Print Page   |   Contact Us   |   Your Cart   |   Sign In
CFHA Blog
Blog Home All Blogs
Search all posts for:   

 

View all (282) posts »
 

Live Blogging: Dr. Russel Glasgow and Dr. Deborah Cohen Plenary Speech

Posted By Administration, Sunday, October 19, 2014

This is the final in a series of live blogs posts for the 2014 CFHA Conference in Washington DC. Big thanks to all the writers!

Implementation, Evaluation, and Getting to the Triple Aim by Russel Glasgow, PhD, Deborah Cohen, PhD

October 18th 8-10 AM

 Colleen Fogarty and Stephanie Hern: Saturday blogging is more fun than cartoons any day 

 

8:38

Parinda announces those CFHA board members rotating off the board and encourages all members to participate in the election. She then passes the CFHA president baton off to Rusty Kallenberg, the new president. Rusty begins his presidential remarks noting the parallels between his career development and the organizational development of CFHA. When he was at George Washington, he often walked across the clinic to ask his behavioral health colleagues to help with patients. When he relocated to UCSD, he first met Todd Edwards and JoEllen Patterson, with whom he began to do collaborative care within the Family Medicine Department.

Rusty quotes Frank DeGruy, “Primary care fails without collaborative care.” CFHA is now at the developmental place of beginning “adult” activities. Rusty calls on us to grow the science and sustainability of collaborative care. “It’s a never ending job improving collaborative care.” “As a primary care physician who has taken 20 years to figure this out, we need to no have new physicians wait 20 years to figure this out. We need to teach them how to do this TODAY.”

Jennifer Hodgson introduces our plenary speakers, Drs. Russel Glasgow and Deborah Cohen.

9:00 

Colleen Fogarty (CF): Drs. Glasgow and Cohen take separate podiums; noting that given the nature of CFHA, they wanted to integrate their presentations together.

How is implementation science relevant to primary care? Dr. Glasgow explains: Implementation science is designed to address the pragmatic challenges; much research isn’t really relevant to real world practice. Conducted under such controlled circumstances, it’s hard to generalize the findings to real world situations.

Russell quotes a valued collaborator, the ‘public health Larry Green’--“If we want more evidence based practice, we need more practice based evidence” (Larry W. Green, UCSF) He shows a schematic model of a contextual systems approach to implementation science, and cites several different models for implementation science. Notwithstanding critics’ concerns that implementation science contains “More frameworks and models than well-done empiric studies!” Dr. Glasgow notes that frameworks can be helpful in identifying what steps are required for success and can inform both planning and results.

Glasgow: Much of research isn't relevant to real world practice The RE-AIM model consists of 5 domains by which to consider an intervention:
- Reach
- Effectiveness
- Adoption
- Implementation
 - Maintenance

 

9:08 

Stephanie Hern (SH): The goal is to try to convince implementation and evaluation science as key to getting to the triple aim. Implementation science is one of the future directions. Deborah will talk about evaluation and how to word evaluations and both of them will share about population health. We will see video of patient experience of the triple aim being attained within primary care settings.

The pragmatic challenge is that the key issue is that the majority of research is not really relevant to real world practice, it is concocted in ideal conditions, with selective populations, and hard to translate the results to real world care.  

Implementation science is a multi level research practice with a contextual systems approach. Critical elements of the intervention are present, but when we deliver a program it is embedded in many other things. Other important elements include the program delivery staff, the research evaluation team, the fit of the program, and the partnership among all players.

Models are theories and frameworks. They can be helpful to what we need to key on in order to be successful, for planning and understanding the results.  The RE-AIM model helps plan, evaluate and report studies. 

9:15

CF: Deborah Cohen reports on a demonstration project funded by the Colorado Health Foundation, Advancing Care Together (ACT). This study enrolled 11 practices, 9 primary care and 2 community health centers, who were experimenting with changing models of behavioral health in practice. As a demonstration project, these were not massively funded, but rather, received “a paltry amount of money to do something that can be sustained.” She asserts that “Little steps are monumental in a program’s ongoing development.”

SH: They knew costs would be lowered based on evidence from other programs. So these things were already proven for co-morbid conditions. With ACT, they wanted to know how to do it. How does published research inform how you work on the ground? They wanted to focus on implementation; but once its implemented, how are they doing?

CF: The team used the RE-AIM framework, recognizing that implementation of a new innovation occurs through rapid, short cycles of improvement. The team collected lots of data over 3 years, including using online diaries, telephone calls, and clinic visits where the team of 2-4 researchers looked at all aspects of a practice, including actual visits. 

SHSite visits were extensive and included all facets of operations of patients in and patients out.  They evaluated the reach, percent of target screened, target positive, and received services. Clinically relevant measure for the teams not only just an outcome. They would be able to take a look deeper at their numbers. What do we do with the data collected? Showed important steps in the change process. They reviewed every quarter with the data from each clinic. Utilized the data for learning moments and teachable moments.

9:25

CFDeb hands the microphone over to Russell with a “research fist bump.” Russell talks about the differences between individual level changes and system level changes, noting that the “Reach” of an intervention encompasses the participation rate among eligible individuals and the representativeness of those participants. The “adoption” of the intervention reflects the system level involvement including the participation rates among invited settings and staff and the representativeness of them. 

Russell points out that denominators are crucial—reach and adoption are often confused; they ARE similar principles, but operate at different levels. If you don’t know the EXACT denominator, it’s okay to estimate!

SH: Deborah talks more about the ACT program. It included brief counseling in PC, referral to traditional long term therapy, warm hand offs, and joint PC and CH counseling joint visits. They looked at whether screening was done. Some clinics had 80-90% screening while others did discretionary screening. Why does this happen? It depends on how much you want to know about the patient population and what capacity you have to meet those needs. Not screening makes it impossible to fully know the needs of your pt population. Knowing the needs of your pop is a stimulator for practice. Not screening is a little like sticking you head in the sand. 

9:40

CFRussell takes the microphone and switches to talking about the patient experience. Effectiveness (individual level) represents the effect on primary outcome and the impact on quality of life. Maintainance represents the long term effects of the intervention at the level of the setting, e.g. the clinic or health system. Evaluators look at the sustained delivery and modification of the intervention.

SH:  Rusell talks about the lessons learned on those two dimensions (sustained delivery and intervention modification). Back to the systems perspective: there are always consequences, some positive and some negative unintended consequences. Some programs with greatest effect have the lowest reach. And vise versa. Second level maintenance is seldom sustained but almost never is a program continued the same way. 

CF: They show a powerful video of a patient health story, given by Patrick, who gave permission to share the video. Patrick talks about the impact of sustaining an injury that left him partially paralyzed and the family disruption, isolation, and ensuing alcohol abuse. He credits his physician with helping him find the motivation and resources for change, and the success of a behavioral health intervention when he reached a plateau in his change process.

SH: Patrick - Where do I find the courage to go further? My doctor could have said your great. But he said, okay if you want the courage I believe it is in your behavior. Coaching from his doctor opened the issues of where Patrick had plateaued and felt that could go up.

CFIn closing, Dr. Glasgow quotes a colleague saying, “All models are wrong” and noting the value of the RE-AIM model to inform our evaluation of integration innovations.

This post has not been tagged.

Share |
Permalink | Comments (0)
 
Community Search
Sign In


Forgot your password?

Haven't registered yet?

CFHA Calendar

10/13/2016 » 10/15/2016
CFHA 2016 Conference: "Celebrating the Many Faces and Places of Integration"