2 1 07 11 00am Ashley

53 %
47 %
Information about 2 1 07 11 00am Ashley
Education

Published on January 10, 2008

Author: Toni

Source: authorstream.com

The Bumps and Bruises of the Evaluation Mine Field Presented by Olivia Silber Ashley, Dr.P.H. Presented to Office of Adolescent Pregnancy Programs Care Grantee Conference, February 1-2, 2007, New Orleans, Louisiana :  The Bumps and Bruises of the Evaluation Mine Field Presented by Olivia Silber Ashley, Dr.P.H. Presented to Office of Adolescent Pregnancy Programs Care Grantee Conference, February 1-2, 2007, New Orleans, Louisiana RTI International is a trade name of Research Triangle Institute 3040 Cornwallis Road ■ P.O. Box 12194 ■ Research Triangle Park, NC 27709 Phone 919-541-6427 e-mail osilber@rti.org Fax 919-485-5555 Overview:  Overview Core evaluation instruments Evaluation design Analysis Background on Core Evaluation Instruments:  Background on Core Evaluation Instruments Office of Management and Budget (OMB) recently examined the AFL program using its Program Assessment Rating Tool (PART) Identified program strengths Program purpose Design Management Identified areas for improvement Strategic planning Program results/accountability In response, OPA Developed baseline and follow-up core evaluation instruments Developed performance measures to track demonstration project effectiveness Staff and Client Advisory Committee:  Staff and Client Advisory Committee Anne Badgley Leisa Bishop Doreen Brown Carl Christopher Cheri Christopher Audra Cummings Christina Diaz Amy Lewin David MacPhee Janet Mapp Ruben Martinez Mary Lou McCloud Charnese McPherson Alice Skenandore Jared Stangenberg Cherie Wooden Capacity Assessment Methods:  Capacity Assessment Methods Review of grant applications, annual reports, and other information from 28 most recently funded programs Qualitative assessment involving program directors, evaluators, and staff in: 14 Title XX Prevention programs 14 Title XX Care programs Telephone interviews Site visit Observations of data collection activities Document review Conducted between January 26, 2006, and March 16, 2006 31 interviews involving 73 interviewees across 28 programs 100% response rate Selected Title XX Prevention and Care Programs:  Selected Title XX Prevention and Care Programs Baptist Children’s Home Ministries Boston Medical Center Emory University Freedom Foundation of New Jersey, Inc. Heritage Community Services Ingham County Health Department James Madison University Kings Community Action National Organization of Concerned Black Men Our Lady of Lourdes Red Cliff Band of Chippewas St. Vincent Mercy Medical Center Switchboard of Miami, Inc. Youth Opportunities Unlimited Children’s Home Society of Washington Children’s Hospital Choctaw Nation of Oklahoma Congreso de Latinos Unidos Hidalgo Medical Services Illinois Department of Human Services Metro Atlanta Youth for Christ Roca, Inc. Rosalie Manor Community & Family Services San Mateo County Health Services Agency Truman Medical Services University of Utah Youth and Family Alliance/Lifeworks YWCA of Rochester and Monroe Capacity Assessment Research Questions:  Capacity Assessment Research Questions How and to what extent have AFL projects used the core evaluation instruments? What problems have AFL projects encountered with the instruments? Difficulties with Core Evaluation Instruments among Care Programs:  Difficulties with Core Evaluation Instruments among Care Programs Difficulties with Core Evaluation Instruments among Prevention Programs:  Difficulties with Core Evaluation Instruments among Prevention Programs Expert Work Group:  Expert Work Group Elaine Borawski Claire Brindis Meredith Kelsey Doug Kirby Lisa Lieberman Dennis McBride Jeff Tanner Lynne Tingle Amy Tsui Gina Wingood Draft Revision of Core Evaluation Instruments:  Draft Revision of Core Evaluation Instruments Confidentiality statement 5th grade reading level Instructions for adolescent respondents Re-ordering of questions Improved formatting Sensitivity to diverse family structures Consistency in response options Improved fidelity to original source items Eliminated birth control question for pregnant adolescents Modified birth control question for parenting adolescents Clarified reference child Separated questions about counseling/testing and treatment for STD Modified living situation question Improved race question Added pneumococcal vaccine (PCV) item Why is a Rigorous Evaluation Design Important?:  Why is a Rigorous Evaluation Design Important? Attribute changes to the program Reduce likelihood of spurious results OMB performance measure to improve evaluation quality Peer-reviewed publication Continued funding for your project and for the AFL program Ensure that program services are helpful to pregnant and parenting adolescents Evaluation Design:  Evaluation Design Appropriate to answer evaluation research questions Begin with most rigorous design possible Randomized experimental design is the gold standard to answer research questions about program effectiveness Units for study (such as individuals, schools, clinics, or geographical areas) are randomly allocated to groups exposed to different treatment conditions Barriers to Randomized Experimental Design:  Barriers to Randomized Experimental Design Costs: Consume a great deal of real resources Costly in terms of time Involve significant political costs Ethical issues raised by experimentation with human beings Limited in duration High attrition in either the treatment or control groups Population enrolled in the treatment and control groups not representative of the population that would be affected by the treatment Possible program contamination across treatment groups Lack of experience using this design (Bauman, Viadro, & Tsui, 1994; Burtless, 1995) Benefits of Randomized Experimental Design:  Benefits of Randomized Experimental Design Able to infer causality Assures the direction of causality between treatment and outcome Removes any systematic correlation between treatment status and both observed and unobserved participant characteristics Permits measurement of the effects of conditions that have not previously been observed Offers advantages in making results convincing and understandable to policy makers Policymakers can concentrate on the implications of the results for changing public policy The small number of qualifications to experimental findings can be explained in lay terms (Bauman, Viadro, & Tsui, 1994; Burtless, 1995) Strategies for Implementing Randomized Experimental Design:  Strategies for Implementing Randomized Experimental Design Read methods sections from evaluations using randomized experimental design Ask for evaluation technical assistance to implement this design Recruit all interested adolescents Ask parents/adolescents for permission to randomly assign to one of two conditions Divide program components into two conditions Overlay one component on top of others Focus outcome evaluation efforts on randomly assigned adolescents Include all adolescents in process evaluation An Example:  An Example Study examined whether Home-based mentoring intervention prevented second birth within 2 years of first birth Increased participation in the intervention reduced likelihood of second birth Randomized controlled trial involving first-time black adolescent mothers (n=181) younger than age 18 Intervention based on social cognitive theory, focused on interpersonal negotiation skills, adolescent development, and parenting Delivered bi-weekly until infant’s first birthday Mentors were black, college-educated single mothers Control group received usual care No differences in baseline contraceptive use or other measures of risk or family formation Follow-up at 6, 13, and 24 months after recruitment at first delivery Response rate 82% at 24 months Intent-to-treat analysis showed that intervention mothers less likely than control mothers to have a second infant Two or more intervention visits increased odds of avoiding second birth more than threefold Source: Black et al. (2006). Delaying second births among adolescent mothers: A randomized, controlled trial of a home-based mentoring program. Pediatrics, 118, e1087-1099. Obtaining and Maintaining a Comparison Group:  Obtaining and Maintaining a Comparison Group Emphasize the value of research Explain exactly what the responsibilities of the comparison group will be Minimize burden to comparison group Ask for commitment in writing Provide incentives for data collection Provide non-related service/materials Meet frequently with people from participating community organizations and schools Provide school-level data to each participating school (after data are cleaned and de-identified) Work with organizations to help them obtain resources for other health problems they are concerned about Add questions that other organizations are interested in Explain the relationship of this project to the efforts of OAPP Adapted from Foshee, V.A., Linder, G.F., Bauman, K.E., Langwick, S.A., Arriaga, X.B., Heath, J.L., McMahon, P.M., & Bangdiwala, S. (1996). The Safe Dates Project: Theoretical basis, evaluation design, and selected baseline findings. American Journal of Preventive Medicine, 12, 39-47. Analysis:  Analysis Include process measures in outcome analysis Attrition analysis Missing data Assessment of baseline differences between treatment groups Intent-to-treat-analysis Multivariate analysis controlling for variables associated with baseline differences and attrition Incorporate Process Evaluation Measures in Outcome Analysis:  Incorporate Process Evaluation Measures in Outcome Analysis Process evaluation measures assess qualitative and quantitative parameters of program implementation Attendance data Participant feedback Program-delivery adherence to implementation guidelines Facilitate replication, understanding of outcome evaluation findings, and program improvement Avoids Type III error: Concluding that program is not effective when program was not implemented as intended Source: USDHHS. (2002). Science-based prevention programs and principles, 2002. Rockville, MD: Author. Attrition Analysis:  Attrition Analysis Number of participants lost over the course of a program evaluation Some participant loss is inevitable due to transitions among program recipients Extraordinary attrition rates generally lower the degree of confidence reviewers are able to place on outcome findings Not needed if imputing data for all respondent missingness Evaluate the relationship of study variables to dropout status (from baseline to follow-up) Report findings from attrition analysis, including direction of findings Control for variables associated with dropout in all multivariate outcome analyses Source: USDHHS. (2002). Science-based prevention programs and principles, 2002. Rockville, MD: Author. Missing Data:  Missing Data Not the same as attrition (rate at which participants prematurely leave an evaluation) Absence of or gaps in information from participants who remain involved A large amount of missing data can threaten the integrity of an evaluation Item-level missingness Run frequency distributions for all items Consider logical skips Report missingness Address more than 10% missingness Imputation procedures Imputed single values Multiple imputation (SAS Proc MI) replaces missing values in a dataset with a set of “plausible” values Full Information Maximum Likelihood Modeling (FIML) estimation in a multilevel structural equation modeling (SEM) framework in Mplus 4.1 (Muthen & Muthen, 1998-2006) Source: USDHHS. (2002). Science-based prevention programs and principles, 2002. Rockville, MD: Author. Analysis:  Analysis Appropriateness of data analytic techniques for determining the success of a program Employ state-of-the-art data analysis techniques to assess program effectiveness by participant subgroup Use the most suitable current methods to measure outcome change Subgroup (moderation) analyses allow evaluation of outcomes by participant age and ethnicity, for example Okay to start with descriptive statistics Report baseline and follow-up results for both treatment and comparison groups Conduct multivariate analysis of treatment condition predicting difference of differences Control for variables associated with attrition Control for variables associated with differences at baseline Source: USDHHS. (2002). Science-based prevention programs and principles, 2002. Rockville, MD: Author. Assessment of Baseline Differences between Treatment and Comparison Groups:  Assessment of Baseline Differences between Treatment and Comparison Groups Address the following research questions: Are treatment and comparison group adolescents similar in terms of Baseline levels of outcome variables (e.g., educational achievement, current school status) Key demographic characteristics, such as Age Race/ethnicity Pregnancy stage Marital status Living arrangements SES Test for Baseline Differences:  Test for Baseline Differences Test for statistically significant differences in the proportions of adolescents in each category If you decide to analyze potential mediators as short-term program outcomes, test for baseline differences on these mediators Report results from these tests in the end of year evaluation report for each year that baseline data are collected Important for peer-reviewed publication Control for variables associated with treatment condition in outcome analyses An Example: Children’s Hospital Boston:  An Example: Children’s Hospital Boston Study to increase parenting skills and improve attitudes about parenting among parenting teens through a structured psychoeducational group model All parenting teens (n=91) were offered a 12-week group parenting curriculum Comparison group (n=54) declined the curriculum but agreed to participate in evaluation Pre-test, post-test measures included Adult-Adolescent Parenting Inventory (AAPI), the Maternal Self-Report Inventory (MSRI), and the Parenting Daily Hassles Scale Analyses controlled for mother’s age, baby’s age, and race Results showed that program participants or those who attended more sessions improved their mothering role, perception of childbearing, developmental expectations of child, empathy for baby, and reduced frequency of hassles in child and family events Source: Woods et al. (2003). The parenting project for teen mothers: The impact of a nurturing curriculum on adolescent parenting skills and life hassles. Ambul Pediatr, 3, 240-245. Moderation and Mediation Analyses:  Moderation and Mediation Analyses Test for moderation Assess interaction between treatment and demographic/baseline risk variables When interaction term is significant, stratify by levels of the moderator variable and re-run analyses for subgroups Test for mediation Standard z-test based on the multivariate delta standard error for the estimate of the mediated effect (MacKinnon, Lockwood, Hoffman, West, & Sheets, 2002; Sobel, 1982) Treatment condition beta value is attenuated by 20% or more after controlling for proposed mediators (Baron & Kenny, 1986) An Example:  An Example AFL Care Program Academic case management Family planning counseling Teacher Characteristics Improved interactions with adolescent Positive messages about adolescent’s capabilities Improved adolescent self-efficacy to succeed academically Longer adolescent stay in school Process Evaluation Training Curriculum Program Content Program Delivery Program Activities Outcome Evaluation Main Effect Mediating Effect Moderating Effect Demographic characteristics Family dysfunction Adolescent age at first pregnancy Outcomes Outcomes Goal Grandparent support group Grandparent Characteristics Increased knowledge about immunization benefits Increased skills for avoiding conflict with adolescent Improved adolescent behavioral capability to use contraception and negotiate with partner Increased adolescent contraceptive use Reduced adolescent repeat pregnancy Increased adolescent contraceptive use Increased immunizations Improved adolescent outcome expectations about immunizations Intent-to-Treat Analysis:  Intent-to-Treat Analysis Requires that all respondents initially enrolled in a given program condition be included in the first pass of an analysis strategy, regardless of whether respondents subsequently received program “treatment” (Hollis & Campbell, 1999) Report findings from the intent-to-treat analysis Important for peer-reviewed publication Okay to re-run analyses, recoding respondents as not receiving the program or dropping them from analyses

Add a comment

Related presentations

Related pages

2012 JO Nationals Team Sheets - USA Gymnastics

15 & Over 7 44 1 15 & Over 8 38 2 15 & Over 9 25 2 7/09/12 1:40pm 7/07/12 11:30am 7/06/12 ... 2 7/07/12 11:00am ... 11:20am Villarreal, Ashley ...
Read more

Elders Dalby Office: (07) 4596 9300 Ashley Loveday: 0429 ...

commencing 11.00am at Bell Showgrounds ... New DimensionNew Dimension Elders Dalby Office: (07) 4596 9300 Ashley Loveday: ... 2 Registered Females Linlora ...
Read more

Ashley (Laura) Hldgs PLC Director/PDMR Shareholding - ADVFN

2.1: 11.8: 180.47: Ashley (Laura) Hldgs PLC Director/PDMR Shareholding. 07/04/2016 6:00am UK Regulatory (RNS & others) ... Ashley (Laura) Hldgs PLC . 07 ...
Read more

Bracket/Game Info - triplecrownsports.com

ACES 16 U (Ashley/Creamer) Simi Valley: 2: 0: 1: 0: 7: 1: 8: 11: 21: 1: 1 in C: GAME: DAY: TIME: FIELD: TEAM: ... 11: 57 07: 1: Bownet 16U: POOL: 19 ...
Read more

England 2-1 Wales: Euro 2016 – as it happened | Football ...

11.00am EDT 11:00. England’s win ... Full-time: England 2-1 Wales! ... the ball came off Ashley Williams and deflected into the path of Jamie Vardy, ...
Read more

Health/Exercise Studies GEP Summer Course Offerings ...

Health/Exercise Studies GEP Summer Course Offerings. ... 00AM: 09:00AM: MTWTh: NCSU1: 1: White,Ashley: ... Summer 2 2016: 5W: 10:20AM: 11:20AM: MTWTh ...
Read more

Ashley Mathews Archives - Trinity

... 11:00AM | 6PM Childcare Details ... http://atltrinity.org/wp-content/uploads/2016/07/2016.7.17-11.00AM-1.mp3. Luke 10:25-37 ... 1; 2; 3 … 5; Next ...
Read more

Contract Opportunities

Current Production Model 1-Ton Cab and Chassis Crew Cab: 07/12/2016 11 ... To locate potential opportunities ... 09/20/2016 11:00AM: 10/12/2016 2:00PM ...
Read more

CIRCUIT COURT FOR ALLEGANY COUNTY Page <1> T R I A L A S S ...

circuit court for allegany county page <2> t r i a l a s s i g n m e n t for 07/08/2016 to 08/30/2016 ...
Read more