Access the full text.
Sign up today, get DeepDyve free for 14 days.
(2007)
Budget Holding Lead Professional Pathfinder. An evaluation of the pilot in Hertfordshire
(2005)
Operational Reflections on Evaluating Development Programs. In: Pitman GK, Feinstein ON and Ingram GK (eds) Evaluating Development Effectiveness
Thomas Cook (2000)
The false choice between theory‐based evaluation and experimentationNew Directions for Evaluation, 2000
Tina Fawcett, Frede Hvelplund, N. Meyer (2010)
Making It Personal
L. Delbeke (1980)
Quasi-experimentation - design and analysis issues for field settings - cook,td, campbell,dtPsychologica Belgica, 20
A. Oakley, V. Strange, Tami Toroyan, M. Wiggins, I. Roberts, J. Stephenson (2003)
Using Random Allocation to Evaluate Social Interventions: Three Recent U.K. ExamplesThe ANNALS of the American Academy of Political and Social Science, 589
(2005)
Operational Reflections on Evaluating Development Programs
Janet Walker, C. Donaldson, K. Laing, M. Pennington, Graeme Wilson, S. Procter, D. Bradley, H. Dickinson, J. Gray, C. Thompson, M. Coombes (2009)
Budget holding lead professional pilots in multi-agency children’s services in England: national evaluationThe annual research report
E. Chelimsky (1985)
Comments on "Social Policy Experimentation: a Position Paper"Evaluation Review, 9
(1998)
Establishing Causality in Evaluations of Comprehensive Community Initiatives
J. Brannen (2005)
Mixing Methods: The Entry of Qualitative and Quantitative Approaches into the Research ProcessInternational Journal of Social Research Methodology, 8
(1986)
Statistics and Causal Inference (with discussion)
(2008)
BHLP Pilot Final Summary Report
(2008)
Budget Holding Lead Professional. An Illuminative Evaluation
(1998)
Learning What Works: Evaluating Complex Social Interventions
C. Godfrey, S. Hutton, J. Bradshaw, Bob Coles, G. Craig, J. Johnson (2002)
Estimating the cost of being 'not in education, employment or training' at age 16-18
(2005)
Respect Task Force, Home Office
M. Wickham‐Jones, D. King (1999)
New Labour New Welfare State? the 'Third Way' in British social policy
(1997)
Realistic Evaluation. London: SAGE
(2008)
Making it Personal. London: Demos
I. Shemilt, I. Harvey, L. Shepstone, L. Swift, R. Reading, M. Mugford, P. Belderson, N. Norris, J. Thoburn, Jill Robinson (2004)
A national evaluation of school breakfast clubs: evidence from a cluster randomized controlled trial and an observational analysis.Child: care, health and development, 30 5
N. Wolff (2000)
Using randomized controlled trials to evaluate socially complex services: problems, challenges and recommendations.The journal of mental health policy and economics, 3 2
(2005)
The Growth of Performance Management: Help or Hindrance? in
L. Bauld, K. Judge (2002)
Learning from Health Action Zones
(1997)
Social Work: Beyond Control? In: Maynard A and Chalmers I (eds)NonRandom Reflections on Health Services Research London: BMJ Publishing Group
(2007)
The Lead Professional: Practitioners' Guide. Children's Workforce Development Council
P. Holland (1985)
Statistics and Causal InferenceJournal of the American Statistical Association, 81
G. Macdonald (1997)
Social Work: Beyond Control?
Bob Stevens (2005)
Every child matters: change for children.RCM midwives : the official journal of the Royal College of Midwives, 8 7
Fadhel Kaboub (2004)
Realistic EvaluationThe Social Science Journal, 41
(1988)
Policy Evaluation: A Guide for Managers
Huey-tsyh Chen (2006)
A Theory-Driven Evaluation Perspective on Mixed Methods Research
G. Drewry, C. Greve, T. Tanquerel (2005)
Contracts, Performance Measurement and Accountability in the Public Sector
Meg Allen, M. Black (2006)
Dual Level Evaluation and Complex Community InitiativesEvaluation, 12
C. Biott, Tina Cook (2000)
Local Evaluation in a National Early Years Excellence Centres Pilot ProgrammeEvaluation, 6
The formulation of evidence-based policy necessitates rigorous, objective evaluation of policy initiatives and, consequently, there has been a significant growth in evaluation of social policy over the last ten years. Alongside this, there is a recognition that the application of new policy initiatives needs to be flexible in order to be relevant to local populations. As a result, pilots and pathfinders are encouraged to undertake local evaluations in addition to national evaluations commissioned by central government. These dual evaluations are seen as a vehicle to provide evidence on effectiveness whilst accommodating heterogeneity of needs and provision. We suggest that without clear delineation of roles, dual evaluations are inefficient, likely to put additional pressure on busy practitioners (and the recipients of new services) to comply with varying data demands, and present policy makers with confusing messages. In this article we focus on the potential for local and national evaluations to reach different conclusions by demonstrating how a simplistic application of quantitative techniques at local level can lead to inappropriate conclusions which contradict national findings. We make a number of recommendations that might facilitate better coordination of local and national evaluations.
Public Policy and Administration – SAGE
Published: Apr 1, 2012
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.