ESRC internship scheme: Welsh Government experience; research in government and careers in analysis
Presented by Dr Steven Marshall, Chief Social Researcher; Dr Jamie Smith, Head of Local Government research and Kimberley Horton, PhD intern; at an ESRC Wales DTC event held in Cardiff on 28th November 2012.
services; • ESRC internship scheme: – Overview; – Welsh Government approach; – Experiences from an existing intern; • The role of evidence and research in Government; • Careers in Government; • Questions
Around 180 people; - Researchers, statisticians, economists, librarians, cartographers and support staff; • Some analysts in policy departments; • Social research only recently part of central department (2009).
each policy department; • Annual work plans agreed with policy departments; • Research is mostly commissioned externally: – Develop approach and specification; – Manage the contract; – Briefing on the results; • But research is carried out in-house and the extent varies across research teams.
of academia; • Projects: – from 1 to 6 months; – frequency of adverts under review; – next round January 2013 • ESRC covers extension to grant, elements of travel plus accommodation. • Host organisations pays a contribution to ESRC.
for 18 co-funded internships per year; • Projects tend to be between 3 to 6 months; • Concentrated in certain policy areas to date but expanding; • Jamie will provide a managers perspective and then experiences of two of our current interns.
roles for PhD Interns Three main types of roles: 1. Building capacity – e.g. by overseeing discrete parts of larger research projects or undertaking small studies that would otherwise have been commissioned externally 2. Scoping new areas of work – e.g. evaluative case studies to gauge potential for wider application or gathering baseline information to inform policy development 3. Synthesising and drawing together existing evidence – e.g. rapid evidence assessments, secondary analysis of sources
PhD Internships help: • Increasing capacity – integral part of resourcing structure • Offer solutions when unforeseen research needs emerge • Fresh ideas, different perspectives • Broaden skills base of the team • Improve links with universities • Satisfaction of helping students develop
evidence to... • Anticipate issues/trends that governments will need to respond to • Monitor social and economic change and policy implementation • Appraise which policy/delivery options are likely to be most cost effective • Tell us what key stakeholders - citizens, customers and service deliverers think and understand • Know whether our policies/programmes are effective and having an impact (and if not, why not) • Help us learn and apply lessons to improve policy over time • Ultimately – allowing more informed decisions
evidence-based policy-making • Not a new idea, but a relatively new ‘paradigm’ in government • New Labour – out with ideological government, in with rational decision-making • Prominent not only in government • Embodied in Civil Service Code • As prevalent as ever?
conceptual difficulties • Evidence-based, evidence-informed, evidence-aware – what’s the difference? • Government is no more ‘doctor’ than society is ‘patient’ • No linear model of policy development – not formulaic • Evidence mainly about the past, policy mainly about the future • Detractors argue there must be policy before there is evidence (value judgements play a big part)
below constitute evidence-based Instrumental Conceptual Leads directly to decision Leads to changes in knowledge, making for policy and practice understanding or attitude Mobilisation Wider Influence Used as instrument of persuasion Leads to larger scale shifts in to legitimise action/in action thinking (Nutley et al 2002)
evidence is central to development and evaluation of policy ….we need to be able to rely on social sciences and social scientists to tell us what works and why and what types of policy initiatives are likely to be most effective’ Views of politicians vary……..
vary…….. "...we, as politicians, are elected by people to make a difference, to make things happen; we look at the evidence, but there comes a time when you say, 'you use your judgment‘. There could be other things where there is evidence that something works [but] for perfectly good policy reasons, we say it is not the thing we want to do”
vary…….. “For me, real value is about being absolutely focused on the outcomes we are achieving out there in the real world…Tell us the facts and set out the evidence that underpins the options” First Minister (presentation to senior civil servants, 2011)
a role for evidence at every stage? Rationale – e.g. evidence that there is a problem/question to be addressed, evidence that government intervention has the potential to help; Objectives – e.g. evidence on what aspects of the problem can be remedied, over what timescales and in what quantities; Appraisal – e.g. comparing different options to deliver the objectives, in terms of how likely they are to work and how much they cost; Monitoring – e.g. gathering monitoring data, and other evidence, during implementation to feed in to evaluations; Evaluation – e.g. an impact evaluation to determine whether the intended effects of a programme have been realised, or a process evaluation to determine how that happened; Feedback – e.g. using evaluation findings to determine the future of a programme or inform the design of a new one.
applies five principles that should be considered at every stage of our work: Fit with the Government’s programme and Ministerial priorities Impact on the people of Wales and the supporting evidence Cost of our investment Mechanisms available to incentivise change Management of the work
What is KAS? • GSR, GES, GSS, GIS, Operational Research, Library • Collection, analysis and presentation of data and research for policy makers and the public. • Professional standards (quality standards, ethics, publication, etc) • Analytical teams for each DG • Cross-cutting work (e.g. National Survey, Corporate Research)
things we do (in GSR) What is KAS? • Describe situations factually and how they change over time • Monitor specific aspects of policies/programmes and how they change over time • Appraise proposed policies/options to establish their likely efficacy. • Evaluate how things worked in practice to learn lessons and assess impact • Calculate value for money through assessing cost effectiveness, efficiency and effectiveness • Estimate social impacts of policies/programmes • Understand and measure behaviour change
trade-off of quality vs timeliness • Good social science can take a long time • But governments need to make quick decisions • Challenge in reconciling these conflicts – how? • Expediting - rapid evidence assessments, interim evaluations, shifting resources around • Giving best quality advice possible, with limitations clearly laid out
limits of causality and certainty • Understanding the impacts of policy is difficult Change in impact measure Counterfactual (no intervention) New initiative Post-new initiative trend Impact?
limits of causality and certainty • Understanding the impacts of policy is difficult • But policy makers need to understand in order to make good decisions and spend money wisely • Difficulties exacerbated by the timeliness vs quality trade-off and methods of policy implementation • We endeavour to: • carry out sound impact evaluations • influence the way policies are designed and rolled out • advise against ineffective policy options
challenges around evidence and research • Awareness of the importance of evidence varies (e.g. “We don’t need to evaluate that, we know it works”). • Objectively assessing policy does not always fit with political priorities (e.g. “We’re keen to demonstrate the success of this policy”). • Balanced view of effectiveness sometimes seen as a negative (e.g. “We’re reluctant to do that in case it shows the policy isn’t working as well as we thought”) • Sometimes the ‘wrong’ evidence can be the most powerful (e.g. “Statistics show we’re doing a lot more of that, so we’re definitely having an impact”).
resources, resources • Good quality social science takes time and financial commitment (often lots!) • Balance between commissioning and in-house work to keep skills sharp • Budgets are increasingly under pressure across the board – more research funding = hard sell • We continue to find creative ways to boost resources (like PhD Internships)
counter-arguments • Financial pressure means the question of ‘what works?’ is more important than ever • Hard to answer this question without rigorous social science • Even the most costly research is less costly than ineffective policy • More sophisticated approaches yield better estimates of impact and cost – pursue the most rigorous approach possible
– Social researchers; – Economists; – Statisticians; – Operational researchers. • The professions have own particular focus but it can vary enormously between government departments.
– Data analysis (e.g. performance indicators); – Guardians of official statistics (inc. publication). • Operational researchers: – Huge variation in role across government; • Additional benefits when professions work together.
• Fast stream is aimed at rapid promotion and demonstrating the potential for senior civil service; • Social researchers, Statisticians, Operational researchers use both routes; • Economists are fast stream entry only.
once a year; • Commitment across government to continue even when recruitment is generally restricted; • Begins in September with final elements in Jan/ Feb; • Postings from around April/ May; • Assessments include some common elements as well as profession specific elements; • Some information in packs but web links will be circulated after the event.
a suitable degree with social research methods Or a suitable higher degree; • Centrally set tests: – An application form; – A written knowledge text; – An oral briefing exercise; – A competence interview; – All based on the GSR competency framework.