Skip to main content Skip to navigation

PO9F9-20 Development Design – Standards, Tools, and Skills for an Evidence-based Approach

Department
Politics & International Studies
Level
Taught Postgraduate Level
Module leader
David Connolly
Credit value
20
Module duration
10 weeks
Assessment
Multiple
Study location
University of Warwick main campus, Coventry

Introductory description

International development organisations and agencies assert that they use an evidence-based approach to design their projects, programmes, and policies. How does this work in practice? What are the challenges in turning words into action? This module examines how organisations and agencies pursue an evidence-based approach to the design of development programming, policy and practice, and the practical and structural challenges involved. It will explore the key operational standards and tools of evidence-based design and assess its contemporary challenges and opportunities. By the end of this module, students will develop practical knowledge of evidence-based design alongside a critical perspective of this approach.

Module web page

Module aims

  1. To develop a critical understanding of the strengths and limitations of an evidence-based approach to development design.
  2. To learn about the key standards and tools used in evidence-based design in international development.
  3. To acquire hands-on, practical knowledge of how to apply evidence-based design in real-world development scenarios.
  4. To analyse the contemporary challenges and opportunities associated with implementing evidence-based design in development work.

Outline syllabus

This is an indicative module outline only to give an indication of the sort of topics that may be covered. Actual sessions held may differ.

Week 1: ‘The Role of Evidence in International Development Design’
Week 2: ‘Cases of Evidence-based Projects, Programmes and Policies’
Week 3: ‘Operational Standards’
Week 4: ‘Logical Frameworks and Theories of Change’
Week 5: ‘Context Assessment and Monitoring Tools’
Week 6: Reading Week
Week 7: ‘Evaluation and Learning Tools’
Week 8: Student Case Study Presentations
Week 9: ‘Navigating Contemporary Challenges and Opportunities: Decolonising Knowledge Production, and Shifting Donorship’
Week 10: ‘Navigating Contemporary Challenges and Opportunities: Data Diplomacy, and Disruptive Technologies’

Learning outcomes

By the end of the module, students should be able to:

  • 1. Acquire a critical perspective on evidence-based design, allowing students to evaluate and articulate the strengths, limitations, potential biases, and the broader implications of relying on evidence in decision-making processes.
  • 2. Demonstrate knowledge of the key standards and tools that govern evidence-based design in international development, including how evidence is gathered, analysed, and applied.
  • 3. Practically apply evidence-based design tools to real-world development sectors and scenarios through case studies.
  • 4. Critically analyse the contemporary challenges and opportunities in adopting and implementing evidence-based design in development work.

Indicative reading list

In keeping with the applied nature of this module, we will engage with different types of readings, including academic and professional journal articles, working papers, reports by international and multilateral agencies, and policy briefs and blogs by think tanks. The readings are organized below under the weekly seminar topics. Please ask during the seminars or office hours for specific recommendations from the list.

I encourage you to supplement this reading list and to deepen your knowledge of the topics. Here are also some of the most pertinent knowledge hubs on the policy and practice of international development:

  • Journal of Development Effectiveness.
  • Overseas Development Institute
  • Center for Global Development
  • Observer Research Foundation
  • The World Bank
  • United Nations
  • https://www.betterevaluation.org/
  • Knowledge for Development and Diplomacy
  • International Initiative for Impact Evaluation
  • The Abdul Latif Jameel Poverty Action Lab (J-PAL)
  • Innovations for Poverty Action

Week 1/Seminar 1: ‘The Role of Evidence in International Development Design’
Ababou, Abil, and David Alzate. 2021. “Developing an Evidence-Based Mindset: Fostering a Culture of Evidence-Based Policymaking through Research, Training, and Policy Engagements.” In Changing Mindsets to Realize the 2030 Agenda for Sustainable Development: How to promote new mindsets and behaviors in public institutions to implement the Sustainable Development Goals, under the direction of the UN Department of Economic and Social Affairs, Division for Public Institutions and Digital Government. New York: United Nations. pp.66-74.

Biglan, Anthony, and Terje Ogden. 2008. “The Evolution of Evidence-Based Practices.” European Journal of Behavior Analysis, 9 (1): 81–95.

Jones, Harry, Nicola Jones, Louise Shaxson, and David Walker. 2013. ‘Knowledge, Policy and Power in International Development: a Practical Framework for Improving Policy’, London: ODI.

Rhoads Allen, Ruth, Pushpa Iyer, and Lillie Ris. 2023. ‘What Constitutes Effective Use of Evidence to Inform Peacebuilding Project Design?’, Evidence Review, Washington D.C: USIP.

United States Agency for International Development. 2016. ‘Strengthening Evidence-based Development’, Report, Washington D.C: USAID.

Week 2/Seminar 2: ‘Cases of Evidence-based Projects, Programmes and Policies’
Mercy Corps, Evidence-based Lessons for Implementing the Global Fragility Act
For background information and further reading on the Global Fragility Act (GFA) see: https://www.usip.org/fragility-conflict

‘Stopping As Success’
The goal of Stopping As Success: Locally Led Transitions in Development (SAS+) is to equip organisations with good practices to transition responsibly and to make way for local leadership in the development sector. For case studies see: https://www.stoppingassuccess.org/case-studies/

Global Evaluation Initiative (GEI):
Global Evaluation Initiative is a global network of organisations and experts supporting developing country governments with strengthening monitoring, evaluation, and the use of evidence in their countries. GEI focuses its support on country-owned efforts that are aligned with local needs, goals, and perspectives. GEI’s integrated systems-based approach moves beyond disconnected interventions. GEI works to address the dynamic, interconnected real-world context of monitoring, evaluation, and evidence usage in each country to provide tailored solutions that help governments gather and use data.

Progress on the SDGs: telling the Ghanaian story through the lens of citizens: Civil Society Organizations' shadow report on the voluntary national review
The Shadow Report complements the Government’s Voluntary National Review (VNR) report. It seeks to promote mutual accountability in the implementation of the SDGs. The report showcases the efforts and initiatives by civil society to implement the SDGs. The findings of the report are intended to help strengthen national interventions on Ghana's SDGs process. Members of the CSO Platform will use the outcomes of the report for advocacy and public awareness as well as for strengthening multi-stakeholder partnerships at the sub-national level.

The Center for Learning on Evaluation and Results for Francophone Africa (CLEAR-FA) in Madagascar.
This article captures relevant insights from the role of CLEAR-FA as an implementing partner of the Global Evaluation Initiative (GEI) in Madagascar. The article identifies CLEAR-FA’s approach to helping the government of Madagascar transform its monitoring & evaluation (M&E) system and the challenges that it faced from 2008 to 2020. The establishment of the National Integrated System for Monitoring and Evaluation (SNISE) and the shift from a donor-oriented to a country-owned system are noted challenges. Key challenges included fragmented stakeholders with undefined responsibilities, modalities, and capacities for evaluation, knowledge sharing, and learning.

The Center for Learning on Evaluation and Results for Lusophone Africa and Brazil (CLEAR-LAB), in collaboration with UNICEF-Mozambique.
CLEAR-LAB used its extensive experience supporting public policy evaluation in Brazil to forge a long-term collaboration with Mozambique in 2020. It introduced the Monitoring and Evaluation Systems Analysis (MESA) tool to map public sector institutions and stakeholders and developed a Capacity Development Plan to guide further activities to strengthen the national M&E system. In partnership with UNICEF, the MESA findings identified three areas of support: 1) providing advisory services to support the development of the country’s legal and policy framework for a national M&E system; 2) building the capacity of stakeholders around M&E; and 3) supporting a multisectoral team with piloting the first-ever rapid policy evaluation in Mozambique.

Week 3/Seminar 3: ‘Operational Standards’
Bates, Mary Ann, and Rachel Glennerster. 2017. “The Generalizability Puzzle.” Stanford Social Innovation Review 15 (3): 50–54.

OECD. 2020. Mobilising Evidence for Good Governance: Taking Stock of Principles and Standards for Policy Design, Implementation and Evaluation, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/3f6f736b-en.

Schmeidl, S., Ware, A., & Alberti, C. (2023). “Conflict sensitivity/Do No Harm (DNH) in development, humanitarian, and peacebuilding practice – reflections and emerging trends” [special issue editorial]. Development in Practice, 33(5), 517–527. https://doi.org/10.1080/09614524.2023.2215970

United Kingdom Department for International Development and Foreign, Commonwealth & Development Office, 2013. How to Note: Assessing the Strength of Evidence

Wenar L. 2006. “Accountability in International Development Aid.” Ethics & International Affairs. 20(1):1-23. https://doi:10.1111/j.1747-7093.2006.00001.x

Winters, Matthew S. 2010. “Accountability, Participation and Foreign Aid Effectiveness”, International Studies Review, Volume 12, Issue 2, June, Pages 218–243, https://doi.org/10.1111/j.1468-2486.2010.00929.x

Week 4/Seminar 4: ‘Logical Frameworks and Theories of Change’
Global Development Research Center. 2003. Logical Framework Analysis.

Golini, R., Landoni, P., & Kalchschmidt, M. 2017. “The adoption of the logical framework in international development projects: a survey of non-governmental organizations”. Impact Assessment and Project Appraisal, 36(2), 145–154. https://doi.org/10.1080/14615517.2017.1354643

Örtengren, Kari. 2004. The Logical Framework Approach: A Summary of the Theory behind the LFA Method.

Stein, D., & Valters, C. 2012. Understanding Theory of Change in International Development.

United Nations Development Group (UNDG). (undated). Theory of Change: Facilitator’s Guide.

Vogel, I. 2012. Review of the Use of ‘Theory of Change’ in International Development. London: UK Department for International Development.

Week 5/Seminar 5: ‘Context Assessment and Monitoring Tools’
Anderlini, S. N. 2006. Mainstreaming gender in conflict analysis: Issues and recommendations, Social Development Papers No. 33. Washington DC: World Bank.

Barakat, Sultan & Waldman, Thomas. 2013. “Conflict Analysis for the Twenty-First Century.” Conflict Security and Development. 13. 259-283.

Mashatt, Merriam, Daniel Long, and James Crum (2008) Conflict-sensitive Approach to Infrastructure Development, Special Report 197, Washington D.C. USIP.

Mobjörk, M., M. T. Gustafsson, H. Sonnsjö, S. van Baalen, L.M. Dellmuth, & N. Bremberg (2016) Climate-related Security Risks. Towards an Integrated Approach. Stockholm: SIPRI.

Pasanen, Tiina and Inka Barnett. 2019. ‘Supporting adaptive management: monitoring and evaluation tools and approaches.’ Working paper No. 659, London: ODI, 3 December.

Sudhakar, N. & Kuehnast, K. 2011. The other side of gender: Including masculinity concerns in conflict and peacebuilding. Washington DC: USIP.

USAID. 2024. Conflict and Violence Assessment Framework, Washington DC.

United Nations and World Bank (2018) Pathways for Peace : Inclusive Approaches to Preventing Violent Conflict. Washington DC: World Bank.

Whaites, Alan. 2017. "The Beginner’s Guide to Political Economy Analysis (PEA)." National School of Government International, London.

World Bank Group Communities of Practice. 2021. Conducting a Stakeholder Analysis:
This article provides guidelines for stakeholder analysis, a process used to identify individuals or groups influenced/impacted by a community of practice. Stakeholders are grouped according to their level of participation, interest, and influence to determine the best approach to communicate with and involve each stakeholder group. Conducting a stakeholder analysis helps to better understand the audience, secure key support, gain early alignment on goals, and address conflicts or other issues early on. It can ensure that all stakeholders have a clear and common understanding of the project's objectives and responsibilities.

Week 7/Seminar 6: ‘Evaluation and Learning Tools’
Alkin, M. and King, J. (2016). “The Historical Development of Evaluation Use”. American Journal of Evaluation. 37/4.

Eckhard, Stefan and Vytautas Jankauskas, 2020. “Explaining the Political Use of Evaluation in International Organizations”, Policy Sciences, 53:667–695

Jankauskas, Vytautas and Steffen Eckhard. 2023. The Politics of Evaluation in International Organizations, Oxford: OUP. See Chapters 1, 2, and 8.

Kaufman, Julia, Amanda Glassman, Ruth Levine, and Janeen Madan Keller. 2022. ‘Breakthrough to Policy Use: Reinvigorating Impact Evaluation for Global Development’, Washington D.C: CGD.

Newcomer, Kathryn E. and Steven W. Mumford. eds., 2024. Research Handbook on Program Evaluation, Cheltenham: Edward Elgar. See Chapters 1, 2, 7 and 8.

Newcomer, K., Olejniczak, K., & Hart, N. 2022. “Learning agendas: Motivation, engagement, and potential.” New Directions for Evaluation, 63–83.

Weiss, Carol H. 1993. ‘Where politics and evaluation research meet,’ Evaluation Practice, Volume 14, Issue 1, Pages 93-106. https://0-doi-org.pugwash.lib.warwick.ac.uk/10.1016/0886-1633(93)90046-R.

Wilson-Grau, Ricardo, Heather Britt, Yulianto Dewata, Patricia Rogers, and Kaye Stevens. 2012. "Outcome Harvesting."

Week 8/Seminar 7: Student Case Study Presentations

  • No required readings

Week 9/Seminar 8: ‘Navigating Contemporary Challenges and Opportunities: Decolonising Knowledge Production, and Shifting Donorship’
Baguios, A., King, M., Martins, A. and Pinnington, R. (2021) Are we there yet? Localisation as the journey towards locally led practice: models, approaches and challenges. ODI Report. London: ODI.

Equitable Evaluation Initiative. 2023. ‘The Equitable Evaluation Framework’

Chakma, Trimita, Georgia Booth, and Ruby Johnson. 2024. ‘Expanding Our Understanding of Evidence for Meaningful Participation’, Report, Amsterdam: Porticus.

Ingram, George. 2024. Locally Led and Globally Informed: Key Themes on Localization. September 20.
This essay is followed by a series of viewpoints from more than 20 experts on localization.

Mansuri, Ghazala and Vijayendra Rao. 2013. Localizing Development: Does Participation Work?. Policy Research Report, Washington, DC: World Bank.

Newcomer, Kathryn E. and Steven W. Mumford. eds., 2024. Research Handbook on Program Evaluation, Cheltenham: Edward Elgar. See Chapters 9, 10 and 11.

Peace Direct. 2021. ‘Time to Decolonize Aid’, Report, London: Peace Direct.
This is an invaluable resource for reflecting on the power dynamics from which the international aid industry emerged, which continue to structure relations across countries and between actors. Chapter 5 “Structural Racism in the Modern-Day Aid System” contains a useful analysis of the structural, procedural, and cultural barriers to engaging local partners in monitoring, evaluation, and learning, and forming partnerships with local actors for knowledge generation and analysis.

World Bank Group. 2021. A Changing Landscape - Trends in Official Financial Flows and the Aid Architecture, Washington DC. The World Bank.
The analysis of the main trends begins on p.18.

Week 10/Seminar 9: ‘Navigating Contemporary Challenges and Opportunities: Data Diplomacy, and Disruptive Technologies’
Boyd, Andy, Jane Gatewood, Stuart Thorson, and Timothy D. V. Dye. 2019. “Data Diplomacy,” Science & Diplomacy, Vol. 8, No. 1, May.

Bjola, C. 2021. “AI for development: implications for theory and practice”. Oxford Development Studies, 50(1), 78–90.

Cummings, M. L., Heather M. Roff, Kenneth Cukier, Jacob Parakilas and Hannah Bryce. 2018. Artificial Intelligence and International Affairs Disruption Anticipated, London: Chatham House, June.

Goralski Margaret A., Tay Keong Tan. 2020. “Artificial intelligence and sustainable development”, The International Journal of Management Education, Volume 18, Issue 1.

Khanal, Shaleen, Hongzhou Zhang, Araz Taeihagh, 2024. “Why and how is the power of Big Tech increasing in the policy process? The case of generative AI”, Policy and Society,

Mir, Asfandyar and Niloufer Siddiqui. 2022. Losing Facts to Fiction: Nationalism, Misinformation, and Conspiracy Theories in Pakistan, USIP Special Report, No 514, November.

Newcomer, Kathryn E. and Steven W. Mumford. eds., 2024. Research Handbook on Program Evaluation, Cheltenham: Edward Elgar. See Chapters 18, 24, 33 and 34.

Vinuesa, R., Azizpour, H., Leite, I. et al. 2020. “The role of artificial intelligence in achieving the Sustainable Development Goals.” Nat Commun 11, 233.

Research element

The assessment will involve a presentation and a report, which will require students to use a variety of primary and secondary sources.

Interdisciplinary

The field of international development and its study are interdisciplinary in their nature and evolution and this will be the focus of many of the discussions during the module. Throughout the module the readings list and sources used will explicitly draw on insights from different disciplines (law, anthropology, political science, economics).

International

The module attracts an international student body with its focus on global issues of international development.

Subject specific skills

  1. Connecting theories of international development to policies and practices of International Development.
  2. Knowledge of the evolution of the evidence-based approach to the policy and practice of the field of International Development and the key actors and events.
  3. Knowledge of the key issues approaches and skills in the field of International Development, with a focus on the evidence-based approach.
  4. Ability to critically assess and evaluate International Development programmes and policies.

Transferable skills

  1. Written communication skills
  2. Oral communication skills
  3. Detailed critical analysis skills
  4. Skills in the interpretation of primary and secondary sources
  5. Independent research skills
  6. Application of learning to case studies
  7. Summarising large bodies of work to highlight key points
  8. Implications of policy developments for theory and practice

Study time

Type Required
Seminars 9 sessions of 2 hours (9%)
Private study 182 hours (91%)
Total 200 hours

Private study description

Students will focus their preparatory reading for each seminar on seminar questions provided in the module syllabus, and will also undertake independent research to complete their assessed essays.

Costs

No further costs have been identified for this module.

You do not need to pass all assessment components to pass the module.

Assessment group A1
Weighting Study time Eligible for self-certification
Assessment component
Report on case study 60% Yes (extension)

Students will draft a consultancy-type report on their case study (see details for other assessment) for the relevant donor or implementing agency. The report will contain an analysis and critique of the evidence-based approach along with targeted recommendations.

Reassessment component is the same
Assessment component
Presentation 40%

Students will select an international development project, programme or policy case study and present an analysis of its evidence-based approach.

Reassessment component is the same
Assessment group S
Weighting Study time Eligible for self-certification
Assessment component
Small-group Presentation (15 mins) (Week 8) 40% Yes (extension)

Students will select an international development project, programme or policy case study and present an analysis of its evidence-based approach.

Reassessment component is the same
Assessment component
Individual Assignment (3,000 words) 60% Yes (extension)
Reassessment component is the same
Feedback on assessment

Feedback form via Tabula, optional verbal consultation

Courses

Course availability information is based on the current academic year, so it may change.

This module is Core optional for:

  • Year 1 of TPOS-M9PT MA in International Development
  • Year 1 of TPOS-M9P9 Postgraduate Taught International Relations

This module is Optional for:

  • Year 1 of TPOS-M9PT MA in International Development
  • Year 1 of TPOS-M9Q1 Postgraduate Politics, Big Data and Quantitative Methods
  • Year 1 of TPOS-M1P3 Postgraduate Taught International Political Economy
  • Year 1 of TPOS-M1P8 Postgraduate Taught International Politics and East Asia
  • Year 1 of TPOS-M9P9 Postgraduate Taught International Relations
  • Year 1 of TPOS-M9PC Postgraduate Taught International Security
  • Year 1 of TPOS-M9PS Postgraduate Taught Political and Legal Theory
  • Year 1 of TPOS-M9PF Postgraduate Taught Public Policy

This module is Option list B for:

  • Year 1 of TPOS-M1PD Postgraduate Taught the Politics of Climate Change