Sign in


The SAMEA Evaluation Hackathon will run from 8-22 October 2021 in partnership with United Nations Children's Fund (UNICEF), National Association of Social Change Entities in Education (NASCEE), Department of Planning, Monitoring and Evaluation (DPME) and JET Education Services (JET).

If you would like to get involved see the themes and topics below. An underlying theme is how we apply monitoring and evaluation (M&E) in times of increasing uncertainty - the COVID-19 pandemic triggered a global crisis, but there are others waiting in the wings, such as possible recessions, climate change impacts and biodiversity collapse.

Theme 1 Addressing the systemic crises facing South Africa

An overriding danger for the planet is from climate change and ecosystems breakdown. We are entering dangerous waters where temperature rise may lead to irreversible warming feedback loops, leading to emissions of very dangerous methane from the seafloor (clathrates) and from permafrost. The human-induced climate crisis is escalating at unprecedented speed and is a multiplier of other pressures on biodiversity, both exacerbating the effects of these pressures and altering the frequency, intensity, and timing of events. This is affecting “most ecological processes with disruptions evident from the genetic level to the landscape level”. At the same time the economic system internationally and within South Africa reproduces economic inequality, with the benefits of growth accruing to a few. Over forty percent of the South African population are unemployed, many of the employed survive on poverty wages and the top 20 percent of the population receive over 68 percent of income. The level of poverty in South Africa and the intractability of the current economic model to reverse this needs focused action.

Topic 1.1: Developing an environmental sustainability criterion and guideline

The traditional way the Organisation for Economic Co-operation and Development (OECD) Development Assistance Committee (DAC) evaluation criteria on sustainability is used is to look at the sustainability of the evaluand. This does not take account of the way the evaluand could or should be contributing to wider sustainability, whether environmental, economic or social. The International Fund for Agricultural Development (IFAD) for example, has taken on a wider definition which addresses wider environmental sustainability. The task of this team is to develop a working definition of an environmental sustainability criteria for South Africa which reviews the actual/potential contribution of the evaluand to wider environmental sustainability (with social sustainability addressed in the equity task below), and a guideline for how to apply this in evaluations.

Topic 1.2: Developing an equity criterion and guideline

Equity was considered when the OECD DAC reviewed the six current evaluative criteria but not taken forward. In a similar fashion to 1.1, how could evaluations look at the actual or potential contribution of policies and programmes to promote equity. The task for this team is to review use of the equity criteria internationally and come up with a working definition for South Africa which includes the actual/potential contribution of the evaluand to promoting equity and a guideline on how this could be applied in evaluations.

Theme 2 How to undertake M&E in times of crisis

“Traditional” M&E is a structured, logical undertaking, planned and scheduled well in advance. However, crises - such as the COVID-19 pandemic and subsequent economic and human crisis - emerge rapidly, without prior warning - disrupting plans and programmes and requiring rapid, decisive, high stakes decision making, in contexts of limited data/evidence, and time and resource constraints, which limit research and evaluation endeavours. This creates a challenge for the M&E profession and practitioners – to respond rapidly, whilst maintaining acceptable rigour; above all, to provide timely evidence to inform appropriate responses and adaptation during times of crisis. M&E can play a crucial role during times of crisis, but it has to be fit for context and purpose. The COVID-19 pandemic and subsequent lockdowns led to a great increase in working from home, reduction in travel and created new complexities for M&E with respect to collecting data, engaging with programme stakeholders and providing/facilitating feedback when face-to-face modalities are not safe or viable. Some of the examples/case studies which will be drawn on in theme will have a specific education sector focus. We have identified three potential challenges:

Topic 2.1: Designing a Rapid Evaluation to meet a real need (possibly more than one team depending on demand)

The COVID-19 crisis has brought out the need for rapid evaluations providing quick answers to policy makers. This type of responsive evaluation should become part of the evaluation toolkit going forward. The aim of this group is to design a rapid evaluation to meet a real need, developing an evaluation design and plan including: the focus/purpose of the evaluation; evaluation questions; sampling strategy; data collection and analysis methods and how these will be applied to answer the evaluation questions; data collection instruments (piloted if possible). It is possible that this evaluation will be undertaken following the hackathon.

Topic 2.2: Virtual M&E

Virtual engagement and technologies are now part of the M&E landscape and are undoubtedly here to stay. The task of this team is to advise M&E professionals in South Africa on how to undertake monitoring and evaluation virtually, when face-to-face engagement is not possible due to safety, time, geographical and other constraints. The team will produce practical guidelines on how to undertake M&E virtually - for NGOs, government, evaluators which may include: when virtual M&E is recommended and when it is not; minimum requirements for an organisation to undertake virtual M&E; minimising data use and facilitating access to data; guidance on: sampling, data collection data analysis, data protection and privacy; good practice case studies.

Topic 2.3 Framework and tools for M&E of emergency crisis funds

Crisis events require a rapid response across stakeholders and sectors. Government responded rapidly to the onset of the COVID-19 pandemic, locking down the country and establishing the Solidarity Fund to coordinate donations and disbursement thereof. Private sector initiatives include the R1 billion Sukuma Relief programme and the South African Future Trust also endowed with R1 billion. Civil Society also sprang into action with initiatives such as Community Action Networks (CANs) which emerged as volunteer-led, grassroots, neighbourhood-level, community responses to COVID-19 and beyond. The task of this team is to develop a framework and tools for monitoring of an emergency/crisis fund. These may include: Identification of appropriate criteria which can be used for M&E of an emergency/crisis fund; development of a framework (logic, indicators, reflection events, instruments (see below), timing, persons involved, and uses of monitoring information); identification/development of tools/instruments (piloted if possible) which can be used for monitoring and reporting on an emergency/crisis fund.

Theme 3 Made in Africa Evaluation –

developing approaches to M&E rooted in African Indigenous Knowledge Systems

The M&E landscape has been led and dominated by evaluation approaches, theories, frameworks, and practices coined in the Global North, be it in academia or the actual practice. Unfortunately, the African education systems responsible for creating and sharing knowledge, (evaluations being no exception), are still dominated by the Western episteme and has done very little in building on African Indigenous Knowledge Systems. The assessment of how African Indigenous Knowledge Systems (AIKS) can transform the approach to evaluations and evaluation systems in South Africa and Africa at large is not well understood and undocumented. We refer to this as Made in Africa Evaluation (MAE). The current attempt to unpack what a MAE approach can offer to evaluations and evaluation systems is necessary but not sufficient without understanding AIKS and what they offer in some depth. In order to understand the MAE, it is key to first understand the conceptual meaning of AIKS and how it operates, and then consider how this could inform our use of evaluations and evaluation systems going forward. Two topics have been identified under this theme.

Topic 3.1: Conceptual understanding of AIKS and the process of application

This group will focus on seeking to develop a knowledge base relevant to M&E drawing from the rich diversity of African indigenous knowledge and identify how it can inform future development of MAE going forward in South Africa and Africa more widely.

Topic 3.2: Cases studies where AIKS have been applied in research or evaluation

This group will focus on identifying cases where people have identified elements of African indigenous knowledge and sought to apply this in practice in research, in facilitation, and in evaluation, possibly even from planning. This will provide a set of approaches, as well as a database of actual examples which can be drawn from in future work. A set of immediately available methods may be identified, as well as areas for further work.

Theme 4 Other practical applications of M&E

The fourth theme seeks to synthesis existing information and develop practical tools which will be useful for M&E practitioners and students and academics conducting research on M&E. Some of the examples/case studies which will be drawn on in theme will have a specific education sector focus.

Topic 4.1 Development of a M&E Evidence Map

To further the M&E field and produce robust evidence, specific research is required on M&E theory and practice. The task of this team is to contribute towards the development of an evidence map of available evidence on M&E theory. Available evidence and potential gaps will be visually presented on an evidence map, to enable M&E researchers to focus on the identified gaps. The team will review available studies in an existing Early Grade Mathematics evidence map. The hackathon event will (re)map the available studies in the evidence map against adopted criteria for an M&E evidence map that will speak to e.g. the type of evidence or evaluation method employed in each study. They will also develop an outline guideline which SAMEA, the Department of Planning, Monitoring and Evaluation (DPME) and other stakeholders could use to inform further similar evidence mapping initiatives and contribute towards an integrated M&E evidence map. They will identify possible gaps in the first round evidence map where relevant theoretical or applied research should be encouraged to help fill the identified gaps. As an integrated evidence map unfolds it can identify possible gaps in current M&E research that can be prioritised in future evaluation agendas.

Topic 4.2: Developing tools for strengthening the link between M&E and planning/budget processes

It is critical that evidence generated from M&E (as well as other sources like research) is used to inform planning and budget decisions. In this way considerations of the effectiveness, efficiency, impact, sustainability of interventions can be used to inform policy and practice. However in practice there are not well developed tools or mechanisms for doing this. One example that can be drawn from is the Socio-Economic Impact Assessment System (SEIAS) in the Presidency, which has developed an evidence guide to inform the development of legislation and regulation. This topic will review existing tools used internationally in government, private sector and NGO sectors, and seek to provide some good models which can be applied in government and NGO sectors in South Africa.

THIS evaluation HACKATHON IS BROUGHT TO YOU by SAMEA in partnership with: