IMPES

Principle Seven

Validate What You Measure

IMPES

7. Principle Seven

Validate What You Measure

To ‘validate’ means to provide evidence for your data. This is also important for consideration of how you communicate that data to your partners or to a public audience. When working with diverse groups of entrepreneurs, and particularly within emerging economies, this is sometimes easier said than done. 

You do not need to achieve and measure everything. Be clear on what your intended impact is, prioritise the data that indicates this success, and define how you can adequately validate it with data, records, or other information.

To ‘validate’ means to provide evidence for your data. This is also important for consideration of how you communicate that data to your partners or to a public audience. When working with diverse groups of entrepreneurs, and particularly within emerging economies, this is sometimes easier said than done. 

You do not need to achieve and measure everything. Be clear on what your intended impact is, prioritise the data that indicates this success, and define how you can adequately validate it with data, records, or other information.

Home 5 The IMPES Principles 5 Principle Seven: Validate What You Measure
7.1 Be transparent about the data you collect, how you validate it, and your limitations

Be transparent about what you are measuring, what data you are collecting, and how you are calculating this data. This is important to ensure that your team is validating and analysing this data in the same way, using the same calculations. It is also important when communicating your data with key stakeholders, such as funding partners. If there are limitations to your data collection, clearly outline them and acknowledge them as part of your MEL practices. 

For example, if you are measuring the revenue growth of a business:

  • Clearly define what data you are collecting to demonstrate ‘growth’ 
  • Outline how you are calculating revenue growth (e.g. time periods of data collection; reported revenue from end-line survey subtracted from  reported revenue from baseline survey)
  • Acknowledge any limitations to the data. For example, if data is collected at the baseline and endline of a six-month incubator program, ensure a clear comparison and validation of revenue growth by conducting follow-up surveys over the same time periods for the following one to two years. 
7.2 Measure less but go a step further (dive deeper)

Consider how you can best analyse and learn from the data you collect. Rather than collecting data against many different outcomes and indicators, you may choose to narrow down and prioritise.

For example, a deeper dive into ‘job creation’ as an indicator might be to consider: 

  • Whether the jobs created are full-time and permanent/secure (not seasonal)
  • Whether the jobs created pay a fair, minimum wage
  • The percentage of men and women employed
  • The percentage of people with disabilities employed
  • Whether salaries between men/women, indigenous/non-indigenous, and people with/without a disability, are equal and comparable
7.3 Be culturally sensitive to validation measures

Understand and be respectful of potential limitations that some entrepreneurs may face when providing you with data. Consider whether some questions are culturally appropriate to ask, and/or whether you should ask certain questions using a different format, such as focus group discussions, to make people feel more comfortable.

7.4 Integrate data collection into your program delivery

Building data collection into program delivery not only assists with validation, but it can also help to reduce the time and pressure on entrepreneurs to respond to many different surveys and questions. For example, if conducting a financial management workshop, you might ask entrepreneurs to bring financial records with them to the workshop. When building out cash flow or other financial forecasts using this information, consider asking entrepreneurs whether you can use this data for your own MEL and data validation.

7.5 Collect data over different timeframes to gain deeper insights

Collecting data using both qualitative and quantitative methods can provide deeper insights into what the data is telling you and provide further validation. Collecting data over a longer period and at different points in time can also tell you whether the change is long-term or only occurred at a certain point in time. For example, many businesses are affected by seasonal market changes. Collecting data at different times, such as during both high and low seasons and/or over a period of several years, can tell you whether this data is painting an accurate picture of impact.

7.6 Analyse and embed your validated data to ‘Learn’ and make improvements

Valid data is only the first part of the story. Remember that the ‘L’ in MEL stands for ‘Learning’. Once you have valid data that tells you how you are performing in different impact areas, it is helpful to use that data to make decisions and improvements. Think about what decisions you make but often do not have enough information about. For example, how much time should you spend on a topic in a workshop to improve entrepreneur outcomes or how should you distribute resources? Developing a plan for how to use your data also helps to ‘measure what matters’ and not to waste time collecting unnecessary data. You can start by listing all the decisions that you would feel more confident about making if you had impact data available.

Case Study - SHE Investments

Validating data for revenue growth

A key outcome for SHE Investments (SHE) in Cambodia, an ESO focused on supporting women-led micro-small enterprises to scale, is revenue growth, leading to intended economic impact. 

For the first few years of programming, the SHE team asked entrepreneurs, “did your revenue grow this month?”, and compared the revenue of the baseline data (when the entrepreneurs first started an incubator program) to the endline data (when they graduated seven months later). 

After multiple programs and more than 50 graduates,
the team started to see some inconsistencies
in the data, which raised questions:

  • Baseline data showed business revenue as mostly round numbers (e.g. US$500 per month). Throughout the program the entrepreneurs were taught financial management skills, including tracking their income and expenses each month. At the end of the program, their monthly revenue was more specific (e.g. US$701.60). 
  • Some women, especially those in industries significantly affected by seasons (such as agriculture and tourism) experienced very high revenue growth during the program. This looked positive, but was it accurate and realistic? 
  • During high points of revenue, seasonal businesses also created more jobs. However, during low seasons, the number of jobs decreased. Were these jobs full-time or permanent? Or were they seasonal labourers hired temporarily? 
  • Some women were hesitant to share their financial data with the SHE team. All data was collected via surveys and interviews, however some women did not answer the question about their revenue or did not want to show any financial records. Therefore, the SHE team relied on the women to tell the truth and on their accuracy. 
  • Some women reported high increases in revenue but no other changes (such as job creation). So what were they using the money for? Were other areas of the business growing, or was this money being used for other means (such as family or socially impactful initiatives)? 
Using learnings to improve data validation

The SHE team used the lessons from these inconsistencies and questions to change the way they collected and validated data. This took time, patience, re-training of the team, and reintegration of new MEL processes into programming and workshops. 

Some key steps the organisation took were:

  • Changed data collection points: Conducting follow-up with entrepreneurs to collect and measure data at different points in time, over a longer term (e.g. program baseline and endline, as well as annual follow-ups, to compare any changes in data collected during the same periods annually). 
  • Understood when entrepreneurs were ready to provide accurate data: Collecting ‘baseline’ data related to finances during or after financial literacy workshops, to ensure that entrepreneurs had started to accurately track income and expenses (and were not guessing the numbers).
  • Introduced new tools for entrepreneurs: Developed additional and accessible tools and resources for entrepreneurs to both help them record their financial data, and to also easily share this with SHE via easily generated financial reports (i.e. KOTRA Riel app)
  • Conducted team training: Training for MEL and other team members to understand who is conducting data collection (workshop facilitators and field staff), who is analysing it (MEL team), and who is using the learnings to change and adapt programming (the whole team, led by the Leadership team). 

THE PRINCIPLES

A set of living, open source Guiding Principles for ESO Impact Measurement, led by a Community of Practice, and developed with input from key stakeholders.

Explore by Principle, or start with

1: Understand what success looks like for entrepreneurs

2: Measure the Health of your ESO

Illustration of Ryani presenting insights and learnings

3: Measure immediate, intermediate and long-term outcomes

4. Understand and align with the goals of key stakeholders

Illustration of Groups of Entrepreneur, Partner, ESO, and Funder Stakeholders

5. Invest in Monitoring, Evaluation, & Learning (MEL)

6: Practise data collection methods that are accessible for diverse entrepreneurs

7: Validate what you measure

Illustration of a Tape Measure