My 5 key take-aways from Covid-19 employment diagnostics

Source: ILO

Since the beginning of the Covid-19 pandemic, countless studies have been conducted to analyze the employment related impacts of the crisis. These studies were often quite different compared to employment and labour market assessments in pre-crisis years. What lessons can we learn from those assessments for future employment diagnostics in crisis and non-crisis times? 

By Kevin Hempel | September 2022
What can the last 2.5 years teach us on analyzing labor market dynamics in crisis times?

Originating as a health crisis in early 2020, the Covid-19 pandemic quickly affected economies and labour markets across the globe. Since then, governments, social partners and development agencies conducted countless employment related diagnostic studies to measure the impact of the crisis on firms and workers, understand who has been most affected, and explore which policies can best address the labour market impacts of the crisis.

To reflect on the wealth of diagnostic studies carried out since the beginning of the pandemic, GIZ and ILO commissioned a study that was published earlier this year. The study sought to review the range of employment assessments, shed light on the different data and information sources used, and highlight implications for future employment diagnostics and crisis-resilient labour market information systems.

Below I am reflecting upon my key lessons learned on Covid-19 related employment assessments from reviewing these studies and talking to people across multiple organizations who commissioned or wrote them:

1) Strong labour market information systems (LMIS) matter in crisis times.

A major lesson that emerged during the pandemic is that countries with stronger existing LMIS were better positioned to conduct timely and quality diagnostics on the crisis. For instance, when the pandemic started, South Africa initiated the Coronavirus Rapid Mobile Survey which built on the existing National Income Dynamics Study, a national household panel survey ongoing since 2008, therefore not needing to set up data collection efforts from scratch. Similarly, countries that already used some remote surveying prior to the pandemic were quicker to adapt to the new circumstances when face-to-face surveying got disrupted. Hence, strengthening labour market information systems in non-crisis times should be high on the agenda in many countries where these systems are still weak.

2) Non-traditional data sources are here to stay.

One major development during the pandemic was the widespread shift to remote surveys (mainly by phone), due to: (i) the inability to collect data face-to-face because of Covid-related containment measures; and (ii) the demand for rapid information. Another major development was the increased use of big data, drawing on data from online job portals and economic activity proxy data (e.g., google mobility data, google trends, electricity data, satellite/ remote sensing). A major appeal of big data is that it consists of high-frequency, real time information that can either be used in the absence of other labour market information and/or complement other sources of information for a richer picture of economic and labour market dynamics. Clearly, these new sources are not perfect (e.g., limitations in terms representativeness) and cannot replace traditional sources of labour market information (such as labour force surveys). But they definitely seem to have grown in acceptance during the pandemic and will certainly continue to be an important source of labour market information in the future.

3) Proper combination and sequencing of different methods yields a richer picture.

Employment diagnostics during the pandemic used a wide range of data and information sources. This included, for instance, national survey data, ex-ante vulnerability analysis (identify sectors and population groups particularly vulnerable to the crisis), different types of modelling techniques, ad-hoc surveys (of households and enterprises), big data, as well as policy response data. Since each method and data source comes with its own limitations, several organizations such as the ILO or the World Bank have carried out various studies (for example this ILO study on Serbia) where they successfully combined multiple sources of information either within the same diagnostic or by launching several complementary studies. Proper sequencing of diagnostics has also proven to be important, given the urgency for answers as the crisis unfolds. For instance, ex-ante vulnerability analysis of sectors and population groups or economic modelling were able to provide some quick answers with limited resources in the absence of other data. As this early-stage analysis was undertaken, stakeholders could then prepare for additional data collection efforts (e.g., firm and household surveys) to provide more detailed and reliable information as the crisis unfolded.

4) Crisis diagnostics need to be designed as a marathon, not as a sprint.

During the early stage of the pandemic, virtually everyone conducted a study. As the crisis continued, however, only few organizations were able to maintain sytematic efforts to understand the evolving impacts of the pandemic. Most of us simply didn’t anticipate how long this crisis could last. Covid-19 taught us that pandemic-related crises are extremely dynamic, highlighting that employment diagnostics must also be thought of as a process rather than one-off exercises. Diagnostics must be able to capture the changing situation and inform policy responses over time. In this regard, studies like the World Bank’s high-frequency household phone surveys or Business Pulse surveys can serve as examples for the future.

5) Partnerships should be explored.

Collaboration on employment related diagnostics can both improve the quality of the research and strengthen the foundation for evidence-based policy dialogue. Collaborations can take many forms. For example, stakeholders or projects with similar information needs (e.g., on household impacts or firms’ coping strategies) can pool money and skills to conduct joint studies, thus allowing for higher quality assessments. Joint analysis can also be helpful to foster policy influence by speaking with one voice in times where policymakers must make important decisions with little time. There have also been good examples of cooperation among researchers who developed joint data collection instruments to measure crisis impacts in a consistent way (see for example Finally, partnerships with technology and social media companies have opened up new promising sources of information. For instance, the Development Data Partnership emerged to facilitate the use of private sector data for public benefit.

These were some of my key take-aways from our study. For more detailed findings and recommendations, check out the full report. If you would like to share feedback or tell me about your own work in this area, send me an email.
About the author:

Kevin Hempel is the Founder and Managing Director of Prospera Consulting, a boutique consulting firm working towards stronger policies and programs to facilitate the labor market integration of disadvantaged groups. You can follow him on LinkedIn and Twitter.