Categories
Universities

When higher education interventions don’t work

I am currently supporting a higher education project in Tunisia and came across an interesting World Bank study considered to be the first of its kind. Final year undergraduates were given the opportunity to graduate with a business plan instead of following the standard curriculum, and were offered 120 hours of training that included ‘most of the components that are considered best-practice for entrepreneurship education’. The optional entrepreneurship track started in 2009/10 and has been running since.

In the first published analysis, short-term impacts were studied:

the entrepreneurship track was effective in increasing self-employment among applicants, but that the effects are small in absolute terms. In addition, the employment rate among participants remains unchanged, pointing to a partial substitution from wage employment to self-employment. The evidence shows that the program fostered business skills, expanded networks, and affected a range of behavioural skills. Participation in the entrepreneurship track also heightened graduates’ optimism toward the future shortly after the Tunisian revolution.

A second paper, published in 2019, examined the medium-term impact using the same cohort:

The medium-term results show that the impacts of entrepreneurship education were short-lived. There are no sustained impacts on self-employment or employment outcomes four years after graduation. There are no lasting effects on latent entrepreneurship either, and the short-term increase in optimism also receded… the lack of medium-term impact holds across sub-groups based on gender, family wealth, skills or social capital.

There are several possible lessons to draw, beyond the clear difficulty of achieving lasting impact. The first is that integrating enterprise education alongside existing curricula, rather than a separate stream, could be an effective alternative. The second, as the second study suggests, is that other limitations are a bigger constraint than the nature of the training, especially accessing capital (there is evidence from Nigeria, cited in the paper, of monetary grants having long-term, positive impact). The third is the importance of continued coaching, training and mentoring beyond the initial period of study.

But what really struck me was how relatively unusual it is to come across randomised control trials of interventions in higher education (and especially published studies of those where the intervention did not work). Fields such as medicine abound with multi-year trials. Primary education has also seen its fair share – this years Nobel prize recognised the work of Abhijit Banerjee, Esther Duflo and Michael Kremer popularised in the excellent book, Poor Economics. There are plenty of large-scale evaluations and analyses of higher education, particularly around student outcomes, but I struggle to think of large-scale, experimental interventions. My hunch is that, as recognition of the role of higher education in development and social change has increased, so too will demand for randomised control trials within the field.

Failures wanted

Tunis, Tunisia

The Tunisian study is helpful as it shows that a ‘common sense’ prescription (give students business and entrepreneurial skills instead of writing an academic thesis as part of their degree) to a commonly-perceived problem (unemployable graduates) simply did not work. I’d love to see more such studies. Sharing examples of what doesn’t work through large-scale, rigorous testing can be hugely valuable, albeit with the caveat that the results may not always be generalisable to other contexts. As I see it, several things need to be in place:

  1. The basic parameters of an academic study: a control group who do not participate in the intervention, careful analysis of the context and environment, benchmarking and continued evaluation, etc.
  2. An acceptance that the intervention may fail. This is why the World Bank is perhaps better placed to fund such a study than the Tunisian government, who would be less willing or able to share widely the outcomes if the project failed, or to experiment with public funds.
  3. A longer-term (multi-year) perspective with no expectation of clear answers in the short term.
  4. A process of freely disseminating the findings and sharing what has (not) worked.
  5. A sufficiently big budget to launch and maintain a long-term effort, and to provide the capacity for effective experimentation, iteration and evaluation. The World Bank’s Tunisia Tertiary Education for Employability Project runs for over five years and commits 70 million USD.

Crucially, these conditions combined distinguish a rigorous, experimental study from a public policy intervention. If you know of any other experimental, evidence-based studies in higher education – especially those that have been deemed to have not worked – please let me know by email or in the comments below.

Photos of Tunis, Tunisia from Unsplash. Credits: main image, article image.

Categories
Universities

Working for economic transformation

Last week the Overseas Development Institute published an excellent report, Working for economic transformation, on generating better jobs in low income countries. The report notes that countries need to make decisions in education policy to match labour market demand:

Promoting higher-level technical or university education, more expensive per student, can help facilitate the leap to more productive sectors. But this can come at the cost of reduced funding for junior levels of education, which might allow for a wider boost to low-productivity employment. It can also help to exacerbate inequalities in education, with tertiary places more often taken up by those from wealthier backgrounds. Increasing investment in lower levels of education tends to have a greater effect on improving attainment outcomes among the poorest. At the same time, expanding educational enrolment and attainment may reduce available funding necessary for higher-quality education, such as for school facilities, teacher training and wages. (p.28)

In other words, there is a budgetary choice to be made between strengthening schools (junior levels of education) or universities (higher levels of education), and there are clear opportunity costs for each.

In policy terms, however, education should be seen as a continuum from school to college to university, and higher education has a role to play in strengthening the stages that come before it. This role needs to be explored and expanded and encapsulated in government policy: universities can train the teachers and advise on curriculum design to strengthen junior education. Their links with businesses can ensure future skills needs are being met throughout the system. At the same time, universities can work with businesses, especially small and medium enterprises (SMEs) to support their growth and in turn their demand for highly skilled graduates. Facilitating internships and placements for students in industry further strengthens these links, and provides a technical and enterprise focus to degrees. As the report concludes:

Setting employment at the centre of development agendas requires a whole range of policies, plus coordination among these policies and of the dynamics between different sectors of the economy. Education policies need to be matched with the requirements of growing industries with productive employment potential. (p.39)

The continuum between junior and higher levels of education becomes particularly important when we consider development policy. One consequence of the Millennium Development Goals focus on achieving universal primary education is the pipeline of students who could potentially go on to study at university as the MDGs draw to a close this year. Treating each stage in isolation risks missing the opportunity to greatly increase the amount of highly skilled jobs created worldwide in the years ahead.