Essays on job search and retraining
Author(s)Ben Dhia, Aicha(Aicha Lucie)
Massachusetts Institute of Technology. Department of Economics.
Esther Duflo and Frank Schilbach.
MetadataShow full item record
This thesis comprises three essays in empirical labor economics. Broadly, the essays provide evidence on the existence and the effects of information barriers in situations of job search and retraining. Chapter 1 (coauthored with Esther Mbih) begins from the observation that little is known about how job seekers decide to enroll in a training program. Decisions related to job training might be undermined by informational gaps, especially about program costs, enrollment procedures, and expectations of reemployment chances. The paper reports the results of a low-cost intervention aimed at testing for the existence of misinformation about training costs and returns, and its impact on enrollment. Partnering with the French Public Employment Services and the largest training provider in France, we sent 50,000 emails advertising training opportunities to job seekers in four regions of France in late summer 2016.We randomly added short messages on training costs, registration procedures, and training returns to the basic email template. A baseline survey reveals misperceptions about financial aspects of training participation among more than half of job seekers: they either believe that they need pay to participate in a training (45%) and/or that their unemployment benefits would be affected (30%). Further, half of respondents perceive enrollment procedures as complex or very complex. We find that receiving an email with a message emphasizing training returns in terms of employment more than doubles the likelihood that job seekers call back the training center. However, callback rates are low in absolute value (less than one percent) and we detect no impact on enrollment one to six months after the intervention. We provide suggestive evidence that increasing salience of basic information about training is driving the effects on callbacks rather than belief updating.Chapter 2 (coauthored with Bruno Crépon, Esther Mbih, Louise Paul-Delvaux, Bertille Picard and Vincent Pons) shows the results of another large-scale randomized experiment to evaluate the impact of an online platform helping job seekers adopt effective job search strategies. The platform combines labor market data from the French public employment agency and personal data from individual profiles to recommend users occupations and areas with high employment chances and to give them concrete tips to improve their job search methods. The experiment was conducted in collaboration with the French public employment agency on a sample of 212 277 job seekers from April to November 2017. An encouragement design led to a take-up rate of 26.2% in the treatment group and virtually zero in the control group. Following individual trajectories over 18 months after the intervention, we do not observe any impact on job seekers' search effort and search scope, whether occupational or geographical.We find modest effects on search methods: job seekers using the website are more likely to rely on personal networks and to use resources provided by public employment services. However, we do not find any effect on self-reported well-being and on employment outcomes, both in the short run or in the middle run, indicating that more intensive interventions are required to bring unemployment down. Chapter 3 contributes to the debate on how to regulate the market of vocational training. Understanding the decision-making process of job seekers who benefit from public training is crucial to improve their matching with effective providers and increase competitive pressure on badly performing providers. The chapter reports the results of an online survey on job seekers in France who had participated in a training program between January 2017 and April 2018. The survey aimed at understanding what they knew of and how they selected a center among heterogeneous training providers.I find two main results. First, job seekers use very limited information when making their choices. Only a third of respondents compare different centers before choosing one and to find a training provider, almost all respondents use a single source of information, which for half of respondents is their caseworker. Second, job seekers take into account various factors beyond the probability of finding a job. Logistical considerations such as start date or distance to home play a more important role than provider characteristics such as employment performance or size and connections to firms. Taken together, these results may explain the low competitive pressure between job centers, which in turn may contribute to low value added.
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Economics, September, 2020Cataloged from student-submitted PDF of thesis.Includes bibliographical references (pages 151-157).
DepartmentMassachusetts Institute of Technology. Department of Economics
Massachusetts Institute of Technology