Timezone: »
The quality of a job depends not only on the job itself but also on the transition opportunities and career paths it opens up. However, limited by scarce data and restrictive models, prior research on labor market transitions has focused on transitions over short periods rather than over careers. We fill this gap by extending transformer neural networks to model sequences of jobs. Sequences of jobs differ from sequences of language, for which the transformer model was initially developed, so we modify the model in two ways: we enable two-stage prediction to first predict whether an individual changes jobs before predicting a specific occupation, and we also incorporate covariates into the transformer architecture. We train our model on a dataset of 24 million American career trajectories collected from resumes posted online. The transformer, which conditions on all jobs in an individual's history, yields significant gains in predictive performance over a Markov baseline, and our modifications add substantially to this gain. We demonstrate the use-cases of our model with two applications: inferring long-term wages associated with starting in various jobs and imputing intermediate jobs between a pair of known jobs.
Author Information
Keyon Vafa (Columbia University)
More from the Same Authors
-
2022 : An Invariant Learning Characterization of Controlled Text Generation »
Claudia Shi · Carolina Zheng · Keyon Vafa · Amir Feder · David Blei -
2022 : Adjusting the Gender Wage Gap with a Low-Dimensional Representation of Job History »
Keyon Vafa · Susan Athey · David Blei -
2022 : CAREER: Economic Prediction of Labor Sequence Data Under Distribution Shift »
Keyon Vafa · Emil Palikot · Tianyu Du · Ayush Kanodia · Susan Athey · David Blei -
2022 : An Invariant Learning Characterization of Controlled Text Generation »
Claudia Shi · Carolina Zheng · Keyon Vafa · Amir Feder · David Blei -
2022 : CAREER: Economic Prediction of Labor Sequence Data Under Distribution Shift »
Keyon Vafa · Emil Palikot · Tianyu Du · Ayush Kanodia · Susan Athey · David Blei -
2022 : An Invariant Learning Characterization of Controlled Text Generation »
Claudia Shi · Carolina Zheng · Keyon Vafa · Amir Feder · David Blei -
2019 Poster: Discrete Flows: Invertible Generative Models of Discrete Data »
Dustin Tran · Keyon Vafa · Kumar Agrawal · Laurent Dinh · Ben Poole