The quality of a job depends not only on the job itself but also on the transition opportunities and career paths it opens up. However, limited by scarce data and restrictive models, prior research on labor market transitions has focused on transitions over short periods rather than over careers. We fill this gap by extending transformer neural networks to model sequences of jobs. Sequences of jobs differ from sequences of language, for which the transformer model was initially developed, so we modify the model in two ways: we enable two-stage prediction to first predict whether an individual changes jobs before predicting a specific occupation, and we also incorporate covariates into the transformer architecture. We train our model on a dataset of 24 million American career trajectories collected from resumes posted online. The transformer, which conditions on all jobs in an individual's history, yields significant gains in predictive performance over a Markov baseline, and our modifications add substantially to this gain. We demonstrate the use-cases of our model with two applications: inferring long-term wages associated with starting in various jobs and imputing intermediate jobs between a pair of known jobs.