Back to jobs

Data Annotator (Canada)

Jobright.ai
Canada
Full-time
Applications go directly to the hiring team

Full Description

Jobright is a next-generation AI job search platform built to make career navigation faster, smarter, and more personal. They are looking for a Data Annotator to craft the high-quality datasets that teach our AI agents how to read resumes, understand job markets, and guide people toward better careers.

Why Join Us

* Your labeling decisions become the ground truth that defines how our AI agents behave in production

* Watch agent quality climb week over week as the datasets you build feed directly into the next model iteration

* Sit close to the applied AI and research teams, with a clear path to grow into AI engineering or ML roles down the line

Responsibilities

* Annotate and review resumes, job postings, agent transcripts, and other text data that shape how AI agents learn and improve

* Work through ambiguous cases with judgment and care, surfacing patterns and gaps that help the team tighten guidelines and reduce future errors

* Team up with applied AI engineers to investigate where agents are failing, then design targeted labeling efforts that close those gaps

* Contribute to the playbooks, quality audits, and reviewer workflows that keep annotation output consistent as the team and dataset grow

Qualifications

Required

* Recent grad or early-career professional with 0 to 2 years of experience in annotation, content moderation, research, editorial work, or a similar field

* Clear writer and communicator who can articulate why a label is correct and raise good questions when the guidelines don't cover a case

* Meticulous eye for detail and steady judgment under ambiguity, paired with a working sense of how labeled data influences AI and ML model behavior

Preferred

* Internship or project experience in data annotation, linguistics, qualitative research, or content operations at a tech or AI-focused company

* Comfort moving through large volumes of nuanced material without losing accuracy, especially when the subject matter is unfamiliar at first

* Familiarity with annotation platforms such as Label Studio or Scale, basic SQL or spreadsheet skills, and hands-on experience reviewing LLM outputs or prompt-based workflows

Applications go to the hiring team directly