Our solutions enable content moderation policies that are consistent, targeted and effective. We design technology that is optimised for freedom of expression and subject to human oversight.
We’re working with some major partners around the world who trust our technology and team to help them deal with the current infodemic.
After a major new round of investment in 2020, and having secured a series of new partners, we’re now excited to expand our team.
We’re taking a different approach to the traditional recruitment process. Rather than post a long list of jobs and job requirements, we want to talk to smart passionate people.
So what areas are we hiring in?
Kinzen believes in the vital importance of the 'human in the loop', ensuring expert oversight of the data informing new and improved content moderation systems.
While data analysis chops are important, this is not a traditional data scientist role.
We want to talk to people who can:
Ideally, you will have worked in social media companies and understand the needs of trust and safety teams, and/or have direct experience curating for machine learning processes.
We want to talk to people who can:
Ideally, you will have worked with journalism or research organisations. Fluency in languages other than English is preferred but not required.
We want to talk to people who can:
Ideally, you will have experience in social media companies, with an understanding of the needs of Trust & Safety and Content Moderation teams.
We want to talk to people who can:
If you think you have the skills, experience and expertise—wherever in the world you are—that match some of our starting assumptions, come and pitch us your dream Kinzen role!
Send us an email outline and attach your CV to aine@kinzen.com. In the subject line, make clear which area you’re interested in (Data, Research & Editorial, Sales & Account Management or Marketing).