Trust & SafetyFull-timeRemote

Trust & Safety Lead

Lead Fluxer's trust and safety function, setting policy direction, managing escalations, and building a safe platform for all users.

Posted March 1, 2026

About the role

We're looking for a trust and safety lead to define and run Fluxer's approach to platform safety. You'll set policy direction, handle complex escalations, and build the processes and tooling that protect users. This is a leadership role with direct impact on product, policy, and operations.

Safety is core to Fluxer. As an open source platform, we face specific challenges around abuse, content moderation, and user protection. We need someone who can turn those challenges into clear policies, practical workflows, and consistent decisions.

What you'll do

  • Define and maintain Fluxer's content moderation policies and enforcement guidelines
  • Lead the trust and safety team, including hiring, mentoring, and performance management
  • Handle high-severity escalations involving abuse, illegal content, and imminent harm
  • Build and improve workflows for content review, appeals, and user reports
  • Partner with engineering to develop safety tooling and automated detection systems
  • Work with legal to ensure compliance with regulations such as the EU Digital Services Act
  • Produce transparency reports and communicate safety metrics to leadership
  • Work with external organisations (law enforcement, NGOs, and industry groups) where required

What we're looking for

  • 3+ years' experience in trust and safety, content moderation, or online safety at a technology company
  • Demonstrated experience leading a team or function
  • Strong understanding of online abuse vectors, including harassment, CSAM, spam, scams, doxxing, and extremist content
  • Experience drafting and enforcing content policies at scale
  • Strong judgement under pressure and the ability to make defensible decisions quickly
  • Excellent written and verbal communication skills
  • Familiarity with legal frameworks relevant to online platforms (DSA, GDPR, DMCA)

Nice to have

  • Experience working at a messaging or social platform
  • Background in law enforcement, child safety, or digital forensics
  • Experience with automated content moderation and detection tools
  • Understanding of open source community dynamics and self-hosted platform considerations
  • Additional language skills beyond English

How to apply

Send your CV and a short note about your experience and approach to platform safety to careers@fluxer.app. We read every application ourselves.

Ready to apply?

Send your application to careers@fluxer.app and we'll get back to you as soon as we can.

We strive to respond to all applicants within 30 days, though our small team size may occasionally cause delays.

Choose your language

All translations are currently LLM-generated with minimal human revision. We'd love to get real people to help us localize Fluxer into your language. To do so, email i18n@fluxer.app and we'll be happy to accept your contributions.