Trust & SafetyFull-timeRemote

Trust & Safety Specialist

Investigate abuse reports, make enforcement calls, and shape the policies and tooling that keep the platform safe.

Future role, not open

No roles listed here are currently open. You can still send an application now, and we will keep it for a future hiring round.

Listed March 1, 2026

About this future role

This role is not currently open. We keep this page to show the trust and safety work we may hire for later, and we accept applications now for a future hiring round.

When this role opens, the trust and safety specialist will investigate abuse reports, make enforcement calls, and help shape the policies and tooling that keep Fluxer's community safe. As an early hire on this team the work runs wider than pure casework. You'll have direct input into policy direction, detection improvements, and how we handle the harder, messier escalations.

The role asks for calm judgement, emotional resilience, and genuine care for user safety. There will be times you review distressing material, and we don't take that lightly. The work also makes a real, visible difference in reducing harm, and that ought to be one of the reasons you want it.

What you'll do

  • Investigate user reports of abuse, harassment, spam, and policy breaches
  • Make proportionate enforcement decisions: warnings, content removals, suspensions, permanent bans
  • Handle appeals with clarity and empathy, even when the answer is no
  • Spot recurring abuse patterns and drive the policy or tooling change when something is systemic
  • Lead policy updates and document the reasoning behind enforcement decisions
  • Handle urgent escalations involving illegal content or imminent risk of harm
  • Document cases thoroughly enough that someone else can pick them up cold
  • Partner with engineering to improve safety tooling and detection quality

What we're looking for

  • 1+ year in trust and safety, content moderation, or a related role at a technology company
  • Deep familiarity with how online abuse actually shows up on messaging platforms
  • The ability to make consistent, fair enforcement decisions under time pressure
  • Excellent written communication for user-facing responses
  • Genuine emotional resilience, since this role involves regular exposure to distressing content, and we take that seriously
  • Methodical attention to detail when working through investigations
  • Comfort with content moderation tools and ticketing systems

Nice to have

  • Experience at a messaging, social, or gaming platform
  • Familiarity with EU content regulation (Digital Services Act, e-Evidence)
  • Working knowledge of CSAM reporting requirements and workflows (NCMEC, IWF)
  • Additional language skills beyond English
  • Experience using data analysis to find patterns in abuse signals

How to apply

Send your CV as a PDF (not a Word .doc or .docx file), or include a publicly accessible website URL to your CV, and a short note about your trust and safety experience to careers@fluxer.app. This role is not open right now, so we are not reviewing or replying to applications yet. We will keep your application on file for a future hiring round.

Send an application for later?

Send your application to careers@fluxer.app with your CV as a PDF (not a Word .doc or .docx file), or include a publicly accessible website URL to your CV. This role is not open now; we will keep your application for a future hiring round.

We are not reviewing or responding to applications right now. We will review saved applications when a relevant future hiring round opens.

Choose your language

All translations are currently LLM-generated with minimal human revision. We'd love to get real people to help us localize Fluxer into your language. To do so, email i18n@fluxer.app and we'll be happy to accept your contributions.