The Global Response Operations organisation within Global Operations responds to real-time crises, proactively identifies and evaluates emerging risks, conducts risk-related investigations, and assesses what we could be doing to best benefit our community. By understanding and consistently managing incidents and real-time crises to resolution, the organisation drives continuous improvement with partner teams across Meta.
The Detection team within Global Response Operations is focused on detecting emerging on-platform risks that pose harm to our community and businesses on our platform. We identify, investigate and understand those risks. We partner closely with other Global Response Operations teams, Policy, Product and Process teams to mitigate those risks.
We are looking for someone who has a passion for identifying risks and investigating to understand where and how these are manifesting on our platforms. A successful candidate has investigative skills and is can work in ambiguous and experimental settings - we are looking for a self-starter who is excited to help us build out our new detection team work and processes.
This role may involve exposure to potentially graphic and/or objectionable content including but not limited to graphic images, videos, audio and writings, offensive or derogatory language, and other potential objectionable material, like child exploitation, graphic violence, self-injury, animal abuse, and other content which may be considered offensive or disturbing.