This incident occurred as Meta continues to refine its content moderation policies, especially when it comes to using automated systems to detect harmful content. According to the company’s transparency report, Meta removed over 10 million pieces of violent and graphic content from Instagram between July and September of the previous year.