It isn’t just famous or highly visible women who are facing enough online abuse to consider leaving social media. A YouGov poll commissioned by the dating app Bumble showed that almost half of women age 18 to 24 received unsolicited sexual images within the past year. UK Member of Parliament Alex Davies-Jones put the phrase “dick pic” into the historical record during the debate on the UK Online Safety Bill when she asked a male fellow MP if he had ever received one. It is not, as she said, a rhetorical question for most women. AI-enabled intimate image abuse that combines images to create or generate new, often realistic images—so-called deepfakes—are other weapons for online abuse that disproportionately impact women. Estimates from Sensity AI suggest that 90 to 95 percent of all online deepfake videos are nonconsensual porn, and around 90 percent of those feature women. The technology to create realistic deepfakes is now outpacing our ability and efforts to combat it. What we now see is a perverse democratization of the ability to cause harm: The barriers to entry for creating deepfakes are low, and the fakes are increasingly realistic. The current tools for identifying and combating this abuse simply can’t keep up. And the effects of online harm against women are chilling. We can look to research that’s been done in societies where women face more social restrictions to see the impact. In a pioneering research study, Katy Pearce and Jessica Vitak found women in Azerbaijan opting out of being online because the potential real-world repercussions resulting from online harassment were simply too high in an honor-based culture with high degrees of surveillance. In other words, women faced an impossible double standard: unable to control their image on social media but punished severely for it. There are answers: Better safety-by-design measures can help people control their images and their messages. For example, Twitter recently allowed people to control how they are tagged in photos. Dating app Bumble rolled out the aptly named Private Detector, an AI-enabled tool allowing users control over which—if any—unsolicited nude images they want to see. Legislation, such as the UK’s proposed Online Safety Bill, can push social media companies to address these risks. It is far from perfect, but the bill takes a systems-based approach to regulation, asking platform companies to assess the risks and to develop upstream solutions such as improving human content moderation, dealing better with user complaints, and pushing for better systems to take care of users. This regulatory approach is not guaranteed to keep women from logging off in great numbers in 2023. If they do, not only will they miss the benefits of being online, our online communities will suffer.