Sorry, 'Severance' fans — this season 2 guest star confirmed they won't be back
- today, 3:04 AM
- businessinsider.com
- 0

A new technology can pinpoint victims of intimate partner violence four years earlier than other detection systems and with 80% accuracy.
The Automated Intimate Partner Violence Risk Support System (AIRS) utilizes clinical history and radiologic data to pinpoint patients seen in the emergency room who may be at a risk for intimate partner violence (IPV). Developed over the past five years, AIRS has been rolled out to the Brigham and Women’s Hospital’s Emergency Rooms in Boston as well as surrounding primary care sites. Currently, the tool has been validated at the University of California-San Francisco Medical Center and is being evaluated by the Alameda Health System for its role in clinical workflow.
“Data labeling quality is a huge concern—not just with intimate partner violence care, but in machine learning for healthcare and machine learning, broadly speaking,” says cofounder Irene Chen. “Our hope is that with training, clinicians can be taught how to spot intimate partner violence—we are hoping to find a set of cleaner labels.”
AIRS is an AI-based program that is run on the Electronic Health Record data. It takes an individual patient’s EHR data—including past radiographic imaging results and patient clinical history—and uses an algorithm to evaluate a patient’s risk for and severity of IPV. This is followed by a “silent study” where the algorithm makes recommendations for patient care for patients who have been labeled as victims.
Both the radiological data and the patient clinical histories have been revealing for Chen and her cofounder Bharti Khurana. For instance, Chen shared that many victims of intimate partner violence who were detected through AIRS had experienced a broken ulna (a bone in the forearm), a defensive injury. When questioned, they had lied that they had fallen down but did not have the instinct to “catch” themselves, which is more likely to lead to a broken wrist rather than a broken ulna. Ulna fractures signaled an attack by someone from above and were a strong indicator of IPV.
AIRS’ clinical data comes from three sources: patient diagnosis codes (usually intended for billing purposes), whether patients opt-in to hospital resources correlated with IPV (including social workers or legal assistance), and direct interviewing of clinicians to incorporate into the algorithm for AIRS. That latter data source can often prove to be most challenging as many physicians—despite their years of training—are not taught to spot IPV. This gap in training is significant: A May 2024 study published in Nature found that only 25% of IPV cases are correctly diagnosed, underscoring the need for more systematic detection methods like AIRS.
Suzanne Freitag, director of the Ophthalmic Plastic Surgery Service at Massachusetts Eye and Ear, who has decades of experience treating victims of IPV, cautions against treating AI as a “magic font of knowledge” that can replace a clinician’s training. While she believes in the “pattern recognizing” power of AIRS that is a hallmark of radiology, Freitag remains cautious about using patient clinical history as a “ground truth” for IPV diagnosis.
“I try to be careful not to stereotype because domestic violence happens to people of all ethnicities, socioeconomic statuses, sexualities, and education levels,” Freitag says.
Chen and Khurana, for their part, believe that AI can eliminate implicit biases to provide a clearer diagnosis for IPV victimization. The two first connected in February 2020, Khurana saw Chen (then a PhD candidate in electrical engineering and computer science at MIT) deliver a talk on algorithmic bias in medicine at Harvard; shortly afterward, Khurana—an radiology professor at Harvard Medical School—approached Chen to discuss collaborating to apply machine learning to intimate partner violence detection. Five years and one $3.2 million National Institutes of Health grant later, Chen and Khurana have not only built and validated AIRS but are now working to expand its implementation across multiple hospital networks.
Social work staff have also been crucial to the implementation of AIRS at Brigham and Women’s Hospital, where Chen and Khurana have tapped into records from an existing program called Passageway. A free, voluntary, and confidential resource, Passageway allows patients afflicted by IPV to gain access to social workers and legal representation to seek help. Chen—while not blinded by imperfections of implementing machine learning in clinical settings—feels “optimistic” about AIRS’ implementation. She points to a 2023 study by the Pew Research Center which found that 38% of a sample of 10,000 Americans believed that AI would improve patient outcomes. While skepticism of AI is alive, Chen feels that it is important not to surrender to it.
Annie Lewis O’Connor, a nurse practitioner and founder of the Coordinated Approach to Resilience and Empowerment Clinic at Brigham, believes that clinicians and technology must work in tandem to care for patients experiencing intimate partner violence. O’Connor, who assists in AIRS’ prediction model, appreciates its help in diagnosing “IPV risk and severity” as well as in assisting with “clinical decision support.”
“To understand the usability, feasibility, and application of AI and machine learning tools, we must be diligent in gathering evidence on outcomes from the use of such tools,” O’Connor writes in an email. “[AI] is something that compliments and assists the clinician in the care and treatment of patients experiencing IPV.”
No comments