To create a solution geared toward widespread adoption, Shift reviewed the lessons learned from previous work and assessed persistent gaps. The Shift approach introduces additional high value real-world clinical use cases, develops a reusable methodology, and builds on a standards based approach. Shift has prioritized four clinically-informed use cases that take place in two phases:
A 75-year-old man with a past diagnosis of depression and opiate use disorder wants to manage who sees what. He’s comfortable sharing some history with his primary care team, but not with third-party apps or non-clinical actors. He also wants to limit access to his behavioral health notes in some situations without blocking all medical data.
Why it matters: Older adults often live with complex health histories, and deserve control over how that information is shared. This use case highlights the real-world challenges of protecting sensitive behavioral health and substance use history, especially for individuals covered under 42 CFR Part 2 and those with overlapping conditions.
Without the means to granularly segment data, today’s systems often fall back to all-or-nothing sharing. That forces patients like him to choose between privacy and care or risk unintended exposure. This use case shows how granular data segmentation and consent can respect privacy while supporting safe, coordinated care.
Two adolescents, one of whom is a minor, wish to keep reproductive health data (including pregnancy and abortion information), genetic, gender identity, and immigration status information private from various actors via the patient portal and data exchange across state lines.
Why it matters: Adolescents navigating sensitive care deserve privacy protections that reflect their unique legal, developmental, and emotional needs. These data points are highly stigmatized and legally complex, especially when shared across state lines or accessed through patient portals.
Current systems don’t allow for that level of nuance. Without granular consent and segmentation, these adolescents face exposure, risk of harm, and loss of trust in the healthcare system. This use case shows how Shift can support privacy, safety, and dignity for some of the most vulnerable patients.
A woman with a documented history of intimate partner violence and other health-related social needs wants to keep this information private. She’s concerned about it appearing in the patient portal, being shared across systems, or being accessed by third-party apps that are not HIPAA-compliant.
Why it matters: Information about a patient’s housing, income, food insecurity, or history of violence is critical for delivering whole-person care, but it’s also deeply personal. When this data is shared through systems not built to protect it, patients may face unintended consequences, including stigma, discrimination, or even personal safety risks.
Without granular privacy controls, sensitive social needs data can be misused or exposed—especially when shared with community platforms or mobile apps outside traditional health care. This use case demonstrates the urgent need for consent-aware systems that balance care coordination with patient safety and autonomy.
A mother receives care for a sexually transmitted infection during pregnancy and later experiences postpartum depression. This information is documented in connection with her newborn’s medical record. She wants to ensure this sensitive data is not visible to the child’s other parent through a shared portal, nor to her child once they reach adulthood.
Why it matters: Health records for mothers and infants are often tightly linked—especially during pregnancy and postpartum care. But when maternal health data (like STI results or mental health history) gets merged into the child’s record, it can later be viewed by unintended parties, such as the child’s other parent or even the child themselves.
Today’s systems don’t easily allow patients to control how co-mingled data is handled, especially when legal guardians or future adults gain access. This use case highlights the need for data segmentation and consent mechanisms that account for family dynamics, sensitive timelines, and evolving access rights.