Privacy Journo Requests

Connect with journalists covering privacy. From breaking news to in-depth features, discover relevant media opportunities from top publications in this category.

Never Miss a Privacy Journo Request

Get instant alerts when Privacy journalists post new journo requests. Join the community of sources landing media opportunities daily.

DHS Employees in Vermont - Immigration Enforcement & Policy Reporting

Do You Work for the Department of Homeland Security in Vermont? Seven Days wants to Hear From You. I’m Lucy Tompkins, the immigration reporter at Seven Days. If you have worked for Customs and Border Protection, Immigration and Customs Enforcement, Citizenship and Immigration Services or other agencies under the DHS umbrella in Vermont, then I’d like to hear from you. You don’t need to have a specific complaint or allegation to reach out. Even if you’re not sure what you want to share, you’re welcome to get in touch for an off-the-record conversation or to ask questions about how I handle sources. I'm working on ongoing reporting about immigration enforcement and policy in Vermont. I understand you may be taking a risk in contacting me, and I take your privacy extremely seriously. We won’t publish your name or any information you share without your permission. To get in touch, you can send an email [email redacted]. If you’d rather talk on Signal, an encrypted messaging app, you can call or send a message to lucytompkins.55. Signal offers end-to-end encryption for sharing text, photos, videos and calls. It is available on mobile and desktop. You can alsoset messages to disappearfrom your phone after a certain length of time. Be aware that sending things from a work computer or email account or while connected to work Wi-Fi networks may create added risks because many corporate and government accounts log web traffic.

sevendaysvt.com logosevendaysvt.com

California Domestic Violence & Housing Providers - Funding Cuts Impact

✅ Work in the domestic violence or housing space? ✅ Provide services to survivors in California? ➡️ I want to hear from you! 🙏🏼 Please fill out our survey: https://lnkd.in/gwmkdjtn The 19th is reporting on housing challenges for domestic violence survivors in California. We know many people stay in abusive relationships due to housing insecurity, and many women become homeless due to fleeing violence. We also know critical programs are at risk of being defunded. We need your help to understand what’s at stake with federal budget cuts, funding priorities and the state budget. Who we want to hear from: Organizations that receive public funding to help people access housing – including emergency, temporary, transitional and permanent housing. If you work in domestic violence services and help survivors seek out housing, we would love to hear from you too. How we will use this information: Survey responses will serve two purposes: To inform our reporting and to produce resources to help survivors navigate housing needs. Some questions will be shared with California Partnership to End Domestic Violence, but there is an option to keep your responses private to The 19th. About us: The 19th is an independent, nonprofit newsroom reporting at the intersection of gender, politics, policy and power. This project is supported by the USC Center for Health Journalism Domestic Violence Impact Reporting Fund. I am a reporter based in Los Angeles who has been covering funding challenges for domestic violence services extensively and am leading this effort. Deadline: April 28, 2026 You can always reach me over email [email redacted] or securely on Signal (username: jsmn.01) if preferred. Your privacy and security is important to us. All of the questions are optional, and are welcome to share as much or as little as you are comfortable with. We will contact you directly if we wish to use any of your information in a story. Read more about the project and how we will use this information on the survey form itself. Please share! THANK YOU! https://lnkd.in/gwmkdjtn

19thnews.org logo19thnews.org

Electric Vehicle Owners - Denied Access to Event & Driving Data

Four weeks. That's how long I'd had my brand new, all-electric 2026 Toyota bZ when a driver struck my vehicle on I-405, hitting the side of the car where my toddler and my elderly mother were sitting. A vehicle equipped with dozens of sensors, cameras, and onboard AI systems that monitor everything from lane positioning to braking patterns in real time. My car knew exactly what happened. Every input, every output, every millisecond of data leading up to and through the impact. I can't access any of it. When I contacted Toyota about retrieving my vehicle's Event Data Recorder and driving data, I hit a wall that had nothing to do with technology and everything to do with policy. The car collected the data. The car used the data. But the person behind the wheel, the person whose driving generated that data in the first place, has no meaningful right to it under current U.S. law. In Europe, this would be a different conversation. GDPR Article 15 gives individuals the right to access personal data collected about them. Article 20 gives them the right to receive it in a portable format. If my car knows everything about how I drive, European law says I have the right to see what it knows. U.S. law says almost nothing. This isn't a niche automotive issue. It's the consumer rights question of the next decade. Every AI-enabled product we interact with, our cars, our phones, our home devices, our workplace tools, is collecting behavioral data, building models from it, and making decisions based on it. The gap between what these systems know and what they make available to the humans generating that data is growing wider, not narrower. We talk a lot about AI transparency in this industry. Usually we mean model explainability or algorithmic bias. But there's a more fundamental layer: do you have the right to see what an AI system recorded about you? Can you access the data your own behavior generated? And if not, who does that data actually belong to? My car knew everything and said nothing. That's not a technology problem. That's a design choice protected by a regulatory vacuum. And until we close that gap, every person interacting with an AI-enabled product is generating value they can't access, can't verify, and can't use in their own defense. This is one of several edge cases I'm exploring in a book I'm developing on AI's unresolved boundaries:https://lnkd.in/diQ4zcww I'd love to hear from my network. Have you ever been unable to access data that was yours? A vehicle, a medical device, a fitness tracker, a workplace tool that knew more about your behavior than you were allowed to see? Or do you have thoughts on where data privacy rights need to go from here? Drop your story or perspective in the comments or shoot me a DM, I would love to connect!

UK Adults 18-40 Living at Home - Impact on Intimacy & Sex Life

Are you currently in a relationship but still living at home with your parents? We’re looking to speak with individuals who live at home with their parents and feel their living situation has impacted their sex lives and ability to be intimate - whether that’s lack of privacy, awkward moments, or having to get creative when it comes to maintaining a healthy sex life. We are particularly interested in how financial pressures and living arrangements are influencing your opportunities for privacy, intimacy, and fulfilling sexual experiences. We want to hear honest, real-life experiences - the challenges, the workarounds, and how it’s affected your intimate relationships. We’re looking for people who: Are between 18-40 years old and based in the UK Are currently in a relationship and living at home (e.g. with parents or family) OR single and living at home Feel that living at home has impacted your ability to be intimate and your sex life Are happy to speak openly with journalists for an interview about your experience on this subject matter (e.g. with newspapers or magazines) Are comfortable being named and sharing supporting images to be used by press In return, participants will receive a £75 voucher (retailer of choice) as a thank you for your time. If this sounds like you, or someone you know, please get in touch for more details on [email redacted] with a few details of your story and contact details. Thanks

CISOs & Privacy Officers - Humanoid Robots & Inference Data Governance

A $4,370 Humanoid Robot showed up on AliExpress. The Hardware Is the Least Interesting Part. You can now order a robot the same way you'd buy a phone case. A 4-foot, 50-pound humanoid with an onboard LLM Ships soon. No waiting list. No enterprise contract. Just a cart and a checkout button. The tech press is writing about cartwheels and wheel kicks. I'm writing about what happens next. The real story isn't the body. It's the layer running on top of it. The R1 comes with a multimodal LLM: voice recognition, image recognition, command processing. That means it is not just a mechanical chassis. It's a sensory endpoint. A listener. A watcher. An agent operating inside your physical space, inferencing on what it sees and hears, and feeding that data through a model. We've spent years debating the privacy implications of smart speakers and phone cameras. Those are passive by comparison. A humanoid robot in your home or lab is an always-on, spatially-aware intelligence node. "Your address tells a story. Your house tells an even better one: if you have a robot walking through it." I've spent years studying how data brokers reconstruct identity from fragments: location pings, purchase history, social graph signals. The LOCUS work we do at BHIL maps what your address reveals about you before anyone ever steps through the door. What happens to that threat model when the sensor has legs? The democratization argument cuts both ways. But "democratization" has always had a shadow. When capability becomes cheap, it doesn't just flow to research institutions. It flows to everyone. And the LLM layer means this isn't a dumb actuator: it's a model with context, memory potential, and connectivity. We are somewhere between 18 and 36 months from humanoid robots being a normal fixture in commercial environments: warehouses, hospitals, retail, hospitality. The personal and professional data exposure surface is about to change in ways most organizations haven't started modeling. Three questions you should be asking right now. 1. What is your data governance policy for AI-enabled physical agents in your space?Most companies have a BYOD policy. Almost none have a BYOB policy ( bring your own bot. ) 2. Who owns the inference data?When a robot processes what it sees in your facility, what does the model retain? Where does it go? The terms of service conversations that defined the social media era are coming for physical AI. 3. How do you build persona-aware intelligence workflows when the data source is embodied?The frameworks we use to analyze human behavior from digital signals were built for screens. The robot is the delivery mechanism. The intelligence layer is the product. The data it generates is the asset. We're not in the era of "should we think about this." We're in the era of "this is already in a cart." I'm curious what you're seeing in your sector. Are clients asking about this yet? Are your risk teams? Drop a comment or reach out.

UK Adults 18-40 Living With Parents - Impact on Sex Life & Intimacy

Are you currently in a relationship but still living at home with your parents? £75 for your story. We’re looking to speak with individuals who live at home with their parents and feel their living situation has impacted their sex lives and ability to be intimate - whether that’s lack of privacy, awkward moments, or having to get creative when it comes to maintaining a healthy sex life. We are particularly interested in how financial pressures and living arrangements are influencing your opportunities for privacy, intimacy, and fulfilling sexual experiences. We want to hear honest, real-life experiences - the challenges, the workarounds, and how it’s affected your intimate relationships. We’re looking for people who: Are between 18-40 years old and based in the UK Are currently in a relationship and living at home (e.g. with parents or family) OR single and living at home Feel that living at home has impacted your ability to be intimate and your sex life Are happy to speak openly with journalists for an interview about your experience on this subject matter (e.g. with newspapers or magazines) Are comfortable being named and sharing supporting images to be used by press In return, participants will receive a £75 voucher (retailer of choice) as a thank you for your time. If this sounds like you, or someone you know, please get in touch for more details. Email [email redacted] with PARENTS in the subject line. Thanks

Australian ECEC Directors & Providers - Children's Data Privacy Audit

Every ECEC service in Australia collects data about children. Enrolment forms. Developmental records. Medication. Incidents. Daily observations. Photos. More and more of it held not by the service itself, but by third-party platforms they chose, set up, and handed the keys to. Most directors have no idea what happens to that data after it leaves their system. That's not a criticism. It's a structural problem. The platforms are complex, the privacy policies are written for lawyers, and nobody in the sector has sat down and actually measured what these tools do against what Australian law requires. Guarding Little Footprints is a research project auditing the data privacy and security practices of the platforms used in Australian early childhood education and care. Each platform gets assessed against the Australian Privacy Principles, the NIST Cybersecurity Framework, and OWASP application security standards, using only publicly available information. The goal isn't to name and shame. It's to give the sector (directors, approved providers, peak bodies, and regulators) an evidence base they don't currently have. I'll be sharing findings here as the research progresses. If you work in ECEC, advise services, or care about children's data rights, I want to hear from you. Which platforms are you using? What questions do you wish someone would answer? Drop them in the comments. Your experience shapes where this research goes.

Charlie Health Employees & Patients - IOP Scaling & Medicaid Billing

Is this really about access, or are we just creating an assembly line? I survived the troubled-teen industry and have spent my career researching how profit-driven systems can exploit people when they are most vulnerable. In my latest Substack, I take a close look at Charlie Health, a fast-growing virtual IOP company that has taken the industry by storm. Technology can help fill gaps, but in behavioral healthcare, scaling up always changes the care itself. Here’s what my analysis reveals: The MSO Mirage: This legal structure allows venture-backed companies to grow quickly while avoiding direct responsibility for clinical care. The Newport Blueprint: Newport Healthcare’s high-volume referral strategy has now been automated and digitized. Care by Dashboard: Some internal reports allege that 'Performance Improvement Plans' are based on survey response rates rather than actual clinical outcomes. The Oregon Inquiry: State regulators are now asking tough questions about unlicensed staff and over $15 million in Medicaid billing. Families in crisis don’t care about 'scalability' or 'throughput.' They want steady, ongoing care. If we swap real, sometimes complicated psychological work for standardized, numbers-based interactions, we aren’t solving the crisis. We’re just moving people through the system. You can read the full investigation here: https://lnkd.in/e6j-p6HT 🗣️ I always want to hear from people on the front lines. If you are a current or former patient, family member, or employee at Charlie Health or Newport, please reach out to me at [email redacted]. Your privacy and anonymity are extremely important to me. #BehavioralHealth #HealthcareReform #CharlieHealth #PrivateEquity #SystemicAnalysis #MentalHealthAdvocacy #SellingSanity

AI-Related Psychosis Case Studies for Documentary - Poland Preferred

Looking to speak with people who experienced a psychotic episode during intense AI use (documentary project) Hi, Less than a year ago I went through what is sometimes called an “AI-related psychosis.” It was such a large-scale and intense experience that it didn’t fit into the framework of my ordinary life or my previous understanding of reality. I’m stable now and trying to understand what it was and how to live with it. From my perspective, experiences like this are rarely openly discussed, which can make it especially difficult for people who’ve gone through something similar to return to a stable life. Often there isn’t even a language to describe what happened, or support that takes this kind of experience seriously. Out of this came the idea for a documentary film about people who have lived through similar states. I’m studying directing in Poland and currently preparing this project for further development. If you are based in Poland, that would be a big plus - but I’m open to speaking with people from other countries as well. I’m not interested in sensationalism or blaming technology. I’m interested in how a person returns to themselves after an experience that goes beyond their usual picture of the world - how self-perception changes, and how people deal with shame, loneliness, and misunderstanding from others. If you’ve had a similar experience and are open to a calm, confidential conversation, please DM me here. Anonymity is absolutely possible - both in our conversations and in the film itself. I take personal boundaries and privacy very seriously. English is not my first language, so I may use a translator in our communication. Thank you.

Never Miss a Privacy Journo Request

Get instant alerts when Privacy journalists post new journo requests. Join the community of sources landing media opportunities daily.