Autonomous vehicles are poised to reshape how people use and perceive public spaces, raising questions about privacy, accountability, and urban design that cities must address now. This article examines seven critical areas where self-driving cars will challenge existing norms, from the blurred lines between public and private space inside vehicles to the erosion of anonymity on city streets. Experts in urban planning, transportation policy, and data privacy offer their perspectives on these emerging issues and potential solutions.
- Federated Governance Limits Inference Risks
- Citywide Fleets Archive Streets Erode Anonymity
- Absent Driver Weakens Bystander Restraint
- Trip Logs Expose Visits Threaten Coverage
- Shared Rides Blur Boundaries Prompt Backlash
- Fragmented Fault Plus Weaponized Footage Undermine Justice
- Vehicle Interiors Seem Private Require Safeguards
Federated Governance Limits Inference Risks
I’ve spent 15+ years building federated data platforms where privacy-by-design isn’t optional—it’s the entire architecture. What concerns me most about autonomous vehicles is the data exhaust problem: these cars will generate massive streams of biometric and behavioral data that reveal health patterns people haven’t even consented to share with their own doctors.
Here’s my specific concern: sensor fusion. These vehicles combine cameras, lidar, thermal imaging, and motion detection to steer safely. But that same sensor array can detect gait abnormalities suggesting Parkinson’s, breathing patterns indicating respiratory disease, or even heart rate variability from micro-movements. In my work with genomic data, we’ve seen how seemingly anonymized information can be re-identified when combined with other datasets. Now imagine your daily walking route, the times you visit certain medical buildings, and your movement patterns all captured passively by passing vehicles and pooled into centralized databases.
My expectation is that we’ll need federated governance models for autonomous vehicle data—similar to what we built for healthcare. The analysis should happen at the edge (in the vehicle itself) with only necessary safety insights extracted, not raw observational data shipped to central servers. I’ve seen this work at scale across 12+ children’s hospitals analyzing sensitive patient data without anyone accessing the raw information. The same privacy-preserving computation can work for autonomous fleets.
The difference between my genomics world and autonomous vehicles: patients explicitly consent to share their DNA for analysis. With autonomous vehicles, you’re opted in just by existing in public space, and there’s no Data Access Committee reviewing who gets to analyze your behavioral patterns captured on someone else’s commute.

Citywide Fleets Archive Streets Erode Anonymity
What I’m worried about is incidental observation becoming persistent and aggregated surveillance. A self-driving car isn’t just a car, it’s a mobile sensor platform. You scale that out to thousands of cars in a given city and all of a sudden you’ve got a dataset where the combined cameras, LiDAR, and other sensors in those cars is effectively a queryable archive of public life; this is something the EFF has warned about, about how this data can be aggregated and retained, and we’re talking about tracking where people are going on a massive scale. The problem isn’t with any individual car spying on you, it’s that the entire fleet produces a permanent record of public space, and then the practical expectation of privacy is eroded.

Absent Driver Weakens Bystander Restraint
What I keep thinking about with autonomous vehicles is how much more exposed people might feel inside them, especially when they’re sitting still in public view.
Naturally when you’re parked in a regular vehicle, there’s a driver inside and the assumption is that someone’s in control. That creates a kind of natural boundary, people don’t usually approach or stare because they recognize the vehicle as being occupied. But when the car’s driving itself, or just idling somewhere without a visible driver, that boundary starts to blur.
You’re sitting there, maybe answering emails or taking a call, and people walking by may not see you as separate from the space around them. That loss of perceived privacy could become a real concern, especially if these vehicles start clustering in busy zones where personal space is already limited.
I don’t think it’s going to stop adoption, but it will definitely change how people behave inside the vehicle— more guarded, more aware of who can see in.

Trip Logs Expose Visits Threaten Coverage
I run a men’s health clinic in Providence where privacy is everything—guys don’t exactly advertise their Low T appointments. What worries me about autonomous vehicles is the erosion of anonymous travel to sensitive medical facilities. Right now, a patient can park three blocks away and walk to our Richmond Square office without anyone tracking that journey.
With autonomous vehicles, every drop-off and pick-up becomes a timestamped, GPS-logged event stored on corporate servers. Insurance companies are already mining health data—imagine them correlating repeated trips to a urology clinic with risk assessments for coverage. We’ve had patients drive from Connecticut specifically because they want discretion from their local communities.
My specific expectation: within three years, we’ll see the first health insurance rate adjustment or claim denial tied to autonomous vehicle trip patterns. Someone’s regular visits to a fertility clinic, addiction treatment center, or mental health facility will leak through data-sharing agreements nobody actually reads. In healthcare, we’re bound by HIPAA—these transportation companies aren’t.
The real kicker is that unlike your own car where you control the data, ride-sharing autonomous vehicles will have zero patient confidentiality protections. I’ve already had two patients this year specifically mention they’re worried about Uber drivers knowing they’re coming to a men’s sexual health clinic.

Shared Rides Blur Boundaries Prompt Backlash
I think autonomous vehicles are going to blur personal space in a way we’re not fully ready for. When cars become shared, sensor-packed environments, the inside of a vehicle stops feeling private and starts feeling like a semi-public, monitored space. One specific concern I have is data exhaust—who owns the conversations, behavior patterns, and movement data generated inside these cars. People will assume “it’s just a ride,” but in reality they’re sitting inside a rolling data collection device. My expectation is there’ll be a backlash once people realize how much is being tracked by default. The companies that win long term will be the ones that make privacy visible and controllable, not buried in terms no one reads.

Fragmented Fault Plus Weaponized Footage Undermine Justice
After handling over 40,000 injury cases across Florida, I’ve watched technology reshape how accidents happen and how liability gets assigned. My biggest concern with autonomous vehicles is liability fragmentation in crash investigations. Right now, when someone’s injured, we have clear parties: driver, vehicle owner, maybe a bar that overserved them. With AVs, that splits into the vehicle manufacturer, software company, sensor supplier, map data provider, and potentially the “passenger” who may have had override capability.
I’m specifically worried about surveillance data weaponization against injury victims. In my rideshare accident cases, we already fight Uber and Lyft over internal trip data—they control the cameras, GPS, and app records that could prove our client’s case, but they’re not neutral parties. Now imagine every autonomous vehicle is a rolling surveillance platform capturing not just its passengers, but everyone around it. Insurance adjusters will have footage of pedestrians on sidewalks before an accident, and they’ll use it to argue comparative negligence. “Your client was distracted by their phone two blocks before our vehicle’s sensor failed.”
I’ve seen this playbook already. After Florida’s 2023 tort reform cut our statute of limitations from four years to two, insurance companies became more aggressive about early settlements because they know time pressure works. With AV crash data locked behind corporate lawyers and proprietary “black boxes,” that two-year window becomes even more dangerous for injured people who need that evidence to build their case.

Vehicle Interiors Seem Private Require Safeguards
As vehicles drive themselves and people use commute time for emails, short meetings, or sleep, the cabin will feel like private space even while moving through public areas. My specific concern is that in cabin sensors and connectivity could capture those moments, so strong privacy controls and clear data policies will be required.

