A Google self-driving car crashes in California. A man in Florida is killed while in a Tesla on Autopilot.
Uber is testing almost fully autonomous vehicles in Pittsburgh, while Ford separately announces its own plan for cars without an accelerator pedal or steering wheel by 2021.
And now the National Highway Traffic Safety Administration offers its support for autonomous vehicles.
It sounds dangerous, but it also sounds inevitable. Yet every day I pass people on the highways and byways and think, "Yo, pal, I think a machine could drive better than you." In many cases, it wouldn't even have to be a very smart machine.
Furthermore, as Sturgis Kid 4.0 has just gotten his learner's permit, I'm acutely aware of all the information and skill needed to pilot a 4,000-pound machine.
But no steering wheel in five years? One wonders how safe this is going to be.
Facing up to danger: These new types of crashes catch our attention, but we've become inured to the daily tally on America's roadways.
Auto fatalities rose in 2015 after years of decline, up 7.7 percent according to the NHTSA's early reporting, totaling about 35,200 for the year.
What's more, NHTSA research from the mid-oughts showed 94 percent of auto crashes involved driver error as a factor.
So factoring in those numbers, I'm thinking it's machines 1, humans 0. But how will Ford and other companies pull this off?
Limits remain: Randy Visintainer, director of autonomous vehicles, Ford Motor Co., outlined Ford's vision to deliver a vehicle without a steering wheel, brakes, or gas pedal, for commercial use by 2021. He said the Society of Automotive Engineers focuses on varying levels of autonomy, numbered 0 to 5.
The first three levels incorporate things we find on almost all 2016 and 2017 vehicles - adaptive cruise control, lane keeping, automatic braking, all requiring the overall control of the driver. Level 3 features part-time autonomous driving, with a human driver available as a backup; this is where Tesla Autopilot lands.
Ford's plan is to deliver Level 4 - a completely autonomous vehicle within a limited domain of roads, which means a driver does not need to be ready to take control of the vehicle, Visintainer said.
"It's constrained, but a human is not involved," he said. Level 5 is fully autonomous without range restrictions.
How it's happening: Visintainer explained that engineers set up what's called a "prior map," which recognizes fixed or reasonably fixed features above the ground - buildings, sidewalks, lampposts - and saves them into memory.
Then the lidar - a radar system that uses lasers - overlays that with a real-time image, which would include "nonstationary objects" - which include animals, people, and other cars.
The system is also fed the rules of the road, so it knows where traffic lights are, stop signs, how people take turns through four-way stop signs, and similar events.
"Sometimes the rules of the road are open to interpretation," Visintainer said. "Are they [the other driver] an aggressive driver? Are they a courteous driver? So we can adjust accordingly."
So it takes these software elements and couples them with the sensory information - the lidar, cameras, the radar - which are fused together to provide a situational awareness of the environment around the vehicle to create a safe path forward.
Where would it happen? Visintainer said the Level 4 autonomy would be limited to places that are mapped in this sort of detail. Right now, Ford is testing on public roads and test tracks in Michigan, California, and Arizona.
But that's not just a Ford project - Google, TomTom, and other mapping services are also creating those same level of detail maps. And Uber is working on its own separate systems on Pittsburgh's streets.
How can this be safe?
"The one thing we can do with the autonomous vehicle is we can program with the knowledge that a professional driver has," Visintainer said.
That means the control system will understand things like how much grip the vehicle has with the road, and will be designed to keep the vehicle within its limits. Who among us has learned that any way but the hard way?
Safety issues: Henry Jasny, vice president and general counsel for Advocates for Highway and Auto Safety, said the main safety issue is for self-driving cars with humans as the backup driver.
"There's a real issue as to how well these machines will alert the driver," Jasny said. "And if the driver starts to reengage, how good will their situational awareness be at that time?"
He actually said full automation would probably be a safer bet. The concern at that point is how much transparency companies will offer, the level of government oversight, and the amount of public information. Especially since each company is working on its own proprietary system.
"We're not interested in stopping this," Jasny said. "We think the honeypot at the end of the rainbow will be all vehicles being fully automated. We want to get there, but we want to do no harm on the road to getting to that goal."