Both the US and UK seem to be leaning toward requiring level 3 autonomous cars to train their "eyes" not just on the outside world, but even on the driver itself.
The issue with this type of autonomous car is the distracted driver. Level 3 designates the car as conditionally autonomous, meaning control of the car would be handed back to the driver from autopilot during an emergency. The British committee reported that it is expressively dangerous "hand back control of the vehicle to the driver when it is unable to deal with a certain situation."
In the US, the National Highway Traffic Safety Administration (NHTSA) has picked up on this potential for disaster as well. The NHTSA noted in their policy that "automakers may need to monitor human awareness as part of their cars' designs."
Concerns about level 3 cars have also been raised by Ford's chief engineer for autonomous vehicles (it seems their test drivers had a nasty habit of taking naps), so it seems likely some major regulations with have to be put in place if these cars ever make it to the market at level 3.
Without this kind of monitoring, these cars, especially in their infancy, could be as or even more dangerous than a human driver, since they'll still require intervention at some points of the driving process—the extremely dangerous points. There's a growing push to skip level 3 in commercial vehicles all together for this very reason.
Want to master Microsoft Excel and take your work-from-home job prospects to the next level? Jump-start your career with our Premium A-to-Z Microsoft Excel Training Bundle from the new Gadget Hacks Shop and get lifetime access to more than 40 hours of Basic to Advanced instruction on functions, formula, tools, and more.