He believes Tesla's technology is almost at the point where allowing humans to steer the vehicle would be more dangerous than relying on Autopilot.
"I think it will become very, very quickly, maybe even towards the end of this year – but I'd say, I'd be shocked if it's not next year at the latest – that having a human intervene will decrease safety," predicted Musk.
His vision is that once Autopilot is proven to be 200 percent safer than a human driver, allowing the human to take control would increase danger.
Musk goes on to compare self-driving cars to elevators, which once upon a time required an elevator operator.
"Now no one wants an elevator operator because the automated elevator stops at the floors and it's much safer than the elevator operator," said Musk.
The rate of improvement in self-driving car capabilities was "exponential", according to Musk, who argued that there'd be no need for camera-based monitoring of the driver.
Fridman questioned Musk about why Tesla allows drivers to use Autopilot in a much broader range of driving conditions compared with the Cadillac Super Cruise system, which the researcher found was "very constrained" and "much narrower" than the operational design domain (ODD) employed in Tesla's system.
Fridman commented that Tesla is allowing drivers to use its self-driving capabilities "basically anywhere". However, Musk clarified that drivers can use it "anywhere that [the vehicle] can detect lanes with confidence".
"Frankly it's pretty crazy allowing people to drive a two-ton death machine manually," said Musk. "That's crazy, like, in the future people will be like, 'I can't believe anyone was just allowed to drive one of these two-ton death machines, and just drive it wherever they wanted…'. It's going to seem like a mad thing in the future that people were driving cars."
The Tesla boss is a lot more upbeat about self-driving technology than Ford CEO Jim Hackett who this week said the company "overestimated the arrival of autonomous vehicles. Ford still plans on releasing a self-driving car in 2021 but "its applications will be narrow, what we call geofenced, because the problem is so complex".
Musk brushed aside questions about Tencent Keen Security Lab's research demonstrating that small objects, such as stickers, could trick a Tesla to drive into oncoming traffic.
"It's very easy to block that by having anti or negative recognition. If the system sees something that looks like a matrix hack, exclude it. It's such an easy thing to do," said Musk.
"People have no idea about neural nets, you know, they probably think it's a fishing net or something."