Yeah, I’ve been working in aerospace, automotive, industrial and rail safety for over 20 years. You don’t get to say “this software does thing” and then in the safety manual say “you don’t get to trust that the software will actually do thing”.
Further, when you claim the operator as a layer of protection in your safety system, the probability of dangerous failure is a function of the time between the fault (the software doing something stupid) and the failure (crash). The shorter that time, the less safe the system is.
Here’s a clue: Musk doesn’t know anything about software safety. Their lead in autonomous technology has less to do with technical innovation and more to do with cutting corners where they can get away with it.
“Hey dad, the WiFi in my dorm room keeps cutting out”
“Have you gotten your Ethernet hooked up yet?”
“Hey dad, when I try to stream TV, it keeps buffering”
“Have you gotten your Ethernet hooked up yet?”
Someday they’ll get it.