In a pivotal moment for the autonomous transportation industry, California chose to expand one of the biggest test cases for the technology.

  • Gsus4@feddit.nl
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    edit-2
    1 year ago

    It hits different when you’re the one being crashed into, but if it crashes less than monkeys behind the wheel and liabilities are all accounted for and punished accordingly, bring it!

      • Gsus4@feddit.nl
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Because corporations can’t be allowed to get away with what would land any of us in jail if we did it. We know they will cut corners if allowed, so make sure FSD is safer and that citizens are not defrauded when dealing with economic behemoths.

        In other words, it’s good that they have less accidents, but the ones they have should be treated the same way we treat human drivers or harsher, so that playing with chances is not just an economic factor to optimize and cut corners on. E.g. aviation safety rules: even low cost airlines need to follow these rules, not the legal farwest they created with social media.

        With FSD the example is: LIDAR is more expensive, but it is an evolving technology that is essentially safe, but Elon wants to use just cameras…because it’s cheaper…and…much less safe…it’s not a solved problem on the cheap. That’s why you need to penalize them for making such choices or outright forbid them from making them. They are going to be setting standards here and there is a risk that a shittier technology wins a few bucks for elon at the cost of lives into the future: and we can’t half-ass this forever just because Elon wants his cars to be half the price it takes to do right.