It’s been a few weeks since my last post, but as I get close to a 50-minute talk at Philcon, I think I should take some time away from promoting my new book (gotta mention it once a post!) and check on how things have been going in some of the topics I’ve written about before.
First, a bit on the robots in religion. My #1 reader (hi mom!) sent me this article and I pulled the thread a little to see what the newest developments are. The main article deals with Mindar, an anthropoid robot that recites a 25-minute pre-programmed sermon.
What leaps out at me in the monk’s words is that he hopes the robot itself becomes a quasi-deific item. That as the monks come, live their lives, and die, an AI that will one day be loaded into Mindar’s body will continue to learn and grow wiser. That the robot will eventually achieve enlightenment or channel the goddess of mercy as an animate, intelligent incarnation. This has so many possibilities; humans as builders of gods. Robots as links to the divine. Robots as religious relics, tended by the monks and priests.
There have also been a couple more robots that take care of ‘rote’ worship. In India, there’s a robot arm that performs a ritual motion.
Elsewhere, a device is being used to recite the name of Buddha or says prayers endlessly, in the manner of a prayer wheel only purely digitally. While neither intelligent nor a classical robot, it shows the march of progress.
On the autonomous vehicle front, the AV START Act and SELF DRIVE Act in the Senate and House respectively have been abandoned. SELF DRIVE passed a couple years ago, but the AV START Act never did, and its abandonment leaves a gap. The House and Senate formed a joint committee in July, planning for industry involvement and cooperation between the two legislatures in drafting a new autonomous vehicle bill. This has so far gone nowhere. Having nothing else they can do, DOT has sent around $60 million in grants to institutions in seven States, a mixture of university research groups and State Departments of Transportation.
While a fairly predictable development, the impeachment investigations and almost inevitable Articles and trial will leave the House and Senate unlikely to work together on anything this year, and nothing that isn’t a crisis until 2021. Meanwhile, autonomous shuttles are being rolled out nationwide, from Kalamazoo to the Brooklyn Navy Yard. All while Waymo partners with firms in France to do a 20 mile mixed highway and city street shuttle from De Gaulle to La Defense and Tesla continues to make grandiose claims about “Full Self Driving” capability (which is apparently not what it says on the tin because they still have the “always keep your hand on wheel and eye on road” disclaimer) which includes existing “Smart Summon.” Let’s have a look at Smart Summon in action.
And a longer video of the Tesla hitting the garage wall
But that’s just features. We’re here for policy. Well, let’s see what the police think about policing non-existent AV policy.
6:00 mark if it doesn’t take you right there.
And we hear it! The line we’ll get from every driver of every ADAS car from now until a law is passed. “I wasn’t driving it!” And the cop saying what most will say when confronted with no regulatory guidance, “Then who do I write the ticket to?”
It’s a classic question, but one that is going to be asked with increasing frequency. When an autonomous system does wrong, who is liable? Ticketing or arresting sleeping drivers in Teslas on Autopilot is clear enough; there’s an operator asleep at the wheel. Whatever the features involved, that’s a well established situation. When there is no one in the car at all, we hit the issues in this video. In this case, I’m surprised the owner didn’t get a citation for operating the vehicle in a dangerous manner – or whatever ticket you get for trying to do a stunt where you steer from outside the car or let go of the wheel and run alongside.
It’s also worth noting that it’s hard to be sure, but the car didn’t appear to stop because there was a police car behind it, but because the owner stopped it or it reached its destination. This has happened before. Police had difficulty getting an autonomous Tesla to stop when a driver was passed out at the wheel.
Let’s look at the Smart Summon disclaimer
Smart Summon is designed to allow your car to drive to you or a location of your choosing, maneuvering around and stopping for objects as necessary. Like Summon, Smart Summon is only intended for use in private parking lots and driveways. You are still responsible for your car and must monitor it and its surroundings at all times and be within your line of sight because it may not detect all obstacles. Be especially careful around quick moving people, bicycles and cars.
In other words, “if this screws up, it’s your fault for not paying attention” just like Autopilot. The question on my mind, then, is how fast you can stop the car if you see it about to hit something. Given the damage so far, probably not fast enough.
Our final update is on autonomous delivery drones. This was not the story I expected to see as the catalyst for a “torch and pitchfork” moment, and that’s the exact lack of awareness that has caused the issue in the first place.
In a post from Emily Ackerman, a disabled PhD Student at the University of Pittsburgh, we hear a report of a Starship (previously talked about in the Autonomous Delivery Crawlers: A Policy Perspective article) robot keeping Ms. Ackerman from getting out of a crosswalk by blocking the pavement cut. This turns out not to be a bug but a feature.
It makes sense from the perspective of an engineering team that likely had few if any differently mobile members. The robot needs to cross the street. It should communicate this in the most legible way possible. So it places itself in the curb cut to signal its intent. It shouldn’t cross when there is an obstruction, so it waits for a clear path and a ‘walk’ signal if possible. A person in a wheelchair would likely be wide enough to stop it from proceeding, and so it gets itself stuck right where the person in the wheelchair needs to go in order to get out of the street. Starships are probably not programmed to back up, and seemingly don’t have many or any sensors back there. The robot is pretty well stuck by its weight and the high traction surface, so it can’t be moved.
What makes this more ridiculous is the following video
It can climb the curb. Probably decreases its mean time between failures by stressing the motors and joints, but it’s built to do it. It doesn’t even need to descend the cut to get into or out of the street. Placing itself there helps communication, but there are other ways to signal that.
As an issue, it’s a reminder that what seems like safe behavior for a robot may taking up space intended for the safety of human traffic. What is truly outstanding are the replies!
I wouldn’t have the slightest problem kicking it out of the way
i will personally fight this robot
Let me know if you need me to come by and give it the old Philly special.
That comment is referring to the destruction of a hitchhiking robot in Philadelphia
carry a screwdriver around and scrap em for parts
And aluminum baseball bat and/or a sledgehammer should solve this issue.
Gonna smash the shit out of it if I ever see one
I’m cherrypicking, but that’s a pretty good number of violent responses. No doubt (at least mostly) non-serious, but the fact remains that this is a serious issue that caused a great deal of public anger. Starship pulled the robots off the street when they got the tweet, so they recognized it as a hazard that needed to be addressed immediately.
The problem will grow, though. These are the first of many, and already crowded sidewalks will be choked with them. Perhaps they’ll migrate to the streets and bike lanes, becoming closer in form to humans on ebikes, scooters, and motorcycles, but the fact remains that they will take up space, and often dangerously. The proliferation of delivery people is a hazard I know well from living in New York, where many behaved quite dangerously, risking themselves and others.
The shift over to robots will focus that annoyance that was previously spread across many individuals onto single companies. After all, if one robot behaves badly, that means they all will in the same situation. A couple more incidents like this, and the industry will die. Some may say rightly. It’s up to the developers to prove the detractors right or wrong by designing the delivery crawlers to be either safe or unsafe, but as we see in policy everywhere, it only takes one bad actor to get an entire group thrown out.
These issues are developing daily, and I hope to get back to Pittsburgh or another high tech city soon so I can experience them firsthand. If there’s any takeaway here, it’s that some companies (Starship) learn from their mistakes and others (Tesla) just keep going ahead hoping for the best. “Move fast and break things” is an OK motto with purely digital, non safety items, but when the tech is embodied and safety critical, what you break might be a human body.
Longer Mindar video: