"If I walked out in front of a Google car travelling at 60mph, I have no real appreciation of how the vehicle will behave, so I'm effectively putting myself at a disadvantage." Ben Byford, a computer ethics commentator is quoted saying in an
No shit, you might be tempted to say, but Byford does infact raise an interesting point. How autonomous cars will receive and interpret information is a subject which has been debated ad nauseum, how they will communicate back with us has been discussed rather less. Of course, new laws will be introduced and highway codes altered to help people adapt to having autonomous machines on the streets, but when a pedestrian attempts to cross the road or a human driver tries to edge out of a junction, the behaviour of autonomous cars may be a little harder to comprehend.
In cities where jay-walking isn't illegal, or where pedestrians have priority, those on foot tend to look for physical cues from the driver - be it a flash of the lights or a wave of the hand - to show them that they've been seen and can cross, move out, or merge. But if a car isn't piloted by a human driver how are those physical cues going to be represented? The machine itself, through deep learning, may comprehend what is in its peripheral vision and what action to take, but how will it relay to the child on his bike or the pensioner crossing the road what that action is to be?
Autonomous car makers, if left to their own devices, would of course prefer a network of roads upon which only driverless cars are allowed, with no interference from pedestrians or cyclists being possible. According to Professor Adam Millard-Ball of the University of California, this may lead to the
altering some streets to promote pedestrian use whilst removing pavements and cycle lanes from others, thereby reducing autonomous cars' need to interact with humans at all. That's not a realistic prospect in every city though, meaning manufacturers still need to come up with other solutions for vehicle-to-pedestrian communication.
In a white paper by roboticists at Duke University, Michael Clamann carried out an experiment in which a van showed a display similar to that of a traffic crossing with 'walk' and 'don't walk' signals, as well as a speed display. "The idea was that the participants would use the speedometer to determine whether it was safe to cross... pedestrians relied on old habits when interacting with new technologies."
Robert Brunner, an industrial designer who worked at Apple, says self-driving cars should go further, however, inspiring confidence whilst being friendly and inviting. To that end, Honda revealed two beautiful EVs at Frankfurt last year, the
both of which include front grille displays that can show charging updates, greetings and, most usefully, alerts. Nissan also recently tested its 'intention indicator' which uses LED strips to communicate. When pedestrians or cyclists are nearby, the strip shines red, signalling that the car is aware of them, while a display shows messages such as "After you", and other companies are looking at even more unique methods, including using emojis and the car's horn to display those humanistic cues.
As it stands, though, companies haven't yet cracked the ideal way for autonomous cars to communicate with us. We learn how to communicate from a very young age and the earlier people interact with technology, the easier it is to communicate with it. But as autonomous technology takes to the road, there will be a crucial transition period during which young and old alike will have to get to grips with encountering it in environments not yet optimised for that occurrence. So, how do you think it should be done? Are extra lights enough, or will they just create confusion? Would you understand what a car meant if it was displaying an emoji? Is there even any need for communication at all? Let us know what you think.