I recently joined an Institute of Engineering & Technology (IET) workshop looking into some of the unexpected consequences of autonomous vehicles. The IET will be producing a report shortly but I’m here to promote the accessibility design agenda.
The technical capabilities of the vehicle, the compute power on board for navigation, vehicle management and entertainment are mind-boggling. To be totally autonomous they need to be independent from any external inputs but they will also benefit from the explosion of external data sources. The impact on city planning, traffic, parking and overall public services will be life-changing for everyone.
Doubtless the vehicles themselves, becoming more sentient, will be very grateful for the increased levels of communications, new information sources and peer-to-peer information flowing between fellow road users as well as input from the streets and other city constituents. Making sure these information flows are bi-directional, linking the on-board vehicle systems to city information sources such as traffic and buildings information is also essential to bring the passengers to the right door for a wheelchair user, to have the robot or drone deliver the package or to find the right assistance for a vision-impaired person trying to find their way into the mall.
However, what really needs careful consideration is the user interface for these vehicles and their associated services. The most important issue is to get the wide variety of different users with different levels of IT skills and accessibility to initiate the autonomous vehicle and assisted travel through their preferred means. And, of course, if someone else orders or initiates the service, for the person to be communicated with in their preferred manner.
Designing the interface from scratch with inclusivity in mind will save the painful and often fated fall-back position of add-on development needed for different disabilities.
The good news is that the work being done around omni-channel customer experience with its multiple options for communicating with people and machines, addresses many of the issues. Add to this upfront design that embraces the accessibility features of smart phones, tablets, personal assistants and home automation systems and we have the beginnings of an inclusive design.
The justification for this is not necessarily just around including all disabled people in the digital economy. No, inclusive design actually makes it easy for everyone to use a service. Some people like talking to their app, some like interacting via a touch screen, some might even prefer the old QWERTY keyboard approach.
Artificial intelligence and machine learning will also contribute to the smooth incorporation of all users into the emerging scenarios. For example, once an individual with particular needs is identified as having ordered an autonomous vehicle service, the system can route a specific vehicle, possibly specially adapted, to the desired location.
Autonomous vehicles can then contribute further in terms of local authorities streamlining their service to people requiring home visits, social care and, of course, linking them into the healthcare systems. Ambulances will be redefined with autonomous driving and the paramedics able to concentrate on looking after the patient.
So, we should also be excited about the impact of autonomous cars on groups previously precluded from driving. New ways of spending time while transferring from one place to another, entertainment, work, rest, will appeal to all customers but we must allow for the human machine interface and offer that interface in a variety of ways that suit all users under all circumstances.
One final thought: what happens when the autonomous vehicle senses a person with a guide dog and the guide dog senses an approaching vehicle? A Mexican stand-off – go find the algorithm for that one!
Keep an eye out for the full IET report coming soon.