Frame 22-2.png
Frame 22-2.png

One of the most powerful aspects of driving a car is the feeling of independence and control of being able to go anywhere at any time, to listen to whatever you like, and to have a space that is your own. With autonomous cars, there is the potential for a wider set of people to experience these joys of driving without actually driving.

To approach this challenge, we will build off the well-established area of accessible interface design for smartphones. 

 

We are currently working on the design system for control and communication using the accessible features of the mobile phone and we are developing the first versions of the design.

Interface Features Walkthrough

Accessible Onboarding

Screen Shot 2021-12-30 at 12.53.50 PM.png
Screen Shot 2021-12-30 at 12.54.07 PM.png

Choose your accessibility requirements as you sign into the app

Order A Ride

Screen Shot 2021-12-30 at 1.40.13 PM.png
Screen Shot 2021-12-30 at 1.40.28 PM.png

Order a ride from marked accessible locations

Screen Shot 2021-12-30 at 1.42.10 PM.png

Verify your ride and destination with scan or voice (for visual impairment)

Screen Shot 2021-12-30 at 1.42.25 PM.png
Screen Shot 2021-12-30 at 1.42.46 PM.png

Choose seating side and deploy ramp if needed

During A Ride

Screen Shot 2021-12-30 at 1.44.20 PM.png

Get notified of updates and 

ezgif-3-5bcb629ffb42.gif
ezgif-4-2d426583f345.gif
ezgif-4-d9b9828074ad.gif

Control home page and seating controls

ezgif-2-a3aab5beda61.gif
ezgif-4-83e287ebbbfc.gif
ezgif-4-b44f9b4b069b.gif
ezgif-2-caf64a0e17a1.gif

More controls, also activated by voice

Dropping Off From Ride / Next Destination

Screen Shot 2021-12-30 at 1.50.06 PM.png
Screen Shot 2021-12-30 at 2.24.11 PM.png

Set pickup location, next destination or end

Screen Shot 2021-12-30 at 1.50.25 PM.png

User feedback

Inclusive Design for Autonomous Vehicles

Research Study for Department of Transportation Design Challenge

Timeline
Tools Used

February - December 2021

Figma

Skills

User Research

User Testing & Interviews

Collaborative Designing

Storyboarding

Wireframing

Collaborators
 
Nik Martelaro and research team 
Roles

UIUX Designer

Researcher

 
The Challenge
How might we design vehicle transportation interfaces to be more accessible to all?

Designing interfaces for accessible autonomous ride-share experiences

Autonomous vehicles are the future, where more and more methods of transportation including our personal vehicles are autonomous. With this powerful tool of driving, we want to expand these experiences to more people by creating an interface that takes into consideration people living with disabilities. 

Often times, many don't realize that a design is not accessible to all because of the lack of exposure and thus knowledge, therefore, I hope to as a designer on this team, not only learn more about designing for accessibility but to also create more awareness into the importance of creating products for the differences in people. 

Screen Shot 2021-06-13 at 8.47.18 AM.png

I worked with researchers to conduct interviews with people who had experience with disability in our local community, analyzed this information and built out the preliminary structure for our interface. 

 
Research
Screen%20Shot%202021-06-09%20at%203.12_e
edited.png
Community Interviews

Every month we had a community meeting where we talk to our users and ask for feedback

Communicating with the research team
Screen Shot 2021-06-09 at 3.39.44 PM.png

We gathered information on what was already out there for accessibility features on major phone operating systems such as IOS and Android

Screen Shot 2021-06-13 at 9.39.47 AM.png

Using this information on accessible features, we placed them into various scenarios

Screen Shot 2021-06-21 at 10.34.28 AM.png
Journey mapping


 
Screen Shot 2021-06-21 at 10.34.47 AM.png

What users are currently experiencing

We collectively in the design team created journey maps of what accessibility users are currently experiencing in different contexts and with different disabilities based on research to get a wholistic understanding of what users would want to experience. This also helped understand at what points in the travel can this interface intervene and significantly improve an accessible autonomous car experience. 

Screen Shot 2021-06-13 at 9.49.37 AM.png
Screen Shot 2021-06-13 at 9.49.24 AM.png

While the research team focused on collecting data from external sources and organizing them into textual information, I took this and created visual scenerios to further understand the situation and what users might want or come into problem with during a car ride

Screen Shot 2021-06-09 at 3.44.57 PM.png
Screen Shot 2021-06-09 at 3.45.18 PM.png

I worked together with the researchers to develop a series of user cases to more fully understand what is necessary as part of our interface and interviewed with others in order to get feedback. The researchers complied our process into a spreadsheet as well as a document guideline for designing the interface.

 
Prototyping Interface

As I continuously created scenarios, I began to translate them into voice flows and use this to guide how I designed the initial interface.

Initial frames for onboarding
 
Screen Shot 2021-06-13 at 11.23.15 AM.pn
Screen Shot 2021-06-13 at 11.23.55 AM.pn

First time user selection for personal autonomous car screens

Transitioning to car-share interface instead of personal car with more user feedback
 
Screen Shot 2021-06-20 at 9.08.57 AM.png

By talking to more users, our team began to shift for designing a personal autonomous car interface to a car-share service because most interviewees assumed that this was the most common and more widely used. This is because of how car-sharing would be a more affordable option.

Screen Shot 2021-06-19 at 9.24.55 PM.png
Screen Shot 2021-09-27 at 9.57.00 AM.png

With public usage cars, we would need to be more questions in the beginning of travel to indicate what type of accessible features one would require, such as the size of the wheelchair. Thus, we later would create a series of necessary onboarding questions. 

Emphasize allowing users to visualize the status of the car
Screen Shot 2021-06-15 at 2.51.51 PM.png

I suggested using step by step notifications the status of the vehicle for people living with sight impairment. The user can listen to and visualize where and when the car is approaching. They are empowered to be in control during the entire process. 

A log at the bottom lists the processes and steps so that people living with cognitive impairment can look back to it for reference or anyone that wants to confirm that everything is proceeding accuretly. 

Screen Shot 2021-06-15 at 11.14.53 PM.pn

I emphasized the redeploy button because during our interviews, people that regularly used cars with ramps, often had to redeploy the ramps due to how finicky they were. 

During the ride
Screen Shot 2021-06-15 at 11.16.11 PM.pn

In a car share scenario, users wondered if there was some way to indicate the drop-off/pick-up zone was accessible friendly. This was a common problem our users faced, therefore, I explored including options for users to rate the drop-off/pick-up zones as well as have a satellite view of the area live. 

Screen Shot 2021-06-15 at 2.52.11 PM.png
Clear and consistent communication of travel 
 

Knowing how different users rely on either sound or visuals to input and understand information from the interface, I designed the starting navigation page to be as visual as possible at major points in the process, such as having a visual of the car in real-time space and having voice control narrate significant events in the journey.

In our interviews, I found out that users want to know what is happening during the ride, but they also don't want to be overloaded with information too often. So creating a balance of what to include and not in the voice feedback is an important aspect I focused on exploring. 

Important events that should require voice control updates include:

• Traffic, delays in travel time

• Changes in road condition (series of speed bumps, etc.)

• Rerouting 

• Emergencies (low fuel, malfunction, etc.)

Screen Shot 2021-06-20 at 7.52.40 AM.png

Anytime during the ride, the user can also use their voice or click to the controls tab top change any of the conditions in the car, such as the seat position and temperature. 

Ending the ride

Screen Shot 2021-06-21 at 12.58.26 PM.png
Screen Shot 2021-06-21 at 12.58.59 PM.png

At the end of the ride, similarly to during the ride, the user is updated on where they are being dropped off at. This includes a live view of the area to better understand the outside conditions, as well as an update on the outside environment and weather. This I learned would be especially relevant to wheelchair users.

The interface then confirms where the user would like to be picked up and parks itself in case of any mistakes. 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Screen Shot 2021-06-15 at 2.58.34 PM.png
Mitigating user mistakes
 

What is often mentioned in our discussions is how to navigate when mistakes happen. Therefore in my interface concepts, I focused on inputting areas for the user to intervene at crucial moments. For example, if the destination arrived is not what the user wants, then they are able to click as well as tell the interface that they want to reroute, stay parked, or even go back to their original location (also accessible during the ride anytime). 

Screen Shot 2021-06-21 at 12.57.42 PM.png
Creating a user community
 
Screen Shot 2021-06-20 at 9.09.17 AM.png

I realized as I talked to accessibility users in our community when designing this interface, that each person was very passionate about the topic and had many personal experiences and things they wanted to day. Therefore, I thought that as our interface becomes further developed, it would be valuable to encourage a community base where users can interact with each other, share information, and help each other in autonomous vehicle related problems.

Refinement
Screen Shot 2021-06-20 at 10.06.04 AM.png
Consolidating designs with cross-functional teams
 
Screen Shot 2021-06-21 at 1.31.12 PM.png
Screen Shot 2021-12-30 at 2.25.45 PM.png

We collaborated with the research team to create a finalized design system for designers and researchers to be on the same page. This included explanations of the reasoning and importance behind chosen colors, fonts, sizes, consistent user interactions and more. 

 
Reflection & Next Steps
My major takeaways

• I really appreciated how our advisor gave us control over how we managed our work and our teams. By doing this, I was really able to practice my leadership and organizational skills to work in smaller teams and contribute as part of a larger team.

• This was such as large project and I am very lucky to have joined the team at an early stage. Therefore, I was able to contribute and guide the project from the beginning to a well-designed state. I learned the full process of how communication designers design an app interface to where they hand it off to coders.

 • I really enjoyed how closely we interacted with community members as well. I learned how to lead interviews, debrief, and take away key insights for improving an interface design.

Interface implementation & further design refinement

Mapping the voice flow with user interactions for interface

Refinement of user interface design to be more succinct and for better flow

Testing with our users of the current prototype

• Start developing the interface and working with coders in doing so