GEEmgome.jpg

GE Aviation Usability Study

 
 
 

Shop Floor Usability Study

GE Aviation

SmartShop Application

Operators, inspectors, and engineers use SmartShop to manage the jet engine repair process and track technical repairs on jet engine parts.

 
 
 
 
 

Location

GE’s jet engine repair shop in McAllen, Texas

Tools

Sketch, InDesign, QuickTime, Final Cut Pro

Feature

Multi-Select Repair (ability to select multiple engine parts and perform an action)

team

Technical product manager, developer rep, UX Design (me)

Users

Shop Visit #

Final Inspector, FPI Inspector, Prep Weld Operator, Supervisor, Engineer

Third shop visit

 
 
 
 

Multi Select Feature

The first version of SmartShop allowed users to select one engine part at a time and to “claim operations” (mark that certain work has been done correctly on a certain part) one at a time. However, this does not match how many users batch process parts. The multi select feature allows users to select and action multiple parts at a time, like they do in repair shop.

 
 

Objective

We wanted to know if the way we designed the multi-piece flow would work for users in the physical shop. Most importantly, can it meet the diverse needs of various users in the shop?

 

Usability Study Method

We used a task-driven usability study. Using an InVision prototype, we asked the users to complete a series of steps to process a batch of engine parts. I read from a script to keep our findings consistent. We took note of how long it took them to complete, kept track of their clicks, and marked spots where they were confused.

 
 

Challenges

The first challenge was making sure our designs accommodated and optimized the highly complicated jet engine repair process. We conducted extensive discovery research in parallel to the usability studies.

The second challenge was to segment our findings by user role while also making an interface optimized for different roles. There are numerous operator and engineer roles at the McAllen shop, and the same role varies at each shop location. We needed to make the product effective and intuitive globally, from Texas to Malaysia. They’re all using the same screen so we want each of them to have quick and easy access to the information this is relevant to them, while not being inundated by irrelevant information.

Finally, but most importantly, we needed to develop strong rapport with our users. Generally, operators were wary of outside observers. They had experience with people who to aimed to automate their jobs or fine them for breaking codes. By building relationships and being transparent about the purpose of our research, we were able to show them SmartShop’s job is to make operators’ jobs easier and safer, not to replace it. It was important that they felt comfortable enough to be forthright about their day-to-day process — especially if it didn’t align with the established workflow — so we could design the best experience for them. Additionally, we avoided using stopwatches on the shop floor; timing exactly how long certain processes take was a sensitive subject due to fear of automation. Instead, we were precise with timing UX interactions outside of the shop floor and were less precise in our discovery research on current shop floor processes. After that trust was established, it was imperative that we honored that trust by maintaining their anonymity when writing reports and sharing results.

 
 

Takeaways

Overall, the study was a success. We built lasting relationships with our users and validated that our general UX workflow works with various physical workflows and different user roles. Observing operators repair jet engine blades and discussing their pain points on the job gave us invaluable insight into our end users.

Through the study, we identified gaps and opportunities. Some gaps were part of the multi-select feature we were testing, while others were underlying issues that were revealed or exacerbated when we added multi-select functionality. For example, most users expected to be able to click on the engine parts in our action drawer. They wanted to be able to de-select parts so they could “flag” one part at a time from the same screen. When we got back to the office I white boarded a few options that allowed the user to de-select from the drawer.

 

Next steps

Unfortunately, subsequent shop visit plans were cancelled due to the pandemic, but I’ve outlined the next research steps below:

We made plans to test with more realistic prototypes in more realistic scenarios. The InVision prototype worked well to test proof of concept and information architecture. However, InVision has its limits and we needed a prototype that remembered variables to accurately test the interaction. We partnered with a front-end developer to create a rapid prototype. We also found that users were tripped up when the data was inconsistent. For example, one of our screens showed engine part numbers that corresponded to a different engine model than was listed. Many users could not get past the data inconsistency to test the functionality. To address this, we partnered with a shop partner to make sure our data was accurate and matched the user’s specific job function.

Finally, during the next visit, infrastructure would be set up so we could physically test on the shop floor as the operators were repairing the parts. Testing on a mounted screen while the user was performing physical tasks on the shop floor — often standing up, wearing gloves, and operating dangerous machinery — would show drastically more realistic results than sitting at a computer.

Screen Shot 2021-08-04 at 5.01.25 PM.png