Water damage brings misery for everyone, including the insurance companies that cover it. As the third highest cause of compensation paid to this company’s customers, it was noticed that only 1-2% of water damage claims were sent to subrogation for possible manufacturer reimbursement of faulty materials. Still, the 1-2% that went through subrogation managed to recover over $35 million annually from pipe and hose manufacturers, making it financially appealing to increase the number of claims further investigated.
The main reason so few water damage claims go through subrogation is due to the difficulty of identifying water line and hose manufacturers. Though most plumbing hoses are covered by a manufacturer warranty for a given amount of time, it is often a difficult and tedious process to determine the company responsible once a tag or sticker has been removed. Hoses must be sent to an analyst and inspected, adding time and cost.
The Innovation Team at this insurance company wanted to expedite this process, and came to the user experience team for ideas. During a one-day session, volunteers from the user experience department broke into small groups and came up with solutions, leveraging optical recognition technology that was being developed.
Our group began by learning about plumbing hoses; each part had a unique name, and each manufacturer had distinct designs and characteristics for those parts. We proposed that once a field rep took photographs of the damaged piece with a smart phone, optical recognition technology could analyze its characteristics and compare them in a photo database to determine the manufacturer. This would both decrease the assessment time and no longer require the handler to ship the hose to an assessment center. Determining the manufacturer could now take seconds.
Within a few hours later our small group had come up with a basic flow, wires and basic comps. Despite its low fidelity, the Innovation Team was very excited with the solution we presented. We sent off our visuals and presentation and hoped that perhaps our suggestion would somehow help the company.
The Innovation Team disappeared into the background, and it wasn’t until over a year later when a similar concept was put on my radar. The company was having issues similar to the plumbing hoses, but this time with damaged siding on claimant’s home. Previously, to complete a claim, field handlers would have to cut a piece of damaged siding off of the house and mail it in for an inspection team to estimate its value. By studying the texture and color, the manufacture could be determined, and therefore a better estimate of replacement cost was give to the claimant. The tradeoff for this accuracy was the amount of time to receive and inspect the siding, and even then it could be difficult identifying a manufacturer.
The insurance company had continued to develop its optical recognition technology and began to build a mobile app for field handlers to use. While the app had been handed around many departments, it had yet to receive attention from the User Experience department, and thus lacked logical order to its flow, considerations around human centered design, and any resemblance of cohesive design. My first reaction was to map out the existing flow and screens, which in turn helped me discover numerous pain points.
Usability testing of the live app was also conducted to confirm many of the pain points the UX team had suspected. Field handlers were utterly confused about color calibrating their photos, which was required for each photo session. Tasking them with holding up a calibration chart to the app and capturing a photo of the siding needed clearer instruction and guidance to avoid user frustration and abandonment.
After gathering feedback from previous testing, I began to iterate on onboarding screens and guides. Various illustration styles and perspectives were tested along with instructional text. Variations in overlays were also created to guide the user into capturing the correct content.
Users were also baffled about the steps required after submitting a siding photo and then receiving potential matches. Manufacture name, product line, and color were not clearly delineated, and users did not understand how to make these selections. A convoluted flow that separated the texture selection from the color selection resulted in numerous errors and much frustration.
Our need to test various versions could not wait for development to implement changes every time on a test device. Therefore, we relied on paper prototypes and overlays printed on transparencies to test our changes to the flow, interface, and copy. Using printouts of guide, siding and the color calibration tool, we were still able to use functioning parts of the app. The low fidelity prototype also allowed us to easily make changes on the fly, and multiple quick changes were made with the instant feedback we received.
Once we had determined a way to make the flow clear and concise, the interface needed to be tested. Multiple version for selecting color and texture matches were tested, with variations in UI color, copy, and hierarchy iterated on after receiving user feedback.
After settling on the best path for users and reinventing the interface, I revised the app map and documented improvements. This also helped business and development understand the changes to the flow, style, and interactions.
As screens were being finalized, I began documenting styles, colors, typography and interactions in style guides for both iOS and Android. This guide could be then applied to other future optical recognition type apps in an effort to ease development and create uniformity in the company’s suite of tools.
As Siding ID began to wind down, the desire for a similar app for plumbing hoses came to surface. It became apparent that our proposed solution for the Innovation Team from the previous year was now coming to fruition. We were presented with crude comps and a muddled flow, much like Siding Id project. We quickly went to sorting out the problem spots and then wiring proposed screens.
After going through similar work with Siding ID, it was much quick to map together the app and digest wireframes. Content strategists were also leveraged, and assisted us in providing guidance and instructional text.
With Brand battles settled, I was finally able to focus on other areas of the app and the UI for both the iOS and Android versions. The hub of the app, the dashboard, was a screen that relied on secondary colors and iconography for visual interest to deliver important information.
Designing screens came easily and I incorporated the styles set in the Siding ID guide. While we did not have a working prototype to test, paper prototypes sufficed and guided our decision making process. Business was pleased with our work and development easily understood our design direction. Most importantly, our field handlers were going to be provided with a tool that would save them time and frustration, save the company money, and provide quick and more accurate estimates for our customers. It was a proud moment watching our infant concept grow into a fully developed product and enter the world.