Back to Home

Case Study

Emergency Evacuation App

Designing for the moment when everything goes wrong

New Zealand sits on the Pacific Ring of Fire. Earthquakes, tsunamis, volcanic eruptions, and floods are not hypothetical they're recurring events. I designed a mobile app that helps people find the safest route to evacuation points, in real-time, when panic is at its peak.

Role

Lead UX/UI Designer

Duration

6 months

Platform

iOS & Android

The Problem

When a natural disaster strikes, people often have only minutes — sometimes seconds — to decide whether to evacuate and move to safety. In these critical moments, they need information that is clear, reliable, and easy to act on.

Instead, the information available is often fragmented and difficult to interpret. Alerts may arrive on time, but the guidance that follows can be unclear, too generic, or not tailored to individual circumstances. For tourists, newcomers, or commuters unfamiliar with local areas, this lack of clarity increases confusion and delays critical decisions.

The core challenge: how do you design an interface that someone can use under extreme stress, with shaking hands, in poor lighting, possibly while running?

User pain points

  • No real-time, GPS-based evacuation guidance existed
  • Government alerts were text-heavy and hard to parse under stress
  • People didn't know where their nearest evacuation point was
  • Cell networks often fail during disasters apps that require internet become useless

Design constraints

  • Must work offline after initial data sync
  • Must be usable in one hand, while moving
  • Information hierarchy: most critical info first, always
  • Accessibility: high contrast, large tap targets, screen reader support
  •        
           

Research & Discovery

I began by understanding the real scenarios. I studied New Zealand's Civil Defence emergency management guidelines, analysed how people behaved during the 2011 Christchurch earthquake and the 2016 Kaikoura event, and interviewed residents in Wellington a city that sits directly on a fault line.

Key Insight 1

People don't read instructions during emergencies. They scan for visual cues. The UI needed to communicate through color, iconography, and spatial hierarchy not paragraphs.

Key Insight 2

Families need to coordinate. Parents need to know their children's school evacuation plan. A shared family evacuation plan feature became critical.

Key Insight 3

Offline capability isn't a feature it's the feature. Cell towers are among the first infrastructure to fail. The app had to cache evacuation maps and routes locally.

Who we designed for

Three primary user archetypes guided every design decision.

The Commuter

Uses public transport daily. Unfamiliar with evacuation routes outside their neighbourhood. Needs real-time, GPS-guided routing to the nearest safe point from wherever they are.

The Parent

First instinct during an emergency is to reach their children. Needs visibility into school evacuation status, family member locations, and the ability to set meeting points.

       

The Tourist

No local knowledge. Doesn't know where to go or what the alert system sounds like. Needs the simplest possible interface with clear visual directions and multilingual support.

       

Design Process

I followed a user-centred design process, iterating through low-fidelity wireframes to high-fidelity prototypes, with usability testing at each stage.

Research & Mapping

Stakeholder interviews, competitive analysis, user journey mapping

Wireframing

Low-fi sketches, information architecture, navigation structure

Prototyping

Interactive prototypes in Figma, micro-interaction design, gesture mapping

Testing & Iteration

Usability testing with 12 participants, stress-scenario simulation, iteration

Connecting with larger audience

An invitation poster was created for research (including date, time and duration) to invite participants from various thematic information and display it at the WelTec & Whitireia Symposium to connect with a large audience and target users to participate in the design thinking workshop.

Design Thinking workshop

Empathy Map

Ideation

I had used Crazy 8’s concept during ideation stage. Participants had came up with 8 different ideas in eight minutes to build the mobile application.

In Ideation activity we had gone through across the board to contribute and pitch ideas, rank them and eventually - select the best of them for inclusion into the product.

This ensured that the user’s voice was heard, and I saw in how the product would be developed.

Wireframe & Prototype

Core Features

Real-time disaster alerts

Push notifications triggered by Civil Defence data. The alert screen uses color-coded severity levels (red for immediate danger, amber for warning, blue for advisory) with large, readable text and a single primary action: "Navigate to Safety".

The design deliberately limits choices during a crisis. When an earthquake alert fires, you don't need a menu you need one button.

GPS-powered safe routes

The map view shows the user's current location, nearby evacuation points, and the fastest walking route to safety. Routes update in real-time based on reported road closures, flooding, and structural damage.

I designed the map for maximum clarity: high-contrast route lines, oversized destination markers, and voice-guided turn-by-turn navigation so users can keep the phone in their pocket while moving.

Offline mode

The app pre-downloads evacuation maps, assembly point data, and emergency contact information for the user's region. When connectivity drops, the app switches seamlessly to cached data with no user action required.

A subtle indicator shows online/offline status, but the core experience remains identical. Users shouldn't have to think about whether they have signal.

Family evacuation plans

Users can create a family group, set designated meeting points, and share their real-time location during an emergency. Each family member sees the others on the map, with estimated walking time to the shared meeting point.

During usability testing, this was the feature that generated the strongest emotional response. Parents said it addressed their deepest anxiety during disaster scenarios.

Visual Outcome

Interaction Design Decisions

One-hand operation

All critical actions are reachable with a thumb on a standard-sized phone. The navigation button sits at the bottom centre. Alert dismissal uses a swipe gesture, not a small X. This matters when hands are shaking.

Color as communication

Red, amber, and blue aren't decorative they're functional. Each color maps to a specific threat level across the entire app. Users learn the system once and can parse any screen in under two seconds.

Progressive disclosure

During an active alert, the app shows only what matters: direction, distance, and estimated time. Detailed information (shelter capacity, supply availability) is available but tucked behind a secondary action.

Audio + haptic feedback

Alerts trigger haptic vibration patterns distinct from regular notifications. Arrival at an evacuation point triggers a confirmation vibration. For visually impaired users, the entire flow is voice-navigable.

Reflection

Designing for extreme context is humbling. You can't rely on user patience, ideal lighting, or steady hands. Every assumption about "normal" use breaks down. This project taught me that the best interface is the one that disappears when it matters most it just works, without thinking.

Accessibility isn't an add-on. When I designed for people using the app during an earthquake with large tap targets, high contrast, voice navigation, and haptic feedback I was designing for everyone. The accessibility features improved the experience for all users, not just those with disabilities.

What I'd do differently: I'd push harder for real-world stress testing. Lab usability tests are useful, but simulating actual panic is nearly impossible. I'd advocate for field testing during Civil Defence drills, with real participants in real locations, to validate the design under conditions closer to reality.