Philip Jean-Pierre
Personal Project  ·  AI-Augmented UX Workflow

Project StarJammer

A controlled experiment in AI-augmented UX design. Full discovery-to-handoff programme — 12 personas, 6 audience segments, research-led IA, and a complete design system — in 4 weeks. AI cut the timeline. It didn't cut the thinking.

12
Personas Across 6 Segments
5
Critical UX Findings
6
UI Modes Designed
4wk
Discovery-to-Handoff

01Context

Role: Solo UX Lead — research, strategy, IA, design, design system

Timeline: 4 weeks

Type: Personal project / controlled methodology study

Status: Research + Design Complete

Why This Project Exists

StarJammer is a satellite tracking and space operations dashboard. It's not a client engagement. That's the point. I needed a contained environment to run a full UX programme and test a specific hypothesis: that AI can accelerate UX work without displacing the thinking that makes UX work actually good.

The claim that AI is replacing designers is everywhere. So is the opposite claim — that AI is just a tool and nothing has changed. Both are wrong. The real question is more specific: where in the process does AI add genuine value, and where does a human in the loop remain essential?

StarJammer was built to answer that question with evidence, not opinion.

The hypothesis: AI as workflow accelerant, not autonomous agent. Every strategic decision, research interpretation, and quality gate stays human. AI compresses the distance between thinking and artifact. It doesn't replace the thinking.

The result: The hypothesis proved out. The core strategic insight — that StarJammer needed persona-aware UI modes instead of a single expert interface — came directly from the research data. Not from a model.

My Role Across the Programme

Research Lead

Ran all discovery and definition phases: stakeholder framing, competitor analysis, heuristic evaluation against Nielsen's 10 principles, qualitative data coding, persona synthesis. No AI shortcuts on the reasoning.

UX Strategist

Designed the IA and audience mode architecture from research findings. The mode-switching solution didn't come from AI ideation — it came from 12 personas with radically incompatible interface needs.

Product Designer

Produced all UX deliverables: 14-screen wireframe set, high-fidelity UI across 6 interface modes, and the Yamato 6.1 design system — tokens, components, WCAG 2.2 AA across light and dark themes.

Methods

Stakeholder Interviews Competitive Audit Heuristic Evaluation Qualitative Coding Persona Development Journey Mapping IA Design Wireframing High-Fidelity UI

Tools

Figma FigJam Claude GPT-4 Confluence Notion

02Problem

Built for One User. Aimed at Many.

StarJammer's dashboard delivered serious technical capability. It also assumed a single user type. A heuristic evaluation and persona analysis surfaced five interface failures — each with cross-segment impact, none of them edge cases.

Onboarding Chasm

Every new user hit a wall. The default interface dropped them into a fully-loaded expert dashboard with no guided entry. 8 of 12 personas experienced immediate, often terminal friction at first launch.

No Share Layer

ISS passes, close approaches, satellite discoveries — none of it shareable. Six personas independently named this as their highest unmet need. A structural ceiling on growth sitting in plain sight.

Historical Data Absent

The telemetry panel was live-only. Dr. Kenji (JAXA researcher persona) needed to examine eccentricity trends during a geomagnetic storm. That view didn't exist. Researchers had no path to the data they needed.

AI Underutilised

The AI Mission Assistant was a corner widget with a generic prompt. Seven personas could derive real value from contextual AI guidance — but nothing connected them to it or adapted it to their task.

The root cause across all five findings: One density, one language register, one feature hierarchy — for a radically diverse user base. Light/Dark toggle wasn't the answer. A mode-based architecture was.

Heuristic Violations Driving the Failures

Match System & Real World

Expert telemetry vocabulary with no plain-language equivalents. Casual users and educators had no way to interpret what they were seeing.

Recognition Over Recall

Mode-specific features were buried. Users couldn't discover capabilities relevant to their context without already knowing where to look.

Flexibility & Efficiency

No accelerators for expert users. No simplification paths for casual ones. One interface tried to serve everyone and fully served no one.

03Research

The Full Programme — No Shortcuts on the Reasoning

I ran structured discovery and definition through four research phases. AI was used in synthesis and documentation — not in the analytical work itself. Every model output was reviewed, revised, and approved by me before it influenced any design decision.

Central Research Question: "What interface, content, and interaction design changes will make StarJammer meaningfully usable across its full spectrum of users — without compromising depth for expert operators?"

That question has two competing requirements embedded in it. Resolving the tension between them — broad accessibility without expert compromise — was the design problem.

12
Personas Researched
6
Audience Segments
5
Competitors Audited
100%
Saturation Achieved

Five Research Findings That Drove Design

04Personas & Information Architecture

The IA Came from the Research, Not a Whiteboard

The mode-switching architecture wasn't a design principle I brought to the project. It was the only solution that fit the data. Twelve personas with radically different mental models, primary tasks, and feature priorities — all on one platform. One interface couldn't resolve that. Six modes could.

Representative Personas

🔭 Dr. Amara Osei

STEM Educator · Ghana

Needs a projectable classroom UI with simplified telemetry and a guided "Tonight's Sky" mode. Designed for a classroom wall, not a single screen.

📡 Marcus Webb

Ham Radio Operator · UK

Needs precise orbital timing, pass prediction with azimuth/elevation, and Doppler shift calculations. High data density is a feature, not a problem.

🛰️ Dr. Kenji Watanabe

Aerospace Researcher · JAXA

Needs historical telemetry trend views, API-level data access, and research export. The feature the platform was missing entirely.

Audience Segment Mapping

Segment Representative Personas Primary Need UI Mode
Space Enthusiasts Sofia, Kowalski Family Discovery, wonder, sharing Discovery Mode
Amateur Radio Ops Marcus Webb, Yuki Tanaka Pass timing, freq / Doppler data Operator Mode
Scientists & Researchers Dr. Kenji, Dr. Amara Historical data, API access, export Research Mode
Educators & Students Ms. Chen, Sofia Projectable, low-density, guided Classroom Mode
General Public Kowalski Family, James Accessibility, large targets, plain language Family Mode
Defence & Government Agent Torres, Director Park SSA overlays, classification, maximum density Pro / Operator Mode

05AI Workflow

Where AI Fit and Where It Didn't

The honest account: AI cut timeline on synthesis, iteration, and documentation. It didn't cut the thinking. Here's the actual breakdown — what stayed human, what AI accelerated, and what the time savings looked like.

Phase Who Did the Work What That Looked Like
Research Design Human Research questions, methodology selection, scope definition — entirely mine. No AI involvement in how the study was structured.
Stakeholder Interviews Human Interview design, facilitation, and initial analysis. AI had no role here.
Qualitative Coding Human Inductive and deductive coding across the interface evaluation. The analytical interpretation — what the patterns meant — was my work.
Strategic Insight Human The mode-based architecture came from the research data. 12 personas with incompatible primary tasks made a single-interface solution untenable. That conclusion required understanding what the data meant — not pattern matching.
Research Synthesis AI-Accelerated After coding manually, I used Claude to accelerate affinity cluster labelling and theme naming. Every output reviewed, edited, approved before it influenced decisions. Estimated time saving: ~40% on synthesis documentation.
Concept Generation AI-Accelerated Used AI to generate divergent concepts for mode-switching patterns, onboarding structures, and the share layer model. I selected, combined, and refined. Nothing shipped directly from AI output.
UX Writing AI-Accelerated Prompted AI for UI copy — button labels, onboarding tooltips, contextual help text, AI Mission Assistant prompts — then edited for voice, clarity, and persona fit. Estimated time saving: ~60% on copy drafting.
Design Iteration AI-Accelerated Used Claude to review wireframe logic and flag cognitive load issues. Used AI to accelerate documentation — annotations, spec writeups, handoff notes. All output under my creative direction.

The principle that held across the whole programme: Every AI touchpoint reduced clock time. None reduced the reasoning required from me as the designer. Synthesis, strategic decisions, and quality gates stayed human. The methodology stayed clean.

What This Demonstrates

A 4-week timeline for a full discovery-to-handoff programme — 12 personas, competitive analysis, heuristic evaluation, qualitative coding, IA design, 14-screen wireframe set, high-fidelity UI across 6 modes, and a complete design system — would normally require 8 to 12 weeks of solo effort. AI compressed that timeline without changing the quality gates or removing the analytical work.

That's the case for AI as accelerant. Not as replacement. The outputs are only trustworthy because the thinking behind them was human.

06Design Decisions

The Core Solution: Stop Trying to Serve Everyone with One Interface

The research finding was clear. Twelve personas across six segments didn't have different experience levels — they had fundamentally different primary tasks. The architecture had to reflect that. Six UI modes, each calibrated to a specific context, expertise level, and task set.

🌌 Discovery Mode

Casual users, first-timers, the Kowalski family. Simplified sky view, guided ISS pass alerts, shareable Story Cards for every high-emotion moment.

📡 Operator Mode

Ham radio operators. Pass prediction with azimuth/elevation, Doppler shift data, frequency window overlays. Everything Marcus needs, nothing he doesn't.

🔬 Research Mode

Scientists and researchers. Historical telemetry trend views, export, API access, annotation tools. The mode that filled the structural gap for Dr. Kenji.

🏫 Classroom Mode

Educators. Full-screen projectable layout, simplified data vocabulary, guided tour mode. Designed for a classroom wall, not a personal screen.

👨‍👩‍👧 Family Mode

General public. Large touch targets (48px+), reduced information density, plain-language labels throughout. Accessibility as the default, not a setting.

🛡️ Pro / Defence Mode

Government and defence operators. SSA data overlays, classification tagging, maximum information density, secure context-switching controls.

Key Decisions and Reasoning

Why modes, not progressive disclosure? Progressive disclosure still assumes a single feature hierarchy. Different segments have entirely different primary tasks — not just different experience levels. A farmer and a JAXA researcher both need depth. It's completely different depth. Progressive disclosure couldn't resolve that. Modes could.

Where does the AI Mission Assistant live? It's contextual, not a corner widget. In Research Mode it surfaces relevant telemetry summaries. In Discovery Mode it answers "What am I looking at?" In Operator Mode it feeds pass timing alerts. One AI feature — six context-aware expressions. The widget problem wasn't the widget. It was the lack of context.

How does the share layer work? Story Cards — auto-generated, shareable panels triggered by high-emotion moments (ISS passes, close approaches, milestones). Each card includes a map view, a data snapshot, and a one-tap share action. Designed to be the mechanism of organic acquisition: convert wonder into reach at zero marginal cost.

How does mode-switching work for users who don't know what mode they need? Onboarding routes users to a recommended mode based on a three-question context prompt. The mode can be changed at any time from the nav shell. Users aren't locked — they're guided.

07Design System

Yamato 6.1 — Built for Six Modes, One Coherent Language

Yamato 6.1 is the design system underlying every StarJammer mode. Built on atomic design principles, it delivers a unified visual language while giving each mode room to express its distinct information density and register. Every component was designed for WCAG 2.2 AA compliance from the start — not retrofitted.

Component Library

Nav Shell Mode Switcher Pass Alert Card Telemetry Panel Story Card AI Assistant Orbital Map Data Chip Timeline Bar Filter Drawer Share Sheet Tooltip Layer Alert Banner Loading Skeleton

Token Architecture

Color Tokens

Semantic tokens for surface, text, border, and accent — applied consistently across both light and dark themes. Mode-specific accent overrides handle the visual distinction between Discovery, Research, and Defence contexts.

Typography Scale

Display, body, and mono type scales defined by viewport size and mode density. Family Mode uses a larger base size and higher line-height throughout. Research Mode supports data-dense mono rendering for telemetry values.

Spacing & Density

Three density tiers — Compact, Standard, Comfortable — map directly to mode contexts. Pro/Defence runs on Compact. Family Mode runs on Comfortable. Mode switching triggers a density change, not just a content change.

Motion Tokens

Duration and easing tokens defined for all interactive states. Prefers-reduced-motion support baked in at the token level — not handled as a special case per component.

Accessibility Standards

08Outcomes

Research-Led Design. AI-Accelerated Delivery. Methodology Intact.

The programme produced a complete, handoff-ready design package in 4 weeks. The AI integration didn't cut corners — it compressed timelines on the work that didn't require human judgment while leaving the work that did entirely in human hands.

12
Personas Delivered
14
Wireframe Screens
6
HiFi UI Modes
1
Design System (Yamato 6.1)

Business Impact of the Design Strategy

User Acquisition

The sharing layer targets viral discovery directly. Every unshareable wonder moment was a missed acquisition event. Story Cards convert high-emotion moments into organic growth at zero marginal cost.

Retention

Persona-aware onboarding, mode switching, and the elimination of the onboarding chasm reduce early churn across all 6 segments — particularly the 8 personas who hit terminal friction at first launch.

Market Expansion

Classroom Mode and Family Mode open StarJammer to the education and consumer markets — segments the expert interface systematically excluded. New modes are new addressable market.

Competitive Differentiation

No direct competitor — Heavens-Above, N2YO, Stellarium — offers AI-assisted satellite operations with persona-aware UI modes. This is the category-defining gap the research surfaced.

What This Project Proves About AI in UX

The thesis proved out. AI as accelerant, not replacement. The core insight — persona-aware modes over a single expert interface — came from the data. The qualitative coding, the saturation testing, the strategic interpretation: all human. AI cut the time from research artifact to synthesis document, from concept to spec, from wireframe logic check to annotated handoff. It didn't cut the reasoning.

The replicable principle: AI integration works when it reduces the distance between a human decision and a human artifact. It breaks down when it's asked to make the decision itself. Keep the quality gates human. Use AI to move faster between them.