Philip Jean-Pierre
UX Strategy Case Study

NCSES Data Tools: Strategic Direction Through Design Thinking

Multi-phase research program that transformed complex federal data discovery into actionable user-centered strategy, serving researchers, policymakers, and data analysts across federal, academic, and private sectors.

4
Research Methods
50+
Stakeholders
15
Competitive Brands
3
Strategic Themes

01Context

Role: UX Lead, Researcher, Designer, & Strategist (Contractor)

Team: Cross-functional · Research-led strategy

Organization: National Center for Science and Engineering Statistics (NCSES)

Timeline: March 2022 · Strategic planning phase

Scope: National data discovery platform serving 200+ million users

The Strategic Challenge

NCSES manages the nation's most comprehensive data on science and engineering workforce trends, education pathways, and research investment. Yet stakeholders described the current platform as "klunky," fragmented, and inaccessible to anyone outside academic research circles.

Executive leadership tasked the team with one question: What strategic direction should guide NCSES Data Tools for the next 1–2 years?

The problem wasn't lack of data. It was lack of clarity about user needs, competitive landscape, and governance direction. The platform served at least four distinct user personas — from "power users" extracting data for analysis to "information consumers" seeking pre-built reports. Each needed different pathways. None of them had them.

Research Phases

Competitive Assessment Usability Assessment Stakeholder Interviews Survey Manager Workshops

Tools & Methods

Design Thinking Framework Figma Workshop Facilitation Thematic Analysis Competitive Benchmarking

02Approach

Design Thinking as Strategic Framework

Rather than jumping to solutions, the team embraced design thinking to move from problem assumptions to evidence-based strategic direction. The program followed a structured "Understand → Define → Ideate → Prototype → Test" cycle — compressed into a planning timeline.

The Strategic Process

Understand Phase

Deep research into user needs, competitive landscape, and internal capabilities.

Define Phase

Synthesize findings into strategic themes and user-centered problems to solve.

Build & Validate Phase

Convert strategic direction into roadmap-ready recommendations with prototypes.

Evidence-Driven Decision Making

Each research method answered a specific strategic question, creating triangulation that protected against bias and assumption-driven roadmapping.

This wasn't academic research. Every finding had to answer a commercial or strategic question. Design thinking kept the program grounded in user reality rather than organizational assumptions.

03Research Execution

Phase 1: Competitive Assessment

Examined 15 successful data platforms — World Bank, CDC, Census, Harvard Dataverse, Kaggle, Google Scholar, and others — to identify patterns worth investigating for NCSES. The goal wasn't copying competitors. It was understanding what patterns solve specific user problems at scale.

Four Competitive Themes Emerged

Depth and Breadth Clarity

Successful platforms show users what data exists and how much is available. Breadcrumbs, faceted navigation, result counts, and visual hierarchy all signal data scope.

Theme-Based Organization

Users enter data by topic, industry, measure, or use case — not by survey name. Navigation accommodates all user expertise levels.

Metadata at the Forefront

Complex data demands detailed pages. Each dataset gets a rich profile showing visualizations, relationships, and direct contact paths to data owners.

Multi-Dimensional Help

Effective platforms combine help centers (self-service library) with contextual help (tooltips, guided tours, embedded instructions).

Phase 2: Usability Assessment

Conducted lightweight qualitative review of the current platform, examining landing page, search, data explorer, and table/chart builders. Findings were organized by feature to feed immediate sprint cycles and inform larger strategic themes.

Usability Findings Summary

Landing Page: Navigation doesn't communicate position within NCSES ecosystem. No help content. Doesn't showcase data depth/breadth.

Search: Autocomplete provides too many results. Missing suggestion-type mix. Fails search logic tests.

Data Explorer: No filters for attribute-based discovery. Tabbed view misleading. No visual connection between results and search term.

Table & Chart Builders: No contextual guidance. Too many controls surfaced simultaneously. Users overwhelmed.

Phase 3: Stakeholder Interviews

Conducted 50+ interviews with executives, product managers, data stewards, and survey managers. Stakeholders shared vivid mental models of what NCSES should become.

Executive Vision

"Our data should be used to answer the questions of today and predict the future. We have to change the narrative. Connect our data to the bigger problems our nation is trying to solve."

"Dumping more data into the tool does not equal success. We need to synthesize the data into information people can use."

Phase 4: Survey Manager Workshop

Deep-dive session with data custodians revealed how they help their users solve real problems. Key use cases included researcher career outcome analysis, professional association benchmarking, state economic development strategy, and university program positioning.

A critical insight: Survey managers acknowledged that their users' analytics expertise varies, but they assumed all users understood survey data. The platform should guide without over-explaining — and allow users to toggle between "expert mode" and simplified flows.

04Key Findings & Themes

Three Strategic Imperatives

Theme 1: User Diversity

NCSES serves four distinct user types, each with different expertise and needs. Current platform design assumes one audience.

Power Users

Extract data for analysis in external tools. Need raw access, metadata, comparison capabilities.

Data Savvy Users

Understand survey data and tables. Need rich metadata and contextual documentation.

Data Interested Users

Know what they want but not the technical path. Need guided discovery and simplified flows.

Information Consumers

Prefer pre-designed reports and visualizations. Don't engage with raw data or tool complexity.

Strategic Action: Platform must accommodate all four user types through different entry points and interaction models — not force all users through one pathway.

Theme 2: More Data & Data Types

Stakeholders unanimously agreed: adding establishment data, longitudinal datasets, and cross-survey harmonized tables would increase relevance across sectors — government policy, academic research, and private industry.

But current tool suite (primarily table/chart builders) couldn't handle all data types. Expansion required parallel tool development: interactive explorer for hierarchical data, comparison tools for longitudinal trends, and pre-built dashboards for common policy questions.

Strategic Action: Roadmap must include tool diversification, not just data addition. Establish data harmonization governance to enable cross-survey queries.

Theme 3: Align Future Direction

Stakeholders emphasized alignment with Executive Order modernization directives: transparent, accessible, easy to use, modern interface, user-centric, recognizable brand.

Current platform felt opaque and niche. Future state demanded being "the gateway portal to data" — discoverable, integrated, salient to broader public.

Strategic Action: Rebrand NCSES as research authority, not data warehouse. Make platform accessible to non-technical audiences. Invest in awareness campaign.

05Feature Innovations: Build & Feedback

The Build Table Feature: Measure-First Architecture

One of the most critical findings: successful competitive platforms (World Bank, CDC, FinStats) use a "measure-first" approach when users build custom tables. Instead of asking "which survey?", they ask "what measurement do you want?"

The NSF portal prototype demonstrated this pattern with an interactive table builder that prioritized user intent over data structure. Users selected desired measures (e.g., "Doctorate Holders by Field"), and the tool automatically identified compatible surveys and years.

Color Selection & Schema Loading

The prototype also introduced an advanced feature: users could define or load custom color schemas for visualizations. This addressed a critical stakeholder need — ability to create 508-compliant, on-brand export content without requiring design expertise.

The color selection feature included:

This pattern solved a hidden pain point: federal researchers needed to produce 508-compliant reports but lacked design tools or expertise. Embedding color accessibility into the data tool eliminated downstream approval friction.

Interactive Feedback Feature

Parallel to the build process, the team prototyped an embedded feedback system. Rather than forcing users to file support tickets or navigate external survey tools, feedback was contextual and lightweight.

Feedback included:

The feedback system was bidirectional: survey managers received structured feedback, and the platform showed crowdsourced use case patterns back to users — building community and signaling that user voice shaped the roadmap.

Interactive Prototype

View the working prototype demonstrating these patterns:

06Strategic Outcomes & Recommendations

Strategic Direction: Three-Year Roadmap

The research program concluded with a 1–2 year strategic roadmap organized around the three findings themes, with measurable success criteria tied to user outcomes.

Year 1: Foundation

Year 2: Expansion

Governance & Measurement

Impact

What Success Looks Like: NCSES becomes the destination for anyone seeking evidence on science and engineering trends. Researchers and policymakers find answers without friction. Federal data scientists produce 508-compliant exports without external design support. Survey managers see increased user engagement and fewer support requests — evidence that the platform is self-service and intuitive. NCSES workforce data informs national policy on STEM education and workforce development.

07Reflection

Strategy Without Research Is Assumption

Leadership could have commissioned a strategic plan based on organizational priorities and competitive benchmarks alone. Instead, they invested in a research-led approach that centered user needs, competitive reality, and organizational capabilities in equal weight.

The payoff: strategic direction that stakeholders believed in because they saw themselves in the research. Executives heard their own language reflected back. Survey managers recognized their users' problems. Product teams understood the trade-offs and could argue for resource prioritization based on evidence, not opinion.

Design Thinking as Leadership Tool

Many organizations treat design thinking as a creative brainstorm framework. Here, it was a strategic discipline. The "Understand → Define → Build" cycle forced the team to stay grounded in evidence at every stage, prevented premature solution-jumping, and created alignment across silos.

Complex Systems Require User-Centered Governance

NCSES data is genuinely complex — survey terminology differs, data harmonization is nontrivial, user expertise varies wildly. The platform can't hide that complexity. But it can make the complexity navigable by meeting users where they are (theme-based discovery, measure-first builders, multi-dimensional help) rather than forcing them through survey-centric pathways.

This required rethinking governance: data stewards needed to be stakeholders in platform design, feedback from users needed to be prioritized over stakeholder assumptions, and success metrics had to measure user outcomes (answer-finding speed, tool self-service rate) not platform outputs (data additions, feature launches).