Multi-phase research program that transformed complex federal data discovery into actionable user-centered strategy, serving researchers, policymakers, and data analysts across federal, academic, and private sectors.
Role: UX Lead, Researcher, Designer, & Strategist (Contractor)
Team: Cross-functional · Research-led strategy
Organization: National Center for Science and Engineering Statistics (NCSES)
Timeline: March 2022 · Strategic planning phase
Scope: National data discovery platform serving 200+ million users
NCSES manages the nation's most comprehensive data on science and engineering workforce trends, education pathways, and research investment. Yet stakeholders described the current platform as "klunky," fragmented, and inaccessible to anyone outside academic research circles.
Executive leadership tasked the team with one question: What strategic direction should guide NCSES Data Tools for the next 1–2 years?
The problem wasn't lack of data. It was lack of clarity about user needs, competitive landscape, and governance direction. The platform served at least four distinct user personas — from "power users" extracting data for analysis to "information consumers" seeking pre-built reports. Each needed different pathways. None of them had them.
Rather than jumping to solutions, the team embraced design thinking to move from problem assumptions to evidence-based strategic direction. The program followed a structured "Understand → Define → Ideate → Prototype → Test" cycle — compressed into a planning timeline.
Deep research into user needs, competitive landscape, and internal capabilities.
Synthesize findings into strategic themes and user-centered problems to solve.
Convert strategic direction into roadmap-ready recommendations with prototypes.
Each research method answered a specific strategic question, creating triangulation that protected against bias and assumption-driven roadmapping.
This wasn't academic research. Every finding had to answer a commercial or strategic question. Design thinking kept the program grounded in user reality rather than organizational assumptions.
Examined 15 successful data platforms — World Bank, CDC, Census, Harvard Dataverse, Kaggle, Google Scholar, and others — to identify patterns worth investigating for NCSES. The goal wasn't copying competitors. It was understanding what patterns solve specific user problems at scale.
Successful platforms show users what data exists and how much is available. Breadcrumbs, faceted navigation, result counts, and visual hierarchy all signal data scope.
Users enter data by topic, industry, measure, or use case — not by survey name. Navigation accommodates all user expertise levels.
Complex data demands detailed pages. Each dataset gets a rich profile showing visualizations, relationships, and direct contact paths to data owners.
Effective platforms combine help centers (self-service library) with contextual help (tooltips, guided tours, embedded instructions).
Conducted lightweight qualitative review of the current platform, examining landing page, search, data explorer, and table/chart builders. Findings were organized by feature to feed immediate sprint cycles and inform larger strategic themes.
Landing Page: Navigation doesn't communicate position within NCSES ecosystem. No help content. Doesn't showcase data depth/breadth.
Search: Autocomplete provides too many results. Missing suggestion-type mix. Fails search logic tests.
Data Explorer: No filters for attribute-based discovery. Tabbed view misleading. No visual connection between results and search term.
Table & Chart Builders: No contextual guidance. Too many controls surfaced simultaneously. Users overwhelmed.
Conducted 50+ interviews with executives, product managers, data stewards, and survey managers. Stakeholders shared vivid mental models of what NCSES should become.
"Our data should be used to answer the questions of today and predict the future. We have to change the narrative. Connect our data to the bigger problems our nation is trying to solve."
"Dumping more data into the tool does not equal success. We need to synthesize the data into information people can use."
Deep-dive session with data custodians revealed how they help their users solve real problems. Key use cases included researcher career outcome analysis, professional association benchmarking, state economic development strategy, and university program positioning.
A critical insight: Survey managers acknowledged that their users' analytics expertise varies, but they assumed all users understood survey data. The platform should guide without over-explaining — and allow users to toggle between "expert mode" and simplified flows.
NCSES serves four distinct user types, each with different expertise and needs. Current platform design assumes one audience.
Extract data for analysis in external tools. Need raw access, metadata, comparison capabilities.
Understand survey data and tables. Need rich metadata and contextual documentation.
Know what they want but not the technical path. Need guided discovery and simplified flows.
Prefer pre-designed reports and visualizations. Don't engage with raw data or tool complexity.
Strategic Action: Platform must accommodate all four user types through different entry points and interaction models — not force all users through one pathway.
Stakeholders unanimously agreed: adding establishment data, longitudinal datasets, and cross-survey harmonized tables would increase relevance across sectors — government policy, academic research, and private industry.
But current tool suite (primarily table/chart builders) couldn't handle all data types. Expansion required parallel tool development: interactive explorer for hierarchical data, comparison tools for longitudinal trends, and pre-built dashboards for common policy questions.
Strategic Action: Roadmap must include tool diversification, not just data addition. Establish data harmonization governance to enable cross-survey queries.
Stakeholders emphasized alignment with Executive Order modernization directives: transparent, accessible, easy to use, modern interface, user-centric, recognizable brand.
Current platform felt opaque and niche. Future state demanded being "the gateway portal to data" — discoverable, integrated, salient to broader public.
Strategic Action: Rebrand NCSES as research authority, not data warehouse. Make platform accessible to non-technical audiences. Invest in awareness campaign.
One of the most critical findings: successful competitive platforms (World Bank, CDC, FinStats) use a "measure-first" approach when users build custom tables. Instead of asking "which survey?", they ask "what measurement do you want?"
The NSF portal prototype demonstrated this pattern with an interactive table builder that prioritized user intent over data structure. Users selected desired measures (e.g., "Doctorate Holders by Field"), and the tool automatically identified compatible surveys and years.
The prototype also introduced an advanced feature: users could define or load custom color schemas for visualizations. This addressed a critical stakeholder need — ability to create 508-compliant, on-brand export content without requiring design expertise.
The color selection feature included:
This pattern solved a hidden pain point: federal researchers needed to produce 508-compliant reports but lacked design tools or expertise. Embedding color accessibility into the data tool eliminated downstream approval friction.
Parallel to the build process, the team prototyped an embedded feedback system. Rather than forcing users to file support tickets or navigate external survey tools, feedback was contextual and lightweight.
Feedback included:
The feedback system was bidirectional: survey managers received structured feedback, and the platform showed crowdsourced use case patterns back to users — building community and signaling that user voice shaped the roadmap.
View the working prototype demonstrating these patterns:
The research program concluded with a 1–2 year strategic roadmap organized around the three findings themes, with measurable success criteria tied to user outcomes.
What Success Looks Like: NCSES becomes the destination for anyone seeking evidence on science and engineering trends. Researchers and policymakers find answers without friction. Federal data scientists produce 508-compliant exports without external design support. Survey managers see increased user engagement and fewer support requests — evidence that the platform is self-service and intuitive. NCSES workforce data informs national policy on STEM education and workforce development.
Leadership could have commissioned a strategic plan based on organizational priorities and competitive benchmarks alone. Instead, they invested in a research-led approach that centered user needs, competitive reality, and organizational capabilities in equal weight.
The payoff: strategic direction that stakeholders believed in because they saw themselves in the research. Executives heard their own language reflected back. Survey managers recognized their users' problems. Product teams understood the trade-offs and could argue for resource prioritization based on evidence, not opinion.
Many organizations treat design thinking as a creative brainstorm framework. Here, it was a strategic discipline. The "Understand → Define → Build" cycle forced the team to stay grounded in evidence at every stage, prevented premature solution-jumping, and created alignment across silos.
NCSES data is genuinely complex — survey terminology differs, data harmonization is nontrivial, user expertise varies wildly. The platform can't hide that complexity. But it can make the complexity navigable by meeting users where they are (theme-based discovery, measure-first builders, multi-dimensional help) rather than forcing them through survey-centric pathways.
This required rethinking governance: data stewards needed to be stakeholders in platform design, feedback from users needed to be prioritized over stakeholder assumptions, and success metrics had to measure user outcomes (answer-finding speed, tool self-service rate) not platform outputs (data additions, feature launches).