Case Study

Designing Across Government: UX Consulting for Public Service Teams

UX consulting across government programs — helping public service teams clarify complex problems, align stakeholders, and move from discovery to design direction.

Confidentiality note: The artifacts and project details shown in this case study are shared for portfolio and application review only. Some details have been anonymized, adapted, or redacted to protect confidential information. Please do not copy, distribute, or share these materials without permission.

Overview

I worked on an experience design consulting team that partnered with government agencies to improve public services, internal tools, and digital programs.

Each engagement started in a different place. My role was to quickly understand the context, identify what kind of design support was needed, and create artifacts that helped teams make decisions.

At a glance

Role

Senior UX Designer / Experience Design Consultant

Scope

Supported 6+ government programs, services, and digital tools.

Team

Worked with program staff, policy partners, business analysts, product teams, technical teams, and leadership.

Focus

Intake and scoping, discovery in Miro, heuristic analysis, user research planning, wireframing, prototyping, design systems, and handoff.

The challenge

Government teams often knew something needed to improve, but the problem was not always clearly defined. Our team helped agencies make sense of complex services, identify user needs, and move toward practical design solutions.

ChallengeWhat that looked like in practice
Ambiguous problem spacesTeams came in with broad goals like “improve the process,” but the user pain points were unclear.
Multiple stakeholder groupsProgram staff, policy teams, technical teams, leadership, and end users often had different priorities.
Projects already in progressI often joined work midstream and had to quickly understand the history, constraints, decisions, and open questions.
Complex government workflowsServices were shaped by policy requirements, compliance needs, legacy systems, and operational constraints.
Limited time or user accessThe design approach had to be practical, focused, and right-sized to each engagement.

My role

I supported multiple consulting engagements from discovery through design direction. Depending on the project, I led or contributed to research, facilitation, workflow mapping, wireframes, prototypes, and design recommendations.

I also helped strengthen the team’s design practice by contributing to reusable templates, engagement materials, and design processes as the team grew.

My consulting approach

I did not apply the same process to every project. I adapted the approach based on the team’s needs, timeline, maturity, and constraints.

When a team needed...I focused on...Artifacts I created
Problem clarityUnderstanding the ask, defining users, clarifying constraints, and identifying the right design questions.Intake notes, problem statements, opportunity areas.
Stakeholder alignmentFacilitating working sessions and turning scattered feedback into shared priorities.Workshop boards, journey maps, prioritization matrices.
User understandingResearching user needs, pain points, workflows, and service barriers.Research plans, testing scripts, moderator guides, findings decks.
Fast usability insightEvaluating existing tools when timelines or user access made research harder to run immediately.Heuristic analysis, severity ratings, annotated findings, recommendations.
A tangible solutionMaking abstract service problems visible through flows, wireframes, and prototypes.User flows, wireframes, high-fidelity prototypes.
Delivery supportDocumenting design rationale, recommendations, and next steps for implementation teams.Annotated designs, handoff notes, recommendation decks.
Long-term consistencyCreating reusable templates and patterns that helped the design team scale.SOW templates, research templates, design templates, engagement materials.

How I worked

01

Clarify the problem

I helped teams define the actual problem, users, constraints, and decisions that needed to be made.

02

Understand the system

I looked beyond the screen to understand policy, workflow, operational, and technical factors.

03

Facilitate alignment

I used workshops and visual artifacts to align teams around shared priorities.

04

Make it tangible

I created maps, flows, wireframes, and prototypes to make decisions easier.

05

Support handoff

I created documentation and templates so teams could continue the work.

A closer look at my process

This section shows the artifact trail behind the work: how I moved teams from unclear requests to research-backed design direction. Each phase uses the strongest artifact as the hero visual, with supporting visuals underneath so the process feels concrete without overwhelming the page.

01. Intake, SOW, and project scoping

Scope of work document outlining the engagement approach, deliverables, and evaluation plan.
Timeline and work item table for the engagement.
Project kickoff notes showing links, timeline, meeting notes, and project structure.

Purpose

Turn a broad request into a focused design engagement with clear goals, constraints, stakeholders, and decisions.

What I did

  • Clarified the project ask, timeline, and success criteria.
  • Identified users, stakeholders, dependencies, and known constraints.
  • Captured project context through structured notes and intake questions.
  • Helped define the right design approach for the team’s stage of work.

What this shows

Project leadership, consulting judgment, communication skills, and the ability to join work already in progress.

02. Discovery, mapping, and prioritization

Journey map showing screen flows, stakeholder notes, and current-state discovery work.
Prioritization exercise mapping findings by impact and effort.
User flow map showing fields, data sources, editable elements, and step-by-step flow structure.

Purpose

Make scattered project context visible so the team could align on what was known, unknown, and worth exploring.

What I did

  • Used Miro to capture stakeholder input, meeting notes, and discovery findings.
  • Grouped notes into themes, pain points, open questions, and opportunities.
  • Mapped current-state workflows and key handoffs.
  • Prioritized pain points by impact and effort to identify quick wins and bigger bets.

What this shows

Facilitation, synthesis, storytelling, multidisciplinary collaboration, and systems thinking.

03. Heuristic analysis

Heuristic analysis setup using usability heuristics and evaluation templates.
Miro board overview of heuristic evaluation findings.
Synthesis and evaluation methodology slide showing severity levels and scoring criteria.

Purpose

Identify usability issues quickly and give the team an objective way to prioritize improvements.

What I did

  • Reviewed existing screens against Nielsen Norman Group usability heuristics and interaction design principles.
  • Created a Miro-based evaluation template so findings could be captured consistently across flows.
  • Documented issues with severity ratings and design rationale.
  • Defined a scoring method so critical, major, minor, positive, and suggestion-level feedback could be reviewed consistently.
  • Translated findings into practical recommendations the team could act on.

What this shows

UX judgment, interaction design skill, accessibility awareness, and the ability to make progress under constraints.

04. User research planning and synthesis

User research planning template showing session structure, prompts, observation notes, and planning materials.
User research tracker showing participant schedule, consent status, device type, facilitation roles, and note-taking logistics.
Research synthesis board showing observed patterns, findings, and opportunity areas.
Theme clusters from user research showing grouped observations and recurring pain points.

Purpose

Structure research so the team could test assumptions, understand user needs, track sessions, and translate findings into design direction.

What I did

  • Defined research goals, participant criteria, and key questions.
  • Created testing scripts, task prompts, and note-taking structures.
  • Tracked participant sessions, consent status, device type, facilitation roles, and note-taking responsibilities.
  • Grouped observations into themes to identify recurring pain points and opportunities.

What this shows

Human-centered design, research maturity, communication skills, and evidence-based decision-making.

05. Wireframing, prototyping, and design systems

Figma design board showing prototype exploration, alternate design options, and screen states.
Figma file overview showing screens, flows, and design iterations.
Design system board showing reusable components and pattern references.

Purpose

Translate research, stakeholder input, and policy requirements into a tangible experience the team could review and refine.

What I did

  • Created user flows to clarify paths, decision points, and dependencies.
  • Explored alternate flow options and screen states in Figma before moving toward final designs.
  • Designed wireframes and prototypes to test structure, content, and interactions.
  • Used reusable patterns and design system components where possible.
  • Referenced existing component libraries to reduce inconsistency and support scalable design decisions.

What this shows

Visual design, interaction design, systems thinking, design systems experience, and end-to-end product design skills.

Key artifacts to show

Intake and SOW materials

Show how I scoped work, clarified the ask, documented constraints, and aligned on engagement goals.

Miro discovery boards

Show note-taking, synthesis, stakeholder input, themes, workflows, and open questions in one shared space.

Heuristic analysis

Show usability findings, severity ratings, annotated screenshots, and prioritized recommendations.

User research materials

Show research plans, testing scripts, task prompts, moderator guides, decks, and synthesis outputs.

Wireframes and prototypes

Show how I translated findings, requirements, and workflows into tangible design solutions.

Design system references

Show how I used existing components and patterns to support consistency across screens and workflows.

Design principles I applied

Human-centered

Grounded design decisions in user needs, pain points, and real workflows.

Systems-minded

Considered the full service ecosystem, including policy, operations, technology, and handoffs.

Accessible and inclusive

Focused on clear content, usable patterns, and experiences that work for diverse users.

Policy-aware

Balanced user needs with government requirements, compliance, and delivery constraints.

Pattern-driven

Used reusable design patterns and templates to create consistency across complex services.

Future-focused

Looked for opportunities to support data-informed decisions, emerging technology, and scalable delivery.

Impact

Program impact

  • Supported 6+ government programs, services, and tools.
  • Helped teams clarify complex problems and user needs.
  • Created artifacts that supported decisions and delivery.

Team impact

  • Supported a growing design consulting practice.
  • Contributed to reusable templates and engagement materials.
  • Helped create more consistent ways of scoping design work.

Design impact

  • Made complex workflows easier to understand.
  • Connected user needs with policy and operational requirements.
  • Used design artifacts to align teams and move work forward.

What this shows about my design practice

This experience shaped how I work in complex government environments. I learned how to enter ambiguous projects, build trust quickly, and use design to create clarity.

The biggest takeaway: design in government is not just about improving interfaces. It is about helping public service teams reduce complexity, work across constraints, and build services that better meet people’s needs.