Iterative Usability Research
Beyond Research: Rebuilding a Government Toolkit
How I transformed a high-risk project by combining UX research with hands-on development to deliver an impactful, cost-saving national tool.
Company
Office of Management & Budget
Timeframe
3 months (with extension)
Team Structure
Lead Researcher partnered w/ Service Designer

The Challenge: A Project on the Brink of Failure
Mandated by a Presidential Executive Order, our goal was to pilot a toolkit to help federal agencies measure "administrative burden,” a term used to describe the friction people experience when interacting with government services.
However, upon joining the project, I discovered the proof-of-concept (POC) tool was critically flawed and the entire initiative was at risk.
1
The Tool was unusable
The POC was clunky, non-functional, and not ready for testing. Using it would only yield feedback on Excel's limitations, not on our core methodology.
2
Agencies Were Skeptical
Pilot agencies already saw the tool as a burdensome "compliance exercise." A frustrating experience would confirm their fears and kill any chance of adoption.
3
The Project's Future Was in Doubt
The assumption was that a future phase would require a costly web app. If our pilot failed, the entire initiative—and its significant potential funding—could be cancelled.
Research Deep Dive
To navigate the project's complexities, we grounded our work in a robust research plan, which I adapted in real-time to address the challenges with the tool.

Objectives & Key Questions
Before we could design solutions, we needed empathy. We conducted remote 60-minute one-on-one interviews with 12-14 creators to have open, nuanced conversations about their experiences.
Primary Objectives
Evaluate the usability, clarity, and value of the Holistic Burden Assessment Toolkit.
Understand how effectively agencies could use the tool to identify and measure burden in their services.
Identify barriers and necessary improvements to ensure successful adoption.
Research Questions
What are the primary pain points for agency staff when using the toolkit?
How can the tool's workflow and visualizations be improved to better support agencies?
What guidance and support do agencies need to use the tool effectively?
What would motivate agencies to adopt this tool as a valuable instrument, not just a mandate?

Methodology & Sampling
Before we could design solutions, we needed empathy. We conducted remote 60-minute one-on-one interviews with 12-14 creators to have open, nuanced conversations about their experiences.
A Hybrid Approach
Co-Design & Discovery Workshops: Initial sessions to understand agency needs and their existing journey maps.
Moderated Usability Testing ("Talk Aloud"): Once I built V1, we conducted sessions where I guided participants through realistic tasks to gather rich qualitative feedback.
Qualitative Thematic Analysis: After sessions, I used Dovetail to systematically code feedback and identify themes, pain points, and opportunities.
Participant Sampling & Screening
Our participants were six federal agencies (including HUD, DOJ, FEMA, and OSHA). We specifically recruited representatives directly involved in customer experience and service delivery. This was critical because they had the necessary context of their agency's services and could test the tool with a real-world journey, making their feedback essential for validating its practicality.
The Approach:
A Two-Phase Journey
Recognizing the roadblock resulting from an unusable tool, I leveraged my hybrid background in UX and development to propose a strategic pivot:
I would rebuild the tool myself.
The Research Process
The "Aha!" Moment &
Gaining Stakeholder Buy-In
I framed the problem around the client's risks and presented a clear, time-boxed plan: give me two weeks to build a functional V1 tool while the team ran initial workshops. By providing a concrete, low-risk solution, I turned a roadblock into a trust-building opportunity.
1
Part 1: Interviews
I synthesized all feedback from the pilots and implemented the final changes, delivering a polished, user-validated V2 of the toolkit and a comprehensive instruction manual, completing the project beyond its original scope.
Parallel-Pathing Research & Development
I architected our workshops to run in parallel with my development. While our Service Designer led agencies through journey mapping exercises, I was in the background rebuilding the tool. This strategic scheduling kept the project on track and bought crucial development time.
2
Part 1: Interviews
With my functional V1 complete, we ran moderated "talk aloud" usability tests with six federal agencies. As both the developer and moderator, I was able to gather direct, actionable feedback on the new tool's workflow and value. I then used Dovetail for thematic analysis to pinpoint areas for final improvements.
3
The Solution: From POC to Robust Application
My development work transformed a static proof-of-concept into a dynamic, user-friendly and insightful application entirely within Excel.
Before (V0 - The POC)
A cumbersome tool:
Manual, error-prone data entry
High cognitive load requiring users to cross-reference multiple tabs
Static fields with limited scalability and a rigid workflow
Basic, uninspiring results display
After (V2 - My Rebuild)
A robust fully-functioning tool
Fast, scalable data entry via dynamics tables
Automated descriptions and ratings to reduce cognitive load
Flexible sequencing to match real-world workflows
Enhanced, actionable data visualizations to drive change

Key Findings: What We Learned from the Pilot
The pilot yielded critical insights not just about the tool, but about the organizational mindset needed to successfully reduce administrative burden.

Collaboration is Key
Agencies that assembled cross-functional teams (policy, CX, IT) gained a far more comprehensive understanding of burden.

A Growth Mindset Wins
Those who saw the tool as a way to learn and improve extracted immense value, while those focused on compliance struggled to engage.

Data is a Catalyst for Change
Participants recognized the visualizations as a powerful tool to advocate for resources and drive change within their agencies.

The Tool Revealed Data Gaps
For agencies without robust CX data, the tool provided a clear framework for what user data they *should* be collecting.
Outcomes & Impact: Delivering Unexpected Value
My hybrid approach didn't just save the project; it delivered tangible outcomes that far
exceeded the original scope and client expectations.
+3 Months
Project Extension Secured
The project's success led directly to a contract extension and
increased revenue for my firm.
$100,000s
Saved in Development Costs
Eliminated the need for a costly web app by creating a robust,
scalable Excel solution.
High
Agency Engagement
The user-friendly tool drove overwhelming enthusiasm and
immediate desire for adoption.
Reflections & Professional Growth
This project was a profound learning experience that honed my skills as a senior UXR and strategic consultant.
The Power of Versatility
I learned that the most impactful UX professionals don't just identify problems—they actively seek to solve them. Combining my UXR skills with my technical capabilities allowed me to de-risk the project and deliver a solution that exceeded all expectations.
Build Trust to Challenge Assumptions
By first delivering a functional tool on a tight deadline, I earned the client's trust. This trust empowered me to later challenge their assumptions and confidently advocate for a better path forward.
Empathy is a Strategic Tool
The key to overcoming agency skepticism was empathy. By understanding their fear of being judged and reframing the tool as a way to help them advocate for themselves, we turned potential adversaries into enthusiastic partners.
True Value Drives Adoption
The core challenge was building a tool people might not want to use. By relentlessly focusing on what would provide tangible value to the agencies—actionable data and an easy interface—we created a pull for the product, transforming a mandate into a desired resource.
Iterative Usability Research
Beyond Research: Rebuilding a Government Toolkit
How I transformed a high-risk project by combining UX research with hands-on development to deliver an impactful, cost-saving national tool.
Company
Office of Management & Budget
Timeframe
3 months (with extension)
Team Structure
Lead Researcher partnered w/ Service Designer

The Challenge: A Project on the Brink of Failure
Mandated by a Presidential Executive Order, our goal was to pilot a toolkit to help federal agencies measure "administrative burden,” a term used to describe the friction people experience when interacting with government services.
However, upon joining the project, I discovered the proof-of-concept (POC) tool was critically flawed and the entire initiative was at risk.
1
The Tool was unusable
The POC was clunky, non-functional, and not ready for testing. Using it would only yield feedback on Excel's limitations, not on our core methodology.
2
Agencies Were Skeptical
Pilot agencies already saw the tool as a burdensome "compliance exercise." A frustrating experience would confirm their fears and kill any chance of adoption.
3
The Project's Future Was in Doubt
The assumption was that a future phase would require a costly web app. If our pilot failed, the entire initiative—and its significant potential funding—could be cancelled.
Research Deep Dive
To navigate the project's complexities, we grounded our work in a robust research plan, which I adapted in real-time to address the challenges with the tool.

Objectives & Key Questions
Our research was designed to move beyond assumptions and gather concrete evidence about the toolkit's utility and usability.
Primary Objectives
Evaluate the usability, clarity, and value of the Holistic Burden Assessment Toolkit.
Understand how effectively agencies could use the tool to identify and measure burden in their services.
Identify barriers and necessary improvements to ensure successful adoption.
Key Research Questions
What are the primary pain points for agency staff when using the toolkit?
How can the tool's workflow and visualizations be improved to better support agencies?
What guidance and support do agencies need to use the tool effectively?
What would motivate agencies to adopt this tool as a valuable instrument, not just a mandate?

Methodology & Sampling
Given the need to both gather feedback and iteratively develop the tool, I implemented a hybrid methodology.
A Hybrid Approach
Co-Design & Discovery Workshops: Initial sessions to understand agency needs and their existing journey maps.
Moderated Usability Testing ("Talk Aloud"): Once I built V1, we conducted sessions where I guided participants through realistic tasks to gather rich qualitative feedback.
Qualitative Thematic Analysis: After sessions, I used Dovetail to systematically code feedback and identify themes, pain points, and opportunities.
Participant Sampling & Screening
Our participants were six federal agencies (including HUD, DOJ, FEMA, and OSHA). We specifically recruited representatives directly involved in customer experience and service delivery. This was critical because they had the necessary context of their agency's services and could test the tool with a real-world journey, making their feedback essential for validating its practicality.
The Pivot: From Researcher to Builder
Recognizing the roadblock resulting from an unusable tool, I leveraged my hybrid background in UX and development to propose a strategic pivot:
I would rebuild the tool myself.
The Research Process
The "Aha!" Moment & Gaining Stakeholder Buy-In
I framed the problem around the client's risks and presented a clear, time-boxed plan: give me two weeks to build a functional V1 tool while the team ran initial workshops. By providing a concrete, low-risk solution, I turned a roadblock into a trust-building opportunity.
1
Parallel-Pathing Research & Development
I architected our workshops to run in parallel with my development. While our Service Designer led agencies through journey mapping exercises, I was in the background rebuilding the tool. This strategic scheduling kept the project on track and bought crucial development time.
2
Conducting Hands-On Usability Testing
With my functional V1 complete, we ran moderated "talk aloud" usability tests with six federal agencies. As both the developer and moderator, I was able to gather direct, actionable feedback on the new tool's workflow and value. I then used Dovetail for thematic analysis to pinpoint areas for final improvements.
3
Delivering the Final Product
I synthesized all feedback from the pilots and implemented the final changes, delivering a polished, user-validated V2 of the toolkit and a comprehensive instruction manual, completing the project beyond its original scope.
The Solution: From POC to Robust Application
My development work transformed a static proof-of-concept into a dynamic, user-friendly and insightful application entirely within Excel.
Before (V0 - The POC)
A cumbersome tool
Manual, error-prone data entry
High cognitive load requiring users to cross-reference multiple tabs
Static fields with limited scalability and a rigid workflow
Basic, uninspiring results display
After (V2 - My Rebuild)
A robust fully-functioning tool
Fast, scalable data entry via dynamics tables
Automated descriptions and ratings to reduce cognitive load
Flexible sequencing to match real-world workflows
Enhanced, actionable data visualizations to drive change

Key Findings: What We Learned from the Pilot
The pilot yielded critical insights not just about the tool, but about the organizational mindset needed to successfully reduce administrative burden.

Collaboration is Key
Agencies that assembled cross-functional teams (policy, CX, IT) gained a far more comprehensive understanding of burden.

A Growth Mindset Wins
Those who saw the tool as a way to learn and improve extracted immense value, while those focused on compliance struggled to engage.

Data is a Catalyst for Change
Participants recognized the visualizations as a powerful tool to advocate for resources and drive change within their agencies.

The Tool Revealed Data Gaps
For agencies without robust CX data, the tool provided a clear framework for what user data they *should* be collecting.
Outcomes & Impact: Delivering Unexpected Value
My hybrid approach didn't just save the project; it delivered tangible outcomes that far
exceeded the original scope and client expectations.
+3 Months
Project Extension Secured
The project's success led directly to a contract extension and
increased revenue for my firm.
$100,000s
Saved in Development Costs
Eliminated the need for a costly web app by creating a robust,
scalable Excel solution.
High
Agency Engagement
The user-friendly tool drove overwhelming enthusiasm and
immediate desire for adoption.
Reflections & Professional Growth
This project was a profound learning experience that honed my skills as a senior UXR and strategic consultant.
The Power of Versatility
I learned that the most impactful UX professionals don't just identify problems—they actively seek to solve them. Combining my UXR skills with my technical capabilities allowed me to de-risk the project and deliver a solution that exceeded all expectations.
Build Trust to Challenge Assumptions
By first delivering a functional tool on a tight deadline, I earned the client's trust. This trust empowered me to later challenge their assumptions and confidently advocate for a better path forward.
Empathy is a Strategic Tool
The key to overcoming agency skepticism was empathy. By understanding their fear of being judged and reframing the tool as a way to help them advocate for themselves, we turned potential adversaries into enthusiastic partners.
True Value Drives Adoption
The core challenge was building a tool people might not want to use. By relentlessly focusing on what would provide tangible value to the agencies—actionable data and an easy interface—we created a pull for the product, transforming a mandate into a desired resource.
Iterative Usability Research
Beyond Research: Rebuilding a Government Toolkit
How I transformed a high-risk project by combining UX research with hands-on development to deliver an impactful, cost-saving national tool.
Company
Office of Management & Budget
Timeframe
3 months (with extension)
Team Structure
Lead Researcher partnered w/ Service Designer

The Challenge: A Project on the Brink of Failure
Mandated by a Presidential Executive Order, our goal was to pilot a toolkit to help federal agencies measure "administrative burden,” a term used to describe the friction people experience when interacting with government services.
However, upon joining the project, I discovered the proof-of-concept (POC) tool was critically flawed and the entire initiative was at risk.
1
The Tool was unusable
The POC was clunky, non-functional, and not ready for testing. Using it would only yield feedback on Excel's limitations, not on our core methodology.
2
Agencies Were Skeptical
Pilot agencies already saw the tool as a burdensome "compliance exercise." A frustrating experience would confirm their fears and kill any chance of adoption.
3
The Project's Future Was in Doubt
The assumption was that a future phase would require a costly web app. If our pilot failed, the entire initiative—and its significant potential funding—could be cancelled.
Research Deep Dive
To navigate the project's complexities, we grounded our work in a robust research plan, which I adapted in real-time to address the challenges with the tool.

Objectives & Key Questions
Our research was designed to move beyond assumptions and gather concrete evidence about the toolkit's utility and usability.
Primary Objectives
Evaluate the usability, clarity, and value of the Holistic Burden Assessment Toolkit.
Understand how effectively agencies could use the tool to identify and measure burden in their services.
Identify barriers and necessary improvements to ensure successful adoption.
Key Research Questions
What are the primary pain points for agency staff when using the toolkit?
How can the tool's workflow and visualizations be improved to better support agencies?
What guidance and support do agencies need to use the tool effectively?
What would motivate agencies to adopt this tool as a valuable instrument, not just a mandate?

Methodology & Sampling
Given the need to both gather feedback and iteratively develop the tool, I implemented a hybrid methodology.
A Hybrid Approach
Co-Design & Discovery Workshops: Initial sessions to understand agency needs and their existing journey maps.
Moderated Usability Testing ("Talk Aloud"): Once I built V1, we conducted sessions where I guided participants through realistic tasks to gather rich qualitative feedback.
Qualitative Thematic Analysis: After sessions, I used Dovetail to systematically code feedback and identify themes, pain points, and opportunities.
Participant Sampling & Screening
Our participants were six federal agencies (including HUD, DOJ, FEMA, and OSHA). We specifically recruited representatives directly involved in customer experience and service delivery. This was critical because they had the necessary context of their agency's services and could test the tool with a real-world journey, making their feedback essential for validating its practicality.
The Pivot: From Researcher to Builder
Recognizing the roadblock resulting from an unusable tool, I leveraged my hybrid background in UX and development to propose a strategic pivot:
I would rebuild the tool myself.
The Research Process
The "Aha!" Moment & Gaining Stakeholder Buy-In
I framed the problem around the client's risks and presented a clear, time-boxed plan: give me two weeks to build a functional V1 tool while the team ran initial workshops. By providing a concrete, low-risk solution, I turned a roadblock into a trust-building opportunity.
1
Parallel-Pathing Research & Development
I architected our workshops to run in parallel with my development. While our Service Designer led agencies through journey mapping exercises, I was in the background rebuilding the tool. This strategic scheduling kept the project on track and bought crucial development time.
2
Conducting Hands-On Usability Testing
With my functional V1 complete, we ran moderated "talk aloud" usability tests with six federal agencies. As both the developer and moderator, I was able to gather direct, actionable feedback on the new tool's workflow and value. I then used Dovetail for thematic analysis to pinpoint areas for final improvements.
3
Delivering the Final Product
I synthesized all feedback from the pilots and implemented the final changes, delivering a polished, user-validated V2 of the toolkit and a comprehensive instruction manual, completing the project beyond its original scope.
The Solution: From POC to Robust Application
My development work transformed a static proof-of-concept into a dynamic, user-friendly and insightful application entirely within Excel.
Before (V0 - The POC)
A cumbersome tool
Manual, error-prone data entry
High cognitive load requiring users to cross-reference multiple tabs
Static fields with limited scalability and a rigid workflow
Basic, uninspiring results display
After (V2 - My Rebuild)
A robust fully-functioning tool
Fast, scalable data entry via dynamics tables
Automated descriptions and ratings to reduce cognitive load
Flexible sequencing to match real-world workflows
Enhanced, actionable data visualizations to drive change

Key Findings: What We Learned from the Pilot
The pilot yielded critical insights not just about the tool, but about the organizational mindset needed to successfully reduce administrative burden.

Collaboration is Key
Agencies that assembled cross-functional teams (policy, CX, IT) gained a far more comprehensive understanding of burden.

A Growth Mindset Wins
Those who saw the tool as a way to learn and improve extracted immense value, while those focused on compliance struggled to engage.

Data is a Catalyst for Change
Participants recognized the visualizations as a powerful tool to advocate for resources and drive change within their agencies.

The Tool Revealed Data Gaps
For agencies without robust CX data, the tool provided a clear framework for what user data they *should* be collecting.
Outcomes & Impact: Delivering Unexpected Value
My hybrid approach didn't just save the project; it delivered tangible outcomes that far
exceeded the original scope and client expectations.
$100,000s
Saved in Development Costs
Eliminated the need for a costly web app by creating a robust,
scalable Excel solution.
+3 Months
Project Extension Secured
The project's success led directly to a contract extension and
increased revenue for my firm.
High
Agency Engagement
The user-friendly tool drove overwhelming enthusiasm and
immediate desire for adoption.
Reflections & Professional Growth
This project was a profound learning experience that honed my skills as a senior UXR and strategic consultant.
The Power of Versatility
I learned that the most impactful UX professionals don't just identify problems—they actively seek to solve them. Combining my UXR skills with my technical capabilities allowed me to de-risk the project and deliver a solution that exceeded all expectations.
Build Trust to Challenge Assumptions
By first delivering a functional tool on a tight deadline, I earned the client's trust. This trust empowered me to later challenge their assumptions and confidently advocate for a better path forward.
Empathy is a Strategic Tool
The key to overcoming agency skepticism was empathy. By understanding their fear of being judged and reframing the tool as a way to help them advocate for themselves, we turned potential adversaries into enthusiastic partners.
True Value Drives Adoption
The core challenge was building a tool people might not want to use. By relentlessly focusing on what would provide tangible value to the agencies—actionable data and an easy interface—we created a pull for the product, transforming a mandate into a desired resource.