See full case study below
For confidentiality reasons some content has been intentionally rebranded, modified, or omitted. In this case study the client will be called Staffing Co. or Versa.
Staffing Co. is an international B2B staffing company, top 5 nationally for private label merchandising. They partner with 3.5K clients across US and Canada in 150K stores, running food sampling events, conducting store resets, and stocking merchandise. Staffing Co. manages $180 billion in physical and digital sales and runs over 2 million in-store sampling events per year.
Their team of over 50,000 employees (called βassociatesβ or "teammates") is broken up across 4 distinct Business Units, which operate independently and with their own leadership, client contracts, brands, and recruiting processes.
The primary challenge for Staffing Co. was high employee turnover, leading to inflated talent costs and a lack of continuity across the organization.
107% Turnover YoY across Business Units
β That's 42% greater than the high-volume-talent industry benchmark at 75%
β Certain BUs have even worse turnover: BU A at 163% turnover and BU B at 134% turnover.
β 60% of this churn occurs within the first 90 days
$15 Million Spent to Hire 20K Employees
β Extremely high acquisition costs, at approx. $750 per new hire acquisition
β 30% of all hired associates never make it to their first shift ββ that's 7.5 million wasted on non-deployed talent
Existing exit procedures did not adequately capture churn drivers, and there was ambiguity within the organization as to what led to this problem and how it might be addressed.
After months of preparation, we launched a 90-day pilot across two markets, facilitating the hiring of 292 new associates, conducting 64 in-depth interviews, and gathering 273 survey responses alongside weekly supervisor check-ins.
The result was measurable business impactβNPS rose for 5 of 7 enhancements, eNPS increased by 18%, new-hire retention improved by 8 percentage points, and the pilot delivered ~$35M in run-rate margin, ultimately expanding to reach 410+ associates in a second wave.
NPS was higher for 5 of 7 enhancements compared to baseline
eNPS increased by 18% within the pilot group.
New hire retention rose 8 percentage points
In the exploratory phase of work we conducted 32 user interviews and launched an organization-wide survey to capture 2 types of user research:
Research to understand the user (associate)
Research to understand the organization / client context
We found both types of research critical to create a deep understanding of the systems design pain points that existed. This informed to 2 core deliverables.
4 User Persona Cards
We created 4 user persona cards to capture the needs, behaviors, and motivations of the primary retail associate personas: the Juggler, the Growth Seeker, the All-In, and the Hobbyist.
Persona cards became a critical artifact for client and consultant interaction. By clearly integrating persona features with organization statistics, business-focused and executive-level leaders could see how design-thinking and personas can guide the building of impact solutions that resonate with employees.
User Experience Blueprint + Systems diagrams
We then created a detailed user experience blueprint that captured the net promoter value (NPS) of current employees and the projected improvements to NPS from 54 proposed ideas, and dimensionalizing them by user value, business value, and operational value.
This allowed us to visualize the user journey and pinpoint pain points at-a-glance, but to capture the nuances of what occurred at each step, we created a series of journey maps and systems diagram deep-dives.
We started off by identifying causes and effects of current employment practices across 4 main steps:
Attraction & Acquisition
Onboarding & Training
Deployment & Retention
Recognition & Rewards
After this initial brainstorm, we created detailed systems diagrams for the key practices in place across these steps, mapping out human, digital, and system touchpoints across sub-steps.
To align with executive-level leadership and foster collaboration, we facilitated a comprehensive client Innovation and Design-Thinking workshop. The workshop included the following key elements:
Journey Maps: We printed and presented our journey maps for client discussion
Idea Generation: The client participated in brainstorming sessions to generate innovative ideas and solutions that built off of the ideas we pre-created. This hands-on approach encouraged active participation and co-creation.
Idea Charters: Clients then broke out into groups and created idea charters and refined sketches to bring their refinements to life.
Idea Charters
Idea charters help create fun and effective brainstorming sessions by facilitating a space for the capture of ideas and ensuring all breakout groups discuss all angles of the problem.
In total, we had ~30 people attend the workshop, with 6 breakout groups of 5 each.
Workshop Outcomes: Green Lit Pilot
The workshop concluded with a clear direction: a strong desire from ADV to run a pilot program to bring these innovative actions to life. This pilot would allow us to test and validate our ideas before making a broader investment and scaling the initiative across the organization.
Next steps involve planning and executing the pilot, gathering feedback, and iterating on our designs to ensure successful implementation and maximum impact.
Once we knew the direction we were going, we began internal ideation to build on the idea charters.
Leveraging mind maps, human/digital/systems ideation, and HMW clusters, we eventually narrowed down our thinking to 7 preliminary concepts.
To bring our ideas to life for testing, we created digital sketches to test 7 concepts with users via Respondent.io.
We spoke to 6 job candidates, 14 current associates, and 7 supervisors.
Synthesizing the user feedback, we found that the most popular ideas (across user groups) focused on:
Sense of connection
Feelings of recognition
Sense of proactive support
Who makes the final call on design choices when working for a client?
It's all about balance.
Though there is never a clear cut answer, we found that finding the middle ground of user needs (informed by user testing), organizational capability/macroeconomic trends (informed by quantitative data analysis), and client appetite to do the work (informed by meetings with ADV executives) was the answer.
Business and design goals are one and the same: the deliver value and a delightful experience to the end user. By framing up our insights as such, we were able to generate buy-in from our clients.
After packaging up our ideas, we pitched it back to the clientβ¦who loved it! The client team was aligned with our pilot vision and wanted to hit the ground running to start building our ideas.
With alignment on our ideas, we moved into prototyping. We began by creating a full storyboard of the envisioned future experience, illustrating exactly how the associate experience would evolve over the coming months. This helped us clearly communicate the vision to supervisors, BU leads, and the executive client teamβmaking the proposed changes tangible and actionable.
Building high-fidelity assets
Because this pilot would ultimately be rolled out to real associates, we wanted it to feel high-fidelity and production-readyβnot just a rough prototype. Our goal was for scores and ratings to reflect genuine reactions, and for the experience to truly help associates so it could be scaled as far as possible.
To achieve this, we turned to a diverse set of prototyping methods to bring the ideas to life:

20 Custom HTML Email Templates

27 Sharepoint Pages

2 Figma flows

1 Automated Excel
My efforts creating custom pages embedded in the client Sharepoint wow-ed the Sharepoint Client Lead, who I met to discuss advanced site edits:
βThis is great. I was looking at the pages you (Jennifer) created and I'm baffled ββ I actually have some questions for you on how you did that!β
β Client, discussing my Sharepoint pages
Creating Ecosystem Buy-In
Once the assets were created, we still needed ecosystem-wide buy-in. The success of our designs depended on supervisors, welcome buddies, and senior associates actively participating in the onboarding experience, so we spent three weeks on targeted outreach to these groups.
We explained the concepts, tied them back to the pain points they addressed, and painted a clear vision of the positive outcomesβgenerating excitement about the companyβs direction. We also solicited volunteers (with additional pay) to take part in piloting the program, ensuring we had committed partners for rollout.
Once the digital and human systems were in place, we began preparing for our pilot rollout. We conducted an analysis of potential client markets and decided to run two pilots in parallelβone in each of two distinct geographiesβto avoid skew from geographic bias. We also identified control counties nearby for each pilot to compare results.
Market #1: Cook County, IL

Population: ~5.1M (second-largest county in the U.S.)
Highly urban, anchored by Chicago
Diverse demographics and dense labor market with a mix of retail, service, and professional industries
Market #2: Maricopa County, AZ

Population: ~4.6M (fastest-growing large county in the U.S.)
Suburban/urban mix, anchored by Phoenix
Rapidly expanding service sector with strong part-time and hourly workforce
Diff-in-Diff (DiD) Experiment Design
This was essentially a DiD design ββ a method where two groups are measured over the same period:
Pilot group: Cook County & Maricopa County, where our 7 initiatives were implemented over a 90-day period
Control group: Matched nearby counties with no changes during the same 90 days
By comparing the change over time in the pilot group to the change over time in the control group, we could isolate the impact of our initiatives from any external trends (seasonal changes, market fluctuations, etc.).
Impact=(Pilot After β Pilot Before) - (Control After β Control Before)
We wanted to ensure that any differences in results were driven only by our pilot initiativesβnot by underlying differences between markets. To do this, we created a balance table comparing test and control counties on key metrics (demographics, workforce composition, employment rates, etc.). After validating that these populations were statistically similar, we moved forward with confidence.
Monitoring ~800 Survey Responses and ~300 Check-ins over 90 days
After months of preparation, we launched our long-anticipated pilotβa major milestone for the team. Over the 90-day period, we:
Hired 292 Associates
with streamlined onboarding and up-skilling
36 BU Meetings
done weekly with batched supervisors in each BU
240 check-ins
monitored between supervisors and associates
In parallel, we conducted 64 in-depth interviews with associates, candidates, and supervisors to capture a holistic view of what was working and where improvements could be made. We also sent out regular surveys to measure Net Promoter Score (NPS) and overall satisfactionβreceiving 783 total survey responses.
The feedback was overwhelmingly positive, with many associates highlighting how the pilot improved their onboarding experience. Constructive feedback also emerged, giving us valuable opportunities to refine processes in real-time and share tangible insights back with stakeholders.
Listen to their stories below!
"When I signed up for a part-time job I didn't know that there would be this much community and support. Especially at my age (53) learning a bunch of new things can feel hard but they mde it easy."
β "Alyssa", New Hire
These continuous listening efforts ensured the pilot stayed responsive and adaptable, helping us maintain strong engagement and momentum throughout the 90 days.Ask ChatGPT
Reflect β Impact & Next Steps
By the end of the 90-day pilot, we saw significant improvements across multiple key metrics:
NPS was higher for 5 of 7 enhancements
eNPS increased by 18% within the pilot group.
New hire retention rose by 8 percentage points
Pilot delivered ~$35M run-rate margin in talent opportunities
The results were strong enough that the client extended the engagementβlaunching three new initiatives and conducting a second wave of pilot testing. In total, our work went on to impact 410+ new hires.
The Power of Listening to Users
In fast-paced design cycles, itβs easy to rush to ship or settle for a small testing sample, which can cause valuable user voices to be overlooked. This project was differentβI invested deeply in listening to and understanding associates. I personally reviewed 250+ check-in notes and 783 survey responses, becoming the resident expert on associate opinion within the team.
That commitment didnβt just improve the productβit made me feel more connected to our users, and more motivated to design empowering, impactful experiences for them.
What the team said about working with meβ¦
βShe wowed the team with her at-cause attitude, attention to detail, and high quality output. Jennifer has proven herself to be reliable, collaborative, and adaptive; consistently throughout this case she has implemented feedback and improved at an incredible pace. Jenniferβs key strengths are her visual design, collaborate system, and analytical mindset (evidenced by her innovative approach to her design work, which significantly increased the value of the work and always supported the holistic case answer)β
βJennifer continues to impress team members with her proactivity, reliability, and stellar design work. Jennifer consistently took new asks (whether learning new tools or developing new types of deliverables) and proactively got smart on skills needed to product high quality of output.β
