There’s a specific design investment that we rarely see correctly measured and reported on: the ROI of design education and design thinking workshops.
If you invest in workshops, courses, or resources to educate members of your staff, you should be able to accurately trace their value after implementation. 71% of companies that practice design thinking report an improved team culture, and that’s fantastic. But you need quantifiable results.
Check-ins on the 30-60-90 day marks
We’ve seen scheduled check-ins with the employees who received training produce valuable data. Scheduling meetings with the participants at 30, 60, and 90 days after the education provides insight into how effective the training was at the organizational level, and whether it’s actually being actioned. Some data points to consider in your 30-60-90 follow ups:
- Was the information was effectively taught and retained?
- Is the information being actively applied to daily work?
- Is there room for further education and improvement?
- Has the professional development had an effect on employee satisfaction?
Ask your design education partner if they provide these follow ups as part of their services - it’s ideal to involve the same organization that provided the design education.
Measure Changes in Outputs, Outcomes, and Time-Spent
71% of design-trained companies report a 10X increase in asset production. So you had better be ready to measure and account for all of them. Compare the sheer production of assets and artifacts to the pre-design training baseline to get a rough estimation of impact. A more detailed process for asset measurement can be found in our ROI of Design report.
Most of the design metrics we cover in the report can be applied across industries, while some are industry-specific. Regardless, if you have systems in place for tracking performance metrics (and you should), any improvements due to design training will be apparent in them.
Track shifts in employee work styles
Design training will also result in changes in work styles that generate new metrics to measure against. How many ideas were generated in ideation and brainstorming (hint: you can just count the sticky notes)? It’s also likely that a design training student will share some of their exercises with clients or internal stakeholders. These should be recorded, and feedback surveys should be sent to the people who participated.
These shifts in workflow after design training range from dramatic to nuanced, but the effects ripple outward and can sometimes get lost in the shuffle. It’s important to monitor the origins of the work being produced to accurately track the impacts of it.
Want to know how your employees feel? Ask them.
A direct way to gauge the effectiveness of design training in your company is to ask the trainees directly. But if you want reliable, actionable data, you have to ensure one thing: anonymity.
Effective design is about creating safe spaces for freedom of expression. If your employees know the C-suite is going to read these and their names are attached to their answers, you won’t get quality feedback. It’s best for these surveys to come from a third party, like your design training facilitator, to encourage safety and honesty.
It’s important to think of these questions before the training even takes place. Again, it’s best to ask your digital thinking partner if they provide these monitoring and evaluation services. They have experience in these evaluations and provide critical guidance on the questions asked.