
- Key Questions:
What are the largest user-centric bottlenecks behind the scientific replication crisis? What specific usability improvements most directly reduce time waste, training costs, and error? How do said improvements scale and provide bilateral, sustainable value?
- End Users:
Academia: Principal Investigators; Post-Doctoral, Graduate, and Undergraduate Students
Pharmaceutical: Assay or Analytical Development Managers, Analysts; Quality Control Managers, Analysts
- Industries:
B2C - pivot to B2B - SaaS
Academia
Pharmaceutical - Quality Control
Contract Research Organizations (CROs)
- Organization:
Inscriptures, in collaboration with the Harvard Medical School, the NSF, and Rutgers University.
- Product:
Assays® - a service-to-SaaS scaled solution to visualize and manage biomedical research procedures.
- Problem Hypothesis:
High research lab turnover leads to lost institutional knowledge and abandoned research projects.80-85% of all biomedical research is wasted - $268.4B per annum - including Acadamia and Pharmaceutical procedures.
- Market Research Opportunity:
For biomedical industries alone: the New England Journal of Medicine estimated global biomedical costs of Research and Development (R&D) to be $268.4 billion back in 2012 as snapshot in a growing trend (Chakma, et al. 2014). It has been estimated about 85% of cumulative research investment - equating to $200 billion in 2010 - is wasted (Chalmers and Glasziou 2009). Findings or alterations in methodology become unpublished or shelved, requiring additional investments in repeated protocols.Human error is an unavoidable elephant in the clinic and laboratory that is often disposed of (Compton, Szklarski, and Booth 2017). A 15-year meta-analysis byHarvard Medical School strictly categorized inappropriate testing in a clinical environment and estimated that 20-30% of tests met this criteria (Zhi et al. 2013). Up to 89% of published trial reports are nonreplicable due to poor descriptions, according to the Lancet (Glasziou et al.).
Step 2: Discovery Survey
- Why?:
Before the one-on-one interview methodology depicted in the InAct case study, a survey provides both a useful interview recruiting tool and sampling quantitative baseline to triangulate qualitative insights. This survey targeted the first userbase identified by market and product research: Academia biosciences. Said respondents would become the key user testers in the usability research segment of this project.
- Goal:
Evaluate demand via unbiased problem-solution and product-market questions. Translate user pain points to dollar amounts to gauge if a lucrative entry point. Provide adequate information for a deeper drill-down within discovery and evaluative interviews.
- Tools:
Mail Merge, Google Forms, Survey Monkey, Microsoft Forms, Excel, Power BI, etc.
- Methods:
Demographic binning, Random sampling. Nominal, Ordinal, Interval, Ratio questions + open ended responses. Likert scales, Ranking.
- Participants
Target userbase respondents, Product leadership, UX designer, Subject Matter Expert (SME).
- Results:
The sampled university students predominantly use iPhones and identify student time commitment as a major problem: namely training, scheduling, and juggling research with coursework. Other personas, like graduate and post-doctorate students, are closer attached to projects and lament high student churn rate: institutional knowledge always falling through the cracks and rendering research project leadership demotivated to train students in assay methodology. All respondents indicated strong need for time-saving, tracking, and managing solutions alongside safety. Money saving was not a concern, but all respondents indicated willingness to pay an average of $100 per annum for a solution. Research supports high usage if the university were to pay for a software license.
Skip to Step 6: User Testing
- Why?:
After going through generative research steps 3-5 covered in the InAct Case Study to expand the problems, participants were incentivized to user test the solution space: from low to high fidelity to save time and costs. The Rapid Iterative Testing and Evaluation (RITE) method was applied in collaboration with the UX Designer to quickly share feedback within a feedback loop. User testing consisted of 30-60 minute moderated, remote sessions: where task completion mapping to the discovery research findings quickly identified design feedback in four color coded categories: use cases/yellow, positive/green, negative/red, desires/blue. Negative feedback and desires directly influence the next user testing round.
- Goal:
Accurately and timely provided UX designers, Product Leadership, and the Engineering team an evidence-based product iteration bullet train to glean the most user-identified value from the least resources.
- Tools:
Figma, Adobe XD, Paper, or other low-to-high fidelity mockup or prototype, Miro, Figjam, Zoom, Teams
- Methods:
RITE method, task analysis, ranking/card sorting (i.e. “rank the menus”), co-creation (i.e. “if you could redesign this...”), contextual inquiry (i.e. “how applicable might or might this not be”), objectivity (“You can’t hurt my feelings. I didn’t design this.”), etc.
- Participants
User Testing Participants, Under cover “Note-takers”: UX Designer, Product Manager, Engineer.
- Results:
After several weeks, rapid feedback informed minor and major changes towards a patent-pending prototype that averages low to no negativity for an MVP. Feedback also informed feature prioritization.
Pivot Back to Step 3:
New End User Discovery

- Why?:
Despite successful user testing, concurrent market and discovery research identified a larger $1.6T quality control pharmaceutical market with similar pain points, but potentially lower cost-to-entry. None of the Academia-based research went to waste, but the project refocused to prioritize empathy and understanding of the processes behind Quality Control in the Pharmaceutical Manufacturing environment. Exploratory research steps 1-2, covered in the InAct Case Study, were followed - leading to a synthesis of proprietary interviews with pharma-based B2B participants, depicted in the below Journey Map.
- Goal:
Succinctly, yet accurately, depict and simplify complex user processes, pain points, and insights towards invaluable product, service, and company leadership. Also: identify key user-values for market entry and inform an MVP.
- Tools:
Miro, Lucidspark, Figjam, or other whiteboard software. Powerpoint - for presentations, including insights from Pendo, WalkMe, Google Analytics, or other analytics tool.
- Methods:
RITE method, task analysis, ranking/card sorting (i.e. “rank the menus”), co-creation (i.e. “if you could redesign this...”), contextual inquiry (i.e. “how applicable might or might this not be”), objectivity (“You can’t hurt my feelings. I didn’t design this.”), etc.
- Participants
User Testing Participants, Under cover “Note-takers”: UX Designer, Product Manager, Engineer.
- Results:
After several weeks, rapid feedback informed minor and major changes towards a patent-pending prototype that averages low to no negativity for an MVP. Feedback also informed feature prioritization.
Step 4: Discovery Action Plan

- Why?:
Discovery syntheses are further broken down into a shared action plan: translating findings and discussion points into fodder for design ideation. Without a discovery action plan to reference, too much information may overwhelm or demotivate stakeholders. In this special case, specific artifacts from research sessions helped inform empathy towards a filmed service-offering that would strategically scale into the earlier-tested mobile software.
- Goal:
Dilute, without misrepresenting nor concealing, all syntheses into a short document stakeholders can easily reference to commence work within the solution space.
- Questions Answered:
What do SOPs currently look like?
What does the current QC work environment look like?
What does QC management fear most?
What are the core communicative pain points we need to address between Assay Development and QC labs?
- Tools:
Miro, Lucidspark, Figjam, or other whiteboard software. Teams or Slack for sharing the plan.
- Methods:
Findings/project-dependent: the KISS method, opt-in complexity, HMW ideation, Co-Creation, Contextual Inquiry, Fear Mapping.
- Participants
All product stakeholders: Product Leadership - Managers and Owners, Engineering/Dev Team, QA Lead, UX Designer, UX Manager, General Managers, Solution Consultants.
- Results:
Product and Sales Leadership relied on the below artifacts to create a profitable and scalable lightweight Assay visualization service offering.
Step 5: Landing Page Ideation
- Why?:
Discovery syntheses are further broken down into a shared action plan: translating findings and discussion points into fodder for design ideation. Without a discovery action plan to reference, too much information may overwhelm or demotivate stakeholders. In this special case, specific artifacts from research sessions helped inform empathy towards a filmed service-offering that would strategically scale into the earlier-tested mobile software.
- Goal:
Dilute, without misrepresenting nor concealing, all syntheses into a short document stakeholders can easily reference to commence work within the solution space.
- Tools:
Miro, Lucidspark, Figjam, or other whiteboard software. Teams or Slack for sharing the plan.
- Methods:
Findings/project-dependent: the KISS method, opt-in complexity, HMW ideation, Co-Creation, Contextual Inquiry, Fear Mapping.
- Participants
All product stakeholders: Product Leadership - Managers and Owners, Engineering/Dev Team, QA Lead, UX Designer, UX Manager, General Managers, Solution Consultants.
- Results:
Product and Sales Leadership relied on the below artifacts to create a profitable and scalable lightweight Assay visualization service offering.

Step 6: User Testing
- Why?:
In a rapid timeline of 2 weeks, users who previously participated in discovery research were re-recruited for rapid-fire user testing of the novel landing page. The Rapid Iterative Testing and Evaluation (RITE) method was again used in collaboration with the UX Designer to quickly share feedback within a feedback loop: starting with the landing page design from the ideation session. User testing consisted of 30-60 minute moderated, remote sessions: where the different scenarios of a prospective customers/users were presented towards findings quickly identifying design feedback in four color coded categories: use cases/yellow, positive/green, negative/red, desires/blue. Negative feedback and desires directly influence the next user testing round depicted in the user feedback map.
- Goal:
Accurately and timely provide UX designers, Product and Marketing Leadership, and the Engineering team an evidence-based product iteration bullet train to glean the most user-identified value from the least resources.
- Tools:
Figma, Adobe XD, Paper, or other low-to-high fidelity mockup or prototype, Miro, Figjam, Zoom, Teams
- Methods:
RITE method, scenarios, co-creation (i.e. “if you could redesign this...”), contextual inquiry (i.e. “how applicable might or might this not be”), objectivity (“You can’t hurt my feelings. I didn’t design this.”), etc.
- Participants
User Testing Participants, Under cover “Note-takers”: UX Designer, Product Manager, Marketing Solution Consultant, Engineer.
- Results:
After only two cost and time-effective weeks, rapid feedback informed design handoff towards a live, lightweight website marketing a service.
Step 7: Evaluative Synthesis
- Why?:
Given the rapid nature of this evaluative research loop, this step occured concurrantly with Step 6 - represented by the below Kanban board. Every few days, newly synthesized findings - following the same color coded feedback - would be pasted into the “To Discuss” bucket, awaiting the purview of marketing, product, and UX design stakeholders. This Kanban board acted as a living document: where every piece of feedback had to either be implemented, deferred until further testing or for feasiblity reasons, or completely addressed. At the end of user testing, this proprietary document allows for insight on aspects of the landing page that were shelved.
- Goal:
Accurately and timely provide UX designers, Product and Marketing Leadership, and the Engineering team an evidence-based product iteration live summary.
- Tools:
Miro, Figjam, Zoom, Teams, Projects, Kanban software.
- Methods:
Kanban, Agile, standup philosophies (not strictly followed).
- Participants
Core stakeholders: UX Designer, Product Manager, Marketing Solution Consultant, Engineer.
- Results:
After only two cost and time-effective weeks, rapid feedback informed design handoff towards a live, lightweight website marketing a service.
Step 8: Feature Prioritization
and Product Strategy Action Plan

- Why?:
Combining user testing results from the shelved mobile user testing and product landing page, a prioritization workshop was necessary to close in the solution space and begin MVP deployment. Helping the UX Designer inform handoff, I crafted proprietary action plans that product and executive leadership adopted as a 5-year roadmap. This step ensured all shelved research prior to this was not forgotten and had an associated timestamp to support scaleable and profitable go-to-market product strategy. This long-term value of research would prove more lucrative than the short-term insights: also motivating the design team to keep iterating.
- Goal:
Apply all research findings towards a prioritized product/solution roadmap - involving input from Engineering, UX, Product, and Innovation Leadership.
- Tools:
Miro, Figjam, Zoom, Teams, Sharepoint.
- Methods:
UX Research Workshops, Brainstorming, Whiteboarding, Process Mapping.
- Participants
Core stakeholders and upper management: UX, Product, Engineering, Marketing.
- Results:
No research nor design efforts were wasted: only reapplied towards a focused product strategy that had an impact on the entire company and afforded UX a “seat at the table”.
Step 9: Readout and Handoff

- Why?:
Handoff to engineering team might be costly if they fail to understand the careful methodologies behind each design decision. Careful communication of the user-evidenced design rationale avoids any mishaps in handoff - later resulting in engineering correction or a botched product release. Part of ensuring a smooth handoff isn't merely sharing design files, but also sharing research artifacts, summaries, and synthesis documents to establish the "why?" behind every piece that is being developed.
- Goal:
Apply all research findings towards a prioritized product/solution roadmap - involving input from Engineering, UX, Product, and Innovation Leadership.
- Tools:
Miro, Figjam, Zoom, Teams, Sharepoint.
- Methods:
UX Research Workshops, PowerPoint.
- Participants
Core stakeholders and upper management: UX, Product, Engineering, Marketing.
- Results:
Immediately: the landing page was engineered and published as designed, due to looping in Engineering into the research process. Marketing teams also felt more confident in their role as a solution consultants - selling value rather than price tags. Overall, internal silos were broken - all in favor of a successful service launch and market roadmap towards already-tested mobile software.