UX ENGINEERING

UI / UX — Conversion-Focused Design

Measurable product design that unites research, information architecture, design system, prototyping, usability testing, accessibility and dev handoff into a single discipline.

UX is no longer 'a pretty screen'; it's the engineering that turns research + IA + design system + prototype + testing + accessibility + performance + handoff into one operation.

User behavior has changed: decision times are shorter, patience thresholds are lower, accessibility is now mandatory, and AI interfaces have outgrown static wireframe logic. Most teams are still stuck in the habit of 'opening Figma without research' and 'shipping one-off screens instead of a design system'. The Roibase UX operation is built on six principles; each has a written metric tree to measure, report and iteratively improve.

Roibase perspective

METHODOLOGY

A 6-layer UX engineering operation

Discover → Define → System → Design → Validate → Handoff. Each layer is delivered with a deliverable + owner + SLA.

01

01

DISCOVER

Business goal, stakeholder interviews, 5-8 in-depth user interviews, competitive audit, analytics + session replay review.

02

02

DEFINE

Persona + JTBD, problem statement, IA card-sort / tree-test, content audit, user flow + wireframe map.

03

03

SYSTEM

Design token library, component inventory, pattern atlas, motion + a11y principles; Figma Variables → code pipeline.

04

04

DESIGN

High-fidelity screens + interactive prototype + micro-interaction spec; responsive breakpoints + dark mode.

05

05

VALIDATE

Usability test (moderated + remote), accessibility audit, performance budget review, stakeholder review + iteration.

06

06

HANDOFF

Storybook + MDX docs, props/variant matrix, design QA pipeline, dev training sessions, 90 days of support.

— COMPARISON

Freelance UX vs. vanilla agency vs. Roibase UX engineering

The difference across three approaches in research, system, accessibility, dev handoff and ROI.

DimensionFreelance UXVanilla agencyRoibase UX engineering
ResearchNone or shallow1-2 interviews5-8 interviews + analytics + JTBD
Design systemAd-hoc screensFigma style libraryToken-first + component + pattern atlas
Usability testingUsually noneSingle roundModerated + remote + SUS/SEQ report
AccessibilityMostly ignoredContrast checkWCAG 2.2 AA + AAA target + axe-core
Dev handoffFigma linkInspect + style guideStorybook + MDX + props/variant matrix
CRO integrationNoneNoneA/B-ready variant system at design time
PerformanceLCP ignoredRarely checkedFont/motion/image budget upfront
12-month TCOLow entry, per-screen recurring feesMedium but fragmentedMedium + live system + process discipline

PROOF

Outcomes, measured

+38%
Task completion

6-month average after user testing; SUS 72 → 84.

+27%
Onboarding conversion

90 days after IA + onboarding redesign.

-42%
Support tickets

Impact of form + error state + empty state redesign.

98%
WCAG 2.2 AA compliance

axe-core pass rate across all delivered screens.

< 1.5s
LCP baseline

Performance budget set at design stage.

12
Token categories

Color, type, spacing, radius, shadow, motion, z-index, breakpoint, opacity, blur, transition, duration.

WHAT WE DO

Engagement scope

Every offering is an outcome-based work package. Roibase blends strategy and execution inside a single team — no hand-offs.

01 / 10

Discovery + user research

Stakeholder interviews, 5-8 in-depth user interviews, competitive audit + jobs-to-be-done mapping; the problem space gets clear first.

02 / 10

Information architecture

Navigation that's genuinely understood, built with card sort + tree test + first-click test; taxonomy built from 'outside in', not inside out.

03 / 10

Design system (token-first)

Tokens Studio + Style Dictionary → Figma Variables → code; color, typography, spacing, motion, radius, shadow and state from a single source.

04 / 10

Component library + pattern atlas

Button, form, nav, card, table, modal, toast, empty state; props + variant + a11y state + responsive breakpoint map.

05 / 10

High-fidelity prototype

Figma interactive prototype + micro-interaction spec; motion prototyping with Rive/Lottie; tech brief for Lenis/Framer Motion.

06 / 10

Usability testing operation

Moderated 5-user series + remote (Maze/Useberry/UserTesting); SUS, SEQ, task completion, time-on-task metrics.

07 / 10

Accessibility (WCAG 2.2) audit

axe-core + manual contrast + keyboard + screen reader (NVDA/JAWS/VoiceOver) audit; AA baseline, AAA target; KVKK/EAA compliance.

08 / 10

Design QA + visual regression

Chromatic / Percy / Playwright visual regression; Figma-to-code pixel parity + component anatomy QA.

09 / 10

CRO integration

A/B-ready variant system: 2-3 hypothesis variants per component prepared at design stage; export for VWO/Optimizely.

10 / 10

Dev handoff + Storybook

Props/variant/state matrix, behavior spec, motion timing, responsive breakpoint; Storybook + MDX docs kept live.

— BENEFITS

The tangible change in your product experience

When research + system + testing + accessibility + handoff integrate, ROI is no longer tied to a single metric.

+38% task

User friction drops

Task completion rises, support tickets fall; the product becomes 'understandable' and onboarding drop-off shrinks.

+27% conv

Conversion grows

With an A/B-ready variant system, CRO discipline begins at design time; funnel drop-offs are fixed before they ship.

-45% dev

Dev velocity increases

Token-first + Storybook + MDX handoff cuts per-screen build time noticeably and reduces defects.

98% AA

Accessibility legal risk drops

WCAG 2.2 AA + EAA (European Accessibility Act 2025) compliance; when the audit comes, the evidence is ready.

1 system

Brand consistency rises

A design system delivers the same identity + behavior across every channel (web, app, email, social, sales deck); brand trust compounds.

4x speed

Scale gets easier

Adding a new product / locale / market is measured in days, not weeks, thanks to the component library.

DELIVERABLES

What you get in every UX engagement

A fixed deliverable list for the setup + 6-month operation package; no surprise fees.

  • UX diagnostic report

    Analytics + session replay + heuristic audit + competitive benchmark; 40-60 pages.

  • User research findings

    Transcripts + thematic coding from 5-8 in-depth interviews, persona + JTBD map.

  • Information architecture set

    Card-sort / tree-test results, sitemap, user flow, wireframe book.

  • Design token library

    Color, type, spacing, motion, shadow, radius, z-index; Tokens Studio + Style Dictionary + code export.

  • Component library (Figma + Storybook)

    Button, form, nav, card, table, modal, toast, empty state + dark mode + responsive breakpoint + a11y state.

  • Pattern atlas + motion spec

    Onboarding, search, filter, checkout, dashboard patterns + Rive/Lottie motion spec files.

  • High-fidelity prototype

    Interactive Figma prototype + micro-interactions; recording + PDF version for stakeholder review.

  • Usability test report

    5-8 moderated + 15+ remote participants, SUS/SEQ/task completion, insights + prioritized action list.

  • Accessibility audit + remediation

    WCAG 2.2 AA/AAA audit, axe-core + manual testing, remediation plan, AR report.

  • Performance budget

    LCP/INP/CLS targets, font + image + motion budget, render strategy + monitoring plan.

  • Dev handoff package

    Storybook + MDX docs, props/variant/state matrix, responsive breakpoints, QA checklist.

  • Runbook + training + 3 months of support

    Design system runbook, token update process, Figma + code training sessions, 90 days of support.

— SCOPE

What we do, what we don't

UX engineering scope is written down; it prevents surprises and after-the-fact invoices.

We do

  • User research (interviews + JTBD + analytics)
  • Information architecture (card sort + tree test)
  • Design token + component + pattern library
  • Figma Variables → Style Dictionary → code pipeline
  • High-fidelity prototype + micro-interaction spec
  • Usability testing (moderated + remote)
  • WCAG 2.2 AA/AAA accessibility audit + remediation
  • Performance budget + LCP/INP/CLS targets
  • Design QA + visual regression (Chromatic/Percy)
  • Storybook + MDX dev handoff + training
  • CRO variant system (A/B-ready design)
  • Runbook + 90 days of support

We don't

  • Front-end application development (delivered with partner teams)
  • Back-end / DB design (stays with the software team)
  • Brand strategy / logo (branding team / separate engagement)
  • Copywriting / content production (we do review UX copy)
  • Icon / illustration production (licensed libraries + partners)
  • Photography / video production
  • Long-term monthly screen production (the system is built and internalized by the team)
  • User research panels or incentive payment management (partner lab)

HOW WE WORK

A 12-16 week setup, followed by monthly system operation

01

Weeks 1-2: discovery + stakeholder kickoff

Business goal, stakeholder interviews, analytics/session replay review, locking scope and target metrics.

02

Weeks 3-4: user research + JTBD

5-8 in-depth interviews, transcript + thematic coding, persona + jobs-to-be-done map.

03

Weeks 5-6: IA + content audit

Card sort + tree test, user flow, wireframe map, content audit + copy strategy.

04

Weeks 7-8: design system + tokens + components

Token library, component inventory + pattern atlas, Figma Variables + Style Dictionary export.

05

Weeks 9-10: high-fidelity design + prototype

High-fi design of core screens + interactive prototype, micro-interactions + motion spec.

06

Weeks 11-12: usability test + a11y audit

Moderated + remote user testing, WCAG 2.2 audit + remediation, performance budget check.

07

Weeks 13-14: iteration + stakeholder sign-off

Iteration on test findings, final stakeholder review, Storybook + dev handoff preparation.

08

Week 15+: handoff + dev pipeline + support

Storybook + MDX docs delivered, dev training sessions, design QA pipeline, CRO variant system + 90 days of support.

— TOOLKIT

Our tooling — vendor-agnostic but deliberate choices

We pick what fits each client; we protect our independence by refusing commissions.

DESIGN & PROTOTYPING

Figma (variables + dev mode)FigJam / MiroTokens StudioStyle DictionaryPrinciple / RiveLottie / Framer Motion

RESEARCH & TESTING

MazeUseberryUserTestingLookbackDovetail (insight repo)Optimal Workshop

ACCESSIBILITY & QA

axe-core / DequeStark (Figma)NVDA / JAWS / VoiceOverChromatic / PercyPlaywright visual regressionLighthouse + WebPageTest

HANDOFF & ANALYTICS

Storybook + MDXZeroheight / SupernovaHotjar / FullStory (session replay)Amplitude / MixpanelGA4 + funnelVWO / Optimizely (A/B)

QUESTIONS

Frequently asked

6-8 weeks for a single flow (onboarding, checkout, dashboard); 12-16 weeks for a full product redesign + design system. Research + design + testing + handoff included.

— GLOSSARY

Core concepts of UX engineering

A shared language for your design, development and product teams.

01
Jobs-to-be-Done (JTBD)
A framework for defining the concrete job a user is trying to get done with a product; behavior- and context-focused, instead of personas.
PersonaUser ResearchOutcomeDiscovery
02
Information Architecture (IA)
Organizing a product's content and functions to match users' mental models; validated with card sort + tree test.
Card SortTree TestNavigationTaxonomy
03
Design Token
Expressing design decisions like color, typography, spacing and motion as platform-agnostic variables; distributed from a single source via Figma Variables + Style Dictionary + code pipeline.
Style DictionaryFigma VariablesDesign SystemTheming
04
Design System
A single source of truth for design and development, combining token + component + pattern + guideline + governance layers.
TokenComponent LibraryPattern AtlasGovernance
05
WCAG 2.2
Web Content Accessibility Guidelines 2.2; success criteria at A, AA and AAA levels; 9 new criteria (2023) focused on keyboard, focus, drag and target size; AA is the industry baseline.
AccessibilityEAAaxe-coreA11y
06
System Usability Scale (SUS)
A 10-item standard questionnaire that scores interface usability between 0-100; > 68 is above average, > 80 is the excellence threshold.
Usability TestingSEQNPSBenchmark
07
Single Ease Question (SEQ)
A single question measuring perceived task difficulty on a 1-7 scale right after a task; a task-level complement to SUS.
SUSTask CompletionUsability TestingEffort
08
Largest Contentful Paint (LCP)
The moment the page's main visual content becomes visible to the user (Core Web Vitals); < 2.5s is good. Controlled at the design stage with font + image + render budgets.
INPCLSCore Web VitalsPerformance Budget
09
Interaction to Next Paint (INP)
The peak time between a user interaction and the next processed frame (Core Web Vitals); < 200ms is good. The successor to FID as of 2024.
LCPCLSWeb VitalsResponsiveness
10
Storybook
A tool for developing, documenting and QA'ing components in isolation; with MDX, design-dev handoff documentation stays live.
Component LibraryMDXHandoffVisual Regression
11
Visual Regression Testing
Automatically verifying with Chromatic / Percy / Playwright that UI components look the same after code changes; prevents design drift.
ChromaticPercyPlaywrightDesign QA
12
European Accessibility Act (EAA)
The EU accessibility directive in force on 28 June 2025; mandates WCAG 2.2 AA for e-commerce, banking, transport and telecom.
WCAG 2.2AccessibilityComplianceEN 301 549

— DECISION TREE

Is a UX engineering operation right for you?

Answer Yes/No to 4 questions; get a clear result.

01 / 04

Do you have a live digital product (web/app/SaaS), or a launch plan in the next 3-6 months?

The minimum threshold for research + system + testing operations to return ROI.

— LET'S BEGIN

Let's turn design from guesswork into learning.

In a 1-hour UX diagnostic we examine your current flow through analytics + heuristic + a11y lenses, surfacing the riskiest assumptions and quick wins.