GeekZilla.io

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors

How Phenomenon Studio Increased KlickEx Conversion by 35% Through Strategic UX Design

Key Takeaways

  • Measurable UX impact: Our KlickEx fintech redesign increased “Add Money” conversion by 35% and “Money Transfer” completion by 30%—backed by 90 days of post-launch analytics tracking 47,000+ user sessions
  • Mobile-first is non-negotiable: 73% of KlickEx users accessed the platform via mobile; our mobile optimization alone contributed to 18% of the overall conversion improvement
  • Friction analysis reveals opportunities: We identified 14 specific friction points in the original flow; eliminating just 4 critical ones produced 80% of the conversion gains
  • Validation before development: Interactive prototypes tested with 89 real users prevented $43,000 in development costs by catching usability issues before coding began

35% conversion increase. 30% better completion rates. $1M in additional funding secured within six months.

These aren’t projections or estimates—they’re documented results from our KlickEx fintech redesign, measured across 47,000+ user sessions between June and December 2025. When Nomupay (KlickEx’s parent company) approached our website development agency in January 2025, their cross-border payment platform was functional but hemorrhaging users at the conversion funnel. Our challenge: fix the UX without disrupting their existing 8,400 active users.

I’m sharing the complete methodology behind this project because most agencies showcase results without revealing how they achieved them. We’ll examine the specific design decisions, the data that drove them, and—critically—what didn’t work so you can learn from both our successes and mistakes.

What Was Actually Broken? Diagnosing the Conversion Problem

How do you identify why users abandon a platform? Most teams rely on analytics dashboards showing where drop-off happens. That’s useful but incomplete—it tells you where the problem is, not why it’s happening.

We started the KlickEx project with what I call a “friction audit.” Over three weeks, we analyzed the existing platform through multiple lenses: quantitative data from their analytics (18 months of user behavior), qualitative feedback from 34 user interviews, session recordings of 240 real user interactions, and comparative analysis against 11 competitor platforms.

The quantitative data revealed stark patterns. Of users who initiated the “Add Money” flow, only 39% completed it. For “Send Money” transactions, completion dropped to 52%. Industry benchmarks for similar fintech platforms average 67% and 78% respectively. Something was seriously wrong.

But the numbers didn’t explain why. That’s where session recordings became invaluable. We watched 240 users attempt transactions on the existing platform. Patterns emerged quickly: users hesitated at specific steps, repeatedly clicked non-interactive elements, and frequently abandoned flows right before completion. These behavioral signals pointed to specific problems our qualitative research confirmed.

The 14 Friction Points That Killed Conversion

Through our research, we documented 14 distinct friction points. Not all were equally impactful. Using a prioritization matrix weighing severity (how much it disrupted the flow) against frequency (how many users encountered it), we identified four critical issues responsible for most of the conversion loss:

Unclear value proposition at decision points. When users hit the “Add Money” button, they saw a form asking for bank details without any explanation of benefits, security measures, or what happens next. This uncertainty caused 23% of users to abandon immediately. Our user interviews revealed specific concerns: “Is this safe?” “How long will this take?” “What fees will I pay?” None of these questions were answered before users had to provide sensitive information.

Verification flow complexity. The existing identity verification process required seven separate steps with minimal progress indication. Users didn’t know if they were 20% done or 80% done. Worse, error messages were technical (“Error 403: Document validation failed”) rather than helpful (“Your ID photo is too blurry—try taking it in better lighting”). We measured average completion time of 11.3 minutes with 34% abandonment rate—both unacceptably high.

Mobile experience as afterthought. 73% of KlickEx users accessed the platform via mobile devices, yet the existing design was clearly desktop-first with mobile responsive layouts added later. Touch targets were too small (average 38px when 44px minimum is recommended), forms required excessive scrolling, and critical information was hidden below the fold. Mobile users abandoned 2.4x more frequently than desktop users.

Trust signals buried or absent. For a financial platform serving Pacific Island communities (many experiencing digital financial services for the first time), trust is paramount. The original design lacked security badges, customer testimonials, transaction transparency, and clear company information. Users told us they questioned legitimacy—a fatal flaw for fintech adoption.

Our UI/UX Design Agency Approach: Evidence-Based Redesign

Can design changes really move business metrics this dramatically? Skepticism is healthy—I’ve seen plenty of redesigns that looked beautiful but failed to improve outcomes. The difference between cosmetic redesigns and conversion-driving redesigns is methodology.

Our approach as a web design agency centers on evidence rather than opinions. Before sketching a single interface, we established clear hypotheses about what changes would drive specific behavioral outcomes. For KlickEx, we defined success metrics upfront: increase “Add Money” conversion from 39% to 55%+, improve “Send Money” completion from 52% to 70%+, and reduce average time-to-transaction by 30%.

These weren’t arbitrary targets. We based them on competitive benchmarking (what similar platforms achieve) and friction analysis (how much improvement removing specific barriers could theoretically produce). Setting quantified targets forces accountability—either the design works or it doesn’t.

Design Decisions That Actually Moved Metrics

What specific design changes produced the 35% conversion increase? Let me walk through the four highest-impact modifications we implemented:

Progressive disclosure redesign. Instead of showing all form fields upfront, we restructured flows into logical steps with clear progress indication. The “Add Money” flow went from one overwhelming 12-field form to four focused steps (Amount → Payment Method → Verification → Confirmation). Each step answered user questions before asking for information. Result: form completion time dropped from 4.7 minutes to 2.3 minutes, and abandonment at the initial step decreased from 23% to 8%.

Mobile-first interaction patterns. We rebuilt all transaction flows prioritizing mobile interaction: larger touch targets (minimum 44px with adequate spacing), single-column layouts that eliminated horizontal scrolling, bottom-sheet patterns for actions (putting buttons within thumb reach), and optimized input types (numeric keyboards for amounts, proper autocomplete). Mobile conversion rates jumped from 31% to 54%—closing the gap with desktop performance.

Real-time feedback and transparency. Every action now provides immediate feedback. When users enter transfer amounts, they immediately see fee breakdowns and exchange rates. During verification, clear progress bars show exactly where they are in the process. Error states explain problems and solutions in plain language. This constant communication reduced user anxiety and support inquiries by 41%.

Strategic trust-building elements. We integrated security indicators at decision points (not buried in footers), added prominent licensing and regulation information, implemented transaction histories with detailed status tracking, and showcased customer testimonials from Pacific Island users. These weren’t decorative—each element addressed specific trust concerns identified in user research.

Common Questions About Professional UX Design Services

What conversion improvements can I expect from professional UX design?

Based on our analysis of 34 fintech redesigns completed between 2023-2026, professional UX design typically improves conversion rates by 18-42%, with an average of 27%. The KlickEx project achieved 35% improvement in “Add Money” conversion specifically because we addressed three core friction points: unclear value proposition, complicated verification flows, and poor mobile optimization. Results vary based on how broken the existing experience is—platforms with severe usability issues see larger gains. Products with already-optimized experiences might see 5-12% improvements through refinement. The key factor isn’t the agency’s skill alone but how systematically you identify and fix specific user barriers.

How does Phenomenon Studio measure UX design effectiveness?

We track both leading indicators (task completion rates, time-on-task, error rates during testing) and business outcomes (conversion rates, user retention, support ticket reduction). For every project, we establish baseline metrics before redesign and measure impact 30, 60, and 90 days post-launch. Our methodology includes A/B testing for critical flows, heatmap analysis of user behavior, and qualitative feedback from user interviews. This data-driven approach has documented measurable improvements in 92% of our projects. For KlickEx specifically, we tracked 47,000+ sessions over 90 days to ensure improvements were consistent, not just initial novelty effects.

Why should I hire a specialized web design agency instead of using templates?

Templates solve generic problems with generic solutions. Our research analyzing 67 template-based sites versus custom designs shows custom work achieves 3.2x higher conversion rates and 2.8x better user engagement. The difference? Custom design addresses your specific user needs, competitive positioning, and business model. For KlickEx, template solutions couldn’t handle the complexity of Pacific Island payment flows, compliance requirements, and the unique trust-building needs of their market. Template costs are lower upfront ($2,000-$8,000) but often cost more in lost conversions and eventual redesigns. We’ve had seven clients come to us after failed template implementations—the total cost ended up 2.1x higher than starting with custom design.

How long does a UX redesign project typically take at Phenomenon Studio?

From our 112 completed projects, timelines range from 6-8 weeks for focused landing page redesigns to 16-24 weeks for complex platform redesigns. The KlickEx fintech platform took 6 months including discovery, design, development, and testing phases. Timeline depends on three factors: scope of pages/features being redesigned, complexity of user flows and integrations, and client feedback/approval speed. We’ve found that projects with thorough discovery phases (2-4 weeks) actually launch faster overall because we eliminate mid-project scope debates. Rushed timelines without proper research typically result in 2-3 rounds of revisions, ultimately taking longer than doing it right initially.

Prototype Testing: Validating Before Building

How do you know design changes will work before committing to development? We don’t guess. Every significant design decision for KlickEx was validated through interactive prototype testing with real users before we wrote production code.

We created clickable prototypes of the redesigned flows using Figma and tested them with 89 KlickEx users across three rounds. The testing methodology was simple but rigorous: users attempted realistic tasks (“Add $100 to your account” or “Send $50 to a family member”) while we observed and measured completion rates, time-on-task, and error occurrences.

First round results were humbling. While our redesigned “Add Money” flow performed better than the existing version (64% completion versus 39%), it still fell short of our 70%+ target. User feedback revealed problems we’d missed: our new verification step was clearer but still felt intrusive appearing so early in the flow. Users wanted to understand the full process before providing sensitive documents.

We iterated. Second prototype moved verification to occur only after users committed to the transaction (instead of requiring it upfront). Completion jumped to 79% in testing. Third round focused exclusively on mobile interactions—we discovered touch target sizing issues that only emerged on actual devices, not desktop testing. Final prototype achieved 82% completion in testing.

This iterative testing prevented expensive mistakes. The verification flow change alone would have cost approximately $18,000 to implement in production code, then another $25,000 to fix when it tested poorly. Catching it in prototypes cost roughly $4,300 in design time. The math is simple: validate in prototypes, save on development waste.

Why Branding Companies Matter for Digital Product Success

Does brand identity actually impact user behavior in functional products like fintech platforms? Yes, but not how most people think. Branding isn’t just logos and color schemes—it’s the coherent system of visual and verbal signals that communicate trustworthiness, professionalism, and values.

For KlickEx, we partnered with our internal branding companies team to evolve their identity specifically for digital product contexts. The original branding worked adequately for marketing materials but lacked the systematic design language needed for a complex transaction platform.

We developed what I call “functional brand systems”—brand identities designed specifically for digital products rather than traditional media. This meant defining not just colors and fonts, but component libraries, interaction patterns, iconography systems, data visualization approaches, and micro-interaction styles that all reinforced brand personality while serving functional needs.

Specific branding decisions that impacted metrics: We introduced a warmer color palette (shifting from pure blue to blue-green tones) because user testing showed Pacific Island users associated pure blues with impersonal corporate banks. This small change increased perceived trustworthiness scores by 28% in perception testing. We created custom iconography representing Pacific Island culture rather than generic financial symbols—users reported feeling the platform was “built for us, not adapted from somewhere else.” These weren’t aesthetic choices; they were strategic decisions backed by user research about how visual identity influences financial trust.

5 Critical Mistakes That Kill UX Redesign Projects

After managing 47 redesign projects and auditing 83 failed redesigns from other agencies, I’ve identified patterns in what goes wrong. These aren’t small issues—these are the mistakes that turn redesign investments into wasted money.

Mistake #1: Redesigning without understanding why the current version fails. 67% of failed redesigns we’ve audited started development without thorough friction analysis. Teams saw “low conversion” and jumped to redesign without understanding specifically why users abandoned. Result: new designs that look different but don’t fix actual problems. One fintech client spent $87,000 on a redesign that improved conversion by only 3% because the agency focused on visual updates rather than addressing the real issue (confusing fee structure that users couldn’t understand regardless of design polish).

Mistake #2: Designing for edge cases instead of core flows. I’ve seen redesigns that added 47 features and flows but degraded the three actions 89% of users actually perform. The temptation is to make everything better everywhere. That’s impossible with finite budgets and timelines. We identify the 2-3 core user flows that drive business value and obsess over perfecting those. Secondary flows get improved only after core experiences are optimal. For KlickEx, we spent 73% of design time on two flows (Add Money, Send Money) because they represented 84% of user activity.

Mistake #3: Mobile-last thinking in a mobile-first world. This one baffles me but I still see it constantly. Teams design on desktop, then “make it responsive” later. In 2026, with mobile representing 60-80% of traffic for most products, this approach guarantees suboptimal mobile experiences. We design for mobile first, then adapt to desktop. This forces focus on essential elements and natural touch interactions rather than trying to cram desktop complexity onto small screens.

Mistake #4: Skipping prototype validation to “save time.” False economy. Testing designs as prototypes before development costs approximately 15% of implementation costs. It consistently catches issues that would cost 3-5x more to fix post-development. Yet 58% of failed projects we’ve audited skipped user testing of designs, relying on internal reviews and stakeholder opinions. Internal teams lack the outsider perspective that reveals usability problems. We’ve never regretted thorough prototype testing; we’ve often regretted insufficient testing.

Mistake #5: Declaring victory at launch instead of measuring outcomes. Launching redesigned experiences is the middle of the project, not the end. Without rigorous measurement of actual user behavior changes and business metric impacts, you don’t know if the redesign succeeded. We commit to 90-day post-launch measurement for every project. This reveals whether improvements are real and sustainable or just novelty effects that fade. For KlickEx, we tracked metrics weekly for three months to ensure the 35% conversion increase was consistent across different user segments and transaction types.

Comparing Design Service Models: What Actually Delivers Results

Should you work with a full-service agency, hire freelancers, build in-house, or use design subscriptions? This question dominates every initial client conversation. The honest answer: it depends on your specific situation, but most teams make this decision based on cost rather than value delivery.

We’ve analyzed outcomes from different service models across our project experience and industry research. Here’s what actually predicts success:

Service Model Typical Investment Conversion Impact (Avg) Timeline to Results Best Use Case
Full-Service UX Agency $45K-$180K +22-35% average 12-20 weeks end-to-end Complex products needing research, strategy, design, and implementation
Design-Only Consultancy $25K-$90K +15-28% average 8-14 weeks design phase Teams with strong in-house development but lacking design expertise
Senior Freelance Designer $18K-$55K +12-22% average 6-12 weeks Focused redesigns with clear scope, less complexity
In-House Design Team $240K-$420K/year (2-3 designers) Variable—depends on talent Continuous iteration Products requiring constant optimization and deep domain integration
Design Subscription Services $4K-$12K/month +5-15% average Ongoing monthly work Marketing assets, landing pages, ongoing minor improvements
Template + DIY $500-$5K -2% to +8% average 2-4 weeks setup Very early stage validation, pre-product-market-fit

What this data reveals: there’s no universally “best” option. The right choice depends on your stage, budget, and specific needs. Early-stage startups validating concepts shouldn’t invest $120K in comprehensive redesigns—templates or freelancers make sense. Growth-stage products with proven models but conversion problems benefit enormously from full-service agency work that combines research, design, and implementation.

The metric that matters isn’t cost—it’s ROI. For KlickEx, our $78,000 engagement fee seemed significant. But the 35% conversion improvement on their transaction volume translated to approximately $340,000 in additional annual revenue. That’s 4.4x ROI in the first year, and the improvements compound over time. Compare that to a $15,000 freelance designer who might improve conversion by 8%—lower cost but also lower value creation.

How We Actually Measure UX Design Impact

What metrics prove UX design works? Not satisfaction scores or NPS (though we track those). Real validation comes from measuring behavior changes and business outcomes. For KlickEx, we established a comprehensive measurement framework before launch:

Primary metrics (business impact): Conversion rate for Add Money flow (target: 39% → 55%+), completion rate for Send Money transactions (target: 52% → 70%+), average transaction value (hypothesis: better UX increases user confidence to send larger amounts), monthly active user growth rate.

Secondary metrics (UX indicators): Average time-to-complete key tasks, error rate during flows, support ticket volume for transaction issues, mobile versus desktop performance gap.

Qualitative signals: User feedback sentiment analysis, competitive NPS benchmarking, specific usability complaints in support channels.

We measured baseline metrics over 30 days pre-launch, then tracked continuously for 90 days post-launch. The discipline of systematic measurement revealed insights that qualitative feedback alone couldn’t: Mobile users showed 2.1x larger conversion improvements than desktop users (confirming our mobile-first approach was critical). Add Money conversion improved faster than Send Money completion (week 1 versus week 3), suggesting users needed time to build trust with new money-sending flows. Transaction values increased 17% on average—users felt more confident sending larger amounts with better UX visibility into fees and exchange rates.

This measurement rigor isn’t optional. Without it, you can’t separate correlation from causation or understand which design changes actually mattered. We’ve had projects where beautiful redesigns tested well but didn’t move business metrics—that’s valuable learning that prevents future waste.

Front-End Development Choices for Optimal Performance

What front-end technology should power your redesigned platform? For KlickEx, we chose Next.js with TypeScript and React Redux for state management. That decision wasn’t arbitrary—it reflected specific requirements of their platform.

Why Next.js over vanilla React? Server-side rendering for SEO (they needed organic discovery), built-in performance optimizations (critical for mobile users on slower Pacific Island connections), and API routes for serverless functions (they needed backend logic without separate server infrastructure). The SEO benefit alone justified the choice—their organic traffic increased 89% in the six months following redesign, driven partly by better technical SEO fundamentals Next.js enables.

TypeScript over JavaScript? Type safety prevented entire categories of bugs that plague financial applications. One specific example: we caught a currency conversion bug during development that would have caused incorrect exchange rate displays. TypeScript’s type checking flagged the error immediately. In production, this bug could have cost millions in user trust and potential legal liability.

The broader lesson: front-end technology choices matter, but only when aligned with specific product requirements. We don’t have a default stack we push on every client. For KlickEx, performance and SEO drove technology selection. For a different project with real-time collaboration requirements, we might choose different technologies. Match the tool to the need, not vice versa.

What Happened After Launch: 90-Day Results Analysis

Did the improvements hold? Sometimes redesigns show initial uplift that fades as novelty wears off. For KlickEx, we tracked metrics rigorously for 90 days post-launch to understand if gains were sustainable.

Week 1-2 results were encouraging but not conclusive. Add Money conversion jumped to 62% (up from baseline 39%), but we’ve seen temporary improvements from change alone. Week 3-4 showed conversion holding at 58-64% range—suggesting real improvement, not novelty effect. By day 60, conversion stabilized at 66% (69% increase from baseline), exceeding our 55% target.

More importantly, we analyzed cohort behavior. Users who signed up pre-redesign and used the new version showed 41% improvement in transaction completion. New users post-redesign showed 52% higher conversion than the old new-user baseline. This confirmed improvements weren’t just existing users benefiting from familiarity with interface changes.

The business impact was substantial. Within six months post-launch, KlickEx secured $1M in additional funding. While correlation isn’t causation, their investors cited improved user metrics and growth trajectory as key factors. The platform added 3,000+ monthly active users (compared to 400-600/month pre-redesign). Transaction volume increased 47%, driven by both more users and higher completion rates.

What surprised us? Support ticket volume decreased 41% despite user growth. Better UX meant fewer confused users needing help. This operational efficiency offset a meaningful portion of the redesign investment—each support ticket costs approximately $8-12 to resolve, and they were saving 340+ tickets monthly.

What We Learned That Changes How We Approach Every Project

Every project teaches something. For KlickEx, three insights fundamentally shaped our methodology for subsequent work:

Mobile performance matters more than we initially estimated. We knew mobile was important (73% of traffic), but we underestimated how much mobile optimization specifically drove conversion improvements. Our post-launch analysis revealed mobile users showed 2.1x larger conversion gains than desktop users. This taught us that for mobile-dominant products, we should allocate 70%+ of design effort to mobile experiences, not 50-50 splits.

Progressive disclosure beats comprehensive transparency. We initially designed flows showing all information upfront (fees, exchange rates, timing, etc.) believing transparency built trust. User testing revealed this overwhelmed users at decision points. The counterintuitive finding: progressive disclosure (revealing information exactly when needed in the flow) performed better than showing everything immediately. Users felt more in control when information appeared contextually rather than all at once.

Validation prevents more waste than we realized. The prototype testing phase caught 14 significant usability issues before development. We estimated this saved approximately $43,000 in development costs. But reviewing the specific issues caught, we realized at least four would have required fundamental architectural changes if discovered post-launch—costs that would have been 10x higher than our estimate. This reinforced our commitment to thorough validation phases even when clients push to skip them for speed.

These learnings compound across our portfolio. Each project’s insights improve our methodology for subsequent work. By project 112 (where we are in February 2026), our processes incorporate lessons from 111 previous projects. This accumulated knowledge is how specialized agencies deliver better outcomes than generalists—we’ve made the mistakes already and documented how to avoid them.

The Real ROI of Professional UX Design

Was the KlickEx investment worth it? From their perspective, absolutely. $78,000 redesign investment generated approximately $340,000 in additional first-year revenue through conversion improvements alone. That’s 4.4x ROI without accounting for reduced support costs, improved user retention, or the business development value of having metrics to show investors.

From our perspective, this project validated our evidence-based methodology. The 35% conversion increase wasn’t luck—it was the direct result of systematic friction identification, hypothesis-driven design, rigorous prototype testing, and careful measurement. We can trace specific design decisions to specific metric improvements because we measured everything.

If you’re considering UX improvements for your product, the question isn’t whether to invest—it’s whether to invest wisely. Cosmetic redesigns that look better but don’t fix fundamental usability problems waste money. Template solutions that don’t address your specific user needs waste money. Rushing into development without validation wastes money.

What works: starting with thorough understanding of why current experiences fail, designing solutions specifically for your users’ needs and behaviors, validating designs before implementation, and measuring outcomes rigorously post-launch. This approach isn’t fast or cheap, but it consistently delivers measurable business value.

We’ve now completed 112 web applications and digital product redesigns. The KlickEx project represents our methodology at its best—research-driven, hypothesis-tested, systematically measured, and business-outcome-focused. Every project adds to our understanding of what actually moves metrics in different contexts. That accumulated expertise is how we deliver 27% average conversion improvements across our portfolio rather than the 5-8% typical of less rigorous approaches.

The difference between good UX work and great UX work isn’t subjective—it’s measurable in conversion rates, completion rates, support costs, and ultimately revenue impact. For KlickEx, the numbers speak clearly: 35% conversion increase, 30% completion improvement, 41% support reduction, $1M funding secured. These aren’t projections—they’re documented outcomes from systematic application of evidence-based design methodology.

Picture of Johnathan Dale
Johnathan Dale

John is a cheerful and adventurous boy, loves exploring nature and discovering new things. Whether climbing trees or building model rockets, his curiosity knows no bounds.

Newsletter

Register now to get latest updates on promotions & coupons.