Published on March 20, 2014
UX and CRO : Toolkits and Tips UXPA UK - 20th March 2014 @OptimiseOrDie
@OptimiseOrDie • UX and Analytics (1999) • User Centred Design (2001) • Agile, Startups, No budget (2003) • Funnel optimisation (2004) • Multivariate & A/B (2005) • Conversion Optimisation (2005) • Persuasive Copywriting (2006) • Joined Twitter (2007) • Lean UX (2008) • Holistic Optimisation (2009) Was : Group eBusiness Manager, Belron Now : Consulting
@OptimiseOrDie Became obsessed with UX Save the Trees Discovered UX Breaking the bonds Getting the mix right UX Hype Cycle
@OptimiseOrDie Timeline Tested stupid ideas, lots Most AB or MVT tests are bullshit Discovered AB testing Triage, Triangulation, Prioritisation, Maths Zen Plumbing AB Test Hype Cycle
@OptimiseOrDie Hands on!
Nice day at the office, dear? @OptimiseOrDie
Craig’s Cynical Quadrant Improves revenue Improves UX YesNo No Yes Client delighted (and fires you for another UX agency) Client fucking delighted Client absolutely fucking furious Client fires you (then wins an award for your work)
Top Tools & Tips #1 Get out of the office #2 Immerse yourself #3 Session Replay #4 Voice of Customer #5 Get the right inputs #6 Act like a P.I. #7 Experience testing #8 Split testing tools #9 Get performance #10 Analytics health check #11 Going agile Examples @OptimiseOrDie
#1 : GET OUT OF THE OFFICE @OptimiseOrDie
@OptimiseOrDie 1a : Lab Based Testing
@OptimiseOrDie 1b : Remote UX Testing 1 2 3 3 3 1 Moderator Participant2 Viewers3 3
Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both) Usertesting (B) www.usertesting.com Userlytics (B) www.userlytics.com Userzoom (S) www.userzoom.com Intuition HQ (S) www.intuitionhq.com Mechanical turk (S) www.mechanicalturk.com Loop11 (S) www.loop11.com Open Hallway (S) www.openhallway.com What Users Do (P) www.whatusersdo.com Feedback army (P) www.feedbackarmy.com User feel (P) www.userfeel.com Ethnio (For Recruiting) www.ethnio.com Feedback on Prototypes / Mockups Pidoco www.pidoco.com Verify from Zurb www.verifyapp.com Five second test www.fivesecondtest.com Conceptshare www.conceptshare.com Usabilla www.usabilla.com 13 @OptimiseOrDie 1c : Crowdsourced Testing
@OptimiseOrDie 1d : Beer, Caffeine and Work Breaks
DESKTOP & LAPTOP CamStudio (free) www.camstudio.org Mediacam AV (cheap) www.netu2.co.uk Silverback (Mac) www.silverbackapp.com Screenflow (Mac) www.telestream.net MOBILE UX Recorder (iOS) www.uxrecorder.com Skype Hugging bit.ly/tesTfm Reflection bit.ly/GZMgxR Reflector bit.ly/JnwtMo @OptimiseOrDie 1e : Guerrilla Testing
@OptimiseOrDie 1f : The Secret Millionaire • Tesco placed IT users in front line roles with product • You have to to create this kind of feedback loop • If it isn‟t there, you need to push/encourage • Connect the team with pain points AND outcomes of their work, split tests and changes • Hugely motivational strategy • One last tip – learn how to interview like a pro • Read these: “Don‟t Make Me Think” amzn.to/1gIZEJn “Rocket Surgery Made Easy” amzn.to/1e0hnUL “Talking to Customers” bit.ly/1e0hT58 “Talking with Participants” bit.ly/1kKL3LE “Don‟t listen to Users” bit.ly/1cQpiIE “Interviewing Tips” bit.ly/1fKqu03 “More interviewing Tips” bit.ly/1bmvGT
#2 : IMMERSE YOURSELF @OptimiseOrDie • Test ALL key campaigns • Use Real Devices • Get your own emails • Order your products • Call the phone numbers • Send an email • Send 11 shoes back • Be difficult • Break things • Experience the end-end • Do the same for competitors • Team are ALL mystery shoppers • Wear the magical slippers • Be careful about dogfood though!
• Vital for optimisers & fills in a ‘missing link’ for insight • Rich source of data on visitor experiences • Segment by browser, visitor type, behaviour, errors • Forms Analytics (when instrumented) are awesome • Can be used to optimise in real time! Session replay tools • Clicktale (Client) www.clicktale.com • SessionCam (Client) www.sessioncam.com • Mouseflow (Client) www.mouseflow.com • Ghostrec (Client) www.ghostrec.com • Usabilla (Client) www.usabilla.com • Tealeaf (Hybrid) www.tealeaf.com • UserReplay (Server) www.userreplay.com @OptimiseOrDie #3 : GET SESSION REPLAY
• Sitewide Omnipresent Feedback • Triggered (Behavioural) Feedback • Use of Features, Cancellation, Abandonment • 4Q Task Gap Analysis very good • Kampyle www.kampyle.com • Qualaroo www.qualaroo.com • Feedback Daddy www.feedbackdaddy.com • 4Q 4q.iperceptions.com • Usabilla www.usabilla.com #4 : GET THEIR VOICE
• Make contact and feedback easy & encouraged • Add contact & feedback to everything (e.g. all mails) • Read Caroline Jarrett, run surveys (remember them?) • Run regular NPS and behaviourally triggered surveys • Get ratings on Service Metrics • Find what drives the ‘level’ of delight • Ask your frequent, high spend, zealous users questions • Make the team spend ½ a day a month at the Call Centre • Meet with your Sales and Support teams ALL the time • Tip : Take them for Beers and encourage bitching #4 : GET THEIR VOICE
Insight - Inputs #FAIL Competitor copying Guessing Dice rolling An article the CEO read Competitor change Panic Ego Opinion Cherished notions Marketing whims Cosmic rays Not ‘on brand’ enough IT inflexibility Internal company needs Some dumbass consultant Shiny feature blindness Knee jerk reactons #5 : Your inputs are all wrong @OptimiseOrDie
Insight - Inputs Insight Segmentation Surveys Sales and Call Centre Session Replay Social analytics Customer contact Eye tracking Usability testing Forms analytics Search analytics Voice of Customer Market research A/B and MVT testing Big & unstructured data Web analytics Competitor evalsCustomer services #5 : These are the inputs you need… @OptimiseOrDie
• For your brand(s) and competitors • Check review sites, Discussion boards, News • Use Google Alerts on various brands & keywords • See what tools they’re using (www.ghostery.com) • Sign up for all competitor emails • Run Cross Competitor surveys • This was VITAL for LOVEFiLM • Use Social & Competitor Monitoring tools : slidesha.re/1k7bflG #6 : ACT LIKE A PI
#4 – Test or Die! Email testing www.litmus.com www.returnpath.com www.lyris.com Browser testing www.crossbrowsertesting.com www.browserstack.com www.spoon.net www.saucelabs.com www.multibrowserviewer.com Mobile devices www.appthwack.com www.deviceanywhere.com www.mobilexweb.com/emulators www.opendevicelab.com @OptimiseOrDie #7 : MAKE MONEY FROM TESTING!
• Google Content Experiments bit.ly/Ljg7Ds • Optimizely www.optimizely.com • Visual Website Optimizer www.visualwebsiteoptimizer.com • Multi Armed Bandit Explanation bit.ly/Xa80O8 • New Machine Learning Tools www.conductrics.com www.rekko.com @OptimiseOrDie #8 : MAKE MORE MONEY FROM TESTING!
#9 – Fight! • Google PageSpeed Tools • Webpagetest.org • Mobitest.akamai.com 26 @OptimiseOrDie
Site Size Requests The Daily Mail 4574k 437 Starbucks 1300k 145 Direct line 887k 45 Ikea (.se) 684k 14 Currys 667k 68 Marks & Spencers 308k 45 Tesco 234k 15 The Guardian 195k 35 BBC News 182k 62 Auto Trader 151k 47 Amazon 128k 16 Aviva 111k 18 Autoglass 25k 10 Real testing : mobitest.akamai.com @OptimiseOrDie
Slides : slidesha.re/PDpTPD If you really care, download this deck: @OptimiseOrDie
Scare the Ecom or Trading director:
#10 : Your analytics tool is broken! @OptimiseOrDie
• Get a Health Check for your Analytics – Mail me for a free pack • Invest continually in instrumentation – Aim for at least 5% of dev time to fix + improve • Stop shrugging : plug your insight gaps – Change „I don‟t know‟ to „I‟ll find out‟ • Look at event tracking (Google Analytics) – If set up correctly, you get wonderful insights • Would you use paper instead of a till? – You wouldn‟t do it in retail so stop doing it online! • How do you win F1 races? – With the wrong performance data, you won‟t @OptimiseOrDie #10 : Your analytics tool is broken!
#11 : Go Agile @OptimiseOrDie
Methodologies - Lean UX Positive – Lightweight and very fast methods – Realtime or rapid improvements – Documentation light, value high – Low on wastage and frippery – Fast time to market, then optimise – Allows you to pivot into new areas Negative – Often needs user test feedback to steer the development, as data not enough – Bosses distrust stuff where the outcome isn’t known “The application of UX design methods into product development, tailored to fit Build-Measure-Learn cycles.” 33
Agile UX / UCD / Collaborative Design Positive – User centric – Goals met substantially – Rapid time to market (especially when using Agile iterations) Negative – Without quant data, user goals can drive the show – missing the business sweet spot – Some people find it hard to integrate with siloed teams – Doesn’t’ work with waterfall IMHO Wireframe Prototype TestAnalyse Concept Research “An integration of User Experience Design and Agile* Software Development Methodologies” *Sometimes 34
Lean Optimisation Positive – A blend of several techniques – Multiple sources of Qual and Quant data aids triangulation – Focus on priority opportunities drives unearned value and customer delight for all products Negative – Needs a one team approach with a strong PM who is a Polymath (Commercial, Analytics, UX, Technical) – Only works if your teams can take the pace – you might be surprised though! “A blend of User Experience Design, Agile PM, Rapid Lean UX Build-Measure-Learn cycles, triangulated data sources, triage and prioritisation.” 36
Lean CRO Inspection Immersion Identify Triage & Triangulate Outcome Streams Measure Learn Instrument 37
We believe that doing [A] for People [B] will make outcome [C] happen. We’ll know this when we observe data [D] and obtain feedback [E]. (reverse) @OptimiseOrDie
agile - Summary • Design your own methodology Experiment and optimise with your team • Don’t be a slave The methodology is the slave, not your master http://tcrn.ch/1gPpUNo • Collaborative working – Harvard study into teams – it‟s an all the time thing • Ask me later… Questions – see me on Twitter, G+ or ask by mail @OptimiseOrDie
Scarcity principle... #1 : EXAMPLES
Scarcity principle... #1 : EXAMPLES
• 20M+ visitor tests with People Images • Some interesting stuff at Autoglass (Belron) • Negative body language is a turnoff • Uniforms and branding a positive (ball cap) • Eye gaze and smile are crucial • Hands are awkward without a prop • Best prop tested was a clipboard • Single image better than groups • In most countries (out of 33) with strong female and male images in test, female won • So – a question about this test @OptimiseOrDie #2 : SPLIT TESTING PEOPLE
Terrible Stock Photos : headsethotties.com & awkwardstockphotos.com Laughing at Salads : womenlaughingwithsalad.tumblr.com Other Stock Memes : linkli.st/optimiseordie/7Fdxz BBC Fake Smile Test : bbc.in/5rtnv @OptimiseOrDie
UK Wave 2 - Isi
TV - Off TV - On Isi went on to star in the TV slot and helped Autoglass grow recruitment of female technicians, as well as proving a point! #3 : TV ADVERTISING
SPAIN +22% over control 99% confidence @OptimiseOrDie #3 : PHOTOGRAPHY UX
@OptimiseOrDie #4 : VOC, NPS, EXPERIMENTS • Belron NPS programme is huge • Millions of people every year, across the world • 35% survey takeup, 6% dropout rate! • (Try @lukew and @cjforms and @stickycontent) • Higher scores than some consumer products • Why? Measuring the drivers of delight • Even on A/B tests, we could split NPS data • We could see a new funnel drove a 5.5% rise • Lovefilm beat their competitors using NPS • How? Measuring key service metrics • Regression to find high value investment areas • Contact deflection using self service • Analytics, split testing, UX
How is it working out for Craig? • Methodologies are not Real Life ™ • It’s mainly about the mindset of the team and managers, not the tools or methodologies used • Not all my clients have all the working parts • Use some, any techniques instead of ‘guessing’ • Bringing together UX techniques with the excellent tools available – along with analytics investment - will bring you successful and well- loved products • Blending Lean and Agile UX with conversion optimisation techniques (analytics, split testing, Kaizen, Kano) is my critical insight from the last 5 years. • UX got hitched to numbers, they ran away and lived happily ever after 49
If it isn‟t working, you‟re not doing it right @OptimiseOrDie
Email Twitter : firstname.lastname@example.org : @OptimiseOrDie : linkd.in/pvrg14 Slides on Slideshare.net/sullivac tonight!
RESOURCE PACK • Maturity model • Best CRO people on twitter • Best Web resources • Good recent books to read • Triage and Triangulation • The Bucket outcome methodology • Belron methodology example • CRO and testing resources • Companies and people to watch • Building a ring model • Manual Models for Analytics @OptimiseOrDie
Ad Hoc Local Heroes Chaotic Good Level 1 Starter Level Guessing A/B testing Basic tools Analytics Surveys Contact Centre Low budget usability Outline process Small team Low hanging fruit + Multi variate Session replay No segments +Regular usability testing/research Prototyping Session replay Onsite feedback ________________________________________________________________________ _____________________ _ Dedicated team Volume opportunities Cross silo team Systematic tests Ninja Team Testing in the DNA Well developed Streamlined Company wide +Funnel optimisation Call tracking Some segments Micro testing Bounce rates Big volume landing pages + Funnel analysis Low converting & High loss pages + offline integration Single channel picture + Funnel fixes Forms analytics Channel switches +Cross channel testing Integrated CRO and analytics Segmentation +Spread tool use Dynamic adaptive targeting Machine learning Realtime Multichannel funnels Cross channel synergy ________________________________________________________________________ _______________________ ________________________________________________________________________ ________________________ Testing focus Culture Process Analytics focus Insight methods +User Centered Design Layered feedback Mini product tests Get buyin _________________________________________________________________________ _______________________Mission Prove ROI Scale the testing Mine value Continual improvement + Customer sat scores tied to UX Rapid iterative testing and design + All channel view of customer Driving offline using online All promotion driven by testing Level 2 Early maturity Level 3 Serious testing Level 4 Core business value Level 5 You rock, awesomely ________________________________________________________________________ ________________________ 53
Best of Twitter @OptimiseOrDie @danbarker Analytics @fastbloke Analytics @timlb Analytics @jamesgurd Analytics @therustybear Analytics @carmenmardiros Analytics @davechaffey Analytics @priteshpatel9 Analytics @cutroni Analytics @Aschottmuller Analytics, CRO @cartmetrix Analytics, CRO @Kissmetrics CRO / UX @Unbounce CRO / UX @Morys CRO/Neuro @PeepLaja CRO @TheGrok CRO @UIE UX @LukeW UX / Forms @cjforms UX / Forms @axbom UX @iatv UX @Chudders Photo UX @JeffreyGroks Innovation @StephanieRieger Innovation @DrEscotet Neuro @TheBrainLady Neuro @RogerDooley Neuro @Cugelman Neuro
Best of the Web @OptimiseOrDie Whichtestwon.com Unbounce.com Kissmetrics.com Uxmatters.com RogerDooley.com PhotoUX.com TheTeamW.com Baymard.com Lukew.com PRWD.com Measuringusability.com ConversionXL.com Smartinsights.com Econsultancy.com Cutroni.com www.GetMentalNotes.com
Best of Books @OptimiseOrDie
Triage and Triangulation • Starts with the analytics data • Then UX and user journey walkthrough from SERPS -> key paths • Then back to analytics data for a whole range of reports: • Segmented reporting, Traffic sources, Device viewport and browser, Platform (tablet, mobile, desktop) and many more • We use other tools or insight sources to help form hypotheses • We triangulate with other data where possible • We estimate the potential uplift of fixing/improving something as well as the difficulty (time/resource/complexity/risk) • A simple quadrant shows the value clusters • We then WORK the highest and easiest scores by… • Turning every opportunity spotted into an OUTCOME “This is where the smarts of CRO are – in identifying the easiest stuff to test or fix that will drive the largest uplift.” @OptimiseOrDie
The Bucket Methodology “Helps you to stream actions from the insights and prioritisation work. Forces an action for every issue, a counter for every opportunity being lost.” Test If there is an obvious opportunity to shift behaviour, expose insight or increase conversion – this bucket is where you place stuff for testing. If you have traffic and leakage, this is the bucket for that issue. Instrument If an issue is placed in this bucket, it means we need to beef up the analytics reporting. This can involve fixing, adding or improving tag or event handling on the analytics configuration. We instrument both structurally and for insight in the pain points we’ve found. Hypothesise This is where we’ve found a page, widget or process that’s just not working well but we don’t see a clear single solution. Since we need to really shift the behaviour at this crux point, we’ll brainstorm hypotheses. Driven by evidence and data, we’ll create test plans to find the answers to the questions and change the conversion or KPI figure in the desired direction. Just Do It JFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or the change is a no-brainer. Items marked with this flag can either be deployed in a batch or as part of a controlled test. Stuff in here requires low effort or are micro-opportunities to increase conversion and should be fixed. Investigate You need to do some testing with particular devices or need more information to triangulate a problem you spotted. If an item is in this bucket, you need to ask questions or do further digging. 58
5 - Belron example – Funnel replacement Final prototype Usability issues left Final changes Release build Legal review kickoff Cust services review kickoff Marketing review Test Plan Signoff (Legal, Mktng , CCC) Instrument analytics Instrument Contact Centre Offline tagging QA testing End-End testing Launch 90/10% Monitor Launch 80/20% Monitor < 1 week Launch 50/50% Go live 100% Analytics review Washup and actions New hypotheses New test design Rinse and Repeat!
CRO and Testing resources • 101 Landing page tips : slidesha.re/8OnBRh • 544 Optimisation tips : bit.ly/8mkWOB • 108 Optimisation tips : bit.ly/3Z6GrP • 32 CRO tips : bit.ly/4BZjcW • 57 CRO books : bit.ly/dDjDRJ • CRO article list : bit.ly/nEUgui • Smashing Mag article : bit.ly/8X2fLk @OptimiseOrDie
So you want examples? Examples of companies putting this stuff together in a good way: • Belron – Ed Colley • Dell – Nazli Yuzak • Shop Direct – Paul Postance (now with EE) • Expedia – Oliver Paton • Schuh – Stuart McMillan • Soundcloud – Eleftherios Diakomichalis & Ole Bahlmann • Gov.uk – Adam Bailin (now with the BBC) Read the gov.uk principles : www.gov.uk/designprinciples And my personal favourite of 2013 – Airbnb!@OptimiseOrDie
#1 Building a Model #1 Avinash article #2 The Ring Model #3 3 examples #4 Benefits #5 Summary 62
6.1 – Avinash “See-Think-Do” • Avinash Kaushik, analytics guru, proposes a very nice model for marketing. A brilliant article can be read here: • http://www.kaushik.net/avinash/see-think-do-content-marketing-measurement- business-framework/ • But this sort of thinking is also relevant to optimisation • CRO often focuses on purely the ‘Do’ stage – rather than ‘See’ or ‘Think’ stages. 63
6.1 – Example 64
6.2 – The Ring Model • Simply looking at conversion points is not enough • We need a way to look at the ‘layers’ or ‘levels’ reached • So I developed a ring or engagement model • This works for many (but not all) websites • Focuses on depth of engagement, not pages viewed • Helps to see the key loss steps, like a funnel • It’s not a replacement for funnel diagrams • It helps to see the ‘big picture’ involved • So – let’s try some examples 65
6.3 – Examples – Concept 66 Bounce Engage Outcome
6.3 – Examples – Shoprush.com 67 Bounce Search or Category Product Page Add to basket View basket Checkout Complete
6.3 – Examples – 16-25Railcard.co.uk 68 Bounce Login to Account Content Engage Start Application Type and Details Eligibility Photo Complete
6.3 – Examples – Guide Dogs 69 Bounce Content Engage Donation Pathway Donation Page Starts process Funnel steps Complete
6.3 – Within a layer 70 Page 1 Page 2 Page 3 Page 4 Page 5 Exit Deeper Layer Email LikeContact Wishlist Micro Conversions
6.4 – Benefits • Helps you see where flow is ‘stuck’ • Sorts out small opportunities from big wins • Ignores pages in favour of ‘Macro’ and ‘Micro’ conversions • Lets you show the client where focus should be • Helps flush flow or traffic through to lower levels • Avoids prioritising the wrong part of the model! • Example – Shoprush problem is basket adds, not checkout • If you had 300k product page views, 5k adds and 1k checkouts – where would your problem be? • If you had 300k product page views, 100k adds and 1k checkouts – it’s a different place! • Example – Google adwords site has bad traffic, not conversion 71
6.5 – Benefits contd. • A nice simple way to visualise complex websites • Does not rely on pages – more ‘steps’ or ‘layers’ • Helps you see where traffic is ‘stuck’ or ‘failing to engage more deeply’ • The combination of traffic potential, UX and persuasion issues combines to identify opportunity • Avoids visual bias when doing an expert review • In the e-commerce example, Rush have optimised product page first, not homepage. • Questions? 72
#5 By Hand Analytics #1 When to use this method? #2 How to use it #3 Demo #4 Limitations 74
5.1 – When to use this method • If goals are unreliable / broken / have no data • If flows are mixed in funnels (mid stage joiners) • If the conceptual model does not match site config • When the data you need does not exist 75
5.2 – How to use this method • For example, with a funnel • Use UNIQUE PAGEVIEWS (and events, if available) • Do NOT mix with pageviews or visitor counts • Step 1 – Basket UPVs • Step 2 – Customer details • Step 3 – Shipping • Step 4 – Payment • Step 5 – Thank you • Use regex / advanced segments to aggregate or filter • Gives you a unique count of people at steps • Always be aware of time periods! 76
5.3 – Limitations & Benefits • Mixing and matching data can look nice but causes issues • Time consuming and more complex • Try to use in-page filters not advanced segments (sampling) • Is not readily replayed by client Some benefits though: • Construct segmented funnels • Split by other data attributes • Very good way to spot variances inside funnels • Vital for multi-device category websites 77
END SLIDES 78 Feel free to steal, re-use, appropriate or otherwise lift stuff from this deck. If it was useful to you – email me or tweet me and tell me why – I‟d be DELIGHTED to hear! Regards, Craig.
Presentación que realice en el Evento Nacional de Gobierno Abierto, realizado los ...
In this presentation we will describe our experience developing with a highly dyna...
Presentation to the LITA Forum 7th November 2014 Albuquerque, NM
Un recorrido por los cambios que nos generará el wearabletech en el futuro
Um paralelo entre as novidades & mercado em Wearable Computing e Tecnologias Assis...
Share 10 Top CRO Questions, Tips and Toolkits : ... It also includes details of analytics tips ... UX and Analytics (1999) ...
Conversion Optimisation as a Conversation ... tips for blending Analytics, UX, Agile and CRO: ... UXPA UK – Toolkits and Tips for Blending UX, ...
VISUAL COMMUNICATION IN UX RESEARCH & DESIGN MAY 2013 UXPA ... http://www-958.ibm.com/software/analytics ... UXPA UK - Toolkits and Tips for Blending UX, ...