Multi-platform testing – the answer to omni-channel CX

Image

The omni-channel experience

Omni-channel” is the new buzzword being tossed around. But what does it actually mean? And how do you begin to develop an omni-channel strategy?

With the rise of Near Field Communication (NFC) and mobile phone usage in-store, consumers are increasingly embracing digital technologies and devices in all stages of their buying journey. This integration of digital into off-line shopping behaviour means customers experience a brand, rather than a channel within a brand. In response, companies are designing omni-channel strategies to deliver a seamless approach to the customer experience across multiple touch points. It’s about true continuity of the customer experience.

An article published in The Wall Street Journal (Jan, 2014) says “retailers are still struggling with omni-channel strategies” – and Australian companies are no different. If a company wants to start thinking about the omni-channel experience, they need to be open and involved in making their customers’ experience continuous and universal. To do this, you need to start understanding why and how your customers integrate different customer touch points into their buying journey.

To meet the demands of our clients, Objective Digital has deployed a multi-platform testing methodology. Conducting usability testing with eye-tracking on multiple devices, rather than individually, can create more knowledgeable insights into your customer’s omni-channel experience. A move to multi-platform testing with users allows our clients to better understand how their consumers experience their brand rather than the interaction with a single channel.

Objective Digital was recently commissioned by a one of Australia’s leading Internet betting and entertainment website. As an online organisation, one of their research objectives was to understand how their multiple platforms integrated together, and what type of omni-channel customer experience they were creating. In response, we deployed our multi-platform testing methodology to investigate which device – desktop, mobile, or tablet – was producing a more efficient customer experience at different points of the online betting journey.

Image

Multi-platform testing set-up

Participants were tested and eye-tracked on two devices (desktop, mobile, or tablet) across four key tasks. Each task was scored against a set of quantitative measures. This was followed by qualitative questions to understand the user’s experience with the different devices. At the completion of the project, we had compiled qualitative, quantitative, and eye-tracking data on each of the four tasks across the three devices. This painted a clear picture of how customers where engaging with the Online Betting agency across its different channels.  From here, we made recommendations on how our client could improve certain channels and leverage others depending on their customers’ expectations and needs.

The philosophy of omni-channel is simple, however the execution of omni-channel strategies has been mediocre at best. In order to accomplish this migration to being omni-channel, companies must have complete visibility of how their users mitigate their multiple touch points and channels. Part of what we are doing as customer experience consultants is filling that omni-channel gap for our clients.

Eye tracking mobile devices – now even easier with the Tobii X2 mobile device stand

Our dedicated love on mobile devices

During my commute to and from work each day, there’s one thing I must not forget to bring with me. It’s not my train ticket or my wallet, it’s my phone, and without it I am lost. The truth is I am not the only one; people love their smartphones and other mobile devices. People are spending more and more time on mobile devices and based on the latest Flurry’s analysis, 80% of that time was spent on apps. But, the question is: on which apps? A recent study by Compuware found that 79% of people will only give an app one or two tries, and if it doesn’t meet expectations they’ll never use it again.

As marketers, it’s critical to understand how to engage your consumers by providing a better experience when they use your mobile apps. The great news is, we have just the right solution for you!


Latest solution for mobile device eye tracking testing

Our latest Tobii Mobile Device Testing solution lets you study how consumers experience mobile websites and apps and how they engage with any mobile ads on mobile devices. Tobii X2 Eye Trackers, paired with the Tobii Mobile Device Stand, provide a dedicated solution for efficient and high quality testing of mobile devices and tablets.

Image

Accurate, the solution delivers highly accurate data you need to test small devices, where logos, text and buttons might all be within one degree of visual angle distance and any compromise on accuracy can lead to the wrong conclusions.

Efficient, we can take the eye tracker to any location where it is convenient to run your tests, you have enormous options in recruiting participants.

Flexibility and natural interaction, the solution allows test participants to interact with the mobile device in a natural way, creating a distraction-free user experience. Users can take hold of the device, smoothly rotate the device between landscape and portrait modes, and interact with it from a comfortable viewing angle.

High quality data, suitable for both quantitative and qualitative studies that require high quality behavioural data. The fixed mounted high-definition (HD) scene camera captures the details needed to analyse small screens.

Image

We’re here to help you get on the front foot

Interbrand, a corporate identity and brand consulting firm, recently ranked Apple the most valuable brand in the world, it’s the first time Coca-Cola hasn’t topped that list since it was first published in 2000. The world as we know is changing and mobile devices are a huge part of it. Websites worldwide now get more traffic from mobile devices than PC, people browse 70% more web pages per visit. It’s time to get on the front foot of your consumer mobile experience and we’re here to help.

Click here for a quick video introduction to our mobile device testing solution!

Yoana Francisca

Free Tobii Webinar:Introducing the Tobii UX Live solution for usability testing

In this free 30-minute webinar we will discuss the value of eye tracking-supported usability testing and how to incorporate it in your development process. You will also learn more about the Tobii UX Live solution and how you can use it yourself when testing web pages and software.

In this webinar you will learn more about:

  • The value of usability testing and how to incorporate it in the development process.
  • The benefits of eye tracking-enabled usability testing.
  • How to set up and run a study using Tobii UX Live.
  • Overview of the Tobii UX Live solution.
  • Attendees will have the opportunity to ask questions via chat after the webinar.

Audience and prerequisites:

This webinar is open to anyone who wants to learn more about the Tobii UX Live solution and usability testing. Pre-registration is required.

About the instructors:

This webinar is taught by Johan Koch, product manager at Tobii Technology in Stockholm, Sweden, and Tommy Strandvall, global training manager at Tobii Technology in Stockholm, Sweden.

Dates and registration:

Date & time: Friday, September 20, 2013, at 17:00pm Sydney time
Price: FREE
Registration: Pre-registration is mandatory: Click here to register.

Feel free to contact us if you have any questions!

Cheers,

Sheilah

Changing course

When we conduct usability testing, we always have a plan – a discussion guide or testing script that has the entire session mapped out to the finest detail (how to greet the participants, what tasks to give them, etc.). However, sometimes, you need to change your plans in the middle of testing, as I found out myself.

I recently finished a usability testing project for a mobile website. It was a pretty awesome-looking site with lots of features from its desktop cousin. After the first 2-3 sessions, I started noticing a clear pattern: everyone was having trouble finding things on the mobile site. There were so many features on the site that users were getting lost trying to find things.

confused_user

It was evident that the issue was with the information architecture of the site (i.e. how information was structured in the site). I felt like there was no point getting the participants to do the same tasks and uncovering the same issues over and over again. So, I decided to conduct a card-sorting activity during the remaining sessions. During a short break between sessions, I quickly printed out cards with labels from the existing site.

lightbulb_moment

Before I showed the site to the participants and asked them to conduct the tasks, I gave them the pile of cards and asked them to group them into categories they deemed logical. After the grouping exercise, I then asked them to give a name to each of the groups that they came up with. The whole process just took 10 minutes and after asking a few participants to conduct the task I could see a clear pattern about how participants were sorting the cards. Not so surprisingly, the IA of the mobile site was very different to what the participants had come up with. No wonder they were having difficulty finding things, the site IA was totally different to what the participants had expected.

fruit

The clients were fascinated by the findings from the ad-hoc card sorting activity. It helped them get into the heads of their users and see the system from a users’ point of view. The clients were pleased to receive the extra deliverable: a new IA based on actual user input, and the good thing from a project management point of view is that the card-sorting activity did not cost us (or the client) any additional time or budget (10 minutes for the activity and a couple of hours to analyse the findings and restructure the IA).

IA

So it got me thinking…what would I have done differently next time? Maybe I could have prepared cards before the first session just in case I needed them or perhaps utilising a digital card-sorting tool like OptimalSort. Online tools are useful as participants would be able to sort the cards on their computer screens and the clients would be able to watch the results live via screen sharing.

Moral of the story: Don’t be afraid to change your course if you’re heading in the wrong direction. Always keep asking yourself whether you’re getting rich insights to answer the research questions and also are these the most important questions to be asking? If not, what other methods can you use to find the answers? Planning and being adaptable is more important than a plan (that doesn’t work).

change_direction

Had you had to change course in the middle of a usability testing project? What would you have done? Feel free to share your experiences and thoughts in the comments below.

Lean eye tracking – making eye tracking quick and affordable

Eye tracking can be a valuable tool in the UX tool kit, however for some teams, it is perceived as being expensive and long winded. In agile and lean environments (where time and unnecessary steps are being trimmed out whenever possible), consultants need to come up with ways to make set up, testing and transferring findings as seamless as possible. In this post I want to share some tips I’ve learned over the years to make eye tracking both cost and time efficient. 

1) Establish clear objectives and areas of interest

Like in any project, it is essential to have clear research and business objectives. The same goes for what you want to get out of eye tracking. Make sure all the key areas and hypotheses pertaining to eye tracking are communicated before the testing starts. Ask your client to outline the following:
  • Which pages/areas are of the most interest? 
  • Are there specific copy, navigation or multimedia elements we want feedback on? 
  • How do we expect eyes to behave (any hypotheses)?

Understanding these questions puts the moderator in a good position to hit the ground running once fieldwork starts.

2) Analyse in real time

Too often eye tracking is sold as a sexy set of heatmaps. While these are appealing (and sometimes insightful), I find the most valuable form of eye tracking is in real time viewing. This form of insight allows the moderator to understand objective user behaviour and discuss immediately after. The moderator’s screen can also be live-streamed over the web so other members of the team can be a part of the real time analysis without having to leave their desks (or country for that matter).

Some examples of real time observations include:

  • What is being focused on (and what is missed)
  • Long & repeat gazes
  • The gaze journey
Eye_tracking_and_live_viewing
In my experience, real time viewing is more valuable to the analysis than any other eye tracking output. Remember to timestamp interesting behaviours and always verify your own observations and hypotheses with the respondent using follow up questions or retrospective think aloud techniques.

3) Streamline analysis

Finally, take time out of the analysis phase by using the software in smarter ways (I am using Tobii Studio). Create video segments (video clips) right after saving the recording and debriefing with your respondent. This works especially well if you have another consutant helping you or during the time your participant fills out an exit questionnaire. I usually do this in the following steps:
  1. Isolate segments quickly by using timestamps or by replaying the footage faster than real time. 
  2. Use the finding or observation as a file name. Do this for as many of the eye tracking findings as you can. 
  3. Add in relevant heatmaps and gaze trails (from your pre-determined areas of interest we talked about in step 1)
Heatmap

With these steps you’ve got the eye tracking collateral you need to make a compelling case for your findings.

Just like screen recording and online surveys, eye tracking is a great tool to help you achieve confident results. Incorporating eye tracking in to lean and agile environments doesn’t need to be an expensive and cumbersome experience, providing you have the right technology and approach. Have you found any tricks to using eye tracking efficiently?

Further reading: Lean UX and the dichotomy of being a UX consultant by Tim Yeo

 

Dan_sorvik_100x100

DAN SORVIK
Dan is the Research Manager at Objective Digital. When he’s not conducting eye tracking studies and uncovering customer insights he is doing DIY, playing Xbox and enjoying an ice cold margarita.

Usability Training Sydney 27 April

Our next training course is running next month!

27 April 2012. 9.30am to 4.30pm, 301, 15 Lime Street, King Street Wharf, Sydney NSW 2000

$595 + gst (lunch provided)
Book usability testing training now by emailing jbreeze@ObjectiveDigital.com


Do you want to learn about usability testing, what it involves and how to do it?

 

Gain confidence in usability testing through our hands-on, interactive course on usability testing.

 

During our one-day interactive usability testing course you will:

  • Learn what’s involved in usability testing
  • Grow your understanding of usability and its heuristics
  • Find out how to plan for a usability test and recruit the right respondents
  • Learn how to prepare a testing script
  • Discover how to run a testing session
  • Understand how to evaluate results and report on usability testing tests
  • Learn more about the logistics and technologies of usability testing
  • Be given guidelines and tips for successful testing
  • Receive templates that cover each of the steps involved in usability testing.


To put into practice what you learn, you’ll complete a series of exercises throughout the day ranging from how to plan a testing script to uncovering usability issues. At the end of the day you will moderate your own usability test and aso be a participant of one.

You will leave the course having gained a deeper understanding of how to plan, prepare, run, analyse and report on usability testing whether this is in a management capacity role or actually running usability testing yourself.

You will also be provided with a usability testing course pack that holds all the templates and materials you will need to run your own sessions.

Our usability testing course has a maximum of 12 participants to keep it interactive, fun and informal.

Who should attend:

  • Marketers
  • Web developers
  • Designers
  • Business analysts
  • Online communications managers
  • IT and web managers
  • Testers
  • Online content owners and creators
  • Project managers

 

Book usability testing training now by emailing jbreeze@ObjectiveDigital.com

 

 

The workshop fee will include morning tea, lunch and afternoon tea. The workshop will be held at 301, 15 Lime St, Sydney from 9.30am to 4.30pm.

 

iPad app development – the road ahead

iPad app development is on a steep upwards trajectory yet in its wake is iPad app usability.  Recent work we conducted for a Sydney client echoed some of the findings that Jacob Neilson began to uncover recently in US regarding usability.

One of the biggest challenges we found (amongst many) was that users adopt two distinctly different mental models when using an iPad. They are the magazine mental model and the computer mental model.

Magazine mental model

With the magazine mental model users expect the app to work similarly to an ebook or emagazine because the shape and size of the iPad device mimics earlier e-reader devices. Users are satisfied with media publications that have been brought online in an electronic form, as it is an efficient, extensive and familiar experience. It is a very linear experience and the structure of the material is in a familiar format of chapters with an initial table of contents (or a catalogue with product categories). It is a wide but flat experience, rather like snorkelling in the water, looking down.

An example of the kind of industry or content suited to a magazine mental model could be online shopping. Anything where the user will be browsing and reviewing choice prior to making their purchase choice/decision.

Computer mental model

However for every other app, users expect a more computer mental model. So what this? We’ll it’s not to say that an iPad is simply a paired down PC but rather users also expect an more accessible, engaging, interactive and immersive experience. This is to be delivered in a fluid way with functionality ignited at the touch of a fingertip. For example, in our travel project participants in the study wanted to be able to see in real time where the current cruise ships were on the globe. The ideal iPad app experience should be a deep one, rather like scuba diving where you can dive deeper, much deeper.

Some examples of the types of industries or content suited to this format are: online booking services (travel, entertainment), financial budgeting and planning (modelling) and travel schedules.  Where the real experience (e.g. holidays) involves engaging all senses – sight, sound, touch – these would be ideal for the computer mental model.  

The expectations that users bring to experience are two fold, the experience must not only be informative but entertaining. Our research revealed that majority of participants used the iPad while eating a meal, having a drink, relaxing on the couch or in bed.  Other research from the UK revealed that in 2010, as much as 20% of iPad usage occurred in bed! It goes without saying given the context of use an app must be simple and uncomplicated. All in all it’s a very tall order.

There seems to be a sense of déjà vu with the rash of iPad and mobile app developments. Where 10 or so years ago businesses were rushing to have a website, we are seeing a repeat somewhat with the development of the apps. In the rush to develop these apps however the usability experience is less than ideal, and often not even considered. The difficulty expands many issues but two critical ones include a lack of navigation and the lack of affordance, which should be on the top of the list when designing an iPad app.  

Lack of navigation: The discipline of developing a simple, stable navigation is lacking in many apps. So many apps don’t even have an obvious back or home button to help participants get back on track when lost. The home page is a critical landmark which users are familiar with and the lack of it severely undermines the process. The mouse-over functionality is non-existent on the iPad and therefore rollover menus and dropdown menus are eliminated from the experience. These are very helpful functions normally on a website so the navigation design needs to work even harder to bring the content out in an easy, simple way.       

Lack of affordance: The seemingly current trend to have sleek, flat designs on an iPad app means that perceived affordance for target areas on the screen are eliminated (For example a properly designed button has the visual affordance of pushability).  Participants in our research became confused as certain links were either avoided as they didn’t look like a link or participants actively tapped items that were deliberately not a functioning item. As the design stood it was not clear what you could push or not, thus generating a considerable amount of frustration. You may very well wish to make your functions discoverable – if there is a game element – but hiding features from your users is not recommended.     

If you would like to know more about our experiences with mobile & iPad testing shoot us an email lphillips@objectivedigital or call us on 02 8065 2438.

 

 

 

 

 

 

 

 

 

Why our clients are finally finding the real value in usability testing…

By Esse Spadavecchia on 3 August 2011

Recently at Objective Digital we have had a plethora of financial clients asking for usability testing with our eye-tracking facilities as well as larger focus-groups to ascertain people’s views on finances and their attitudes towards banking online.

This is a good thing. Actually it’s a great thing. For one, as a banking customer of a couple of our client’ banks, I’m overjoyed that they’re interested in hearing what their end-users have to say… and what’s more they’re really listening. Secondly, it’s good for business. And we do love that really. But how did this all come about?

How did we instil the need for usability testing?

There was a long period of time when the words Usability Testing and User Centred Design were just hot jargon being played with in the industry… I don’t think our clients really knew how they would make improvements following usability testing sessions or include the outcomes in the process.

So what changed?

Well for one, the User Experience (UX) industry started evolving to a point where their deliverables weren’t just a 98-page Word document with detailed findings that no one would ever read…

Yep, we started wizening up to the fact that nobody likes to read lengthy documents, whether they are online or printed and bound. Even if you did put your super snazzy company logo in the top left corner with scented spray.

Now there was GOLD in those documents. Literally thousands of magic pointers that, if followed to the nose, would make your site grow a cape and take off faster than Superman… but who has time to read? Let alone work out a strategy for what gets done first and who should be responsible for it all.

And then what?

Deliverables became PowerPoint presentations and findings became bullet points of highlights. Graphs and tables with summaries became the norm. Edited short video clips of usability testing and eye-tracking accompanied every document.

Suddenly our clients were taking notice. One of our clients went on to say that our report was the ‘Best report I’ve ever read, tells the story in a way that makes absolute sense. We can’t wait to do this for all our products’.

 This was the hot tamale they’ been looking for. This was actionable. This they could deal with and still get home in time to watch Master Chef. Winner.

As an example we delivered a 28-page preso to one of our clients recently, most of which consisted of links to snippets of users talking about what they liked and what they didn’t like. The chief programme guy got it immediately. He stood up, shook our hand and said “Guys, I know what we need to do. It’s so obvious”. I know we keep saying it, but yes, the users’ voice speaks a million words… Even more so that an image, even if it’s of Paris Hilton.

So we started fashioning all our final presentations as such.

  • Short overviews, executive summaries.
  • Stats on what people liked, where they failed, what they wanted.
  •  All accompanied by video and eye-tacking data, as well as heatmaps and gazeplots.
  • Then end it off with an actionable list of what needs to be done, and prioritise it by user-need vs. business-need vs. technical complexity and you have a winner.
  • With a cherry on top.

Gazepath
Gazemap

So now our clients keep coming back to us. Not only because we’re lovely and generally very good-looking, but also because we actually deliver something they can act on. The guy at the top gets it because he can fashion a business case from it. The manager of the programme gets it because he can work it into his project timeline and of course the designer and propeller-heads get it because it’s actionable and they can action it. Simple.

So now our clients know usability testing is good. Not only because they are hearing the voice of their customers, but also because now they know what to do about it. Brilliant.

At the end of the day it’s the deliverables that are evolving… we forget in the UX industry that our clients have other stuff to read too (no! really?). And we are very, very clever, but it means nothing if people can’t do something with it.

It’s like getting a stunning villa in France vacuum packed and bubble wrapped with limited instructions (in French) as to how to get it up and install the plumbing… it’s bound to be abandoned.

And you probably won’t buy a villa from the same people again.

So go forth grasshopper, and create shorter presentations and deliverables with real impact!

 

 

 

 

Sydney Research Network by Objective Digital

We are pleased to announce our new Sydney Research Network on Facebook.

Screen_shot_2011-02-14_at_8

If you or anyone you know would like to be participants in our face-to-face user research and usability testing sessions in George St, Sydney then sign up. You can earn up to $100 in an hour!

Screen_shot_2011-02-14_at_9

And don’t forget to like our new page (right below) please.

DIY Document Camera for mobile testing & recording

Recently we did some mobile iPhone testing and were unable to use a document camera so I ended up creating a DIY document camera.

We used the Gorilla Pod by Joby called the GorillaMobile Original. It comes with some great handy mounting options such as a suction cup clip and a high-bond removable adhesive clip. We ended up using the latter to attach our webcam to the pod.

The webcam we used was a Logitech Webcam Pro 9000, which we fixed to the Gorilla Pod with the help of the removable adhesive clip that came with the pod kit.

As we use Macs, we simply hooked up the webcam to Photo Booth or you can use any other recording software (screen or webcam) to record the research sessions. There is a bit of a fiddle with software to allow Photo Booth to use an external webcam and the quality of the external webcam is not as great as it would be if you used a PC. To allow for the external webcam to work on the Mac we used a program called Macam.

This setup was used successfully for 2 days and cost much less than a document camera. Especially those that have a webcam and/or Gorilla Pod already, it becomes a very cheap alternative. 

Other useful sites that show alternative DIY recording options:

Make your own iPhone usability testing sled for 5 pounds

Recording usability tests on the iPhone

 

Photo_3Photo_2Photo_1