user experience Archives - DevOps Online North America https://devopsnews.online/tag/user-experience/ by 31 Media Ltd. Mon, 22 Jan 2018 16:02:29 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.3 2018: Time to tackle new customer experience challenges https://devopsnews.online/2018-time-tackle-new-customer-experience-challenges/ Mon, 22 Jan 2018 16:02:29 +0000 http://www.devopsonline.co.uk/?p=11657 What issues are organisations at the forefront of user experience excellence looking to tackle?

The post 2018: Time to tackle new customer experience challenges appeared first on DevOps Online North America.

]]>
The demand for constant availability of apps, sites and services is nothing new. Digital users are only getting more demanding, and businesses that ensure their services are ‘always up’ and performing will dominate those who can’t successfully tackle performance issues.

Even established industry players still experience challenges when it comes to maintaining a culture of constant availability, and moving forward this year, we’re set to see companies get proactive to ensure availability, top performance, and ultimately, unparalleled user experience.

So, what issues are organisations at the forefront of user experience excellence looking to tackle?

Availability needs to be addressed before companies can compete on user experience

2017 saw more than its fair share of major website and application outages, from Virgin Money Giving going down the night before the London Marathon, to WhatsApp crashing over new years – forcing people to actually talk to one another…via Facebook probably. The impact on organisations’ reputations, not to mention their pockets, ranged from inconvenient to disastrous.

It’s not just an overwhelming volume of online traffic that is reducing many critical websites and applications to near failure – it’s the so-called glitches we’ve seen in IT layers behind these channels which make it tough for sites to function as they want.

Recently, it was revealed that hundreds of parents in the UK are struggling to access their tax-free childcare benefits via an HMRC-run website, resulting in many nurseries and other childcare providers not being paid and parents scrambling to allocate personal funds. In some extreme cases, parents found that all records of their childcare benefit funds had been wiped from the system without explanation.

It’s examples like these that show how far many organisations still have to go before they can even compete on a user experience level. Regardless of whether it’s down to the prevailing organisational attitude, failing to ensure a 24/7 culture of availability and user-first culture impacts how competitive you can be in the market and in many cases, how you can deliver critical services.

Businesses will look to improve user experience by exercising and testing the supply chain

Application supply chains are becoming longer and more complex as digital enablement is bolted on top of legacy systems putting new and unforeseen workloads on applications.

This means that it is no longer viable to assume a good user experience from the monitoring of individual components and that the experience needs to be measured at the edge at a macro level.

For organisations looking to differentiate from a user experience point of view, exercising and testing the supply chain could be a real market differentiator in 2018.

UX requirements will drive evolution of the C-suite

As organisations become increasingly digitally enabled, it will become more important for the whole C-suite/board to be involved in looking at the ‘performance dashboard’.

Roles will have to transform in 2018; the CIO and CDO (Chief Digital Officer) will start to merge as the necessity for customer awareness and UX appreciation supersedes technical expertise alone.

The head of customer experience will rise to become one of the most important roles supporting digital services. In some companies, this may mean that CDOs begin to address more customer experience requirements, and in others, it may mean that CDOs morph into the customer experience role over time.

In both instances, CDOs will need to begin to work more closely with UX design teams, playing a key role in user research and people-centric design before moving on to play an integral part in the direction of application load testing to ensure services work on all levels, end-to-end.

The ultimate goal here is to design user-friendly digital services that perform during peak traffic times, allowing large industry players to deliver a user experience in line with customer expectation, and giving smaller competitors the opportunity to compete in the fastest evolving digital landscape we’ve seen to date.

Organisations will prioritise web and application monitoring and load testing

If you don’t know by now that unhappy users make for an unprofitable business (presuming that you haven’t cornered your market entirely), then what you might not be aware of is the scale of the impact that poor website or application performance can have on the success of your business. Let’s take website homepage loading time, for example. How long is ten seconds? Long enough to lose you 40% of potential users if they can’t access what they want in that timeframe.

At the end of the day, it doesn’t matter if a site or service’s graphic user interface is optimised for an amazing user experience – if performance and response times are not up to scratch, organisations will lose users. In 2018, we’re set to see CEOs prioritise web and application monitoring and load testing so that user experience teams can focus on what they’re good at: developing innovative digital services that make the user experience easier and better.

Written by Tim Cox, global head of operations, Apica.

The post 2018: Time to tackle new customer experience challenges appeared first on DevOps Online North America.

]]>
Smart Home Solutions: Putting the spotlight on user experience https://devopsnews.online/smart-home-solutions-putting-spotlight-user-experience/ Thu, 30 Nov 2017 08:00:30 +0000 http://www.devopsonline.co.uk/?p=11074 The Internet of Things offers increasingly sophisticated capabilities, and the opportunity to transform our everyday lives

The post Smart Home Solutions: Putting the spotlight on user experience appeared first on DevOps Online North America.

]]>
With the Christmas shopping period rapidly approaching, Amazon Echo products are likely to be a top item on this year’s gift lists for those eager to explore the opportunities of a digital home assistant. As the Alexa – Amazon’s intelligent personal assistant – product range extends, it has also become a key platform for several third-party products. Increasing pressure is mounting on these third-party vendors, including Amazon, to deliver seamless user experiences.

The Internet of Things offers increasingly sophisticated capabilities, and the opportunity to transform our everyday lives. In turn, demands have grown exponentially around user experience and product quality. Today, IoT systems are highly distributed and diverse – to ensure everything runs smoothly, including graphically rich systems, testing is essential.

Increasingly complex user interfaces

The Echo portfolio is a key example of how we no longer simply interact with digital devices via a controlled environment such as a keyboard, and user interfaces are no longer ‘just’ digital forms for us to navigate, consisting of components such as labels and buttons.

Through services including Spotify and smart home devices, voice and video are layered in: this requires testing the entire system end-to-end to ensure delivery of a seamless user experience every time. Testers are no longer saying something is ‘out of scope’ simply because it was not developed internally. Ongoing monitoring of user experience is now a key requirement for success in a digitalised era.

The importance of multi-level testing

When it comes to voice, testers need to be able to ensure a great user experience at each point of interaction—including how Alexa handles different regional accents and creates appropriate responses. Failing to do so can cause end-user frustration – and an embarrassing situation for Amazon if anecdotes surrounding mishearing home assistants go viral.

As vendors add third-party capabilities into ecosystems, they need to be able to test (interact/validate) the entire end-to-end experience, involving all components, at every level. For example, if you have multiple user interfaces (your Alexa voice-activated UI, phone, laptop, and fridge). In addition, you have multiple IoT systems in your house gathering information from lots of different sources, such as a power meter gathering information from lights, fridge, and heating. So, you have several things producing data, several things aggregating and processing data, and several things presenting data. As a result, testing becomes increasingly about testing a set of independent, technologically different components working together to do something.

Setting the (user experience) standard

If businesses want to gain a competitive advantage in the digital personal assistant market, they need to consider implementing a set of standards: for example, the assistant must be able to respond within a certain amount of time, work with simplified English, deal with different accents and provide helpful error messages. These should be met from an end user experience point of view before other service providers can become part of the Echo ecosystem, and must enable those providers to easily achieve the same quality of user experience.

The ability of users to smoothly and accurately interact with any IoT solution will decide whether it succeeds in the over-saturated landscape. If Amazon wants the Echo to be an essential tool for years to come, and not just for Christmas, then it needs to guarantee an experience that delights all users – at all times.

Written by Antony Edward, CTO of Testplant

The post Smart Home Solutions: Putting the spotlight on user experience appeared first on DevOps Online North America.

]]>
The future of analytics is self-service https://devopsnews.online/the-future-of-analytics-is-self-service/ Mon, 25 Jul 2016 08:31:19 +0000 http://www.devopsonline.co.uk/?p=8073 Today we live in a world of self-service. More tasks than ever before can be completed without even a hint of social interaction, from withdrawing money to buying fuel. The trend has developed so much in recent years that Costa has even developed a self-service approach to fresh coffee, all to quench our collective thirst...

The post The future of analytics is self-service appeared first on DevOps Online North America.

]]>
Today we live in a world of self-service. More tasks than ever before can be completed without even a hint of social interaction, from withdrawing money to buying fuel. The trend has developed so much in recent years that Costa has even developed a self-service approach to fresh coffee, all to quench our collective thirst to avoid the conversational drudgery of human interaction. However, as Greg Richards, Sales and Marketing Director of Business Intelligence, Connexica, explains, this is more about speed than unsociable nature – and it’s a trend that is here to stay.

It is likely that many of us would be unable to remember exactly the last time that we didn’t use a supermarket’s express self-service checkout when doing the weekly shop. The reason we choose to do this, usually, is in the name itself. It is an express service – faster and with less hassle.

Most industries are adapting to cater for this changing nature of user interaction, placing more of a focus on speed. This makes for faster checkouts, a faster refill and even a faster double-espresso skinny-decaf gingerbread Frappuccino.

Evolution of technology, data and analytics

Clearly, this influx of user-friendly, timesaving technology is a direct reaction to the changing nature of consumer interaction. In consumer technology, this is represented by the instant gratification provided by smartphones and tablets that allow users full access to the internet, leaving a long trail of data ripe for interpretation in their wake.

The big data landscape represents not only a shift in data quantity, but in its perceived quality to individuals and businesses alike. By harnessing this surge in data, real insight can be gleaned from customer behaviour to enhance services and target specific groups based on emerging patterns.

With this in mind, the notion that 90% of data available has been created in only the last two years is not to be taken lightly. This avalanche of data, with the right analysis, should pave the way for business efficiency savings, improvements in targeted marketing and developments of more streamlined user experiences in every sector. This begins with a more streamlined and quicker approach to business intelligence – with self-service business analytics.

Self-service business analytics

It is no secret that traditional tools rely on professional analysts to interpret data sets through dedicated knowledge of SQL (Structured Query Language) across multiple disparate systems such as OLTP (Online Transaction Processing) or OLAP (Online Analytical Processing), systems that are far from straightforward.

Although these technologies deserve credit for the level of analysis they have provided over the years and have been stretched to an extent, the huge increase in data is showing no signs of slowing. This poses new problems for technologies that were never designed to analyse such a wealth of diverse data sets, as well as most staff being unable to provide detailed technical analysis. This results in a performance bottleneck.

However, this all changes when users are given the power. Self-service tools built for more user-friendly data analysis not only allow businesses to save money by avoiding hiring expert analysts or booking expensive training, but improve the quality of analysis. With search-based business intelligence, for example, users can quickly find the data they require through natural language search, drastically improving data access times.

Summary

With the barriers of old technology removed, we can expect to herald a new generation of data analysis, one that allows the wealth of data to be instantly viewed and reviewed, allowing experts to focus on less menial tasks. This is all possible due to self-service analytics, which allows more time to be spent improving services instead of waiting for data analysis.

It’s clear that the convergence of the data-driven culture of business and the self-service nature of society is a technological evolution that is here to stay. As such, companies and organisations will continue to reap the benefits, identifying endless opportunities to better their business practice before their self-served coffee goes cold.

 

Edited from press release by Cecilia Rehn.

The post The future of analytics is self-service appeared first on DevOps Online North America.

]]>
The lasting impact of application response times https://devopsnews.online/the-lasting-impact-of-application-response-times/ Tue, 06 Oct 2015 09:00:52 +0000 http://2015.softwaretestingnews.co.uk/?p=1160 Kevin Surace, CEO, Appvance Inc., discusses the increased expectations in application response times. In a world of instant gratification, application transaction times have a lasting impact on user and brand perception, as well as worker productivity. Smart companies are striving for applications to have transaction times less than 1 second, through the server, internet, browser,...

The post The lasting impact of application response times appeared first on DevOps Online North America.

]]>
Kevin Surace, CEO, Appvance Inc., discusses the increased expectations in application response times.

In a world of instant gratification, application transaction times have a lasting impact on user and brand perception, as well as worker productivity. Smart companies are striving for applications to have transaction times less than 1 second, through the server, internet, browser, JavaScript engine and to the user’s eyes. For each second of delay, 7% of users get distracted and wander.

A few years ago, website designers and application developers tried to achieve transaction times under 10 seconds. Back in the HTML 1.0 days, load testing only verified if a site would crash under load. No client-side code existed, no smartphone, sites ran on slower servers, and productivity increased simply due to the use of computers in the workplace.

Time limits determined by human perceptual abilities

User experience professionals as early as 1968 identified three time limits determined by human perceptual abilities to keep in mind for app development. Jakob Nielsen identified these levels of computer responses in his book Usability Engineering from 1993 (essentially pre-web), which states:

0.1 second: is about the limit for having the user feel that the system is reacting instantaneously, meaning that no special feedback is necessary except to display the result.

1.0 second: is about the limit for the user’s flow of thought to stay uninterrupted, even though the user will notice the delay. Normally, no special feedback is necessary during delays of more than 0.1 but less than 1.0 second, but the user does lose the feeling of operating directly on the data.

10 seconds: is about the limit for keeping the user’s attention focused on the dialogue. For longer delays, users will want to perform other tasks while waiting for the computer to finish. Users should be given feedback indicating when the computer expects to be done.

Increased expectations

While perceptual abilities have not improved, expectations have. In the 2000s, people began investigating the correlation between the speed of a website and loss of customers. Yahoo found that a 400-millisecond improvement to the speed of its home page increased page views by 9%. Firefox shaved 2.2 seconds off its average page load time and increased download conversions by 15.4%. Shopzilla reduced load time from 7 seconds to 2 seconds, and increased page views by 25% and revenue by 7 – 12%. One Google experiment designed to look at response times increased the number of search results per page from 10 to 30, with a corresponding increase in page load times from 400 ms to 900 ms. This resulted in a 25% drop off in first result page searches. By adding the checkout icon (a shopping cart) to search results, responses were 2% slower (added 8 ms), with a corresponding 2% drop in searches/user. Similar results are now showing up in mobile apps, with users expecting near instantaneous response rates.

Users have become used to sub-1-second response times. Driven by long-term efforts from leaders such as Google, Amazon, Facebook and others, companies aim to deliver against any request as close to 100 ms as possible, which means it’s near instantaneous to the user. By focusing on the user experience (not just server response), companies can tune server code and architecture, as well as client side code, to deliver rapid response rates. This has started to condition the world to believe that instantaneous delivery is a reasonable expectation to any request. If Google can search the world’s largest database and return results in a few hundred milliseconds, why is it acceptable to allow corporate transactions to take 6 seconds?

Setting the bar high

The bar should be set at 100 ms. One second is too long because users have lost their train of thought by this time, and have started to wander off. With a 1 second response time, 7% of users show a 16% decrease in customer satisfaction at every transaction. If the largest players can drive toward 100 ms, anyone can. Their methods have required full beginning-to-end performance validation (i.e. UX through back end) at every build (true agile) and a set of goals to continue to bring the transaction times down at every release (hourly or daily). Nothing gets added to the code or architecture that could affect instantaneous response. Every day is an exercise in driving down transaction times in a matter of cult-like behaviour.

Continuous integration should also be continuous improvement. At each build, QA or DevOps should verify transaction times of all major use cases, and drive them down. If a corporation has 100,000 workers using a common application that can see a transaction time improvement of 2 seconds, it can save some US$14 million per year. How about improving every transaction in all applications? Millions of dollars could be saved in worker time, truly impacting the bottom-line. It should be a C-suite imperative, giving a lot of smart developers, DevOps and QA teams plenty to work on in the coming year.

The post The lasting impact of application response times appeared first on DevOps Online North America.

]]>
Domino’s Pizza working with Hitachi Consulting to ensure continuous integration and testing of applications https://devopsnews.online/dominos-pizza-working-with-hitachi-consulting-to-ensure-continuous-integration-and-testing-of-applications/ Thu, 13 Aug 2015 15:00:20 +0000 http://2015.softwaretestingnews.co.uk/?p=654 As part of its DevOps strategy to increase the agility and scalability of its IT function, Domino’s Pizza is working with Hitachi Consulting. The consulting firm will augment Domino’s Pizza software delivery team and help with the design, build, testing and rollout of new applications, in addition to providing DevOps consultancy. Enhancing customer experience through...

The post Domino’s Pizza working with Hitachi Consulting to ensure continuous integration and testing of applications appeared first on DevOps Online North America.

]]>
As part of its DevOps strategy to increase the agility and scalability of its IT function, Domino’s Pizza is working with Hitachi Consulting. The consulting firm will augment Domino’s Pizza software delivery team and help with the design, build, testing and rollout of new applications, in addition to providing DevOps consultancy.

Enhancing customer experience through automating key aspects of the testing process

Domino’s Pizza’s e-commerce operation accounts for 69.4% of all delivered sales in the UK & ROI, and therefore it comes as no surprise that the company is always looking to enhance its customer experience. Its ambitious plans include introducing more technology innovations, which means adopting a continuous IT delivery methodology and automating key aspects of the testing process, including regression testing. Hitachi Consulting will also supply DevOps consultancy, as Domino’s Pizza looks to grow cross-team collaboration and take advantage of cloud infrastructure and DevOps tools.

“We’ve got a truly collaborative relationship between Domino’s Pizza and Hitachi Consulting,” says Rod Brooks, Domino’s Pizza’s Head of IT (Software Delivery). “With our eye constantly on innovation, we are always seeking ways to make our customers lives easier. So, it’s important to have the capability to become more agile and have the scalability to keep ahead of the competition. Hitachi Consulting provides us with that capability, but in turn we’re able to share our experiences, as a rapidly growing e-commerce and m-commerce business.

“Working with Hitachi Consulting also means we have access to the company’s knowledge and experience in other areas, such as cloud architecture, business intelligence and enterprise IT,” adds Brooks.

Challenge of ensuring new applications are rolled-out quickly, but are also well-tested and customer-ready

“Domino’s Pizza is a fast-growing company that harnesses technology to keep it ahead of the competition, but this creates challenges of their own in terms of ensuring that new applications are rolled-out quickly, but are also well-tested and customer-ready,” Chris Saul, Vice President of Digital, EMEA, Hitachi Consulting said. “We are delighted to be able to increase the IT function’s capabilities in this area, while having the opportunity for our team to work for a successful, dynamic business that understands how IT can be an enabler, not just a resource.”

“With DevOps, there is a strong focus on continuous testing and constant delivery. This means there is more emphasis on testing much earlier in the solution delivery lifecycle. Domino’s Pizza has embraced continuous integration and testing to help the software delivery team rapidly build, test, and deliver secure, high-quality applications,” Jonathon Wright, Director, Testing Quality Assurance, at Hitachi Consulting commented.

The post Domino’s Pizza working with Hitachi Consulting to ensure continuous integration and testing of applications appeared first on DevOps Online North America.

]]>