Why cloud native is the new normal

Cloud native is the new normal

The technologies of the cloud have changed, and its purpose is shifting. The New Normal now lies with five major cloud elements essential to organisations wanting to remain competitive and relevant: cloud-native applications, deploying a multi-cloud strategy, embracing mobile applications into the cloud, building viable ‘data lakes’ and enabling the democratisation of data using analytic tools pivotal to helping industry sectors in becoming AI-driven businesses.

AI is not just a technology; it is a business paradigm shift that cannot be ignored.

Gartner states that 80% of internally- developed software is now cloud-enabled or cloud-native and the evolving cloud ecosystem has allowed businesses to operate faster, more flexibly and in real-time, which has led to competitive pressures. Accepting cloud-native and a multi-cloud approach as the new normal means avoiding cloud vendor lock-in, so that companies can deliver an above five-9s response rate (99.999%), to avoid downtime of services that cost millions.

As 68% of organisations already have a digital transformation strategy in place, or are in the process of implementing one, and the vast majority see cloud as a critical component of their change strategies, the debate continues on whether the term ‘digital transformation’ is outdated because the essence of what companies must do has always remained in the cloud.

Simply put, businesses need to accept these five major cloud elements if they are to remain relevant in an intensely fierce digital arena across all sectors.

Furthermore, the three AI technologies crucial to businesses in 2019 are: sight, language and conversation. Leaders need to leverage these services within their own context by introducing cloud AI into existing applications and enabling businesses to use data science, which is often in short supply. Therefore, having a viable data lake with data that is tagged and ingested in the right way is more effective than just investing in analytics services.

Multi-cloud is tech’s multi-pass

A recent study by The Cloud Industry Forum (CIF) found that organisations are more open to multi-cloud environments, with three-quarters of firms adopting the use of more than one cloud service to power their digital transformation process. Business leaders are finally realising that cloud vendor lock-in impedes on the creativity, availability and fluidity enjoyed by a multi-cloud approach.

There is also an increase in companies using hybrid and multi-cloud environments from the big vendors: AWS, Google Cloud or Microsoft Azure. This trend has also been encouraged by cloud providers creating managed versions of open source stacks, like Apache Kafka, for certain capabilities.

This enables easier migration from one cloud to another and is key to avoiding vendor lock-in while still allowing a business to focus on its digital USP.

Standardisation of the cloud means that multi-cloud offers businesses to run workloads more cost-effectively, as prices often change between cloud service providers. For enterprises with critical workloads and cloud experience, multi-cloud can improve both uptime and competitiveness.

Cloud also means IT spend is maximised via a multi-cloud strategy due to the standardisation of cloud-native technologies which allows businesses to use the right cloud vendor for the right product.

For example, microservices to enable event-driven scaling (e.g. Black Friday) through containerisation (i.e. Docker) and orchestration (i.e. Kubernetes); hyper-scale provisioning by means of a cloud infrastructure, such as Thin clients (web apps, native mobile apps, Alexa Skills) that consume multiple microservices offering strong resilience and flexibility – through autonomous and self-healing capabilities and designing for failure.

This is the orchestration of containers that helps to withstand partial cloud outages when coupled with a cloud providers fabric and regional capabilities.

Death of native apps

Embracing native mobile apps into the cloud is also vital to enable The Internet of Things (IoT), AI and Virtual Reality, which means native apps will need to maintain the pace. Mobile apps are costly if not part of the cloud mix. Finally, building a viable data lake that curates information in a pragmatic manner to avoid turning

it into a swamp is essential to sustaining competitive advantage when introducing AI and machine learning (ML) into the mix of data science tooling. As such, for companies to remain relevant they must embrace AI as it is not just a technology; it is a business paradigm shift that cannot be ignored.

These cloud trends will continue to make an assertion during any digital transformation strategy – and will help lead the charge into becoming an AI- driven business, including having a strong understanding of how the role of apps, data, analytics and identity management will enable businesses to become more efficient and compliant.

Creating viable data lakes

In the last five years alone Internet users have increased by more than 82% and Gartner anticipates that data volume is set to grow 800% by 2022, with 80% of it residing as unstructured data.

2019 will be critical for enterprises to build a usable data lake in their organisation as they continuously deploy cloud services. Adding in an intelligent set of discoverable, metadata-tagged data from all their systems, devices and services to extract value from the terabytes of structured and unstructured data they generate each day will enable them to run analytics, business intelligence, ML and AI, and gain vital insights into new efficiencies to gain a competitive edge.

Compared to a traditional data warehouse approach, a key principal of data lake architecture is to provide a place to land all the raw data without transformation or loss, so that any transformations on the data can be replayed at will. The challenge with this approach in an enterprise is maintaining a level of control over the landing of the data so that the volume and veracity doesn’t become overwhelming or turn into a data swamp.

By utilising Lambda architecture, businesses get the benefit from being able to use streaming data to report in near real-time, offering almost immediate visibility of important events; a significant step change from the traditional data warehouse approach where you’d have to wait 24 hours.

Companies then need to apply a pragmatic approach to making sense of the data such as: storage taxonomy, curating data workloads through classification for example, data security and who has access to it (IdAM) as well as data science tooling to help data scientists create/apply good equations to the data lake to improve future analysis.

The democratisation of data science

Artificial intelligence is a business paradigm shift that cannot be ignored. In 2018, AI and ML started to gain traction, particularly when processing structured and unstructured data in order to help businesses make intelligent decisions and spot trends. Today, cloud AI can offer intelligence at scale, scanning vast numbers of image, audio, video or text files to track patterns and anomalies.

Some AIs are functioning at levels that were impossible even two years ago, generating unparalleled business value. As a result, we are now seeing a growing appreciation for how cloud AI could revolutionise business models in a cloud-native ecosystem.

2019 will continue to see businesses building AI into their digital strategies. The biggest uptake will be using cloud AI to take away more menial tasks from human beings and apply a level of intelligence to basic business processes. For example, AI chatbots growth in contact centres answer 80% of repetitive questions, leaving humans to deal with the more complex and important 20%.

These intelligent scripts are able to remove a layer of administrative burden, whilst offering a higher level of customer experience. Freedom is in the cloud – it’s time to start flying.

Simon Evans, CTO, Amido

 

More
articles

Menu