The need for DevOps and data virtualisation

Ash Ashutosh, CEO, Actifio, describes the blend of DevOps and data virtualisation that will make in-house software development more efficient.

Every successful company collects data. This data is processed, analysed and transformed into information that creates real business value. Data has become a strategic asset for organisations around the word and needs to be harnessed in the right way. How valuable this data capital is depends on the quality of the software for data processing.

Through the digital transformation in the economy across all sectors, software is increasingly becoming a key success factor in daily operations. In today’s digital age, a growing amount of companies, even those where software development isn’t a focus, are striving for more efficient data processing. However, an off-the-shelf piece of software that caters for every business model and process simply doesn’t exist. Therefore, software must be adapted or developed, with many companies practising development in-house.

Software development, previously regarded as a pure IT issue, is increasing in importance and has become a business-critical task as a result of the digitisation of everyday business. It means speed and efficiency have become paramount. As business processes continue to change, the correct software that is implemented across an organisation has become even more important for competitive business operations.

The development and integration of new applications have to become more efficient, which more and more companies already recognise. But this knowledge also requires a willingness to rethink and move away from old models – and mottos like “as we have always done it”. A practical approach that exactly challenges this rethinking is DevOps.

Breathe new life into the IT department

DevOps breathes new life into the IT department – and creates a co-operation between the development team (Dev) and IT operations (Ops). It might sound straightforward but it’s a little more complex than that. It’s an approach to software development that strengthens communication, collaboration, integration and automation.

At the core of DevOps are four crucial elements: speed, quality, control and cost. Speed is fundamental to competitive execution and market positioning. Quality is vital to successful implementation and long-term viability. Control, the command of data use, security, access and process is crucial to safe operations. And cost, of course, is a pivotal consideration in nearly all business decisions.

With DevOps, previously separate teams are united in terms of a common goal with a more efficient process in place and faster production maturity of the software. Before, the two parties often had almost contradictory objectives: the development team was tasked to develop new applications as quickly as possible. For this purpose, large amounts of data copies were required. The IT operations team had to provide computing and storage resources for development, and then to integrate the new applications into the production system. The IT operations team was primarily concerned that the infrastructure is reliable and available, and that mission-critical data wasn’t compromised. As a result, quick actions according to the wishes of the developers were often not feasible. When you think about it, it’s no wonder why given two completely opposite missions – in turn, led to differing processes that overall run less efficiently.

A typical IT organisation has silo structures that are thwarting the full potential of agility, flexibility and speed. For example, in most companies, it may take hours, if not days or even weeks, to provide copies of large data volumes. For developers who need ongoing copies of production data for testing purposes, this represents a significant bottleneck and leads to the all too familiar tensions between development and operations.

To combat this issue, dummy data is often used in order to shorten the delivery time, but this may affect the quality of the applications. Since no full copy of the production environment is available, errors may go undetected, which then appear in the QA process or later in production. DevOps is a good approach to optimise processes such as those between development and operations. That said, DevOps can unfold its full potential only if copies of large data amounts can be provided rapidly and in best available quality.

Contemporary data management

To address this challenge, a modern and more efficient solution for data management is required. This solution should speed up the provisioning of data and minimise the dependence of the Dev team from the Ops team. This is where copy data virtualisation enters into the play.

This technology is based on a single ‘golden’ physical master copy of production data from which an unlimited number of virtual copies can be provisioned immediately. These virtual data copies are then available for a variety of use cases, not only for development and testing but also for backup, disaster recovery, data analytics, etc. without the need for separate ‘silos’ and systems for managing these silos. Multiple redundant physical copies of the same data that unnecessarily burdens memory resources and is responsible for the ‘flood of data’ can be avoided at the core.

With copy data virtualisation implemented, the operating team can even provide developers instant ‘self-service’ access to full-size virtual copies of production data. This results in better quality applications that are delivered more quickly. Developers no longer have to deal with bottlenecks in the provision of data and IT operations are no longer burdened with the complex deployment and management of physical copies of production data. Both parties are mutually engaged for the good of the business.

The growing importance of data and software

The integration of application development with the IT operation, according to the DevOps model, requires rethinking. Silos and separate processes must be dismantled to enable accelerated application development. Sleek data management, thanks to copy data virtualisation, fits perfectly into this concept, as information silos are eliminated in favour of simple and rapid data access.

Copy data virtualisation allows organisations to experience the full potential of DevOps. An efficient combination of business-driven rapid application development with high quality and fast, smooth business processes is realised. The key is intelligently managed virtual data. This in the future even more so, because data and software are becoming increasingly more important as the underlying infrastructure is becoming commoditised. With the rise of cloud and as-a-service offerings, infrastructure even must no longer be held in the own company.

DevOps is becoming a key part of IT planning for organisations globally, and if it hasn’t been employed yet, it will certainly be considered. What’s clear is that when combined with copy data virtualisation, it can be an incredibly powerful step towards changing the world of business when gaining a competitive advantage is critical.

 

Edited for web by Cecilia Rehn.

More
articles

Menu