software quality Archives - DevOps Online North America https://devopsnews.online/tag/software-quality/ by 31 Media Ltd. Thu, 19 Jul 2018 15:56:14 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.2 5 key drivers for software testing in 2018: driving the digital change https://devopsnews.online/5-key-drivers-for-software-testing-in-2018-driving-the-digital-change/ Mon, 18 Jun 2018 13:48:45 +0000 http://www.devopsonline.co.uk/?p=13073 QMetry explores some major drivers that are becoming extremely important for digital change in 2018

The post 5 key drivers for software testing in 2018: driving the digital change appeared first on DevOps Online North America.

]]>
Software testing has seen two major transitions in the past decade. The advent of agile heralded the shift from manual to automation and then in recent times, DevOps is driving the 2nd wave of change for testing teams with continuous integration and delivery. Although, in many ways, we are seeing some of the old practices coming back to life again in testing. Let us explore some major drivers that are becoming extremely important in 2018 and will remain in the future.

DevOps

The DevOps paradigm and continuous delivery have led to a digital shake-up for enterprise IT. With DevOps comes a new kind of agility, compressed release cycles, enhanced application quality and above all a renewed synergy between development and operations. This breaking of barriers is essential to match the transformation journey.

Yet, it is not just speed and quality; the business value of DevOps is quite remarkable. It evolves the paradigm of continuous integration to continuous development. Simply put, the code that is written and committed to version control is now being built, developed, tested and installed in the production environment for consumption. This helps the entire ecosystem because environments and processes become standardised and every action in this chain becomes automated.

Software testing role becomes crucial here as ‘test early, test frequently’ is the key to achieving better quality software for DevOps teams. This requires the effective use of key enablers like test automation, continuous integration, capability for continuous feedback and the right mix of tools and process for DevOps.

Test automation

But with great power comes great responsibility. Testing is one of the key pieces in the continuous delivery puzzle and organisations are now scaling up their agile and DevOps services toward continuous testing and continuous delivery. There is high pressure to fix defects urgently, in order to reduce the technical debt and achieve the pace that is set by agile and DevOps thinking.

Thus, as companies shorten their sprint cycles and increase their release velocity, testing must match both the high frequency and volume needs. Manual testing is simply not enough to accomplish this pace. Test Automation is one of the key drivers for DevOps and digital transformation. greatly increases your coverage and accuracy.

Test automation is at the heart of your CI/CD pipeline to achieve the coverage and accuracy. From faster feedback loops, reduced expenses, reusability, organisation and faster time to market, automated testing vastly improves your efficiency and overall quality.

Artificial intelligence & machine learning

Modern test management has the advantage of information. The vast amounts of test data and results produced by their test automation suites hold valuable insights and intelligence. And while it is manually impossible to wade through all this data, BOTs can quickly analyse terabytes of information. BOT-enabled tools can then produce actionable insights and optimisation recommendations. These analytics are useful to detect and reduce/remove performance bottlenecks and also identify areas of most failed scenarios or tips/insights on critical areas.

Artificial intelligence, machine learning and the use of BOTs have emerged as the major disruptive forces in the current digital landscape. By leveraging the power of AI and ML in their test automation efforts, businesses can significantly increase their ability to fulfil time to market pressures while also meeting the stringent quality needed for desired business outcomes. These AI-and ML-led smarter test automation solutions – also known as intelligent testing will be the core differentiators in achieving frictionless automation and continuous feedback.

Unified testing platforms

The organic progression of software testing organisations has led to the use of various legacy tools and processes, and teams with a wide variety of scripting languages and methods. This weighs down the test automation efforts as it leads to duplicity of efforts, lack of reusability and high maintenance overheads. The cumulative outcome is increased the time to market and non-optimisation of software quality.

DevOps and digital transformation have set the pace for a highly proactive, fast and integrated delivery culture. The confidence in the quality and timeliness of releases requires a unified approach that transcends the siloes and barriers. This has led to the era of tightly integrated, unified testing platforms that address various testing needs and challenges under one umbrella. Modern, unified solutions provide a holistic test management solution that optimizes the testing cycle with end-to-end coverage and integrates with your existing automation, CI/CD tools and project management tools.

Predictive & prescriptive quality analytics

Delivering quality at the speed of business is a digital imperative. This needs businesses to define the requirements clearly, use the right infrastructure, and use established processes. AI and ML algorithms have made it possible to mine large volumes of data from automation suites and test management tools to gain insights and intelligence. Predictive analytics can help you shorten the testing cycles by optimising your process, anticipate defects early and set the priorities for testing that will have the greatest impact.

Prescriptive analytics goes one step further and makes automation smarter by providing a holistic view of root causes and failures, heat map-visualisations and providing recommendations that are both actionable and accurate.

The game changer

Delivering software quality continuously is the need of the day in the era of digital transformation. As the lines continue to blur with DevOps and agile practices enabling more fluid and transparent work practices, integration and visibility are essential for effective test management. Software test automation holds the power to increase your coverage, depth and scope and solve many of the prevalent challenges.

But it is intelligent test automation that is the real game-changer, with its power of data, AI and machine learning. Intelligent testing takes quality engineering to the next level in the DevOps context by giving you an unparalleled competitive advantage. That of optimising testing activities, instant feedback, faster data-driven decisions and self-adaptive software testing for your quality lifecycle. Smarter testing led by AI and BOTs has beckoned the third wave in the software testing ecosystem.

Written by Rutesh Shah, CEO, Infostretch Corp

The post 5 key drivers for software testing in 2018: driving the digital change appeared first on DevOps Online North America.

]]>
Friend or foe? Is it agile and DevOps or agile versus DevOps? https://devopsnews.online/friend-or-foe-is-it-agile-and-devops-or-agile-versus-devops/ Wed, 05 Apr 2017 08:30:57 +0000 http://www.devopsonline.co.uk/?p=8768 Olivier Bonsignour, Executive Vice President of Product Development, CAST, argues that architectural software quality standards must be adopted to achieve true success. Agile is today’s norm in the application development world. Over the last 10 years, its benefits have been proven and go beyond simply helping development teams become more efficient. Agile improves user satisfaction,...

The post Friend or foe? Is it agile and DevOps or agile versus DevOps? appeared first on DevOps Online North America.

]]>
Olivier Bonsignour, Executive Vice President of Product Development, CAST, argues that architectural software quality standards must be adopted to achieve true success.

Agile is today’s norm in the application development world. Over the last 10 years, its benefits have been proven and go beyond simply helping development teams become more efficient. Agile improves user satisfaction, it reduces re‑work and it has a tangible impact on team morale, frequently rewarding developers with concentrated and rapid goals and timelines.

Moreover, the widespread adoption of DevOps has made agile an even more compelling approach for IT development shops. Agile is not DevOps, and DevOps is not agile, but it’s quite difficult to excel in DevOps if agile principles are not followed. And while agile makes perfect sense independent of DevOps, one can argue that agile only reaches its full potential when complemented by DevOps practices.

For DevOps to be most effective, it must not be used just by development teams. At a minimum, DevOps should include a global IT team approach at the enterprise level. In fact, it has been recommended that IT organisations adopt enterprise agile practices to support focused innovation, adding DevOps practices to enable safe and frequent release of new software.

In the article that follows, I assume organisations are adopting this new high‑speed, secure and feature‑oriented approach to development.

Speed versus risk

Among the different benefits offered by agile and DevOps, developers can more quickly react to users’ changing needs – a critical factor in today’s digital, ‘always on’ world. Agile has consistently focused on shortening cycles between new requirements and the delivery of relevant features to the end user. DevOps’ major promise is to support a fast delivery pace for these changes, pushing the ‘fail fast’ concept to a reality.

This is a slightly provocative reduction of DevOps benefits as experts and those with step back capabilities understand that speed is also a question of context and need. The principle impact of DevOps is not the speed per se, but more the cultural switch that is required to put the end user at the centre of the process.

However, teams embracing agile with or without DevOps must not forget that to sustain a rapid pace of change, good application structure and a solid foundation are a must. To guarantee speed and adaptability, applications must be built on sound underlying architecture that must be constantly adapted and refined to keep the application itself (and the team) agile. While there may not be signs of weakness in immediate sprints, not doing so is a recipe for failure in 12 months or more.

At quick glance, it would appear this kind of architectural approach would be in total opposition with agile methodologies and the iterative process at its core. However, building an application on a sound architecture doesn’t necessarily mean you need to get everything right and planned for an application from the ground‑up. Building on sound architecture simply means that your architectural blueprint should be constantly monitored and validated against architectural engineering best practices, therefore identifying areas of efficiency, non‑compliant components or potential security and resiliency gaps.

And in a DevOps context it means that teams must be empowered to automatically check that the principles are met, that the changes are not introducing any new risks, that they are no hidden side effects linked to these changes. And because the iterative process means regularly investing in modernising the architecture, adapting it to the ever‑changing needs of the application and refactoring it accordingly, the teams also need to be supported through always up‑to‑date and easy to produce architecture blueprints. Doing so is the only way to properly achieve speed without introducing new risks to the organisation.

Agile and quality

Both agile and DevOps are required to create a model that is both fast while keeping risk and structural soundness in mind. But neither of them will inherently create ‘quality’ or low technical debt. Some might think that, thanks to the rapid cycle and famous fail fast principle, early failures cost less to fix and have a more limited impact. That is certainly true, but IT organisations must go beyond fail fast in production. Constantly shifting software failure identification to the left keeps any disruption at the absolute lowest impact and lowest cost.

Agile, by abbreviating the development cycle and embedding more direct feedback from end users in the process, produces applications that better fit the desired requirements and can be quickly adapted to reflect the usual change in requirement during the life of the project. DevOps, with a focus on automation and fail fast, also contributes to the creation of better quality and optimal conditions to monitor the quality of the application. Yet, even with agile and DevOps aiding in proactive and even automated software failure identification, developers must follow coding and architectural best practices to embed quality in their applications.

Using software quality best practices in development may sound easier than it is in practice. Your team may have incredibly skilled developers but lack team leadership and focus. You may have mediocre talent working in incredibly advanced agile and DevOps programs. Regardless of the team size or demographic, a common standard must be adopted to establish rules and benchmark performance and quality.

In my experience, the Consortium for IT Software Quality (CISQ) has some of the more comprehensive standards for software quality. CISQ standards are commonly used to detect critical violations of good coding and architectural practices in software, measure compliance and establish outcomes with software vendors. In fact, we work with CISQ at CAST for both internal development practices and external vendor agreements.

Adopting a set of industry standards may very well be the missing piece to ensuring quality in your applications. Leveraging architectural quality standards will also help your team identify if software quality risks are being introduced through changes made to the application source code. This is the kind of software risk that can lie undetected for months, suddenly rearing its head and causing disastrous outages. Sometimes the most important practices are the ones that take a little extra focus but guarantee a structurally sound underlying architecture over time.

Agile and legacy

While modern development methodologies have been widely adopted, that does not change the fact that today’s applications often remain just a part of larger systems in complex enterprise ecosystems. Thus, even building an app using an agile approach can often require some use of legacy technologies and applications. Often, the teams maintaining these legacy applications are not familiar with agile, and the technologies involved are not optimised for an agile or DevOps approach. Similarly, agile teams building the new component may not be aware of constraints posed by the „legacy technologies they are required to connect to.

This can pose workflow challenges and significantly increase the risk of a project if not handled appropriately. Some recommend taking an agile‑DevOps approach to legacy modernisation to give teams more control over operational issues that may present during the transition process. This can also surface expectations that have not been realised even when operational issues aren’t present.

Regardless of your specific scenario, properly following best practices will decrease and control the risks incurred by any modifications of complex systems. The best approach is to, again, take an architectural view of both applications in question to clearly establish interactions between modern and legacy components.

Understanding old code is a challenge, particularly if it has not been well‑documented. When teams add new features, they will often need to modify existing components, requiring some architectural knowledge of both the existing and modern code base. Establishing a holistic, architectural view of the application will ensure that the new web front‑end and its related services do not over‑trigger the legacy back‑end and create performance issues down the road.

The challenge ahead

In 2017, the big challenge for teams will be combining an architectural, holistic system‑level view with their existing agile and DevOps practices. While bringing new functionality to the market quickly will remain a top priority, teams must pause to take a quick breath. In this case, that breath is system‑level analysis. Only with the integration of system‑level analysis will we see the full potential and promise of agile and DevOps come to life.

Fail fast will remain a best practice, however in its current form it does not always point to architectural issues that cause significant operational disruption. When combined with system‑level analysis, teams will quickly fix bugs while proactively preventing massive outages, proving their worth and directly contributing to business success.

Agile and DevOps teams must evolve to the next level to become truly effective, going beyond efficiency to secure development practices. At the basic level, this must include the adoption and implementation of industry standards from organisations such as CISQ, the Software Engineering Institute, Object Management Group and others. Arming dev teams and team leaders with a framework for quality improvement and proactively fighting against software risk and technical debt will be critical to this deeper level of success and direct contribution to business initiatives.

 

Edited for web by Cecilia Rehn.

The post Friend or foe? Is it agile and DevOps or agile versus DevOps? appeared first on DevOps Online North America.

]]>