How to keep on top of testing techniques & tools

Test Magazine Journalist Leah Alger interviews senior software testing assets to find out how they keep on top of testing techniques & tools

Tell me about yourself and your job role:

Sudeep Chatterjee: I am a Senior Technology Manager who has more than 19 years’ experience with top-tier investment banks, fintech and consulting firms; managing testing globally for enterprise-wide change programmes. I am currently consulting as Head of Testing at Bank of America Merrill Lynch within FICC – Global FX Technology group. Prior to the Bank of America Merrill Lynch, I have worked as Head of Testing with Lombard Risk, Barclays, UBS, GE and Accenture, while primarily focusing on building high performing multi-disciplinary testing teams and delivering testing for complex technology-driven business transformation initiatives.

Niranjalee Rajaratne: I am an IT professional with over 12 years’ experience who has worked in various verticals such as telecom, financial services, publishing and e-commerce. Currently, I am the head of quality assurance at Third Bridge. Third Bridge is a leading independent financial research company that provides private equity firms, hedge funds and strategy consultants with the information they need to make an informed decision about investment opportunities.

What is the most complex system architecture you have managed?

Sudeep Chatterjee: I have managed testing of many complex system architectures starting from PoS systems for one of UK’s largest retail company, complex integrated voice recognition systems for one of UK’s largest telecommunication company, complex banking systems built over quants and big data solutions and recently work on the high frequency low latency electronic trading platform for FX.

Niranjalee Rajaratne: One that was developed on unstable grounds where the system was built and managed in an ad-hoc manner. There was a poor team structure in place and there wasn’t enough skills and knowledge in the teams. Also, the business knowledge was not built into the system properly. This created uncertainty when it came to software testing and delivery. It became difficult to achieve quality and to take proactive measurements to risk management.

Are your change management processes modernised and agile friendly?

Sudeep Chatterjee: Change management process is agile and focuses on ensuring SDLC is built over robust processes, which allows for faster delivery with high quality.

Niranjalee Rajaratne: Yes, up to some extent. We keep evolving. We try to iterate the change management process so that it enables us to achieve the business needs faster, better and within the budget. As an outcome, we try to do early engagements with business units, improve communication to keep it consistent and get the buy-in from senior management as early as possible. The team is given autonomy and time to try new things and adapt to change as new and brilliant ideas come to light.

How do you stay ahead of the competition when it comes to mobile systems, cloud computing and APIs?

Sudeep Chatterjee: Adopting a modern digital transformation strategy is part of the technology roadmap, which includes best-in-class delivery for mobile platforms and cloud solutions for end users.

Niranjalee Rajaratne: Research and assessments help us to continuously refine and adopt the right level of technology to achieve business objectives. We assess our current workflows and software to identify gaps in efficiency that technology can bridge.

Can you give me an example of a CI/CD failure you’ve experienced, and what you learned from it?

Sudeep Chatterjee: CI/CD has gone wrong mostly in my experience, especially when dev teams are used to implementing plans and runbooks for deployments in non-production and production environments. When teams start using CI/CD tools without proper training there are times when an issue comes up, particularly around configuration setups.

Niranjalee Rajaratne: I can give many, such as the wrong selection of CI servers, inefficient build infrastructure, uncorded and unstructured pipeline, incompetent skill set and lack of management support. It resulted in delivering business value to the end user albeit on broken builds. CDs cannot be performed and organisation cannot be productive with an inconsistent CI. To make CI/CD work, it was important that the business and technology management understood the value of an efficient build pipeline, and support it without seeing the effort as pure technical work.

How has DevOps changed your way of working?

Sudeep Chatterjee: DevOps has helped organisations to reduce the time to deliver software to production with continuous testing embedded in the process. It has also strengthened testing frameworks like behaviour driven development.

Niranjalee Rajaratne: DevOps improved infrastructure and created efficient build pipelines to help get new products and services to the users quickly. It created a shift in testing and quality assurance by enabling automation of repetitive tasks such as regression and smoke testing. The development teams were empowered to follow test first development practices such as TDD and BDD, helping them towards an efficient build pipeline, and to receive fast feedback.

What DevOps tools do you use?

Sudeep Chatterjee: Jenkins, Git, BitBucket, Maven and Ansible.

Niranjalee Rajaratne: Github, Ansible, Docker, Jenkins, Elasticsearch, Monit are few.

Have you faced any bad experiences when implementing DevOps?

Sudeep Chatterjee: Implementing DevOps without organisational culture change can cause conflicts between teams. For successful DevOps teams, it is important the change management process complements the ethos, and all team members from BA, developers, architect, environment management, QA and application support must understand their role in the DevOps world.

Niranjalee Rajaratne: We certainly did have challenges and we still have some to this day, but I would not say they were bad experiences. Rather they were learning opportunities for the team to perfect the delivery pipeline.

How will DevOps continue to improve software delivery?

Sudeep Chatterjee: DevOps will continue to improve and will be ‘the norm’ for all software delivery. Dev and QA teams will learn to work with DevOps tools including the continuous integration and continuous delivery de-facto standard.

Niranjalee Rajaratne: It will further support the team’s autonomy to be self-organised by removing dependencies, further breaking silos. It will continue to improve inter-team collaboration, empowering people to learn new skills and share knowledge. It will bring users and technology teams closer to achieve a symbiotic relationship. If iteratively improved, this will enable organisations to take a paradigm shift in how they deliver software.

Do you believe manual testing will come to a close because of automation and DevOps?

Sudeep Chatterjee: There will always be a requirement for manual exploratory testing by domain experts though this may not just be a QA professional but can be anyone in the team with strong domain knowledge like BA, product owner, or users.

Niranjalee Rajaratne: The purpose of manual testing and the value it brings cannot be fully replaced by automated testing. Whilst automation testing is an integral part of DevOps, manual testing should continue to function in order to help achieve optimum levels of quality in software.

Anything else you would like to add?

Sudeep Chatterjee: One of the challenges in the agile and DevOps world will be how organisations measure product quality and compensate its employees for performance management. You must make sure the HR process matures to be to able to provide compensation to the entire team, or you will have to still rely on individual feedbacks and do performance benchmarking between team members.

More
articles

Menu