Sumo Logic talks data, cloud and why they decided to make a free service

Cloud is everywhere. It is at the heart of data storage and computing power and without it, IT would not have been able to develop half as quickly as it has done. However, despite its all-round worthiness, it can be costly and competitive. With such fierce rivalry in place, Sumo Logic, a continuous intelligence platform, is working on a way to reduce costs of running real-time analytics and archiving data. After the organisation’s announcement for an On-Demand and Archiving Service, Mark Pidgeon, Vice President Technical Services at the company talks to us about what the plans will mean for the cloud, and at what point they realised that something needed to change.

What will the announcement of on-demand and archiving services mean for tech firms?

More companies today rely on data from their applications for operational insight and business intelligence – for example seeing how well their applications are performing, how their cloud infrastructure is operating, and opportunities on how to improve their services.  However, today’s data analytics pricing and licencing models are broken and simply don’t reflect the rapidly changing ways companies are using data. A new approach is needed to help harness the full value of all this data. Reducing the cost and changing the economics around cloud is essential if we are to help more companies take advantage of this in their DevOps projects.

With our new on-demand and archiving services, companies no longer have to make a trade-off as their machine data grows. They no longer have to choose between either paying runaway licence costs or simply not using this data.

Organisations will now be able to dynamically segment their data and tailor the analytics accordingly for real-time insights, frequent or infrequent interactive searching, or troubleshooting and full data archiving. These capabilities enable customers to maximise the value, choice, and flexibility required for operating and securing their digital businesses.

 Why did you decide to make the Archiving service free of cost?

Sometimes, you have data that you either have to keep for compliance or you want to keep it for analysis, but you don’t necessarily want to look at that data immediately. Take a security event – you want to be able to review a lot of data quickly, but you don’t want to be hosting all that data on standby and incurring a cost on the off chance. Instead, you should be able to store that data somewhere cheap and then look at it quickly when you need to.

Our approach with our Archiving Intelligence Service is to work with your existing storage, bring the data in and analyse it quickly to give you the result you need. This approach helps companies look at their data when they need to, or be more selective with what they import in and why. This helps companies change their thinking about this kind of activity, and when they might want to keep that data.

At what point did you realise that something needed to change in analytic capabilities?

 If you look at what DevOps teams are involved in today, they are building the applications that companies are using to compete in the market. For new companies that are formed today, they get started in the cloud and use a ‘cloud-native’ approach to IT around those applications. For the more established companies, they are looking at how to transition over to that model.

Looking at the data from those applications involves not just the DevOps team – this data can provide direct indicators of the impact that business decisions can have over time. If you redesign your app or update your mobile experience, you can see the effect on how people use your applications or how they buy. If you make a mistake, you can rectify it; if not, then you can see the results and the improvement immediately in the data. This business role for DevOps data is something that is evolving rapidly.

In terms of data analysis, where are the biggest areas where people tend to ‘shoot themselves in the foot’?

 The biggest problem is that many teams don’t have the right ways to measure their activities, or they are using older metrics that are not relevant any longer. How we work with clients – and how we run our own IT presses internally – is to look for a key goal that supports customers, and work back from there. For example, you may want to work on customer experience. Traditionally, you might have looked at qualitative data like Net Promoter Scores in post-sale surveys, and they still have value. However, when you have real-time data on site performance and shopping cart abandonment, you can see if there are any particular situations that lead to more people leaving the site.

You can then turn these into measures that you can track. In our own DevOps team, we refer to these as service level objectives and service level indicators. If you don’t put these kinds of metrics in place, then you can be optimising for the wrong kinds of activity.

What can you tell us about ‘blind spots’ in monitoring and analytics?

 Due to legacy volume-based, one-size-fits-all pricing models, many companies are being forced to make a trade-off when it comes to data for use cases such as code troubleshooting and preventive site or service maintenance. Since this data is being used in an ad hoc nature, many companies will choose not to use this data, since ingesting and analysing is simply not cost-effective.  By not using this data due to cost reasons, Developers, Ops and SecOps teams will not have the full visibility needed to operate and secure their digital business.

What big things are you seeing happen in Continuous Intelligence at the moment?

Continuous Intelligence is a natural next step for companies from continuous integration and continuous deployment. These pipelines are how forward-thinking companies run their application pipelines and get new features into production. Continuous Intelligence builds on this to show companies how those applications are performing, how secure they are, and how they are supporting the business.

More companies are adopting this kind of approach – it relies on getting data out into peoples’ hands in the right format so they can make use of it, and help them do more of their work using data to make smarter decisions.

It seems as though many companies aren’t putting enough practices in place when it comes to data security. Do you think this is true and if so, why do you think this is happening?

 Cloud SIEM is a big trend that will develop more in 2020. According to a survey conducted by Dimensional Research, 93 percent of security professionals think traditional SIEM solutions are ineffective for the cloud, and two-thirds identified the need to consolidate and rethink traditional tools.

Security Incident and Event Management, or SIEM, is the place where companies consolidate their data from multiple sources and understand what is taking place around them. Companies running in the cloud need that data on what is taking place all the time, so they understand and run their infrastructures securely. The traditional approaches to SIEM can’t scale up to cope with the sheer amount of data coming in, so new approaches are needed.

The traditional SIEM vendors are investing more into cloud, while cloud-native companies have built out their products to support the huge volumes of data that cloud applications create. There will be a lot that happens in the market over the next year, as more companies face problems around security and cloud.

What do you think people can do to ensure they are treating data security seriously?

 The most important thing is to look at how to automate your approach around security responses and analysis. For many Security Operations Centres, finding the right staff and skills is a massive challenge – according to ISACA, almost 60 percent of companies had open roles for security professionals, and 32 percent found it took at least six months to fill a role. That is a long time for your existing team to be short, so automating processes and making your existing staff more productive will certainly help.

Alongside this, it’s important to build up an understanding of how security functions for DevOps teams and supports all the great innovation work that is taking place. The impact of GDPR in the UK and the forthcoming data privacy regulations in California demonstrate that compliance is a big issue for business. This can be challenging, particularly when security is seen as a cost centre, but it’s essential that the business knows that this work is necessary and delivering value.

What do you think the big trends in 2020 will be?

 According to the ISC2 report on cybersecurity for 2019, 75 percent of those surveyed were very or extremely concerned about cloud security. The biggest challenges were struggling with compliance (34 percent) and lack of visibility into infrastructure security (33 percent).

Dealing with these issues will be difficult, but getting more clarity on what is taking place across applications and cloud instances will help. At the same time, more people will get the necessary skills to manage cloud services at scale successfully, so this should be something that will become less of an issue in the future.

Is there anything that you will be glad to leave behind in 2019?

 Default deployments of services that are not secure. The number of issues that have been down to poor application components that have not been hardened with steps like access control or encryption is shocking. I hope that this is something that will be left behind as people apply best practices around DevOps. However, that is not guaranteed to take place. Using data to spot poor deployments or missed security steps should help.

Where do you see the future of intelligence services heading?

 Getting data used across businesses is something that we have been talking about for the past few years, but it has not been adopted as widely as I think many people assume. Getting data to be pervasive within an organisation – where every team has access to data as part of their workflows and can use it productively – is an end goal. We already see companies like Samsung SmartThings or Tacton taking Sumo Logic beyond the DevOps team, giving access to that data to other departments and making it useful for them.

I think that kind of progression around data will take place over the next year or two, as more people across companies start to get curious around how other teams in their companies work around data. As one team gets more successful, others want to tap into that success and use it in their own ways. It’s an organic process, and there is no one size fits all approach that works for every company, but it can be encouraged. The customers that we work with really get enthusiastic, and they want to spread the word about the value that they see to other teams.

 

More
articles

Menu