BI Archives - DevOps Online North America https://devopsnews.online/tag/bi/ by 31 Media Ltd. Wed, 12 Sep 2018 11:02:42 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.2 The importance of realistic data in tests https://devopsnews.online/the-importance-of-realistic-data-in-tests/ Wed, 12 Sep 2018 11:02:42 +0000 http://www.devopsonline.co.uk/?p=13955 There are many tests that we can execute in Business Intelligence (BI) systems or any system that uses uncontrolled data – extract, transform, load (ETL), queries, performance etc

The post The importance of realistic data in tests appeared first on DevOps Online North America.

]]>
There are many tests that we can execute in Business Intelligence (BI) systems or any system that uses uncontrolled data – extract, transform, load (ETL), queries, performance etc.

Testing should be done as close as possible to the conditions that it will be used in production by the actual users. One of the keys to success in this area is the data that is being used during the testing process.

Some applications use only data produced by them, like alarm clocks. Other apps use only predefined data like weather apps. Those cases are relatively easy. But when your application or system uses a lot of types of data, including external varied data, and sometimes unstructured data – like systems with big data, the data might be corrupted or unexpected (IOW, the code can’t handle it). But it is not only unexpected data that might cause data integrity issues. It can also be malfunctioning of the data processing of items that are supposed to be handled. For example, inability to process a certain type of picture format or a variation of it.

Other kinds of malfunctioning are the BI systems ability to filter or query correctly. For example, querying for items up to 2k and getting also items of 2.1k. You can think of such a test, but you can’t think of all the tests. Doing the tests but also using a large amount of data will increase the chance of finding more issues.

The number of possibilities is unlimited, and thus the number of data types and tests. this has many risks, from data loss or data that will not be processed correctly, to system downtime.

Regarding the last point, it is also the programmer’s responsibility to handle unexpected data in the code.

Reducing risks

Always use as much data as you can in your tests. Fill the database with similar types of items you are testing besides all other supported and unsupported data. For example, if you test emails, make sure you have a lot of emails in the system. Some in different languages, different lengths, w/ and w/o attachments, different attachments size etc. If you don’t have enough data, try to develop test code that can produce a large amount of data which you can control its content.

Otherwise, when the test says: use a filter X, make sure you see item A which is a document with the title “I am a document”, and the only document in the database is the item A, you might miss a bug that will retrieve all documents that starts with “I” or other types of data that somehow will enter the result (picture by the name “I am a document” for example).

Customer data

Using real data from the customer or production is not a nice to have but a very important factor in the success of the tests and can discover all kinds of issues from above. True, it is not always available, but because of its importance, we must do our best to get it.

Feel free to abuse

Feel free to abuse the data. From corrupted data packets to pictures. Cut the data, add unexpected data, long, short etc. Fill the database with unsupported data.

Know your customer

If you have a specific customer, research them. If he/she is in a specific country and the data is coming from the web, research what are the most popular website in that country, Which are the most popular languages. If it is related to apps or phone types, research what are the most common apps in the country, common phones etc. and base your tests on that.

The last tip about the data is that ETL and executing queries are not enough to validate that all is working. You need to make sure the data went all over the system up to the export correctly.

The post The importance of realistic data in tests appeared first on DevOps Online North America.

]]>
BI and analytics market to hit US$18.3 billion https://devopsnews.online/bi-and-analytics-market-to-hit-us18-3-billion/ Fri, 17 Feb 2017 12:42:52 +0000 http://www.devopsonline.co.uk/?p=8685 Global revenue in the business intelligence (BI) and analytics software market is forecast to reach US$18.3 billion in 2017, an increase of 7.3% from 2016, according to the latest forecast from Gartner, Inc. By the end of 2020, the market is forecast to grow to US$22.8 billion. Accessibility, agility and deeper analytical insight Modern BI...

The post BI and analytics market to hit US$18.3 billion appeared first on DevOps Online North America.

]]>
Global revenue in the business intelligence (BI) and analytics software market is forecast to reach US$18.3 billion in 2017, an increase of 7.3% from 2016, according to the latest forecast from Gartner, Inc.

By the end of 2020, the market is forecast to grow to US$22.8 billion.

Accessibility, agility and deeper analytical insight

Modern BI and analytics continues to expand more rapidly than the overall market, which is offsetting declines in traditional BI spending. The modern BI and analytics platform emerged in the last few years to meet new organisational requirements for accessibility, agility and deeper analytical insight, shifting the market from IT-led, system-of-record reporting to business-led, agile analytics including self-service.

The modern BI and analytics market is expected to decelerate, however, from 63.6% growth in 2015 to a projected 19% by 2020. Gartner believes this reflects data and analytics becoming mainstream. The market is growing in terms of seat expansion, but revenue will be dampened by pricing pressure.

“Purchasing decisions continue to be influenced heavily by business executives and users who want more agility and the option for small personal and departmental deployments to prove success,” said Rita Sallam, research vice president at Gartner. “Enterprise-friendly buying models have become more critical to successful deployments.”

The influences behind modern BI and analytics

Gartner said that the rapidly evolving modern BI and analytics market is being influenced by the following seven dynamics:

Modern BI at scale will dominate new buying

While business users initially flocked to new modern tools because they could be used without IT assistance, the increased need for governance will serve as the catalyst for renewed IT engagement. Modern BI tools that support greater accessibility, agility and analytical insight at the enterprise level will dominate new purchases.

New innovative and established vendors will drive the next wave of market disruption

The emergence of smart data discovery capabilities, machine learning and automation of the entire analytics workflow will drive a new flurry of buying because of its potential value to reduce time to insights from advanced analytics and deliver them to a broader set of people across the enterprise. While this “smart” wave is being driven by new innovative startups, traditional BI vendors that were slow to adjust to the current “modern” wave are driving it in some cases.

Need for complex datasets drives investments in data preparation

Business users want to analyse a diverse, often large and more complex combinations of data sources and data models, faster than ever before. The ability to rapidly prepare, clean, enrich and find trusted datasets in a more automated way becomes an important enabler of expanded use.

Extensibility and embed ability will be key drivers of expanded use and value

Both internal users and customers will either use more automated tools or will embed analytics in the applications they use in their context, or a combination of both. The ability to embed and extend analytics content will be a key enabler of more pervasive adoption and value from analytics.

Support for real-time events and streaming data will expand use

Organisations will increasingly leverage streaming data generated by devices, sensors and people to make faster decisions. Vendors need to invest in similar capabilities to offer buyers a single platform that combines real-time events and streaming data with other types of source data.

Interest in cloud deployments will continue to grow

Cloud deployments of BI and analytics platforms have the potential to reduce cost of ownership and speed time to deployment. However, data gravity that still tilts to the majority of enterprise data residing on-premises continues to be a major inhibitor to adoption. That reticence is abating and Gartner expects the majority of new licensing buying likely to be for cloud deployments by 2020.

Marketplaces will create new opportunities for organisations to buy and sell analytic capabilities and speed time to insight

The availability of an active marketplace where buyers and sellers converge to exchange analytic applications, aggregated data sources, custom visualisations and algorithms is likely to generate increased interest in the BI and analytics space and fuel its future growth.

“Organisations will benefit from the many new and innovative vendors continuing to emerge, as well as significant investment in innovation from large vendors and venture capital-funded startups,” said Ms Sallam. “They do, however, need to be careful to limit their technical debt that can occur when multiple stand-alone solutions that demonstrate business value quickly, turn into production deployments without adequate attention being paid to design, implementation and support.”

 

Edited from press release by Jordan Platt.

The post BI and analytics market to hit US$18.3 billion appeared first on DevOps Online North America.

]]>
The use of a great BI strategy https://devopsnews.online/the-use-of-a-great-bi-strategy/ Tue, 22 Nov 2016 09:34:45 +0000 http://www.devopsonline.co.uk/?p=8432 Zafar Ali, Director, IDB Solutions LTD, discusses how you can boost your success through the creation of a good BI testing strategy.  Enterprises use various BI tools like Cognos, Oracle Business Intelligence Enterprise Edition (OBIEE), Tableau, SQL Server Reporting Services (SSRS) etc., to analyse swathes of data and convert raw data into a more presentable...

The post The use of a great BI strategy appeared first on DevOps Online North America.

]]>
Zafar Ali, Director, IDB Solutions LTD, discusses how you can boost your success through the creation of a good BI testing strategy. 

Enterprises use various BI tools like Cognos, Oracle Business Intelligence Enterprise Edition (OBIEE), Tableau, SQL Server Reporting Services (SSRS) etc., to analyse swathes of data and convert raw data into a more presentable format like dashboard or key performance indicators (KPIs), in order to make well-informed decisions, but you need to make sure your testing strategy is good.

The success of any BI project depends on the trust of the data shown in the BI dashboard/reports, performance of the BI system, data integrity, data security, ease of use of the application and the graphical representation of data.

Testing of a large scale BI system faces challenges in the form of data quality assurance (DQA), metrics/aggregation rule test and data grain etc. The success of any BI system depends on the traceability of the BI functionality to the business requirement (BR), issue/anomalies fix log and how well the test cases are developed.

What is BI testing?

BI testing plays a pivotal role in the success of large-scale BI rollouts in enterprises.

BI testing ranges from: data validation, formatting, security, and performance to graphical user interface (GUI). Emphasis on a thorough BI is pivotal for improving the quality of data, user interface and meeting business requirements.

Developing a good testing strategy

The purpose of BI/DWH/ETL testing is to get credible data for your end-users. Making sure a comprehensive testing strategy is adopted can increase the credibility of BI.

A comprehensive test strategy is a stepping-stone for the effective test cycle of BI application. Your test strategy should cover any stage of data flow in order to make sure that the data input to data output is effectively tested.

To ensure testing readiness, the following key areas of your testing strategy should be focused on:

  • Testing scope – testing categories and types to be used
  • Test environment
  • Test data availability
  • DQ and performance benchmark
  • Issue/enhancement log and traceability

The different testing categories

DWH/ETL testing:

ETL testing is quite different from conventional testing. There are many challenges we faced while performing data warehouse testing. Here is the list of few ETL testing challenges:

  • Incompatible and duplicate data.
  • Loss of data during ETL process.
  • Unavailability of inclusive test bed.
  • Testers have no privileges to execute ETL jobs by their own.
  • Volume and complexity of data is very huge.
  • Fault in business process and procedures.
  • Trouble acquiring and building test data.
  • Missing business’s data transformation rules.

Data is important for businesses to make critical business decisions. ETL testing plays a crucial part in validating and ensuring that the business information is exact, consistent and reliable.

Functional testing:

When a new report or dashboard is developed for consumption by other users, it is important to perform a few checks to validate the data and design of the included reports.

Verify that the new report or dashboard conforms to the report requirement / design specifications. Some of the items to check are:

  • Verify that the report or dashboard page title corresponds to the content of the reports.
  • For reports with charts, the axis should be labelled appropriately.
  • The aggregation level of the data in the reports should be as per the report requirements.
  • Verify that the report or dashboard page design conforms to the design standards and best practices.
  • Validate the presence and functionality of the report download and print options.
  • Where applicable, verify that the report help text exists and is appropriate for the report content.
  • Verify the existence of any required static display text in the report, such as FOIA text.

Regression testing:

BI stress testing is similar to the testing of any other web/desktop application. The objective is to simulate concurrent users accessing reports with different prompts and to understand the bottlenecks in the system.

A typical BI user will login to the system and navigate to reports (or dashboards) and apply prompts and drills down to other reports. After the report is rendered, the BI user reviews the data for a certain amount of time, called ‘think time’. Conducting a stress test requires the simulation of the above BI users behaviour, concurrently, for different user loads. So it is important to have a list of different user logins for the stress test. When executing the reports, each user can pick a different set of prompt values for the same report.

Security testing:

BI applications also have authentication and authorization security requirements, often integrated with SSO (single sign on) or lightweight directory access protocol (LDAP) or other mechanism like oracle weblogic. The objective of security testing is to validate that the BI user’s access to the BI reports, subject areas and dashboards, is limited according to their access levels. Access to the reports is generally controlled by an application role or a LDAP-based security feature in the BI tool.

Single sign on is often used as the authentication mechanism for BI applications in large enterprises – objective of this testing is to ensure that users are able to access BI applications using their single sign on access (or windows authentication).

Report/BI testing:

BI tools, such as OBIEE and business objects, empower the business users by providing the capability for them to create their own reports without the help of a developer. These tools generate the database queries automatically for the reports, based on the measures and dimensions. For OBIEE the model is defined in the RPD, while business objects store the model in the form of a universe. Business users can select any combination of dimension and then measure the attributes available in the subject area in order to come up with their own ad-hoc report.

 

Edited for web by Jordan Platt.

The post The use of a great BI strategy appeared first on DevOps Online North America.

]]>