6 BI challenges IT teams must address
Business intelligence (BI) enables companies to harness insights from massive amounts of data. But doing so requires overcoming a range of strategic and tactical challenges.

Every day, organizations of every description are deluged with data from a variety of sources, and attempting to make sense of it all can be overwhelming. So a strong business intelligence (BI) strategy can help organize the flow and ensure business users have access to actionable business insights.

“By 2025, it’s estimated we’ll have 463 million terabytes of data created every day,” says Lisa Thee, data for good sector lead at Launch Consulting Group in Seattle. “For businesses to stay in touch with the market, be responsive, and create products that connect with consumers, it’s important to harness the insights that come out of that information.”

BI software helps companies do just that by shepherding the right data into analytical reports and visualizations so that users can make informed decisions. But without the right approach to implementing these tools, organizations still face issues to maximize value and achieve business goals.

Here are six common BI challenges companies face — and how IT can address them.

1. Low user adoption rates

Diana Stout

Diana Stout, senior business analyst, Schellman

Schellman

It’s critical for organizations wanting to realize the benefits of BI tools to get buy-in from all stakeholders straight away as any initial reluctance can result in low adoption rates.

“The number-one issue for our BI team is convincing people that business intelligence will help to make true data-driven decisions,” says Diana Stout, senior business analyst at Schellman, a global cybersecurity assessor based in Tampa, Fl.

To gain employee buy-in, Stout’s team builds BI dashboards to show them how they can easily connect to and interact with their data, as well as visualize it in a meaningful way.

“For example, say a stakeholder thinks one certain product line is the most profitable,” she says. “I can build a dashboard and show them the intelligence that either proves that what they think is correct, or I can prove them wrong and show them why.”

This enables users to see the value in adopting BI tools, according to Stout.

2. Determining which BI delivery method fits best

There are many traditional IT-managed ways to deliver reports and insights from data. But by utilizing self-service BI tools, with more intuitive dashboards and UIs, companies can streamline their processes by letting managers and other non-technical staff better wield reports and, therefore, derive increased business value from the data.

Axel Goris

Axel Goris, global visual analytics lead, Novartis

Novartis

There can be obstacles, however, to taking the self-service approach. Having too much access across many departments, for example, can result in a kitchen full of inexperienced cooks running up costs and exposing the company to data security problems. And do you want your sales team making decisions based on whatever data it gets, and having the autonomy to mix and match to see what works best? Central, standardized control over tool rollout is key. And to do it correctly, IT needs to govern the data well.

Because of these tradeoffs, organizations must ensure they select the BI approach best-suited for the business application at hand.

“We have more than 100,000 associates in addition to externals working for us, and that’s quite a large user group to serve,” says Axel Goris, global visual analytics lead at Novartis, the multinational pharmaceutical corporation based in Basel, Switzerland. “A key challenge was organization around delivery — how do you organize delivery, because a pharmaceutical company is highly regulated.”

An IT-managed BI delivery model, Goris explains, requires a lot of effort and process, which wouldn’t work for some parts of the business.

“That’s because they feel it’s overly complex; there’s too much overhead, and they want to move faster and be more agile,” Goris says. “And if IT is the go-to place for delivery, then IT becomes a bottleneck because we’re not big enough to do the delivery for everyone.”

To deal with this challenge, Novartis implemented both types of delivery: the IT-managed method and the self-service, business-managed approach.

“With business-managed delivery, we provide the platforms and tools, and allow the business, within certain parameters, to go on its own, use its preferred vendors, or for teams do it themselves, and that’s very popular,” he says, adding that it all comes down to determining “how we can serve everyone in the business or allow BI users to serve themselves in a way that’s scalable.”

3. To integrate data or not

As organizations find themselves having to integrate data from a variety of data sources both on-premises and in the cloud — which can be a time consuming and complicated process — the demand to simplify the setting-up process increases. But many find other solutions. Lionel LLC, for instance, the American designer and importer of toy trains and model railroads based in Concord, N.C., uses its ERP as its system of record, according to CIO Rick Gemereth.

Rick Gemereth

Rick Gemereth, CIO, Lionel LLC

Lionel LLC

“Our single data source is NetSuite, and we have our entire ERP, our e-commerce, based on NetSuite,” he says. “And one of the benefits of that is we don’t have the challenge of trying to marry data from different sources.” Yet what works for Lionel might not work elsewhere. The challenge is finding the solution that works best for your particular circumstances.

Stout, for instance, explains how Schellman addresses integrating its customer relationship management (CRM) and financial data.

“A lot of business intelligence software pulls from a data warehouse where you load all the data tables that are the back end of the different software,” she says. “Or you have a [BI tool] like Domo, which Schellman uses, that can function as a data warehouse. You can connect to the software and it’ll pull it into a table. Then you have all those tables in one place so you can grab the information and fiddle with it.”

Jim Hare, distinguished VP and analyst at Gartner, says that some people think they need to take all the data siloed in systems in various business units and dump it into a data lake.

“But what they really need to do is fundamentally rethink how data is managed and accessed,” he says. “What Gartner is writing about is the concept of a data fabric.”

Defined as an enabler of frictionless access of data sharing in a distributed data environment, data fabric aims to help companies access, integrate, and manage their data no matter where that data is stored using semantic knowledge graphs, active metadata management, and embedded machine learning. “Data fabric allows the data to reside in different types of repositories in the cloud or on prem,” Hare says. “It’s about being able to find relevant data and connect it through a knowledge graph. And key to this is the metadata management.”

4. Allowing perfect to be the enemy of good enough

Conventional wisdom says companies need to work with high-quality data to glean insights necessary to make the best business decisions. But that’s not quite accurate, says Nicole Miara, digital transformation lead at LKQ Europe GmbH, a subsidiary of LKQ Corp. and a leading distributor of automotive aftermarket parts based in Zug, Switzerland.

Just because you don’t think the data is of the highest quality doesn’t mean it doesn’t have value.

Nicole Miara

Nicole Miara, digital transformation lead, LKQ Europe GmbH

LKQ Europe GmbH

When it comes to making decisions, a company’s desire to obtain perfect data can slow its efforts as they spend time gathering as much of it as possible, fixing incomplete data or correcting formats. It’s difficult to have perfect data, but possible for organizations to work with and analyze imperfect data to start translating it into business insights, according to Miara. She cites how with Project Zebra, an open-sourced think tank composed of business leaders, academics, and technologists working to drive supply chain improvement, she was able to use imperfect data to make good business decisions and significantly improve the supply chain.

“Data doesn’t have to be perfect to start the journey,” she says. “It’s a step-by-step approach.” Plus, she adds, you can’t make predictions if you don’t have the basic data layer.

For example, LKQ Europe was trying to apply its data, including sales data, to improve its supply chain operations in light of 35 months of disruption it experienced due to the pandemic. However, the company only had data on its sales history for about 12 months.

“We took invoice data, and we didn’t have additional information regarding our sales, so we took that imperfect sales data and tried to find correlations to our future business,” Miara says. “But we wanted to understand if we could improve our forecasting to predict demand based on that data alone. We found that our imperfect data correlated very well with outside signals, such as inflation and the employment index, even though the data wasn’t perfect.”

5. Dealing with resistance to change

Change management was the number-one struggle Happy Feet International faced implementing business intelligence, says Nick Schwartz, CIO of the luxury vinyl plank and tile flooring company based in Ringgold, Ga.

Nick Schwartz

Nick Schwartz, CIO, Happy Feet International

Happy Feet International

The flooring industry is a technological infant and, as such, a lot of people don’t use technology, according to Schwartz. In fact, when Schwartz joined the company three years ago, salespeople didn’t even use email on a day-to-day basis since they were more comfortable conducting business over the phone.

“People are used to doing things a certain way,” he says. “They’ve been doing it that way for years and they ask why you’re trying to do it a different way. So we have to simplify the experience as much as possible for them, and also hold longer training sessions.”

6. Data governance consistency

Organizations need to ensure they have mature data governance processes in place, including master data management as well as governance around key metrics and key performance indicators (KPIs), says Justin Gillespie, principal and chief data scientist at The Hackett Group, a research advisory and consultancy firm.

“We all hear the horror stories,” he says. “Every company I talk to all have the same problem in that people come to meetings with different numbers and they spend all meeting long arguing about how so and so got his or her number. Having a centralized governed set of KPIs and metrics that are certified by the organization is key.”

Governance is also about standardizing the tools and platforms, according to Gillespie. “From a tools and technologies perspective, it’s rarely about not having tools, it’s usually about having too many tools,” he says. “So companies should standardize on one tool set and then create a proficiency around it.”

What is DataOps? Collaborative, cross-functional analytics
DataOps (data operations) brings together DevOps teams with data engineers and data scientists to provide the tools, processes, and skills to enable the data-driven enterprise.