Power Community

Power Community

Fortune 500 Pharmaceutical Company Leverages Power BI and ValQ for Strategic Planning

Summary: A Fortune 500 pharmaceutical company headquartered in New Jersey, United States demonstrated the impact of strategic data transformation to better prepare for the future. Their solution utilizes the capabilities of Microsoft Azure, Power BI, and ValQ to harness the potential of their data to perform predictive financial modeling and planning to inform improved business outcomes.

Watch this video to hear the finance director from the Fortune 500 pharmaceutical company present his journey with using Power BI for planning.

Fortune 500 pharmaceutical company uses Power BI to solve planning challenge

The US-based Fortune 500 pharmaceutical company faced challenges with fragmented data that involved sorting through multiple data sources (i.e., Excel, flat file, SAP, and other cloud-based systems) engaged with a disjointed toolset. These barriers prevented access to critical information which made it increasingly difficult to perform the economic modeling and evaluation along with metrics calculations.

The finance director for supply chain management within the company was looking for a way to provide much deeper end-to-end analysis of the demand/delivery cycle to understand cost drivers and other performance metrics. This solution would enable insight into the actionable areas of the business that are driving the economics and therefore the finances reported in their financial results.

Ultimately, their solution needed to unify a significant amount of disparate data in a coherent manner, that would allow them to perform predictive modeling to understand financial implications of certain actions prior to execution.

Power BI’s low-code reporting visualization was suitable for the company’s finance director to begin work on the proof of concept. Because of its flexibility, security, and functionality it would also scale well further into implementation. Also, Power BI’s data modeling and data source connection capabilities made it a no-brainer tool for their solution. Most importantly, Power BI supported the planning capabilities the director needed through the partnership with Visual BI, and their ValQ application, which delivers forecasting, simulations, and performance analysis.

The platform behind the data

The platform the company is leveraging for POC—which took 10 days to implement—takes all source data and moves it to the Azure environment. Source data is loaded via Azure Data Factory into Azure Data Lake Storage Gen2 for storage. From there data transformation is processed via Azure Databricks and Azure Machine Learning in two stages.

Stage one involves the necessary steps to make the data fit for purpose, which are to:

  1. Clean and purge
  2. Organize and structure
  3. Establish relationships
  4. Apply business logic
  5. Model to support analysis

The second stage builds on this through leveraging the data platform functions that:

  1. Automate tasks
  2. Generate analytics and insights
  3. Detect anomalies
  4. Simulate model changes
  5. Predict
Source data from AWS, Oracle, Execel, SAP HANA, and Sharepoint is ingested into Azure Data Lake Storage Gen2. Data is then processed and transformed using Azure Databricks and Azure Machine Learning Services. Data pushed to Azure SQL Database is presented in Power BI, Excel, Power Apps, and PowerPoint. Data is transferred throughout the process using Azure Data Factory. Internal end users access the processed data to make decisions.
Company’s financial analytics architecture shows their data storage, processing, and interaction.

This data manipulation and transformation is the key to the entire application development process. The data serving and interaction layer leverages Microsoft Power Platform whereby the solution uses Power BI to present the data and Power Apps to allow user interaction with the data. Whether a user pulls data in from the Power BI dashboards or Excel, the information will be the same because of the existing data transformation process.

The next key element in the company’s solution is the integration of Visual BI’s planning tool, ValQ. The simulation capability begins with a demand signal while volume assumptions flow downward to trigger each process.

Starting with demand, aligned volume assumptions flow downward to trigger each process. Processes are: demand, supply network, financial supply chain, and inventory distribution and logistics. After each process is complete highly automated forecasting takes place allowing for revenue, cost of goods, variance, purchase price, tax, distribution and logistics, insurance, discards, and warehousing forecasting.

Visualization of the company’s simulation model structured around demand signals which result in production-focused forecasting.

The Fortune 500 company’s integrated planning process starts with a demand signal. This signal feeds the supply network plan—how a company produces materials to meet the overall demand. The next step is to incorporate the financial supply chain, which tracks the cost of physically transporting products in and out of various markets. The final step in the process is managing inventory and determining distribution logistics. Each step in this process is a driver of financial outcomes (see image 2. left swim lane).

The financial outcomes lead to financial assumptions (see image 2. horizontal swim lanes) that are applied to each process step to arrive at the various forecasts and analyses (see image 2. right swim lane).

The pharmaceutical company spends significant time and obtains a large amount of utility from the Power BI platform in the area focused on production management. Power BI helps them to understand various aspects of production management such as the timing and scheduling of the events, repackaging counts, runtime versus setup time, and switch over time on the factory line or floor. The knowledge of how these aspects influence required resources is used to determine how to optimize the output over time, and thereby become cost efficient. By adjusting the various inputs in Power BI, the pharmaceutical company can understand how each decision will impact the overall outcome which then automatically realigns planning.

The overall planning system solution utilizes Power BI and ValQ to treat all financial assumptions as levers that are adjustable to reveal deep insights into how each variation will impact the overall supply chain outcome. In this way, they simulate and understand how pulling one lever will financially impact every other financial factor which then automatically plans each stage of the supply chain process.

End to end production cost visibility

Once the demand signal and what needs to be produced has been established, the finance director looks to the production requirements and associated costs. With Power BI dashboards, the company is able to see end-to-end analysis reports with insight into every step in the manufacturing process, where they occur, and what the cost structure is at each step along the way.

With all of these elements calculated and displayed in a composition tree, the user sees an accumulated cost for the material to be sold in the Chain Cost Analysis dashboard. The decomposition tree capability makes simple and easy what would be an unwieldy amount of data to navigate through. Each bar within the report demonstrates the cost drivers or the or the cost impact of each step in the manufacturing process on a per unit basis, which allows for end-to-end visibility over a complex process.

Now, cost drivers and accumulated costs to be quickly and easily identified within the larger scope of the production process so that the company can focus on opportunities for improvement. As key areas are flagged for improvement within the multi-step production process, the user can access a deeper Step Cost Analysis report within the Power BI dashboard, which gives an embedded cost overview.

Power BI dashboards
Power BI dashboard reports show the Chain Cost Analysis and Step Cost Analysis within the end-to-end manufacturing process.

This enables analysis of the costs on the manufacturing floor—including labor, machinery usage—and how it translates into cost-per-unit. The user can then identify throughput; how many units per hour the given manufacturer is able to produce, at what cost.

Within the Step Cost Analysis view, users access overhead cost details on a product-specific basis, which is critical for a large company that produces a large variety of products with variable production requirements.

The next level of analysis involves the material itself needed to perform the production process and to obtain the IDs of those materials, determine the quantity, and overall cost of materials.

The solution then looks across the business portfolio materials for any given manufacturing site to understand what the different contentions for resources are, what products go through that site, and the overall cost of anything that specific plant touches. Ultimately, this delivers a holistic view of the manufacturing process rather than just one product or one production location.

Modeling, simulating, and forecasting with confidence

The company’s Power BI reporting paired with ValQ ‘s embedded predictive modeling capabilities enables economic assumptions to be woven into the model to arrive at what is the final cost per unit of a particular portfolio of products.

The simulation model starts with demand, filters down to supply, and then the spending to support the production processes is captured, which will end up resulting in the final cost. A critical feature of ValQ is the ability to link all these relationships together within the model.

For instance, if a user elects to scale demand up or down by geographic region, the model will automatically update the supply requirements in the network that supplies that specific region.

ValQ dashboard planning
Company’s simulation and planning solution dashboard which combines Power BI and ValQ capabilities.

Within this simulation, the model calculates all the implications to demand and provides supply requirements in order to meet the demand modeled. It then updates all supply by stage and by plant automatically match the required supply to the group of plants in the network that would support that region’s demand.

Company’s simulation and planning solution dashboard which combines Power BI and ValQ capabilities.

Simulation and planning capabilities allow users like the finance director to manipulate a massive amount of critical data with ease to drill down to the individual cost components that are associated to it and that drive it, and then model changes there.

Effectively it starts with volume, the volume drives the cost, and the costs have their own independent variables that can be adjusted as well, to model the financial impact of a particular decision.

Planning with certainty and saving time

By applying technology to business processes, the pharmaceutical company has experienced the elimination of redundancy of data and redundant process-based work. This has ultimately saved a significant amount of time and effort, but also allows them to do more thorough analysis and provide quick feedback as to the impact of that analysis.

The solution’s unique combination of Power BI and ValQ enables planning with certainty for the Fortune 500 pharmaceutical company based on a variety of economic assumptions.

Their simulation and planning tool also provides line-of-sight into the downstream economic impact of changes across organizations, which has reduced bias through enabling data-based decision-making and evaluations.

To read more Power BI customer stories like this one visit our Microsoft customer stories portal

- Advertisement -spot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -spot_img
- Advertisement - Advertisement

Latest News

Power Fx: Error handling graduates to preview

We are thrilled to announce that the long-time experimental feature Formula-level error handling has moved forward to preview. As a result, you and your end users will enjoy higher reliability and more transparency about what is happening in your apps. It’s a huge step. Adding error handling to an existing language turned out to be a very tall order, touching almost everything, from data types, to functions, to the runtime. Thank you for all of your support, feedback, and patience as we got this done. What does it mean for you? Your apps will more reliably detect and report errors.You can write blank/null values to a database.You can detect and replace errors with the IsError, IsErrorOrBlank, and IfError functions.You can control error reporting and logging at a central location with App.OnError.You can create and throw your own custom errors with the Error function. Error handling is a big change in behavior. By entering preview, we are signaling that we believe we are done, that we anticipate no further significant changes from here. Many of you already use error handling in production and this move to preview should only embolden more of you to do so. If significant changes are needed from here, we will treat them as a separate feature. We are rolling this out slowly as it is such a big change. All of you will soon see that the Formula-level error handling switch has moved from experimental to preview in the settings (as of version 3.22082). It will still be default to off for most tenants. Over the coming weeks we will slowly change the default for new apps only to on across the tenants. Makers can still disable this feature and will be able to do so for a long time. I say again: we are changing the default for new apps only. Existing apps will continue running as they always have. We have no plans at this time to turn this on for existing apps, and as this is such a big change, we may never do this and make this a permanently available switch. Your feedback will guide us. The documentation for Error, IfError, IsError, IsErrorOrBlank functions and the App.OnError property covers these changes. IfError and IsError are very similar to their Excel counterparts. We are also working on overview docs that will be released shortly. But before that, let’s take a brief tour. Let’s start with what Excel does, the inspiration for Power Fx. For an error like division by zero, Excel is very clear that something has gone wrong with a # error message that shows right in the cell. This error will propagate to other cell formulas if A1 is used in a formula: Today, without error handling, Power Apps won’t report anything in this scenario, instead treating the division by zero error as a blank value. That’s not good, as the maker and the end user of the app have no idea something may have gone wrong: Errors happen. Unexpected data flows in, networks go down, storage fills up, to name just a few situations that an app may encounter in the real world. Makers don’t often think through all the ways that things can go sideways which makes default error handling even more important. Returning a blank for an error is also a problem because blank is a legitimate value in our type system and in many databases. Without error handling, Power Apps won’t allow you to write a blank to a database instead thinking it is an error. So, instead of returning an easy to ignore or misinterpret blank value, with error handling turned on we now report an error to the end user (the error banner) and show the formula as having an error to the maker (the red filled in circle on the control): Further, if you look at the value of the formula, it is not a blank but an error value. Just as any formula can result in a blank, now any formula can also result in an error: Now, we still aren’t showing an error in the label control itself as Excel does. We couldn’t do this generically because, unlike Excel, the error could be on a property of a control for which there is no way to display the error. For example, where should an error on a slider control? Where should an error be shown for an imperative operation in the middle of a button’s OnSelect formula? We settled on showing the end user banner and flagging the control in the design experience. That’s not to say you can’t detect and display an error in that label control. Error handling provides a wealth of mechanisms to control how errors are handled and reported. For example in this case, we can wrap the division by zero with an IfError function to return a custom message in the label: The Text function call is required for type compatibility. Or we can use IfError to throw a different, more specific error with the Error function: Or we can have a catchall for all errors in the app with App.OnError. For example, we can log the error and present a different message to the end user: If we look at the log, we see the details of the captured error from FirstError (and there is also an AllErrors), including where it happened and when it was detected: The possibilities are endless! You now have all the tools you need to detect, replace, report, and log errors, including a good default behavior if you never take advantage of these tools. And, bonus, you can also now write blank (or null) values to databases. Please let us know what you think in the Power Apps community forum. There is a dedicated and active space for error handling discussions at Error Handling – Power Platform Community (microsoft.com).

More Articles Like This

- Advertisement -spot_img