Enhancing Supplier Quality Management with Business Intelligence: A Strategic Solution for Manufacturers
- Tina Okonkwo
- Sep 16, 2024
- 9 min read
Updated: Sep 17, 2024
For manufacturers, maintaining consistent raw material quality is essential to ensuring operational efficiency and minimizing costs. However, many organizations face challenges related to supplier inconsistencies, which can result in material defects that disrupt production, cause downtime, and drive up expenses.
Managing supplier quality across multiple plants presents additional complexity. Without a centralized system to track and evaluate supplier performance, companies often lack the visibility required to identify trends, address performance gaps, and enhance overall supply chain efficiency. This lack of insight can hinder the ability to make data-driven decisions critical to improving supplier relationships and reducing production risks.
To address these challenges, a strategic solution is needed — one that consolidates supplier data, identifies areas of concern, and provides actionable insights. This article explores how business intelligence can be utilized to optimize supplier quality management, enabling business leaders to make informed, data-driven decisions that positively impact operations and the bottom line.
Problem Statement
Enterprise Manufacturers Ltd. had no consistent way to validate supplier performance across different plants. The lack of visibility into supplier quality resulted in frequent downtime and inconsistent production outputs. As the management team aimed to centralize this information, the following key questions were raised:
Which vendors/plants are causing the greatest defect quantity?
Which vendors/plants are responsible for the most downtime?
Are there particular combinations of materials and vendors that are consistently problematic?
How does the same vendor and material perform across different plants?
Are there any insights or patterns the team may have overlooked?
The business needed a clear, data-driven approach to visualize and answer these questions.
Analysis Approach
After thoroughly reviewing the requirements, it was crucial to approach the problem from the client’s perspective. The core need was evident:
“Our company wants to start monitoring the performance and quality of goods sent to us. We don’t have a procurement system in place, and while we lack complete data, we need insights to monitor and improve this activity quickly.”
Understanding this, I focused on the key insight driving the analysis — defective materials have a direct impact on production efficiency. This leads to production delays, increased downtime, and ultimately, financial losses. Recognizing this, my analysis was shaped with a clear objective — to quantify the financial implications of poor quality and identify the primary sources of these issues.
Instead of presenting stakeholders with abstract metrics like the total number of defective units or minutes of downtime, the analysis aimed to highlight the tangible financial impact of these problems. The goal was to make it clear how much money is being lost due to quality issues and to identify the main offenders — both in terms of vendors, plants and materials.
About the Dataset

The dataset for this analysis comprises:
Date: The date when the defect was recorded.
Vendor: The name or identifier of the supplier providing the material.
Plant Location: The geographic location or site of the plant where the material was used.
Category: The category or classification of the material or product.
Material Type: The specific type or kind of material being used.
Defect Type: The classification of the defect observed in the material.
Defect: A brief description or code for the type of defect.
Total Defect Qty: The total quantity of defective materials recorded.
Total Downtime Minutes: The total number of minutes of downtime caused by the defective material.
Methodology
Although the data required minimal cleaning, establishing an efficient data model was crucial to improving performance and optimizing the report generation process. The existing “one-table” data model is cumbersome, often resulting in longer-than-usual run times when generating reports. To address this, I focused on breaking out key dimensions into separate tables based on their attributes.
Segmentation into Dimension Tables: Upon inspecting the data, I evaluated which attributes could be separated into distinct dimension tables. By segmenting the data into manageable components, I created dimension tables with unique identifiers for categories like Vendor, Plant Location, Material Type, and Defect Type.
The goal was to create a model that would allow for more efficient querying and insightful reporting. Below is a summary of the steps I followed:
Duplicate the Query: For each Dimension table that needed to be created, I duplicated the original query.
Rename the Tables: Each query was renamed according to its respective role in the model, such as Supplier Fact Table for the central metrics like defect quantity and downtime minutes — and Dimension Tables for categories like Vendor, Plant Location, Defect, Defect Type, Category and Material Type.
Select Relevant Columns: For each table, only the specific columns relevant to that dimension were retained. This step helped streamline the dataset and ensure that each table contained only the information necessary for analysis. I also generated primary keys using an index column for each dimensions.
Building the Fact Table: Once the dimension tables were established, I used Power BI’s Merge Feature to integrate the keys back into the original dataset, forming a well-structured Fact Table.

Creating a Date Table: One essential part of the process was building a Date Table. In data analysis, having a dedicated Date Table is invaluable for creating time intelligence measures and applying time-related filters to reports. Using DAX (Data Analysis Expressions), I built an accurate Date Table that enabled effective time series analysis, making it easier to track trends over time.

Star Schema Data Model: With both my Dimension and Fact Tables in place, the next step was data modeling. I established relationships between tables by connecting the primary keys in the dimension tables to the foreign keys in the Fact Table. This resulted in a clean Star Schema Model. By organizing the data this way, you can be rest assured of generating reports without performance lags.

Analysis
Once the data was in place, the next challenge was designing a dashboard that would deliver actionable insights at a glance. I knew I needed a balance between high-level summaries for executives and more detailed breakdowns for the procurement and production teams.
The main challenge for me was defining the right metrics, which was determined by the data I had available. The three key metrics I focused on were:
Downtime: The total hours lost due to defective materials.
Defect Quantity: The number of defective units received from suppliers.
Downtime Cost: The financial impact, where I assumed the company was losing $10 for every hour of downtime.
Initially, I questioned whether presenting defect quantity would add value, since there was no direct comparative data. However, I realized that there might be a correlation between downtime and defects, and I was right. There were some instances where the number of defects were lower but downtimes were higher and vice versa.
I also had to decide whether to present downtime in minutes or hours. I chose hours to align with the manufacturing industry’s practice of calculating costs on an hourly basis, ensuring consistency in the report.
However, the question I had was why downtime would still occur if defects had no impact — I assumed other factors might be at play, such as machine failures, power outages, staffing shortages, or delays in receiving materials.

The idea behind the homepage was to provide a clean, professional interface, integrating intuitive navigation for stakeholders. I added a navigation bar to allow users effortlessly switch between the core areas of the report, each focusing on different aspects of supplier quality and performance. The layout mimics modern website designs, ensuring that even non-technical decision-makers can explore critical metrics effortlessly.
The focus here is on leveraging graphic design to create an intuitive, visually appealing dashboard that improves the user experience, enabling stakeholders to find key metrics at a glance.
Overview Page

The top section of this page provides a snapshot of the most critical metrics:
Defects: The total number of defects reported across all vendors, along with percentage changes compared to the previous month and year.
Downtime Hours: Total downtime caused by defective materials, which immediately highlights the operational impact.
Downtime Cost: This was a key insight — by setting a custom Downtime Cost per Hour (in this case, $10), we could directly calculate the financial impact of downtime, making the cost implications crystal clear for the C-suite.
A monthly trend chart was added where users could toggle between defect quantities and downtime hours to see how these metrics changed throughout the year. Another critical section of this page was the Worst Performers section, where I broke down the defect data by vendor, material, plant, and defect type.
I also needed to display the ratio of impact to non impact and rejected defects and the distribution across the years.
Vendor Performance

For the vendor analysis, the goal was to highlight which suppliers were contributing the most to the company’s production issues. I included a Top N Analysis feature in the dashboard allowing users to focus on a specific number of vendors, displaying either the top or bottom performers based on a selected metric (in this case, Downtime Hours or Defect Quantity). This is particularly useful when trying to identify vendors that are either excelling or underperforming.
By categorizing vendors into high, medium, and low-risk groups based on their defect impact and downtime, I was also able to dynamically analyze the correlation between these factors. I calculated three metrics to determine risk levels, such as classifying vendors as high-risk if their downtime exceeded 800 hours, medium-risk if their downtime was between 400 and 800 hours, and low-risk if their downtime was less than 400 hours.
Plant Performance


To analyze plant performance, I incorporated a Top N Analysis to identify the top or bottom performers based on key metrics. A map was added to visualize the geographic distribution of plants and their performance, and a table providing a detailed breakdown of defect impact, no impact, and rejected defects, helping to pinpoint the most problematic locations. I also included a tooltip for metrics about the location.
Material Performance

To analyze material performance, I provided a detailed breakdown of key metrics, including downtime by material type, defect type, and category. I visually highlighted Logistics and Mechanicals defects to emphasize their criticality. The increasing trend in Mechanicals defects indicates a potential need for equipment maintenance or process improvements, while the rising trend in Logistics defects suggests the need for supply chain optimization.
Downtime Impact

This page highlights the significant financial impact caused by defective materials, which could be alarming to management when they see the substantial $$ losses tied to downtime.The top section provides a breakdown of downtime costs per hour, highlighting specific days where costs were highest, with September and December showing the most significant spikes. The bar chart on the right visualizes the monthly downtime costs, allowing for quick identification of peak cost periods.
At the bottom, users can analyze the Vendor-Plant and Vendor-Material combinations to identify the best and worst performers.
Key Insights Summary
Rising Defects and Downtime: Defect quantities surged to 2.6 billion units, with a corresponding increase in downtime to 216,000 hours, leading to a financial impact of $2.16 million underscoring the direct cost of defective materials on production efficiency.
Financial Impact: A crucial insight from the analysis is the direct correlation between downtime and financial losses. Assuming a downtime cost of $10 per hour, the company incurred over $2 million in downtime costs, with spikes in September and December indicating periods of severe disruption. This financial perspective provides stakeholders with a clear understanding of how supplier quality issues are affecting the bottom line.
Worst-Performing Vendors: Certain suppliers were identified as high-risk due to their consistent contribution to defects and downtime. Vendors like Avamm, Meejo, and Yombu emerged as the top offenders, contributing the highest defect quantities and significant downtime hours, reflecting the need for targeted supplier improvement initiatives.
Plant-Specific Issues: The analysis highlighted several plants, such as Hingham, Charles City, and Twin Rocks, as hotspots for defect-related downtime, with each plant reporting defect quantities nearing 100 million.
Defect Categories: Recurring issues like Bad Seams was the most frequent, requiring attention from both suppliers and internal quality control.
Material Performance: Raw materials were the most problematic, causing significant downtime. The rising trends in mechanical and logistics defects suggest a need for process and supply chain improvements.
Vendor-Material/Plant Combinations: The analysis drilled down into specific vendor-material and vendor-plant combinations to identify the worst performers. For example, the vendor Abata, when supplying raw materials, caused significant downtime and costs, with 3 million defects and 249 downtime hours, resulting in nearly $2,500 in lost productivity.
Recommendations
Centralize Data and Automate Reporting: A centralized data management system is essential for tracking and evaluating supplier performance across multiple plants. Automating data collection and reporting will enhance visibility, reduce manual effort, and ensure consistency across operations.
Focus on Financial Impact: Beyond tracking defect quantities and downtime, manufacturers should emphasize the financial impact of production delays. This approach ensures that business leaders can directly see the cost implications of supplier quality issues, driving more urgent and informed decision-making.
Implement a Continuous Improvement Program: Based on the insights gathered from supplier performance data, manufacturers should establish a continuous improvement program focused on reducing downtime, improving material quality, and addressing recurring defects. This will drive supplier accountability and improve the overall efficiency of the supply chain.
Leverage Advanced Analytics: To further enhance supplier quality management, manufacturers can integrate advanced analytics and predictive models. By identifying patterns and trends early on, businesses can proactively address potential issues before they result in costly downtime or material waste.
Conclusion
Business intelligence tools can revolutionize supplier quality management by transforming raw data into insightful, actionable reports. For Enterprise Manufacturers Ltd., the implementation of a BI-driven approach revealed trends, inefficiencies, and potential risks that would have otherwise gone unnoticed. By quantifying the financial impact of defective materials, the company was able to identify and address the root causes of production inefficiencies.
Centralizing procurement and performance data through Power BI will enable the management team to make more informed decisions regarding vendor performance, plant operations, and material selection. The analysis not only highlighted key performance gaps but has also paved the way for long-term process improvements.
Interactive Power BI Report
Click HERE to view interactive report
Resources
Want to learn how I created this, there is a full walk-through on my YouTube Channel.
Link to Original post here
Comments