Tag Archives: Bad Data

Applying the Principles of Lean IT to Data Management – Part 3

Comments off 4851 Views0

In this final installment of this 3-part series, we look at how the practices and principles of Lean IT provide the approach and tools needed to eliminate data quality issues at the source. See Part 1 here and Part 2 here.

For a practical example of Lean IT and data management, see the webinar Lean IT: Driving SAP Continual Process Improvement.

Lean IT as Part of the Solution

In lean, all improvement starts with the people. The first step is to engage with your employees by leading with respect, creating a meaningful challenge, and fostering a workplace where excellence is the norm. When you engage with people respectfully, they respond in ways you never could have dreamed plausible.

Bad data entered into high-performance IT systems produces what one CEO described as, “Crap at the speed of light!” Even if we successfully create a lean application development environment, bad data will efficiently yield inaction-able information and over time, people will disengage and lose trust in their leaders, as well as the ERP system. The resourcefulness and genius of most workforces lies dormant beneath layers of distrust, uncertainty, and unreliability of the systems they are forced to contend with.

You engage with people because they really are your most valuable resource; in fact, people are your only appreciative resource! It is those who are closest to the work (as well as closest the problems) that most deeply understand the challenges and are best positioned to develop effective countermeasures.

In lean, we engage those closest to the work to define a target state, determine our current status, measure the gap, and identify roadblocks and obstacles. We run a series of experiments to strengthen our understanding of cause and effect and validate which countermeasures effectively move the dial and achieve measureable results. This cycle of learning and discovery (referred to as PDCA – Plan-Do-Check- Adjust) is a process that is frequently repeated until it becomes embedded into daily work routines. Although it seems straightforward and relatively simple, it is wickedly difficult to accomplish!

Figure 2 illustrates the PDCA cycle – an educational cycle of trial and discovery. The initial step of the process is Plan which starts with going and seeing where the work is performed in order to fully understand the current situation from the perspective of customers, end users, and those doing the work. Emphasis is placed on facts (data) over opinion and discovering potential obstacles that threaten the attainment of a target state (where we need to be in terms of quality, delivery time, productivity, cost, and customer satisfaction). As we tighten our grasp of the situation, we develop experiments to validate our understanding.

In the Do phase, we run the experiment and compare outcomes with our expectations. Again, we place an emphasis on measuring what matters most to customers, end users and those doing the work. In Check, we reflect on the results of the experiment and learn. If the results are what we anticipated, we have validated our understanding of cause and effect and demonstrated that the countermeasures intended to spark improvements actually work! If the results show a gap still exists, then we still do not fully understand, and need to consider alternative approaches.

Finally, in the Adjust phase, we determine our next step on what we learned during Check. If we have identified a working solution, we make it part of the standard work process. On the other hand, if we have not discovered an approach that gets us to where we need to be, we enter another cycle of learning and discovering by repeating the PDCA sequence. The cycle is repeated until we attain the results required.

The power of PDCA or lean problem solving is in its scalability, neutrality, and methodology. PDCA works well with relatively simple challenges as well as complex enterprise-level problems. It is domain neutral and functions effectively in any discipline, be it IT, supply chain, finance, manufacturing, service, healthcare, science, or other fields. The structured methodology supports a way of thinking and mental framework of how we approach problems that drive higher quality analysis and more effective responses.

So where does technology fit into lean IT? The answer may surprise you: it fits in last! First we engage the people and provide the tools, training, and support they need to drive improvement to core business processes. Those people in turn focus on the process to enhance quality at the source and flow of value to end users and customers. Then and only then, do we implement new technology and/or reconfigure our current systems to enable and automate redesigned processes that focus primarily on effectiveness.

This is not easy work, but the payoff creates new levels of performance, a cultural shift, and a competitive advantage that are almost impossible to match. Happy people do great work. When people experience the positive feeling of solving problems at the root cause level (rather than repeatedly working around chronic problems), they experience radical shifts in performance, personal growth, and teamwork. When they have the tools and information they need to succeed, that’s where the magic happens. When it comes to technology, it starts with quality at the source and that means accurate, complete, and timely data.

If you want to succeed, data quality is not an option, it’s a necessity. The quality of your data will determine the quality of your information, which plays the defining role in the quality of your work. Lead with respect by creating work systems and processes that produce great work!

So Now What?

So what can you do to effectively address data quality problems and leverage lean IT to extract greater value from SAP?

Learn: start your lean IT journey by reading our first book Lean IT, Enabling and Sustaining Your Lean Transformation, and by pre-ordering the “How To” book The Lean IT Field Guide (available November 2015). Pre-order the book here.

Connect: engage with your employees by going to where the work is done to better understand the challenges they are experiencing and to see things from their perspective.

Involve: introduce lean process improvements by applying the PDCA cycle of experimentation.

Automate: only after you have engaged your people and improved core work processes, should you consider new technology and/or reconfiguration to automate and streamline newly improved processes.

As we discussed earlier, lean IT is all about engaging people, improving processes, and leveraging technology – always in that order. The sooner you begin to address the issue of data quality, the sooner you begin to realize the impact of high-quality actionable information.

Applying the Principles of Lean IT to Data Management – Part 2

Comments off 3278 Views0

In Part 2, we take a look at how Lean IT can be applied to make significant improvements to data quality, reducing the wasteful rework associated with incomplete and inaccurate information. See Part 1 here.

For a practical example of Lean IT and data management, see the webinar Lean IT: Driving SAP Continual Process Improvement.

How Lean IT Can Help

Lean IT is a framework for deeply understanding Information and Technology in a new light through applying the principles and methods of lean. Lean is all about creating excellence in the workplace, in the work, and in the quality of information. Bad data produces inaction-able information, which leads to errors in judgment and behavior.

If there is a chronic lack of high-quality information, it’s impossible to sustain a smooth flow of work because fixing mistakes channels precious energy and mental capacity from your employees.

The illustration above is the principles pyramid developed while writing the book Lean IT. At its core, lean IT is about leveraging technology to deliver customer value with the least amount of effort required. In order to achieve this, we remove all unnecessary effort – ambiguity of process, avoidable mistakes, self-inflicted variation, corrections, rework, delays, and extra steps. For the purpose of this discussion, let’s focus on the principle of Quality at the Source.

Quality at the source means performing work right the first time, every time. Imperfect work (work that is incomplete or inaccurate) is never sent forward to the next operation, end users, or customers. We measure quality in terms of percentage of work that is accurate, complete, timely, and accessible (as defined by end users and customers). The critical nature of the quality of information is well known. We’re all familiar with the truism “Garbage in, garbage out!” Without quality information, the results will always be inferior and require heroic efforts and creative rework to meet customer expectations.

The essential factor of quality information is quality data. It is amazing that many, if not most organizations, focus the majority of their effort and resources on technology while paying very little attention to the quality of data within the system. You can have state-of-the-art hardware, application stack, network, connectivity, and security, but if you have data issues, the result is in-actionable information received by end users, confusion, frustration, errors, workarounds, and the accompanying pain as a result. This wasted, non-value added effort and annoyance only gets worse over time. Why? Because once people no longer trust the system, they adapt skillfully to perform their job outside of the system to get their work done!

The Promise of ERP

Today, complex business enterprises are connected and managed through integrated information systems like SAP. Enterprise Resource Planning (ERP) has been around since the 80s and is both a blessing and a curse. The promise of an integrated system with a cohesive database that creates a single rock-solid record of “the truth” has been the vision of ERP systems since their inception. ERP is the ultimate connective tissue of the enterprise, enabling disparate silos of the organization to work towards common objectives, access information, maintain accurate records, and share information. Imagine trying to run a modern corporation without technology!

When actionable information is missing or unavailable, it is often hard to detect. People tend to rely on what the system tells them and often only discover that information is inaccurate and incomplete after the fact by hearing about a problem from downstream operations, end users, or worst-case scenario – their customers.

Data Quality, ERP, and Respect for People

In lean, respect for people refers to management’s responsibility to create a work environment where people are positioned to be successful and grow to their full capabilities. It means creating a workplace where everyone has the tools, processes, and information they need to do great work. It also means creating a culture where uncertainty and ambiguity are actively eradicated, while transparency and trust are intentionally fostered. Knowing your ERP system is housing bad data and not doing anything about it is the antithesis of respect for people, and sends a clear message to all that management places a higher priority on other things.

If poor information quality becomes a chronic issue within ERP, users lose trust in the system and rely on ingenuity to get what they need to complete their work. Spreadsheets, stand-alone databases, pay-per user cloud-based apps, in-house solutions developed outside of IT, and other inventive efforts by users add new layers of anonymous technology in the shadows of the ERP system. This creates a black market of information beyond the integration, security, and scrutiny of the IT department! The technical debt associated with shadow IT systems accumulates over time, crippling an organization’s ability to respond to complex performance issues, and blocks straightforward upgrades to system functionality.

As the sanctioned ERP system goes through controlled releases of functional modules and upgrades, informal, unauthorized shadow IT systems proliferate spontaneously at a very rapid pace driven by user needs, the bureaucracy of IT, and the distrust of ERP.

In my next post, we’ll go more deeply into how Lean IT effectively comes to grip with bad data by eliminating its creation at its origin.

Applying the Principles of Lean IT to Data Management – Part 1

Comments off 1964 Views0

The Tyranny of Bad Data 

We’ve all experienced the frustration and pain associated with bad data – either we’re aware that the information obtained from IT systems is inaccurate or incomplete (and the non-value added work that comes with it) or unaware that the information is based on bad data and the cascading impact of taking the wrong course of action due to misinformation. It is incredible that so much is invested on enterprise technology solutions, as little attention is devoted to ensuring high quality data is the sole source of system information. Only a handful of companies have discovered how to exploit the power of lean IT to shorten time to value development cycles, while assuring data integrity.

The ultimate purpose of information and technology is to enable people to perform great work as effectively and efficiently as possible. From a lean IT perspective, we want to leverage technology to empower people to do excellent work with the least amount of required effort. Technology has the capability to gather, store, organize, manipulate, manage, calculate, analyze, summarize, format, and report limitless amounts of data in order to create actionable information. Technology that efficiently delivers bad information only serves to enable waste, delays, and poor results. For our purposes, information needs to possess the following attributes to be deemed actionable: accurate, timely, complete, and accessible.

For a practical example of Lean IT and data management, see the webinar Lean IT: Driving SAP Continual Process Improvement.

When bad data happens to good people

Donald Rumsfeld, former US Secretary of Defense infamously said: “You go to war with the army you have, not the army you might want or wish to have at a later time.” In the same way, we do business with the data quality we have, not the data quality we might want! But what happens when highly effective technology processes inaccurate, incomplete, and out-of-date data; when bad data happens to good people? 

Scenario #1 – We know we don’t know…

If people recognize that the information they are receiving is not actionable, they are forced to choose from damaging alternatives like adjusting their course of action based on years of experience, assumptions, and perceived understanding. Some develop rules of thumb based on personal knowledge, while others devise creative workarounds to obtain the information they require when system information is suspect and unreliable.

Unfortunately none of these countermeasures confronts the root cause of the problem, nor guarantees a timely and accurate business outcome. Undocumented workarounds and tribal knowledge of what to do when the system delivers bad information may work in one instance and fail in another, and all of these actions are forms of guessing that are impossible to scale and sustain.

Scenario #2 – We don’t know we don’t know…

When people rely on information from IT systems, assuming accuracy, timeliness, and completeness, and that information is actually compromised, things get much worse. Bad data generates bad information, prompting people to make misinformed decisions, mistakes, oversights, and the creation of more bad data! The compounding impact of bad data and inaction-able information is a frustrating, downward cycle of errors, corrections, rework, and delays that force people to resort to heroic efforts to deliver mediocre results. Customers instantly notice a lack of service, timeliness, and quality. As employees become more aware of data problems, they begin to lose trust in the system and resort to the workarounds described in scenario #1, which may feel better and attain some results, but do not materially improve the situation. In fact, the more exceptions and workarounds to the way work is conducted, the more variability the customer experiences in service levels, quality, and delivery time!

In my next post, we’ll explore How Lean IT addresses the issue of bad data at a level that creates measurable, sustainable change for the better.