Wednesday, July 19, 2017

When is Q not Quality?

The delivery of Quality while constrained by the Project Management Triangle is the holy grail of all change programmes. 
Project_management_triangle
Source: wikipedia

The definition of Quality, however, is often complicated, especially when there is no physical product or clear monetary benefit.

For many change / transformation teams, the metrics for a Programme is about delivery - how much can the team deliver in the next n years, despite not knowing where they are today. In this case, Q becomes Quantity. 


The drive to meet a Quantity-led KPI often forces the delivery teams into shoehorning a solution by the stated deadline, whether or not the solution is scaleable to meet the enterprise needs. 


Over time, the story is the same: 

  • functionality and usability suffer
  • manual workarounds start to appear to plug the gaps
  • the implemented work cannot be reused for other areas
  • data is a mess. 
  • the original team leaves
This results in management: 
  • pulling the plug
  • commissioning another project to replace (go back to paragraph above to rinse and repeat)
  • turning the manual processes into work packages and shipping them off to an outsourcing outfit, the corporate world of sweeping dirt under the carpet

Transitioning
So to the main point - how can an organisation transition from a Quantity-driven mission to one that focuses on Quality? 

Start by acknowledging that a Quantity-driven metric, while easy to measure, may not be the best place to start. Look at past deliveries and use the lessons learnt in them. Use these findings to define what Quality looks like to the organisation. 

Ideas about Quality

Some ideas around a Quality deliverable can include:
  • Is the Product a working product? Did it deliver to scope?
  • Can the Product and/or it's corresponding processes and outputs be reused? 
  • Is the carry over defects list something that can be resolved within the next 6-12 months? 
  • How is it good, if people have to manually handle it?
  • See also this article on how to identify bad data
And a bonus philosophical question: If the product does what it says on the box, but does not work due to bad data or other bad processes, did the project team do a good job?

How to get good Quality

  • Change the metrics. Stop making it about how many - make it about how well solutions align to the enterprise architecture
  • No Enterprise Architecture? Perhaps that's where the investment should go to first. Not enough budget? Put into play a business-aligned data architecture (by the way - if there is a strong Enterprise Architecture setup, many projects can be run using Agile methodologies).
  • Spend more time understanding: the current status; other in-flight projects and their expected outcomes.
  • If time is an absolute constraint, ensure that an enterprise solution is planned to replace any workarounds and invest in that solution for post go-live. Senior management's reduced appetite to fix the problem after implementation is understandable, but. like wet rot in a log cabin, is not good for the organisation in the long term
  • Open communications with the senior owner and the project team so they can make the necessary calls to change or hold or stop projects
  • Create a culture where people are rewarded for calling out issues. Additional time or costs in the short term may mean tens or even hundred times more savings in the future 
  • Get people to cross-pollinate ideas. Get rid of silos, especially the Us vs Them mindset of IT vs Business (Users). This is best done when there are crossovers from Business into IT. And yes, there will be an investment cost in people but it's well worth it.
Remember. the right KPIs drive the right behaviours which will, in the long run, build a stronger foundation for the organisation.

Saturday, January 28, 2017

9 signs of data gone bad

In my early days as a junior auditor, I could never quite understand why the accountants of the companies we were auditing would stay late everytime we were there. I initially thought that it was because we were interrupting their day. However, when I had to ask for schedules (different ways of presenting or breaking down the financial data), I realised that they were staying late to create reports that were used only annually.

The dredging up of historical data was painful for the head accountant as they had to find out what was posted during the year to a certain account, and why. Sometimes, errors were spotted and reclasses had to be made. Nevertheless, our presence and incessant requests would put them in a somewhat grumpy mood.

When I first started working in financial reporting myself, I realised why the late nights happened. Every month, errors in the ledger would create a break with expectations or budget or forecasting. I did not like this as I did not enjoy going home at 9 or 10 pm a few days every month end, for something so unnecessary. So I started my own investigations in between the reporting days.

I started by teasing out the recurring issues that had to be fixed every single month. Over the next 8 months, I managed to find the roots of the problems and would speak to the owners to eradicate the errors. Eventually, I got the overtime reduced to 0, which resulted in the team getting disbanded. No good deeds go unpunished.


In a later part of my career, I applied the the same logical thinking to data testing. This resulted in seamless upgrades as I managed to catch the bugs during before implementation (which, incidentally, were missed by the vendor's other clients). Sadly, shortly after I left the team, I heard that the governance and accountability on data change management was abandoned, swiftly followed by a severe deterioration in the quality of data. That system fell into disrepute and was never the same again.

To me, data is like the raw materials or components of a supply chain. If any single piece is of poor quality, the final product will be defective. Since data is not life threatening, unlike say a car, the same level of Quality Control is not applied.

Before fretting over data, it is important to understand if the data has gone bad. Things to consider include:
  1. Manual handling/workarounds
    Ideally, this should be close to the 6 sigma (no more than 3.4 defects per million opportunities). But let's be realistic - is 3 sigma even achievable? 67 defects per 1000 opportunities. If not, there is a serious data problem - either at data capture and/or handling
  2. Confusion
    Project/ change teams are constantly arguing over where to get good data OR
    are finding it hard to get the data they need
  3. Multiple spends
    Requests for spend on data sourcing for various projects which are similar in nature
  4. Not reusing data
    Data owners consistently having requests for similar data feeds and/or fixes
  5. Data user silos
    This happens when Users do not share data and disparate reports are created. A whole new cottage industry is then set up to perform reconciliation activities. See #6
  6. Reconciliation cottage industries
    When the number of operations people working on reconciliations is growing consistently or exponentially, year-on-year
  7. Delivery does not match Data turnaround time
    If the data is "available" daily, but direct reports take more than 2 days to create, then have a look to see if it is a data quality or process issues
  8. Upset usersTeams who have to work overtime to produce reports tend to have lower job satisfaction, higher burnout rates and less energy to focus on what is most important: understanding the data
  9. Regulatory wrath
    Poor data will always get discovered as the final products, reports, are unable to reconcile with each other. 
I'll write another article on my thoughts on how bad data can be prevented, and on how bad data can be repaired.