Don’t let data skeletons in silos ruin your Halloween – or the rest of your year. Make a commitment to a formal data governance program to take the scare out of your data integration and management initiatives.
Possessiveness and politics are among the “usual suspects” for killing data governance best practices. Isn’t it funny how regular, nice, family-oriented people can put on a business attire costume and transform into data hoarding goblins?
In most organizations, the battle for competitive supremacy is not just an external thing. Most business users and managers want to maintain control of their data for a variety of reasons. The most common ones include:
- a desire to gain leverage with upper management for more resources;
- earnestly believing that they are the best stewards of their data;
- preserving job security through secrecy instead of sharing; or
- a lack of trust in IT teams to meet data requests in a timely fashion.
Whatever the reason data silos are the haunting of a governance program. They are perpetuated by business units or departments that build or buy their own apps to support a specific business initiative. The resulting technology fragmentation and different semantic definitions become a graveyard for data integration and management where nobody knows where the bodies are buried.
Silos prevent an enterprise-wide view of business-critical information. It makes discovery impossible and blocks efforts to map data to sources, applications and users with any accuracy. In the face of rapidly growing unstructured and semi-structured data, plans to integrate and analyze newer big data sources and formats are DOA, as the enterprise is hindered from efficiently managing its existing data, which is mostly structured and howling to break free of data marts and warehouses. The enterprise is potentially exposed to the nightmare of undue risks hiding out in these departmental or individual end-user silos.
The results of these can mean missed business opportunities, hellacious compliance violations, and blood-draining resources that all murder operational intelligence and competitiveness.
Change the Habits, Change the Outcomes
Many data owners fall into the habit of making decisions as if their business unit or division were in a laboratory test tube. Without proper data governance practices, they lose sight of the value of data on a macro scale. Bad or incomplete data prevents the enterprise from achieving ROI and risk management objectives.
Want to become a data-driven enterprise? Then make data governance a strategic habit. To do so, its importance and directive begins with a commitment from senior management. To instill the habit cycle, the program should provide basic training that does not consume extensive time. It must include rewards and penalties to ensure adherence throughout the organization – from business unit managers to data entry personnel.
Business users should be involved from the outset in order to define the use cases and objectives the data will serve. These can include maintaining accurate patient or customer records, ensuring the consistency of reference data or adhering with GRC (governance, regulatory, compliance) requirements.
A formal data governance program establishes procedural and cultural best practices for defining data elements across different user and application silos. It should be overseen by a data governance council comprised of business unit leaders with subject matter knowledge and IT staff that bring technology expertise. The group’s role is to oversee all aspects of master data, including identifying the data elements, creating consistent definitions for use across the enterprise, establishing rules (and rewards) for data entry, maintenance and changes to the master data, and setting data retention and auditing guidelines.
Once existing systems have been audited and inventoried for data assets, the application and data owners have been identified and the data uses and its management has been set, integration can begin. Each data element must have a name, a structural format and a definition, all of which is then documented within a core metadata repository. Since master data is used by multiple applications, errors in the “golden” file can cause errors in all the apps that use it. SO data must be matched and cleansed to ensure quality and integrity with standards.
Quality data feeds better assumptions that go into model building and validation. Companies can deploy self-service data discovery and visualization tools with more confidence that they will improve analytics and decision outcomes. Users will have more confidence in the quality of the larger data sets they are working with, which leads to more associations and relationships. Better data enables apps to be the true life blood of the enterprise.
So yes, everyone wants to do more analytics, especially as big data seeps into the organization. But you bloody well better make sure that the quality of the data is trustworthy. Because bad data has a way of sneaking up on you when you least expect it. And you don’t want bade decision outcomes to cause heads to roll.