Originally published on LinkedIn.
Companies have data in many places. And many companies do not know what data they have, where it is stored, who and what has access to it, the trustworthiness of the data or how to organize it in a timely manner into decision criteria for leadership teams.
The easiest way to know if what I'm saying is truth is to ask someone on your technical staff to provide you an asset and access inventory.
Give them one business day. Their reaction will reveal your truth.
Running a company minimally requires two things: knowing where you want to go and having access to timely, trustworthy data that will guide your journey. This article discusses the data aspect only.
And as you may already hope, suspect or know, addressing unsecured, unmanaged, disparate applications, data and permissions is a solvable problem. Accessing one view into your company is also solvable. Let's look at the plan.
Inventory all software applications and data repositories inside and outside your company, as well as, anything interacting with or exchanging data with your applications and repositories.
What is the technology collecting, managing, editing your data? Where is it hosted? By whom? Is it good, questionable or corrupt data? Who and what has access to it? What are they doing to the data? Who is managing the security and sanctity of the data? How do you know you can trust the data? Is the data current and with what frequency?
Is the data managed via role-based permissions or is it wide-open for too many people and systems to manipulate, extract and exploit? Is it direct-connect? Copy-paste? Batch-uploads? API-accessible? Is it secured while at rest? Is it secured while in transit?
Think your company not likely to be attacked, corrupted, ransomed or otherwise exploited? Consider your brand value, consumers, privacy laws and bad company press. Do people trust your brand today? Will they after a breach?
When data originates from multiple data sources, the structure of the data is usually non-uniform. The first step is to understand the current structure and state of all data at the origination point.
The second step is to determine to what Common Data Format (CDF) all data will be funneled and/or otherwise re-organized. In other words, if your company's growth strategy has been through Mergers and Acquisitions, you likely have many data stores with similar types of data, but with different states of sanity. If you want one view across all of these data stores, words must have the same meaning for all instances of all data. Establishing the same meaning for all similar instances is "normalization" or "establishing a Common Data Format."
Many to one.
Only after there exists a common data format are you able to see, understand and make decisions that confidently and consistently take into consideration all parts of the company.
When you understand all places from which data originates and have a CDF, your teams are then able to write predictable, repeatable and auditable methods of extracting, normalizing and putting data into your new, single source of truth.
To be clear, the methods of extracting data, normalizing data and putting data must be predictable, repeatable and auditable. And the structure into which all data is put is itself the CDF. Anything less and you will simply be creating a new mess that must be managed on top of your existing ecosystem -- whatever the state.
Now that you've made the effort to ensure all data, from all locations, is secured and normalized, protect it. This means there must exist a predictable, repeatable and auditable manner by which applications, systems and companies access your data. Notice I didn't say people.
To access data from the single source of truth, there must exist predictable, repeatable and auditable set of actors, permissions and activities. If there is variability in actors, permissions and activities, it will no longer be a single source of truth.
Require anyone or thing that wants access to your data to follow your rules. Non-negotiable. This includes people in Mensa, people with twenty years of tenure who have been there since the company started, the CEO's nephew and your mom.
Your single source of truth is special. No one who wants access to the data is special. Despite what their mom told them when they were young.
Attach reporting solutions. Attach streaming solutions. Attach elastic search. Attach dashboards. Follow the rules. Enjoy peace.
Now you can trust that your data has integrity. You can trust it is secure. You can trust your data is predictable, repeatable and auditable. You can trust your company has one message.
And you can trust that you know all applications, repositories, data management and security behaviors, actors, hosting solutions and reports are something upon which you can bank your company's reputation.
Trility Consulting's Matthew D Edwards joined Bâton Global for a podcast series focused on Exploring Human-Centric Transformation and how leveraging data can simplify and automate processes for team members, stakeholders, and ultimately customers by protecting their most valuable asset – people.