Truth in data starts with your data culture
Do you need a single source of truth (SSOT) for data? Yes. For a data-driven enterprise, an SSOT is a necessity to provide deliberate and accurate insights into decision-making for the business. Sixty-nine percent of CFOs agree that an SSOT is necessary to run an enterprise. However, it is not something we should rush into, nor should we feel paralyzed due to the lack of having an SSOT.
There are several artifacts and processes that build the SSOT so that this asset can contribute to your organization’s decision-making culture. By identifying these, one will realize that an SSOT is not the goal but the result of a well-executed data journey and culture.
What is an SSOT?
Traditionally, “SSOT” is an IT term referring to a data storage practice of obtaining data from a single location. However, this is much more than a technology conversation. Before you can even design an SSOT strategy, there are a necessary steps to be taken.
Let’s using cooking as an example. Making a fine, gourmet meal is a process. You must have some skillset, select your fine ingredients, follow a specific method of preparation, introduce warranted recipe adjustments, and, of course, find a delectable way to present the finished dish. Working with data is no different. Before you can present your data to your organization as the “single source of truth,” there’s a lot to be considered. Understand that the truth is sourced from an aggregation or a collection of data artifacts that are themselves sourced from multiple sources. This aligns with Forbes.com contributor Brent Dykes’ definition of an SSOT: a data storage principle to always source a particular piece of information from one place. Before any of this can happen, you need to create a sound strategy for the data that will be pushed into this singular location — if you fail to do this, your SSOT initiative will fail too.
Five steps to SSOT success
There are five major stages to consider when establishing a data culture enabled with an SSOT. They are:
- Data source curation. Revealing your data sources is the first step. This typically manifests as a workshop with different stakeholders, data, and technology leaders. The idea is to bring forth all the data sources in scope within the initiative and investigate their point of creation. The objective of this step is to identify not only the pertinent data points but also their true sources. Through a workshop, users will gain a foothold in the process behind the data they have been consuming and ascertain the point at which that data is ready for consumption. For example, you may have a database table named “Customer Table” housing company information such as names, addresses, and firmographic and geographic data that is sourced from multiple raw files. Discussions can reveal the sustainability of these sources and users collectively can decide to improve or drop the source altogether.
- Authoritative data source identification. There is a need to identify “officially recognized data” that the business is confident in employing. Certain groups with authority should be accountable for the data. These entities are groups of data owners or experts in the organization who develop or manage data for a specific business purpose. The data these entities create is referred to as authoritative data. The data may be created internally or externally; all of it is validated for quality and accuracy so that it will be authoritative. Answers to questions such as “At which point would the data be ready for business consumption?,” “Are there data owners who are accountable for the data as it stands todays and how it will be moving forward?,” and “Are the sources and aggregation protocols complete as expected?” are important in order to gain an understanding of what data groups need to be considered as authoritative sources for the SSOT. These concerns imply that the data owners, creators, and providers need to revue data policies and processes in order to these declare source(s) authoritative.
- Data stewardship. Data and data sources typically require preparation including augmentation, standardization, validation, and enrichment. Once authoritative data sources are identified, members or a central group are responsible for the input and maintenance of these authoritative data sources.
- Identify a trusted data source. This is a publisher of data, potentially from multiple authoritative sources. It may be an internal organization or external providers whose process of compiling or aggregating data has been vetted by the users of the data — their strengths, limitations, and utilities are known and transparent. An example is D&B Optimizer, Dun & Bradstreet’s secure, cloud-based marketing solution. Optimizer supplies data employing our sustainable and repeatable data stewardship services involving your trusted data.
- Data architecture design and implementation. From the four stages mentioned above, it should be clear that the concept of an SSOT isn’t merely a technological implementation. There are crucial ingredients to this proverbial gourmet meal. These are the data, its qualifications, its processes, its defined rules, and the parties involved in its creation and maintenance. All these should be sorted out before any system design or selection. The goal here is to scale the above stages and enable the trusted data to be fostered in a capable, secure, and accessible environment that will become the SSOT. The articulation of the data flow, distribution, and access are important aspects of this stage.
SSOT goes beyond technology
Traditionally, SSOT is an IT term or responsibility, but many SSOT initiatives have failed to provide much-needed digital assets due to the predominant focus on technology. Yes, technology is very much part of the equation. That is not being discounted. However, the data, subject matter experts, business logic, planning, and governance are equally as important to successfully deploying your organization’s SSOT. It boils down to this: The single source of truth is merely a result of a well-designed and executed data strategy and not the goal. The goal is to be able to provide data that is actionable, dependable, and accurate at scale.