The Qualytics 8 – Timeliness

We spent a lot of time dealing with data quality issues before we decided to take a stance and build Qualytics. Going through our experiences, we have ultimately identified and defined the Qualytics 8 as the key dimensions that must be addressed for a trustworthy data ecosystem. We believe that to achieve comprehensive data quality, data should be assessed in 8 main categories.

The Qualytics 8

Qualytics uses these 8 fundamental categories to assess data quality.

required fields are fully populated
availability and uniqueness of expected records
Alignment of the content to the required standards, schemas, and formats
the value is the same across all datastores within the organization
your data is the resolution that is expected - How tightly can you define your data?
data is available when expected
data has the same size and shape across similar cycles
your data represents the real-world values they are expected to model

Timeliness is a measure of how often data is available when it’s expected. It can be calculated as the time difference of when information should be available and when it is actually available. Informed business decisions depend upon consistent and timely information. Therefore, critical measures of data quality include tests specifying how quickly data must be propagated and compliance with other timeliness constraints such as periodic availability.

For example:

Good- Timely Data:

  • A customer places an online order and the seller has access to that information in accordance with their supply chain timeline.
  • A company has a decision of which location to allocate more resources to. After analyzing the data between locations, they make their decision based on the highest customer records. Because all the records were able to be analyzed, they made the right call.

Bad- Timely Data:

  • A customer buys a product, but the product doesn’t arrive in time because the transaction record wasn’t available until after the guaranteed delivery date.
  • Thinking all the data was ready, a data scientist created a model based on customer records, when actually some records were not available. The model turns out to be inaccurate as the remaining customer records provided greater insight.

Ensure Timeliness

If all data is available, then the data is timely. The question is: how does one ensure that all data is available? 

Analyzing data both at rest and in-flight data is key to this solution. If a company can only analyze data at rest, then they are missing out on the data in-flight that could contain valuable information. Being able to analyze data without having to wait for it to arrive at another location is critical to increasing accessibility due to the decrease in wait time. As a result, all data is available data, and companies can make more informed business decisions.

Importance and Industry Insight

Without timely information, businesses can easily make wrong decisions. Correspondingly, these decisions can cost time, money, and possibly reputational damage for organizations. The bottom line is if you’re analyzing data, you need to have all the data available. Along with the benefits of more informed decision making, users will complete a step to gaining confidence in their data quality. Although there are a number of data quality characteristics to take into consideration, combining their efforts allows users to be fully aware of the quality of their data and make improvements. 

According to Jim Harris, the OCDQ’s Blogger-in-Chief, “Due to the increasing demand for real-time data-driven decisions, timeliness is the most important dimension of data quality.”

How Does the Qualytics Data Firewall Address Timeliness?

At Qualytics, we support analyzing data in-flight as well as data at rest. A primary benefit of analyzing data in-flight is that this enables Qualytics users to validate the expected rates of flow of data movement and timeliness of mission critical data. By utilizing additional data points, users will know whether all data is in fact available. 


As mentioned, data that is non-timely can lead to inaccurate insights and poor decision making, but Qualytics works to present users with all data, which is why analyzing both in-flight and at rest data is key. On top of that, we take into consideration historical shapes and sizes as a way to monitor volume for quality issues. Meaning, if the volume received doesn’t match historical data for the volume expected, we will alert users. 

With our product Protect, customers start by connecting to a data store. Following, using proprietary algorithms, the Data Firewall infers rich metadata through it’s deep profiling operation of the connected data. Because data is always changing, the rules should too. This is why the Data Firewall uses unsupervised learning to detect model and data drift in metadata, while supervised learning ensures rules are always aligned with user needs. Additionally, we allow users to write their own rules using our rich metadata. Following, the Data Firewall can do its job. Anomalies in flight can be quarantined or alerted, and anomalies at rest can be identified and alerted.

Qualytics is the complete solution to instill trust and confidence in your enterprise data ecosystem. It seamlessly connects to your databases, warehouses, and source systems, proactively improving data quality through anomaly detection, signaling and workflow. Learn more about the Qualytics 8 factors in our other blogs here – Accuracy, Conformity, Consistency. Let’s talk about your data quality today. Contact us at