.@philsimon on the collision between the two.
I've spent much of my career managing the quality of data after it was moved from its sources to a central location, such as an enterprise data warehouse. Nowadays not only do we have a lot more data – but a lot of it is in motion. One of the […]
In my previous post I discussed the practice of putting data quality processes as close to data sources as possible. Historically this meant data quality happened during data integration in preparation for loading quality data into an enterprise data warehouse (EDW) or a master data management (MDM) hub. Nowadays, however, there’s a lot of […]
We had just completed a four-week data quality assessment of an inside plant installation. It wasn't looking good. There were huge gaps in the data, particularly when we cross-referenced systems together. In theory, each system was meant to hold identical information of the plant equipment. But when we consolidated the […]
The post IoT: How could your company benefit from real data accuracy? appeared first on The Data Roundtable.
In my last post we started to look at two different Internet of Things (IoT) paradigms. The first only involved streaming automatically generated data from machines (such as sensor data). The second combined human-generated and machine-generated data, such as social media updates that are automatically augmented with geo-tag data by […]
The concept of the Internet of Things (IoT) is used broadly to cover any organization of communication devices and methods, messages streaming from the device pool, data collected at a centralized point, and analysis used to exploit the combined data for business value. But this description hides the richness of […]
The post The Internet of Things and the question of data quality appeared first on The Data Roundtable.
Throughout my long career of building and implementing data quality processes, I've consistently been told that data quality could not be implemented within data sources, because doing so would disrupt production systems. Therefore, source data was often copied to a central location – a staging area – where it was cleansed, transformed, unduplicated, restructured […]
In my first blog article I explained that many insurance companies have implemented a standard data model as base for their business analytics data warehouse (DWH) solutions. But why should a standard data model be more appropriate than an individual one designed especially for a certain insurance company? Besides faster […]
A soccer fairy tale Imagine it's Soccer Saturday. You've got 10 kids and 10 loads of laundry – along with buried soccer jerseys – that you need to clean before the games begin. Oh, and you have two hours to do this. Fear not! You are a member of an advanced HOA […]
The post Can SAS Data Management get you to soccer on time? appeared first on The Data Roundtable.
As I explained in Part 1 of this series, spelling my name wrong does bother me! However, life changes quickly at health insurance, healthcare and pharmaceutical companies. That said, taking unintegrated or cleansed data and propagating it to Hadoop may only help one issue. That would be the issue of getting the data […]
The post Health insurance, healthcare and pharmaceutical perspective on data quality – Part 2 appeared first on The Data Roundtable.