If your enterprise is working with Hadoop, MongoDB or other nontraditional databases, then you need to evaluate your data strategy. A data strategy must adapt to current data trends based on business requirements. So am I still the clean-up woman? The answer is YES! I still work on the quality of the data. […]
The demand for data preparation solutions is at an all-time high, and it's primarily driven by the demand for self-service analytics. Ten years ago, if you were a business leader that wanted to get more in-depth information on a particular KPI, you would typically issue a reporting request to IT […]
The post Data prep and self-service analytics – Turning point for governance and quality efforts? appeared first on The Data Roundtable.
In DataFlux Data Management Studio, the predominate component of the SAS Data Quality bundle, the data quality nodes in a data job use definitions from something called the SAS Quality Knowledge Base (QKB). The QKB supports over 25 languages and provides a set of pre-built rules, definitions and reference data that conduct operations such as parsing, standardizing and fuzzy matching to help you cleanse your data. The QKB comes with pre-built definitions for both customer and product data and allows for customization and addition of rules to accommodate new data types and rules specific to your business. (You can learn more about the QKB here.)
Sometimes you may want to work with an alternate QKB installation that contains different definitions within the same data job. For example, your default QKB may be the Contact Information QKB; however, in your data flow you may want to use a definition that exists in the Product Data QKB. These data quality nodes have the BF_PATH attribute as part of their Advanced Properties enabling you to do this.
Note: You must have the alternate QKB data installed and be licensed for any QKB locales that you plan to use in your data job.
Here is an example data job that uses the Advanced property of BF_PATH to call the Brand/Manufacturer Extraction definition from the Product Data QKB. It also calls Standardization definitions from the default QKB of Contact Info. Notice that from the data flow perspective, it is one seamless flow.
The BF_PATH advanced property setting for the Extraction node that is using the Brand/Manufacturer definition from the Product QKB contains the path of where the Product Data QKB was installed.
Note: The path setting could be set as a macro variable. The path information can be obtained from the QKB registration information in the Administration riser bar in Data Management Studio.
Once BF_PATH advanced property is set, you will not be able to use the user interface to make your definition selection. You will need to know the definition name and any other relevant information for the node, so you can add the information using the appropriate Advanced properties.
In my example, I need to set the definition and token fields for the Brand/Manufacturer definition using the PARSE_DEF and PARSE_DEF Advanced properties.
Note: I was able to find out the needed information by viewing the Product Data QKB information in the Administration riser bar in Data Management Studio.
Here is the user interface for the Extraction node that is using the Brand/Manufacturer definition from the Product QKB and its data preview.
Note: The definition cannot be displayed since it is not in the Active QKB. You can select the Extraction field and Additional Output information on the user interface.
In conclusion, the Advanced property of BF_PATH is useful when you want to use an Alternate QKB installation in your data job. For more information on Advanced properties for nodes in DataFlux Data Management Studio, refer to the topic “Advanced Properties” in the DataFlux Data Management Studio 2.7: User’s Guide. For more information on the SAS Quality Knowledge Base (QKB), refer to its documentation. Finally, to learn more about SAS Data Quality, visit sas.com/dataquality.
Auditability and data quality are two of the most important demands on a data warehouse. Why? Because reliable data processes ensure the accuracy of your analytical applications and statistical reports. Using a standard data model enhances auditability and data quality of your data warehouse implementation for business analytics. Auditability by […]
The post Don't let your data warehouse be a data labyrinth! appeared first on The Analytic Insurer.
At some point, your business or IT leaders will decide – enough is enough; we can't live with the performance, functionality or cost of the current application landscape. Perhaps your financial services employer wants to offer mobile services, but building modern apps via the old mainframe architecture is impractical and a replacement […]
The post How to observe the impact of modernisation through a data quality lens appeared first on The Data Roundtable.
The “big” part of big data is about enabling insights that were previously indiscernible. It's about uncovering small differences that make a big difference in domains as widespread as health care, public health, marketing and business process optimization, law enforcement and cybersecurity – and even the detection of new subatomic particles. […]
.@philsimon on whether organizations need MDM to gather valuable insights about their customers.
The post Examining the relationship between MDM and customer intelligence appeared first on The Data Roundtable.
Master data management (MDM) is distinct from other data management disciplines due to its primary focus on giving the enterprise a single view of the master data that represents key business entities, such as parties, products, locations and assets. MDM achieves this by standardizing, matching and consolidating common data elements across traditional and big […]
Single view of customer. It's a noble goal, not unlike the search for the Holy Grail – fraught with peril as you progress down the path of your data journey. If you're a hotelier, it can improve your customer's experience by providing the information from the casinos and the spa at check-in to better meet your […]
The post In search of the Holy Grail: Sorry Hadoop fans, only MDM delivers single view of customer appeared first on The Data Roundtable.