jump to navigation

Today’s Linkedin Discussion Thread: Enterprise Data Quality April 28, 2009

Posted by Peter Benza in Data Analysis, Data Elements, Data Governance, Data Optimization, Data Processes, Data Profiling, Data Quality, Data Sources, Data Standardization, Data Synchronization, Data Tools, Data Verification.
Tags: ,
add a comment

Here is my most recent question I just added to my Linkedin discussion group = Enterprise Data Quality.

QUESTION: What master data or existing “traditional” data management processes (or differentiators) have you identified to be useful across the enterprise regarding data quality?

MY INSIGHTS: Recently, I was able to demonstrate (and quantify) the impact of using an NCOA updated address for match/merge accuracy purposes when two or more customer “names and addresses” from three disparate source systems were present. The ultimate test approach warrants consideration especially when talking about the volume of customer records for big companies today number “hundreds” of millions of records. It is ideal to apply this test to the entire file not just a sample set. But, we all know today its about: money, time, value, resources, etc.

For testing purposes, I advised all individual customer address attributes were replaced (where information was available) with NCOA updated addresses and then loaded and processed through the “customer hub” technology. If you are not testing a piece of technology, then constructing your own match key or visually checking sample sets of customer records before and after is an alternative. Either way, inventory matches and non-matches from the two different runs – once with addresses (as-is) and once with addresses that leverage the NCOA information.

My goal was to establish a business process that focused on “pre-processing customer records” using a reliable third party source (in this case NCOA) instead of becoming completely dependent on a current or future piece of technology that may offer the same results, especially when the methodology (matching algorithms) are probalistic. My approach reduces your dependency, as well, and you can focus on “lift” the technology may offer – if your are comparing two or more products.

Where as, inside a deterministic-based matching utility (or off-the-shelf solution) adding extra space or columns of data to the end of your input file to store the NCOA addresses will allow you to accomplish the same results. But, for test purposes, the easier way may be to replace addresses where an NCOA record is available.

Remember, based on the volume of records your client may be dealing with, a pre-process (business process) may be ideal, rather than loading all the customer names and addresses into the third party customer hub technology and processing it. Caution: This all depends on how the business is required (i.e. compliance) to store information from cradle to grave. But, the rule of thumb of the MDM customer hub is to store the “best/master” (single customer view record) with the exception of users with extended search requirements. The data warehouse (vs. MDM solutions) now becomes the next challenge… what to keep where and how much. But, that is another discussion.

The percentage realized in using the updated customer address was substantial (over 10%) on the average based on all the sources factored into the analysis. This means several 10’s of millions of customer records will match/merge more effectively (and efficiently) followed by the incremental lift – based on what the “customer hub” technology enables using its proprietary tools and techniques. This becomes the real differentiator!

Cognos data quality rapid assessment service January 17, 2008

Posted by Peter Benza in Data Accuracy, Data Analysis, Data Governance, Data Integration, Data Management, Data Metrics, Data Profiling, Data Quality, Data Standardization, Data Stewardship, Data Tools.
add a comment

http://www.cognos.com/performance-management/technology/data-quality/pdfs/fs-cognos-data-quality-rapid-assessment-service.pdf

BusinessObjects data quality XI January 17, 2008

Posted by Peter Benza in Data Accuracy, Data Analysis, Data Architecture, Data Assessment, Data Consolidation, Data Hygiene, Data Integrity, Data Profiling, Data Quality, Data References, Data Strategy, Data Templates, Data Tools.
Tags: , , ,
add a comment

Standardize, Identify Duplicates, Correct, Improve Match, Append, Consolidate, and more.    

http://www.businessobjects.com/products/dataquality/data_quality_xi.asp

SOA Governance At Bea: Essential to your enterprise transformation strategy January 17, 2008

Posted by Peter Benza in 1, Data Analysis, Data Architecture, Data Governance, Data Integration, Data Management, Data Optimization, Data Profiling, Data Security, Data Stewardship.
Tags: , ,
1 comment so far

Effective SOA governance is an essential element in any enterprise transformation strategy. It can help your organization achieve measurable, sustainable business value.

Read about this and other webcasts, whitepapers, etc… at Bea.

http://www.bea.com/framework.jsp?CNT=index.jsp&FP=/content/solutions/soa_governance/

What types of common data problems are found in your master data? January 13, 2008

Posted by Peter Benza in Data Analysis, Data Assessment, Data Governance, Data Hygiene, Data Metrics, Data Profiling, Data Quality.
Tags: , ,
3 comments

Master Data exists across your entire enterprise.  Companies today are assessing what is the best way to consolidate all their information assets (data sources) into a “single customer view”.

What types of data problems exist in your organization today or the future with the move towards managing data at the enterprise level?

[Be first to answer this question]

MDM Accelerator® by Zoomix January 9, 2008

Posted by Peter Benza in Data Accuracy, Data Aggregates, Data Analysis, Data Assessment, Data Consolidation, Data Dictionary, Data Formats, Data Governance, Data Hygiene, Data Integration, Data Management, Data Metrics, Data Processes, Data Profiling, Data Quality, Data References, Data Sources, Data Standardization, Data Stewardship, Data Synchronization, Data Templates, Data Tools.
add a comment

To learn more about or post your comments about MDM Accelerator®

by Zoomix.

http://www.zoomix.com/mdm.asp

Online data gathering – great resource for surveys and business forms August 25, 2007

Posted by Peter Benza in Data Analysis, Data References, Data Templates, Data Tools, Data Warehouse.
add a comment

I came across this website and after reading what it does I just had to share it.  The responses can be exported to a excel or word file – even add your response and form data into a data warehouse – an Access file will be downloaded that will allow you to do further analysis, if you desire.  

Visit www.askget.com to learn more about this online data gathering tool. 

Malcolm Chisholm, President, Askget.com, Holmdel, NJ

What data variable(s) are useful to determine when a customer record should be classified as active or inactive? August 25, 2007

Posted by Peter Benza in Data Analysis, Data Management, Data Metrics, Data Mining, Data Processes, Data Research, Data Variables.
2 comments

Be the first to author a comment on this subject.

Data quality and plotting customer address data on a map August 19, 2007

Posted by Peter Benza in Data Analysis, Data Hygiene, Data Integration, Data Metrics, Data Profiling, Data Quality, Data Tools.
add a comment

Consider the insights and knowledge your organization will gain about the quality of its customer name/address data prior to centralizing all the desparate data sources into one location.  Here is a actual slide deck I prepared a few years ago using the output from my analysis to illustrate how maps and data profiling can assist in assessing data quality. 

What type of statistical modeling approach would you recommend for those times when your response file is under 1,000 buyers? August 13, 2007

Posted by Peter Benza in Data Analysis.
add a comment

Be one of the first to author an article in this category!