What other data aggregate functions are useful besides averages and means? September 19, 2007Posted by Peter Benza in Data Aggregates, Data Consolidation, Data Elements, Data Errors, Data Research.
1 comment so far
(Be first to author an article on this topic.)
Incorporating data quality into your business, technical, and regulatory standards September 18, 2007Posted by Peter Benza in Data Profiling, Data Quality, Data Research, Data Warehouse.
add a comment
This white paper describes how application developers can incorporate data quality into their Microsoft SQL Server 2005 Integration Services solutions. (22 printed pages)
Here is an excerpt from the beginning of this paper:
The quality of the data that is used by a business is a measure of how well its organizational data practices satisfy business, technical, and regulatory standards. Organizations with high data quality use data as a valuable competitive asset to increase efficiency, enhance customer service, and drive profitability. Alternatively, organizations with poor data quality spend time working with conflicting reports and flawed business plans, resulting in erroneous decisions that are made with outdated, inconsistent, and invalid data.
For the rest of this article:
Upcoming information quality and data management tradeshows August 25, 2007Posted by Peter Benza in Data Integration, Data Management, Data Profiling, Data Quality, Data Research, Data Tools.
add a comment
Europe’s Most Authoritative
Data Management and Information Quality Conferences
29 October – 1 November 2007 • London, UK
Victoria Park Plaza Hotel
This year there are three major shows in one: Information Quality, DAMA International, and Meta Data. (October 29, 2007 – November 1, 2007)
What data variable(s) are useful to determine when a customer record should be classified as active or inactive? August 25, 2007Posted by Peter Benza in Data Analysis, Data Management, Data Metrics, Data Mining, Data Processes, Data Research, Data Variables.
Be the first to author a comment on this subject.
How complete is your data? August 15, 2007Posted by Peter Benza in Data Completeness, Data Governance, Data Quality, Data Research.
add a comment
Data completeness is contingent upon first knowing the target population* relative to the number of missing data elements (bad values) to good values by data element.
Consider analyzing over time and set up on a scheduled basis missing value reports (better yet aggregate datasets) to study over time data completeness patterns. These findings might also reveal other data governance processes, policies, and standards in your organization for consideration.
It is advised to include a statistical analyst early on in outlining this process in order to help define data completeness specific to your organization – past, present, and future.
*target population could be anything from your customer name/address customer master database to product-specific datasets and all their associated attributes.