By Guest Blogger, Anshuman Sindhar
Over the last decade, the volume produced by both corporations and individuals has increased exponentially. In addition, increase in the velocity (speed of data creation) and variety (structured, and unstructured, including, social media, email, PDFs, etc.) have contributed to the lack of veracity (abnormality or “noise”) of the data. This has led many executives to question data validity – that is, is the data on which decisions are based accurate? Is it timely and consistent? Do we know? Or are we just guessing?
The rate of data growth has exceeded leading market analyst’s ability to predict the next horizon:
- IDG: Unstructured data is growing at the rate of 62% per year.
- IDG: By 2022, 93% of all data in the digital universe was unstructured.
- Gartner: Data volume is set to grow 800% over the next 5 years and 80% of it will reside as unstructured data.
There has never been a greater need for high-quality data. Today, good data = good business = better analytics. That’s why having a claims data analysis is incredibly important to your business. Without good data, how do we know who the most profitable customers are? What products we are selling to whom? Are we offering the right products to the right customers? How do we reduce operational risk in serving our customers? The answers to these questions are becoming increasingly more difficult to answer, but service like NGP Integrated Marketing Communications can help you answer them! In some cases loss of data can impact a business’s analytics, this is where data recovery services are highly needed to help save any issues from developing and businesses potentially losing customers.
Issues with effectively and efficiently managing enterprise data:
In my experience, having worked at many different organizations across banking, capital markets, high-technology, networking and retail industries, there are 12 data challenges that lead to bad analytics:
- Data Governance
No such organization exists. Usually a Data Governance Program is an afterthought and fail often. Such programs need considerable executive buy-in and commitment to succeed.
- Data Management
Limited definition and coordination of data management principles, policies, and procedures. In addition, multiple ways to roll-up data leads to multiple aggregates of the same data. Customer views may vary depending on whether the audience aggregating the data is internal or external. Product data can be vastly different depending on the industry and line of business. Consolidation of hierarchies into a single comprehensive view with links into other ways the information is aggregated requires acceptance by business units and could be a significant change management effort.
- Uncoordinated Funding
Decisions on what initiatives to fund are often made in isolation. Line of Businesses have their own funding sources. Business user friendly technologies such as data visualization tools have made is easier for business units to fund and coordinate their own projects in isolation.
- System Risks
Multiple systems with the same data but different levels of granularity, and data integrity creates the risk of not being able to get the single source of truth. The ledger is populated from sub-ledgers that reside in various business units. Sub-ledgers and associated data is generally aggregated before populating the ledger which leads to loss of granularity. A significant effort is required to extract, integrate, and reconcile data back to the “upstream” systems of record.
- Resource Constraints
Many initiatives fail to gather business unit support upfront. Resource constraints and other priorities pull the organization in several different directions, leading to a reduced focus on what may take time and effort over and above the “day jobs”. To get the planned value from BI investments, banks need strong senior management commitment and business unit involvement to establish a true “partnership” with IT.
- Data Proxy
Incorrect or inconsistent business rules applied to the data. For example, if APR is blank on a loan document, does it make sense to enter a default value?
- Master Data Management
Often a customer master or product master does not exist. In addition, there are inconsistent definitions of a product. Many business units define products as customer or channel-specific. Over time, these definitions permeate into the organization. Separating the product view to become independent of customer, channel, departments or other organizational views can take a significant effort. In addition, a product should be independent of the pricing characteristics, and should be recognizable to internal and external customers. Once a consistent product definition is established across the organization, systems and processes need to be identified and modified to deliver more updated product information. These could affect existing systems, business reports, performance metrics, even incentive plans which require conversion of product data across systems to enable a new definition.
- Audit / Compliance disengaged
Lines of defense, data controls, and risk mitigation strategies are undefined, and are not linked to business risk control and self- assessment activities.
- Data Quality Issues
Data Quality issues prevent product and customer data to be consistent. Issues with the way data is captured at the time of entry could lead to inaccuracies. Missing checks in customer facing systems like the loan origination or portfolio management systems, transformation of errors as data moves through multiple systems can cause in accuracies. Translation errors arise when various groups translate the same data element in different ways. Moreover, there might be a lack of documentation to understand the various interpretations. Companies like Salesforce use strong Customer relationship management (CRM) to make sure that there are no errors with the data they have collected.
- House holding
Defining households accurately is dependent on accurate customer identifiers. Creating and managing a customer master is used to build an accurate customer identifier. Mastering data involves collecting data from key systems. Match-merge algorithms create unique customer identifiers across these systems, which can then be used for grouping current customers interacting with different parts of the organization into a household. Similarly, merging external third-party data expands households to include members who are not current customers.
- Inappropriate Data Access
Inappropriate data access due to lack of accurate authentication, authorization or proper data security can put intense pressure on the data infrastructure. Proper privacy and security can help in strengthening the infrastructure.
- Privacy Issues
Sensitive personal data, especially a customer’s financial information needs to be protected as it moves from customer facing systems to operational systems. Usually the data that is generated in systems such as loan origination systems, insurance agency management systems, or deposit management systems and contain sensitive customer data. Information such as customer’s geographic details, FICO scores, Loan-To-Value ratios, property value estimates, need to be stored in a way which prevents a compromise of customer’s identity.
Setting a strong data governance structure in the beginning can determine the success of an Analytics initiative.
Enterprise Analytics involves sharing key data elements across the company. These include data elements related to product, customer, suppliers etc. Once defined, key data elements can be reused across Enterprise Wide initiatives such as Customer Data Integration (CDI) or Master Data Management.
The key to building a successful data quality improvement strategy is to start with a cross-organizational holistic picture rather than just focus on integrating pieces together. This includes building:
- Comprehensive Data Governance
- A Common Language
- Consistent Data Granularity
- Consistent Quality and Standards
- Data Stewardship, not just Ownership
Look for my next blog in the coming weeks for a deeper look on this.
About the Author:
Mr. Anshuman Sindhar has two decades of financial management, systems development, entrepreneurship and consulting expertise. Anshuman has held various leadership positions at BearingPoint, CenturyLink Cognilytics, Capco, IBM and Paradigm Technology.
Paradigm Technology is a strategic consulting company serving the banking, airline, manufacturing, high-tech and retail marketplaces. We utilize innovative business and technology solutions to help clients enable their digital transformation programs, and improve their Analytics, Cloud, Master Data Management, and Project Leadership solution delivery. Paradigm is ready to support you in your GDPR compliance journey. For more information about Paradigm Technology and GDPR, email firstname.lastname@example.org or visit us at www.pt-corp.com.