Quantcast
Channel: DCAM – Element22
Viewing all articles
Browse latest Browse all 10

Standstill in Data Management? Metrics, Adherence, Meaning, Lineage and Quality offset solid progress on Data Governance

$
0
0

DCAM Data Management Benchmark only improves by 0.09 since 2015 to 3.22 in 2017

The EDM Council recently conducted the data management benchmarking study for the second time since its inception in 2015. The 2017 study continues to assess where the global financial information industry collectively stands against the requirements for sustainable data management as defined by the Data Management Capability Assessment Model (DCAM).

The results demonstrate that while we continue to operationalize data governance and policies, not much progress was made in making data more trustworthy and accessible. Financial institutions are still mired in tactical tasks such as the mapping of data from physical repositories to applications, or reconciling business glossaries to define the business meanings of data.

The benchmark indicates that progress in establishing mature data management is at a standstill. The average score across all 8 components of the DCAM is 3.22, a minor improvement of 0.09 since 2015 when the benchmark was at 3.13.

Out of the 22 statements, 15 received higher scores than in 2015, 4 decreased in score, 2 remained unchanged, and 1 statement was added since the 2015 survey:

2017 2015
1 Our organization has a defined and endorsed data management strategy 3.5 3.5
2 The goals, objectives and authorities of the data management program are well communicated 3.4 3.3
3 The data management program is established and has the authority to enforce adherence 3.5 3.4
4 Stakeholders understand (and buy into) the need for the data management program 3.5 3.5
5 The funding model for the Data Management Program is established and sanctioned 3.6 3.4
6 The costs of (and benefits associated with) the Data Management Program are being measured 2.7 2.8
7 The data management program is sufficiently resourced 3.2 3.3
8 Data management operates collaboratively with existing enterprise control functions 3.3 3.0
9 Data governance structure and authority is implemented and communicated 3.6 3.4
10 Governance “owners” and “stewards” are in place with clearly defined roles and responsibilities 3.5 3.2
11 Data policies and standards are documented, implemented and enforced 3.6 3.3
12 The “end user” community is adhering to the data governance policy and standards 3.0 2.7
13 The business meaning of data is defined, harmonized across repositories and governed 3.0 2.9
14 Critical data elements are identified and managed 3.3 3.2
15 Logical data domains have been declared, prioritized and sanctioned 3.2 3.3
16 End-to-end data lineage has been defined across the entire data lifecycle 2.8 2.7
17 Technical architecture is defined and integrated 3.2
18 All data under the authority of the Data Management Program is profiled, analyzed and graded   2.6 2.7
19 Procedures for managing data quality are defined, implemented and measured 3.1 3.0
20 Root cause analysis is performed and corrective measures are being implemented 3.1 3.0
21 Technology standards and governance are in place to support data management objectives 3.2 3.1
22  The data management program is aligned with internal technical and operational capabilities 3.2 3.1

 

4th Report from BCBS on the adoption of the principles for effective risk data aggregation confirms standstill

BCBS published its 4th report on BCBS 239 compliance “Progress in adopting the Principles for effective risk data aggregation and risk reporting” in March 2017.

The 11 principles were scored between 2.60 and 3.37 and only 3 principles (27%) received a score higher than 3.

P1 P2 P3 P4 P5 P6 P7 P8 P9 P10 P11
2.93 2.60 2.73 2.97 2.73 2.83 2.70 3.03 3.07 2.90 3.37

 

Similar as to the DCAM Data Management Benchmark results, the improvement seen in the BCBS progress report since 2015 is only marginal, a 0.05 increase of the average score to 2.90 with 3 principles receiving lower scores than in the 2016 assessment. Based solely on these scores, only 21% out of 30 institutions are in full compliance.

 

5 major problem areas have been identified by the 2017 DCAM Data Management Benchmark Study:

Metrics 2.7 The costs of (and benefits associated with) the data management program are being measured
Adherence   3.0 The end user community is adhering to the data governance policy and standards
Meaning 3.0 The business meaning of data is defined, harmonized across repositories and governed
Lineage 2.8 End-to-end data lineage has been defined across the entire data lifecycle
Profiling 2.6 All data under the authority of the Data Management Program is profiled, analyzed and graded

 

Metrics, Adherence, Meaning, Lineage and Profiling (which the industry typically also refers to as data quality assessment) received the lowest scores out of the 22 statements.

The lack of measurement of data management operations, adherence, and data quality flags a key risk in the execution of a firm’s data management strategy.

This is the primary reason chief data officers fail to effectively communicate the value that their data office contributes to an organization.

Without a harmonized definition of business meanings the industry will never be able to unravel interconnections, automate processes, or manage linked risks. Work is underway, but glossary reconciliation has proven to be significantly more difficult and time consuming task than anticipated.

With thousands of applications and hundreds of data models, this challenge gets even more complicated. Terms must be mapped from physical repositories and linked to logical data models ultimately resulting in transparent lineage and data flows. But, as lined out in the BCBS 239 progress report and the 2017 DCAM Data Management Benchmark, financial institutions have a long way to go to define end-to-end data lineage.

The consequence is that data quality suffers because we don’t understand the rules, lack a common definition, and can’t map the business glossary to physical repositories where the assessment of data quality should actually be performed.

These 5 problem areas are the main reason why the implementation of robust data management seems to be at a standstill, and years away from completion.

 

Data management programs must be driven by or integrated into business initiatives

The foundation of data governance has already been designed to service the enterprise, but now we need to tackle the open challenges. We need to align ourselves more clearly with the business objectives and focus on the most important areas at each firm.

Therefore, we suggest that data management programs need to be driven by business initiatives, and fully integrated into these initiatives. This will help to scale to the level that is required to bring the problematic capabilities to maturity, while generating value-add for the most critical business areas.

Most importantly, it will fast track our progress towards trustworthy and accessible data for the business areas that matter the most.

What’s the better approach to data management maturity – boiling the ocean? Or focusing on a specific set of initiatives?

We are certain that firms with the better project management capabilities, and an initiative-driven approach, will lead the scoreboards of BCBS 239 compliance and data management maturity in the future by a large margin.

We shall see the next time the EDM Council conducts a round of assessment on the state of data management in the financial industry with Pellustro.

A market commentary provided by

Thomas Bodenski, PartnerElement22

The opinions expressed are as of August 2017 and may change as subsequent conditions vary


Viewing all articles
Browse latest Browse all 10

Latest Images

Trending Articles





Latest Images