Skip to end of metadata
Go to start of metadata

You are viewing an old version of this content. View the current version.

Compare with Current View Version History

Version 1 Current »

lead: Rachael Akinyede RA

contributors: Christoph Gerbig, Andrea Kaiser-Weiss, Christian Plaß-Dülmer, Beatrice Ellerhoff, Hannes Imhof, Max Reuter

related deliverable: D-K.2.2: Second Progress report including first consolidated Traceability Matrix [month: 24/o-15] (MPI-BGC)

See Jira Metadata here: https://itmsgermany.atlassian.net/jira/software/c/projects/IK/boards/5?selectedIssue=IK-37

Summary:

This deliverable follows up on the first progress report and the conceptual design of a traceability matrix to track the progress of the ITMS project. It is aimed to consolidate (unify or reinforce) our plans for the traceability matrix based on discussions made at the ITMS Benchmarking Meeting on the 21st of Feb 2024 and follow-up discussions between the modules. These consolidated plans will be applied in the upcoming “Applied Traceability Matrix and updates at GMs deliverable” D-K3.2 part 2. In addition to this, it informs on the status of the yearly interim report to the funding agency, BMBF.

  1. Second progress report:
    All individual partners submitted the second interim progress report (for the year 2023) to the funding agency, BMBF on schedule.

  2. First consolidated Traceability Matrix:
    As reported in /wiki/spaces/ITMS/pages/48562177, the two levels of traceability identified will be adopted: This includes one that tracks progress according to the schedule given by deliverables and milestones, otherwise known as the structural traceability matrix and one that evaluates progress according to the ITMS project aims measured by specific metrics, the so-called “performance-oriented” traceability matrix. A preliminary application of both matrices has been reported in /wiki/spaces/ITMS/pages/53706757.

    1. Report on consolidated Structural Traceability Matrix:
      Consolidating the structural traceability matrix involves bringing together up-to-date information on the project schedule as provided by the project management infrastructure: Confluence and Jira as facilitated by the ITMS K-working group coordinators and are continuously being applied. Since the conceptual design of the structural traceability matrix, the pre-defined requirements for providing such reports are still in effect and have involved answering the following questions:

      • Was a deliverable/milestone submitted or reached?

      • Was it submitted/reached on time?

      • Was it reached as planned or with limitations/concessions?

      • What are current pending tasks and risky issues within the project?

      In addition to the above-mentioned, we planned to improve information accessibility in a single Confluence place for everyone within the project. The following was thus established:

      • Extended visualization of the status of deliverables and tasks to all principal investigators of all project partners via Jira Kanban boards ("todo", "ongoing", "done") including “red flagging” of tasks as a traffic light system in case of issues.

      • In events of severe issues, reporting on such “red flagging” to all project members via each Confluence Module space.

      • Submission of deliverables via Confluence pages.

      • Implementation of a 4-eye check as a review process for all deliverables submitted. The K-Working-group coordinators will be responsible for this, ensuring that all submitted deliverables and accompanying metadata are complete, accessible, fit for purpose, and understandable, amongst other criteria as decided by the reviewer or person performing the 4-eye check.

      Since all Jira information is still not yet easily accessible in a single place to all members of the project, we further aim to establish:

      • Extended visualization of the status of deliverables and tasks to all project members via Jira Kanban boards as described above.

    2. Report on consolidated Performance-oriented Traceability Matrix
      In developing a traceability matrix that evaluates the status of a greenhouse gas monitoring capability developed within ITMS, the use of key performance indicators (KPIs) was suggested. As a rule, each KPIs should be selected based on the following pre-defined requirements as previously reported:

      • Precision: Is the metric clearly defined, and is the success tested in a way that identifies and facilitates progress in the project?

      • Applicability: Can the metric be straightforwardly implemented?

      • Objectivity: Could someone else apply the measure and obtain the same result?

      • Communication: Can others understand it?()

      Here, we have consolidated all quantifiable KPIs applicable to the individual modules, including possible scoring metrics. These KPIs have been selected to benefit not just the planning of phase 2 but the entire envisioned phases of the project:

Performance Indicators (KPIs) consolidated across all modules

Module B: KPIs (Abbreviation) →

Documentation

Timeliness

Integration in ITMS

Primary
Data Quality Parameter

Secondary
Data Quality Parameter

Metrics/Scores

Excellent

10

Defined by data provider by adapting a template. Reviewed, e.g., by data user.

..

..

..

..

9

..

Very good

8

..

7

..

Good

6

..

5

..

Fair

4

..

3

..

Basic

2

..

1

..

Module K: KPIs (Abbreviation) →

User-friendliness

Interoperability

Management Success

Metrics/Scores

Excellent

10

metric based on soft factors under development

metric based on soft factors under development

metric based on soft factors under development

9

Very good

8

metric based on soft factors under development

metric based on soft factors under development

metric based on soft factors under development

7

Good

6

metric based on soft factors under development

metric based on soft factors under development

metric based on soft factors under development

5

Fair

4

metric based on soft factors under development

metric based on soft factors under development

metric based on soft factors under development

3

Basic

2

metric based on soft factors under development

metric based on soft factors under development

metric based on soft factors under development

1

Module M: KPIs (Abbreviation) →

Use of multiple Datastreams for Inversion

Use of multiple Datastreams for Diagnostics

Timeliness for provision of inversion products

Uncertainty reporting

Number of species treated: MPI

Number of species treated: DWD

Number of sectors treated: MPI

Evaluation against external inversion systems

Analysis of different settings (different locations, networks/data streams, tracers)

User interaction/ Links to other projects and stakeholders (based on strength of interactions with links: A,B,C plus extra engagement)

Metrics/Scores

Excellent

10

all available ground based obs. + remote sensing + TCCON + COCCON

+ subset of each data stream set aside for validation (i.e. not used in inversion)

Real real-time
(within 3h)

 

+ spread of inversion ensemble (station subsets, prior fluxes, + satellite vs. ground based)

5 (gross fluxes biosph. CO2 + anthr. CO2 + CH4+ N2O)

gross fluxes biosph. CO2 + anthr. CO2 + CH4 + N2O + F-gases

+ all sectors contributing at least 10% of total emissions, 12 regions in Germany

all of the below + for all different species

3 cities, 6 monitoring configuration, 4 seasons, total CO2 and fossil fuel and biogenic CO2,yes

Strong interaction with A,B,C plus citizen engagement

9

Near real-time
(days)

3 cities, 6 monitoring configurations, 4 seasons, total CO2 plus fossil fuel and biogenic CO2, no

Strong interaction with A,B,C

Very good

8

+ night-time data

+ subset of each data stream set aside for validation (i.e. not used in inversion)

Year -1 month

 

+ spread of inversion ensemble (station subsets, prior fluxes)

4 (gross fluxes biosph. CO2 + anthr. CO2 + CH4)

4 (biosph. CO2 + anthr. CO2 + CH4 + F-gases)

all species: 4 sectors in Germany + lateral boundary correction

detailed inter comparison protocol

3 cities, 6 monitoring configurations, 2 seasons, total CO2 plus fossil fuel and biogenic CO2, no

Strong interaction with A,B, weak interaction with C

7

Year -4 months

2 cities, 6 monitoring configurations, one season, total CO2 plus fossil fuel and biogenic, yes

Weak interaction with A,B,C plus citizen engagement

Good

6

+ additional vertical levels at tail towers

+ aircraft (IAGOS + campaigns)

Year -1 year

 

+ monthly spatial maps

3 (gross fluxes biosph. CO2 + CH4)

CH4 + biosph. CO2 + anthr. CO2

CH4: + lateral boundary correction

CO2: simplified biospheric and anthropogenic

+ reporting of detailed inversion settings (prior error, model-data mismatch error, station set, posterior error)

2 cities, 4 monitoring configurations, 3 seasons, total CO2 plus fossil fuel and biogenic, no

Strong interaction with A,B, no interaction with C

5

Year -1.5 years

1 city, 6 monitoring configurations, on season, total CO2 and fossil fuel and biogenic CO2, no

Strong interaction with A, Weak interaction with B, No interaction with C

Fair

4

all available ground based obs.

subset of sites

Year -2 years

statistical posterior uncertainty at monthly/country scale

2 (biosph. CO2 + CH4)

CH4 + simplified biospheric CO2

CH4: 4 sectors in Germany

CO2: simplified biospheric (no vertical profile)

posterior flux comparison

1 city, 2 monitoring configurations, total CO2 and fossil fuel and biogenic CO2, yes

Weak interaction with A and B, no interaction with C

3

Year -2.5 years

1 city, 4 monitoring configurations, one season, total CO2, no

Strong interaction with A, no interaction with B and C

Basic

2

<20 ground based sites

none

Year -3 years

 

none

1 (biosph. CO2)

Level 2: CH4 including time profiles

Level 1: passive tracer with time-constant emissions

no sectors, 4 regions in Germany

none

1 city, 2 monitoring configurations,one season, total CO2, no

Weak interaction with A

1

On request

1 city, no monitoring configuration

Only outreach

Module Q&S: KPIs (Abbreviation) →

Resolution: Spatial

Resolution: Temporal

Uncertainties

Timeliness

Integration into Q&S Database (currently GRETA)

Calculation Approach

Data quality (detail level of input data, consistency)

Data availability to ITMS

Metrics/Scores

Excellent

10

10 m

hourly

data from ensembles  and gridded

real-time (within 3h)

automatic integration into Q&S database

model ensembles/ multiple models

perfect (high resolution/ high consitentcy)

good / low expenses

9

100 m

 

 

near real-time (days/1 week)

 

 

 

 

Very good

8

1 km

daily

data from ensembles

year -1 month

 

Tier 3

 

 

7

2 km

 

 

year -4 month

running own dataportal

 

high resolution / middle consitency

 

Good

6

4 km

weekly

gridded uncertainty (value ± error)

year -1 year

 

 

middle resolution / high consitency

 

5

 

 

 

year - 1.5 years

 

Tier 2

middle resolution / middle consitency

 

Fair

4

6 km

monthly

 

year -2 years

manual transfer to Q&S database

 

high resolution / low consitency

 

3

10 km

 

uncertainty per sector (value ± error)

year -2.5 years

 

Tier 1

middle resolution / low consitency

 

Basic

2

 

quarterly

 

year - 3 years

 

 

 

 

1

25 km

yearly

no

on request

manual transfer to user

guess

low resolution / low consitency

bad / high expenses

Comments

All the KPIs presented here will be further evaluated per module. They will also be evaluated as overall metrics in K by visualization using a spider graph. This is a tool useful in combining multiple metrics and comparing one system to the other in a single graph.

Planned updates related to this deliverable:

The traceability matrix and metrics for measurement will continuously be discussed at ITMS-K meetings and, in particular, before the due date of the following deliverables:

  • D-K.2.3: Third interim report including deliverable status updated traceability matrix [month: 36/o-39] (DWD).

  • D-K.2.4: Final report including deliverable status and updated traceability matrix [month: 48/o-51] (MPI-BGC)

  • No labels