DIKE_12-2015-04

Marine Strategy Framework Directive (MSFD)
Common Implementation Strategy
12thmeeting of the
Working Group on Data, Information and Knowledge Exchange (WG DIKE)
0900-1700: 12 October 2015
Conference Centre Albert Borschette, Room AB/5B, Rue Froissart 36, B-1040 Brussels
Agenda item: / 5
Document: / DIKE_12-2015-04
Title: / UPDATE to indicator structure and common elements for information flow
Prepared by: / ETC/ICM for EEA
Date prepared: / 29/09/2015
Background: / This document provides an update to the DIKE_11-2015-05 [1]document which introduced an analysis and proposal of a common structure and elements to an indicator assessment. Specifically, the update incorporates the latest developments in the regional indicator approaches and its bearing on the common MSFD indicator schema, as well as feedback received from WGDIKE 11
The schema presented in this version should be regarded as a final draft to be agreed at the marine directors meeting.

WG DIKE is invited to:

  1. Review and agree the common schema as presented.

Indicator structure and common elements for information flow

1Summary

It is expected that assessments developed in the context of the Regional Sea Conventions will be used in support of the 2018 MSFD Art.8 assessments, including through the use of commonly agreed indicators. For two RSCs, developing their next assessments (HELCOM's HOLAS II and OSPAR's Intermediate Assessment 2017), the indicator assessment approach is seen as a direct contribution to MS needs in 2018.

The analysis presented in DIKE 11 Doc 05 of the two RSC approachesand the EEA approach to indicator assessment (Figure 1), suggests there is an underlying common structure and the possibility to agree a set of core information that could be mapped between the regional approaches/European approach and utilised in the provision of information that feed upwards from Member States to RSCs to the European level.

Figure 1Indicator common categorization and number of information fields in each structure (from DIKE 11-Doc 05)

The analysis and suggested improvements in the DIKE 11 document have been discussed with the RSC’s/EEA and a summary of enhancements is given that demonstrates a further convergence towards a common set of information for regional based indicators.

These improvements have been incorporated into the MSFD indicator schema draft and are presented in this document.

2Indicator assessments

For environmental assessments, indicators are a well-established method to apply a metric to the status and trends of specific pressures and states. This approach, in one form or another, has been used in the RSCs and at the European scale for the last two assessment rounds. The MSFD criteria also correlate to an indicator type approach, and this is recognised by the RSCs in their planning for their nextregional assessments – the so-called 'roof reports'.

The breadth of subject matter of these indicators has already led to the formulation of agreed structures for the information that forms part of the indicator; this is equally true for the EEA, OSPAR and HELCOM. Information on the developments within the Mediterranean and Black Sea are still asyet notknown. Each of these organisations have elaborated templates of information elements;Table 1 summarizes their approach and status.

Table 1 overview of existing indicator structures

Organization / Development status / Presentation format / Structure
EEA / Currently used for coreset indicators (CSI) / Online / Follows a strict meta-data template adapted from the European SDMX Metadata Structure (ESMS)
HELCOM / Currently used for some existing assessments (i.e. Eutrophication) and available for use for roof report / Online / Follows a document-based structure based on an agreed HELCOM set of information
OSPAR / Template finalised and being used for OSPAR Intermediate assessment (2017). Available foruse in roof report / Planned to be onlinefollowing development of the OSPAR Assessment Tool / Follows a document and meta-data-based structure, developed within OSPAR
UNEP/MAP / Not known / Not known
BSC / Not known / Not known

3Revisions to Indicator assessment structures

3.1EEA Core indicator[2]

Recalling the recommendations to the EEA schema from DIKE 11 Doc 05:

Possible enhancements for MSFD

-There is no specific reference to geographical assessment units. However the map products do depict the results by MSFD sub-regions, and the metadata structure has a field for‘contributing countries’.

-There is no explicit reference to an ‘assessment dataset’ which both OSPAR and HELCOM refer to;the EEA structure refers to the overall underlying dataset.

-With the exception of the DPSIR label type, there are no specific labels that link the indicator to a specific MSFD utilisation i.e. using the MSFD criterion id’s.

The EEA have undertaken work to map the common schema to their existing indicator schema, see Annex 2: Analysis and mapping in detail (Excel workbook) “EEA_indicator Map Common”. They are now in the process of resolving how they incorporate these changes. This means that they will provide more specific linkages to address the MSFD specific enhancements including:

-Clearer references to access and use rights

-Inclusion of trend information

-Linkage to assessment/reporting units

-Inclusion of an assessment dataset provision

-Labelling for MSFD criteria

3.2HELCOM Core indicator[3]

Recalling the recommendations to the HELCOM schema from DIKE 11 Doc 05:

Possible enhancements for MSFD

-There is no explicit placeholder for temporal scope i.e. the year range of the assessment

-There is no direct reference to which countries the indicator assessment applies to (i.e. a list of countries), although this can be inferred from the map of assessment units/data results

-Although an authorship citation is provided, there is no explicit reference to the access and use rights to the assessment dataset, or map products.

HELCOM have agreed to include these enhancements in a future update to their template in connection with HOLAS II. The template is not yet elaborated with these elements in a presentable form. In addition, HELCOM have indicated that with unique labelling it will be possible for a national indicator to link to the regional indicator instance, and similarly it will be possible to provide reciprocal links to a national indicator that relates to the regional indicator. This also simplifies the potential issue of multiple language support at a regional/European level as contracting parties can workwith national languages at a national indicator level and make use of the linkages.

3.3OSPAR Common Indicator[4]

Recalling the recommendations to the OSPAR schema from DIKE 11 Doc 05:

Possible enhancements for MSFD

-A clearer information provision to the data product relating to an assessment result would be useful

-The field ‘linkage’ would benefit from a more structured approach as a collection of links – most of which will be intrinsic to interpreting the assessment, may be too ambiguous and lead to information not being translated correctly in a common set of information fields.

-The structure would be improved by more explicit references to quality aspects of the assessment, monitoring and data aggregation methods.

OSPAR have undertaken a revision of their indicator template, see Annex 2: Analysis and mapping in detail (Excel workbook) “OSPAR_Indicator assessment”. This revision addresses the suggested enhancements, including:

-Inclusion of assessment method as a distinct field

-Data snapshot and the wider data results split into two distinct fields

In addition, OSPAR have indicated that with unique labelling it will be possible for a national indicator to link to the regional indicator instance, and similarly it will be possible to provide reciprocal links to a national indicator (using the ‘linkages’ field) that relates to the regional indicator. This also simplifies the potential issue of multiple language support at a regional/European level as contracting parties can work with national languages at a national indicator level and use the linkages.

3.4UNEP/MAP

The DIKE 11 Doc 05 did not specifically address a regional indicator structure from UNEP/MAP, which the member states recommended should be undertaken in a follow-up. The EEA and the European Topic Centre have had communication with UNEP/MAP Secretariat, and at the time of this report there is no further information that would enable such an analysis to be incorporated.

4Revised version of the common indicator schema

Based on the feedback from DIKE 11 and revisions to the regional templates as a result of the analysis, a second draft of a common schema is presented inTable 2. There is only one additional field suggested in the new draft (highlighted), whereas most of the changes are related to a clarification of the use of the field(s) or a widening of the concept definition. The right hand column (Change History) indicates the entity that has suggested the change, the nature of the change and the field it affects.

1

DIKE_12-2015-04

Table 2 MSFD indicator schema updated draft outline

x = more than one field present in existing indicator structureo = one field present in existing indicator

n = no explicit fields in existing indicator

ID / Category and relevant fields / Description / Content type / EEA / HELCOM / OSPAR / Recommendation / Change history (PROCESS: PARTY - nature of change [to which element]
Access and Use / Explicit information on access rights, usage rights related to the data products, indicator publication etc. i.e. data policy, copyright restrictions etc. / o / o / o
1 / Conditions applying to access and use / i.e. Copyright, data policy / text or URL / Required
Assessment Findings / Key messages, assessment results (text and graphic form), trend analysis and conclusions presented primarily as text / x / x / x
2 / Key assessment / Longer description of assessment results by assessment/reporting units / text / Required / DIKE 11: OSPAR - addition of reporting unit to [description]
3 / Key messages / Short descriptions of indicator outcome, e.g. trends, outcome against assessment threshold / text / Required / DIKE 11: OSPAR - extension of [description]
4 / Results and Status / Textual description of assessment results, could include graphics / text and figures / Required
5 / Trend / Textual description of assessment trend, could include graphics / text and figures / Optional
Assessment methods / Methodologies, aggregation methods, indicator specifications, references to other relevant methods / x / x / x
6 / Indicator Definition / Short description of indicator aimed at general audience / text / Required
7 / Methodology for indicator calculation / Text and tabular information on the process of aggregation and selection etc / text or URL / Required / DIKE 11: OSPAR - URL added as [Content type]
8 / Methodology for monitoring / Short textual description of monitoring requirements/method / text / Optional
9 / Indicator units / Units used for indicator / text or URL / Optional / DIKE 11: OSPAR - URL added as [Content type]; suggest to be optional as will be extensively incorporated in other fields [Recommendation]
10 / Concept and target setting method / Text describing concept used and target method / text or URL / Optional / DIKE 11: OSPAR - URL added as [Content type], GES removed from [Field title]
Assessment Purpose / Purpose of assessment, rationale of approach, targets and policy context / x / x / x
11 / Indicator purpose / Justification for indicator selection and relevance / text or URL / Required / DIKE 11: OSPAR - URL added as [Content type]
12 / Policy relevance / Text relating indicator to policy / text or URL / Optional / DIKE 11: OSPAR - URL added as [Content type]
13 / Relevant publications (policy, scientific etc) / Citable URLs to policy documents related to indicator / text or URL / Optional / DIKE 11: OSPAR - URL added as [Content type]
14 / Policy Targets / Description of policy target / text / Optional
Contact and Responsibility / Contact details for assessment indicator, including authorship and organisational contact points / x / x / x / DIKE 11: SE - Ownership is confusing, responsibility covers better [Category title]
15 / Contributing countries / List of contributing countries (ISO 2-letter country code) / text / Optional / DIKE 11: OSPAR - Concept changed to a more generic country as opposed to individual author [Field title] + [Description]
16 / Citation / Full citation / text or URL / Required / DIKE 11: OSPAR - URL added as [Content type]
17 / Point of contact / Organisational contact / text / Required
Data inputs and outputs / Data sources, assessment datasets, assessment results (tabular and dataset), snapshots etc. / x / x / x
18 / Data sources / Underlying datasets / text and/or URL / Required
19 / Assessment dataset / snapshot dataset that was derived from underlying data / URL / Required
20 / Assessment result / summary results dataset/table/figure / File or web service / Required
21 / Assessment result- map / GIS version of assessment result i.e. Shape file or WFS / File or web service / Optional
Geographical scope / Assessment units, other geographical information, countries / x / o / x
22 / Assessment/Reporting unit / Nested assessment unit (if available) / text / Optional / DIKE 11: OSPAR - addition of reporting unit to [Field title]
23 / Countries / Countries that the indicator covers / text / Required
24 / Other geographical unit / alternate source of geographical reference for indicator i.e. ICES areas / text or URL / Optional
37 / Assessment area (context) / gives a description of the area where the assessment is made (e.g. the assessment may be part of a RSC assessment and therefore cover an entire region OR it may a national assessment covering all national waters or a part of them / text / Optional / DIKE 11: FI - NEW FIELD suggested addition to clarify context of assessment unit [Field title]
Labelling and Classification / identification and classification systems, such as INSPIRE, DPSIR, MSFD criteria / x / o / x
25 / DPSIR / assessment framework linkage / text / Optional
26 / MSFD criteria / criteria coding as listed in Annex III tables 1 and 2 and 3 / text / Required / DIKE 11: SE - table 3 missing from [Description]
27 / Indicator title / Full title of indicator as published / text / Required
28 / INSPIRE topics / Keyword topics / text / Required
Quality Aspects / Explicit information content on quality of assessment, including data and methods. i.e. Uncertainty, gaps in coverage etc. / x / x / x
29 / Data confidence / Adequateness of spatial and temporal coverage, quality of data collection etc. / text / Required
30 / Indicator methodology confidence / Knowledge gaps and uncertainty in methods/knowledge / text / Required
31 / GES - confidence / Confidence of target, descriptive text / text / Optional
Temporal scope / Time range of assessment, usually expressed as a year range / x / o / x
32 / Temporal Coverage / assessment period expressed as year start -year end / date range / Required
Version control / Publishing dates, references to previous indicator versions, URI's etc. / x / o / x
33 / Last modified date / date of last modification / date time / Optional
34 / Published date / publish date of indicator / date time / Required
35 / Unique reference / Citable reference unique to resource i.e. URI, DOI / text or URL / Optional
36 / version linkage / Link to other versions of assessment / URL / Optional

1

DIKE_12-2015-04

5Annex 1: Compiled feedback from DIKE 11

id / Entity / Comment(s) / Action needed? / Resolution/Comment
1 / EEA / 6 fields need to be added to the EEA indicator template to comply with common structure / EEA marine unit to be in contact with their IDM/COM units to resolve / Add fields to EEA template
2 / Sweden / Does this mean that e.g. WFD and MSFD indicators easily can be combined? / EEA to comment back / n/a
3 / Sweden / Trends are mentioned two times, and it’s unclear what’s referred to., trends over time ? Could they be combined or do they refer to different things ? Or is trend below «category» a «trend indicator » to be used when it isn’t possible to establish a GES-boundary. / EEA/ETC to comment back / The 'trend' field is the most specific and will include a clear indication of the direction of a trend in an indicator i.e. increasing/decreasing/no change;
'Key messages' and 'Key assessment' may be expanded texts/graphics etc. which may include trend (in time) information but also other information on the assessment result and implications
4 / Sweden / For 'GES - concept and target setting method' - better to use boundary instead of Target, no to be mixed with Targets according to article 10. / target came from HELCOM template, need guidance from COMM/EEA / For discussion
5 / Sweden / 'indicator purpose' - Ecosystem or policy relevance ? / ETC to comment / In theory both ecosystem and policy relevance information can be included/referred to. However, there is a specific field for links to policy relevance so this would mean it is more likely to include ecosystem relevance here
6 / Sweden / 'Relevant publications (policy, scientific etc)' - Most important references must be listed but URLs are often not possible on scientific papers. / ETC/EEA to comment / ok to have URL's and 'offline' citations as long as the citation source is publically accessible?
7 / Sweden / definition of 'policy targets' / COMM/EEA to comment/clarify / can refer to any level of policy i.e. national, regional etc.
8 / Sweden / Confused by term 'ownership' as can be multiple owners of assessment / ETC to clarify / The 'owner' is the agency, organisation or institute responsible for the collation and provision of the indicator. Ownership can be switched with 'responsible organization' if this is clearer
9 / Sweden / 'Underlying data' further explanation needed / ETC to clarify / Underlying datasets refers to the provision of data pertaining to assessment under art 19.3 of the directive. The data are not being requested to be reported as such, but made available in a way that allows the indicator to transparent regarding the evidence base on which it was built
10 / Sweden / 'Assessment dataset' further explanation needed / ETC to clarify / This is the instance of the dataset derived from the underlying dataset(s) that has usually undergone some form of processing i.e. filtering for quality control, harmonization of units etc. This is typically the dataset from which the indicator results are derived. The concept is recognised in both HELCOM (assessment dataset) and OSPAR (snapshot dataset) indicator processes. The 'instance' concept is important as the underlying data sources are dynamic and it is therefore difficult to reproduce an assessment result without the point in time assessment dataset.
11 / Sweden / 'Assessment result' what is the intention? / ETC to clarify / The EEA, MS and RSC's have all expressed a wish to have the most up-to-date information shared between systems and ensure that an assessment result at regional level is mirrored at European level. This could be aided by having automation in the system(s) of providing indicator results. Both HELCOM and OSPAR have suggested that such services could be a feature in their online systems (ODIMS and HELCOM Map and Data Service) - so it would be prudent to cater for this but recognising in some cases that a less automated process may be necessary elsewhere. Overall the aim for WISE Marine is to have a system where as little manual work as possible is needed to keep the information updated.