Nothing Special   »   [go: up one dir, main page]

Academia.eduAcademia.edu

Framing an independent, integrated and evidence-based evaluation of the state of Australia’s biophysical and human environments

2014, Journal of Environmental Planning and Management

Framing an independent, integrated and evidence-based evaluation of the state of Australia’s biophysical and human environments Trevor Ward1,2, Steven Cork3 , Kirstin Dobbs4, Peter Harper5, Peter Harris6, Tom Hatton7, Robert Joy8, Peter Kanowski9, Richard Mackay10, Neil McKenzie11, Barbara Wienecke12. 1 Greenward Consulting, Wembley, WA, Australia 2 University of Technology Sydney, Ultimo, NSW, Australia 3 EcoInsights, Canberra, ACT, Australia & Crawford School of Public Policy, ANU, Canberra, ACT, Australia 4 Great Barrier Reef Marine Park Authority, Townsville, Qld, Australia 5 Population, Labour and Social Statistics Group, Australian Bureau of Statistics, Canberra, ACT, Australia 6 Environmental Geoscience Division, Geoscience Australia, Canberra, ACT, Australia 7 CSIRO, Perth, WA, Australia 8 Newnham, Launceston, Tas, Australia 9 Fenner School of Environment and Society, ANU, Canberra, ACT Australia 10 Godden Mackay Logan Pty Ltd, Sydney, NSW, Australia & La Trobe University, Bundoora, Victoria, Australia 11 CSIRO Land and Water, Canberra, ACT, Australia 12 Australian Antarctic Division, Department of the Environment, Hobart, Tas, Australia Corresponding Author: Trevor J Ward, PO Box 493, Wembley, WA, 6014, Australia email: tjward@bigpond.net.au Keywords environment report card; conceptual framework; integrated performance assessment; expert elicitation; consultative process Abstract A new approach was developed for Australia’s 2011 national State of the Environment (SoE) report to integrate the assessment of biophysical and human elements of the environment. A Common Assessment and Reporting Framework (CARF) guided design and implementation, responding to jurisdictional complexity, outstanding natural diversity and ecosystem values, high levels of cultural and heritage diversity, and a paucity of national-scale data. The CARF provided a transparent response to the need for an independent, robust and evidence-based national SoE report. We conclude that this framework will be effective for subsequent national SoE assessments and other integrated national-scale assessments in datapoor regions. 1. Introduction Australia is the world’s largest island continent and sixth largest country, with jurisdiction and management authority over 27.45 million km2, including a landmass of 13.59 million km2 and an associated marine zone of 13.86 million km2 stretching from the tropics to sub-Antarctic regions (including the Exclusive Economic Zone, Extended Continental Shelf, and Australian Antarctic Territory: Symonds et al. 2009). The environments encompassed by this area of land and sea include many tangible and intangible assets and values, including mineral and natural resources, natural and cultural heritage values, and a range of ecosystem services for generation of wealth, lifestyle, well-being, recreation and cultural appreciation (SoEC 2011). The assets and values are also represented by many iconic and globally recognised unique features, including 19 World Heritage sites. The ecosystems and biodiversity are exceptional as they 1 comprise 7 to 10% of all known species, including more than 17,000 species of flowering plants and 33,000 known marine species (Steffen et al. 2009, Butler et al. 2010). The geologically ancient landmass is diverse, with cultural traditions that extend back thousands of years (Blewett 2012, 571). This terrestrial and marine domain is now overlain with 200 years of colonial and post-colonial activity and heritage, particularly across the coastal landscapes where the majority of Australians live within the seven largest cities (SoEC 2011). The Australian Government’s Environment Protection and Biodiversity Conservation (EPBC) Act requires a State of the Environment (SoE) report to be prepared every five years on the nation’s ecosystems, natural and physical resources, heritage value and quality of places, and relevant social, economic and cultural aspects. The fourth such assessment and report, conducted by an independent committee (SoEC 2011) was built on the foundations provided by earlier national SoE assessments and by state and regionallevel environment reports (e.g. GBRMPA 2009, Dobbs et al. 2011). The 2011 SoE report extended the assessment system of earlier reports to encompass more fully the expectations of the EPBC Act and to provide a more comprehensive report. The overall Driving Forces, Pressure, State, Impact, Response (DPSIR) approach adopted for the SoE 2011 report is a performance assessment consistent with modern environmental reporting (Smeets and Weterings 1999), but we incorporated several new aspects, layers of information and a process culminating in summary report cards for each theme. In this respect, the SoE 2011 report was designed to be both technically factual and readily accessible to a wide variety of stakeholders, employing information products at multiple levels in both print and electronic formats. In addition to meeting national objectives, national SoE reports play an important role as source documents for Australia’s international reviews and reporting obligations, such as the OECD’s environmental performance reviews and Australia’s report to the UN Convention on Biological Diversity. Australia’s national SoE process assesses and reports on condition, trends and pressures, but does not make management recommendations nor discuss policy responses/options to any issues that may be described in the report. In that sense, the SoE report is constrained to be assessment, synthesis, evaluation and reporting at the national scale. This provides policy makers with a robustly derived set of issues but does not constrain the types of policy responses that may be invoked. The intention of this approach is to generate an independent perspective on environment performance of the assets and values, to foster a national conversation about issues and their fundamental drivers relevant to these assets/values, and to provide an agreed platform of knowledge that can be used by policy-makers to engage with stakeholders about policies and appropriate supporting management strategies to address the issues. It is difficult to design and apply a single assessment and reporting system that integrates and is equally effective for the biophysical and human elements of highly complex natural ecosystems, built environments and cultural heritage at a national scale. High-level and strategic evaluation of these systems at the national scale requires: • identification of metrics that represent the intrinsic attributes of the systems, • a process for estimating their condition and aggregation to achieve a synthesis and summary of the available information, and • a reporting modality that is accurate, accessible and timely with direct utility for national-level policy and management initiatives. In high-value, large and complex natural and urban systems such as that of Australia, careful attention to core ecological and equity principles is needed to guide the content of the process (e.g. Foley et al. 2010), as well as establishing the context and uncertainties. For a low-bias assessment, the use of a small number of well-known system attributes has to be balanced against the use of a greater number of less reliable attributes that represent different aspects of the systems being assessed (to avoid high levels of uncertainty in model structure, sensu Walker et al. 2003, and minimise the potential for Type III error, e.g. Bark et al. 2013). There needs to be an explicit trade-off between detailed information on a limited subset of ecosystem and human attributes (which may have been studied for purposes unrelated to national policy) and lower-resolution information on a broader range of attributes and policy responses that are more relevant to the broader range of issues relevant to an integrated national-scale SoE assessment In this paper we outline the framework for the integrated assessment system we developed for Australia’s 2011 national State of the Environment report, with selected examples of the process detail drawn from the nine assessment themes. With a focus on the processes of consultation, data capture, and synthesis and interpretation, we describe how the national-scale reporting was designed to respond to policy2 driven requirements. We also consider a broad array of system-level attributes to provide support for a systematic and integrated approach to environment reporting. This experience is synthesised into a logframe (Team Technologies 2005) that may be of value for guiding future national SoE reports in Australia as well as environmental assessments at other scales and jurisdictions where integration of biophysical and human elements of the environment is required. 2. Conceptual reporting approach The SoE 2011 report principally comprises information on drivers (key activities and pressures on the environment); a series of theme reports that consider condition (state), pressures, management responses, resilience, risks; and culminates in an outlook for the future (SoEC 2011, 28). This structure, modelled on the Great Barrier Reef Outlook Report (GBRMPA 2009), provides consistency in most themes with previous SoE reports as well as the opportunity to develop a broadly-based environmental outlook. We adopt much of the European checklist approach (Kristensen et al. 1999), although the assessment and project management aspects are uncoupled to ensure there is a clear separation between the commissioning government agency (Australian Government Department of Sustainability, Environment, Water, Population and Communities; DSEWPaC—now Department of the Environment) and the independent committee appointed to be responsible for the report content. A unified assessment and reporting system (the Common Assessment and Reporting Framework— CARF) was established to provide for assessment outcomes that were consistent across all themes irrespective of their environment attributes. Based on performance assessment metrics (sensu Smeets and Weterings 1999), the assessment system provided a single type of finding in each theme, summarised into ‘report card’ format, enabling easy and direct comparisons of the assessment outcomes amongst themes (and, in the future, within themes). Integration was achieved by applying the common assessment and reporting system to all aspects of all themes, including the biophysical and human environments. A central element of the CARF is the establishment of benchmarks, against which performance grades are assigned. For the natural systems being assessed, intrinsic ‘naturalness’ benchmarks were applied to represent a near-pristine set of the conditions that prevailed at the time of European settlement of Australia (notionally about 1800). In the human-environment constructs, such as the heritage and built environment themes, intrinsic natural attributes are not easily discoverable or estimable. Although these human constructs operate within the context of a natural environment that provides various environmental services (provisioning, regulating, cultural), there are nonetheless few appropriate natural or pristine surrogates that can serve as intrinsic benchmarks for assessment. As a result, perception-based benchmarks for attributes of the environment such as integrity of heritage values, and urban livability were developed and applied in the heritage and built environment assessments respectively. In the atmosphere theme, target-based benchmarks were applied in relation to human health, for attributes such as levels of ambient air quality that impact human health. In the land theme, soil condition was assessed against the baseline of condition prior to vegetation clearing. All the benchmarks were chosen to be consistent with precedents in national assessments in Australia. The CARF therefore enabled each theme to estimate a measure of current condition quality (inter alia) as ‘distance’ of the current condition from a measured or estimated set of the benchmark conditions relevant to the theme under assessment. For all themes (Table 1) the central technical issue revolved around development of a CARF that would robustly deal with the institutional and subject-matter complexity and a paucity of available data and information. There was a considerable number of relevant datasets available to some themes, but much of the information either did not relate specifically to the metrics, scale or scope required, or could not be synthesised and made available to the assessment in the required timeframe. In addition to the lack of an adequate knowledge base (or form of quantitative data or surrogates) to resolve the condition, pressures, and trends within the CARF at a national scale, the assessment process was also constrained by the need for the report to be prepared and concluded within a two-year timeframe. 3 Table 1. Australia’s SoE 2011 assessment themes and their main focus Theme Atmosphere Inland Water Land Marine Environment Antarctic Environment Biodiversity Heritage Built Environment Coasts Focus Climate, greenhouse gases, ambient air quality, indoor air quality, stratospheric ozone Inland river, lake, wetland and aquifer environments and ecosystems Soil, vegetation and land use systems Marine environments, ecosystems and biodiversity Environments, ecosystems and biodiversity of Australia’s Antarctic Territory and sub-Antarctic islands Species, environments, native vegetation, ecological communities and ecosystem services Natural heritage and reserved lands and waters, indigenous heritage, historic heritage places Livability and efficiency associated with human-made physical structures and the environmental regulating, provisioning and cultural services that support these Pressures that impact the coast assets and values Transparency was a key principle underlying the preparation and reporting system for this SoE, not only to provide for both an appropriate level of accountability in expenditure of public funds, but also to document in the public domain the approach and assumptions underpinning the assessments, the process undertaken, and the findings. A high level of transparency was also important for an easily accessible archive (at www.environment.gov.au/soe), so that: (a) the assessment process could be efficiently replicated (and improved where necessary) to permit the five-year scales of change to be estimated at the next SoE assessment in a comparable way; and (b) public-domain contestability was feasible to promote continuous improvement of the assessment system. 3. Design of assessments The EPBC Act is silent about the process that should be used for SoE assessments. Since the inception of national SoE reporting in Australia (1996), every five years governments have appointed and relied upon a small, independent committee of experts to guide and oversee assessment and production of a report that complies with the requirements of the Act. For the 2011 SoE report, the national environment minister appointed a group of eight experts to the SoE committee to represent each of the main themes, and co-opted a member of DSEWPaC staff with special expertise in Antarctic issues (SoEC 2011, 24). Resources were provided by DSEWPaC, including operating funds and support from dedicated staff, including technical and administrative staff who assisted with research, the logistics for the committee operations, report production, and inter- and intra-government liaison. The committee was also supported by commissioned research, case studies, compilations of statistics, analyses drawn where possible from existing data and analytical products of national and state government agencies. While the independence of the assessment process is not mandated in the EPBC Act, independence, transparency, wide consultation and technical robustness provide for a measure of ‘arms length’ assessment, and this was adopted by the committee as a form of assessment and reporting ‘best practice’ that was particularly of importance for maintaining relevance to community expectations. The broad approach and design of the SoE process and report were developed by DSEWPaC in conjunction with the committee, including the overall framework and policy-derived principles for the process, the types of products that would be needed, and the extent of consistency with earlier SoE reporting products. The 2011 SoE process resulted in four main products: 1. the main written document, which contains a set of definitive findings in each theme; 2. an ‘in-brief’ written document that essentially provides a summary of the main report; 4 3. an electronic on-line resource with supporting and additional information not able to be included in the printed report; and 4. a series of stakeholder engagements and presentations following release of the final report. In addition to presentation in a format consistent with the DPSIR approach, all the products use language and concepts familiar to policy makers and the general public. They were designed to be useful for at least five years in secondary and tertiary education, in local and state government, and as overview material for use in research contexts in the private and public sectors. Key high-impact facts were developed into simple graphics, designed to be accessible for all readers (e.g. the number of places added to the National Heritage List in each year from 2005-06 to 2010-11: SoEC 2011, 707). A consistent format for the report card was used to provide a more integrated overview of condition, pressures, trends and confidence (SoEC 2011, 29). The report card presented aggregated and summarised information, using either three or four performance grades, supported by short pieces of text to highlight the main underpinning arguments or evidence. This reporting requirement was also used as the basis for structuring consultation and information capture, although applied differently across themes depending on the availability of data and information, and the type of issues to be addressed in each theme. Each theme assessment involved extensive discussions with government agencies and specific consultation about the assessment approach, direction, and data-sources. During the planning and writing periods, theme authors consulted extensively with technical peers and related experts through both individual and workshop sessions to determine the availability of information, to fill knowledge gaps, and to determine which data were relevant for reporting. Uncertainty in the sense of precision and accuracy has been identified, declared and addressed as a core activity in the SoE 2011 process, recognising the importance of this for providing outcomes that are relevant to environmental reporting (Udovyk and Gilek 2013). Precision of the findings (statistical uncertainty, sensu Walker et al. 2003) was established within the consultation process—authors and experts assigned their own estimates of confidence using a confidence structure established in the CARF. The accuracy of the expert opinion (model outcome uncertainty, sensu Walker et al. 2003) was checked by verification against extant technical data where that was practical, a substantial feature in some themes. To minimise the likelihood of substantive inaccuracy, a broad base of experts was consulted within a systematic process using the CARF reporting template that constrains outputs to a maximum of four grades of performance (five nationally-agreed grades in the case of urban air quality) in each metric. Also, opinion accuracy is traded-off against resolution in the issues by keeping the assessment and reporting focused on a broad base of intrinsic assets and values, thus reducing the risk of decision model failure that could arise from high levels of model outcome uncertainty. And finally, considerable resources were devoted to an independent peer review process to check on the structure, content and findings from each theme. 4. Implementing the assessments The process of designing and implementing the CARF was developed incrementally, and is summarised here in a scale and theme-independent logframe (Team Technologies 2005) (Table 2). In each theme different forms of data and information were available for analysis, including reports on various aspects of pressure, condition and trends in the environmental assets and values. However, much of the available information consisted of only partial assessments, covering limited topics or areas of the nationallyrelevant sets of issues. Each theme therefore, while consistent with the CARF, took a different path to select, assemble and assess data and information (Table 2), informed mainly by the context of the issues, the actors, the time/resources available for the reporting process, and the need to reach findings consistent with the CARF report card format. The initial consultation tasks involved identification and engagement with an appropriate set of experts to satisfy the dual purposes of securing ownership and engagement with the assessment, and identifying data and information sources. The data and information both suitable and available for assessment were identified and obtained from within government and non-government organisations, and through the experts consulted. Experts were also consulted in some themes to assist with setting the spatial and structural boundaries, with the establishment of a typology for the assessment (such as assisting to establish an assessment structure of parameters, components and metrics), and with development of appropriate surrogates for reference benchmarks. This consultation also helped to ensure that the assessment 5 typologies established links between condition and trends in the environmental assets/values and the management frameworks, through relevant performance measures that could be useful in the development of policy responses. The diversity of approaches used by the assessment themes to implement the consultation, data capture and synthesis aspects is described here using five theme examples. 4.1 Built Environment The built environment essentially refers to human-made constructs, albeit within the context of a natural environment that provides various environmental services to these constructs and the people who live within them. This posed challenges for the assessment process, as these attributes of the built environment are generally neither directly discoverable nor observable in nature itself and there is no natural state to provide a benchmark for making assessments. After the identification of relevant experts, based on discipline expertise and coverage of the potential issues, the initial consultation process focussed on clarifying the scope of the chapter, including attributes of the built environment for which assessments would be made, and identifying data sources that could assist in the assessments. There are considerable data on the economic and social aspects of the built environment, but little data to inform the environmental dimension, particularly at a national level, and so informed expert opinion was needed in order to develop an assessment that conformed to the CARF. A workshop of the experts was conducted to prepare the assessments. For the assessment of state, a matrix of population size (representing groups of urban areas) by built environment attribute (such as urban amenity, transport, housing, etc) within the two main components (livability, urban environmental efficiency) was developed. For each cell of the matrix, the available relevant evidence was considered along with expert opinion to determine both a grade and trend, expressed in terms of an agreed grading scale (SoEC 2011, 821). Environmental attributes were aggregated to determine an overall grade for each population size group, rather than aggregating across population size groups to determine an overall grade for each environmental attribute. Confidence levels were assigned to indicate the extent of consensus in the ratings of the individual cells in the matrix and the consistency of ratings across the environmental attributes in the aggregation process. The comments section of the assessment summary was used to identify key determinants in the grading process, with the narrative and data presented in the chapter itself providing a more comprehensive evidence base for the assessment grades and trends. While there were differences of opinion among the experts who were involved in the assessment process, the structured approach that was used delivered assessments that considered all of the available evidence and resulted in a balance of opinions of the experts, including a representation of all relevant forms of uncertainty. 4.2 Heritage Heritage is a subset of the wider natural and cultural environment which is perceived as a valued inheritance to be passed on to future generations. For the purposes of SoE 2011 natural heritage was regarded as those lands which have or should be reserved for conservation purposes. For both Indigenous and historic heritage, SoE similarly addressed not only formally listed places (on statutory and non-statutory registers) but also those places which warrant heritage listing. Assessment of the current state of Australian heritage involves what values have been identified and their current condition. Evaluation of the condition and trend was undertaken using two interrelated and complementary methodologies: commissioned expert assessment, and a series of expert workshops. For the 2001 and 2006 SoE reports, the condition and integrity of a small stratified sample of places on the Register of the National Estate were assessed (Pearson and Marshall 2011). For SoE 2011 this study was repeated, and extended to cover natural and Indigenous heritage. (ERM 2011, Pearson and Marshall 2011, Schnierer et al. 2011). A series of workshops were convened with representatives from peak government and nongovernment bodies in each heritage sub-theme: natural, cultural and historic. Participating agencies included the Australian Committee for IUCN, the ‘Heads of National Parks’ forum, state heritage officials, Australia ICOMOS and the DSEWPaC Indigenous Advisory Committee. At these workshops, a series of open questions were posed, leading to identification of matters relevant to the assessment of condition and trend of Australia’s heritage, as well as individual assessment of the agreed metrics. In the majority of cases, additional data sources or informants were also identified during these workshops. The workshop notes were 6 circulated to participants for verification and published on the SoE 2011 website (http://www.environment.gov.au/soe/2011/report/heritage/index.html). The Australian Heritage Council participated in two workshop discussions—the first to discuss the methodology, priorities and structure of the heritage theme chapter; and the second, an assessment and evaluation similar to that conducted with the other peak heritage bodies. Considerable structuring of available data and workshop expert opinion was needed to ensure conformity with the CARF. Allocation of grades and data confidence levels was undertaken separately by the heritage theme author, as the assigned grades and confidence levels were also informed by the external condition and integrity reports. In addition to reporting through the report cards of the CARF, many of the conclusions in the heritage theme chapter are demonstrated through case studies. There was strong consensus expressed in the workshops regarding important issues, circumstances or trends, but often no empirical data. The report card grades or trends are therefore also supported by examples that ‘prove the point’ (e.g. incremental destruction of Indigenous heritage: SoEC 2011, 737). 4.3 Biodiversity Biodiversity is defined as the variety of life, including the diversity of species and the genetic material that they embody, and the aggregations of species and their interactions with the non-living world that constitute ‘ecosystems’ and landscapes. Successive Australian SoE reports have noted that limited data are available on biodiversity at national scales. While information is collated in various databases at a national scale, the collection of information on biodiversity (such as in field surveys) is performed by statelevel government agencies and non-government organisations (e.g. Birds Australia) that may be collections of local groups. Past national SoE reports have collected published and unpublished information on aspects of the pressures on biodiversity and the changes in state that those pressures induce, but this information has rarely been collected with a view to supporting inferences at a national scale. The assessment of biodiversity in the SoE 2011 report drew primarily on three sources of information: 1. a review of conclusions about drivers, pressures, state, impacts and responses at the state level by each of the state governments in their most recent SoE reports; 2. assessments of biodiversity made in other chapters of the SoE 2011 report (e.g. the marine chapter conducted an extensive expert consultative process, the inland water chapter drew on recent reviews of biodiversity in Australian rivers, and the land chapter reviewed current information on native vegetation); 3. key recent reviews of some groups of species at a national level, including a review of representation of terrestrial ecosystems in Australia’s protected area system (Taylor et al. 2011), an independent report to the Australian Government about Australia’s terrestrial biodiversity (DEWHA 2009), and a national assessment of the state of Australia’s birds (Olsen 2008). Given this information base, which included several previous expert consultative processes, the Biodiversity theme authors made judgments about grades to be applied in the report cards in relation to pressures on biodiversity and state and trends. The levels of consensus among experts and the amount and reliability of information were made explicit in the confidence indicators in the report cards. Where there was limited evidence or consensus, which was a frequent occurrence, specific experts were contacted to determine if relevant information had been overlooked. This approach did not avoid the problem that in many cases information was inadequate to support unequivocal conclusions, but it did allow identification of uncertainties, where the strong conclusions could be made, and where it was most critical to obtain new information to support strategic decision-making. While most jurisdictions understand the nature and implications of pressures on biodiversity, there was a general pattern of inadequate investment to meet objectives and a failure to achieve desired outcomes for biodiversity. This was recognised as a major issue, which points to the need for critical examination of management effectiveness in relation to biodiversity. 7 4.4 Marine environment The available marine data for the national assessment related to a species complement of only a few hundred species and habitats, representing e.g. less than 1% of the known species that occupy Australian marine waters. Further, most of the available species-level datasets were highly spatially biased towards intensively studied shallow water areas, or structurally biased towards data that was required for natural resource management (such as for fisheries) rather than intrinsic ecological attributes of populations or habitats. An extensive expert consultative process was designed because of the paucity of suitable public domain marine biological data and information. The primary objective of this was to secure a set of low-bias expert judgement assessments of the available marine data and information in relation to the marine assets and values. At least two currently active and field-experienced experts able to attend workshops were identified for each broad discipline/issue area. The consultative process involved establishing an assessment typology to represent the assets and values of the entire marine jurisdiction that was not biased by the extent of available data, and then conducting a series of three assessment workshops. The experts assigned scores/grades to condition, trends, pressures and confidence, and discussed their consensus assignments in the presence of their peers. The outcomes of the assessment process are summarised in the SoE 2011 report (SoEC 2011, 388) and are presented in more detail elsewhere (Ward forthcoming). Information about each of the metrics (a hierarchical set of parameters, components and indicators) in the assessment was sought from the invited experts. Their grading judgement was assigned using pre-agreed assessment procedures and grading scales (SoEC 2011, 394), and this then provided the primary data for assessment. The opinions and judgements were contested and verified, sometimes extensively, at the workshops, in order to ensure that judgements could be considered as a consensus of the participating experts and be based on the available data and information (Ward 2011, Ward forthcoming). The data provided by the experts were aggregated into high-level graphical summaries. Summary statistics of the unweighted data and non-parametric statistical tools were used to avoid implicit weighting and complex indices that might bias outputs beyond that established by the explicit architecture of the assessment. All the data were standardised to a single grading scale for the report card, consistent with the CARF (SoEC 2011, 392) and have been also used for more detailed analysis providing more complex overviews that integrate condition, trend and information quality, to inform the development of integrated policy analysis and responses (Ward forthcoming). 4.5 Antarctic environment The natural and cultural values of Australia’s Antarctic Territory (AAT), the Australian territory of Heard Island and the MacDonald Islands, and the waters surrounding these areas, are managed by the Australian Antarctic Division (AAD) of the Department of the Environment (formerly DSEWPaC). The AAD operates with a strong basis in science, including extensive collaboration with other national and international institutions with Antarctic interests and expertise, and is Australia’s principal institution for Antarctic science, policy and management. The data collection, synthesis and analysis for the SoE 2011 report was therefore led by AAD, in cooperation with other relevant government and tertiary education institutions with expertise in Antarctic matters. While not fully independent of government, the process implemented by the AAD was heavily based on established scientific knowledge, was guided by the SoE committee within the common framework for decisions and reporting (the CARF), and involved extensive consultation outside DSEWPaC to establish the issues and verify the information base. The Antarctic chapter covered various topics that were also addressed in the broader Australiaspecific context (inter alia biodiversity, marine, heritage) but it also includes information about the cryosphere and policy and governance issues specific to the management of the AAT and Australia’s subAntarctic islands. This chapter therefore constitutes a report on the AAT and the sub-Antarctic islands as a single entity. However, not all issues relevant to the AAT and the islands could be dealt with in detail, so topics were selected to be most relevant for the 2011 report and to be representative of a broad range of Antarctic issues. In preparation for the SoE 2011 report, there was extensive external consultation and collaboration focused on preparing the theme chapter. A number of meetings were held at the AAD’s headquarters, also attended by staff from other research organisations, to determine topics to be included in the 2011 report, and 8 the structure of the assessment tables. Consensus was achieved during the expert meetings on grades for condition, trends, pressures and confidence where data, particularly long term data, were available. Where insufficient data were available to make an assessment, this was indicated appropriately in the tables and the confidence assignments. In support of the chapter development, case studies were commissioned from experts both within and external to AAD. In the discussions of the various topics, the most up-to-date international science literature was considered and the data summaries and presentation of each section were reviewed by repeated iterations of both internal and external experts prior to the final submission of the chapter for external peer review as part of the SoE-wide process. 5. Discussion Like this assessment, the past cycles of national SoE reporting in Australia have been heavily constrained by limited availability of data/information at the national scale. As a result, past reports are typified by the ‘shopping list’ approach—reporting of issues and examples to provide evidence that reflects the importance of a pre-determined set of themes and issues of the day, filtered through the expertise/experience of the appointed committee and a limited range of consulted experts. While this has provided an acceptable basis for SoE reporting and policy development in the past, as a result of a number of recent, highly fractious, environmental debates there is now a heightened awareness in the Australian community of the need for greater process transparency and accountability in government, and the need to provide for direct links between environment assessments and management responses. This includes a need to use unbiased data/knowledge as well as internally coherent and consistently repeatable structures for making assessments that may be used to influence management strategies in sensitive areas of public policy such as climate, water, heritage, and natural resource management. The adoption of the independent evidence-based and highly consultative integrated assessment approach we report here distinguishes the 2011 SoE report from its predecessors, which have mainly relied on selected examples of the presumed issues drawn from information-rich sources. The CARF approach has allowed the SoE 2011 report to move away from the development of a catalogue of what is not known or has not been recorded, to pro-active evaluation and assignment of clear assessment grades based on more defendable, repeatable and transparent data collection processes, supported by explicit estimates of uncertainty that can be applied to the findings. To provide a mechanism for feedback and assessment of the effectiveness of the SoE process as a whole, a structured feedback process was implemented by DSEWPaC. This resulted in substantial feedback, mostly supportive and constructive, indicating that there has been a wide and positive acceptance in the Australian community. In the public release of the report, the national environment minister declared that the report would serve as a touchstone for subsequent government policy and decisions across all portfolios. Online and download readers of the report were issued with an invitation for voluntary feedback. The webbased survey (conducted through SurveyMonkey ®) indicated that, in the 18 months after the report release, 83% of respondents agreed that the SoE 2011 report had improved their understanding of national environmental issues, 74% agreed they were made aware of new issues, 85% were assisted by the report in their work or study, and the decisions or actions of 68% of readers were influenced by content of the report. Many of the theme chapters are demonstrably influencing national policy—the heritage chapter is directly informing and influencing the preparation of a ‘National Heritage Strategy’ by the Department of the Environment and the Australian Heritage Council. Likewise, the Primary Industries Standing Committee is responding to soil management challenges identified by SoE 2011 (primarily relating to soil carbon stocks, widespread soil acidification and unsustainable rates of soil erosion by water). Internationally, the approach has been adopted for environmental reporting in similarly data-poor marine situations for the purposes of the World Ocean Assessment (Ward 2012). Ultimately, the most important measure of success will be the extent to which policy responses in the forthcoming decade reflect the issues and findings presented in the report. It is too early to declare the process and report a success in terms of environmental outcomes, even though it is having an important contribution to public debate, and is helping to develop a sound basis for consideration of environment issues within public and policy-development circles. Whether this will translate to effective strategies and actions for improvement in Australia’s environment issues remains to be determined, and will unfold with continuing iterations of the national SoE assessment process. 9 6. Conclusions SoE 2011 represents a landmark in national environment reporting in Australia. Despite the persistent institutional and information complexities, significant improvements over earlier approaches were achieved. To meet the multiple, and sometimes competing objectives, the project design was adaptive and developed incrementally, as summarised here in the CARF logframe. In particular, the CARF was important for improving the planning, implementation and management of SoE activities across multi-year budget cycles; for facilitating effective consultation and engagement with technical experts; and for a clear articulation of the capacity of the project to deliver high-impact outputs that was needed to develop a supportive and engaged base of stakeholders. For an SoE report that engages with the community, influences policy decisions in government, and is well accepted in the private sector, a well-founded consultative process that operates within a single unified assessment framework has been required. For the SoE 2011 report, the two central purposes for consultation were to develop and benchmark a process that provided a sense of ownership in the stakeholders, and to secure robust and defendable technical information base for the findings of the report. With appropriate design and management of bias/risks, the use of expert judgement worked effectively and met both purposes. We therefore conclude that a basis of expert knowledge drawn from within a well-designed consultation process is both achievable and important for SoE reporting purposes and is effective in delivering credible findings. The effectiveness and value of the CARF for SoE assessment and reporting will grow with each iteration of SoE assessments that use the same approach. We expect that with repeat cycles of assessments that use this CARF future trends will become more obvious and new information will clarify the condition of many components assessed in SoE 2011. We also expect that such an ongoing implementation of this approach will incrementally increase the effectiveness of management in maintaining the structure, functions and health of the full range of Australia’s environments. In this way, the five-year cycle of national SoE assessment and reporting activities may then become accepted as an investment in the future rather than only a cost to the current budget. Acknowledgements This manuscript was improved by guidance and input from Nancy Dahl-Tacconi (formerly DSEWPaC). The analysis reported here does not necessarily represent the views or the official position of the Australian Government, or the Australian Government Department of the Environment. The process and outcomes were made possible by the contributions of many consulted experts, reviews and colleagues across many disciplines and organisations, to whom we are grateful (listed at SoEC, 2011: pp. 898-903). The work reported here was funded and supported by DSEWPaC, and we recognise the high level of professional support provided by the DSEWPaC SoE Team and other staff, the professional support of Biotext information consultancy (biotext.com.au), the vision of the commissioning Minister for Environment Protection, Heritage and the Arts (Peter Garrett) in presenting us with this challenge, and the support of the presenting Minister for Sustainability, Environment, Water, Population and Communities (Tony Burke) in accepting and delivering the final report to Australia. PTH publishes with the permission of the Chief Executive Officer, Geoscience Australia. 10 References Bark, R. H., L. J. M. Peeters, R. E. Lester, C. A. Pollino, N. D. Crossman, and J. M. Kandulu. 2013. Understanding the sources of uncertainty to reduce the risks of undesirable outcomes in large-scale freshwater ecosystem restoration projects: An example from the Murray–Darling Basin, Australia. Environmental Science & Policy 33: 97–108. Blewett, R. 2012. Shaping a Nation: a Geology of Australia. Canberra: Geoscience Australia and ANU Press. Butler, A., T. Rees, P. Beesley, and N. Bax. 2010. Marine biodiversity in the Australian region. PLoS ONE 5(8): e11831. doi :10.1371/journal. pone.0011831. DEWHA. 2009. Assessment of Australia’s terrestrial biodiversity 2008. Canberra: Australian Government Department of the Environment, Water, Heritage and the Arts. Dobbs, K., J. Day, H. Skeat, J. Baldwin, F. Molloy, L. McCook, M. Johnson, et al. 2011. Developing a longterm outlook for the Great Barrier Reef, Australia: a framework for adaptive management reporting underpinning an ecosystem-based management approach. Marine Policy 35: 233–244. ERM. 2011. Condition and integrity assessment of natural heritage places. Environmental Resources Management, Australia. Canberra: Australian Government Department of Sustainability, Environment, Water, Population and Communities on behalf of the State of the Environment 2011 Committee. Foley, M., B. Halpern, F. Micheli, M. Armsby, M. Caldwell, C. Crain, E. Prahler, et al. 2010. Guiding ecological principles for marine spatial planning. Marine Policy 34: 955–966. GBRMPA. 2009. Great Barrier Reef Outlook Report 2009. Townsville: Great Barrier Reef Marine Park Authority. Kristensen, P., L. Anderson, and N. Denisov. 1999. A checklist for state of the environment reporting. Copenhagen: Technical Report 15, European Environment Agency. Olsen, P. 2008. The state of Australia’s Birds 2008: A Five year Review. Melbourne: Birds Australia. Pearson, M., and D. Marshall. 2011. Study of condition and integrity of historic heritage places for the 2011 State of the Environment report. Canberra: Australian Government Department of Sustainability, Environment, Water, Population and Communities on behalf of the State of the Environment 2011 Committee. Schnierer, E., S. Ellsmore, and S. Schnierer. 2011. State of Indigenous cultural heritage 2011. Canberra: Australian Department of Sustainability, Environment, Water, Population and Communities on behalf of the State of the Environment 2011 Committee. Smeets, E., and R. Weterings. 1999. Environmental indicators: Typology and overview. Copenhagen: Technical Report 25, European Environment Agency. SoEC. 2011. Australia State of the Environment 2011. Canberra: Australian Department of Sustainability, Environment, Water, Population and Communities on behalf of the State of the Environment 2011 Committee. http://www.environment.gov.au/resource/state-environment-report-2011-soe-2011-contents 11 Steffen, W., A. Burbidge, L. Hughes, R. Kitching, D. Lindenmayer, W. Musgrave, M. Stafford Smith, and P. Werner. 2009. Australia’s biodiversity and climate change. Canberra: Australian Government Department of Climate Change. Symonds, P., M. Alcock, and C. French. 2009. Setting Australia’s limits. AusGeo News 93, March 2009. http://www.ga.gov.au/ausgeonews/ausgeonews200903/limits.jsp Taylor, M., P. Sattler, J. Fitzsimons, C. Curnow, J. Beaver, L. Gibson, G. Llewellyn. 2011. Building Nature's Safety Net 2011: The State of Protected Areas for Australia's Ecosystems and Wildlife. Sydney: WWFAustralia. Team Technologies. 2005. The Logframe Handbook: a Logical Framework Approach to Project Cycle Management. Washington, DC: World Bank. http://documents.worldbank.org/curated/en/2005/01/5846691/logframe-handbook-logical-frameworkapproach-project-cycle-management Udovyk, O., and M. Gilek. 2013. Coping with uncertainties in science-based advice informing environmental management of the Baltic Sea. Environmental Science & Policy 29: 12–23. Walker, W., P. Harremoes, J. Rotmans, J. Van Der Sluijs, M. Van Asselt, P. Janssen, M. Krayer Von Krauss. 2003. Defining uncertainty: a conceptual basis for uncertainty management in model-based decision support. Integrated Assessment 4 (1): 5–17. Ward, T. J. 2011. National marine condition assessment—decision model and workshops. Canberra: Australian Government Department of Sustainability, Environment, Water, Population and Communities on behalf of the State of the Environment 2011 Committee. http://www.environment.gov.au/soe/2011/report/marine-environment/supporting-material.html Ward, T.J. 2012. Workshop Report: Regional Scientific and Technical Capacity Building Workshop on the World Ocean Assessment (Regular Process), Bangkok, Thailand. 17– 19 September 2012. Bangkok: UNEP/COBSEA http://cobsea.org/documents/BangkokWSreport_lores.pdf Ward, T.J. (forthcoming). Australia’s marine environment is in good condition, but in decline [for Ocean and Coastal Management] 12 Table 2. The Common Assessment and Reporting Framework (CARF) for Australia’s SoE 2011 report: a scale-independent project logframe. 13 Step Phase sequence of activities 1a Preparatory 1b 1c 2a Consultation 2b Data capture 3 Synthesis, interpretation 4 Objective Policy Driver legislative mandate and intended achievements requirements Confirm the policy mandate Establish purpose for and scope for assessment and Interpret and operationalise assessment and the legislative mandate reporting, and identify target reporting audiences Ensure that key informants Establish context of the Identify the set of key and stakeholders are process, specific sectors with stakeholders, and their effectively engaged to build an interest in the findings, and specific interests ownership of process and specific points of engagement outcomes Audit existing data, reports, and Establish the nature and extent Ensure data and knowledge benchmarks; secure of existing knowledge; fill key base is current and best additional topic reviews gaps in knowledge available for assessment where needed Set boundaries, be Spatially bound the clear about what is in assessment and reporting Implement the mandate for national environment and out of scope; eg process and its broad spatial and boundaries components, for clarity about assessment and reporting for the area(s) to be what areas/assets/values are consistent with ecosystem assessed and any sub- included/excluded; secure based management and divisions that might be agreement with key reporting needed stakeholders Parameters and decision model: identify Match the reporting and agree on typology - capacity/expectation to the aspects to be biophysical structure of assets assessed, the grading and values; clearly establish scale to be deployed, the distinction between the baselines/reference assessment of the points, rules for grading environment vs assessment of decisions, and form of sustainability final reporting Prepare information base, assign scores/grades through workshops where specific findings can be tested with peers, iterate workshop outcomes for verification and confirmation Conduct a detailed scoring analysis to support report card output 5a Peer review 5b Production Production Operationalise the reference/benchmark framework for assessment decisions relative to policy needs/mandate Assign credible and defendable grades to the Produce robust assessment ecosystem assets and values decisions in a report that is with a known level of accuracy credible and defendable and confidence Actors organisations and players involved Output the desired product Purpose* Performance Measure attributes of the project indicative step-wise KPI serviced Australian Environment Minister; DSEWPaC executive; Whole of government other Australian Government commitment to process agencies Clients, Relevance, Coordination & Facilitation SoE community, national and state government agencies, Stakeholders comitted to resource users, NGOs, tertiary process science/knowledge base Summary statement of Clients, Relevance, Credibility, stakeholders and their Communication, Coordination specified interests; confirmed & Facilitation with key actors SoE community, national and Inventory of data and state government agencies, knowledge, agreed with key resource users, NGOs, tertiary actors science/knowledge base Clients, Relevance, Credibility, Summary statement of existing Communication, Coordination knowledge and sources; & Facilitation confirmed with key actors National and state government agencies, NGOs, tertiary Bounding the biophysical, Relevance, Credibility, issues and governance science/knowledge base Communication subset (societies, experts, systems covered by the report workshop participants) Summary statement of purpose; confirmed with key actors Maps, process structure and boundaries agreed with key actors Tertiary science/knowledge base subset (experts, workshop participants) Agreed set of metrics to be covered by the report, and rules governing assessment process decisions Credibility, Communication Typology and decision model agreed with key actors Tertiary science/knowledge base subset (experts, workshop participants) Credible and defendable assessment decisions Credibility, Relevance Raw workshop outputs Credible, accessible and effective assessment report Credibility, Communication, Coordination & Facilitation Synthesised workshop outputs Report of peer review, response from theme authors Credibility, Relevance Reviewer reports, authors response reports, consequent updating of draft report Draft SoE report Communication, Relevance, Coordination & Facilitation Draft SoE report Produce assessment decisions Synthesise and summarise the in a report that is accessible Lead agency; supporting assessment findings facilitator and effective for the target audiences DSEWPaC, independent Secure quality peer review of Produce report that is credible reviewers with qualifications findings, independent of theme and defendable and experience at least equal authors to those of the theme authors Report that provides strategic Range of accessible products Lead agency; supporting decision support and is that engage with target facilitator; communications effective for national-level audiences team policy making Release, outreach 6 Feedback 7 Iteration 8 Report that provides strategic Outreach with report Australian Environment Issue discussion amongst Direct engagement with target decision support and is card findings and policy Minister; national government policy makers and the broader audiences effective for national-level implications agencies; NGOs, media team Australian community policy making Estimate the effectiveness and Performance-based feedback impact of the process and the Guidance for incremental Formal evaluation of on effectiveness, and to guide report, relative to the mandate, Australian Environment improvement of the report process and Minister; lead agency; other future improvements for the and contribute to 'return on assessment and reporting effectiveness design and implementation of investment' and 'value for national government agencies process future national SoE reports money' public funds expenditure assessment Enable adaptive ecosystemAustralian Environment based management through Minister; DSEWPaC executive; Whole of government Repeat comparable Interpret and operationalise time series analysis of process, 5 yearly cycle the legislative mandate other national government commitment to process environment condition and agencies trends Clients: ensure that all interested and contributing actors are identified, have the opportunity to input, and have access to outputs Communication, Relevance, Coordination & Facilitation Level of public engagement Communication, Relevance, Coordination & Facilitation, Impact Public domain report of formal evaluation process Clients, Relevance, Coordination & Facilitation, Impact Summary statement of purpose; confirmed with key actors Relevance: Maintain the focus of the process and the assessment on the primary environment issues and matters that relate to an objective, robust and independent assessment with accessible outputs Coordination/Facilitation: there is an effective set of arrangements with capacity to ensure a high level of coordination is maintained by the process, and facilitation of client consultation is effective *Project Attributes and their defining features Credibility: the assessment process is transparent, independent, evidence-based, is conducted to a high technical level of robustness and independently peer reviewed, and issues findings bounded within an appropriate envelope of uncertainty and confidence Communication: the interim and final products of the assessments are appropriately framed into accessible language and concepts, and made available to the community and the actors in a programmed manner to enable informed and respectful analysis and public feedback Impact: the demand for print and electronic SoE products is tracked, and there is an assessment of citations in the various technical journals and policy development domains that may indicate uptake and the type and level of policy response