-
Member Login
- Home
- About
- Institute Groups
- Membership
- Events
- News & Publications
- Institute Programs
- Resources
- Jobs Board
- Contact Us
- Site Info
Impact assessment (IA) relies upon being able to predict changes that may occur to environmental and social systems, values and resources as the result of a proposed action. As these predictions may trigger significant decisions about a development proposal, including a decision about whether to proceed, the accuracy of these predictions is clearly important. We have a lot of sophisticated modelling tools available these days, but how accurate are the predictions we make in IA? While those involved in IA often bemoan the lack of follow-up, there are a handful of studies that have tried to verify the predictive accuracy of IA and the results are mixed. Although it’s rare for impacts not to be identified at all, over- and under-prediction is common.
Obviously, one source of error arises from the (in)competence and (in)experience of practitioners collecting baseline data and performing modelling and other analytical tasks. The conscious and unconscious bias of these practitioners when they make decisions or assumptions about modelling approaches is also an issue. These issues can be addressed to some extent by skill development, sensitivity testing and peer review. Follow-up validation studies might also be of great benefit here.
Another key cause of inaccuracy or uncertainty can be inadequate baseline data or poor conceptualisation of the environmental and social systems that we are trying to model. The obvious response to this is increased effort and rigour at data collection, and more sophisticated models, and this is an entirely appropriate response. But there are also broader issues of epistemic uncertainty, or imperfect knowledge. In fact, the social and environmental systems that IA practitioners are working with are extremely complex and subject to a wide range of natural variations and, increasingly, anthropogenic pressures. This complexity and variability places both practical and theoretical limitations on how accurately we can measure, conceptualise and model the systems.
This is a challenging proposition for IA practitioners and the technical specialists that perform predictive studies. Many of us come from science and engineering backgrounds and are heavily influenced by positivist and rational ideals that, with enough effort, we can know and understand the world around us. It’s also nerve-wracking for those involved in decision-making, whether deciding about alternative designs and mitigation measures or deciding whether a proposal should proceed. IA is based on a rational decision-making model that relies on a large amount of accurate information that can be used to weigh up the pros and cons of a proposal, and so the idea that we may never in fact have enough evidence to feed this model is very disconcerting.
Of course, this issue has been around as long as the practice of IA, and there are a number of ways to deal with it. Apart from minimising uncertainty as much as possible and practicable in IA studies, it is also suggested that uncertainty needs to be better communicated in IA reports. Studies have shown that there remains a tendency for IA reports to avoid expressing uncertainty about the accuracy of predictions, which makes the predictions appear more certain or confident than they really are. This may be because practitioners are not fully aware of uncertainty, but also from a desire to appear confident, and avoid challenge.
Another issue is that statements made about impacts in IA reports are often vague, ambiguous and unverifiable, which is a problem for those reviewing the reports, but also for longer-term management and monitoring. Finally, once proposals are implemented, techniques such as adaptive management are available to help deal with uncertainty but these approaches require strong management frameworks, well-targeted monitoring and effective responses.
EIANZ is keen to examine how Australian and New Zealand practitioners identify and manage uncertainty, and what sort of guidance and support might be required. It is planned to form a working group from amongst the IA SIS membership to work on this issue over the next 12 months or so, with a view to providing some good practice guidance for practitioners. If you are interested in being involved, please contact Claire Gronow for more information.
We acknowledge and value the rights and interests of Indigenous Peoples in the protection and management of environmental values through their involvement in decisions and processes, and the application of traditional Indigenous knowledge.