Please use this identifier to cite or link to this item:
|Scopus||Web of Science®||Altmetric|
|Title:||A resourceful and adaptable method to obtain data on the status of seagrass meadows|
|Citation:||Aquatic Botany, 2017; 141:17-21|
|Nicole R. Foster, Douglas G. Fotheringham, Daniel J. Brock, Michelle Waycott|
|Abstract:||Coastline degradation, as well as subsequent ecosystem loss, has long been attributed to anthropogenic stress and is an all too familiar issue affecting coastal habitats. Should management and conservation efforts fail to improve the quality of coastal ecosystems and the services they provide, they may be irrevocably damaged. A significant limitation to conservation efforts is often the ability to track change in seagrass meadows due to the significant time and cost of monitoring efforts in underwater habitats. Remote sensing is often a tool used to improve our knowledge of habitat status, however, ground-truthing remote sensing results is difficult when historical data is required. We apply an innovative and resourceful approach to the attainment of data to check the status of seagrass meadows from resources that are available in many areas due to the collection of other data sets. We employ the use of underwater digital photographs originally taken for monitoring sediment movement patterns. We were successfully able to develop a method to critically and easily evaluate these photographs for habitat status, enabling the generation of a data set unable to be obtained in other ways. This method can further be utilised in a citizen science project, for other underwater digital photographs, to support the assessment of coastal submerged ecosystem habitat status.|
|Keywords:||Seagrass; digital photographs; remote sensing; South Australia; coastal monitoring; citizen science|
|Rights:||© 2017 Published by Elsevier B.V. All rights reserved.|
|Appears in Collections:||Environment Institute publications|
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.