View your project summary
Guide to using the "Dashboard" page
Review the updated results from your project using the Project "Dashboard" page, accessible from the left-hand ribbon menu of the Data Portal.

The Dashboard is updated automatically to highlight the latest statistics across the project and among sites, extracting the total hard coral cover values estimated by the human and machine over time. Specific information includes:
Sites - the total number of sites within the Project.
Images - the total number of uploaded images.
Human Annotatable Points - the total number of human annotation points available across your project to train the model. This value will be the number of enabled images multiplied by the number of points per image you defined during during the initial project creation. The progress tracker will reflect the proportion of those points that have been annotated by users within the project to date. Generally, we suggest annotating around 30% of the points across your dataset, spreading effort evenly between surveys and considering how the AI model is performing using the "Model Report" page. Depending on the complexity and size of your project you may find that you achieve a machine that performs well with considerably fewer annotations.
Transfer Learning Points - the total number of points annotated by the model across the project, including those not annotated by a human. This encompasses all the additional work your AI model is doing across your project based on machine learning. Each model will annotate a total of 50 points per image, including any human annotatable points available.
Hard Coral Cover Estimates - the model and human hard coral cover estimates for a given year and site can be viewed in the bar plots provided. View the total percentage hard coral cover (y-axis) reported across different years (x-axis), site-by-site, either by selecting site on the dropdown menu or clicking on a given site on the map on the right hand side of the dashboard. For longer time series datasets, it may be useful to view this data using a line graph display which can be updated via the button to the right of the site list. Blue bars (or points) represent hard coral cover for each year based on human annotations, and orange represents hard coral cover for each year based on machine estimates. Blue bars are missing where no human annotations have been made for a given year and site. Hover over each bar to see the percentage hard coral cover estimate. Hovering over the plot reveal additional viewing features, including options to zoom, change axes, select areas to view and download the plot as a .png image file.
Site Map - view project sites relative to each other on an interactive satellite map. Select sites to view the site name and coordinates, and to change the hard coral cover estimates plotted alongside.
What are the differences between "Human Annotatable Points" and "Transfer Learning Points"?
Under the “Dashboard” page, you’ll see there is a value given for the number of human annotatable points and a second value for the number of transfer learning points. The number of human annotatable points are defined by the user during the initial project creation and remain fixed within a project. The machine uses these points in training and applies that learning across the project (up to 50 points per image). For example, if you have 5 human annotatable points per image, the machine will add 45 other transfer learning points randomly, adding up to a total of 50 points; if you have 20 human annotatable points per image, the machine will add 30 other transfer learning points randomly. These machine transfer learning can be visualised on a given image via the "Classify Images" page by unchecking the red "Only Show Human Annotatable Points" default option in the filter. Data is also available for these points in the point classifications file export.
Last updated
Was this helpful?