Coral reefs are highly complex and heterogenous, making it difficult to measure reef population and community dynamics given the logistical constraints of underwater sampling. Simple metrics such as coral growth and partial mortality are a challenge to obtain in situ by standard monitoring techniques. Underwater photogrammetry can be used to monitor large areas and extract information across organisational levels (colony, population, and community). Image segmentation, a process that groups similar pixels and assigns them a class, can provide colony scale metrics. However, the challenge resides in the capacity to automatically segment the thousands of colonies that are imaged. Automating data extraction from photogrammetry is likely to become a key component for coral reef monitoring as it builds capacity to process high volumes of data. This study investigates the application of machine learning to optimise the extraction of structural complexity metrics and coral classification from photogrammetry outputs to quantify coral reef community assemblages. I propose a semi-automated and iterative workflow that: 1) segments coral colonies based on key morphologies; and 2) extracts colony level metrics from 2D and 3D models. This method aims to automatically extract coral community composition data enabling spatial analysis to quantify differences between sites and datasets.