Description
A wide range of different and complementary data (RADAR, optical, IR) from Sentinel-1,2,3 and now also from Sentinel-5P are nowadays available with an open and free policy. For some applications, such as agriculture, the synergy between these data has been already shown. For several applications, there has been an increasing interest in jointly using both RADAR and optical data to compensate for the limitations of using single data products alone. The combination of the weather and illumination independence and the sensitivity to the size, density, orientation and dielectric properties of SAR sensors together with the multi-spectral information related to the leaf structure, pigmentation and moisture captured by optical sensors can provide greater insight and context in many areas of application. This half-day tutorial will demonstrate the usage of Open Tools (ESA SNAP; QGIS; R) available within the RUS environment to run a supervised classification over an agricultural area using Sentinel1 GRD and Sentinel-2 products. Participants will be able to choose between two of the most relevant machine learning algorithms used for this purpose: Random Forest and Support Vector Machine (SVM). A multi-temporal and multi-sensor pixel-based data fusion approach will be used in various scenarios to analyse the influence of different input data in the final classification accuracy. The data fusion will be implemented at feature level (fusion of images is done before applying the core processing task, e.g. classification) Participants will get familiar with SAR and optical data pre-processing using batch processing in SNAP and its command-line implementation (GPT) as well as using existing R scripts implemented in a Graphical User Interface (GUI) within QGIS.
Presenter: Miguel Castro Gómez, Remote Sensing Specialist, RUS – Research and User Support for Sentinel Core Products-Service Operation, Serco UK&E Local Regional Government
Duration: 3 – 4 hours
Software: SNAP, QGIS, R