This hackathon in “AI for Sustainability” aims to bring together students and researchers from a multitude of disciplines to collaborate on a small set of strategically selected projects at the intersection of environmental science and artificial intelligence (AI).

The goal is to collaborate in multi- and interdisciplinary teams to provide research-driven and practical solutions to topics in sustainability. Breakfast, lunch, tea/coffee and water are provided (free) to participants throughout the hackathon.

We welcome participants from multiple different disciplines to contribute to different aspects of the projects.

28 March - 1 April 2022

Registration is now closed.

Hackathon Projects

For our first hackathon, we have chosen the following projects. Participants will be chosen from a multitude of disciplines and backgrounds and will be allocated to one of the three groups. It is not expected that all participants are able to programme or have prior knowledge in AI and machine learning. It would be desirable though if participants could pick a project that relates to their interesting and discipline in some way (e.g. computer science, environmental sciences, engineering, or similar).

Project 1: Hedge identification from earth observation data with interpretable computer vision algorithms.

This projects aims to establish if explainable AI methods (for image analysis) can identify environmental features, such as hedgerows, from earth observation, e.g. satellite data. Hedgerows are an important feature of the UK landscape and support biodiversity by providing a home to a range of different species, as well as offering wildlife corridors, linking hedgerows with woodland and trees. Hedgerows are also increasingly researched for their carbon capture potential. This project builds off an existing investment from Natural England and The Tree Council. Data and baseline code are provided to the hackathon group. The task is to apply a comparative set of recent deep learning models (e.g. vision transformers, capsule networks) and see if they can add benefits to the existing convolutional neural network model, e.g. in terms of model performance and / or explainability.

Project 2: Monopile fatigue estimation from nonlinear waves using deep learning.

This project is connected with a current Supergen ORE Hub project which estimates the fatigue and remaining useful life of wind turbine monopiles. It seeks to make a direct contribution to the reliability of renewable energy, specifically offshore wind, and in this way contributing to the UK’s and global transition to carbon net zero. The project combines numerical modelling approaches that capture the physics of wind and wave effects on monopiles with deep machine learning methods to create rapid assessments of monopile fatigue, thus directly informing operations and maintenance decisions of turbine operators. Data and baseline code are provided for this project. The goal is to benchmark the performance of a variety of different neural networks (e.g. recurrent neural networks, transformer networks, convolutional neural networks) to establish which family of deep learning model is the most suitable to this task and project.

Project 3: Live sentiment tracking during floods from social media data.

This project will analyse text and image data from social media tweets collected during historical flooding events. With the ever-increasing use of social media, this kind of data is becoming an important source of information during flooding (and other emergency) events, and can potentially be used to direct emergency response, rescue or other assistance where it is most needed. This can be achieved via a combination of sentiment analysis and topic identification, i.e. what are people tweeting about and how badly (or not) are they affected? Current approaches to this idea are still in their infancy mostly due to the difficulty in processing such information in real time, i.e. whilst they occur. Data and baseline code for this project are available. The key objective is to build on top of an existing system that can identify sentiment and topics offline, and establish experimentally whether it is feasible to do so online from a live stream of tweets.

Programme

Our programme is as follows - as the event is run during the teaching period, it is expected that not all participants will be able to attend all sessions. You can always join us and contribute to those sessions that you are available for.

MONDAY 28 March Location
09:00 - 10:00: Arrival and Breakfast RBB 3rd floor, social area
10:00 - 12:00: Session 1: Deep learning: simple neural nets and backprop RBB 3rd floor, LTD
12:00 - 13:00: Lunch and group introductions RBB 3rd floor, social area
13:00 - 14:00: Session 2: Introduction of projects, Q&A RBB 3rd floor, LTD
14:00 - 16:00: Lab session, getting started: Tensorflow and Keras Turing lab (RBB 335)
TUESDAY 29 March Location
09:00 - 09:30: Arrival and Breakfast RBB 3rd floor, social area
09:30 - 10:30: Plenary -- Intro to Image Analysis: Convolutional Neural Networks vs Transformers RBB 3rd floor, LTD
10:30 - 12:00: Development in groups (CNNs) Cray lab (RBB 321)
12:00 - 13:00: Lunch and discussion RBB 3rd floor, social area
13:00 - 15:00: Development in groups (Transformers) Turing lab (RBB 335)
WEDNESDAY 30 March Location
09:00 - 10:00: Arrival and Breakfast RBB 3rd floor, social area
10:00 - 11:00: Development in Groups Hopper Lab, RBB 207
11:00 - 12:00: Taught component: Transformer networks RBB LTD
12:00 - 13:00: Lunch and discussion RBB 3rd floor, social area
13:00 - 15:00: Development in groups (Transformers) Hopper Lab, RBB 207
THURSDAY 31 March Location
09:30 - 10:00: Arrival and Breakfast RBB 3rd floor, social area
10:00 - 12:00: Development in groups Turing lab (RBB 335)
12:00 - 13:00: Lunch and discussion RBB 3rd floor, social area
13:00 - 15:00: Development in groups Turing lab (RBB 335)
FRIDAY 1 April Location
09:00 - 10:00: Arrival and Breakfast RBB 3rd floor, social area
10:00 - 12:00: Plenary: Group presentations of results, discussion. RBB 3rd floor, LTD

Organisation

Nina Dethlefs, Neil Gordon, Lydia Bryan-Smith, Onatkut Dagtekin - Computer Science

Agota Mockute, Robert Houseago, Josh Wolstenholme, Rob Thomas - Energy and Environment Institute

Thanks!


Thanks everyone for attending! We've had a fantastic week with over 30 active participants over the days and some really interesting outcomes.

We've managed to identify and highlight hedges from space, laying the foundation to further research into the biodiversity of hedges and the wildlife that they support. We have also managed discover a new suite of deep learning algorithms to predict fatigue in monopile structures and inform their remaining useful life. We have create a spatially-aware flood warning system based on social media tweets and images.

Thanks to our fantastic group leads for sharing their data and expertise and supporting the groups in achieving these outcomes. Thanks to all our participants for turning up, and giving some of their own time to feed into our research projects and making the hackathon a success!

Thanks to NERC Discipline Hopping and the Energy and Environment Institute for funding us!