HACKATHON CHALLENGES

CARBON: How to go low carbon, by reducing emissions and cut energy consumptions?

MapBiomas - There are 150,000 deforestation alerts detected in Brazil every year. To make this alerts ready for action a report need to be prepared identifying the period and the land tenure of each deforestation event. This is essential to define weather the deforestation is legal or illegal for example. In 2018 only 1000 reports were prepared by environmental authorities so less than 1% o the deforestation alerts were considered for action. MapBiomas develop a method that produces now over 2000 of such reports per week. There are now over 140,000 reports ready to serve to action. The challenge now is to turn those reports into real action such as: environmental agencies to fine illegal deforestation, banks and traders to not negotiate with whom is causing deforestation, general public to pressure companies and government to act. Public APIs are available for MapBiomas and all data is available in the open-source.

 

Cool Farm Alliance CIC - Cool Farm Alliance could do even more great deeds if they had the ability to take advantage of map GIS data, as well as satellite or drone images. Such a solution would simplify the identification and reporting of various features like buildings, hedges, fields, woods, watercourses etc. On a large scale, this is about quantifying on-farm greenhouse gas emissions, soil carbon sequestration and biodiversity impact – to enable growers to make decisions that reduce their environmental impact. Read more about Cool Farm Alliance.

What data is relevant for the challenge?

Remote sensing data, ideally from satellite so the analysis can be completed at a global scale. Drone data sets would be the next best option, but slower and more expensive to gather the data.

Are there any public data you believe could be relevant?

Anything else you want to add regarding this use case?

A solution would simplify landscape classification for reporting and environmental impact assessments. The ability to automatically detect landscape features and determine their size and area, such as buildings, hedges, fields, woods, watercourses etc. From this to be able to identify areas used for cultivation, and set aside for nature habitats etc. 

This data can then be used to populate greenhouse gas and biodiversity assessments.  This simplifies the administrative burden for farmers completing these assessments and has application for a range of environmental reporting platforms and farm management systems. 

  1. Draw on the boundaries of a farm, quickly and easily using feature-jump logic (as in Strava route planning)
  2. and easily label and polygonise the various features (buildings, hedges, fields, woods, watercourses etc.)
  3. for each calculate the area according to the known scaling of the image
  4. If poss. capture the height (or use 3D imagery to do so) and similarly calculate approximate volume (to aid 3D mapping, especially relevant for perennials/agroforestry)
  5. And then use of the above to ‘populate’ a farm definition file us in the CFT – or some such
  6. If data available, take state-of-the-art LIDAR scanner data and turn into biomass calculations for trunk, branch, leaf ratios of perennial crops using AI/ML for different crop types, e.g. oil palm .v. mango .v. avocado .v. apple

The Cool Farm Tool allows farmers to monitor and report their environmental performance and it acts as a decision support tool to help them identify opportunities and action to step into regenerative sustainable farming practices.

The Cool Farm Alliance is an international not for profit membership organisation working towards sustainable agriculture, with a vision of a global agriculture system that builds soil carbon, helps to mitigate climate change and restores ecological balance. https://coolfarmtool.org

The existing Cool Farm Tool can be found here, free use of the tool comes with five assessments, and so you can get a feel for the existing tool functionality: https://app.coolfarmtool.org/account/login/?next=/

 

Hudson Carbon - We are working with partners identifying intensive livestock feeding operations (or Concentrated Animal Feeding Operation, CAFO), but grazing livestock are more difficult to identify. Building on a model developed by researchers in Canada, hackathon participants could help improve accuracy and capacity and create new approaches to identifying grazing animals. This project would help add value to our efforts to identify emissions from livestock globally. Read more about Hudson Carbon.

What data is relevant for the challenge? 

Aerial Imagery, machine learning algorithms 

Is there any example of data that you can share with the hackathon teams?

Counting Cattle using satellite imagery. https://arxiv.org/pdf/2011.07369.pdf

Are there any public data you believe could be relevant?

Any file attachments that you will add when you submit this form?

Rangeland data files ArcGIS - My Map and here for New Mexico some more livestock grazing data (https://data.doi.gov/dataset/blm-grazing-allotment-polygons)

 

 

Ember & Subak - Methane as a greenhouse gas is 84 times more powerful than carbon dioxide over a 20-year period. Around a third of anthropogenic methane emissions come from the fossil fuel industry; it seeps from deep coal mines, it vents from oil and gas wells, and it leaks from pipelines. The extent of these emissions has historically been underestimated, and voluntary reporting of emissions has proved insufficient. With global demand for fossil fuels forecast to rebound, it is more important than ever to understand the climate impact of their production. Only in recent years have the tools become available to measure methane emissions directly, although the practice is in its infancy. The challenge is to use the available monitoring data to identify large methane emission events, and attribute these to fossil fuel infrastructure on the ground. This will help to identify the offending players and raise awareness of the true climate impact of fossil fuels. Read more about Ember and Subak.

What data is relevant for the challenge?

Satellite and atmospheric measurements of methane concentrations. Ground-based monitoring of methane and reported emissions events. Maps of energy infrastructure.

Methane emissions:

Known leaks:

Fossil fuel and energy infrastructure:

 

WWF - WWF implements REDD projects that aims at reducing emissions from deforestation and forest degradation and thereby enhancing forest carbon stocks in forest landscapes in developing countries. An important aspect of these projects is to be able to quantify Greenhouse gas (GHG) emission reductions and removals (ER) from avoiding unplanned and planned deforestation and forest degradation. The challenge is to develop a system that can quantify GHG ER remotely without having to do field work on the ground. Thus, the system should receive remote spatial data (satellite/radar etc.)  concerning forest-cover change dynamics matching on variables hypothesized to influence deforestation trends such as altitude, slope, ecosystem type, political jurisdiction, etc. This should be coupled with data about forest biomass and other relevant indicators in a Landscape to be able to quantify subsequent changes to carbon stocks in aboveground tree biomass. Then the system should be able to couple continuous forest cover change detection using for example AI /machine-learning or other relevant techniques for subsequent analysis and quantify Greenhouse gas (GHG) emission reductions and removals from avoiding unplanned and planned deforestation and forest degradation.

What data is relevant for the challenge?

Examples of data could be freely-available optical U.S.G.S. Landsat, Sentinel-2, synthetic aperture radar, Sentinel-2, and CORONA satellite imagery to reconstruct historical forest-cover change dynamics in Landscape Very-high resolution PlanetScope data could be used for high accuracy assessment of forest dynamics during recent years (since 2016).

Any file attachments that are relevant?

https://verra.org/methodology/vm0007-redd-methodology-framework-redd-mf-v1-6/

 

ECOSYSTEM: How to preserve and protect biodiversity and health of the world’s ecosystem?

Smithsonian Institution - The Bird Friendly Certification program seeks to develop a tool to help Bird Friendly® certified farmers and auditors around the world assess the quality of the wildlife habitat retained in and around farms. While agriculture often drives biodiversity loss, the Habitat App will empower users to understand and maintain habitat for wildlife within managed land. The Habitat App will guide users through  Bird Friendly habitat assessments, which measure critical features such as tree cover, density, and diversity. The Habitat App will thereby allow users to assess and remotely verify compliance with Bird Friendly Certification criteria and assess and improve the relative Bird Friendliness of their farm practices. Today, the Bird Friendly Certification is limited to two crops—coffee and cocoa—and requires in-person audits from trained inspectors, which can be costly to farmers. The Habitat App will be a useful tool for auditors, farmers, and the Smithsonian as it will streamline the certification audit, decrease certification fees, and populate a database of habitat features that can be monitored for change over time. The Habitat App can potentially scale up to identify and track habitat in a diversity of crop systems and urban and suburban developments. Tomorrow, we envision a diversity of Bird Friendly crops and producers who are able to assess and manage the habitat they retain on their farms using the Bird Friendly Habitat App. Read more about Bird Friendly Coffee.

What data is relevant for the challenge?

Documentation of wildlife habitat features in and around coffee and cocoa farms. These habitat features include: tree canopy cover, tree density, tree height, tree species diversity, agrochemical use, property borders, % forest on the landscape, and GPS/timestamp verification of all samples. These habitat features can be assessed manually with values inputted by a trained inspector, but could also be automated by certain tools/apps already on the market: “Measure” app on iPhone for canopy heights and “CanopyApp” on iphone for canopy cover. Each data collection point could include a GPS location, which could be cross-referenced with satellite-derived products, such as % tree cover or % primary forest cover, to verify the amount of forest present around farms. 

Are there any public data you believe could be relevant?

Satellite products that describe forest, tree cover, and forest loss: 

Anything else you want to add regarding this use case?

The excel inspection sheet is completed for every farm. Please note that a cooperative would consist of multiple farms (and sheets) under a collective. We can supply these from completed inspections.

Today, we have two “Bird Friendly” standards at the Smithsonian, but we are leading an initiative called the “Bird Friendly Coalition” with a constituency of over 50 organizations who conduct or promote Bird Friendly practices, and amongst whom a monitoring and evaluation tool has been established as a high priority. 

 

Climate Policy Radar - Laws and policies will make or break our ability to overcome climate change: whether regulating EV charging points, subsidies for renewable energy, or mandatory risk disclosure by corporates. In order to develop better laws and policies we need an in-depth understanding of the existing laws and policies, so policymakers can replicate successes and avoid mistakes. This challenge involves developing solutions to querying full text documents in the global climate legislation database. Read more about Climate Policy Radar.

Is there any example of data that you can share with the hackathon teams?

Data set (c. 2100 laws and policies) available here: https://climate-laws.org/legislation_and_policies Downloadable CSV with metadata available. When clicking on each law there’s a short English summary (written by researchers) and a link to full text – sometimes link to external site, sometimes locally stored PDF. 

 

PlanBørnefonden - Humanitarian organisations can only help women, girls, men and boys affected by a disaster, if we understand their needs, resources and capacities as fast and thoroughly as possible. Often it is girls and women who are the most vulnerable in a crisis. For example, Gender Based Violence increases during crisis, and girls and women face increased work burdens as they have to care for children, those injured or sick, as well as prepare food and walk long distances to get water. At the same time, men may lose their sense of purpose as they lose jobs, and boys are recruited to join armed forces. A tool used after a disaster strikes which document and analyse vulnerabilities, capacities, and roles related to gender is called Rapid Gender Analysis (RGA). The aim of RGA is to understand these gender dynamics in a context before and after a disaster strikes and to provide recommendations for how emergency response should look like so that we meet needs and promote gender equality. RGA’s are conducted by comparing existing data on gender dynamics in a context with newly collected data through interviews, focus group discussions, and surveys with the affected people. All of this should be done within two weeks after a disaster strikes. However, field access is often difficult, gender experts are not first responders and translation is lengthy. ‘Rapid’ GA is often not rapid. While there is a wealth of information on gender dynamics available, these data often remain unexplored due to lack of resources to analyse them. How can we collect and analyse data efficiently from various data sources, thereby producing faster, cheaper, and richer RGA? This will strengthen the humanitarian community’s ability to meet the needs of the most vulnerable in a crisis situation in the most efficient way.

What data is relevant for the challenge?

Existing Rapid Gender Analyses, gender in briefs, RGA assessment tools used to collect and analyze data for RGA, guidance notes on how to produce RGA  

Is there any example of data that you can share with the hackathon teams?

Find existing RGA’s and assessment tools here: https://genderinpractice.care.org/core-concepts/gender-analysis-framework/good-practices-framework-on-gender-analysis/emergencies/

Are your organization using any hardware (IoT, sensors, cameras etc.) to collect data?

Kobo toolkbox, Magpie and Poimapper on Tablets to collect data, and we sometimes use cameras to take pictures (used for photovoice to document circumstances), data is also collected paper based.

Are there any public data you believe could be relevant?

Data from GDELT and existing RGA, gender in briefs.

 

TransitionZero - Our knowledge of the cooling types that fossil fuel power plants use is very incomplete and would aid greatly in understanding emissions and water risk of power plants globally. Using a sample of power plants where cooling types are known we can build a machine learning model to identify from publically available satellite imagery.

What data is relevant for the challenge?

Sentinel2 satellite data, openstreetmap tags, power plant locations

 

WASTE: How to become more sustainable, by developing products with zero waste?

Buy Food With Plastic - Today our plastic collection process is manual. We would like to see how this can be automated. An example: Instead of writing number of plastic bottles collected, we want some kind of digital solution. And here the numbers of bottles could be shown on our public web, as well as the numbers of meals provided. We urge you to be creative and innovative. Surprise us on how technology can simplify this process and make our business even more attractive. Read more about what we do in our Impact Report.

 

WATER: How to be water positive, meaning replenish more water than used?

WattTime - Due to the seasonality of rainfall, hydropower plants will have different behavior at different times of year. Furthermore, due to an excess of water, some hydro generators may release the water past the generators (‘spillage’) at certain times. How could we model hydro behavior in order to predict behavior and identify spillage. This will help WattTime improve understanding of grid behavior and identify opportunities for emissions reductions. Read more about WattTime.

Public data available:

Articles for reference: