top of page

- Blog -

Search

This February, EDF Mexico is hosting the conference Marea Digital (“Digital Tide”): technology solutions for sustainable fishing. This hybrid event is being held in La Paz, Baja California Sur, Mexico and on Zoom. The conference is a mixture of fishing community representatives, technology providers, fishery managers and more. The event will be presented in English and in Spanish and event resources will be available on SAFET's website in the near future.


Congratulations to EDF Mexico for coordinating such an exiting event!


Learn more about the agenda: https://ppv.mx/mareadigital/index.php/agenda


Meet the Marea Digital speakers: https://ppv.mx/mareadigital/index.php/ponentes


See which technology providers attended: https://ppv.mx/mareadigital/index.php/proveedores


For general details, please see the Marea Digital website: https://ppv.mx/mareadigital/




In this episode of 5 Questions, we talked with Claire Van Der Geest, former General Manager of the Fisheries Information and Services Branch at the Australian Fisheries Management Authority (AFMA), and one of the most knowledgeable and enthusiastic persons I’ve ever talked to about Electronic Monitoring (EM). During her tenure, Claire oversaw much of the empirical data collection at AFMA from licensing, logbooks, and observer programs, and helped shepherded its digital transformation for programs like Electronic Monitoring and Electronic Reporting (ER). She also Chaired the Western and Central Pacific Fisheries Commission (WCPFC) EMandER Working Group, and Vice Chair for the Indian Ocean Tuna Commission (IOTC) EMandER Working Group. We talked to Claire in this podcast about: How she got involved in EM? What exactly is EM (for the layperson)? How is EM progressing at the RFMO level? The importance of engaging multiple stakeholders in EM programs? And lessons learned from implementing EM programs. Since this podcast was recorded, Claire has since moved on to her new role as a private consultant.

 

Q: What's your role at AFMA and how did you get involved in the world of Electronic Monitoring?


The General Manager of the Fisheries Information and Services Branch at AFMA, is responsible for the services that collect all of AFMA's empirical data, from logbooks and licensing to the observer program, electronic monitoring program, catch documentation records, port sampling – a whole range of services, that underpin the data collection to support for sustainable fisheries management across the commonwealth in Australia.


“How did I get involved in electronic monitoring – well, I landed the service delivery.” Claire was involved in a comprehensive review of the electronic monitoring program, which at the time was ongoing for 7 years, and they took a comprehensive review of the program. They looked at what people thought about what it was, what the services are and how it had performed, the benefits, etc., and really have a deep dive to ensure that the services continue to improve.


Q: What exactly is e-monitoring for the layperson?


Electronic monitoring can have multiple meanings. In fact, traditional vessel monitoring systems or VMS is a form of electronic monitoring; electronic logbooks, as opposed to paper logbooks, are a form of electronic monitoring; so we need to be careful when we think about what is truly Electronic Monitoring in a traditional context and a broad context. Increasingly there is a conversation about electronic technologies in fisheries that can be used to encompass all of these things, and so that enables us to use electronic monitoring in a very specific context – so in the context that we use electronic monitoring, we are thinking about having CCTV cameras on fishing boats. At the moment in Australia, this is a combined system with sensors on equipment that kick-start the cameras to record fishing activities or events of interest. These events could have high-risk implications for the fishery at large, or there are particular events that AFMA wants to verify what’s actually occurring on the water. “That’s what we think as being electronic monitoring, and that is the more global view of what electronic monitoring is, but it’s important to realize there is a broader context also.”


Q: EM has been at the center of many Regional Fishery Management Organization (RFMO) discussions, and with disruptions from the COVID-19 global pandemic and concerns about the level of human observer coverage in fisheries such as longline, what’s happening at the regional government level?


Monitoring fishing activities at sea has always been important. All RFMOs have had observer requirements to conduct at-sea monitoring of fishing events for a long time, but compliance with that has been relatively low for some parts of the fleet; where it is either difficult to place the observer due to the size of the vessel or the ability to accommodate the person onboard for the duration of the trip. It’s important to recall that we don’t implement tools for no reason – we are implementing onboard observers or electronic monitoring to collect and verify data of what’s going on at sea. We already have a requirement for onboard observer coverage at sea, if we can’t meet this requirement then we have an opportunity to use a different monitoring tool, and that other monitoring tool is electronic monitoring. Both onboard observers and EM have the same purpose, it's about verifying fishing activities at sea.


As part of her many roles at WCPFC, Claire also chaired the Electronic Monitoring (EM) and Electronic Reporting (ER) Working Group, where there have been engaged conversations to progress both EM and ER, but she notes that it is really a complex area to navigate.


The technology around the onboard observer doesn’t change, but the technology for EM we need to future-proof any standards since the technology is continuously changing. Privacy is also an issue because we’ll be collecting images of people's faces, which are highly sensitive issues. Progress needs to be pragmatic and needs to be iterative, and we need to work collaboratively to understand the differences between each jurisdiction – for example, what kind of privacy legislation exists in Australia and how we get that out in an RFMO. Both WCPFC and the IOTC have progressive discussions where Australia is helping to lead that discussion, the Commission for Conservation of Southern Bluefin Tuna (CCSBT) just had their initial conversations on EM. Other RFMOs that Australia is not a part of such as the International Commission for Conservation of Atlantic Tunas (ICCAT) and the Inter-American Tropical Tuna Commission (IATTC) are also having discussions on EM. At various levels, there is a really strong need to harmonize across RFMOs. All of the Chairs to the different groups are in dialogue on how to harmonize to the greatest extent so that we have systems on boats that can work across multiple RFMOs, and where countries are party to multiple RFMOs they are only implementing one EM program, and not more.


Q: It seems there are a lot of layers that necessitate coordination. Is there a space to engage stakeholders like the private sector, NGOs, and fishers, or is it purely top-down? What are the potential roles & responsibilities, and potential incentives?


If we don’t have stakeholders engaged we’re not going to have the best outcome that we can get. We definitely need industry onboard, we’re putting cameras/CCTVs on their boats, “I don’t have a camera above me in my workplace.” It’s important to have rigorous rules about how this data is being used “as it's capturing whatever I’m doing.” We as governments are really sensitive to what we’re actually doing, so we need to have clear rules and guidelines around what footage we need, and why is it targeting certain locations. “No, [cameras] are not in the galley.” It’s important that people understand the purpose of this tool and for the industry to be on board to have comfort with what we are doing. We definitely need EM tech vendors onboard because the technology is changing, and we need them to understand the need from a business perspective and from a fisheries management perspective, i.e. “what do we need this tech to do.”


We also need to progress Artificial Intelligence (AI) and Machine Learning (ML) to reduce the amount of time needed to review and analyze video footage, as this is where the majority of costs for the program are located. We need to ensure existing algorithms for say “face-masking” are already included in the footage review software to protect the privacy of individuals.


In terms of incentives, it’s a tricky one. We’re at this point where we keep talking about it, but until vendors see government commitment they won’t invest, but unfortunately, governments won’t invest until they see the prices come down, we’re at a tricky impasse of chicken and egg. Where we actually all need to make a commitment, “Yeah, this is going to happen...” So how do enhance the need for collaboration, what do we need from a fisheries management perspective, go and make your tech do that for us, and we’ll continue to progress the development and implementation of this tool.


“We’re at this point where we keep talking about it, but until vendors see government commitment they won’t invest, but unfortunately, governments won’t invest until they see the prices come down – we’re at this tricky impasse of chicken and egg.”


It comes down to, “what is the data that fisheries management needs to effectively manage its resource? What tool is appropriate to collect that data?” In instances where observers are already collecting data effectively, we don’t need to replace them with EM, we can see them as 2 complimentary tools, and in that way, we can have the best of both worlds and have really good verification of what’s happening at sea, which is the whole objective.


Q: AFMA has been piloting EM for some time now (as far back as 2005!), can you share with us some key lessons learned, especially for those just beginning to launch EM programs?


It is complex, and people shouldn’t see it as a quick process. To get effective buy-in you need to take this in a collaborative and iterative way. There are a lot of moving parts, “spinning plates on top of spinning plates,” policy ramifications, and the ancillary legislation implications – engage collaboratively and understand it’s going to be a longer process.


Have a clear goal as to what you’re trying to achieve, whether it’s verifying data, biological samples, offsetting observer coverage, or whatever it is.


Collect really good baseline information such as cost and benefit information. We continue to improve and demonstrate to the industry what’s in it for them.


We have much more confidence now in our logbook data. When we compare and verify it to EM data it's highly congruent, and we’re very confident that our logbooks are accurate now. It’s because AFMA implements 100% EM coverage in the relevant fisheries, and we randomly sample that data – these two pieces are essential. It’s a huge win because we reduce uncertainty in issues like stock assessments and are more confident that our management will actually achieve its objectives. We are also able to implement more discrete management arrangements that target specific risks that we’re trying to mitigate which might be spatial, temporal, or vessel-level compliance.






For the past decade Global Fishing Watch has been at the forefront of cutting-edge and open-access vessel monitoring technology that you can view from anywhere around the world. Earlier this year SAFET’s Executive Director, Jeremy Crawford, sat down with Global Fishing Watch’s Chief Innovation Officer, Paul Woods, in a segment called “5 Questions” where we interview tech innovators and implementers for seafood and fisheries from around the world on the work that they do, and on exciting new updates in their organization. In this episode we chatted with Paul a bit about how he got involved with Global Fishing Watch, the new API Portal that they’ve recently launched, and the API portal hackathon coming up at the beginning of next year.

 

Q: How did you get involved at Global Fishing Watch?


“I’m a software developer by background, working in commercial software as a career, mostly with start-up software companies in the Washington DC area. By the late-2000s during the financial collapse, I was working with several big banks using some big data software that we built to help them optimize and pick through the wreckage of the mortgage crisis, and at some point in 2009 I thought ‘I could be doing something better with my life’.”


Paul then came across an NGO called SkyTruth, who were using remote sensing data to report on environmental issues, “I really didn’t know anything about remote sensing, but after the Deep Water Horizon spill in 2010, I saw how effective SkyTruth was at reporting on the size of the slick and to be a really truly independent watch-dog. When it turned out that SkyTruth’s reporting on the size of the slick was much more accurate than any of the authoritative sources, I got excited about the potential to scale up that capability…”


Paul then went to work for SkyTruth and figured out how to automate and process remote sensing data to extract environmentally useful insights. The work with SkyTruth and remote sensing of ocean data ultimately led to Global Fishing Watch.


Q: Why is it important that vessel tracking data is democratized, free and available?


“Our first big insight came when we started working with satellite collected vessel tracking data from AIS [Automated Identification System]. Before 2012, it wasn’t possible to do what we’ve been doing, you could only get AIS data locally – you could get it from your own receiver on your boat or you could get it from a terrestrial receiver onshore near a port. Then when satellites started picking this up, we realized there was an enormous amount of information in there.”


While most people were just using it to look at ‘where are the vessels now’, Paul and SkyTruth were interested in historical data and seeing patterns of activity. Initially looking for oil spills, they wanted to know which vessels were associated with the oil spills, but when they took all the vessel data and threw it on a map, the fishing activity just ‘popped right out’. “If you plot all the data from AIS for a couple of months in the Pacific Ocean, the fishing just stands right out, you don’t have to do anything to immediately see in what part of the ocean the fishing is concentrated.”


At the time they were looking at machine learning models and said to themselves, “You know, if we just took all this data and applied some models to it, we could tell people where all the fishing was… people might be interested in that…”.


That led to a conversation with Oceana, “…and when they saw what we could do they lit right up, and immediately came up with 100 things that we could do with the data right away. Then showing the data to Google they said, ‘Wow – that’s a neat and novel global data set, and we would love to help you distribute it to everybody’. And here we are 8 years later, and we are still working our way through those original 100 things.


“That was the fundamental founding idea to Global Fishing Watch. We need to get this new data set out and share it with people so everyone can see what’s going on – democratize access to it, in the hopes of better management.”


Q: Global Fishing Watch is at the forefront of innovation in fisheries monitoring – How is Global Fishing Watch staying ahead of the curve, in particular this new API portal that you’ve launched?


“We’re always applying new remote sensing technologies – we’re using satellite radar now to detect vessels that turn AIS off or that don’t broadcast, and that’s really exciting…”


Typically, users of the Global Fishing Watch map zoom around, click on a spot and use the slider to see historical data over a period of time, see some fishing activity and click on it, ‘that’s one way to get information out of the system’.


“But a lot of folks, researchers, institutions, already have their own software systems and really want the data where they could just plug it in – so that’s what the API portal is for. We piloted this with NOAA, we did a project with their office of law enforcement, they wanted to build a big dashboard and look at fishing behavior. We learned a lot about feeding our data in a live stream into another data system so that’s what led to the development of the API portal.”


“So, you can go up on the website, go to the API portal, and anybody that can do a little bit of coding can write a bit of software that will pull any slice of data from the global data set, it’s all the fishing events we detect from 2012 until 3-days ago – and all the identity information that we can track….”


“Previously we were dumping annual data sets and one of the complaints was the data sets were just way too big – in order to do anything you had to download this monstrous data set just to find the piece that you wanted – so that was one of the motivations for the API, so now you can write a query that just pulls out the exact slice you want, and if all you need is 100 vessels – just go get the 100 vessels.”


Q: What are some of the use cases you are seeing for the new API portal?


“We’re excited to see MRAG as one of our early adopters so they’re using our APIs to pull data for fishery assessments – they’ve been going to our public map and take screen shots whereas now they can pull the data directly.”


Telecommunication companies laying undersea cables have a lot of interest in vessels that bottom-trawl, “We will be releasing into the API a new classification where we identify which vessels we think are bottom-trawling. Which is of interest for a lot of things, not just for fishing, but for companies that lay undersea cables who need to know where bottom-trawling occurs since they don’t want nets being dragged across their cables.”


The other use case is obviously enforcement, fisheries Monitoring Control and Surveillance (MCS). “If you go into enforcement command centers you’ll see their vessel monitoring system, or VMS, and somewhere you’ll see a screen from Global Fishing Watch on it – and they’ll use that to check against what they’re seeing now, or if they’re seeing something and want to compare it to last year they’ll go into Global Fishing Watch. So now the potential is that Global Fishing Watch data can flow into their primary system, so they don’t have to go flipping back-and-forth between screens.”


Q: Next year Global Fishing Watch is hosting a hackathon – can you tell us a little about that?


“The idea is to generate new ideas – whenever we release an API there’s a few sophisticated users and they know how to use it, but we feel there’s a lot of potential particularly for apps. So, we’re developing an app that’s designed for use in ports for port-inspectors, it’s called ‘Vessel Viewer’. The idea is you can use the API in an app on your mobile device to get all the information on a vessel just as it arrives in port before you start your inspection – everything you need to know is there.”


“However, there are 100s of apps that could be useful like that and have other useful information that we don’t have. So what we love to do is get people’s ideas – maybe somebody already has an app and they just want some help to integrate the API for our data to overlay it – so with the hackathon we’re hoping to attract many different use cases and people with different domain knowledge that will bring their ideas to it and come up with things we would have never would have thought of.”






bottom of page