Search

5 Questions: Paul Woods, CIO at Global Fishing Watch

For the past decade Global Fishing Watch has been at the forefront of cutting-edge and open-access vessel monitoring technology that you can view from anywhere around the world. Earlier this year SAFET’s Executive Director, Jeremy Crawford, sat down with Global Fishing Watch’s Chief Innovation Officer, Paul Woods, in a segment called “5 Questions” where we interview tech innovators and implementers for seafood and fisheries from around the world on the work that they do, and on exciting new updates in their organization. In this episode we chatted with Paul a bit about how he got involved with Global Fishing Watch, the new API Portal that they’ve recently launched, and the API portal hackathon coming up at the beginning of next year.

 

Q: How did you get involved at Global Fishing Watch?


“I’m a software developer by background, working in commercial software as a career, mostly with start-up software companies in the Washington DC area. By the late-2000s during the financial collapse, I was working with several big banks using some big data software that we built to help them optimize and pick through the wreckage of the mortgage crisis, and at some point in 2009 I thought ‘I could be doing something better with my life’.”


Paul then came across an NGO called SkyTruth, who were using remote sensing data to report on environmental issues, “I really didn’t know anything about remote sensing, but after the Deep Water Horizon spill in 2010, I saw how effective SkyTruth was at reporting on the size of the slick and to be a really truly independent watch-dog. When it turned out that SkyTruth’s reporting on the size of the slick was much more accurate than any of the authoritative sources, I got excited about the potential to scale up that capability…”


Paul then went to work for SkyTruth and figured out how to automate and process remote sensing data to extract environmentally useful insights. The work with SkyTruth and remote sensing of ocean data ultimately led to Global Fishing Watch.


Q: Why is it important that vessel tracking data is democratized, free and available?


“Our first big insight came when we started working with satellite collected vessel tracking data from AIS [Automated Identification System]. Before 2012, it wasn’t possible to do what we’ve been doing, you could only get AIS data locally – you could get it from your own receiver on your boat or you could get it from a terrestrial receiver onshore near a port. Then when satellites started picking this up, we realized there was an enormous amount of information in there.”


While most people were just using it to look at ‘where are the vessels now’, Paul and SkyTruth were interested in historical data and seeing patterns of activity. Initially looking for oil spills, they wanted to know which vessels were associated with the oil spills, but when they took all the vessel data and threw it on a map, the fishing activity just ‘popped right out’. “If you plot all the data from AIS for a couple of months in the Pacific Ocean, the fishing just stands right out, you don’t have to do anything to immediately see in what part of the ocean the fishing is concentrated.”


At the time they were looking at machine learning models and said to themselves, “You know, if we just took all this data and applied some models to it, we could tell people where all the fishing was… people might be interested in that…”.


That led to a conversation with Oceana, “…and when they saw what we could do they lit right up, and immediately came up with 100 things that we could do with the data right away. Then showing the data to Google they said, ‘Wow – that’s a neat and novel global data set, and we would love to help you distribute it to everybody’. And here we are 8 years later, and we are still working our way through those original 100 things.


“That was the fundamental founding idea to Global Fishing Watch. We need to get this new data set out and share it with people so everyone can see what’s going on – democratize access to it, in the hopes of better management.”


Q: Global Fishing Watch is at the forefront of innovation in fisheries monitoring – How is Global Fishing Watch staying ahead of the curve, in particular this new API portal that you’ve launched?


“We’re always applying new remote sensing technologies – we’re using satellite radar now to detect vessels that turn AIS off or that don’t broadcast, and that’s really exciting…”


Typically, users of the Global Fishing Watch map zoom around, click on a spot and use the slider to see historical data over a period of time, see some fishing activity and click on it, ‘that’s one way to get information out of the system’.


“But a lot of folks, researchers, institutions, already have their own software systems and really want the data where they could just plug it in – so that’s what the API portal is for. We piloted this with NOAA, we did a project with their office of law enforcement, they wanted to build a big dashboard and look at fishing behavior. We learned a lot about feeding our data in a live stream into another data system so that’s what led to the development of the API portal.”


“So, you can go up on the website, go to the API portal, and anybody that can do a little bit of coding can write a bit of software that will pull any slice of data from the global data set, it’s all the fishing events we detect from 2012 until 3-days ago – and all the identity information that we can track….”


“Previously we were dumping annual data sets and one of the complaints was the data sets were just way too big – in order to do anything you had to download this monstrous data set just to find the piece that you wanted – so that was one of the motivations for the API, so now you can write a query that just pulls out the exact slice you want, and if all you need is 100 vessels – just go get the 100 vessels.”


Q: What are some of the use cases you are seeing for the new API portal?


“We’re excited to see MRAG as one of our early adopters so they’re using our APIs to pull data for fishery assessments – they’ve been going to our public map and take screen shots whereas now they can pull the data directly.”


Telecommunication companies laying undersea cables have a lot of interest in vessels that bottom-trawl, “We will be releasing into the API a new classification where we identify which vessels we think are bottom-trawling. Which is of interest for a lot of things, not just for fishing, but for companies that lay undersea cables who need to know where bottom-trawling occurs since they don’t want nets being dragged across their cables.”


The other use case is obviously enforcement, fisheries Monitoring Control and Surveillance (MCS). “If you go into enforcement command centers you’ll see their vessel monitoring system, or VMS, and somewhere you’ll see a screen from Global Fishing Watch on it – and they’ll use that to check against what they’re seeing now, or if they’re seeing something and want to compare it to last year they’ll go into Global Fishing Watch. So now the potential is that Global Fishing Watch data can flow into their primary system, so they don’t have to go flipping back-and-forth between screens.”


Q: Next year Global Fishing Watch is hosting a hackathon – can you tell us a little about that?


“The idea is to generate new ideas – whenever we release an API there’s a few sophisticated users and they know how to use it, but we feel there’s a lot of potential particularly for apps. So, we’re developing an app that’s designed for use in ports for port-inspectors, it’s called ‘Vessel Viewer’. The idea is you can use the API in an app on your mobile device to get all the information on a vessel just as it arrives in port before you start your inspection – everything you need to know is there.”


“However, there are 100s of apps that could be useful like that and have other useful information that we don’t have. So what we love to do is get people’s ideas – maybe somebody already has an app and they just want some help to integrate the API for our data to overlay it – so with the hackathon we’re hoping to attract many different use cases and people with different domain knowledge that will bring their ideas to it and come up with things we would have never would have thought of.”