EyeSea automates fish-MHK devices video analysis

ORPC’s RivGen system (Photo: ORPC)

 
Researchers from the Pacific Northwest National Laboratory (PNNL) have created a video software tool called EyeSea that automatically spots the interactions between fish and marine and hydrokinetic (MHK) devices.

Funded by Department of Energy’s Water Power Technologies Office, the EyeSea uses machine vision algorithms to ‘watch’ video footage for any incidents where a fish or mammal is near an MHK turbine.

The tool automatically detects when a fish or mammal enters the frame and flags every event, according to PNNL.

The flagged events tell an operator which segments of footage should be evaluated, significantly reducing the time required to analyze the recordings manually.

Other downside of manual analysis of the underwater video footage is the reduced accuracy of the method, as researchers often resort to picking random one-hour intervals of footage to evaluate in order to speed up the process.

EyeSea can ‘watch’ underwater video footage and automatically identify when wildlife enters the frame (Image: PNNL)

PNNL developed and tested the tool using footage from Ocean Renewable Power Company’s (ORPC) pilot project in Igiugig, Alaska.

For about two months, an ORPC turbine generated electricity during the middle of Alaska’s annual salmon run.

Researchers analyzed 43 hours of video footage, where they observed nearly 20 fish interactions with no occasions of fish injuries.

PNNL assessed the accuracy of EyeSea and determined it was 85% accurate at detecting when marine life was present.

Based on this data, PNNL said it started refining the algorithms behind EyeSea.

If successful, EyeSea will be made available to MHK operators and developers to streamline siting and permitting processes, and meet post-installation monitoring requirements at future MHK sites, according to US Department of Energy.