Automate the production of high resolution time series of newt observations using camera traps

SCHEME: Proof-of-Concept

CALL: 2017

DOMAIN: MS - Materials, Physics and Engineering

FIRST NAME: Xavier

LAST NAME: Mestdagh

INDUSTRY PARTNERSHIP / PPP: No

INDUSTRY / PPP PARTNER:

HOST INSTITUTION: LIST

KEYWORDS: Camera trap, Automation, Image processing, Amphibian, Newt,

START: 2018-01-01

END: 2019-12-31

WEBSITE: https://www.list.lu/

Submitted Abstract

The strategic objectives of the present project is to develop an early pilot production line for a non-intrusive amphibian trap able automate the production of high resolution time series, to demonstrate its performance in relevant environments and to hook the suitable industry to further manufacture and commercialize the trap. Environmental measuring capacities are still in the Middle Age in the sense they require a lot of human interventions (manual live trapping) with the associated impacts on the variable it measures (bias) and animal welfare. Amphibians, as an important part of biodiversity, are intensively studied in the frame of research projects, legal obligations, bio-indication and education. The patent n° LU93388 filed in December 2016 by the LIST describes a tool allowing to automate the production of amphibian sightings. It tackles limitations of standard methods by being cheaper (reduced human resource), faster (no animal handling), better (high resolution time series, more independent observations), and safer (no caught animals). The PoC-NEWTRAP project includes the following steps: -Marketing study including technico-economical survey, market study (micro market test and macro industry test), -Improvement of the existing prototype to meet end-users needs and manufacturers,-Expansion of the LIST IP portfolio reinforcing the automation of wildlife observation,-Make the end-product available for the market through licensing-out.

This site uses cookies. By continuing to use this site, you agree to the use of cookies for analytics purposes. Find out more in our Privacy Statement