Visual Physiology and Neurobotics

Bio-inspired autonomy - forging nature's drone.

We have developed an autonomous robot platform, bio-inspired from our recordings of brain activity in flying insects. This autonomous system detects, selects and pursues moving features in cluttered environments.

Core capabilities

Using behavioural, electrophysiological and morphological techniques, we have developed computational models that provide robust and efficient solutions to target-detection & tracking.

These models have been translated onto our autonomous robotic platform and tested in real-world, unstructured, environments. We examine how biology addresses problems that challenge the robot’s capability to autonomously pursue moving targets.

We have developed the only tractable, animal model system providing insight into predictive feature attention, an important task required for artificial vision systems.

Key collaborations

  • Established international collaboration with Lund University (SRC, STINT funded) and San Diego State University (AFOSR funded).
  • Support of two ARC Centres of Excellence (Centre for Robotic Vision, Centre for Nanoscale BioPhotonics).
  • DEWNR project support with permits and approvals for collection and filming.
  • Multidisciplinary collaboration across neurosciences, psychology, engineering and computer vision.

Key contact

Dr Steven Wiederman
Email: steven.weiderman@adelaide.edu.au
Phone: +61 403 079 575

Visual Physiology and Neurobotics Laboratory, University of Adelaide