Navigation : EXPO21XX > ROBOTICS 21XX > H21: Humanoid and Androids Research > Bristol Robotics Laboratory
Videos
Loading the player ...
  • Offer Profile
  • Our mission is to understand the science, engineering and social role of robotics and embedded intelligence. Our multidisciplinary approach aims to create autonomous devices capable of working independently, with each other, or with us in our human society.

    Launched in December 2005 the Bristol Robotics Laboratory is a collaborative research partnership funded by the University of Bristol, the University of the West of England and HEFCE. Under the direction of Professor Chris Melhuish the laboratory is an evolution of the Intelligent Autonomous Systems Laboratory (IAS)
Product Portfolio
  •  Projects

    • Energy Autonomy: Ecobot

    • One goal of our work is to build energetically autonomous robots. For this, the Microbial Fuel Cell (MFC) technology is employed to extract electrical energy from refined foods such as sugar and unrefined foods such as insects and fruit. This is achieved by extracting electrons from the microbial metabolic processes. To be truly autonomous, robots will be required to incorporate in their behavioural repertoire actions that involve searching, collecting and digesting food. The robot will be designed to remain inactive until sufficient energy has been generated to complete its next task. This may prove to be a paradigm shift in the way action selection mechanisms are designed.

      Project code-name:‘EcoBot’.

      So far, two such robots, namely EcoBot-I and EcoBot-II have been developed, which - to some extent - exhibit this type of behaviour. EcoBot-I, which was developed in 2002, employed E. coli and was fed with sugar and EcoBot-II, which was developed in 2004 used sludge microbes and was fed (amongst other substrates) with dead insects and food waste.

      This project, using the same MFC technology, is also looking into underwater autonomy based on artificial gills for robots.
    • ecobot III project : Energy Autonomy

    • Ecobot III is an EPSRC funded project which ended in January 2007. The main objective of this project was to develop a robot with onboard fluid circulation, capable of collecting its energy from the environment and getting rid of its own waste; all of these functions are powered by the MFCs. Due to the challenging nature of this project rapid prototype technology was fully utilised in order to produce a uniquely designed, lightweight and strong robot structure comprising multiple parts. Furthermore, because of the scarcity of onboard energy, the team was required to develop ultra low power electronic circuitry to operate the robot.

      The image at right shows the current ecobot team.

      Ingestion/Digestion/Egestion Vessel (left image)
      This is the upper most component of the robot which constists of the ingestion, artificial digestion and solid waste excretion mechanism. The image clearly shows sludge within the vessel and solid waste sedimented into the egester.

      Sludge and Water Distribution and MFCs (right image)
      This is the middle section of the robot which consists of the sludge distribution mechanism (white solid helical rings), and the MFCs (24 in total) which are shown just below the distribution mechanism. Underneath the MFCs there is an overflow collection tray which feeds back into the ingestion vessel above.
    • Towards Empathy in Humanoids

    • This project is concerned with the believability of emotionally expressive agents. The aim is to generate facial behavior in a humanoid robot head, so that if a person speaks to the robot, the person would feel listened to in a sympathetic way. In this study we will pursue techniques for the classification of dynamic facial expressions in humans and the generation of appropriate dynamic expression in the robot. This will require ‘theory of mind’ models as well as dynamic emotional models. We are collaborating with computer graphics specialists who are able to extract ‘characteristic’ features from images and use these to create novel action sequences with qualitatively the same behaviour as the example set. Our particular interest is in how, and to what extent one can achieve the illusion of psychological attending and understanding even though it lacks 'true' intelligence. We aim to find new approaches towards enhancing human-likeness by generating genuine, non-repetitive facial behaviour that conveys a certain underlying emotional state.
    • Robot Gesturing

    • This project focuses upon the production of credible conversational gestures by an anthropomorphic robot. An important part of human interaction with a robot is, for the human, that they have a feeling of engagement with an intelligent agent; one way this might be achieved is a robot that uses gestures as well as speech. Gestures are an integral part of human communication, not only for semantic content but also for evidence of speaker thought processes and engagement of the conversational partner. This will require a controller that is able to produce humanlike motions, in order to do this evolution strategy applied to a neural network is being investigated as a way to generate such a controller. The gestures themselves and their coordination with speech will be generated using a novel natural language with gesture generator (NLGG). The NLGG will provide the instructions that are to be carried out by the evolved controller. Of particular interest is how the gestures affect a person’s interactions with, and feelings towards, the robot. We will be collaborating with psychologists to carry out such an investigation, initially using hidden tele-presence control (wizard of Oz), and eventually with the fully implemented system.
    • Haptic Sensing
      Robotic Humanoid Hand Research

    • In order to interact physically with humans and the unstructured environments in which they live, robots will need an accurate and sophisticated sense of touch. This project is concerned with the research and design of a haptically instrumented robotic humanoid hand.

      We use our sense of touch to interact with each other and with our environment. It has been said that of all the senses, if lost, touch has the most detrimental effect on a person’s quality of life. With the absence of a tactile sense, humans are no longer able to control objects, or even their own limbs, without significant visual feedback and effort, as well as losing the ability to meaningfully interact and communicate with each other physically. The sense of touch is an essential part of autonomous independent existence and has a significant role in emotional interaction between humans.

      The multimodal tactile sensors that will be developed for the robotic hand will allow for accurate control and manipulation of objects as well as the ability to actively gather information haptically about objects within unstructured environments. This work will also investigate the use of such an instrumented hand for advanced prosthetics, and for human robot interactions where an accurate sense of touch plays an important role in safety.
    • Haptic Representation Project

    • This project aims to produce textural concepts using tribological (friction) data acquired by an artificial finger. The long term goal is to have an artificial hand explore an environment to learn and classify different textures encountered. Before this goal can be realised the textural concept generation and aquisition techniques are being developed on an artificial finger using artificial textures specifically made to examine textural features.

      Our Finger (right image)
      The finger is made out of poly-carbonate plastic using a rapid prototyping machine. A microphone was recessed into the back of the finger to record vibrations when in contact with surfaces, because the finger is recessed ambient noise levels picked up are reduced. To boost the signal to noise ratio a pre-amplifier was positioned as close to the microphone as possible, and this results in very strong signals from direct contacts with the finger.

      Initial tests proved that the bare plastic material does not have enough friction or deformation when in contact with surfaces to adequatly record tribological data. Latex skin was added to the outside of the finger to improve the quality of recordings.
    • Learning in Swarm Robotic Systems

    • The aim of this project is to design a swarming robotic-system with desirable group behaviors. We achieve this by introducing learning ability to individuals within the group. We are interested in letting the individual robots learn to interact with each-other or their environment using rules to minimize the conflict resulting from interference, or to maximize the total income of the group for surviving. In order to explore these questions, we have designed a group of robots that perform foraging-like tasks. We chose foraging due to motivations related to biological inspirations, mostly from ant colonies.
    • Albatross UAV: Dynamic soaring

    • Albatrosses use dynamic soaring to cross thousands of miles of open ocean. This project aims to use dynamic soaring to overcome endurance limitations of small Unmanned Aerial Vehicles (UAVs). Previous research identified open loop control laws but for an efficient use with real aircraft, a closed loop control law is required. We intend to use intelligent control to identify such a closed loop control law using reinforcement learning. The second aspect of this project is skill transfer. Initially, for obvious safety and cost reasons, the controller training will be done using simulation. Afterwards, the learned control laws will be used with a real glider.
    • Flying Flock: Collective Locomotion

    • The principle aim of this research is to develop a set of collective minimalist movement algorithms for use on a group of flying autonomous robots. A group of physical robots has been designed and constructed and have been used to demonstrate how swarming and homing in three dimensions can be achieved using only simple rules. The robots employ helium balloons (blimps) and therefore have a limited payload for the propulsion, communication and localisation systems. This inspiration comes from social insects that employ only local sensing and communication and do not directly communicate with all group members.
    • Whiskerbot: Tactile Sensors

    • In 2005 we began a collaborative project with the Adaptive Behaviour Research Group at the University of Sheffield. The study, funded by EPSRC and referred to as Whiskerbot investigates a biomimetic artificial whisker system which could provide a novel form of robot tactile sensor capable of texture discrimination and object recognition. The project involves mounting an array of actively-controlled artificial whiskers on a mobile robot that will input to biologically-accurate computational models of sensory pathways in the rat brain.