Navigation : EXPO21XX > AUTOMATION 21XX > H05: Universities and Research in Robotics > University of Pennsylvania
Videos
Loading the player ...
  • Offer Profile
  • The General Robotics, Automation, Sensing and Perception (GRASP) Laboratory integrates computer science, electrical engineering and mechanical engineering in a vibrant, collaborative environment that fosters interactions between students, research staff and faculty. GRASP has grown into a $10 million research center with impressive technological innovations. Pioneering GRASP researchers are building autonomous vehicles and robots, developing self-configuring humanoids, and making robot swarms a reality. Our doctoral students are trained in theory and practice and mentored to become leaders in research and education.
Product Portfolio
  • Mobile Robotics Research

    • RHex Hexapedal Robot

    • RHex is a biologically inspired hexapedal robot invented and first characterized at the dawn of the century as part of a large DARPA funded consortium. A variety of RHex platforms have been developed since that time, and our lab has been particularly active in developing new versions for studying biologically inspired locomotion, gait control, and sensor based navigation as well as for developing novel courses and other educational materials.
    • MAST - Micro Autonomous System Technologies

    • Micro Autonomous Systems Technologies (MAST) is a collaboration with University of Maryland, University of Michigan, BAE Systems and Army Research Laboratories. Our vision is to develop Autonomous Multifunctional Mobile Microsystems (Am3 ), a networked group of small vehicles and sensors operating in dynamic, resource-constrained, adversarial environments. While individual units may be specialized, Am3 will be multifunctional because of its heterogeneity, the ability of individual units to automatically reconfigure and adapt to the environment and to human commands, and its distributed intelligence. Am3 will need to operate with little or no direct human supervision, because groups like this will be very difficult, if not impossible, to efficiently manage or control by programming or by tele-operation. The deployment, monitoring, and tasking of such multifunctional groups will be challenging and will require the application of new, yet-to-be-developed methods of communication, control, computation and sensing, specifically tailored to Mast applications
    • Omnidirectional Vision

    • Omnidirectional vision systems can provide panoramic alertness in surveillance, improve navigational capabilities, and produce panoramic images for multimedia.
    • Tele-Immersion

    • Tele-Immersion will enable users at geographically distributed sites to collaborate in real time in a shared, environment as if they were in the same physical room. This new paradigm for human-computer interaction is the ultimate synthesis of networking and media technologies.

       
    • Ben Franklin Racing Team

    • The Ben Franklin Racing Team's goal is to build fast, reliable, safe and autonomous vehicles that will revolutionize transportation systems in urban environments. We will leverage state-of-the-art advances in sensing, control theory, machine learning, automotive technology and artificial advantages to build robotic cars. The team will participate the 2007 DARPA Urban Challenge.
    • SWARMS - Scalable sWarms of Autonomous Robots and Mobile Sensors

    • The SWARMS project brings together experts in artificial intelligence, control theory, robotics, systems engineering and biology with the goal of understanding swarming behaviors in nature and applications of biologically-inspired models of swarm behaviors to large networked groups of autonomous vehicles. Our main goal is to develop a framework and methodology for the analysis of swarming behavior in biology and the synthesis of bio-inspired swarming behavior for engineered systems.
    • Unmanned Aerial Vehicles (UAV)

    • The main motivation for the project is to develop cooperative behavior for between unmanned aerial vehicles and or ground vehicles at the GRASP Lab. Another motivation is to develop control algorithms methodologies to allow the aircraft to form a part of a heterogeneous robot team including ground and other aerial vehicles and perform mission tasks at higher levels.
    • Human activity detection and recognition

    • We are developing computer algorithms to recognize human at multiple levels of abstractions: from the basic body limb tracking, to human identification, to gesture recognition, to activity inference. The ultimate goal is to develop computation algorithms to understand human behavior in video.

      The rapid growth in size of storage devices allows us to store hours, days or even months of video data. Watching through and analyzing videos of such length is no longer feasible. In order to summarize or index videos (for search purposes) we need to develop algorithms which detect and classify events happening in the video without human supervision. To identify and describe various types of events we seek important features and ways of extracting/learning them from the video data.
    • ACCLIMATE

    • This multi-university project involves the University of Pennsylvania, the University of California at Berkeley, and Carnegie Mellon University. It focuses on the design and evaluation of the adaptive hierarchical control of mixed autonomous and human operated semi-autonomous teams that deliver high levels of mission reliability despite uncertainty arising from rapidly evolving environments and malicious interference from an intelligent adversary. Equipment for this project is supported by an ARO DURIP grant.

       
      • Haptography (Haptic Photography)

      • Haptography, like photography in the visual domain, enables an individual to quickly record the haptic feel of a real object and reproduce it later for others to interact with in a variety of contexts. Particular positive ramifications of establishing the approach of haptography are to let doctors and dentists create haptic records of medical afflictions such as a decayed tooth surface to assist in diagnosis and patient health tracking; to improve the realism and consequent training efficacy of haptic surgical simulators and other computer-based education tools; to allow a wide range of people, such as museum goers and online shoppers, to touch realistic virtual copies of valuable items; to facilitate a haptographic approach to low-bandwidth and time-delayed teleoperation, as found in space exploration; and to enable new insights on human and robot touch capabilities.
        The primary hypothesis of this research is that the feel of tool-mediated contact with real and virtual objects is directly governed by the high-frequency accelerations that occur during the interaction, as opposed to the low-frequency impedance of the contact. Building on our knowledge of the human haptic sensory system, our approach will use measurement-based mathematical modeling to derive perceptually relevant haptic surface models and dynamically robust haptic display paradigms, which will be tested via both experimental validation and human-subject studies.
         
      • Spatially Distributed Tactile Feedback for Stroke Rehabilitation

      • More than 780,000 Americans suffer a stroke each year, and approximately 80% of these individuals survive and require rehabilitation to regain motor functionality, though the optimal treatment method is not yet known. This project aims to create a new low-cost rehabilitation system that measures the user's arm movements in real time and uses a combination of graphical and tactile feedback to guide him or her through a set of motions chosen by the therapist. He or she views the posture or motion to master on a screen and attempts to move his or her body to match. The movements of all the body segments are tracked through a motion capture system, displayed on the screen, and compared with the target body configuration in real time. When he or she deviates more than a small amount from this target, tactors on the associated limb segment provide feedback, helping the user know how to translate or rotate that part of his or her body toward the correct configuration.
      • Motion Planning for Mobile Manipulation Platforms

      • This project is concerned with developing high-dimensional motion planners that can control mobile manipulation robotic systems. The challenge is to develop planners that can do it in real-time and at the same time provide theoretical guarantees on performance such as completeness. Example problems include fully autonomous door opening and mobile manipulation of objects in cluttered spaces. This project is in collaboration with Willow Garage company.
      • The Penn Smart Chair

      • This project is an effort at the GRASP Laboratory to develop a new technology in the form of a smart wheelchair. This device is equipped with a virtual interface and on-board cameras that enable the subject to navigate on the ground by interacting with the virtual system interface or use one of the built-in control algorithms.
      • MARS - Multiple Autonomous Robots

      • The goal of the research is to develop a framework and the support tools for the deployment of multiple autonomous robots in an unstructured and unknown environment with applications to reconnaissance, surveillance, target acquisition, and the removal of explosive ordnance. The current state-of-the-art in control software allows for supervised autonomy, a paradigm in which a human user can command and control one robot using teleoperation and close supervisory control. The objective here is to develop the software framework and tools for a new generation of autonomous robots.
      • RiSE Climbing Robot

      • The goal of the RiSE project is to create a bioinspired climbing robot with the unique ability to walk on land and climb on vertical surfaces. Active research studies novel robot kinematics, precision-manufactured compliant feet and appendages, and advanced robot behaviors. This project is funded by the DARPA Biodynotics Program and is in collaboration with Boston Dynamics, Stanford University, Carnegie Mellon University, UC Berkeley and Lewis and Clark University.
      • Image Segmentation and Object Recognition

      • This research is motivated by two sets of questions: 1) how to extract “interesting” patterns from data, and 2) how to guide the grouping process to achieve specific vision tasks, such as recognizing familiar object shapes. In this direction, we have been pursuing a line of research building upon spectral graph theory.
      • DaVinci

      • The DaVinci project brings together mathematicians and engineers from the University of Iowa, Maryland, Pennsylvania and Rensselaer Polytechnic Institute, to address the urgent need for a thorough understanding of the mathematics of engineering systems that can be modeled by Differential Algebraic Inequalities and Differential Complementarity Problems. The project will open a new chapter in applied mathematics in which classical differential equation theory is merged with contemporary mathematical programming methods. The deliverables of our research are a set of broadly applicable mathematical theories, algorithms, and computational tools that will have a direct impact on an array of engineering and scientific disciplines
      • HURT: Heterogeneous Unmanned RSTA Teams (UAV)

      • HURT is a multi-vehicle controller that coordinates and collaboratively plans urban RSTA missions for autonomous vehicles. It implements augmented autonomy for teams of arbitrary vehicle platforms.
      • Autonomous Aerial Vehicles

      • The Autonomous Aerial Vehicles research project is mainly focused around autonomous navigation of unmanned air vehicles. The challenge is to design systems, which exhibit a goal-driven behavior, while sensing and reacting to changing environment. This project is a collaboration between students and faculty from University of Pennsylvania and industry experts from Dragonfly Pictures.