Reposted from the OSRF Blog.
We are happy to announce the final results of the Agile Robotics for Industrial Automation Competition (ARIAC).
ARIAC is a simulation-based competition designed to promote agility in industrial robot systems by utilizing the latest advances in artificial intelligence and robot planning. The goal is to enable industrial robots on the shop floors to be more productive, more autonomous, and more responsive to the needs of shop floor workers. The virtual nature of the competition enabled participation of teams affiliated with companies and research institutions from across three continents.
While autonomously completing pick-and-place kit assembly tasks, teams were presented with various agility challenges developed based on input from industry representatives. These challenges include failing suction grippers, notification of faulty parts, and reception of high-priority orders that would prompt teams to decide whether or not to reuse existing in-progress kits.
Teams had control over their system’s suite of sensors positioned throughout the workcell, made up of laser scanners, intelligent vision sensors, quality control sensors and interruptible photoelectric break-beams. Each team participating in the finals chose a unique sensor configuration with varying associated costs and impact on the team’s strategy.
The diversity in the teams’ strategies and the impact of their sensor configurations can be seen in the video of highlights from the finals:
Scoring was performed based on a combination of performance, efficiency and cost metrics over 15 trials. The overall standings of the top teams are as follows.
First place: Realization of Robotics Systems, Center for Advanced Manufacturing, University of Southern California
Second place: FIGMENT, Pernambuco Federal Institute of Education, Science, and Technology / Federal University of Pernambuco
Third place: TeamCase, Case Western Reserve University
Top-performing teams will be presenting at IROS 2017 in Vancouver, Canada in a workshop held on Sunday, September 24th. Details for interested parties are available at https://www.nist.gov/el/intelligent-systems-division-73500/agile-robotics-industrial-automation-competition-ariac
The IROS workshop is open to all, even those that did not compete. In addition to having presentations about approaches used in the competition, we will also be exploring plans for future competitions. If you would like to give a presentation about agility challenges you would like to see in future competitions, please contact Craig Schlenoff (email@example.com).
Congratulations to all teams that participated in the competition. We look forward to seeing you in Vancouver!
Reposted from the OSRF Blog.
We are excited to show off a simulation of a Prius in Mcity using ROS Kinetic and Gazebo 8. ROS enabled the simulation to be developed faster by using existing software and libraries. The vehicle's throttle, brake, steering, and transmission are controlled by publishing to a ROS topic. All sensor data is published using ROS, and can be visualized with RViz.
We leveraged Gazebo's capabilities to incorporate existing models and sensors. The world contains a new model of Mcity and a freeway interchange. There are also models from the gazebo models repository including dumpsters, traffic cones, and a gas station. On the vehicle itself there is a 16 beam lidar on the roof, 8 ultrasonic sensors, 4 cameras, and 2 planar lidar.
The simulation is open source and available at on GitHub at osrf/car_demo. Try it out by installing nvidia-docker and pulling "osrf/car_demo" from Docker Hub. More information about building and running is available in the README in the source repository.
Gazebo 6.x and 5.x have reached the end of their lives. We will continue to answer questions about these versions, but we will stop fixing bugs.
We are proud to announce the release of Gazebo 8. This version of Gazebo has short term support with an end-of-life on January 15, 2019.
A major API change comes with Gazebo8. This API change centers around the transition from Gazebo's internal math library to Ignition Math. Please refer to the changelog and migration guide to help your transition.
The ability to dynamically and programmatically add visual elements to Gazebo has been added through a visual marker interface. Visual markers can consist of simple shapes, lines, triangles, and text. Additional features associated with visual markers can be found through the gz marker -h command line tool. A C++ example demonstrates how to manipulate visual markers from a stand-alone application.
We continually strive to improve Gazebo's user experience and offer features that benefit a wide audience. To this end, a feature rich plotting utility has been integrated with Gazebo. This utility supports plotting data from topics, models, and simulation parameters. Multiple plots can be created, and data can be exported to CSV or PDF files. Try inserting a model, such as the Double Pendulum, and press ctrl-p.
Following the same rationale as the plotting utility, we are pleased to announce the integration of video recording in Gazebo 8. Simply select the camera icon on the right hand side of the tool bar to start recording into an MP4, AVI, or OGV file. Select the icon again to stop recording and save the video file.
Enjoy the new release, and thanks for all the contributions,
OSRF Development Team
The Space Robotics Challenge (SRC), a NASA Centennial Challenge, has recently kicked off. The SRC "tasks teams with developing and displaying the ability of an R5 robot to assist in the procedures of a NASA mission, such as one to Mars, offering a $1 million prize pool for successful teams."
The SRC uses Gazebo and a set of plugins to simulate R5 and the challenge environments. Qualifications are underway, where competitors solve two tasks on their personal computers. Finals will take place next year, and will utilize CloudSim with Gazebo.
Gazebo can generate a lot of data, especially when simulating complex robots. This data can be used to tune model parameters, debug unexpected behavior, write unit tests, and peform system introspection and identification. However, raw data is difficult for a human to consume.
Gazebo 8 will ship with a plotting utility that can display in real time data produced by Gazebo. The plotting utility is capable of processing data from simulation models and data available on topics. Below is an example image of the plotting utility.
A generic Mars rover is available in the model database.
Most recent active questions
Slowly and surely we are switching Gazebo from its built-in transport library to the new Ignition Transport library based on ZeroMQ and Protobuf. The Ignition Transport library is independent of Gazebo, and designed for use in robotic and in non-robotic applications.
Recently, we have added new features to Ignition Transport including
Ignition Transport has been available in Ubuntu since Trusty, with version 1.3 scheduled for Yakkety. Gazebo 8, to be released in January 2017, will depend on Ignition Transport 1 or greater.
OSRF has internally used Ignition Transport in a number of projects, including HAPTIX and Mentor2. We believe it is ready for prime-time use, and we will be making more use of it in Gazebo and other projects.
Inertia plays an important role in simulation. An object's inertia defines how it will move and react to forces, including gravity. Incorrect inertia values can lead to strange behavior.
Gazebo has a visualization tool that helps debug inertia values. Within the Gazebo GUI, right click on a model and select View→Inertia. You should see a purple box with green axes for each link. The center of each box is aligned with the center of mass of its link. The sizes and orientations of the boxes correspond to unit-mass boxes with the same inertial behavior as their corresponding links.
A rule of thumb is to make sure the purple boxes roughly match each link in size. There are exceptions to this rule, such as an object that doesn't have uniform density.
A fire station is an active pull request to the model database.
Most recent active questions