-
Notifications
You must be signed in to change notification settings - Fork 32
See Getting help page.
Nobody actually asks this question, but many should. The guidelines for messages on the mailing list can be found in the getting help page.
It is very important that you read that page before posting to the mailing list. At least if you are meant to be helped efficiently.
TODO: insert compatibility table here
We don't maintain binary packages for Unix-like systems, but some users and Linux distributions do. We do provide a binary installer for Windows. Look on the Get Player page for more details.
The story of the Player Project can be read in the Project History page.
Some online sources in rough order of usefulness:
- The 'official' documentation page at Sourceforge
- The Wiki for Player project
- Mailing list archives. Many questions have been answered here.
- Old FAQ page, mostly moved here
- Please submit links for this FAQ
See the Local installation tutorial.
See Shakespeare's 'Seven Ages of Man' speech.
Many P/S/G programs (including playerv, playernav, stage, and gazebo) can dump screenshots, and you might want to assemble these screenshots into a movie, for example to include in a presentation. Unfortunately, there is no good (universal) method for animating frames into a movie that will play on all platforms. Some known methods, all of which have pros and cons:
- On Linux, use mencoder (comes with mplayer). Works great, but the movies it makes generally don't run on Windows machines (some kind of DIVX problem). Sometimes Windows Media Player will play these movies, but Powerpoint won't let you embed them in a slide (maddening, isn't it?). Encoding with MPEG1 does work, but it looks terrible.
- On Windows, there is a nice freeware binary called BMP2AVI (google it) that does the trick. Simple, but pretty effective.
- On Windows/OS X, you can pay $30 for the full version of QuickTime, and use that to make your movies. You can generally tweak it so that the movies play on all platforms (QuickTime on Windows and MPlayer on Linux).
- xvidcap: Captures snapshots or movies of areas of the screen.
- wink: Input formats: Capture screenshots from your PC, or use images in BMP/JPG/PNG/TIFF/GIF formats. Output formats: Macromedia Flash, Standalone EXE, PDF, PostScript, HTML or any of the above image formats. Use Flash/html for the web, EXE for distributing to PC users and PDF for printable manuals.
Player is a device server that provides a powerful, flexible interface to a variety of sensors and actuators (e.g., robots). Because Player uses a TCP socket-based client/server model, robot control programs can be written in any programming language and can execute on any computer with network connectivity to the robot. In addition, Player supports multiple concurrent client connections to devices, creating new possibilities for distributed and collaborative sensing and control.
Previous work in the area of robot programming interfaces has focused primarily on providing a development environment that suits a particular control philosophy. While such tools are very useful, we believe that implementing them at such a low level imposes unnecessary restrictions on the programmer, who should have the choice to build any kind of control system while still benefiting from device abstraction and encapsulation.
Thus in Player we make a clear distinction between the programming interface and the control structure, opting for a maximally general programming interface, with the belief that users will develop their own tools for building control systems. Further, most robot interfaces confine the programmer to a single language, providing a (generally closed-source) language-specific library to which the user must link his programs. In contrast, the TCP socket abstraction of Player allows for the use of virtually any programming language. In this way, it is much more "minimal" that other robot interfaces.
There is a list of supported devices.
The Get Player page has the links and the instructions.
Of course you can. Tutorials are available for users of OpenEmbedded, and for Cross-compiling Player from the command line.
What is the difference between Player and Stage and Gazebo? What is the difference between Player device drivers and simulated device models in Stage or Gazebo?
See the explanation on How Player works
That's usually because either Player isn't running or because you're trying the wrong port. To check whether Player is running and to verify on which port(s) it is listening, use netstat. In Linux, the following should help (arguments will be different for other platforms):
- $ netstat --inet --tcp -lp
If you already have a working plugin driver and wish to integrate it, read Adding drivers to Player 3. If you want to create your own plugin driver, read Writing a Player driver
When I run Player (possibly under Stage), it exits with the message "unknown host; probably should quit." What's the deal?
(This seems to occur mostly on OS X) Add an entry to your /etc/hosts for your machine's name. For example, if your machine is called foobar:
127.0.0.1 localhost foobar
There's probably already a line for 127.0.0.1 (known as the "loopback address"); you can just append your hostname to the end of that line.
It's possible to get scans at 75Hz from the SICK LMS, if you use an RS422 connection, which is a high-speed serial line. If you use a run-of-the-mill RS232 connection, the best you can expect is about 10Hz, depending on angular aperture and resolution.
Look here for one way to use a USB-RS422 converter to get high-speed laser scans.
If you purchased the Quatech card, check out the tutorial High Speed Lidar.
For a detailed explanation of how the LMS works, timing considerations, and what data rates can be expected, look here.
More info and tips on using Player to get high-speed laser scans can be found here.
How do I connect my (Sony or Canon) PTZ camera to a standard serial port (instead of, e.g., the AUX port on my Pioneer)?
ActivMedia robots that are equipped with a PTZ camera often have the camera connected to the AUX port on the P2OS board in the robot. Player does not support control of the camera through this connection, for reasons explained here. Instead, Player requires a standard, direct, serial line to the camera.
Documentation about the Sony PTZ units is available here. In particular, page 15 of this manual has a wiring diagram.
Here are some detailed wiring instructions for a Pioneer with a Canon camera and VSBC8 computer, courtesy of Jason L. Bryant at the Navy Center for Applied research in Artificial Intelligence:
Instructions for rewiring a pioneer robot so that the Canon PTZ camera device can be connected to a serial port (ttyS1) the on-board VSBC8 computer rather than to the robot's microcontroller.
Purchase a VISCA - DB9 conversion cable (item # 0002V448 on-line), as well as a length of 20 ribbon connection cable (our cable is about 18 inches long). You will also need a 20 pin header connector.
Attach the 20 pin header to one end of the ribbon taking note of the location pin 1 on both the ribbon and the header connector. At the other end of the cable, split the ribbon into 2 10 pin sections. Cut about 1 inch off of the last pin from each section (pins 10 and 20) so that you now have 2 9-pin cable ends. Now attach 2 DB-9 serial connectors (MALE) to the ends (being sure that pins 1 and 11 go into the proper slots of the connector. The serial connection with pin 1 will eventually go to the serial port on the microcontroller and the other connection will hook to the VISCA - DB9 conversion cable.
Remove the top plate and nose from your pioneer robot. Next,locate and remove the 20 pin header with a 9 wire rainbow colored ribbon from the Serial port on the on-board computer. This header connects to serial ports ttyS0 and ttyS1, however, using the default pioneer configuration, port ttyS1 is unused. The other end of this ribbon connects to the serial port on the microcontroller (look in your pioneer manual for the location of this port or just follow the cable).
Now place the 20 pin header of the cable you just made into the now free serial ports on the computer. Snake the wires under the robot's control panel and to the back section of the chassis. Connect the serial connection from ttyS0 (serial connection with pins 1 - 9) to the now free serial port on the microcontroller. Connect the other serial connection (pins 11 - 19) to the female DB-9 connector on the VISCA to DB-9 conversion cable and snake the rest of this cable up and outside the robot cover. Replace the nose and top cover of your robot. Once you connect the other end of the VISCA cable to the camera, you will now have a working ptz camera on port /dev/ttyS1.
You can test that the connections work by running /usr/local/Aria/bin/demo on the robot, selecting 'C' for camera control, then the appropriate key for your particular camera (Sony, or Canon) connected to a serial port ('@' for a Canon), and finally '2' for serial port /dev/ttyS1.
There are several options for accessing image data from a camera in Player:
- Write a (plugin) Player driver which reads the data directly from the camera (through the camera interface).
- Use socket interface to return the image data to the client side.
- Use an external streaming system, like Quicktime RTSP, gstreamer, VideoLAN or OpenH323.
Raw image data can be read on the client side using an appropriate proxy ((e.g., CameraProxy in the C++ client, or playerc_camera_t in the C client). Be aware that this options will severly increase network traffic.
Setting up an external streaming server allows you to access the "live" video feed using many other popular programs. Since the data is not travelling via Player, there is less impact on the performance of Player. Also, streaming servers typically compress the images before sending, reducing the network load somewhat. That said, there are no samples in Player/Stage to show you how to do this, as it is completely outside of the project.
Searching the mailing lists for "camera" will bring up most of the previous discussions of this matter.
Links:
- http://developer.apple.com/darwin/projects/streaming/
- http://gstreamer.freedesktop.org/
- http://www.videolan.org/
- http://www.openh323.org/
What is the purpose of the key in a provides or requires field (e.g., the "odometry" in "odometry::position:0")?
This is explained in the Writing Confguration Files tutorial.
It's a request to a position2d device (e.g., a mobile robot) to set its internal odometry to a particular (X,Y,theta) value. It doesn't move the robot, just transforms the coordinate system in which odometry will be reported.
The 'opaque' interface is designed for this purpose. It allows you to exchange messages with arbitrary content. On the client side, there's an OpaqueProxy. Of course, there will not be XDR wrappers for your custom messages, so you have to do your own (de)marshaling on each side.
The opaque interface is usually used to prototype new interfaces and/ or extensions to existing interfaces. If you would like to add a new interface, you have two options: create a plugin interface or add a new interface definition to player. For custom applications, it's better to create a plugin interface. If the interface needs to be integrated into player, you can follow the instructions at Writing a Player interface