*: Adaptor is needed for the termination of the can circuit. From the technical documentation of Continental ARS 308-2C/-21;
"Since no termination resistors are included in the radar sensor ARS 308-2C and ARS 308-21, two 120 Ohm terminal resistors have to be connected to the network (separately or integrated in the CAN interface of the corresponding unit)."
2-Turn on the power supply. The device should start working with audible operation.
2. A brief introduction to CAN bus communication using Linux
There are various ways to communicate with CAN bus. For linux, SocketCAN is one of the most used CAN drivers. It is open source and comes with the kernel. Furthermore it can be used with many devices. If a different vendor driver is liked to be used however, please refer to that drivers manual for communicating via CAN bus.
First load the drivers. The device sends the messages with a specific bitrate, if it is not matched the stream would not be synchronized. Therefore the ip can link must be set with the bitrate of the device. Bitrate for this device is constant at 500000/s and cannot be changed. Below is an example sniplet for setting can device can1:
$ modprobe can_dev
$ modprobe can
$ modprobe can_raw
$ sudo ip link set can1 type can bitrate 500000$ sudo ifconfig can1 up
Now the connection between the computer and the sensor is established.
For checking the information sent by the device, a user friendly tool package called can-utils can be used with SocketCAN for accessing the messages via the driver.
Get the can-utils;
$ sudo apt-get install can-utils
Display the can messages from can1;
$ candump can1
A stream of CAN messages should be received at this point. An example of the message stream is shown below;
The can messages sent from the device have to be converted into meaningful information. Can messages have headers to identify the content of the message. Below the headers and the content of the messages sent from the Radar are shown;
0x300 and 0x301 are the input signals. The ego-vehicle speed and yaw rate can be sent to the device. If this information is provided, the radar will return detected objects's positions and speeds relative to the ego-vehicle. If this information is not sent, the radar will assume that it is stationary.
0x600, 0x701 and 0x702 are the output messages of the radar. The structure of these messages are given below;
3. Configuring the radar
The radar must be configured first to receive tracked object information. This device does not transmit raw data from the radar scans. Instead, its microcontroller reads the raw sensing data and detects/tracks objects with its own algorithm (this algorithm is not accesable). The sensor sends the detected/tracked object information through the CAN bus.
The default behavior of the device is to send detected objects (not tracked). In order to receive tracked object messages, 0x60A, 0x60B and 0x60C, a configuration message has to be sent. The following command will send the configuration message using the can-utils for receiving tracked object messages:
$ cansend can1 200#0832000200000000
Now 0x60A, 0x60B and 0x60C messages can be received instead of 0x600, 0x701 and 0x702. We can check this by dumping the CAN stream on the terminal screen with the following command again:
$ candump can1
The stream should include 0x60A, 0x60B and 0x60C messages now.
4. Receiving CAN messages on ROS
Ros package socketcan_interface is needed to receive can messages in ROS. This package is used with the socketCAN.
This should display the received messages in ROS. An example is shown below;
With socketcan_interface a driver for this device can be developed. However there is already a ready can driver in ros called ros_canopen. Install this package with the following command;
$ sudo apt-get install ros-kinetic-ros-canopen
This package will be used to publish the can messages received from the device in the ROS environment.
The socketcan_to_topic node in the socketcan_bridge package can be used to publish topics from the can stream. First, start a ROS core and then launch this node with the name of the can port as an argument (e.g can1).
This is a brief guide to getting Apollo 2.0 up and running. It is based on the Apollo README with additional setup for the Perception modules.
Ubuntu 16.04 (also works on 14.04).
Nvidia GPU. Install the drivers as described here. You don't need CUDA installed (it's included in the Apollo docker). On 16.04 you will need a new-ish version - the below is tested using 390.25. The Apollo recommended 275.39 will not work on 16.04, but will work on 14.04. However, as this requires a newer GCC version that breaks the build system, it is much easier to go straight to the 390.25 driver.
Download code and Docker image
Get the code: git clone https://github.com/ApolloAuto/apollo.git
If you don't have Docker already: ./apollo/docker/scripts/install_docker.sh
Then log out and log back in again.
Pull the docker image. The dev_start.sh script downloads the docker image (or updates it if already downloaded) and starts the container. cd apollo/
Install Nvidia graphics drivers in the Docker image
Check which driver you are using (in host) with nvidia-smi.
First off we need to enter the container with root priveledges so we can install the matching graphics drivers. docker exec -it apollo_dev /bin/bash
where ***.** is the driver version running on your host system. Note: Disregard the Apollo instructions to upgrade to GCC 4.9. Not only is it not necessary with newer versions of the Nvidia drivers, but it will make the build fail. Stick with the GCC version of 4.8.4 which comes in the Docker image.
Now install the drivers: chmod +x NVIDIA-Linux-x86_64-***.**.run
./NVIDIA-Linux-x86_64-***.**.run -a --skip-module-unload --no-kernel-module --no-opengl-files
Hit 'enter' to go with the default choices where prompted. Once done, check that the driver is working with nvidia-smi.
To create a new image with your changes, check what the container ID of your image is (on the host): docker ps -l
Use the resulting container ID with the following command to create a new image (on the host): docker commit CONTAINER_ID apolloauto/apollo:NEW_DOCKER_IMAGE_TAG
where CONTAINER_ID is the container ID you found before, and NEW_DOCKER_IMAGE_TAG is the name you choose for your Apollo GPU image.
Build Apollo in your new Docker image
To get into your new docker image, use the following: ./docker/scripts/dev_start.sh -l -t NEW_DOCKER_IMAGE_TAG
Now you should be able to build the GPU version of Apollo: ./apollo.sh clean
From within the docker image, start Apollo: scripts/bootstrap.sh
Check that Dreamview is running at http://localhost:8888.
Set up in Dreamview by selecting the setup mode, vehicle, and map in the top right. For the sample data rosbag, select "Standard", "Mkz8" and "Sunnyvale Big Loop".
Start the rosbag in the docker container with rosbag play path/to/rosbag.bag.
Once you see the vehicle moving in Dreamview, pause the rosbag with the space bar.
Wait a few seconds for the perception, prediction and traffic light modules to load.
Resume playing the rosbag with the spacebar.
Once the rosbag playing is complete, to play it again you have to first shutdown with scripts/bootstrap.sh stop and then repeat the above from step 1 (otherwise the time discrepancy stops the modules from working).
When recording data from a previously recorded rosbag instead of sensor data, clock might become a problem.
Rosbag record updates the clock to the time when the rosbag is being created, but the original timestamps are not updated causing the clock in the rosbag and the topics timestamps to be out of sync.
To fix this when recording the new rosbag add /clock to the list of recorded topics. this will keep the clock of the original rosbag instead of creating a new one.
- Vector Map
- NDT working
- Calibration publisher
- Tf between camera and localizer
Traffic light recognition is splitted in two parts
1. feat_proj finds the ROIs of the traffic signals in the current camera FOV
2. region_tlr checks each ROI and publishes result, it also publishes /tlr_superimpose_image image with the traffic lights overlayed
2a. region_tlr_ssd deep learning based detector.
Compile Autoware, Cmake will detect SSD Caffe and compile the SSD nodes.
To test, download the object detection models from: http://ertl.jp/~amonrroy/ssd_models/ssd500.zip http://ertl.jp/~amonrroy/ssd_models/ssd300.zip
The 300 model will run faster but won't provide good results at farther distances. In contrast, the 500 model require more computing power but will detect at lower resolutions(farther objects).
In Autoware's RTM use the [app] button next to ssd_unc in the Computing Tab. to select the correct image input src and the models path.
Launch the node and play a rosbag with image data.
Follow the instructions on screen. DO NOT install the NVIDIA Driver included. Install the CUDA Samples on your home directory.
Once finished, to confirm everything is ok. Go to your home directory and execute cd NVIDIA_CUDA-X.Y_Samples/1_Utilities/deviceQuery Match X, Y to your CUDA version. i.e. CUDA 9.0 cd NVIDIA_CUDA-9.0_Samples/1_Utilities/deviceQuery
Compile the sample running make
Run the sample ./deviceQuery you should see the details about your GPU(s) and CUDA Setup.
Check the assigned USB device to the IMU using dmesg
[ 8808.219908] usb 3-3: new full-speed USB device number 28 using xhci_hcd
[ 8808.237513] usb 3-3: New USB device found, idVendor=2639, idProduct=0013
[ 8808.237522] usb 3-3: New USB device strings: Mfr=1, Product=2, SerialNumber=3
[ 8808.237527] usb 3-3: Product: MTi-300 AHRS
[ 8808.237531] usb 3-3: Manufacturer: Xsens
[ 8808.237534] usb 3-3: SerialNumber: 03700715
[ 8808.265957] usbcore: registered new interface driver usbserial
[ 8808.265982] usbcore: registered new interface driver usbserial_generic
[ 8808.265999] usbserial: USB Serial support registered for generic
[ 8808.268037] usbcore: registered new interface driver xsens_mt
[ 8808.268048] usbserial: USB Serial support registered for xsens_mt
[ 8808.268063] xsens_mt 3-3:1.1: xsens_mt converter detected
[ 8808.268112] usb 3-3: xsens_mt converter now attached to ttyUSB0
Change permissions of the device chmod a+rw /dev/ttyUSB0 Probably is USB0, change it accordingly to your setup
In an Autoware sourced terminal execute: rosrun xsens_driver mtdevice.py -m 2 -f 100
(this configures the IMU to publish raw data from the sensor at 100Hz)
To publish data execute (in a sourced terminal): rosrun xsens_driver mtnode.py _device:=/dev/ttyUSB0 _baudrate:=115200
Confirm data is actually coming using rostopic echo /imu_raw in a different terminal