

Self-Driving Car Studio
Accelerate, expand, and sustain self-driving research
The Quanser Self-Driving Car Studio is the ideal platform to investigate a wide variety of research topics for teaching and academic research in an accessible and relevant way. Use it to jump-start your research or give students authentic hands-on experiences learning about the essentials of self-driving. The studio brings you the tools and components you need to test and validate dataset generation, mapping, navigation, machine learning, and other advanced self-driving concepts at home or on campus.
Learn More- 1 x high-performance preconfigured turnkey PC to serve as a testbed and infrastructure station
- 3 x high-definition monitors
- 1 x game controller
- 2 x floor maps (15.75’ x 9.2’/4.8m x 2.8m and 15.75’ x 20’/4.8m x 6.1m) for testing various driving scenarios (images in Gallery)
- 2 x custom PVC borders for branding and environmental mapping
- 4 x reprogrammable traffic lights with batteries
- 1 x set of accessories including multiple options for North American and European scale signs and ten traffic pylons
- 1 x car stand
- 1 x preconfigured router for high-speed wireless communication
- 1 year license for the QLabs Virtual QCar module
The Quanser Virtual QCar is a fully instrumented, dynamically accurate digital twin of the Quanser QCar system. It behaves the same way as the physical hardware and can be measured and controller using Python, ROS, or MATLAB and Simulink. It can enrich your lectures and activities in traditional labs, or bring credible, authentic model-based lab experiences into your distance and online self-driving course.
Same as the physical QCar, the virtual system is a self-driving teaching and research platform complete with industrially relevant sensors such as
Lidar and RGBD cameras.
QLabs Virtual QCar is available as a 12-month multi-seat subscription. The platform is compatible with the physical QCar content, covering
examples such as 360 vision, RGBD imaging, autonomous driving and more. The platform also integrates with the self-driving teaching content available with the Self-Driving Car Studio.
Product Details
Dimensions | 39 x 21 x 21 cm |
Weight (with batteries) | 2.7 kg |
Power | 3S 11.1 V LiPo (3300 mAh) with XT60 connector |
Operation time (approximate) | ~2 hours 11 m (stationary, with sensors feedback) |
30 m (driving, with sensor feedback) | |
Onboard computer | NVIDIA® Jetson™ TX2 |
CPU: 1.2 GHz quad-core ARM Cortex-A57 64-bit + 1.2 GHz Dual-Core NVIDIA Denver2 64-bit | |
GPU: 256-core NVIDIA Pascal™ GPU architecture, 1.3 TFLOPS (FP16) | |
Memory: 8GB 128-bit LPDDR4 @ 1866 MHz, 59.7 GB/s | |
LIDAR | LIDAR with 2k-8k resolution, 10-15Hz scan rate, 12m range |
Cameras | Intel D435 RGBD Camera |
360° 2D CSI Cameras using 4x 160° FOV wide angle lenses, 21fps to 120fps | |
Encoders | 720 count motor encoder pre-gearing with hardware digital tachometer |
IMU | 9 axis IMU sensor (gyro, accelerometer, magnetometer) |
Safety features | Hardware “safe” shutdown button |
Auto-power off to protect batteries | |
Expandable IO | 2x SPI |
4x I2C | |
40x GPIO (digital) | |
4x USB 3.0 ports | |
1x USB 2.0 OTG port | |
3x Serial | |
4x Additional encoders with hardware digital tachometer | |
4x Unipolar analog input, 12 bit, 3.3V | |
2x CAN Bus | |
8x PWM (shared with GPIO) | |
Connectivity | WiFi 802.11a/b/g/n/ac 867Mbps with dual antennas |
2x HDMI ports for dual monitor support | |
1x 10/100/1000 BASE-T Ethernet | |
Additional QCar feautres | Headlights, brake lights, turn signals, and reverse lights (with intensity control) |
Dual microphones | |
Speaker | |
LCD diagnostic monitoring, battery voltage, and custom text support |
Supported Software and APIs | QUARC Autonomous Software License |
Quanser APIs | |
TensorFlow | |
TensorRT | |
Python™ 2.7 & 3 | |
ROS 1 & 2 | |
CUDA® | |
cuDNN | |
OpenCV | |
Deep Stream SDK | |
VisionWorks® | |
VPI™ | |
GStreamer | |
Jetson Multimedia APIs | |
Docker containers with GPU support | |
Simulink® with Simulink Coder | |
Simulation and virtual training environments (Gazebo, QuanserSim) | |
Multi-language development supported with Quanser Stream APIs for inter-process communication | |
Unreal Engine |