|322 x 220 x 251 mm
|Four-wheel differential Load
|Ackermann mode Load
|Wheat wheel Load
|Minimum ground clearance
|Hub motor (4x14.4W)
|No-load max. speed
|Ackermann minimum turning radius
|Max. climbing capacity
|40° (under track mode)
|ARM 64-bit firstname.lastname@example.orgGHz (Cortex-A57)
|128-core NVIDIA Maxwell @921MHz
|4GB 64Bit LPDDR4 25.6GB/s
|Li-ion 5200mAh 12V
|5.5x2.1mm DC Barrel Jack
|NVIDIA Jetson Nano (4G)
|iFlytek Voice Assistant/Google Assistant
|Left and right dual channels (2x2W)
|TYPE-C x1, USB2.0 x2
|1.54 inch 128x64 white OLED display screen
|7 inch 1024x600 IPS touch screen
|Mobile APP, command control
|Bluetooth, maximum distance 10m
Robots are capable of working precisely in automated environments—consequently reducing human physical workload. Unmanned Ground Vehicles (UGVs) are robot vehicles functioning on the ground without requiring human presence on board. To produce an effective automated robot capable of different tasks, it utilizes machine learning and deep learning.
Currently, there is a challenge in researching these technologies as they require machines for code implementation and data gathering. The need for hardware and lower-level code in completing a machine to be used by researchers likewise arises.
AgileX Robotics LIMO serves as a ROS development and learning platform, integrating four steering modes on top of pre-installed demos and examples. This learning platform adapts a wider scenario range that is more aligned with the requirement of industry applications. The device supports robot education, function R&D, and product development.
LIMO can perform advanced robotic applications, utilizing features like the NVIDIA Jeston Nano, depth camera, EAI XL2 LiDAR, and other sensor configurations. Some of these applications are SLAM, obstacle avoidance, path planning and navigation, autonomous positioning, and traffic light recognition.
LIMO has a 4x14.4W drive-type hub motor and measures 322x220x251mm. It weighs 4.2kg and can carry a dead load of 4.8kg. The wheelbase spans 200mm and the tread at 175mm. The device requires a minimum ground clearance of 24mm and exhibits a 1m/s maximum speed at no load.
Processor & Operating System
LIMO’s CPU is Cortex-A57, an ARM 64-bit 4-core processor running at 1.43GHz, while the GPU is the 128-core NVIDIA Maxwell at 921MHz. The memory used is the 4GB 64-bit LPDDR4 25.6GB/s. The operating system is Ubuntu 18.0, while the Inertial Measurement Unit (UMI) is MPU6050.
Display & Battery
The front uses a 1.54-in 128x64 OLED display screen, while the rear displays a 7-in IPS touch screen with a 1024x600 resolution. It runs on a 5200mAh 12V battery that has a 40 min working time and a standby time of 2 hours. The charging interface is a DC barrel jack measuring 5.5x2.1mm.
Four Steering Modes
LIMO’s four different steering modes are tracked, omnidirectional, four-wheel differential drive, and Ackerman. The design allows the device to quickly switch between these steering modes, allowing developers to research a variety of applications.
The four-wheel differential mode grants in-situ auto-rotation ability. Although it causes serious tire wear even with a 1kg load. On the other hand, the Mecanum wheel mode or the omnidirectional mode allows LIMO to move forward, laterally, oblique, rotational, or even a combination of motion modes. This mode can carry a wheat wheel load of 4kg.
Meanwhile, the track mode gives LIMO a good off-road performance for climbing small steps and 40-degree slopes. Lastly, the Ackermann mode aims to solve the problem involving the wheels when tracing out circles of varying radii in the steering of vehicles. The Ackermann mode can carry a load of 4 kg with a minimum turning radius of 0.4m.
Nvidia Jetson Nano Developer Kit
The small yet powerful NVIDIA Jetson Nano is a computer designed for entry-level edge AI devices and applications. The kit has acceleration libraries for computer vision, deep learning, graphics, multimedia, and others. Moreover, it can expand robot navigation positioning, voice recognition, and image processing among others, especially in the high-end version.
ORBBEC® Dabai Stereo Depth Camera
The low-cost high-precision ORBBEC® Dabai Stereo Depth Camera can provide depth images and color, generating dense point clouds while consuming low power. The ORBEC SDK and ROS wrapper gives computer vision development a breeze, with up to 67.9° field view, 0.3-3m range, and accuracy of +/-6mm.
EAI X2L 360° LiDAR
The 360-degree two-dimensional YDLIDAR X2L LiDAR is a ranging product with a scan frequency of 7Hz, a range distance of 0.1-8m, and a frequency range of 3000 Hz. The feature provides ROS navigation, obstacle avoidance, environment scanning, and 3D reconstruction. The absolute tolerance is 2cm at less than a meter, with a relative tolerance of 3.5% for 1-6m. The trigonometric ranging principle combined with related optical, electrical, and algorithm design aids the product in achieving high-precision and high-frequency distance measurement.
The LIMO app from ROS technology aims to replace the remote controller. It connects through Bluetooth at a maximum distance of 10m. This multi-modal motion control gives the user control over the four-wheel differential mode. While it runs via Bluetooth, LIMO also connects with Wi-Fi through the command control.
Limo Simulation Table
The simulation table specifically designed for Limo works best in autonomous driving projects and curriculum. The table grants autonomous traffic light recognition, reversing and warehousing, and navigation and detection. It comes with 16 750x750x5mm plates serving as the base and 16 750x200x5mm borders as the sides. Meanwhile, the whiteboard with its magnets works for text recognition. The traffic light can work in manual or automatic mode. The Manual mode requires pressing the round button, while in Automatic mode, red turns yellow after 35s, then yellow to green after 3s, and another 35s from green to red.
Tags4 wheeled robotsroboticstracked robots
A newly created real-life Transformer is capable of reconfiguring its body to achieve eight distinct types of motion and can autonomously assess the environment it faces to choose the most effective combination of motions to maneuver.
Yasuhide “Yasu” Yokoi is the cofounder of design and technology firm Final Aim Inc., which works with laboratories, startups, and multinational companies to transform ideas into tangible solutions.