Archive

Posts Tagged ‘robot’

A Look at Three Options to Develop Real-Time Linux Systems on Application Processors – HMP, Real-Time Linux and Xenomai

October 15th, 2016 6 comments

This is a guest post by written by Guilherme Fernandes, Raul Muñoz, Leonardo Veiga, Brandon Shibley, all working for Toradex.

Introduction

Application processor usage continues to broaden. System-on-Chips, usually powered by ARM Cortex-A cores, are taking over several spaces where small ARM Cortex-M, and other microcontroller devices, have traditionally dominated. This trend is driven by several facts, such as:

  • The strong requirements for connectivity, often related to IoT and not only from a hardware point of view, but also related to software, protocols and security
  • The need for highly interactive interfaces such as multi-touch, high resolution screens and elaborate graphical user interfaces;
  • The decreasing price of SoCs, as consequence of its volume gain and new production capabilities.

Typical cases exemplifying the statement above are the customers we see every day starting a product redesign upgrading from a microcontroller to a microprocessor. This move offers new challenges as the design is more complicated and the operating system abstraction layer is much re complex. The difficulty of hardware design using an application processor is overcome by the use of reference designs and off-the-shelf alternatives like computer-on-modules or single board computers. On the operating system layer, the use of embedded Linux distributions is widespread in the industry. An immense world of open source tools is available simplifying the development of complex and feature rich embedded systems. Such development would be very complicated and time consuming if using microcontrollers. Despite all the benefits, the use of an operating system like Linux still raises a lot of questions and distrust when determinism and real-time control application topics are addressed.

A common approach adopted by developers is the strategy of separating time-critical tasks and regular tasks onto different processors. Hence, a Cortex-A processor, or similar, is typically selected for multimedia and connectivity features while a microcontroller is still employed to handle real-time, determinism-critical tasks. The aim of this article is to present some options developers may consider when developing real-time systems with application processors. We present three possible solutions to provide real-time capability to application processor based designs.

Heterogeneous Multicore Processing

The Heterogeneous Multicore Processing (HMP) approach is a hardware solution. Application processors like the NXP i.MX7 series, the NXP i.MX6SoloX and the upcoming NXP i.MX8 series present a variety of cores with different purposes. If we consider the i.MX7S you will see a dual core processor composed of a Cortex-A7 core @ 800MHz side-by-side with a Cortex-M4 core @ 200MHz. The basic idea is that user interface and high-speed connectivity are implemented on an abstracted OS like Linux with the Cortex-A core while, independently and in parallel, executing control tasks on a Real-Time OS, like FreeRTOS, with the Cortex-M core. Both cores are able to share access to memory and peripherals allowing flexibility and freedom when defining which tasks are allocated to each core/OS. Refer to Figure 1.

NXP i.MX7 Block Diagram (Click to Enlarge)

Figure 1 – NXP i.MX7 Block Diagram (Click to Enlarge)

Some of the advantages of using the HMP approach are:

  • Legacy software from microcontrollers can be more easily reused;
  • Firmware update (M4 core) is simplified as the firmware may be a file at the filesystem of the Cortex-A OS;
  • Increased flexibility of choosing which peripherals will be handled by each core. Since it is software defined, future changes can be made without changing hardware design.

More information on developing applications for HMP-based processors are available at these two articles:

Toradex, Antimicro and The Qt Company collaboratively built a robot showcasing this concept. The robot – named TAQ – is an inverted pendulum balancing robot designed with the Toradex Computer on Module Colibri iMX7. The user interface is built upon Linux with the QT framework running on the Cortex-A7 and the balancing/motor control is deployed on the Cortex-M4. Inter-core communication is used to remote control the robot and animate its face as seen in the short video below.

Real-Time Linux

The second approach we present in this article is software related. Linux is not a real-time operating system, but there are some initiatives which have greatly improved the determinism and timeliness of Linux. One of these efforts is the Real-Time Linux project. Real-Time Linux is a series of patches (PREEMPT_RT) aimed at adding new preemption options to the Linux Kernel along with other features and tools to improve its suitability for real-time tasks. You can find documentation on applying the PREEMPT_RT patch to the Linux kernel and developing applications for it at the official Real-Time Linux Wiki (formerly here).

We did some tests using the PREEMPT_RT patches on a Colibri iMX6DL to exemplify the improvement in real-time performance. The documentation on preparing the Toradex Linux image to deploy the PREEMPT_RT patch is available at this link. We developed a simple application which toggles a GPIO at a 2.5KHz (200µs High / 200µs Low). The GPIO output is connected to a scope where we measure the resulting square wave and evaluate the real output timings. The histograms below show the comparison between the tests on a standard Linux kernel configured for Voluntary Preemption (top) and a PREEMPT_RT patched Linux kernel configured for Real-time Preemption (bottom). The x-axis represents the period of the square wave sample and the y-axis represents the number of samples which measured with such a period. The table below the chart presents the worst and average data.

Click to Enlarge

Figure 2: Histogram of the square wave generated using the standard Kernel (top) and Preempt-RT kernel (bottom) – Click to Enlarge

Description

Samples

Smallest (µs)

Worst Case for 99% of  Samples (µs)

Worst Case (µs)

Median (µs)

Average (µs)

Default Kernel

694,780

36

415

4,635

400

400

PREEMPT_RT Kernel

683,593

369

407

431

400

400

Table 1: Comparison between Default Kernel and real-time Kernel when generating a square wave.

An example software system using the PREEMP_RT patch is provided by Codesys Solutions. They rely on the Real-Time Linux kernel, together with the OSADL (Open Source Automation Development Lab), to deploy their software PLC solution which is already widespread throughout the automation industry across thousands of devices. The video below presents the solution running on a Apalis iMX6Q.

Xenomai

Xenomai is another popular framework to make Linux a real-time system. Xenomai achieves this by adding a co-kernel to the Linux kernel. The co-kernel will handle time-critical operations and will have higher priority than the standard kernel. To use the real-time capabilities of Xenomai the real-time APIs (aka libcobalt) must be used to interface user-space applications with the Cobalt core, which is responsible for ensuring real-time performance.

dual-core-xenomai-configuration

Figure 3: Dual Core Xenomai Configuration

Documentation on how to install Xenomai on your target device can be found at the Xenomai website. Additionally, there is a variety of Embedded Hardware which is known to work as indicated in the hardware reference list, which includes the whole NXP i.MX SoC series.

To validate the use of Xenomai on the i.MX6 SoC we also developed a simple experiment. The target device was the Colibri iMX6DL by Toradex. We ran the same test approach as described above for the Real-Time Linux extension. Some parts of the application code used to implement the test are presented below to highlight the use of Xenomai APIs.

The results comparing Xenomai against a standard Linux kernel are presented in the chart below. Once again, the real-time solution provides a clear advantage – this time with even greater distinction – over the time-response of the standard Linux kernel.

Click to Enlarge

Figure 3: Histogram of the square wave generated using the standard Kernel (top) and Xenomai (bottom) – Click to Enlarge

Description

Samples

Smaller (µs)

Worst Case for 99% of Samples (µs)

Worst Case (µs)

Median (µs)

Average (µs)

Default Kernel

694,780

36

415

4,635

400

400

Xenomai Implementation

1,323,521

386

402

414

400

400

Table 2: Comparison between Default Kernel and Xenomai implementation when generating a square wave.

Conclusion

This article presented a brief overview of some solutions available to develop real-time systems on application processors running Linux as the target operating system. This is a starting point for developers who are aiming to use microprocessors and are concerned about real-time control and determinism.

We presented one hardware-based approach, using Heterogeneous Multicore Processing SoCs and two software based approaches namely: Linux-RT Patch and Xenomai. The results presented do not intend to compare operating systems or real-time techniques. Each of them has strong and weak points and may be more or less suitable depending on the use case.

The primary takeaway is that several feasible solutions exist for utilizing Linux with application processors in reliable real-time applications.

Asus Zenbo Smart Home Robot is Looking for Developers

May 31st, 2016 1 comment

Asus introduced a $599 robot on wheel at Computer 2016. Asus Zenbo features a friendly face which can also be used as a touchscreen display, answer voice commands, take pictures on-demand, and can help you for all sort of task from reading recipes out loud, emergency calls if it detects an elderly person’s fall, to task reminders such as taking medicine, or going to a doctor’s appointment.

Asus_ZenboThe robot could also be used for all sort of home automation tasks so as turning on and off lights, so Asus also launched a developer program in order to get more applications leveraging the robot’s capabilities, including Android and web applications using speech recognition, facial recognition, indoor navigation, smart home protocols, etc… Few technical details have been provided so far though. The highlights of ASUS press event do show the robot in action, answering questions, moving around and taking photos.

Via Liliputing

Autonomous Deep Learning Robot Features Nvidia Jetson TK1 Board, a 3D Camera, and More

January 25th, 2016 No comments

Autonomous, a US company that makes smart products such as smart desks, virtual reality kits and autonomous robots, has recently introduced a deep learning robot that comes with a 3D camera, speaker and microphone, Jetson TK1 board, and a mobile base.

Autonomous_Deep_Learning_Robot

The robot appears to be mostly made of the shelves parts:

  • 3D Depth camera – Asus Xtion Pro 3D Depth Camera
  • Speaker & Microphone
  • Nvidia Jetson TK1 PM375 board – Nvidia Terra K1 quad-core Cortex A15 processor @ 2.3 GHz with a 192-core Kepler GPU, 2GB RAM, 16 GB flash
  • Kobuki Mobile Base –  Kobuki is the best mobile base designed for education and research on state of the art robotics. Kobuki provides power supplies for external computer power as well as additional sensors and actuators. Its highly accurate odometry, amended by calibrated gyroscope, enables precise navigation.

The robot is designed for research in deep learning and mobile robotics, and comes with Ubuntu, Caffe, Torch, Theano, cuDNN v2, and CUDA 7.0, as Robot Operating System (ROS) set of open source software libraries and tools.

Kobuki Base

Kobuki Base

While there’s virtually no documentation at all on the product page, I’ve been told that the robot was built on top of TurtleBot open source robot, and re-directed to tutorials available via TurtleBot Wiki, as well useful resources for deep learnings frameworks such as Caffe and Torch, and Google TensorFlow Tutorials.

Autonomous Deep Learning Robot sells for $999 with manual charging, or $1048 with a self-charging dock.

Thanks to Nanik for the tip!

Reeman Playmate is a Robot for Kids Based on Rockchip RK3288 SoC

October 26th, 2015 2 comments

Rockchip processors are usually found in tablets and TV boxes, but one Shenzhen based company called Reeman has designed Playmate robot for kids powered by Rockchip RK3288 processor, which can take picture, sing, dance, tell stories, be used for video chat, and goes to recharge itself automatically when the batteries are low.

Reeman_PlaymateThe Rockchip processor takes care of a 10.1″ touchscreen display, stereo microphones, speakers, camera, and artificial intelligence which recognizes Chinese and to a lesser extend English language, and take relevant actions, while the motors and sensors are managed by a few ARM Cortex M3 micro-controllers. The robot is still in development, and the company is working on adding a 3D camera to recognize objects, which will allow the robot to find its charging dock among other things.

The Android 4.4 robot will be available for 2,999 RMB (~$473) in Q1 2016 in China. Charbax filmed the robot at the company, and also had a tour of the R&D department.

You can visit Reeman Playmate product page if you will, but currently there’s virtually no information.

Via ARMDevices.net

GroBotz Interactive Robot Project is Made of Easy to Assemble Smart Blocks (Crowdfunding)

February 25th, 2015 No comments

GroBotz makes me think of Lego applied to robotics. The project consists of modules such as motors, sensors, buttons, switches, or cameras that snap together in order to create a robot on wheels, games, toys, a musical instrument, or whatever idea you may have, and the hardware is then programmed using a graphical user interface.

Grobie

GroBie is made for GroBotz Modules

A Raspberry Pi board is used for the brain of the robot, and Microchip PIC MCUs for the smart blocks. The software is programmed in C# using Xamarin, the user interface is based on Unity, OpenCV is used for image processing, and during development a plastic part where printed with Makerbot, and schematics and PCB layout designed with CadSoft EAGLE.

The company has now come up with a number of modules as shown in the picture below.

GroBotz_Smart_Blocks

Your robot can then be controlled over Wi-Fi with GroBotz app which works on Windows, Mac OS, iOs, Android and Linux devices. The software provide a “wire editor” to link up to 127 modules together, and define the robot’s behavior. For example, you can wire a motor module to a joystick module, and easily control the motor with the joystick.

Grobotz_AppGroBotz has just been launched on KickStarter, where the developers look to raise at least $300,000 to go ahead with production. The simplest kit is composed of Light Game Cube and battery with a GroBotz T-Shirt and builders cube (which must be the plastic enclosure for the module), and requires a $50 pledge, but if you want something a bit more fun like the GroBie shown on the first picture, you’ll need to pledge $100 in order to receive 2 DC Motors, a brain  (Raspberry Pi), a battery and charger, a caster, and 2 wheels, and there are other rewards with for example $500 for 30 building blocks. Delivery is scheduled for August to October 2015 depending on the chosen perk. You may also want to visit grobotz.com for a few more details.

Robotics News – Hack-E-Bot and RiQ Educational Robots, and Maker Club 3D Printed Robots (Pre-Orders / Crowfunding)

November 6th, 2014 1 comment

I’ve come across several robotic projects this week, so instead of picking one up, or writing a post for each, I’ll summarize the three products into one post. Two of the projects are educational robots based on Arduino, with the sub $50 Hack-E-Bot, or the more advanced RiQ robot, and Maker Club is a company providing the electronics for robotics kit, and you print the plastic parts with your 3D printer.

Hack-E-Bot Robot

Hack-E_Robot

Hack-E-Bot is an affordable open source robot that hopes to encourage children to learn about engineering, electronics, and programming. The robot is powered by Adafruit’s Trinklet Arduino compatible board, connected to a breadboard, and some add-on boards sensors. The basic version comes with a Sonar sensor, but more add-on boards are on the way including bump sensors, a buzzer, colored lights, a claw, a servo scanner, and so on.

The project is listed on CrowdSupply, and has been funded. The campaign is closed, but you can pre-order a full Hack-E robot kit for $47 delivered in December 2014. Taking pre-orders after the campaign is completed is apparently an option on CrowdSupply that’s not found in Indieogo and Kickstarter.

RiQ Robot Kit

RiQ_Robot_Kit

RiQ is a robot powered by an Atmel MCU (ATmega328P), with a Bluetooth, ultrasonic/light.infrared/touch sensors, LEDs, a servo driver, and up to four DC motors. It’s programmed with PCS Cortex 5.0 drag-and-drop software that reminds me of Scratch, and that can also show the actual Arduino code help learn actual coding. iOS, Android, Mac OSX, and Windows are supported, and the connection to the “PCS BRAIN” is made via  USB or Bluetooth. The company provides tutorials for six robots: “Scaredy Bot” whose afraid of light, “Art” drawing bot, “Red Rover” which can follow “lines’ made of IR light, “Bounced Bot” to demonstrate the use of buttons, “Bat Bot” for obstacle avoidance with the ultrasonic sensor, and “Illumi Bot” which flashed bright LEDs.

You can find more information, and/or pre-order RiQ Robot Kit for $199 on Edventures Lab’s RiQ Robot page. Shipping is scheduled for mid-November.

Maker Club 3D Printed Robots

Carduino

Carduino 3D Printed Robot

Maker Club bots are also controlled by an Arduino compatible board called MakerConnect with Bluetooth Smart (BLE) connectivity. But instead of providing complete kits with all parts, the company will simply send you the electronics, and provide the CAD files so that you can 3D print your own robots. This saves on shipping (in theory), makes it more fun to build the robot, and easier to get replacement parts.

They’ve launched a flexible funding campaign on Indiegogo, where you can pledge for MakerConnect board only (29 GPB / $46), or one of the four robotics kit: “The Grabber” robotic arm (39 GBP / $62), “Quadmonster” with four servos (45 GBP / $72), “Carduino” (59 GBP / $94), or “Insectoid” some sort of spider with 18-servos (69 GBP / $110). If you don’t own a 3D printer, you can also add 20 GBP / $32 to get the 3D printed parts for the kit. Shipping is not included in the price, but “standard shipping costs” are expected to the UK, and between 20 and 35 GBP to the rest of the world. Delivery is scheduled for March or April 2015 depending on the perk.

Arduino Compatible Microduino JoyPad with TFT Display Lets You Play Games, Control Devices, and More (Crowdfunding)

September 30th, 2014 No comments

Last year, Microduino successfully launched their tiny Arduino board and shields via Kickstarter, and they are now back on Kickstarter with Microduino Joypad, an other Arduino compatible board that also happens to be a gamepad with a small OLED display. It can be used in standalone to control games on the tiny display, as a gamepad for PC or game console, a control interface for quadcopters and robots etc…
Microduino_JoypadMicroduino Joypad (main board) specifications:

  • MCU – Atmel ATMega328p/1284p/644p or 32U4 via Microduino Core, Core+, CoreUSB boards. (Not part of board but included in all perks)
  • Display I/F – TFT and OLED headers. Separate TFT display board included in all perks.
  • Controls – Left and right joysticks, 4 buttons, and left and right switches.
  • Audio – Microphone
  • Sensors – Light sensor, temperature sensor
  • USB – 2x micro USB ports: one for power and one for charging
  • Expansions – 2x UPin27 headers for Microduino shields.
  • Misc – Buzzer, vibrator, reset key, power switch, charging and power LEDs
  • Power – 5V – 1x micro USB for power, 1x Micro USB for charging, battery header

Microduino_Joypad_Tetris
There’s very little information about programming the Joypad, but since it’s based on Microduino board, you should be able to use the same Microduino interface, and documentation. Microduino is also open source hardware, and files are already available, and the company also intends to make the Joypad open source hardware once it ships.

Arduino has already made an Arduino gamepad called Esplora, and if you wonder what the difference are between Esplora and Microduino Joypad, the company listed the differences in the following table.

Esplora_vs_JoypadIf you are interested you can checkout Microduino Joypad Kickstarter campaign, where you can still pledge $54 for an early bird Joypad with Microduino Core, USB2TLL board (required for programming), and a USB cable, after which the kit will be available for $60. They also have various perks including quadcopter and self-balancing robot kits going for $202 and $214 respectively. Shipping is free to the US, and $20 to the rest of the world, with delivery scheduled for October and November 2014.

Some Projects on Nvidia Jetson TK1 Development Board: Nintendo Emulator, USB3 Webcam, and Robotics

August 4th, 2014 6 comments

Nvidia Jetson TK1 is a development board powered by the company’s Tegra K1 quado core Cortex A15 processor, and especially a Kepler GPU that allows for OpenGL 4.4. It has shipped to developers around April/May, and some of them have showcased their projects, or tested some hardware.

Dolphin Emulator on Nvidia Jetson TK1

Dolphin is an emulator for Nintendo GameCube and Wii console that supports full HD (1080p) rendering, and run on Android, Linux and Mac OS,  and there’s also an Alpha version for Android. Ryan Houdek (Sonicadvance1), one of Dolphin’s developers, has leveraged Kepler’s OpenGL support via Nvidia’s GPU drivers, to port the emulator to the platform running on Ubuntu, but it should work as well on Tegra K1 hardware running Android such as XiaoMi MiiPad tablet.  You can watch Mario Kart: Double Dash demo running at full speed on the Nvidia board below. According to the developer, such framerate would be not achievable on Qualcomm 800 because “Adreno Graphics Drivers are grossly inefficient compared to the TK1”.

The latest version of Dolphin for Android (Beta) dates December 7, 2013, so I’d assume the optimizations shown above are not available right now. You can find more demos on Ryan Houdek’s YouTube Channel.

USB3.0 Webcam @ 1080p30

Another developer, Korneliusz Jarzębski, has tested e-con Systems USB3 See3CAM_80 HD camera connected to the board’s USB 3.0 port, and using the camera’s “See3CAM” application. I understand that all that needed to be done was to enable hidraw for USB devices in the Linux kernel, and it just worked out of the box. The application can perform real-time video processing, applying videos filters (invert, particles, etc..), as well as changing image characteristics such as brightness, contrast and so on.

You can find a little more on his blog (Polish).

“Super-Computer-On-Legs” Robot

The last demo I’ll show today is a robot powered by Jetson TK1 board that can walk to the nearest person it can see. The robot detects person via a camera and GPU accelerated face detection (about 5 times faster than CPU-only face detection). Beside better performance, the robot is pretty power-efficient as it only draws about 7 watts, and last about 45 minutes powered by a small LiPo battery. The robot was showcased at the Robotics Science and Systems Conference last month, and while attendees were impressed by the performance and power consumption, they still noticed the board was a bit too big for most robots, especially quad copters. But the platform clearly has potential, and Shervin Emami, the person behind the project who happens to work for… Nvidia, mentioned work is being done on smaller Tegra K1 computer on modules that be installed in a custom motherboard of a robot without unnecessary ports.

If you are interested in seeing more projects running on Jetson TK1 development board, you can consider following “Embedded Tegra & Jetson TK1 Blog” on Google+.