Archive

Posts Tagged ‘drone’
Orange Pi Development Boards

FOSDEM 2018 Open Source Developers Meeting Schedule

January 23rd, 2018 5 comments

FOSDEM (Free and Open Source Software Developers’ European Meeting) occurs every year on the first week-end of February, where developers meet for two days discussing about open source software projects. FOSDEM 2018 will take place on February 3-4 this year with  652 speakers, 684 events, and 57 tracks, an increase over  last year 608 speakers, 653 events, and 54 tracks. There will be 8 main tracks namely: Community, History, Miscellaneous, Performance, Python, Security and Encryption, Space, and Global Diversity CFP Day.

There will also be 33 developer rooms, and since the full schedule is now available, I’ll make a virtual schedule mostly based on sessions from the Embedded, mobile, and automotive, Hardware Enablement, and Internet of Things devrooms.

Saturday 3, 2018

  • 09:50 – 10:15 – Turning On the Lights with Home Assistant and MQTT by Leon Anavi

In this presentation you will learn the exact steps for using MQTT JSON Light component of the open source home automation platform Home Assistant for controlling lights through the machine-to-machine protocol MQTT. Practical examples for low cost devices combining together open source hardware with free and open source software will be revealed. The presentation will provide general overview of Home Assistant, details about the software integration of new devices to it through the MQTT protocol and open source MQTT brokers such as Mosquitto. We will do a code review of an open source Linux daemon application for Raspberry Pi, written in the C programming language and based on the Paho library for MQTT client and the piGPIO library used for pulse-width modulation (PWM) control of a RGB LED strip. We will compare it to an implementation of the same features for the microcontroller with WiFi ESP8266 written as a sketch for the Arduino environment. Furthermore, the presentation will include details about reading data from various sensors and their setup in Home Assistant.

  • 10:25 – 10:50 – Accessing your Mbed device from anywhere using Pagekite by Bert Outtier

When looking at home automation solutions available in the market nowadays, one of the most important and expected features is to be able to control your home automation installation from anywhere in the world using a smartphone app. A vendor of a low-cost home automation solution requested us to add such a feature to their existing IP gateway product, which only allowed for users to control their home automation system with their smartphone while they are connected to their local network at home. We were asked to make it possible to let the smartphone app connect to the IP gateway from anywhere in the world. This vendor’s IP gateway hard- and software was based on the Mbed platform, so they needed a solution that could fit within Mbed.

Since our client wanted an open-source, secure, low-cost and easy to set up solution that he could host himself, we opted to go for Pagekite. However, since Mbed does not support OpenSSL, Linux sockets or libev, the existing libpagekite C library was not an option to start from. So we started to implement a Mbed flavour of the library ourselves, and decided to make it open-source

  • 11:00 – 11:30 – The free toolchain for the STM8 by Philipp Klaus Krause

The STM8 is a popular 8-bit architecture by ST Microelectronics commonly used in household electronics, automotive application and industrial controls. For quite a while there were no free tools, and the irregular architecture makes it hard to support in GCC or LLVM. In recent years free tools for it started to appear and now form a free toolchain that surpassed preexisting non-free ones. The most important part is the Small Device C Compiler (SDCC). New tree-decomposition-based algorithms from recent compiler research have been implemented in SDCC, including a new register allocator particularly suited to irregular architectures with few registers. SDCC quickly surpassed the non-free compiler in standard compliance and OS support and generates substantially faster integer code. Programs can be flashed by stcgal (via a serial link on STM8 devices that have a bootloader) and stm8flash (via the SWIM interface of ST-LINK hardware). OpenOCD and GDB allow on-target debugging via the ST-LINK. IDEs complete the development environment. However, stcgal still needs non-free binary blobs for use with some devices and the ST-LINK has non-free firmware. SDCC still falls short in floating-point performance. While there are some ports of free RTOSes that use the free toolchain for the STM8, more would be desirable.

  • 11:30 – 12:00 – Building RT Linux distribution with Yocto by Pierre Ficheux

The conference will describe how to use PREEMPT_RT and Xenomai with Yocto build system – building image and SDK – developing simple application – testing performances.

Using RT extension with Yocto is not that easy because linux-yocto-rt kernel is not usable on main embedded target such as ARM (as it works on QEMU target only). Using Xenomai is much more complicated as it needs several steps (patching the kernel, installing user-space libraries, building an extended SDK).

During the conference we will describe how to build a Yocto Linux image using PREEMPT_RT for famous boards such as Pi 3 or BeagleBone Black.

Some Xenomai support is provided by meta-eldk from DENX but it supports only Xenomai 2.6. We will describe meta-xenomai as we maintain it for our customers (available on GitHub). That new meta-xenomai layer is based on Xenomai 3.x and very recent kernel.

Then we will explain how to build a simple Xenomai application based on a periodic task. Finally we will compare performances of both extension (PREEMPT_RT and Xenomai) on same hardware.

  • 12:00 – 13:00 – How to keep your embedded Linux up and running? by Krzysztof Opasiak

Userspace software is imperfect and we all know about this. Running it for 5 minutes seems to be easy but what about days or weeks? This problem already gave server guys a lot of sleepless nights. Nowadays also IoT and embedded Linux world is facing very the same problem. Unfortunately solutions known from server world (Nagios and friends) usually cannot be directly applied.

During this talk, Krzysztof will discuss problems related to monitoring and “healing” embedded Linux distribution. First, most common server approaches will be described. After that, Krzysztof will try to identify key problems of applying this solution to embedded platform. Then Krzysztof will introduce faultd – small but extendable daemon for system monitoring and CPR;). How to use it? What can it do? What are the advantages and disadvantages? All those questions should be answered in this part. Last part is going to be a discussion on a presented idea and experience sharing.

  • 13:05 – 13:30 – A Guided Tour of Eclipse IoT: 3 Software Stacks for IoT by Benjamin Cabé

Whether you’re looking at the constrained devices that make for the “things” of the IoT, gateways that connect them to the Internet, or backend servers, there’s a lot that one needs to build for creating end-to-end IoT solutions. In this session, we will look at the typical software features that are specific to IoT, and see what’s available in the open source ecosystem (and more specifically Eclipse IoT) to implement them. A live demo of the Eclipse IoT Open Testbed for Asset Tracking will allow the audience to see some of the projects (such as Eclipse Kura, or Eclipse Kapua) in action.

  • 13:45 – 14:10 – Tizen:RT A lightweight RTOS platform for low-end IoT devices by Philippe Coval

The Tizen software platform has been designed to target consumer electronics, since 2013 the OS is powering many products on the market (from smart watches to TVs, cameras or even white goods). Even if this Linux based platform is very flexible, the Linux kernel has minimum size requirements, so Tizen can’t be deployed on constrained devices (ubiquitous microcontrollers).

To also target low end devices part of Tizen’s technology was rebased on NuttX RTOS. Seamless connectivity is still provided by IoTivity, while a new IoT features are becoming available to application developers too, this whole stack is Tizen:RT!

This presentation will give an overview of Tizen ecosystem, and explain how to get started with Tizen:RT using QEmu, SDK, finally an IoT scenario will be demonstrated on trusted system on module ARTIK 055s.

  • 14:25 14:50 – Eclipse IoT FOSS Platform for Cloud Based IoT Solutions by Steffen Evers

It is expected that in the next years billions of devices will be connected to the Internet of things (IoT). Many of them will interact with cloud-based solutions to provide additional services on the devices or in the web. To bring IoT to the next level technologies for supporting cross-domain/cross-vendor solutions are needed. There is already a lot FOSS available to provide a technological base for building IoT solutions (e.g. Kubernetes). However, on top of it, software is needed for the connectivity challenges, support of domain-specific protocols, large scale messaging and device management and integration with existing infrastructure. Eclipse IoT aims to address these needs and provide an FOSS IoT framework that makes IoT development fast and simple. In the last year Eclipse IoT has made a lot of progress and the underlying environment in cloud technology has seen a lot of changes. In addition, upcoming challenges like automated driving and connected vehicles have resulted in new projects for better support for the automotive domain. This talk gives you an overview of major Eclipse IoT projects and illustrates its capabilities with a short demo.

  • 15:05 15:30 – IoT.js – A JavaScript platform for the Internet of Things by Ziran Sun

IoT.js is a JavaScript platform that aims to provide inter-operable services for IoT world. Powered by JerryScript, an ultra-lightweight modular JavaScript engine, the platform is designed to bring the success of Node.js to constrained IoT devices. To address interoperability, IoT.js has provided a Node.js friendly architecture and comes with a subset of Node.js APIs. Since Samsung OSG first presented IoT.js in FOSDEM in 2016, the platform has been through a rapid growth in last couple of years. With a lot active high-quality contributions from the IoT.js and JerryScript open source community, IoT.js has released version 1.0 in July 2017 which presented a rich set of features, hardware and tool supports for developers. In this talk, we are looking at recent developments in IoT.js and share our vision for future plans. The talk is supported by a demo of iot.js running on constrained device seamlessly connects to node.js for third party cloud access.

  • 15:45 – 16:10 – The dark side of Internet of things by Dipesh Monga

With the advent of the Internet of things, monitoring and controlling everything such as coffee maker, lights, TV, Fridge,etc. over the Internet has become a child’s play. But are we really making our lives simpler or diving ourselves in a vast ocean which is getting deeper and deeper? In today’s world where the security of our data of a major concern, the number of websites are always tracking what we search for, what we watch, our location and now when things are limited to only data, adding another dimension i.e. physical entities is really a big question.

From this talk audience will take away an understanding of the privacy concerns related to IoT, and how they may be putting their personal information at risk by connecting my physical entities to the Internet. Is it really safe to connect things to the Internet?

  • 16:30 – 17:00 – Facing the Challenges of Updating Complex Systems by Enrico Jörns

Over the past three years, the growing zoo of Open Source update frameworks made updating an embedded Linux system much easier. But, the availability of a robust update tool solves only one step in the complex chain from a software artifact to an updated and working system on your devices.

Starting with a modern system consisting of a recent bootloader, kernel, init system and update tool, this talk ventures beyond the basic and already solved topics of A/B redundancy, atomicity, or simple update verification.

Enrico will present strategies for creating a robust update chain from automated testing up to full rollout management and show how to solve these challenges with recent Open Source software such as barebox, RAUC, systemd, hawkBit, casync and labgrid. You will learn how to deal with more modular and complex system setups, restricted systems, error recovery, product variants, resigning for deployment, updating the bootloader itself and interaction with verified boot.

  • 17:00 – 18:00 – Multitasking on Cortex-M class MCUs, A deep-dive into the Chromium-EC OS by Moritz Fischer

We’re gonna look at multi-tasking on small Cortex-M class MCUs like the ARM Cortex-M0. After a brief general overview of the Cortex-M0 programming model, exception handling and other basics required, we’ll start our deep-dive into one specific implementation in the Chromium-EC firmware. We’ll look at startup code, how tasks are implemented, how to deal with priorities and peripheral interrupts.

The Chromium-EC firmware is a little (RT)OS that runs (mostly) on ARM cores of the Cortex-M class (M0/M3/M4), and powers Google’s Chromebooks as well as other devices (Project Sulfur SDR). It’s permissive license makes it attractive for (ab)use in other projects, since Kernel and U-Boot integration are already existing.

  • 18:00 – 18:30 –  The Chromium project’s Way to Wayland by Maksim Sisov

Wayland is the most advanced X11-alternative display protocol, shipping today in a variety of desktop and embedded environments. Although the Chromium browser on Linux still defaults to use the X11 window system, there have been efforts to port it to different environments.

This effort happens in various fronts, including the development and stabilization of Ozone, an abstraction layer for graphics and input events, and the transitioning of some ChromeOS-oriented solutions to Linux, for example Chromium’s new “UI service”.

Igalia has been actively contributing to this multi organizational collaboration, aiming at getting a full fledged Chromium browser running natively on Wayland. The work happens on Chromium’s upstream repository so that the greater Chromium community can benefit from it.

  • 18:30 19:00 – GStreamer for tiny devices by Olivier Crête

GStreamer is a complete Open Source multimedia framework, and it includes hundreds of plugins, including modern formats like DASH, HLS or the first ever RTSP 2.0 implementation. The whole framework is almost 150MB on my computer, but what if you only have 5 megs of flash available? Is it a viable choice? Yes it is, and I will show you how.

Starting with simple tricks like only including the necessary plugins, all the way to statically compiling only the functions that are actually used to produce the smaller possible footprint.

Sunday 4, 2018

  • 09:30 10:00 – Programming UEFI for dummies, what I have learned while tweaking FreePascal to output UEFI binaries by Olivier Coursière

With the upcoming end of legacy mode in UEFI firmware on PCs, every alternative and hobbyist operating systems, bare metal programmers and wannabe OS developers will have to deal with UEFI on modern hardware. After presenting the binary format of UEFI applications, I will focus on the use of UEFI APIs through EFI system table and UEFI protocols so you can get started.

  • 10:00 – 10:30 – Rustyarm AKA A project looking at Rust for Embedded Systems by Benedict Gaster (cuberoo_)

Rustyarm is a project in the Physical Computing group at the University of West of England looking at application of Rust on embedded micro controllers. UWE Sense is a new hardware and software platform for IoT, build with ARM micro controllers, Bluetooth LE and LoRaWAN, which runs a software stack written completely in Rust. While UWE Sense is a close to the metal implementation, UWE Audio, a new hardware platform for studying high performance audio using ARM micro controllers, uses Rust to implement a monadic reactive graph, supporting both an offline compiler and and Embedded DSL. UWE Audio uses safe Rust, for example, describing domain clock as generic associated types, providing both compile time guarantees that multiple streams will not be incorrectly sequenced at different sample rates, and the ability to dynamically compile for different parts of the system.

In this talk I will provide a high-level overview of the Rustyarm project, including how using Rust has made this project interesting, but also enabled providing guarantees with respect to the audio scheduler, for example. However, Rust has some short comings in the embedded domain and we provide details on some of these and what we and the wider community are doing to address them. As an example of Rust’s application in the embedded domain we present early work on UWE Audio and hardware and software platform for building digital music instruments, which as already noted is programmed with solely in Rust.

  • 10:30 – 11:00 – How to build an autonomous robot for less than 2K€ by Miika Oja (PuluMan)

Telepresence, Delivery Boy, Security and Follow Me in one PULUrobot. PULUrobot solves the autonomous mobile robotics complexity issue without expensive parts, without compromise. By fearless integration and from-scratch design, our platform can do SLAM, avoid obstacles, feed itself, and carry payload over 100kg, for less than 2000EUR.

Application ecosystem can be born around it, as we offer a ready-made Open Source (GPLv2) solution in a tightly coupled HW-SW codesign. Pulu Robotics Oy was founded in July, 2017, in Finland, to solve our own needs, with an efficient team of three. No one had prior knowledge on robotics.

By studying the market and other startups, we realized the common mistake is to use “robotic modules” as building blocks. They are highly expensive, provide little bang for buck, often are inefficient, and require complex software middleware (such as ROS) as the glue inbetween. Due to our combined background in mechanical, electrical, software and manufacturing design, we took the approach of designing as much as possible by ourselves.

We are now selling the very first generation of robots for the early adopters, hoping to give a kick start to the open source community as soon as possible. Behind the curtains, we are focusing on the development of our next 3D sensor system, which will replace the current scanning 2D lidar with a 360×90 degree full 3D distance data, and do it for the same price we currently pay for the Scanse 2D lidar used in the first small-scale production batch.

  • 11:00 – 11:30 – … like real computers! Making distributions work on single board computers by Andre Przywara

Installing an operating system on single board computers (SBCs or “Fruit-Pis”) is very board specific and requires a lot of hand holding. If at all, standard distributions support only a small number of them explicitly, which leads to a lot of board specific images and distributions. This talk will show how this situation can be improved, to the point where off-the-shelf Linux (or BSD) distributions can be installed on those boards, without those distros knowing about each and every one of them. Key ingredients are standardized firmware interfaces like UEFI, stable device trees and on-board memory like SPI flash. This should make using ARM based SBCs as easy as using (x86) PCs today: like “real computers”. On top of this, ways to simplify and speed up mainline Linux kernel support are explored. Enabling kernel support for new SoCs usually takes a while, especially if the effort is driven by the community. This delays distribution support, up to a point where a certain SoC or board might become slightly dated when it’s finally supported. Using more device tree features and less hardcoded kernel data would reduce the code required to support new SoCs, ideally reaching a point where new SoCs could be at least booted with existing (distribution!) kernels, just by providing the proper device tree blob. This talk describes the idea and gives an example by looking at what can be done on Allwinner SoCs.

  • 11:30 – 12:00 – Booting it successfully for the first time with mainline by Enric Balletbo Serra

While things have gotten a lot better, new hardware bring-up sometimes still feels like pulling teeth. With the right methodology, tools and techniques, a significant amount of time, energy (and sanity) can be saved while enabling a new board to run Linux. In this talk, we’ll discuss the phased process involved in new board bring-up and the challenges it can pose, from reviewing initial schematic design to the successful upstreaming of any necessary bootloader and kernel patches. We’ll also provide some examples of the process based on a board that was recently made compatible with mainline.

  • 12:00 – 12:30 – SITL bringup with the IIO framework, bootstrapping a x86 based drone platform by Bandan Das

This talk aims at an introduction to using the Industrial IO(IIO) framework to initialize sensors and acquire data to feed to a Software in the Loop (SITL) interface of drone software such as iNav/Cleanflight. Most flight controller boards are based on low power ARM microcontrollers and the flight controller software is not usually based on Linux. However, with the availability of increasingly powerful boards with onboard sensors and multicore processors, using a Linux based flight controller software can be used to our advantage. Experimenting with onboard devices and scheduling algorithms can lead to interesting applications with minimal porting overhead to new architectures.

The talk starts with a quick overview of the IIO framework and using it to initialize the drivers for the onboard sensors of the Intel Aero platform, a x86 based flight controller board. Although, not tied to the Aero board in any way, this talk will use this board as an example to describe the onboard sensors and acquire data from them to successfully run a minimal SITL instance. The talk aims to explore how the IIO framework exposes data from these sensors and how users can utilize these interfaces followed by a demo of the setup.

  • 12:30 – 13:00 – Rapid SPI Device Driver Development over USB by Stefan Schmidt

On the quest for a cheap and easy way to connect some simple SPI devices to my laptop it was surprising to not find anything suitable available. The idea is neither new nor innovative and surely there must have been something already.

Maybe the use-case was to special. To connect the SPI device to a Linux laptop over USB in order to develop a SPI kernel driver for it and having a rapid development and test cycle. None of the solutions to access the SPI device over libusb in userspace would work for me. I needed a SPI master controller in kernelspace to work with the variety of devices and kernel subsystems.

After some research I settled on the MCP2210 chip. With its cheap and easy to get development boards and an out-of-tree driver as a good start. Maybe it is also something others are looking for and it is surely worth demonstrating and explaining.

  • 13:00 – 14:00 – Implementing state-of-the-art U-Boot port, 2018 edition by Marek Vasut

This presentation is a practical guide to implementing U-Boot port to a new system from scratch. U-Boot is the de-facto standard bootloader for embedded systems, there is plenty of U-Boot ports, yet vast majority of those are implemented in sub-optimal way. This talk first explains the U-Boot internals, the driver model (DM) and it’s interaction with device tree (DT), as understanding these is vital to understanding the implementation of core subsystems. The core subsystems are explained in detail afterward to allow developers implement drivers the intended way without hacks and workarounds. Unfortunately, not all systems have plenty of resources, but U-Boot caters for those as well. The final part of the talk discusses the U-Boot SPL, the preloader which initializes the hardware, DRAM and starts U-Boot and finer parts of this procedure, which tends to have plenty of pitfalls.

  • 14:00 – 15:00 – Image capture on embedded linux systems by Jacopo Mondi

Image capture is one of the most broad and complex fields of today’s computing applications. Capturing and displaying images with an embedded platform poses additional challenges, introduced by the rapidly increasing complexity of dedicated hardware blocks often found on modern Systems On Chip designed for mobile and industrial computing. Using real world examples of image sensors, connection buses and processing blocks this presentation provides an overview of current industry standard technologies with an introduction to Video4Linux2 kernel framework for driver development and its userspace APIs.

  • 15:00 – 16:00 – ARM64 + FPGA and more: Linux on the Xilinx ZynqMP by Luca Ceresoli

The Xilinx Zynq UltraScale+ MPSoC (aka ZynqMP) is a powerful and complex chip featuring 64-bit cores, 32-bit realtime cores, a large FPGA, a GPU, video codecs and dedicated power management and security units.

The main topics covered will be:

  • Overview of the hardware.
  • Available software support from Xilinx and from the community.
  • How the peculiar CPU+FPGA design effectively allows to design “your own SoC”, with the technical steps to implement this with Linux.
  • Why booting is nontrivial on this SoC and the currently available ways to boot Linux.
  • Handling the H.264/H.265 hardware codecs.
  • GPU support issues.

Focus will be given to how much open source technologies can be used with the ZynqMP SoCs, why this matters, and the current status of open source resources with respect to the alternatives.

  • 16:00 – 16:50 – New GPIO interface for linux user space by Bartosz Golaszewski

Since linux 4.8 the GPIO sysfs interface is deprecated. Due to its many drawbacks and bad design decisions a new user space interface has been implemented in the form of the GPIO character device which is now the preferred method of interaction with GPIOs which can’t otherwise be serviced by a kernel driver. The character device brings in many new interesting features such as: polling for line events, finding GPIO chips and lines by name, changing & reading the values of multiple lines with a single ioctl (one context switch) and many more. In this presentation, Bartosz will showcase the new features of the GPIO UAPI, discuss the current state of libgpiod (user space tools for using the character device) and tell you why it’s beneficial to switch to the new interface.

FOSDEM 2018 will take place at the ULB Solbosch Campus in Brussels, Belgium, attendance is free of charge, and no registration is required.

A Day at Chiang Mai Maker Party 4.0

December 6th, 2017 6 comments

The Chiang Mai Maker Party 4.0 is now taking place until December 9, and I went there today, as I was especially interested in the scheduled NB-IoT talk and workshop to find out what was the status about LPWA in Thailand. But there are many other activities planned, and if you happen to be in Chiang Main in the next few days, you may want to check out the schedule on the event page or Facebook.

I’m going to go though what I’ve done today to give you a better idea about the event, or even the maker movement in Thailand.

Click to Enlarge

Booth and activity area should be the same over the 4 days, but the talks, open activity, and workshop will be different each day. Today, people could learn how to solder in the activity area.
The even was not really big with manufacturers/sellers like ThaiEasyElec, INEX, or Gravitech closer to the entrance…


… and slighter higher up in a different zone, companies and makers were showcasing their products or projects. I still managed to spent 5 interesting hours at the event attending to talks and checking out the various projects.

I started my day with a talk entitled “Maker Movement in South East Asia” presented by William Hooi, previously a teacher, who found One Maker Group and setup the first MakerSpace in Singapore, as well as helped introduce the Maker Faire in Singapore in 2012 onwards.


There was three parts to talk with an history of the Maker movement (worldwide), the maker movement in Singapore, and whether Making should be integrated into school curriculum.
He explained at first the government who not know about makers, so it was difficult to get funding, but eventually they jump on the bandwagon, and are now puring money on maker initiative. One thing that surprised me in the talk is that before makers were hidden their hobby, for fear of being mocked by other, for one for one person doing an LED jacket, and another working on an Iron Man suit. The people around them would not understand why they would waste their time on such endeavors, but the Maker Space and Faire helped finding like minded people. Some of the micro:bit boards apparently ended in Singapore, and when I say some, I mean 100,000 units. Another thing that I learned is the concept of “digital retreat for kids” where parents send kids to make things with their hands – for example soldering -, and not use smartphone or tablets at all, since they are already so accustomed to those devices.

One I was done with the talk, I walked around, so I’ll report about some of the interesting project I came across. I may write more detailed posts for some of the items lateron.

Click to Enlarge

Falling object detection demo using OpenCV on the software side, a webcam connected to…

Click to Enlarge

ASUS Tinker board to handle fall detection, and an NVIDIA Jetson board for artificial intelligence. If fall is detection an alert to send to the tablet, and the system also interfaces with Xiaomi Mi band 2.

Katunyou has also made a more compact product, still based on Tinker Board, for nursing home, or private home where an elderly may live alone. The person at the stand also organizes Raspberry Pi 3 workshops in Chiang Mai.

I found yet another product based on Raspberry Pi 3 board. SRAN is a network security device made by Global Tech that report threats from devices accessing your network using machine learning.

Click to Enlarge

Nordic Technology House showcased a magic mirror based on Raspberry Pi 3, and a webcam to detect your dance move, but their actual product shown above is a real-time indoor air monitoring system that report temperature, humidity, CO2 level, and PM 2.5 levels, and come sent alerts via LINE if thresholds are exceeded.

One booth had some drones including the larger one above spraying insecticides for the agriculture market.

Click to Enlarge

There was also a large about sewing machines, including some smarter one where you can design embroidery in a table before sewing.

There were also a few custom ESP8266 or ESP32 boards, but I forgot to take photos.

The Maker Party is also a good place to go with your want to buy some board or smart home devices.

Click to Enlarge

Beside Raspberry Pi Zero W / 3, ESP8266 boards and Asus Tinker board seem to be popular items in Thailand. I could also spot Sonoff wireless switch, and an Amazon Dot, although I could confirm only English is supported, no Thai language.

BBC Micro:bit board and accessories can also be bought at the event.


M5Stack modules, and Raspberry Pi 3 Voice Kit were also for sale.

Click to Enlarge

Books are also available for ESP32, Raspberry Pi 3, IoT, etc… in Thai language.

Click to Enlarge

But if you can’t read Thai there was also a choice of book in English about RPi, Arduino, Linux for Makers, IoT and so on. I then attended the second talk of the day: “NB-IoT” by AIS, one of the top telco company in Thailand. Speakers included Phuchong Charoensub, IoT Marketing Specialist, and Pornsak Hanvoravongchai, Device Innovation Manager, among others. They went through various part include a presentation of AIS current M2M business, what IoT will change (e.g. brings in statups and makers), some technical details about NB-IoT, and the company offering for makers.

I’ll go into more details in a separate post tomorrow, but if you want to get started the good news is that it’s now possible to pre-order a 1,990 THB Arduino Shield ($61) between December 6-9, and get it shipped on February 14, 2018. NB-IoT connectivity is free for one year, and will then cost 350 Baht (around $10) per year per device. However, there’s a cost to enable NB-IoT on LTE base stations, so AIS will only enable NB-IoT at some universities, and maker spaces, meaning for example, I would most certainly be able to use such kit from home. An AIS representative told me their no roadmap for deployment, it will depend on the business demand for such services.

If you are lucky you may even spot one or two dancing dinosaurs at the event.

Cheap Evil Tech – WiFi Deauther V2.0 Board and Autonomous Mini Killer Drones

November 24th, 2017 10 comments

Most technological advances usually improve life of people, and with the costs coming down dramatically over the years, available to more people. But technology can be used for bad, for example by governments and some hackers. Today, I’ve come across two cheap hardware devices that could be considered evil. The first one is actually pretty harmless and can be use for education, but disconnects you from your WiFi, which may bring severe physiological trauma to some people, but should not be life threatening, while the other is downright scary with cheap targeted killing machines.

WiFi Deauther V2.0 board

Click to Enlarge

Specifications for this naughty little board:

  • Wireless Module based on ESP8266 WiSoC
  • USB – 1x Micro USB type changed, more stable.
  • Expansion – 17-pin header with 1x ADC, 10x GPIOs, power pins
  • Misc – 1x power switch,  battery status LEDs
  • Power Supply
    • 5 to 12V via micro USB port
    • Support for 18650 battery with charging circuit (Over-charge protection, over-discharge protection)
  • Dimensions – 10×3 cm

The board is pre-flashed with the open source ESP8266 deauther firmware, which allows you to perform a deauth attack with an ESP8266 against selected networks. You can select target IP address, and the board will then disconnect that node constantly, either blocking the connection or slowing it down. You don’t need to be connect to the access point or know the password for it to work. You’ll find more details on how it works on the aforelinked Github page. Note: The project is a proof of concept for testing and educational purposes.

WiFi Deauther V2.0 board can be purchased on Tindie or Aliexpress for $10.80 plus shipping.

A.I. Powered Mini Killer Drones

The good news is that those do not exist yet (at least not for civilians), but the video shows what could happen once people, companies, or governments weaponize face recognition, and drone technology to design mini drones capable of targeted killings. You could fit the palm-sized drones with a few grams of explosives (or lethal poison), tell them who to target, and once they’d find it, land on the skull of the victim, and trigger the explosive for an instant kill. Organizations or governments could also have army of those drones for killing based on metadata obtained from phone records, social media posts, etc… The fictional video shows how those drones could work, and what may happen to society as a consequence.

Technology is already here for such devices. Currently you could probably get $400+ DJI Spark drone to handle face recognition, but considering inexpensive $20+ miniature drones and $50 smart cameras are available (but not quite good enough right now), sub $100 drones with face recognition should be up for sale in a couple of years. The explosive and triggering mechanism would still need to be added, but I’m not privy to the costs… Nevertheless, it should be technically possible to manufacture such machine, even for individuals, for a few hundreds dollars. Link to fictitious StratoEnergetics company.

The Future of Life Institute has published an open letter against autonomous weapons that has been signed by various A.I. and robotics researchers, and other notable endorsers, stating that “starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control”, but it will likely be tough to keep the genie in the bottle.

UFS 3.0 Embedded Flash to Support Full-Duplex 2.4GB/s Transfer Speeds

September 10th, 2017 3 comments

All my devices still rely on eMMC flash for storage, but premium smartphones, for example, make use of UFS 2.0/UFS 2.1 flash storage with performance similar to SSD, with Samsung UFS 2.0 storage achieving up to 850MB/s read speed, 260 MB/s write speed, and 50K/30K R/W IOPS. UFS 3.0 promises to roughly double the performance of UFS 2.0/2.1 with transfer rates of up to 2.4 GB/s, and separately, the UFS Card v2.0 standard should deliver UFS 2.1 performance on removable storage.

Image Source: Next Generation of Mobile Storage : UFS and UFS Card – Click to Enlarge

Several Chinese and Taiwanese websites, including CTimes and Benchlife, have reported that companies have started getting UFS 3.0 & UFS Card v2.0 licenses from JEDEC, and Phison is working on a controller to support both new standards, and scheduled to launch in 2018.

Premium smartphone SoC are only expect to support UFS 3.0 in 2019 and beyond, and hopefully by that time eMMC will have been replaced by UFS 2.0/2.1 in entry level and mid range devices. The outlook for UFS cards is less clear, as I’ve yet to see a product equipped with a UFS slot.

Click to Enlarge

Based on a recent presentation at the Flash Memory Summit, (typical) embedded storage capacity will also increase to 32GB for IoT / multimedia applications, 256GB for smart home products and drones, 512GB for mobile devices, and over 1TB for automotive applications.

Via Liliputing

Kudrone Nano Drone Shoots “4K” Videos, Follows You With GPS (Crowdfunding)

March 24th, 2017 6 comments

Kudrone is a palm-sized drone equipped with a 4K camera that can follow you around for up to 8 minutes thanks to its 650 mAh battery by tracking your smartphone location via GPS. You can also take matters on your own hands by piloting the drone with your smartphone.

The drone also includes various sensors such as an accelerometer, a gyroscope, a magnetic compass, a sonar, and a vision positioning sensor enabling features such as auto hovering. Some of the specifications include:

  • Storage – Up to 64GB (micro SD card)
  • Connectivity – 802.11 b/g/n WiFi up to 80 meters
  • GNSS – GPS / GLONASS
  • Camera
    • Sony CMOS 1/3.2 image sensor (13MP)
    • F2.8 / H100 V78.5 / D:120 lens
    • Image resolution up to 3280 x 2464
    • Video resolution 4K, 2.7K, 1080p, 720p
  • Flight Parameters – Max altitude – 30 meters; hovering accuracy: +/- 0.1 meter
  • Battery – 650 mAh LiPo1S battery good for up to 8 minutes (but lower when the camera and GPS are on)
  • Dimensions – 174 x 174 x 43 mm

It’s a little odd that it records 4K videos, but image resolution is limited to 3280×2464, so there may be some extrapolation here and the video quality is unlikely to match what most people would consider “4K”. You can see a video shot with drone – but apparently not while flying – here, and it is limited to 1080p60 on YouTube.

Kudrone Team  provided a comparison pitting their drone against some other cheap nano drones, and some higher end drones by DJI and Parrot.

iPhone or Android mobile apps will allow you to control the drone, enable/disable features, and sync your photos and videos with your  smartphone. The preview is shown at 720p with a 160 ms delay.

The drone launched on Indiegogo several days ago, and has been pretty popular having raised close to $700,000 with 21 days to go. All very early bird rewards at $99 are gone, but you could still get the drone for $109 with two propeller sets, a charger, a 16GB micro SD card, two batteries, and a pair of propeller protector. Shipping adds $9 to the US or China, and $25 to the other countries I checked. Delivery is scheduled for July 2017. The drone is made by a company called Fujian Ruiven Technology, and Kudrone is not their first drone. However, you may want to check out the update section on Indiegogo to see pictures and video samples, as well as videos of the drone in action to get a better idea of the drone current capabilities.

Categories: Hardware Tags: Android, drone, indiegogo, ios, wifi

$80 BeagleBone Blue Board Targets Robots & Drones, Robotics Education

March 14th, 2017 3 comments

Last year, we reported that BeagleBoard.org was working with the University of California San Diego on BeagleBone Blue board for robotics educational kits such as EduMiP self-balancing robot, and EduRover four wheel robot. The board has finally launched, so we know the full details, and it can be purchased for about $80 on Mouser, Element14 or Arrow websites.

Click to Enlarge

BeagleBone Blue specifications:

  • SiP (System-in-Package) – Octavo Systems OSD3358 with TI Sitara AM3358 ARM Cortex-A8 processor @ up to 1 GHz,  2×32-bit 200-MHz programmable real-time units (PRUs), PowerVR SGX530 GPU, PMIC, and 512MB DDR3
  • Storage – 4GB eMMC flash, micro SD slot
  • Connectivity – WiFi 802.11 b/g/n, Bluetooth 4.1 LE (TI Wilink 8) with two antennas
  • USB – 1x USB 2.0 client and host port
  • Sensors – 9 axis IMU, barometer
  • Expansion
    • Motor control – 8x 6V servo out, 4x DC motor out, 4x quadrature encoder in
    • Other interfaces – GPIOs, 5x UARTs, 2x SPI, 1x I2C, 4x ADC, CAN bus
  • Misc – Power, reset and 2x user buttons; power, battery level & charger LEDs; 6x user LEDs; boot select switch
  • Power Supply – 9-18V DC input via power barrel; 5V via micro USB port; 2-cell LiPo support with balancing,
  • Dimensions & Weight – TBD

The board ships pre-loaded with Debian, but it also supports the Robot Operating System (ROS) & Ardupilot, as well as graphical programming via Cloud9 IDE on Node.js. You’ll find more details, such as documentation, hardware design files, and examples projects on BeagleBone Blue product page, and github.

The board is formally launched at Embedded World 2017, and Jason Kridner, Open Platforms Technologist/Evangelist at Texas Instruments, and co-founder and board member at BeagleBoard.org Foundation, uploaded a video starting with a demo of various robotics and UAV projects, before giving a presentation & demo of the board at the 2:10 mark using Cloud 9 IDE.


If you attend Embedded World 2017, you should be able to check out of the board and demos at Hall 3A Booth 219a.

FOSDEM 2017 Open Source Meeting Schedule

January 31st, 2017 4 comments

FOSDEM (Free and Open Source Software Developers’ European Meeting) is a 2-day free event for software developers to meet, share ideas and collaborate that happens on the first week-end of February, meaning it will take place on February 4 & 5, 2017 this year. FOSDEM 2017 will features 608 speakers, 653 events, and 54 tracks, with 6 main tracks namely: Architectures, Building, Cloud, Documentation, Miscellaneous, and Security & Encryption.
I won’t be there, but it’s always interesting to look at the schedule, and I made my own virtual schedule focusing especially on talks from “Embedded, mobile and automotive” and “Internet of Things” devrooms.

Saturday 4, 2017

  • 11:00 – 11:25 – Does your coffee machine speaks Bocce; Teach your IoT thing to speak Modbus and it will not stop talking, by Yaacov Zamir

There are many IoT dashboards out on the web, most will require network connection to a server far far away, and use non standard protocols. We will show how to combine free software tools and protocols from the worlds of IT monitoring, Industrial control and IoT to create simple yet robust dashboards.

Modbus is a serial communication protocol developed in 1979 for use with programmable logic controllers (PLCs). In simple terms, it is a method used for transmitting information over serial lines between electronic devices., it’s openly published, royalty-free, simple and robust.

Many industrial controllers can speak Modbus, we can also teach “hobby” devices like Arduino boards and ESP8266 to speak Modbus. Reliable, robust and simple free software Modbus client will be used to acquire the metrics from our device, then the metrics will be collected and sent to Hawkular and Grafana to store and visualize our data.

  • 11:30 – 11:55 – Playing with the lights; Control LIFX WiFi-enabled light bulbs, by Louis Opter

In this talk we’ll take a close look at a one of the “smart” (WiFi-connected) light-bulbs available on the market today. The bulbs expose a small API over UDP that I used to run an interface on a programmable buttons array. We will see how topics like reverse engineering, security, licensing, “self-hosting” and user experience came into play.

monolight is an user interface to control LIFX WiFi-enabled light bulbs. monolight runs on a programmable button array; it is written in Python 3.6 (to have type annotations and asyncio), and it interfaces with the bulbs through a more complex daemon written in C: lightsd.

This talk will start with a live demo of the button grid remotely controlling the light bulbs. We will then explore how it works and some of the motivations behind it (network isolation, trying to not depend on the “cloud”, reliability, user-experience). Finally, we will look into what kind of opportunities even more open IoT products could bring, and open leave the place to Q&A and discussion.

  • 12:00 – 12:30 – Creating the open connected car with GENIVI, by Zeeshan Ali, GENIVI Development Platform (GDP) technical lead

A number of new components have matured in GENIVI to provide a true connected car experience. A couple of them are key connectivity components; namely SOTA (Software Over the Air) and RVI (Remote Vehicle Interface). This talk will discuss both these components, how they work together, the security work done on them and their integration into the GENIVI Development Platform.

This talk will also run down the overall status of GENIVI’s development platform and how it can enable an automotive stack to speak not just with the cloud, but with IoT devices via Iotivity interface.

  • 12:30 – 13:00 – Making Your Own Open Source Raspberry Pi HAT; A Story About Open Source Harware and Open Source Software, by Leon Anavi

This presentation will provide guidelines how to create an open source hardware add-on board for the most popular single board computer Raspberry Pi using free and open source tools from scratch. Specifications of Raspberry Pi Foundation for HAT (Hardware Attached on Top) will be revealed in details. Leon Anavi has been developing an open source Raspberry Pi HAT for IoT for more than a year and now he will share his experience, including the common mistakes for a software engineer getting involved in hardware design and manufacturing. The presentation is appropriate for anyone interested in building entirely open source products that feature open source hardware and open source software. No previous experience or hardware knowledge is required. The main audience are developers, hobbyists, makers, and students. Hopefully the presentation will encourage them to grab a soldering iron and start prototyping their DIY open source device.

  • 13:00 – 13:25 – Building distributed systems with Msgflo; Flow-based-programming over message queues, by Jon Nordby

MsgFlo is a tool to build systems that span multiple processes and devices, for instance IoT sensor networks. Each device acts as a black-box component with input and output ports, mapped to MQTT message queues. One then constructs a system by binding the queues of the components together. Focus on components exchanging data gives good composability and testability, both important in IoT. We will program a system with MsgFlo using Flowhub, a visual live-programming IDE, and test using fbp-spec.

In MsgFlo each process/device is an independent participant, receiving data on input queues, and sending data on output queues. A participant do not know where the data comes from, nor where (if anywhere) the data will go. This strong encapsulation gives good composability and testability. MsgFlo uses a standard message queue protocol (MQTT or AMQP). This makes it easy to use with existing software. As each participant is its own process and communicate over networks, they can be implemented in any programming language. Convenience libraries exist for C++, Python, Arduino, Node.js and Rust. On top of the message queue protocol, a simple discovery mechanism is added. For existing devices without native Msgflo support, the discovery messages can be sent by a dedicated tool.

  • 13:30 – 13:55 – 6LoWPAN in picoTCP, and how to support new Link Layer types, by Jelle De Vleeschouwer

6LoWPAN enables, as the name implies, IPv6-communication over Low-power Wireless Personal Area Networks, e.g. IEEE802.15.4. A lot of resources are available to allow 6LoWPAN over IEEE802.15.4, but how can one extend the 6LoWPAN feature-set for the use with other link layer types? This talk will cover the details about a generic implementation that should work with every link layer type and how one can provide support for ones own custom wireless network. The goal is to give quite a technical and detailed talk with finally a discussion about when 6LoWPAN is actually useful and when is it not.

Last year, as a summer project, a generic 6LoWPAN adaption layer was implemented into picoTCP, an open source embedded TCP/IP-stack developed by Altran Intelligent Systems, with an eye on the IoT. The layer should also be able to allow multiple link-layer extensions, for post-network-layer processing. This could be used for mesh-under routing, link layer security, whatever you want. This talk will cover how one can take advantage of these features and caveats that come with it.

  • 14:00 – 15:00 – Groking the Linux SPI Subsystem by Matt Porter

The Serial Peripheral Interconnect (SPI) bus is a ubiquitous de facto standard found in many embedded systems produced today. The Linux kernel has long supported this bus via a comprehensive framework which supports both SPI master and slave devices. The session will explore the abstractions that the framework provides to expose this hardware to both kernel and userspace clients. The talk will cover which classes of hardware supported and use cases outside the scope of the subsystem today. In addition, we will discuss subtle features of the SPI subsystem that may be used to satisfy hardware and performance requirements in an embedded Linux system.

  • 15:00 – 15:25 – Frosted Embedded POSIX OS; a free POSIX OS for Cortex-M embedded systems, by Brabo Silvius

FROSTED is an acronym that means “FRee Operating System for Tiny Embedded Devices”. The goal of this project is to provide a free kernel for embedded systems, which exposes a POSIX-compliant system call API. In this talk I aim to explain why we started this project, the approach we took to separate the kernel and user-space on Cortex-M CPU’s without MMU, and showcase the latest improvements on networking and supported applications.

  • 15:30 – 16:00 – How to Build an Open Source Embedded Video Player, by Michael Tretter

Video playback for embedded devices such as infotainment systems and media centers demands hardware accelerators to achieve reasonable performance. Unfortunately, vendors provide the drivers for the accelerators only as binary blobs. We demonstrate how we built a video playback system that uses hardware acceleration on i.MX6 by using solely open source software including Gstreamer, Qt QML, the etnaviv GPU driver, and the coda video decoder driver.

The Qt application receives the video streams from a Gstreamer pipeline (using playbin). The Gstreamer pipeline contains a v4l2 decoder element, which uses the coda v4l2 driver for the CODA 960 video encoder and decoder IP core (VPU in the Freescale/NXP Reference Manual), and a sink element to make the frames available to the Qt application. The entire pipeline including the Gstreamer to Qt handover uses dma_bufs to avoid copies in software.This example shows how to use open source drivers to ease the development of video and graphics applications on embedded systems.

  • 16:00 – 16:25 – Project Lighthouse: a low-cost device to help blind people live independently, by David Teller

The Word Health Organization estimates that more than 250 million people suffer from vision impairment, 36 millions of them being entirely blind. In many cases, their impairment prevents them from living independently. To complicate things further, about 90% of them are estimated to live in low-income situations.

Project Lighthouse was started by Mozilla to try and find low-cost technological solutions that can help vision-impaired people live and function on their own. To this date, we have produced several prototypes designed to aid users in a variety of situations. Let’s look at some of them. This will be a relatively low-tech presentation.

  • 16:30 – 16:55 – Scientific MicroPython for Microcontrollers and IoT, IoT programming with Python, by Roberto Colistete Jr

MicroPython is a implementation of Python 3 optimised to run on a microcontroller, created in 2013 by the Physicist Damien P. George. The MicroPython boards runs MicroPython on the bare metal and gives a low-level Python operating system running interactive prompt or scripts.

The MicroPython boards currently use 32 bit microcontrollers clocked at MHz and with RAM limited to tens or hundreds of Kbytes. These are the microcontroller boards with official MicroPython support currently in the beginning 2017 : Pyboard, Pyboard Lite, WiPy 1/2, ESP8266, BBC Micro:bit, LoPy, SiPy, FiPy. They cost between USD3-40, are very small and light, about some to tens of mm in each dimension and about 5-10 g, have low power consumption, so MicroPython boards are affordable and can be embedded in almost anything, almost anywhere.

Some hints will be given to the FOSS community to be open minded about MicroPython : be aware that MicroPython exists, MicroPython is a better programming option than Arduino in many ways, MicroPython boards are available and affordable, porting more Python 3 scientific modules to MicroPython, MicroPython combines well with IoT.

  • 17:00 – 17:25 – Iotivity from devices to cloud; how to make IoT ideas to real using FLOSS, by Philippe Coval & Ziran Sun (Samsung)

The OCF/IoTivity project aims to answer interoperability issues in the IoT world from many different contexts to accommodate a huge range devices from microcontrollers, to consumer electronics such as Tizen wearables or your powerful GNU/Linux system The vision of Iotivity is not restricted to ad hoc environment but also can be connected to Internet and make the service easily accessible by other parties. With cloud access in place, usage scenarios for IoT devices can be enriched immensely.

In this talk we walk through the steps on how to practically handle IoT use cases that tailored towards various topologies. To introduce the approach used in IoTivity, we first give a detailed background introduction on IoTivity framework. Then we will present a demo that shows a few examples, from setting up a basic smart home network to accessing the IoT resource via a third party online service. Challenges and solutions will be addressed from development and implementation aspects for each step of the demo.

We hope this talk will inspire developers to create new IoT prototypes using FLOSS.

  • 17:30 – 17:55 – Open Smart Grid Platform presentation, an Open source IoT platform for large infrastructures, by Jonas van den Bogaard

The Open Smart Grid Platform is an open source IoT platform. The open smart grid platform is a generic IoT platform, built for organizations that manage and/or control large-scale infrastructures. The following use cases are now readily available: smart lighting, smart metering, tariff switching, and microgrids. Furthermore the following use-cases are in development: distribution automation, load management and smart device management. The architecture of the open smart grid platform is modular and consists multiple layers.

The open smart grid platform is highly unique for embracing the open source approach and the following key features:

  • Suitable for scalable environments delivering high performance
  • High availability and multitenant architectures
  • Built with security by design and regularly tested.
  • It has a generic architecture. More use cases and domains are easily added to the platform.
  • The open smart grid platform is based on open standards where possible.

We believe the platform is interesting for developers who have interest in working on use-cases for Smart Cities, Utility Companies and other large-scale infrastructure companies.

  • 18:00 – 19:00 – AGL as a generic secured industrial embedded Linux; factory production line controllers requirements are not that special, by Dominig ar Foll

There is no de facto secured embedded Linux distro while the requirement is becoming more and more critical with the rise of IoT in Industrial domains. When looking under the hood of the Yocto built AGL project (Automotive Linux), it is obvious that it can fit 95% of the most common requirements as a Secured Embedded Linux. We will look how non Automotive industries can easily reuse the AGL code and tools to build their own industrial product and why it’s a safer bet than to build it internally.

Industrial IoT cannot be successful without a serious improvement of the security coverage. Unfortunately there is as today, no of-the-shelves offer and the skills required to create such solution, are at best rare, more often out of reach. AGL as created a customizable embedded Linux distro which is nicely designed for reuse in many domains outside of Automotive. During the presentation we will see how to: – start your development with boards readily available on the Net, – change the BSP and add peripherals using Yocto layers or project like MRAA, – integrate a secure boot in your platform, – add your middleware and your application without breaking the maintained Core OS – develop a UI on the integrated screen and/or an HTML remote browser – update the core OS and your add-ons. – get support and influence the project.

Sunday 5, 2017

  • 10:00 11:00 – How I survived to a SoC with a terrible Linux BSP, Working with jurassic vendor kernels, missing pieces and buggy code, by Luca Ceresoli

In this talk Luca will share some of his experiences with such vendor BSPs, featuring jurassic kernels, non-working drivers, non-existing bootloaders, code of appallingly bad quality, ineffective customer support and Windows-only tools. You will discover why he spent weeks in understanding, fixing and working around BSPs instead of just using them. The effects on the final product quality will be described as well. Luca will also discuss what the options are when you face such a BSP, and what both hackers and vendors can do to improve the situation for everybody’s benefit.

  • 11:00-12:00 – Open Source Car Control, by Josh Hartung

This fall my team launched the Open Source Car Control (OSCC) project, a by-wire control kit that makes autonomous vehicle development accessible and collaborative to developers at every level. In this presentation, we discuss the project and its implications on the development of autonomous cars in a vertically integrated and traditionally closed industry.

A primary barrier to entry in autonomous vehicle development is gaining access to a car that can be controlled with an off-the-shelf computer. Purchasing from an integrator can cost upwards of $100K, and DIY endeavors can result in unreliable and unsafe solutions. The OSCC project acts as a solution to these problems. OSCC is a kit of open hardware and software (based on Arduino) that can be used to take control of the throttle, brake, and steering in modern cars. The result is a fully by-wire test car that can be built for about $10K (USD), including the vehicle. In this discussion, we unpack the impetus and development of the OSCC project, challenges we encountered during development, and the role projects like OSCC have in a necessary “flattening” of the automotive industry.

  • 12:00 – 13:00 – Kernel DLC Metrics, Statistic Analysis and Bug-Patterns, by Nicholas Mc Guire

SIL2LinuxMP strives to qualify a defined GNU/Linux subset for the use in safety-related systems by “assessment of non-compliant development”. To demonstrate that the kernel has achieved suitable reliability and correctness properties basic metrics of such properties and their statistic analysis can be used as part of the argument. Linux has a wealth of analytical tools built-in to it which allow to extract information on compliance, robustness of development, as well as basic metrics on complexity or correctness with respect to defined properties. While IEC 61508 Ed 2 always pairs testing and analysis, we believe that for a high complexity system traditional testing is of relatively low effectiveness and analytical methods need to be the primary path. To this ends we outline some approaches taken:

  • Bug-age analysis
  • Bug-rates and trend analysis
  • Code-complexity/bug relationship
  • Brain-dead correctness analysis
  • Interface and type-correctness analysis
  • API compliance analysis
  • Analysis of build-bot data

While much of the data points to robust and mature code there also are some areas where problems popped up. In this talk we outline the used methods and give examples as well as key findings. FLOSS development has reached a quite impressive maturity, to substantially go beyond we think it will need the use of quantitative process and code metrics – these results from SIL2LinuxMP may be a starting point.

  • 13:00 – 14:00 – Loco Positioning: An OpenSource Local Positioning System for robotics, presentation with a demo of autonomous Crazyflie 2.0 quadcopter, by Arnaud Taffanel

Positioning in robotics has alway been a challenge. For outdoor, robots GPS is solving most of the practical problems, but indoor, precise localization is still done using expensive proprietary systems mainly based on an array of cameras.

In this talk, I will present the loco positioning system: an open source Ultra Wide Band radio-based local positioning system, why we need it and how it works. I will also speak about its usage with the Crazyflie 2.0 open source nano quadcopter, of course ending with an autonomous flying demo.

  • 14:00 14:50 – Free Software For The Machine, by Keith Packard

The Machine is a hardware project at Hewlett Packard Enterprise which takes a new look at computer architecture. With many processors and large amounts of directly addressable storage, The Machine program has offered an equally large opportunity for developing new system software. Our team at HPE has spent the better part of two years writing new software and adapting existing software to expose the capabilities of the hardware to application developers.

As directly addressable storage is such a large part of the new hardware, this presentation will focus on a couple of important bits of free software which expose that to applications, including our Librarian File System and Managed Data Structures libraries. Managed Data Structures introduces a new application programming paradigm where the application works directly on the stable storage form for data structures, eliminating serialization and de-serialization operations.

Finally, the presentation will describe how the hardware is managed, from sequencing power to a rack full of high-performance computing hardware, through constructing custom Linux operating systems for each processor and managing all of them as parts of a single computing platform.

  • 15:00 – 15:25 – Diving into the KiCad source code, by Maciej Sumiński

Let’s be sincere, all of us would love to change something in KiCad. I bet you have an idea for a new tool or another killer feature that would make your life so much easier.

You know what? You are free to do so! Even more, you are welcome to contribute to the project, and it is not that difficult as one may think. Those who have browsed the source code might find it overwhelming at first, but the truth is: you do not have to know everything to create useful extensions.

I would like to invite you for a walk through the KiCad source code to demonstrate how easy it is to add this tool you have always been dreaming about.

  • 15:30 – 16:00 – Testing with volcanoes – Fuego+LAVA, embedded testing going distributed, by Jan-Simon Möller

LAVA and Fuego are great tools individually already. Combining and extending them allows for a much broader test coverage than each tool alone can provide.

The focus of this talk is to share the experiences made and lessons learned so people can integrate such tools better in their own environment. It also raises the pain-points and open issues when setting up a distributed environment.

Especially for Automotive, Long-Term-Support, CIP or Consumer Electronics, advancing the Test-harness is essential to raise the bar and strengthen the confidence in our embedded platforms. Automated testing can improve our ecosystem from two sides: during development (feature does work and does not break things) and during maintenance (no regressions through backports).

  • 16:00 – 16:30 – Adding IEEE 802.15.4 and 6LoWPAN to an Embedded Linux Device, by Stefan Schmidt

Adding support for IEEE 802.15.4 and 6LoWPAN to an embedded Linux board opens up new possibilities to communicate with tiny, IoT type of, devices.

Bringing IP connectivity to devices, like sensors, with just a few kilobytes of RAM and limited battery power is an interesting IoT challenge. With the Linux-wpan and 6LoWPAN subsystems we get Linux ready to support the needed wireless standards as well as protocols that connect these tiny devices into the wider Internet. To make Linux a practical border router or smart home hub for such networks.

This talk will show how to add the needed transceiver hardware to an existing hardware and how to enable and configure the Linux-wpan and 6LoWPAN mainline subsystems to use it. The demonstration will include setting up the communication between Linux and other popular IoT operating systems like RIOT or Contiki as well.

  • 16:30 – 17:00 – OpenPowerlink over Xenomai, by Pierre Ficheux

Industrial Ethernet is a successor of classic field bus such as CAN, MODBUS or PROFIBUS. POWERLINK was created by B&R Automation and provides performance and real­-time capabilities based on standard Ethernet hardware. openPOWERLINK is open source and runs on lots of platforms such as Linux, Windows, various RTOS and dedicated hardware (FPGA). We will explain how to use openPOWERLINK on top of Xenomai 3, a powerful real-time extension for Linux kernel based on co-­kernel technology.

FOSDEM 2017 will take place at the ULB Solbosch Campus in Brussels, Belgium, and no registration is required, you just need to show up in order to attend the event.

JeVois-A33 is a Small Quad Core Linux Camera Designed for Computer Vision Applications (Crowdfunding)

December 27th, 2016 8 comments

JeVois Neuromorphic Embedded Vision Toolkit – developed at iLab at the University of Southern California – is an open source software framework to capture and process images through a machine vision algorithm, primarily designed to run on embedded camera hardware, but also supporting Linux board such as the Raspberry Pi. A compact Allwinner A33 has now been design to run the software and use on robotics and other projects requiring a lightweight and/or battery powered camera with computer vision capabilities.

allwinner-a33-computer-vision-cameraJeVois-A33 camera:

  • SoC – Allwinner A33  quad core ARM Cortex A7 processor @ 1.35GHz with  VFPv4 and NEON, and a dual core Mali-400 GPU supporting OpenGL-ES 2.0.
  • System Memory – 256MB DDR3 SDRAM
  • Storage – micro SD slot for firmware and data
  • 1.3MP camera capable of video capture at
    • SXGA (1280 x 1024) up to 15 fps (frames/second)
    • VGA (640 x 480) up to 30 fps
    • CIF (352 x 288) up to 60 fps
    • QVGA (320 x 240) up to 60 fps
    • QCIF (176 x 144)  up to 120 fps
    • QQVGA (160 x 120) up to 60 fps
    • QQCIF (88 x 72) up to 120 fps
  • USB – 1x mini USB port for power and act as a UVC webcam
  • Serial – 5V or 3.3V (selected through VCC-IO pin) micro serial port connector to communicate with Arduino or other MCU boards.
  • Power – 5V (3.5 Watts) via USB port requires USB 3.0 port or Y-cable to two USB 2.0 ports
  • Misc
    • Integrated cooling fan
    • 1x two-color LED: Green: power is good. Orange: power is good and camera is streaming video frames.
  • Dimensions –  28 cc or 1.7 cubic inches (plastic case included with 4 holes for secure mounting)

jevois-camera-hardwareThe camera runs Linux with the drivers for the camera, JeVois C++17 video capture, processing & streaming framework, OpenCV 3.1, and toolchains. You can either connect it to a host computer’s USB port to check out the camera output (actual image + processed image), or to an MCU board such as Arduino via the serial interface to use machine vision to control robots, drones, or others. Currently three modes of operation are available:

  • Demo/development mode – the camera outputs a demo display over USB that shows the results of its analysis, potentially along with simple data over serial port.
  • Text-only mode – the camera provides no USB output, but only text strings, for example, commands for a pan/tilt controller.
  • Pre-processing mode – The smart camera outputs video that is intended for machine consumption, and potentially processed by a more powerful system.

The smart camera can detect motion, track faces and eyes, detect & decode ArUco makers & QR codes, detect & follow lines for autonomous cars, and more. Since the framework is open source, you’ll also be able to add your own algorithms and modify the firmware. Some documentation has already been posted on the project’s website. The best is to watch the demo video below to see the capacities of the camera and software.

The project launched in Kickstarter a few days ago with the goal of raising $50,000 for the project. A $45 “early backer” pledge should get you a JeVois camera with a micro serial connector with 15cm pigtail leads, while a $55 pledge will add an 8GB micro SD card pre-load with JeVois software, and a 24/28 AWG mini USB Y cable. Shipping is free to the US, but adds $10 to Canada, and $15 to the rest of the work. Delivery is planned for February and March 2017.