Sipeed MetaSense RGB ToF 3D depth cameras are made for MCUs & ROS Robots (Crowfunding)

Sipeed MetaSense RGB ToF 3D Depth Cameras

We’ve just written about the Arducam ToF camera to add depth sensing to Raspberry Pi, but there are now more choices, as Sipeed has just introduced its MetaSense ToF (Time-of-Flight) camera family for microcontrollers and robots running ROS with two models offering different sets of features and capabilities. The MetaSense A075V USB camera offers 320×240 depth resolution plus an extra RGB sensor, an IMU unit, and a CPU with built-in NPU that makes it ideal for ROS 1/2 robots, while the lower-end MetaSense A010 ToF camera offers up to 100×100 resolution, integrates a 1.14-inch LCD display to visualize depth data in real-time and can be connected to a microcontroller host, or even a Raspberry Pi, through UART or USB. MetaSense A075V specifications: SoC – Unnamed quad-core Arm Cortex-A7 processor @ 1.5 GHz with 0.4 TOPS NPU System Memory – 128 MB RAM Storage – 128MB flash Cameras 800×600 @ 30 […]

Odyssey Blue mini PC bundle ships with Frigate open-source NVR, Coral USB AI accelerator

Frigate NVR Odyssey Blue mini PC

Odyssey Blue mini PC based on the ODYSSEY-X86J4125 SBC is now offered as part of a bundle with Frigate open-source NVR platform with support for real-time local object detection, and an Coral USB AI accelerator. The Odyssey Blue mini PC is equipped with an Intel Celeron J4125 quad-core Gemini Lake Refresh processor, 8GB RAM, and a 128GB SSD preloaded with an unnamed Linux OS (probably Debian 11) and Frigate Docker container. The solution can run over object detection at 100+ FPS when equipped with a Coral USB accelerator. Since the hardware is not exactly new, and we’ve covered it in detail in the past, even reviewing the earlier generation SBC with Celeron J4105 processor and Re_Computer enclosure, I’ll focus on the software, namely Frigate NVR in this post. Frigate is an open-source NVR program designed for Home Assistant with AI-powered object detection that runs as a Docker container and uses […]

PureThermal 3 board embeds FLIR Lepton FS thermal camera for $200

PureThermal 3 FLIR Lepton FS board

Groupgets PureThermal 3 (PT3) is a hackable thermal webcam that ships with the low-cost FLIR Lepton FS module, and is compatible with FLIR Lepton (2.x – 3.x) LWIR camera core. The company explains the new model offers the same basic functionality as the PureThermal 2 but with a few changes and additional features, although the motivation for the new design was primarily to address component shortages. Some changes include the removal of pads to install an RF shield, and the Tag-Connect TC2030 programming connector is replaced by the Tag-Connect EC-10-IDC. PureThermal 3 specifications with changes highlighted in bold: Microcontroller – STMicro STM32F412 Arm Cortex-M4 MCU @ 100 MHz with up to 1024 KB flash, 256 KB SRAM Camera support Supports LIR Lepton (2.x – 3.x) LWIR camera cores Ships with FLIR Lepton FS non-radiometric 160 x 120 resolution micro thermal camera module USB – USB Type-C port with USB UVC […]

NVIDIA Jetson Xavier NX fanless embedded box PC features four HDMI input ports

NVIDIA Xavier NX embedded box pc four HDMI inputs

AAEON BOXER-8256AI is a fanless embedded box PC equipped with an NVIDIA Jetson Xavier NX system-on-module, offering four HDMI input ports – two Full HD, two 4K capable -, as well as two HDMI outputs for smart healthcare equipment, digital signage, and entertainment. The embedded computer comes with 8GB RAM and a 16GB flash provided by the NVIDIA module, supports M.2 NVMe and SATA storage, Gigabit Ethernet, plus optional WiFI, 4G, and 5G cellular connectivity through M.2 sockets.   BOXER-8256AI specifications: SoM – NVIDIA Jetson Xavier NX with CPU – 6-core NVIDIA Carmel Armv8.2 64-bit CPU GPU – 384-core NVIDIA Volta GPU with 48 Tensor Cores AI accelerator – 2x NVDLA deep learning accelerators AI performance – Up to 21 TOPS at 15 Watts System Memory – 8GB LPDDR4x Storage – 16GB eMMC flash Storage – MicroSD Slot, M.2 NVMe SSD socket, SATA III port Display Interfaces – 2x HDMI […]

AAEON launches UP Xtreme i11 & UP Squared 6000 robotic development kits

UP Xtreme i11 robotics development kit

AAEON’s UP Bridge the Gap community has partnered with Intel to release two robotic development kits based on either UP Xtreme i11 or UP Squared 6000 single board computers in order to simplify robotics evaluation and development. Both robotic development kits come with a four-wheeled robot prototype that can move omnidirectionally, sense and map its environment, avoid obstacles, and detect people and objects. Since those capabilities are already implemented, users can save time and money working on further customization for specific use cases. Key features and specifications: SBC UP Squared 6000 SBC – Recommended for power efficiency and extended battery power SoC – Intel Atom x6425RE Elkhart Lake processor clocked up to 1.90 GHz with Intel UHD Graphics System Memory – 8GB LPDDR4 Storage – 64GB eMMC flash, SATA port UP Xtreme i11 SBC – Recommended for higher performance SoC Intel Core i7-1185GRE clocked at up to 4.40 GHz and […]

Arducam ToF camera adds depth sensing to Raspberry Pi for $30 (Crowdfunding)

Arducam ToF camera Raspberry Pi

Arducam has launched of Time-of-Flight (ToF) camera module for Raspberry Pi that enables depth sensing by capturing 3D data (at 240×180 resolution) at a distance of up to 4 meters. Arducam has launched several cameras for Raspberry Pi boards over the years, more recently the Arducam Pi HawkEye 64MP camera, but the Arducam ToF camera is quite different, as while it still connects to the MIPI CSI connector of the SBC, it is used to measure distances and depth and display 3D data. Arducam ToF camera specifications: Compatibility – Any Raspberry Pi board with a MIPI CSI connector Effective number of pixels – 240×180 Frame Rate Up to 120 fps (sensor) Up to 30 fps (when processed by a Raspberry Pi using 4-phases RAW frames) Sensor size – 1/6-inch Modulation Frequency – 75MHz/37.5MHz Viewing Angle – 70° Diagonal Light Source – 940nm VCSEL illuminator Output Formats 4-phases RAW Frame Depth […]

SyncBot educational mobile robot supports NVIDIA Xavier NX or Intel Tiger Lake controller

SyncBot educational mobile robot RBX-I2000 controller

Syncbotic Syncbot is a four-wheel autonomous mobile robot (AMR) platform for research and education that can be fitted with an NVIDIA Xavier NX or an Intel Apollo Lake/Tiger Lake-based controller running Ubuntu 20.04 operating system with ROS 2 framework, and comes with an motion control MCU board with an EtherCat master and running Zephyr OS. The robot comes with four 400W TECO servo motors, can handle up to 80kg payloads for sensors and a robotic arm, features 12V and 24V power output for sensors, four USB 3.0 ports, and can also be equipped with an eight-camera kit with Intel RealSense and ToF cameras. Syncbot AMR specifications: Robot Controller Platform (one or the other) SyncBotic A100 evaluation ki (Apollo Lake E3940) SyncBotic SBC-T800 series (Intel Tiger Lake UP3) SyncBotic SBC W series (Intel Tiger Lake UP3, waterproof version) SyncBotic NSync-200 series (NVIDIA NX) Dimensions – 200 x 190 mm STM32-based Motion […]

AI, computer vision meet LoRaWAN with SenseCAP K1100 sensor prototype kit

Wio Terminal Grove Vision AI LoRaWAN module

CNXSoft: This is another tutorial using SenseCAP K1100 sensor prototype kit translated from CNX Software Thai. This post shows how computer vision/AI vision can be combined with LoRaWAN using the Arduino-programmable Wio Terminal, a Grove camera module, and LoRa-E5 module connecting to a private LoRaWAN network using open-source tools such as Node-RED and InfluxDB. In the first part of SenseCAP K1100 review/tutorial we connected various sensors to the Wio Terminal board and transmitted the data wirelessly through the LoRa-E5 LoRaWAN module after setting the frequency band for Thailand (AS923). In the second part, we’ll connect the Grove Vision AI module part of the SenseCAP K1100 sensor prototype kit to the Wio Terminal in order to train models to capture faces and display the results from the camera on the computer. and evaluate the results of how accurate the Face detection Model is. Finally, we’ll send the data (e.g. confidence) using […]

Exit mobile version