Intel Showcases Core m Compute Stick Prototype with RealSense Camera at IDF 2016 Shenzhen
Intel appears to be all-in with their Realsense technology at IDF 2016 Shenzhen, as together with RealSense Robotic Development Kit, the company is showcasing an Intel Core m “Skylake” TV Stick, based on similar hardware as STK2MV64CC Compute Stick with a Core m3 or m5 vPro processor, but adding a Realsense F200 3D depth camera and an array of microphones.
The full specifications are not available, but we do know the stick also comes with one USB 3.0 ports, and a yellow USB 2.0 port which should be always-on, a micro USB port for power, and a micro HDMI port to connect the your TV. The stick is supposed to be placed on top of your TV so you’d then be able to control the user interface, play games, etc… using gestures, with potentially other applications made possible thanks to 3D depth sensing such as Windows Hello which allows you to sign-in without password.
NotebookItalia also reports that while Core m TV stick design where reserves to Intel’s own use so far, the company will release then so that OEM/ODM manufacturers can start offering their own Skylake TV sticks.
4 Replies to “Intel Showcases Core m Compute Stick Prototype with RealSense Camera at IDF 2016 Shenzhen”
Linux and OSX capture driver for F200,R200,SR300 cameras -> https://software.intel.com/en-us/forums/realsense/topic/607339
Tested on Intel Compute Stick, BOXSTK1AW32SCR, MinnowBoard Max, Kangaroo MD2B, UP Board, but it should also work on similar platforms as long as there’s a USB 3.0 port.
can there Depth camera’s be used for 3D scans?
RealSense can do that. They showcased it as CES 2014 @ https://www.youtube.com/watch?v=QSG9ypd-Zhk
“The most recent SDK for the Intel RealSense F200 camera now includes 3D scanning” -> https://software.intel.com/en-us/blogs/2015/03/24/intel-realsense-3d-scanning-scan-then-convert-obj-to-ply-for-blender-unity
So more confirmation it should work with the F200 camera in the stick. You’d probably need a turntable to fully scan the objects.