Intel Apollo Lake is the next generation of low power processor family that should replace Braswell Celeron processors, and Fanlesstech got hold of the specifications for two upcoming “Arches Canyon” NUC6CAYS and NUC6CAYH NUCs (Next Unit of Computing) mini PCs based on the processors, as well as the 2016-2018 roadmap for the complete (consumer grade) Intel NUC family.
- SoC – Intel Celeron Jxxx quad core processor @ x GHz to y GHz (burst) with Intel HD graphics up to z MHz (10W TDP)
- System Memory – 2GB DDR3L-xxxx SO-DIMM (dual channel), upgradeable up to 8GB DDR3L-1866
- Storage – 32GB eMMC flash, 2.5″ SATA3 bay for 9.5mm hard drives, SDXC slot with UHS-I support
- Video Output – HDMI 2.0 (4K @ 60 Hz), VGA
- Audio – Up to 7.1 channels via HDMI, 3.5mm headset jack, 3.5mm rear speaker/TOSLINK combo jacl
- Connectivity – Gigabit Ethernet (RJ45), Intel Wireless AC-316x M.2 module for 802.11ac 1×1 WiFi and Bluetooth 4.2 with internal antennas
- USB – 2x front USB 3.0 ports at the front (yellow one for charging), 2x rear USB 3.0 ports, 2x internal USB 2.0 ports via header
- Misc – IR receiver, Kensington lock
- Power Supply – 12~19V DC input (65W wall-wart power supply included)
- Dimensions – 115 x 111 x 51 (plastic casing with inner metal structure)
The 2016-2018 NUC roadmap above was also “leaked” with more powerful Core i3, Core i5 and Core i7 NUCs. The first Apollo Lake NUC will be released in Q4 2016 with Windows 10, and the barebone version in Q1 2017.
Jean-Luc started CNX Software in 2010 as a part-time endeavor, before quitting his job as a software engineering manager, and starting to write daily news, and reviews full time later in 2011.
16 Replies to “Intel Apollo Lake NUC6CAYS & NUC6CAYH NUC mini PCs Specifications Released”
WHY IS THERE A VGA CONNECTOR!!!!
I don’t get it, there’s this push for stupid things like replacing the headphone jack with a USB-C port, but we’re stuck with VGA connectors for display connectivity in 2016, a place where digital actually makes sense… :'(
Because some of us want to plug it up to a far better CRT monitor instead of a crappy LED?
I heard Apollo Lake was canceled … are you sure of this post ? Intel want drop this and go straight into Kaby Lake
Apollo Lake is Pentium/Celeron/Atom, Kaby Lake is Core i processors.
@TLS yes i know they just wanted drop that tree of Atoms CPU etc. (next gen Apollo Lake)
For me If not – than better, im waiting for support HDMI 2.0, VP9, HDR and 10bit HEVC
(nvidia already support 12bit in HEVC)
Because NOTE EVERYONE USES HDMI. Many Mini PCs are used for small projects, and many use VGA connection. All boxes should have VGA, HDMI and DP.
No, VGA should simply just go and die… It’s a relic that should’ve disappeared a long time ago, but due to various “tolls” and import duties and other stupid anti-technology regulation by various governments, we’re still stuck with an analogue graphics interface from the 1980’s. Maybe you think computers should still have composite video outputs as well?
Really?! So how would you explain business lines of laptops? I need to disappoint you, but what you wrote is pure BS.
A lot of infrastructure for presentation is still VGA. Ask Extron.
maybe some should drop a micro-vga socket design
I am deploying a NUC Core I5 with Vmware Esxi6 aso my inhouse server. So far so good. One TB of disk, 120gb pcie stick, 64gb of ram.
Installed vmware from a USB stick.
It’s a low cost device so they want backward compatability with low cost display devices like VGA tv’s, monitors, and projectors. It costs next to nothing to include it, so why not? If it’s not present they will lose some sales into the low end market that is their target market. Also remember, these things are sold worldwide. In places like India, Brazil, and Russia, there could be lots of buyers in the VGA space.
Lenovo has dropped VGA on all their ThinkPads finally, so no, it’s not on all business laptops. The one reason people still use VGA tends to be old projectors and that’s about it. Tough, get a HDMI or DP to VGA adapter then… Personally I can’t see the VGA connector disappearing soon enough. Also, nothing I wrote was BS, as the VGA connector came around in the 1980’s and the reason displays still have VGA connectors in some countries are down to higher duties and import taxes on displays with digital interfaces.
How does it make sense converting a digital signal to analogue and then back to digital? I can’t imagine India, Brazil and Russia are full of CRT monitors, as they’re simply to impractical. It really is time to knock this old interface on the head and move on. Oh and please show me a TV with VGA connectors, I haven’t seen one for at least five years.
TLS, you are simply wrong on the statement about VGA. I’ve not received a unit yet, Dell, NUC or several other desktop type system w/o VGA.
I’m not debating why, it’s simply there.
All TV’s I’ve found in retail here in the US have VGA, so I don’t know where you are, but in the US they still are mostly VGA. The only thing I’ve seen dropped is the SVGA or stripped sync inputs. About 1/4 have the RCA color RS-170 color connector and support.
As to what I’d like to see, I’d like to see wireless display support integrated in some fashion which would work with systems, and I’ve never seen it, but hardware VNC would be incredibly useful.
I bought a TV which had HDMI, RF, VGA and RCA inputs, made by Vizio, in the last 2 months.
(The Jim St is me, entered on a new system with differing input, not trying to pull anything)
But why would you even want to down convert a digital signal to analogue to then convert it back to digital again to get a picture on your screen? It makes no sense at all today, as no-one’s using analogue CRT screens any more…
LCD’s endpoints are analog. The dot matrix controller module ( http://www.lcd-module.de/fileadmin/eng/pdf/zubehoer/t6963c.pdf for example ) takes digital input and outputs high frequency analog signals to oscillate the crystals to produce a color filter. The LED array then illuminates through those filter layers and you end up seeing specific colors. You’re not turning different colored LEDs on and off digitally.
When modern monitors have VGA input sockets, they don’t covert to digital. They processes the signal with analog frequency modulation and amp cut-offs.
HDMI: Graphics card (DIGITAL) -> Monitor (DIGITAL-to-ANALOG).
VGA: Graphics card (DIGITAL-to-ANALOG) -> Monitor (ANALOG-to-ANALOG).
Saying all that, you’re right. It would still be preferable to just send a digital signal and have it processed in the monitor. The problem is patents, DRM and licensing. And HDMI is full of those…