Intel Phases Out Quark SoCs and Microcontrollers

I had completely forgotten about Intel Quark SoCs and MCUs found in development kits and modules such as the Arduino 101 board or Intel Curie module. Last time I heard about those chips, Intel had just discontinued several boards including the two aforementioned products, and I just assumed the Quark chips had quietly reached end-of-life as well.

Intel Quark Discontinued

But actually those will still be available for a bit longer, with Intel having just issued a product change notification for the discontinuance & end of life of the following parts:

  • Intel Quark SoC X1020D
  • Intel Quark SoC X1000
  • Intel Quark SoC X1010
  • Intel Quark SoC X1021D
  • Intel Quark SoC X1001
  • Intel Quark SoC X1011
  • Intel Quark SoC X1020
  • Intel Quark SoC X1021
  • Intel Quark Microcontroller D1000
  • Intel Quark Microcontroller D2000
  • Intel Quark SE C1000 Microcontroller
  • Intel Quark Microcontroller D2000
  • Intel Quark SE C1000 Microcontroller

In case you have a product based one of those SKUs, you can still order until July 19, 2019, and the last shipment is scheduled to occur on July 17, 2022.

Via Anandtech, and thanks Jon for the tip.

20
Leave a Reply

avatar
6 Comment threads
14 Thread replies
1 Followers
 
Most reacted comment
Hottest comment thread
11 Comment authors
Sky VelleitynotzedPaul MCampGareththeyguyuk Recent comment authors
  Subscribe  
newest oldest most voted
Notify of
Laurent
Guest
Laurent

No mention of Quark S1000. Was it ever released? It was the only one interesting due to no x86 😛

blu
Guest
blu

I’m curious of that as well — the more ESP32 products — the better ; )

dgp
Guest
dgp

The quarks had features that are advantageous like proper secure boot that’s documented but they were never going to compete in the “race to the bottom” segment. I have no idea why they even thought it’d be a good idea to get into it.

David Willmore
Guest
David Willmore

So, did Intel learn any lessons from this? I hope the companies who used these products and relied on Intel as a supplier and partner learned an important lesson.

dgp
Guest
dgp

Only giving 6 months for the last order seems a bit cruel but I think it suggests that almost no one was using them.

willy
Guest
willy

They’re good at making high-power, high-performance chips but not on the lower end. There’s a limit to the design reuse. The x86 architecture simply is not well suited to being scaled down. And it’s not just due to the legacy compatibility which is mostly emulated nowadays. Even the i376 which dropped support for real mode and 16-bit modes was possibly too complex for embedded systems. When you look at a RISC instruction decoder (MIPS, ARM) on one side and at an x86 one on the other side, you quickly understand that one of them starts with a huge penalty. The instruction set was optimal for the 8088 which was not pipelined, but when you pipeline such a thing and have to deal with dispatch it cannot be fun. And all these transistors turning on and off at clock rate to do stuff the other ones don’t need come with a cost (space, test thus dollars, and energy).

From a marketing perspective they should build low-end RISC processors and claim “RISC is for low-end”. Anyone could verify this everyday considering that their non-x86 competitors are still many times slower for most workloads, and they would comfort their position in the high-end market. They could also justify lower prices for the entry-level market without affecting the x86 license rates.

blu
Guest
blu

You’d think. Instead we’ll get Larrabee 3 ; ) Apropos, their non-x86 competitors in the HPC segment are gaining momentum. And their non-x86 competitors in the mobile segment are eyeing the next mobile milestone — notebooks.

dgp
Guest
dgp

>And their non-x86 competitors in the mobile segment are eyeing the next mobile milestone — notebooks.

They better get on with it. RISC-V is going to totally decimate ARM and others microcontroller market.

Paul M
Guest
Paul M

I think it will be five years before RiscV takes over mid to high end Arm. And Arm have fought back with cheap or free licensing of low end cores.

dgp
Guest
dgp

>I think it will be five years before RiscV takes over mid to high end Arm.

I didn’t write anything about the mid or high end. Risc-V is already competitive (on paper) with the Cortex M? cores.

>And Arm have fought back with cheap or free licensing of low end cores.

Could you link where the license fees have become “free”? As far as I know it’s free to start. You can go and get your free Cortex M0 IP if ARM deems you worthy but if you actually start to build chips with it you still pay royalties. On the other hand anyone can go and get a RISC-V core from github and take it all the way to mass production.

David Willmore
Guest
David Willmore

x86 can be made very low power–maybe not and be performance competetive with a more natively low power arch like ARM or MIPS, but it can be scaled down there–after all, that’s where it came from a long time ago.

What I was expecting was for the research people at Intel who have been working on Near Threshold Switching (and Sub Threshold Switching) to make the future members of this family. That would give us very performance competetive parts (at the 500MHz range), but I haven’t heard anything coming out of them. There was that ARM core from the guys in Texas who are working in that field….

dgp
Guest
dgp

>From a marketing perspective they should build low-end RISC processors and claim “RISC is for low-end”

The Intel Quark processors weren’t really about being super low power as far as I remember. It was more about having most of the nice stuff about x86 (platform structure, strong driver support etc) in the low cost/consumer goods market. Things like gateways for smart home gadgets.

The main issue is you can’t compete with a $4 Cortex A7 chip even if the SDK looks like total garbage and the hardware is made up of whatever IP blocks were on sale that day if your chip costs $10.

Jon Smirl
Member

I still can’t believe Intel made a reality TV show featuring the Quark processors.
https://makezine.com/2015/10/05/intel-new-reality-show-makers/

David Willmore
Guest
David Willmore

Surely it’s a commedy. 🙂

theyguyuk
Guest
theyguyuk

The future will be reduced instruction hardwired SoC, with loadable complex instructions, load into high speed, fast responce, on SoC die, memory. So the SoC only loads the complex instructions it needs for the software it runs. Think of it as software tuned for gaming, or productivity software, video editing, photography, programming, design.
Instead of just listing variables, programs will pass to the SoC which complex instructions it needs loading to run.
At present the local on die, high speed, complex instruction, writable cores are still being worked on, and also heat management.

Good few years yet :(, funding allowing.

CampGareth
Guest
CampGareth

Or it could be FPGAs where you don’t have any of this instruction business to worry about and just make the hardware do exactly the task you want. Don’t need a particular function right now? You can turn it off or better yet repurpose the space it consumed for something else. I am biased though, I work at a company making FPGAs easier to work with (they are hard to write for atm).

willy
Guest
willy

You’re suggesting intel should start to use Altera they acquired a few years ago maybe ? 🙂

notzed
Guest
notzed

But who’s gonna write the compilers for such a machine? And then software that uses those? Why would it work this time, hasn’t it been tried before? It’ll all just costs too much.

Even all the SIMD instructions are barely used by most software because the compilers are so miserable at exploiting them . Despite decades of sustained effort one still needs to bypass the compiler to get anywhere. The language designers and compiler writers are too busy trying to re-design another C variant (or just get C++ to even compile properly) rather than how to use existing machines or get features added to them that they could use.

FPGA has similarly been promising programmable hardware for decades. Its got it’s uses but general purpose computing hasn’t been one of them so far.

blu
Guest
blu

You’ve hit the nail on the head — plenty of silicon available today but not (m)any good compilers for it. That’s why I find Tensilica interesting and promising.

Sky Velleity
Guest
Sky Velleity

I thought these got killed off with the Edison/Joule/Everything else IoT that got canned in 2017.
I’ve seen them on a few suppliers websites, I just assumed they were old stock, surprised they were still making the things.