Bootlin Releases Open Source VPU Driver for Allwinner Processors with MPEG2 and H.264 Video Decoding

At the beginning of the year, Bootlin – formerly Free Electrons – launched a crowdfunding campaign to bring open source Allwinner VPU driver to mainline Linux. VPU (Video Processing Unit) drivers are used to encode and decode videos. They were successfully in raising enough money from small donors, as well as several companies manufacturing Allwinner development boards including Olimex, Pine64, Libre Computer, FriendlyELEC, and Xunlong Software (Orange Pi).

The amount raised (€31,612) was enough to assign two engineers to work on the main goals, as well as some stretch goals namely support for newer Allwinner H3, H5, and A64 processors, and H.265 video decoding. The good news is the company has now delivery the first release for their work on the main goals.

Open Source VPU Driver on A33-OLinuXino and ALL-H3-CC Boards – Click to enlarge

The photo above illustrated a demo of Kodi running with bootlin open source Cedrus VPU driver on top of Linux 4.18-rc kernel. Both MPEG2 and H264 are supported, and they’ve gone a little beyond one of their main goal since Allwinner H3 is also supported. Their work was based on libvdpau-sunxi project and other work from linux-sunxi community.

Bootlin announcement goes into more details, but here’s a quick report card of the main goals achievements so far:

  • Support for older Allwinner SoCs: A10, A13, A20, A33, R8 and R16 – Fully met, plus H3 support as a bonus (that was planned for the stretch goals)
  • Production-ready MPEG2 decoding –  Fully met with improvements in both the kernel  and user-space code. MPEG2 codec was already partially supported.
  • H264 video decoding implementationFully met, including  high-profile H264 support. Further debugging likely needed.
  • Modifying the Allwinner display driver (DRM) to directly display the decoded frames instead of converting and copying those frames – Fully met, plus hardware scaling fixed, and patch sets contributed to upstream. Bootlin work on the A20 and A33 display driver, and the community on H3.
  • User-space library to integrate in open-source video players. Partially met via libva-v4l2-request user-space libraries that can be used by all libva capable video players. That’s the theory, and in practise it’s only working with Kodi for now, and more work is needed for VLC and GStreamer.
  • Upstreaming changes to the official Linux kernel. Almost met.  They’ve gone though five iterations of the Sunxi-Cedrus Linux kernel driver, but they have yet to be merged. It should just be a question of time.

So soon enough, you’ll be just able to get mainline kernel and the open source user-space libraries to have MPEG2 and H.264 video work on Allwinner processors. If you want to give a try at the ope nsource driver, Bootlin prepared a LibreELEC rootfs with Sunxi-Cedrus driver that works on Allwinner A20, A33 and H3 boards.

Share this:

Support CNX Software! Donate via PayPal or cryptocurrencies, become a Patron on Patreon, or buy review samples

36 Replies to “Bootlin Releases Open Source VPU Driver for Allwinner Processors with MPEG2 and H.264 Video Decoding”

    1. Not sure about that. Does Allwinner support the most important HTPC features: HDMI audio passtrough with most recent lossless codecs and full HDMI CEC? People won’t switch from RPi to Allwinner if these are missing. On top of that, the Allwinner boards are legacy ARMv7 while RPi 3B+ is modern 64-bit ARM with gigabit ethernet, which is important for very high quality streaming and > 100 Mbps bitrates.

      1. > the Allwinner boards are legacy ARMv7 while RPi 3B+ is modern 64-bit ARM with gigabit ethernet

        Ah, RPi Trading’s ‘social media influencer’ is back! 😉

        The RPi 3B+ just as any other Raspberry in the past has no Ethernet capabilities. There’s just an USB-Ethernet adapter behind an internal USB hub that has been added to the SBC (you know that you can attach an el cheapo RTL8153 GbE dongle to every other SBC around and this combination does not suck that much as RPI 3B+’s crippled GbE?). The RPi 3B+ got instead of a great working GbE USB adapter an USB chip that does not work as expected:

        Simple workaround: Force the RPi 3B+ to downgrade to Fast Ethernet and the problems are gone:

        Every Raspberry Pi that will be used as HTPC can not make use of any 64-bit/ARMv8 features since when bringing up the 64-bit Cortex-A53 cores in this state the proprietary VideoCore stuff doesn’t work any more. Enjoying 64-bit/ARMv8 benefits with Raspberry Pi involves not using anything media related (and for a headless device used as small server even the RPi 3B+ is just a terrible choice — the USB2 bottleneck and bus contention issues as well as the old underpowering problems make it a terrible choice for everything server related)

        With the HTPC use case RPi 3 and 3+ are condemned to run in legacy ARMv7 mode while at least two of the Allwinner SoCs that were part of the Kickstarter (1st stretch goal) are true 64-bit/ARMv8 SoCs and can benefit from ARMv8 ISA running in 64-bit mode.

      2. Jerry, you should retrieve your head from your rear end, because you sound like you are speaking from uranus
        RPi with “modern” Gigabit ethernet , my arse.

  1. I supported the crowdfunding campaign, but now I learn I paid for “VPU (Video Processing Unit)” which is something else than a GPU?

    1. In some devices the GPU and VPU are packaged in a single “GPU”, like the VideoCore IV in Broadcom BCM283x processor found in Raspberry Pi boards. But in most Arm devices, the GPU takes care of 2D and 3D acceleration, while the VPU handles hardware video decoding and encoding.

      Time to share that link again:

      Anyway, you did good.

      1. > VideoCore IV in Broadcom BCM283x processor found in Raspberry Pi boards

        Do you know whether HW accelerated video decoding on Raspberries anytime soon or at all will work with fully open sourced drivers like on Allwinner SoCs now? Or still everything depending on the closed sourced RTOS running directly on the VideoCore?

        1. That part will never be open source :), or maybe in 20 years, when they’ll have moved into something else. A bit like Microsoft does with their old software.

        1. The HTPC users don’t really care whether it’s proprietary or not as long as you can download the ROM for free. Also lol @ V4L2. I’ve used RPi model B for several years as a HTPC and it supports all sorts of videos I throw at it. RPi 3B+ is even better with gigabit ethernet and 64-bit processing. If you just happened to have a extremely high quality source (DLNA), you could blast away with nearly uncompressed video quality.

          1. > RPi 3B+ is even better with gigabit ethernet and 64-bit processing

            Gigabit Ethernet is still broken and there is no 64-bit processing as long as all the proprietary VideoCore crap is involved (VideoCore is 32-bit and will remain 32-bit). So either you skip all the proprietary HW acceleration running on the VideoCore (then you can bring up the ARM cores in aarch64 state) or you remain 32-bit.

          2. Jerry – uncompressed HD video is ~750Mb/s (4:2:0 8 bit 1920×1080/30p or 60i – ignoring blanking and audio). There’s no way a Pi3B+ is going to be streaming that… HD Cam SR VT decks use 440Mb/s on-tape bitrates for ‘nearly uncompressed video quality’.
            (The HD-SDI broadcast uncompressed standard interconnect is rated at ~1.5Gb/s. UHD / 4K is 12Gb/s – and doesn’t fit neatly into a 10GbE connection if you keep it uncompressed…)

      1. We evaluated such an action but Mali GPU is significantly more complex and uncertain on the user-space side with Mesa. It would need to bring in at least 100K from the community with a 50% donor match (Libre Computer).

  2. I’m very glad I got a chance to contribute money to this kickstarter. I just wish we would have hit the H.265 stretch goal as that’s the only codec I use anymore. Still completely worth it, though. Good work, Bootlin!

    1. Wrt software support H2+ and H3 are the same. If you click on the link in the first sentence you get the timeline for the stretch goals…

      1. So dear Tkaiser when can we expect to have these bootlin work getting into latest mainline image ? Presently I am on old legacy desktop ..eagerly waiting for full featured stable mainline Armbian desktop images ….

        1. > eagerly waiting for full featured stable mainline Armbian desktop images ….

          Sure. If you need this “now”, we would need to borrow money to put it into the project. Hire people for the project which doesn’t make money?

          We already made huge preparations (dirty work) that this will be possible much sooner. We already go faster, producing way more than it is possible and it seems nobody care.

          1. Dear Igor .. nobody care? ..exactly in context of ? you mean that not much people financially supporting Armbian ?

            PS : Dear when i said ” eagerly waiting ” I was just simply presenting my ” desire ” …in no way I was dictating to anybody…that said , I really really appreciate the efforts done by you and the Armbian team …and now this Bootlin team…you all are just wonderful…hats of …

          2. I made some estimation the other day. End-user donations cover around 15 minutes of main contributors under some average commercial condition each month. This means end users pay for a few out of a few hundred support question each month. Where are the infrastructure, management and most expensive R&D costs? The gap is huge and if money would be a motivation, we would close this down a long time ago. Remember that also this video acceleration campaign would not be successful without the usual suspects, board makers (with obvious interest and funds to support) which contributed the majority. Those (Orangepi, Libre Computer, FriendlyARM, Olimex) also helps us in managing the costs.

          3. Dear , I know Steven (Orange Pi) … he is very nice supportive guy and not a greedy one , recently he hired few engineers to provide support for OPIs but I am not very sure about the technical competence of them .. so if you wish then I can talk to him on this issue to see if he can support Armbian on regular basis …this extra expense he can cover up by raising prices of OPIs by few cents… whats your suggestion ?

          4. Steven is among those who support Armbian on regular basis but all support together is not on the level that we could hire people. Which is probably the only way to sustain this level of operation on a long run. And long run is a default for this type of work. I am low on ideas on how to solve all this and you are free to contact me if you know how. Just saying how things are.

          5. If Armbian team could consider getting into hardware, it might be easier to generate funds.
            Maybe a collaboration with Shenzhen Xunlong, or another company for an “Armbian Edition” board that could be launched on a crowdfunding campaign first. Then you could decide to assign more resources to that particular board, so people may be drawn to it even it’s a bit more expensive than equivalent boards.

            Projects like Espruino follow that “business model”. Espruino is a JavaScript firmware for microcontroller that supports all sort of MCU boards with the help of the community, but they also have their own Espruino boards with better support.

            I have no personal experience about this type of endeavors, so just my 2 cents.

  3. I really hope that they will also implement H264 encoding. It’s too bad that the stretch goal wasn’t met.

    1. Sure. What baffles me is why. I can understand that those (partially brilliant) engineers working for RPi Trading are in trouble (they’re amongst those 30-50 people on this planet familiar with the closed source mess that defines RPi. And once their guru moves on to something else they’re fired since literally no one needs their VideoCore skills any more).

  4. Hello everyone! What is the maximum of frame rate of ENCODING can be reached on Allwinner A20 chip for Full HD (1920×1080) video resolution? Did anyone measure that? I reached only 3fps. As signal sources I used csi camera and ffmpeg test video.

Leave a Reply

Your email address will not be published.