38 responses

  1. onebir
    January 19, 2013

    Maybe you could explain a bit about the Linux compatibility issues? If Mali supports the OpenGL and OpenVG APIs, why can’t picUntu use it for graphics acceleration? & why is there a need for someone to develop open source drivers (Lima). Which (if any) GPUs are *really* compatible with Linux (eg allowing Linux to use them for HD video & 3D graphics)?

  2. cnxsoft
    January 19, 2013

    I wish I could explain it, but I don’t clearly understand why. For Mali-400 I can see some higher levels libs/drivers are open source (http://malideveloper.arm.com/develop-for-mali/drivers/open-source-mali-gpus-linux-exadri2-and-x11-display-drivers/) , but there are still binary blobs that appear to be SoC specific, not only GPU specific, AFAIK due to legal reasons. That’s why, for example, you may have Mali-400 support on the Snowball board, but the same Mali driver cannot be used in the Cubieboard.

    New Linux based mobile OS (Ubuntu for Phone, Tizen) are getting smart about it, and they are compatible with the Android kernel so that they can use the same GPU drivers. At least, that’s what I understand.

  3. Dan
    January 19, 2013

    Granted they don’t often appear in low end stuff but no Qualcomm Adreno ? they are a pretty big one to omit & their GPU’s are top of the line. Not surprising given they are ex-ATI as AMD sold off the ATI embedded division years ago to Qualcomm, what a mistake that was.

    The upcoming Adreno 330 is 50% faster than the Adreno 320 which easily beats the Tegra 3.

  4. cnxsoft
    January 19, 2013

    Oups, I’ll try to add it a bit later (like early next week). It takes a bit of time to find information.

  5. onebir
    January 19, 2013

    Well, if you don’t completely understand it, my chances are… limited. Thanks! :)

  6. xranby
    January 19, 2013

    First i would like to point out that you have missed Broadcom who design the VideoCore line in your list of ARM GPU manufacturers.

    One source of incompatibility of drivers between operatin systems is hidden inside the libEGL.
    The libEGL.so (the library that you use to initialize OpenGL ES) is implemented differently on Android and X11 GNU/Linux machines and on the Dispmanx GNU/Linux RaspberryPi.

    EGL API background:
    It is common and allowed by the EGL spec that libEGL EGLNativeWindowType differ for OpenGL ES implementations across operating systems, for example:
    On Windows EGLNativeWindowType is a “HWND” datastructure.
    On Linux/Android EGLNativeWindowType is a “ANativeWindow*” datastructure pointer.
    On Linux/X11 EGLNativeWindowType is a “Window” datastructure.
    and so on…
    On The Raspbeery Pi using a Linux/Broadcom Videocore IV EGLNativeWindowType is a “EGL_DISPMANX_WINDOW_T” datastructure.

    This means that if you try to use a libEGL and OpenGL ES built for Android you will get an exception on X11 that libEGL is unable to connect OpenGL ES render output to the X11 native window surface.

  7. misko
    January 19, 2013

    After all somebody on odroid forum try to get A10 mali drivers on exynos mali
    it look like it works

  8. cnxsoft
    January 19, 2013

    Nice. Do you know what “it cant use all PP blocks inside Mali-400″ means? Is it just because Mali-400 only use one core in A10, so we can’t use all the cores inside the Mali-400 MP4 GPU present in Exynos 4412?

    January 19, 2013

    this is similar to what I have been doing, here http://blog.thinkteletronics.com/all-mobile-socsolutions/
    also sgx 544mp2, I dont think it hit 136Gflops, most likely 30-30Gflops

    January 19, 2013

    I dont think selling of the mobile division was a bad move, look at gcn, that scales well[power and perf], to even ulv parts like the amd temash apu

  11. notzed
    January 19, 2013

    lima is about developing a free software driver for the gpu. The gpu is a programmable processor just like a cpu, so graphics drivers are just as much software as a compiler is. A binary blob has all the problems associated with a binary blob: bugs can’t be fixed, long-term support is questionable, etc.

    The reason opencl isn’t used on android is google have their own crappy alternative and don’t like using standards in general – a typical attitude from the over-qualified technicians who work there.

  12. kamejoko
    January 19, 2013

    Did you see @200MHz and 136Gflops @532MHz

  13. 1.21 jigga watts
    January 20, 2013

    PowerVR SGX544MP2
    136 GFLOPS
    744.8M Tri/s

    woohaa ;-)
    Really bad typo, huh? ;-)
    (its around 34GFLOPS
    and less than 200MTri/s can’t remember the exact number there )

  14. Simon
    January 20, 2013


    That page at the Malidevelopers website says:

    “Note that these components are not a complete driver stack. To build a functional OpenGL ES or OpenVG driver you need access to the full source code of the Mali GPU DDK, which is provided under the standard ARM commercial licence to all Mali GPU customers. For a complete integration of the Mali GPU DDK with the X11 environment refer to the Integration Guide supplied with the Mali GPU DDK.”

    It looks like the fault of ARM, not helping create the complete driver stack, and blocking with the Mali GPU DDK license issue. This is not something that every SoC manufacturer should try to solve. ARM should solve for all their SoC licensees.

  15. cnxsoft
    January 20, 2013

    @1.21 jigga watts
    It was not a typo, but a mistake from my part.
    I read it’s 6.4 GFLOPS power core @ 200 MHz (http://en.wikipedia.org/wiki/PowerVR#Series_5XT) , so I multiplied that by 8 (shader cores) and adjusted for the frequency. The mistake with I had to multiple by 2 (GPU core) instead of 8 (shader cores). So the values were all 4 times larger than in reality.
    This link confirms it’s 12.8 GFLOPS @ 200 MHz – http://www.anandtech.com/show/4413/ti-announces-omap-4470-and-specs-powervr-sgx544-18-ghz-dual-core-cortexa9

    Thanks for correcting my mistake!

  16. m][sko
    January 21, 2013

    cnxsoft :
    Nice. Do you know what “it cant use all PP blocks inside Mali-400″ means? Is it just because Mali-400 only use one core in A10, so we can’t use all the cores inside the Mali-400 MP4 GPU present in Exynos 4412?

    from dmseg
    Mali PP: Creating Mali PP core: Mali-400 PP

    it really looks like GPU core

  17. Luc Verhaegen
    January 24, 2013

    As the actual main developer of the lima driver, i have some data to add here.

    The mali-400 is a split shader design. 1 vertex shader (GP, geometry processor), and 1-4 fragment shaders (PP, pixel processor).

    On the odroid-x2, they manage to clock the mali to 533MHz.

    The GP manages 30MTris at 275MHz, so at 400MHz this translates to 43.6MTris, at 533 (which kind of matches the others) you hit 58.1MTris. The PP manages one pixel per core per clock. 1.6GPix at 4 cores at 400Mhz (so 400MPix per core), or 2.133GPix at 533Mhz. A rather amazing number compared to the competition, and exactly the reason why the exynos4 was such a great performer in the mobile space where the number of triangles tends to be low.

    On the A10, there is only a mali400MP1, so 1GP and 1PP. The binary driver which we got for it has apparently been built to support only 1 PP, and apparently ignores the other 3PPs reported by the exynos kernel. Now this single PP has to work the whole display on his own, quadrupling the number of PPs, in a real world test i have here, will give4x the performance if the CPU can feed the triangles fast enough (the rockchip dual A9s will happily do that in comparison to the single A8 of my current favourite SoC, the allwinner A10).

    Yes, SGX MPx rules atm. But throwing more silicon at the problem is not the best solution for the mobile space, especially not with a GPU which is a management and synchronization nightmare even on a single core.

    As for the lima driver, keep your eyes peeled at FOSDEM.

  18. onebir
    January 25, 2013

    @Luc Verhaegen
    Thanks for working on Lima – having non-crippled Linux on (at least some) Arm processors will help a lot of people :)

    For people like me who’ve never heard of FOSDEM, it’s on 2 & 3rd Feb this year:
    So not even a very long wait :)

  19. sky770
    January 28, 2013

    @Luc Verhaegen

    +1 Great info!
    Thanks.. :)

  20. Hardik P
    August 15, 2013

    which graphics is better 16 core or PowerVR SGX544?

  21. cnxsoft
    August 15, 2013

    @Hardik P
    Which 16-core GPU?

  22. Alex
    September 19, 2013

    “So when it comes to 2D/3D graphics performance, we should not expect Freescale i.MX6 quad core Cortex A9 processor to outperform Rockchip RK3066 dual core Cortex A9 processor, and AllWinner A31 provides excellent graphics performance even if it features slower Cortex A7 cores (4 of them).”

    Sorry but i do not get this one. Does that mean SGX544MP2>Mali400MP4>Tegra3>GC200?

    The reason why i ask is, because i want to know,wich chip would be better for gaming/emulating:
    RockChip 3188, Quad Core, 1.8GHz (cortex A9 CPU, ARM mali400 mp4 GPU)
    All Winner A31s, Quad Core, 1GHz (CPU:cortex A7, GPU:Power VR SGX 544 mp2)

  23. m][sko
    September 20, 2013

    SGX is your friend

    look at iPad/iPhone GPU perf. and anybody else

  24. Alex
    September 20, 2013


    I see… but a difference in 0.8 ghz in cpu is massive. and in the apple stuff its not exactly this gpu: http://www.android-hilfe.de/attachments/onda-v972-forum/161044d1358373694-9-onda-v972-quadcore-cpu-2gb-ram-vr-sgx544mp2-gpu-powervr_gpu_compare.jpg

    im really confused, the more i read about this topic, the less i understand^^

    can you describe me the advantages, and disadvantages in both cases, ty

  25. vivek
    December 22, 2013

    PowerVR SGX544 vs nvidia Tegra 4
    wich is better?

Leave a Reply




Back to top
mobile desktop