Archive

Posts Tagged ‘gpu’

PowerVR SDK v3.4 Supports WebGL, 64-Bit Android 5.0 Lollipop, and MIPS Linux

October 21st, 2014 2 comments

Imagination Technolgies has just released PowerVR SDK v3.4  including the latest compilers for PowerVR Series6 and Series6XT GPUs to PVRShaderEditor, several performance optimization, a new WebGL SDK, 64-bit support for Android 5.0 Lollipop, and Linux support for MIPS based processors.

PowerVR_SDKThe company has revamped the user interfaces of their tools, and made the following key changes:

  • PVRTrace, a tool to capture and analyze OpenGL ES and EGL API calls, now supports OpenGL ES 3.1, compressed trace files, and they’ve reduce the software memory usage
  • PVRTune, a performance analysis tool, now features new counters, and  “significant” performance optimizations.
  • PVRShaderEditor, a light-weight shader editing too, adds the latest compilers for PowerVR Series6 (FP32 and FP16) and Series6XT GPUs, as well as GLSL disassembler output.
  • PVRTexTool, a utility for compressing textures, adds plugin support for Autodesk 3DSMax and Maya (2015 versions), and improves ETC decompression by up to 20% faster per surface.

Imagination also claims to have improved documentation with a new SDK Browser, part of the SDK,  with installation instructions, examples, source code, documents, etc… More details are available on the release notes page.

PowerVR SDK is available for Windows, Mac OS X & Linux (32-/64-bit).

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter

Linaro 14.09 Release with Kernel 3.17 and Android 4.4.4

September 27th, 2014 No comments

Linaro 14.09 has just been released with Linux kernel 3.17-rc4 (baseline), Linux 3.10.54 & 3.14.19 (LSK), and Android 4.4.2 & 4.4.4.

Linaro has kept working on their member boards such as IFC6410 (Qualcomm), D01 (Huawei/Hisilicon), Ardnale (Samsung), and Juno (ARM). They’ve also announced they’ll change the tools to build GCC by using cbuild2 instead of cbuild1 for next release, and they’ve enabled a build with gcov (for code coverage analysis) which may mean they’ll work on reducing the kernel size by getting rid off unused code. I’ve also noticed the Arndale and Arndale Octa Ubuntu images are now based on Linux LSK with Mali GPU support since last month.

Here are the highlights of this release:

  • Linux Linaro 3.17-rc4-2014.09
    • GATOR version 5.19
    • updated topic from Qualcomm LT (ifc6410 board support) and HiSilicon LT
    • updated Versatile Express ARM64 support (FVP Base and Foundation models, Juno) from ARM LT.
    • updated Versatile Express patches from ARM LT
    • updated LLVM topic (follows the community llvmlinux-latest branch)
    • Big endian support (the 2014.05 topic version rebased to 3.17 kernel)
    • config fragments changes – added gcov config fragment, disabled DRM_EXYNOS_IOMMU to work around boot failure on Arndale
  • Linaro Toolchain Binaries 2014.09
    • based on GCC 4.9 and updated to latest Linaro TCWG releases: Linaro GCC 4.9-2014.09, Linaro binutils 2.24-2014.09, and Linaro GDB 7.8-2014.09.
    • This will be the last release done with cbuild1 and crosstool-ng. Next releases will be done with cbuild2. Official support for very old host environments will be dropped.
  • Linaro builds of AOSP 14.09 built with Linaro GCC 4.9-2014.09.
  • Linaro OpenEmbedded 2014.09
    • integrated Linaro GCC 4.9-2014.09, Linaro binutils 2.24-2014.09, integrated Linaro GDB 7.8-2014.09.
    • imported Linaro eglibc 2.19 into meta-linaro after OE-core switched to glibc 2.20
    • fixed shadow securetty for Qualcomm and STMicroelectronics SoCs
    • upstreaming – fixed libpng on aarch64 (neon symbol), updated PM QA to 0.4.14, updated libunwind to include aarch64 support
  • Linaro Ubuntu 14.09
    • added linux-tools (perf standalone, splitted from kernel build)
    • updated packages: Juno firmware 0.8.1, LSK 3.10.55/3.14.19 and linux-linaro 3.17-rc4 kernels.
  • A gcov enabled build has been added
  • Linaro builds of the Android NDK have been updated to current upstream sources and current Linaro toolchain component releases.
  • Standalone Android toolchain binary builds now use Linaro binutils for improved armv8 support.

You can visit https://wiki.linaro.org/Cycles/1409/Release for a list of known issues, and further release details about the LEB, LMB (Linaro Member Builds), and community builds, as well as Android, Kernel, Graphics, Multimedia, Landing Team, Platform, Power management and Toolchain components.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter

Some Projects on Nvidia Jetson TK1 Development Board: Nintendo Emulator, USB3 Webcam, and Robotics

August 4th, 2014 4 comments

Nvidia Jetson TK1 is a development board powered by the company’s Tegra K1 quado core Cortex A15 processor, and especially a Kepler GPU that allows for OpenGL 4.4. It has shipped to developers around April/May, and some of them have showcased their projects, or tested some hardware.

Dolphin Emulator on Nvidia Jetson TK1

Dolphin is an emulator for Nintendo GameCube and Wii console that supports full HD (1080p) rendering, and run on Android, Linux and Mac OS,  and there’s also an Alpha version for Android. Ryan Houdek (Sonicadvance1), one of Dolphin’s developers, has leveraged Kepler’s OpenGL support via Nvidia’s GPU drivers, to port the emulator to the platform running on Ubuntu, but it should work as well on Tegra K1 hardware running Android such as XiaoMi MiiPad tablet.  You can watch Mario Kart: Double Dash demo running at full speed on the Nvidia board below. According to the developer, such framerate would be not achievable on Qualcomm 800 because “Adreno Graphics Drivers are grossly inefficient compared to the TK1″.

The latest version of Dolphin for Android (Beta) dates December 7, 2013, so I’d assume the optimizations shown above are not available right now. You can find more demos on Ryan Houdek’s YouTube Channel.

USB3.0 Webcam @ 1080p30

Another developer, Korneliusz Jarzębski, has tested e-con Systems USB3 See3CAM_80 HD camera connected to the board’s USB 3.0 port, and using the camera’s “See3CAM” application. I understand that all that needed to be done was to enable hidraw for USB devices in the Linux kernel, and it just worked out of the box. The application can perform real-time video processing, applying videos filters (invert, particles, etc..), as well as changing image characteristics such as brightness, contrast and so on.

You can find a little more on his blog (Polish).

“Super-Computer-On-Legs” Robot

The last demo I’ll show today is a robot powered by Jetson TK1 board that can walk to the nearest person it can see. The robot detects person via a camera and GPU accelerated face detection (about 5 times faster than CPU-only face detection). Beside better performance, the robot is pretty power-efficient as it only draws about 7 watts, and last about 45 minutes powered by a small LiPo battery. The robot was showcased at the Robotics Science and Systems Conference last month, and while attendees were impressed by the performance and power consumption, they still noticed the board was a bit too big for most robots, especially quad copters. But the platform clearly has potential, and Shervin Emami, the person behind the project who happens to work for… Nvidia, mentioned work is being done on smaller Tegra K1 computer on modules that be installed in a custom motherboard of a robot without unnecessary ports.

If you are interested in seeing more projects running on Jetson TK1 development board, you can consider following “Embedded Tegra & Jetson TK1 Blog” on Google+.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter

ARM TechCon 2014 Schedule – 64-Bit, IoT, Optimization & Debugging, Security and More

July 23rd, 2014 No comments

ARM Technology Conference (TechCon) 2014 will take place on October 1 – 3, 2014, in Santa Clara, and as every year, there will be a conference with various sessions for suitable engineers and managers, as well as an exposition where companies showcase their latest ARM based products and solutions. The detailed schedule for the conference has just been made available. Last year,  there were 90 sessions organized into 15 tracks, but this year, despite received 300 applications,  the organizers decided to scale it down a bit, and there will be 75 session in the following 11 tracks:ARM_TechCon_2014

  • Chip Implementation
  • Debugging
  • Graphics
  • Heterogeneous Compute
  • New Frontiers
  • Power Efficiency
  • Safety and Security
  • Software Development and Optimization
  • Software Optimization for Infrastructure and Cloud
  • System Design
  • Verification

There are also some paid workshops that take all day with topics such as “Android (NDK) and ARM overview”, “ARM and the Internet of Things”, or “ARM Accredited Engineer Programs”.

As usual, I’ve gone through the schedule builder, and come up with some interesting sessions with my virtual schedule during the 3-day event:

Wednesday – 1st of October

In this session, Dr. Saied Tehrani will discuss how Spansion’s approach to utilize the ARM Cortex-R line of processors to deliver energy efficient solutions for the automotive MCU market has led the company to become a vital part of the movement toward connectivity in cars. Beginning with an overview of the auto industry’s innovation and growth in connected car features, he will explain how these systems require high performance processing to give drivers the fluid experience they expect. Highlights in security and reliability with ARM Cortex-R, including Spansion’s Traveo Family of MCU’s will also be presented.

HEVC and VP9 are the latest video compression standards that significantly improves compression ratio compared to its widely used predecessors H.264 and VP8 standard. In this session the following will be discussed:

  • The market need for GPU accelerated HEVC and VP9 decoders
  • Challenges involved in offloading video decoding algorithms to a GPU, and how Mali GPU is well suited to tackle them
  • Improvement in power consumption and performance of Mali GPU accelerated decoder
  • big.LITTLE architecture and CCI/CCN’s complementing roles in improving the GPU accelerated video decoder’s power consumption

ARM’s Cortex-M family of embedded processors are delivering energy-efficient, highly responsive solutions in a wide variety of application areas right from the lowest-power, general-purpose microcontrollers to specialised devices in advanced SoC designs. This talk will examine how ARM plans to grow the ARM Cortex-M processor family to provide high performance together with flexible memory systems, whilst still maintaining the low-power, low-latency characteristics of ARM’s architecture v7M.

IoT devices as embedded systems cover a large range of devices from low-power, low-performance sensors to high-end gateways. This presentation will highlight the elements an embedded engineer needs to analyse before selecting the MCU for his design. Software is fundamental in IoT: from networking to power management, from vertical market protocols to IoT Cloud protocols and services, from programming languages to remote firmware update, these are all design criteria influencing an IoT device design. Several challenges specific to IoT design will be addressed:

  • Code size and RAM requirements for the major networking stacks
  • Optimizing TCP/IP resources versus performance
  • Using Java from Oracle or from other vendors versus C
  • WiFi (radio only or integrated module)
  • Bluetooth (Classis versus LE) IoT protocols

Thursday – 2nd of October

Amongst ARM’s IP portfolio we have CPUs, GPUs, video engines and display processors, together with fabric interconnect and POP IP, all co-designed, co-verified and co-optimized to produce energy-efficient implementations. In this talk, we will present some of the innovations ARM has introduced to reduce memory bandwidth and system power, both in the IP blocks themselves and the interactions between them, and how this strategy now extends to the new ARM Mali display processors.

Designing a system that has to run on coin cells? There’s little accurate information available about how these batteries behave in systems that spend most of their time sleeping. This class will give design guidance on the batteries, plus examine the many other places power leakages occur, and offer some mitigation strategies.

64-bit is the “new black” across the electronics industry, from server to mobile devices. So if you are building or considering building an ARMv8-A SoC, you shall attend this talk to either check that you know everything or find out what you shall know! Using the ARMv8 Juno ARM Development Platform (ADP) as reference, this session will cover:

  • The ARMv8-A hardware compute subsystem architecture for Cortex-A57, Cortex-A53 & Mali based SoC
  • The associated ARMv8-A software stack
  • The resources available to 64-bit software developers
  • Demonstration of the Android Open Source Project for ARMv8 running on Juno.

Rapid prototyping platforms have become a standard path to develop initial design concepts. They provide an easy-to-use interface with a minimal learning curve and allow ideas to flourish and quickly become reality. Transitioning from a simple, easy-to-use rapid prototyping system can be daunting, but shouldn’t be. This session presents options for starting with mbed as a prototyping environment and moving to full production with the use of development hardware, the open-source mbed SDK and HDK, and the rich ARM ecosystem of hardware and software tools.Attendees will learn how to move from the mbed online prototyping environment to full production software, including:

  • Exporting from mbed to a professional IDE
  • Full run-time control with debugging capabilities
  • Leveraging an expanded SDK with a wider range of integration points
  • Portability of applications from an mbed-enabled HDK to your custom hardware

Statistics is often perceived as scary and dull… but not when you apply it to optimizing your code! You can learn so much about your system and your application by using relatively simple techniques that there’s no excuse not to know them.This presentation will use no slides but will step through a fun and engaging demo of progressively optimizing OpenCL applications on a ARM-powered Chromebook using IPython. Highlights will include analyzing performance counters using radar diagrams, reducing performance variability by optimizing for caches and predicting which program transformations will make a real difference before actually implementing them.

Friday – 3rd of October

The proliferation of mobile devices has led to the need of squeezing every last micro-amp-hour out of batteries. Minimizing the energy profile of a micro-controller is not always straight forward. A combination of sleep modes, peripheral control and other techniques can be used to maximize battery life. In this session, strategies for optimizing micro-controller energy profiles will be examined which will extend battery life while maintaining the integrity of the system. The techniques will be demonstrated on an ARM Cortex-M processor, and include a combination of power modes, software architecture design techniques and various tips and tricks that reduce the energy profile.

One of the obstacles to IoT market growth is guaranteeing interoperability between devices and services . Today, most solutions address applications requirements for specific verticals in isolation from others. Overcoming this shortcoming requires adoption of open standards for data communication, security and device management. Economics, scalability and usability demand a platform that can be used across multiple applications and verticals. This talk covers some of the key standards like constrained application protocol (CoAP), OMA Lightweight M2M and 6LoWPAN. The key features of these standards like Caching Proxy, Eventing, Grouping, Security and Web Resource Model for creating efficient, secure, and open standards based IoT systems will also be discussed.

Virtual Prototypes are gaining widespread acceptance as a strategy for developing and debugging software removing the dependence on the availability of hardware. In this session we will explore how a virtual prototype can be used productively for software debug. We will explain the interfaces that exist for debugging and tracing activity in the virtual prototype, how these are used to attach debug and analysis tools and how these differ from (and improve upon) equivalent hardware capabilities. We will look in depth at strategies for debug and trace and how to leverage the advantages that the virtual environment offers. The presentation will further explore how the virtual prototype connects to hardware simulators to provide cross-domain (hardware and software) debug. The techniques will be illustrated through case studies garnered from experiences working with partners on projects over the last few years.

Attendees will learn:

  • How to set up a Virtual Prototype for debug and trace
  • Connecting debuggers and other analysis tools.
  • Strategies for productive debug of software in a virtual prototype.
  • How to setup trace on a virtual platform, and analysing the results.
  • Hardware in the loop: cross domain debug.
  • Use of Python to control the simulation and trace interfaces for a virtual platform.
  • 14:30 – 15:20 – GPGPU on ARM Systems by Michael Anderson, Chief Scientist, The PTR Group, Inc.

ARM platforms are increasingly coupled with high-performance Graphics Processor Units (GPUs). However the GPU can do more than just render graphics, Today’s GPUs are highly-integrated multi-core processors in their own right and are capable of much more than updating the display. In this session, we will discuss the rationale for harnessing GPUs as compute engines and their implementations. We’ll examine Nvidia’s CUDA, OpenCL and RenderScript as a means to incorporate high-performance computing into low power draw platforms. This session will include some demonstrations of various applications that can leverage the general-purpose GPU compute approach.

Abstract currently not available.

That’s 14 sessions out of the 75 available, and you can make your own schedule depending on your interests with the schedule builder.

In order to attend ARM TechCon 2014, you can register online, although you could always show up and pay the regular on-site, but it will cost you, or your company, extra.

Super Early Bird Rare
Ended June 27
Early Bird Rate
Ends August 8
Advanced Rate
Ends September 19
Regular Rate
VIP $999 $1,299 $1,499 $1,699
All-Access $799 $999 $1,199 $1,399
General Admission $699 $899 $1,099 $1,299
AAE Training $249 $299 $349 $399
Software Developers Workshop $99 $149 $199 $249
Expo FREE FREE $29 $59

There are more types of pass this year, but the 2-day and 1-day pass have gone out of the window. The expo pass used to be free at any time, but this year, you need to register before August 8. VIP and All-access provides access to all events, General Admission excludes AAE workshops and software developer workshops, AAE Training and Software Developers Workshop passes give access to the expo plus specific workshops. Further discounts are available for groups, up to 30% discount.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter

Imagination Technologies Unveils Low Power Low Footprint PowerVR GX5300 GPU for Wearables

July 22nd, 2014 No comments

Up to now most wearables are based on MCU solutions or derived from mobile platforms, which may either not provide the advanced features required by users, or consume too much power and take more space than needed. With Ineda Dhanush and Mediatek Aster, we’ve already seen silicon vendors design wearables SoCs, and now Imagination Technologies has just announced PowerVR GX5300 GPU targeting wearables with support for OpenGL ES 2.0, 480p to 720p resolution, and using 0.55mm2 silicon area based on 28nm process.

PowerVR GX5300 Block Diagram

PowerVR GX5300 Block Diagram

PowerVR GX5300 GPU will be support Android, Android Wear, and Linux based operation systems, and according to the company has the following key features:

  • Unified shaders – The TBDR graphics architecture offers unified shaders where vertex, pixel and GPU compute resources are scaled simultaneously.
  • Low power and high precision graphics – All PowerVR GPUs offer a mix of low (FP16) and high precision (FP32) rendering and implement the full OpenGL ES 2.0 specification.
  • Reduced memory footprint - PowerVR GX5300 supports PVRTC, a texture compression format which reduces memory bandwidth and decreases power consumption. It can help silicon vendors reduce memory costs.

Typical applications will be embedded Linux or Android-based connected home systems that require graphics rendering such as smart washing machines, and wearables running Android Wear such as smartwatches.

PowerVR GX5300 is available for licensing now, but it has not been announced in any wearable SoCs just yet, so it’s probably something we’ll see in products in 2015.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter

Hardkernel Unveils $179 ODROID-XU3 Development Board Powered by Samsung Exynos 5422 SoC

July 8th, 2014 9 comments

Remember ODROID-XU2 development board based on Exynos 5420? The bad news is that it apparently got scrapped, but the good news is that it gave birth to ODROID-XU3 development board powered by the latest Samsung Exynos 5422 octa core big.LITTLE SoC with support for Ubuntu 14.04 and Android 4.4, including GPU 3D acceleration with the company promising a full desktop experience in Ubuntu.

ODROID-XU3ODROID-XU3 specifications:

  • SoC – Samsung Exynos 5422 quad core ARM Cortex-A15 @ 2.0GHz+ quad core ARM Cortex-A7 @ 1.4GHz with Mali-T628 MP6 GPU supporting OpenGL ES 3.0 / 2.0 / 1.1 and OpenCL 1.1 Full profile
  • System Memory – 2GB LPDDR3 RAM PoP (933Mhz, 14.9GB/s memory bandwidth, 2x32bit bus)
  • Storage – Micro SD slot (up to 64GB) + eMMC 5.0 module socket (16, 32, or 64GB module available)
  • Video Output – micro HDMI (Up to 1080p) and DisplayPort (up to 2160p)
  • Audio Output – micro HDMI and 3.5mm headphone jack
  • Network Connectivity – 10/100Mbps Ethernet (Via LAN95144 USB + Ethernet controller)
  • USB – 1x USB 3.0 host port, 1x USB 3.0 micro USB OTG port, 4x USB 2.0 ports
  • Expansion – 30-pin header for access to GPIO, IRQ, SPI and ADC signals
  • Debugging – Serial console header
  • Misc – Accurate current sensors and voltage sensors for energy measurement, Power and RGB LEDs, cooling fan header
  • Power Supply – 5V/4A power adapter using 5.5/2.1mm barrel.
  • Dimensions – PCBA: 94x70x18mm; Enclosure: 98x74x29mm
ODROID-XU3 Block Diagram (Click to Enlarge)

ODROID-XU3 Block Diagram (Click to Enlarge)

The company can also provide USB modules / dongles for optical S/PDIF output, Gigabit Ethernet (USB 3.0 to Ethernet adapter), Wi-Fi 802.11n/g/n 1T1R with antenna, and 2.5″/3.5″ SATA drives (USB 3.0 to SATA III adapter).  Hardkernel will provide Ubuntu 14.04 support with OpenGL ES + OpenCL support, and Android 4.4.2 both based on Linux kernel 3.10 LTS, and the source code will soon be available on their github account. This is also the first ODROID board that supports Heterogeneous Multi-Processing / Global Task Scheduling implementation of big.LITTLE processing. You can get an overview of the board and see Ubuntu 14.04 running OpenGL ES 3.0 demo, and playing a windowed YouTube video – which makes me think hardware video decoding may not be implemented yet – in the video below.

ODROID-XU3 comes with a plastic case, an active cooler and a 5V/4A power adapter, and you get pre-order it for $179 + shipping on Hardkernel website with delivery schedule for the 18th of August 2014. There’s no internal storage with the board so you’ll need a micro SD card (Class 10 strongly recommended), or even better purchased one of the eMMC module with the board. The 16GB eMMC sells for $39 and includes an adapter to connect it to your PC. You can get more information, and/or purchase the board and a few of its 23 accessories on Hardkernel’s ODROID-XU3 product page.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter

Google Releases Android L (Lollipop?) Developer Preview

June 26th, 2014 2 comments

Google I/O is taking place right now in San Francisco, and the company made several announcements. Although they have not announced the full codename of Android 5.0, referring to the next version as “Android L” (Lollipop would be nice though), but they’ve already documented the key changes made to Android L, and a developer preview will be released later today (26 June), together with binary images for Google Nexus 5 and Nexus 7.

Android_Lollipop

Beside the smartphone and tablet developer preview, there will be 3 other SDKs for Android L:

  • Android Wear SDK – Android for wearables with sync notifications, wearable apps, data transfer APIs, and voice actions, e.g. “Ok Google, call mum”.
  • Android TV Preview SDK – Android for TVs with pre-built fragments for browsing and interacting with media catalogs, in-app search, and recommendations.
  • Android Auto SDK – Android for the car with apps featuring consistent user experience between vehicles, and minimizing distractions.

I’ll go through various software and hardware announcements for Android Wear and TV in separate blog posts, and probably skip Android Auto for now.

So what’s new in Android L Developer Preview?

Material Design

Material Design is is a new design language that will let developer create app which look similar to Google Now. Google chose the name “Material” as it is apparently inspired from real materials such as paper and ink. Android L user interface will be entirely designed with Material Design. The best is to look at an example.

Gmail Now vs Gmail "L"

Gmail Now vs Gmail “L”

On the left, we’ve got the current Gmail app, and on the right the newly designed app for Android L. Lots of it looks like cosmetic changes, but you’ll have noticed the three dot and new mail icons are gone, and all menu will be accessible via the top left icon. There are also some light and shadow effects that will make users feel like they’re touching real elements.

More details can be found in this Material Design presentation (PDF).

Improved Notifications

Notifications have also changed with a new design based on Material, and the ability to display notifications on the lock screen.

Android_L_Notifications

I understand lockscreen notifications are optional, and if you don’t like to show them in the lock screen using visibility controls. As you can see from the screenshot above it works very similar to Google Now which cards that you can discard once you’re done. Notifications will also be able to pop-up in games or other full screen apps, and you’ll be able o take action within the notification, for example by declining or accepting a video call request.

Recents

The list of recent apps will become the list of recent everything, simply called “Recents”, as it will include both apps, web pages, and documents.

Better Tools for Improving Battery Life

As devices become more powerful, they also become more power hungry despite efforts by SoC designers to reduce energy usage. Badly programmed apps are however the main culprit of short battery life, so Google has introduced Project Volta to help user and developers optimize power consumption. Developers can use “Battery Historian” tool to monitor power consumption of different processes, and which hardware block (e.g. Cellular radio) is currently being used.

Battery_HistorianUsers will also have their own app / feature dubbed “Battery Saver” to improve battery life, and Google claims their Nexus 5 should be able to last an extra 90 minutes on a charge with Battery Saver enabled. This is achieved by reducing the performance of the device once the battery has dropped below 20% charge. At that time, a notification would pop-up to let the user select he wants to enable Battery Saver mode.

Under the hood improvements

As as been widely reported, Google recently killed Dalvik in a recent commit in AOSP, and ART will become the default JAVA runtime using ahead-of-time compilation for speedier application loading time, and memory usage improvements. Google also claims it provides true cross platform support for ARM, MIPS and x86.

Android L will support 64-bit instructions including ARMv8, x86-64 and MIPS64. This will provide a larger number of registers, and increased addressable memory space. Java developers won’t needto change their apps for 64-bit support. One the first Android64 devices is likely to be the Nexus 9 tablet powered by Nvidia Tegra K1 Denver as previously reported.

On the graphics side, Android L adds support for OpenGL ES 3.1, and includes Android Extension Pack for developers with tesselation and geometry shaders and other features that should bring PC and console class graphics to Android games according to Google.

Via Anandtech and Liliputing

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter