Design, Implementation, and Evaluation of Adaptive Code Unloading for Resource-Constrained Devices


Transactions on Architecture and Code Optimization (TACO)
Vol. 2, Number 2, June, 2005, pages 131-164.

Lingli Zhang and Chandra Krintz

(PDF)

Abstract

Java Virtual Machines (JVMs) for resource-constrained devices, e.g., hand-helds and cell phones, commonly employ interpretation for program translation. However, compilers are able to produce significantly better code quality, and hence, use device resources more efficiently than interpreters, since compilers can consider large sections of code concurrently and exploit optimization opportunities. Moreover, compilation-based systems store code for reuse by future invocations obviating the redundant computation required for re-interpretation of the same code.

However, code storage required for compilation can increase the memory footprint of the VM significantly. As a result, for devices with limited memory resources, this additional code storage may preclude some programs from executing, significantly increase memory management overhead, and substantially reduce the amount of memory available for use by the application.

To address this issue of increased memory pressure required for native code storage, we present the design, implementation, and empirical evaluation of a compiled-code management system that can be integrated into any compilation-based JVM. The system unloads compiled code to reduce the memory footprint of the VM. It does so by dynamically identifying and unloading dead or infrequently used code; if the code is later reused, it is recompiled by the system. As such, our system adaptively trades off memory footprint and its associated memory management costs with recompilation overhead. Our empirical evaluation shows that our code management system significantly reduces the memory requirements of a compile-only JVM, while maintaining the performance benefits enabled by compilation.

We investigate a number of implementation alternatives that use dynamic program behavior and system resource availability to determine when to unload as well as what code to unload. From our empirical evaluation of these alternatives, we identify a set of strategies that enable significant reductions in the memory overhead required for application code. Our system reduces code size by 36%-62% on average which translates into significant execution time benefits for the benchmarks and JVM configurations that we studied.