Concepedia

TLDR

Future high‑performance virtual machines will improve performance through sophisticated online feedback‑directed optimizations. This paper presents the architecture of the Jalapeno Adaptive Optimization System to support leading‑edge virtual machine technology and enable ongoing research on online feedback‑directed optimizations. The system is built as an extensible, thread‑federated architecture in Java, supporting adaptive multi‑level optimization and online feedback‑directed inlining based on statistical sampling applied to application code, libraries, and the virtual machine itself. Empirical results show the profiling technique incurs low overhead and improves startup and steady‑state performance, even without online feedback‑directed optimizations.

Abstract

Future high-performance virtual machines will improve performance through sophisticated online feedback-directed optimizations. This paper presents the architecture of the Jalapeno Adaptive Optimization System, a system to support leading-edge virtual machine technology and enable ongoing research on online feedback-directed optimizations. We describe the extensible system architecture, based on a federation of threads with asynchronous communication. We present an implementation of the general architecture that supports adaptive multi-level optimization based purely on statistical sampling. We empirically demonstrate that this profiling technique has low overhead and can improve startup and steady-state performance, even without the presence of online feedback-directed optimizations. The paper also describes and evaluates an online feedback-directed inlining optimization based on statistical edge sampling. The system is written completely in Java, applying the described techniques not only to application code and standard libraries, but also to the virtual machine itself.

References

YearCitations

1991

2.2K

2011

728

1984

585

2000

576

2000

532

1997

474

1999

349

1993

283

1999

273

1999

268

Page 1