When type instability matters

07/28/2023, 8:10 PM — 8:20 PM UTC
32-144

Abstract:

Type instabilities are not always bad! Using non-concrete types, and avoiding method specialization and type inference can help with improving latency and, in specific cases, runtime performance. The latter is observed in inherently dynamic contexts with no way to compile all possible method signatures upfront, because code needs to be compiled at points of dynamic dispatch by design. We present a concrete case we face in our production environment, additional examples, and related trade-offs.

Description:

Writing type-stable code has largely been accepted as a standard good practice. When the compiler can fully infer the types which flow through a program, it can apply optimizations and specialize to emit fast code for the specific function signatures that get compiled. The process of method specialization and type inference is expensive and a main source of the perceived latency when using Julia. While in most cases, this one-time cost is acceptable for the performance gains at runtime, there are specific contexts in which the cost of such analyses becomes prohibitive. In this talk, we present situations which may benefit from preventing the compiler to reason too much about the code, considering the trade-off between compilation and execution time. Such situations include evaluating generated code, especially generating new types. We also discuss other cases, in which certain data structures benefit from having non-concrete fields, such as Expr and other tree-shaped data structures.

Platinum sponsors

JuliaHub

Gold sponsors

ASML

Silver sponsors

Pumas AIQuEra Computing Inc.Relational AIJeffrey Sarnoff

Bronze sponsors

Jolin.ioBeacon Biosignals

Academic partners

NAWA

Local partners

Postmates

Fiscal Sponsor

NumFOCUS