Paper Review The 800 Pound Python in the Machine Learning Room

High-level

Summary:

In this paper, we demonstrate the ability to overcome these shortcomings by performing a relatively simple source-to-source transformation, that allows for operator overloading techniques to be extended to language built-ins, including control flow operators, function definitions, etc.

This paper is basically about implementation of LMS in python. Code transformation at runtime — this is essentially about

Evaluation:

Use PLT Redex. Enables a form of multi-stage programming in Python. We capture the required transformation in a proof-of-concept back-end agnostic system. Use in Autograph.

There is no large-scale experiments, how could I know if this approach works or not?

Takeaways:

Operator overloading: record executed tensor operations via tracing based on operator overloading.

Given the Python program2 in Figure 2 (left) as input, Cython will produce a .c file which runs approximately 35% faster than in Python

Practical Value

What you can learn from this to make your research better?

Program -> Dependency graph -> optimization

Details and Problems From the presenters’ point of view, what questions might audience ask?

What does “virtualization” mean in this context?

I feel this is just translation Python source to an operator-style source. Some limitations should still exist.

What is the essence of multi-staging programming?

@lms uses a custom ast.NodeTransformer object to per- form in-place modifications on the ASTs extracted from user code

What is the difference between run-time code generation and lambda expression?

Cython, Torch Script, and sim- ilar systems which perform wholesale translation in this fashion fail to utilize known data to specialize code gener- ation and to take advantage of the ability to execute code during the translation