Dynamically typed languages such as JavaScript and Python are increasingly popular, yet static typing has not been totally eclipsed: Python now supports type annotations and languages like TypeScript offer a middle-ground for JavaScript: a strict superset of JavaScript, to which it transpiles, coupled with a type system that permits partially typed programs. However, static typing has a cost: adding annotations, reading the added syntax, and wrestling with the type system to fix type errors. Type inference can ease the transition to more statically typed code and unlock the benefits of richer compile-time information, but is limited in languages like JavaScript as it cannot soundly handle duck-typing or runtime evaluation via eval. We propose DeepTyper, a deep learning model that understands which types naturally occur in certain contexts and relations and can provide type suggestions, which can often be verified by the type checker, even if it could not infer the type initially. DeepTyper, leverages an automatically aligned corpus of tokens and types to accurately predict thousands of variable and function type annotations. Furthermore, we demonstrate that context is key in accurately assigning these types and introduce a technique to reduce overfitting on local cues while highlighting the need for further improvements. Finally, we show that our model can interact with a compiler to provide more than 4,000 additional type annotations with over 95% precision that could not be inferred without the aid of DeepTyper.
Tue 6 NovDisplayed time zone: Guadalajara, Mexico City, Monterrey change
13:30 - 15:00 | Deep LearningResearch Papers at Horizons 6-9F Chair(s): David Rosenblum National University of Singapore | ||
13:30 22mTalk | Deep Learning Type Inference Research Papers Vincent J. Hellendoorn University of California at Davis, USA, Christian Bird Microsoft Research, Earl T. Barr , Miltiadis Allamanis Microsoft Research, Cambridge | ||
13:52 22mTalk | DeepSim: Deep Learning Code Functional Similarity Research Papers | ||
14:15 22mTalk | Code Vectors: Understanding Programs Through Embedded Abstracted Symbolic Traces Research Papers Jordan Henkel University of Wisconsin–Madison, Shuvendu K. Lahiri Microsoft Research, Ben Liblit University of Wisconsin–Madison, Thomas Reps University of Wisconsin - Madison and GrammaTech, Inc. | ||
14:37 22mTalk | MODE: Automated Neural Network Model Debugging via State Differential Analysis and Input Selection Research Papers Shiqing Ma Purdue University, USA, Yingqi Liu Purdue University, USA, Wen-Chuan Lee Purdue University, Xiangyu Zhang Purdue University, Ananth Grama Purdue University, USA |