Use Deep Learning to Expedite the Hardware Architecture and Design Process

Train an augmented, super lightweight Transformer model 'CircuitFormer' to predict synthesis results.

The number of transistors that can fit on one monolithic chip has reached billions to tens of billions in this decade thanks to Moore’s Law. With the advancement of every technology generation, the transistor counts per chip grow at a pace that brings about exponential increase in design time, including the synthesis process used to perform design space explorations. Such a long delay in obtaining synthesis results hinders an efficient chip development process, significantly impacting time-to-market. In addition, these large-scale integrated circuits tend to have larger and higher-dimension design spaces to explore, making it prohibitively expensive to obtain physical characteristics of all possible designs using traditional synthesis tools.

In this work, we propose a deep-learning-based synthesis predictor called SNS (SNS’s not a Synthesizer), that predicts the area, power, and timing physical characteristics of a broad range of de- signs at two to three orders of magnitude faster than the Synopsys Design Compiler while providing on average a 0.4998 RRSE (root relative square error). We further evaluate SNS via two representative case studies, a general-purpose out-of-order CPU case study using RISC-V Boom open-source design and an accelerator case study using an in-house Chisel implementation of DianNao, to demonstrate the capabilities and validity of SNS.

Leveraging recent advances in artificial intelligence and graph analytics, SNS turns input designs into graph representations for synthesis predictions. Inspired by a path-based approach for social network analysis, SNS takes a novel circuit-path-based approach to predict the physical characteristics of individual circuit paths and aggregating the paths’ characteristics to predict the area, power, and timing of the entire input design. Leveraging sequence processing techniques that learns the order and the placement of words in relation to a sentence used in natural language processing, SNS learns the order and the placement of functional units in a circuit path to provide more accurate synthesis result predictions. With very limited number of open-source hardware designs available as training data, SNS utilizes generative models to generate training datasets, providing accurate predictions even when training data is scarce.

Publications

Ceyu Xu, Chris Kjellqvist, and Lisa Wu Wills, "SNS's not a Synthesizer: A Deep-Learning Based Synthesis Predictor". International Symposium on Computer Architecture (ISCA) 2022.