Photos are available in the DATE 2024 Gallery.

The time zone for all times mentioned at the DATE website is CET – Central Europe Time (UTC+1). AoE = Anywhere on Earth.

ET01 Using Generative AI for Next-generation EDA

Start
Tue, 26 Mar 2024 16:30
End
Tue, 26 Mar 2024 18:00
Room
TBA
Organiser
Hammond Pearce, University of New South Wales Sydney, Australia
Presenter
Jason Blocklove, NYU, United States
Presenter
Siddharth Garg, NYU, United States
Presenter
Jeyavaijayan Rajendran, TAMU, United States
Presenter
Ramesh Karri, NYU, United States

Tutorial resources: https://github.com/JBlocklove/LLMs-for-EDA-Tutorial

Motivation: There are increasing demands for integrated circuits but a shortage of designers - Cadence’s blog reports a shortage of 67,000 employees in the US alone. These increasing pressures alongside shifts to smaller nodes and more complexity lead to buggy designs and slow time-to-market. State-of-art Generative AI tools like GPT-4 and Bard have shown promising capabilities in automatic generation of register transfer level (RTL) code, assertions, and testbenches, and in bug/Trojan detection. Such models can be further specialized for hardware tasks by fine-tuning on open-source datasets. As Generative AI solutions find increasing adoption in the EDA flow, there is a need for training EDA experts on using, training and fine-tuning such models in the hardware context.
Intended audience: Students, academics, and practitioners in EDA/VLSI/FPGA and Security
Objectives In this tutorial we will show the audience how one can use current capabilities in generative AI (e.g. ChatGPT) to accelerate hardware design tasks. We will explore how it can be used with both closed and open-source tooling, and how you can also train your own language models and produce designs in a fully open-source manner. We'll discuss how commercial operators are beginning to make moves in this space (GitHub Copilot, Cadence JedAI) and reflect on the consequences of this in education and industry (will our designs become buggier? Will our graduating VLSI students know less?). We'll cover all of this using a representative suite of examples both simple (basic shift registers) to complex (AXI bus components and microprocessor designs).
Abstract There are ever-increasing demands on complexity and production timelines for integrated circuits. This puts pressure on chip designers and design processes, and ultimately results in buggy designs with potentially exploitable mistakes. When computer chips underpin every part of modern life, enabling everything from your cell phone to your car, traffic lights to pacemakers, coffee machines to wireless headphones, then mistakes have significant consequences. This unfortunate combination of demand and increasing difficulty has resulted in shortages of qualified engineers, with some reports indicating that there are 67,000 jobs in the field yet unfilled.

 

Fortunately, there is a path forward. For decades, the Electronic Design Automation (EDA) field has applied the ever-increasing capabilities from the domains of machine learning and artificial intelligence to steps throughout the chip design flow. Steps from layouts, power and performance analysis and estimation, and physical design are all improved by programs taught rather than programmed.

In this tutorial we will explore what's coming next: EDA applications from the newest type of artificial intelligence, generative pre-trained transformers (GPTs), also known as Large Language Models. We will show how models like the popular ChatGPT can be applied to tasks such as writing HDL, searching for and repairing bugs, and even applying itself to the production of complex debugging tasks like producing assertions. Rather than constrain oneself just to commercial and closed-source tooling, we'll also show how you can train your own language models and produce designs in a fully open-source manner. We'll discuss how commercial operators are beginning to make moves in this space (GitHub Copilot, Cadence JedAI) and reflect on the consequences of this in education and industry (will our designs become buggier? Will our graduating VLSI students know less?). We'll cover all of this using a representative suite of examples both simple (basic shift registers) to complex (AXI bus components and microprocessor designs).

Necessary background Experience with EDA flows and softwares such as Xilinx Vivado, Yosys, iverilog, etc. will be helpful but is not required as training on the day will be provided.
References:
(Tutorial presenters in bold)

S. Thakur, B. Ahmad, Z. Fan, H. Pearce, B. Tan, R. Karri, B. Dolan Gavitt, S. Garg , "Benchmarking Large Language Models for Automated Verilog RTL Code Generation," 2023 Design, Automation & Test in Europe Conference & Exhibition (DATE), Antwerp, Belgium, 2023, pp. 1-6, doi: 10.23919/DATE56975.2023.10137086.

J. Blocklove, S. Garg., R. Karri, H. Pearce, “Chip-Chat: Challenges and Opportunities in Conversational Hardware Design,” 2023 Machine Learning in CAD Workshop (MLCAD),. Preprint: https://arxiv.org/abs/2305.13243

H. Pearce, B. Tan, B. Ahmad, R. Karri and B. Dolan-Gavitt, "Examining Zero-Shot Vulnerability Repair with Large Language Models," 2023 IEEE Symposium on Security and Privacy (SP), San Francisco, CA, USA, 2023, pp. 2339-2356, doi: 10.1109/SP46215.2023.10179324.

B. Ahmad, S. Thakur, B. Tan, R. Karri, H. Pearce, “Fixing Hardware Security Bugs with Large Language Models,” under review. Preprint: https://arxiv.org/abs/2302.01215

R. Kande, H. Pearce, B. Tan, B. Dolan-Gavitt, S. Thakur, R. Karri, J. Rajendran, “LLM-assisted Generation of Hardware Assertions,” under review. Preprint: https://arxiv.org/abs/2306.14027

On the day:

Hands on session

Content: Audience members will use the language models to achieve various tasks within a simple EDA environment focused on simulation.

Goals: While we will also demo approaches using more complex software, the hands-on session will focus on the use of iverilog, which is a simple, free, and open-source software for simulation of Verilog designs. iverilog is not demanding (it can be run on local machines/laptops) and is compatible with windows, Linux, and mac.

Pre-requisites: While it is preferable for participants to have installed gcc, build-essential, iverilog, and gtkwave in advance, doing so on the day is not difficult and we can provide guidance at the beginning of the session.

Tutorial material Reference material on the pre-requisites and the manuscripts from the listed references.
Tutorial plan

0-15 mins: Introduction and motivation by Hammond Pearce, Ramesh Karri, Siddharth Garg, and Jason Blocklove (presenter TBD)

15-35 mins: Hands-on Chip-chat - using ChatGPT for writing, simulating, and bug-fixing Verilog by Jason Blocklove and Hammond Pearce (participants will be provided with scripts that they can adapt to interact with ChatGPT for their own tools)

35-60 mins: Hands-on VeriGen: Developing Open-source EDA datasets and models by Shailja Thakur and Jason Blocklove

60-80 mins: AI for Bug Detection: Accelerating hardware fuzzing and flagging bugs and Trojans with Generative AI by Benjamin Tan and JV Rajendran

80-90 mins: Gazing into the Crystal Ball: The future of EDA with Generative AI by Siddharth Garg and Ramesh Karri