Sie sind auf Seite 1von 20

JasperGold™

High-Level Formal Verification

Vigyan Singhal
Harry D. Foster

Agenda

• Jasper introduction
• Model checking
• Block-level verification
- High-level requirements
- Formal testplan
- Coverage
• Formal Testplanner
• PSL (Property Specification Language)

1
Jasper Design Automation

• Jasper is the premier electronic design automation


(EDA) supplier of high-level formal functional
verification software.

• Jasper’s solution achieves 100% Actual Coverage –


improving the quality of electronic design, predictably
and within verification schedule constraints

• Jasper has unique and valuable technology that


changes the verification game and “makes formal
verification real”.

Jasper Design Automation


• Founded in 1999 (originally Tempus Fugit)

• Over $20M in funding -- led by Foundation Capital, Accel Partners

• Management
- Kathryn Kranen President and CEO (Verisity, Quickturn)
- Vigyan Singhal CTO (Cadence, UC Berkeley)
- Harry Foster Chief Methodologist (Verplex, HP)
- Craig Cochran VP Marketing (Synopsys)
- Craig Shirley VP Sales (Verisity)
- Nafees Qureshy VP Engineering (CoWare)

• 9 PhDs in Formal Verification


- UC Berkeley, Stanford, CalTech, Chalmers, Gothenburg

2
Sequential Satisfiability (model checking)
Is there a sequence of input assignments such that
p is 1 at any finite time time?

00 q
a

00 p

Combinational gates + flops (with initial values)

Sequential Satisfiability (model checking)


Is there a sequence of input assignments such that
p is 1 at any finite time time?

00 q
a 10

00 p

3
Sequential Satisfiability (model checking)
Is there a sequence of input assignments such that
p is 1 at any finite time time?

10
10 00 00 q
a 10

00 00 p

Sequential Satisfiability (model checking)


Is there a sequence of input assignments such that
p is 1 at any finite time time?

10 11 q
a

00 01 p

4
Sequential Satisfiability (model checking)
Is there a sequence of input assignments such that
p is 1 at any finite time time?

01
01 11 01 q
a 11

11 01 p

Sequential Satisfiability (model checking)


Is there a sequence of input assignments such that
p is 1 at any finite time time?

02 02 q
a

12 p

10

5
Sequential Satisfiability (model checking)
Is there a sequence of input assignments such that
p is 1 at any finite time time?

00 q
a

00 p

Combinational gates + flops (with initial values)


Complexity: PSPACE-hard (Aziz 93)

11

Model Checking use model


• Does the design have a given desirable property?

Properties
and
Constraints Verified
Model
Model or
Checker
Checker CounterExample

p
Design under test q

• Properties - Specification
• Constraints - Assumptions needed to prove
Properties
• Design under test - Implementation
12

6
Functional Verification Today

‹ Verification starts late


Test
Bench z Creating the chip-level testbench can be
as difficult as creating the chip model.
Large ‹ Poor coverage
hdl z Full chip verification does not provide
enough coverage to find all bugs.
z High controllability of block-level
interfaces is difficult to achieve from
stimulus generated at the chip boundary.
‹ Low observability
z Internal errors during full chip simulation
might not propagate to an observable
output during the test.

13

Design and Verification Flow


Design Validate
Spec

SRAM CPU
Chip Chip testbench
Test IP I/F SRAM CPU

Refine &
Partition Test IP I/F

FIFO Algorithm M
Super- U
X
Most of today’s verification
Block Controller Sync is performed at the chip level

Timer Timer Actions Integration &


Block
FSM
Verification
FSM
Actions
Very little block-level
verification is performed
Block-level testbench

Unit or FSM
Module
Timer Timer Actions
FSM

Actions
FSM

Cost of a re-spin > $1M + months of delay


14

7
What is necessary for FV to succeed in industry?

FV solution must provide:

1. High Return-on-Effort
- Measurable proof of quality
2. Ease of use (FV expert not require for successful)
- Encapsulate formal expert knowledge into the tool
- Strong engines + advance abstraction techniques
- Guide the user to success
3. Sophisticated debug capabilities
- Design illumination--enabling user to learn the design
4. Means to overcome the “blank page” syndrome
- Specification challenge
- Methodology challenge
5. Metrics to measure progress and success

15

What’s the solution?


• Give up, accept the raw algorithm limitations and perform bounded-
model checking, or….

• How do we handle complexity for other applications?


- For example, place-n-route is an NP-complete problem.
- In place-n-route, using guidance, the tool can automatically partition the
problem down into tractable pieces.

• To succeed, formal verification tools must combine advanced


techniques with multiple, powerful search engines
- For example, managed automatic tunneling, ….
- Encapsulate the knowledge of the formal expert into the tool!

16

8
JasperGold Manages Proof Capacity
Tunneling, Monitors, and Abstraction

High-level requirement

• Conventional and Hybrid Formal Tools


Choke on All the Logic in the Cone of Influence.
• Progress and Coverage are Unknown.

Inputs Outputs
JasperGold Refines the Proof Space Counter

Abstracts Formal Unfriendly Structures


RTL Block and Displays Progress and Coverage

17

JasperGold - Ease of Use

RTL Design
High-Level
High-Level Design
Design
Block
Jasper Formal requirements
requirements
Testplanner™

• Static, exhaustive
JasperGold™ proofs without
Knowledge base of simulation
design-specific methodology
and requirements templates • Proof navigation
Precognitive
Precognitive Proof
Proof
TM suggests next steps
Engine
Engine TM engines
engines
• Isolates the root cause
JasperGold
of each bug to the
Proving Environment TM
faulty line of RTL

- OR -
Functional Bugs Proven
Isolated 100% Correct
Interactive, context-sensitive
debugging environment

18

9
In-line assertions (PSL example)
module btrack (clk,allocate,deallocate,set);

input clk, allocate, set, deallocate;

reg active_r, valid_r;


wire active_s, valid_s, active_in, valid_in;

assign active_in = ((allocate | active_s) & ~deallocate);


always @(negedge clk)
active_r <= active_in | set;
assign active_s = active_r;

assign valid_in = (active_s & (set | valid_s));


always @(negedge clk) begin
valid_r <= valid_in;
assign valid_s = valid_r;

// psl property deallocate_when_not_active =


// always {deallocate} |-> {~active_s; valid_r};
// psl assert deallocate_when_not_active;
endmodule

19

A high-level requirement example


[DesignCon 04 PCI Express Paper]
High-
High-level requirement = no packets are dropped, duplicated, or corrupted

Tx
One-hot
Increment output
by 1
TLP A A
Seq_num
From Tx FIFO Over
Fabric Flow To
LCRC A PHY
Packet
Retry buffer arbiter

From Rx
RX
DLLP

A = Implementation assertions: low-level assertions


(similar to writing simulation monitors)
20

10
Illustrating with an Example

AHB to MSIF
bridge
X ?

From AHB
Master
To
Wishbone
via MSIF

Requirement: A transaction from an AHB Master on an external


device must NOT get duplicated, corrupted or dropped, when it
goes out to the Wishbone fabric

21

Illustrating the packet ordering requirement


using a FIFO

Data Transport
3 2 1
TLP
Tx Seq_num A 1 2 C
From
Wishbone LCRC
Packet To
C B A Retry buffer arbiter
PHY

From
Rx
DLLP
RX DLL

C
B ×
write_ptr A read_ptr

22

11
HLR example
always @(posedge clk or posedge rst) datain dataout
if (rst) begin din[7:0] dout[31:0]
wrPtr <= 4’d0; rdPtr <= 4’d0;
for (i = 0; i < 4; i = i + 1)
stored[i] = 64’b0;
end else begin
if (datain) begin
wrPtr <= wrPtr + 4’d1;
store[wrPtr] <= din;
end
if (dataout) rdPtr <= rdPtr + 4’d1; C
end B ×
wrPtr A rdPtr
store[]
prove { ~ ((dataout) && (dout != store[rdPtr])) }
prove { ~ (((wrPtr + 4’d1) == rdPtr) && ~dataout && datain) }
prove { ~ ((wrPtr == rdPtr) && (dataout)) }

Notice dependency on number of transactions in flight


23

High-level Requirements: high return on effort


Data
Integrity Arbitration
Fairness

High-level
Requirements
• End-to-end
• Black box
• Based on design intent Intent
• Are harder to prove Implementation

Increment
Low-level One FIFO FIFO
By 1
Assertions Hot Overflow Overflow

• Localized
• Implementation-specific
• Can find bugs faster
Design Behavior

24

12
High-Level Formal vs. Conventional Formal

JasperGold
operates here

High-Level
µ-architecture Specification Requirements
Intent

Implementation

RTL Coding Implementation


Assertions

Most formal tools


operate here

25

Seven steps of a formal testplan


µ-arch 1

Identify good identify 2


candidates English
list
Define 3
interface

Requirements
checklist
4

Requirements
inputs Formal
outputs 5
description
Block

Verification 6 assert_never ();


strategy
Proof
Steps
Formal description
Coverage 7
goals Coverage
items

26

13
Formal Testplan Example
• Transmit Path (tx)
- Packets should not lose ordering or get dropped
- Packets should not get duplicated
- Packets headers should not get corrupted
- Link Credits sent are less than or equal to Network Credits received
- No Link Credits received from the network are lost (every credit is
sent downstream, or recorded locally)
- No packets are retried until timer expires
- Any packet not ack’ed is retried
- Retried packets should not lose ordering or get dropped
- Retried packets should not get duplicated
• Receive Path (rx)
- Alignment
- Packet Latency
- …

27

Overcoming the formal specification “blank-page” syndrome

28

14
JasperGold Proving Environment - Sophisticated Debug

• Design Illumination mode steers the user through the design -


“lets you get into the designer’s head without talking to them”
• Enables you to quickly expose the root cause of a bug, isolated down to
the faulty line of RTL code
• Provides design-specific next steps and proof progress metrics

29

Are there success stories?


• Real examples include:

- Arbiters of many different kinds - Interrupt controller


- On-chip bus bridge - Memory controller
- Power management unit - Token generator
- DMA controller - Credit manager block
- Host bus interface unit - Standard interface (AMBA, PCI Express, …)
- Scheduler, implementing multiple - Proprietary interfaces
virtual channels for QoS - Clock disable unit

• Common characteristic: concurrency, multiple data streams

Multiple, concurrent streams


Hard to completely verify using simulation
“10 bugs per 1000 gates”
-Ted Scardamalia, IBM

30

15
Control, data transport

• Concurrent
• Handles multiple streams
• “Corner-case” bug is missed because randoms never excited that
timing combination
• Bug example: during the first 3 cycles of a transaction start from one
side of the interface, a second transaction came in on the other side of
the interface and changed the config register.
• Design examples: arbiters, standard interface protocols (PCI), DMA
controller

31

Data transform

• Sequential, functional
• Single-stream
• “Corner-case” bug is missed because testplan was not complete (spec
was large)
• Bug example: the IFFT result is incorrect if both inputs are negative
• Designs: floating point unit, graphics shading unit, convolution unit in a
DSP chip, MPEG decoder

32

16
On-chip Bridge (e.g. AHB-AHB)
HTRANSA
Master A HREADYA
HRESPA HTRANS
AHB-AHB
bridge HREADY
Slave
HRESP
HTRANSB
Master B HREADYB
HRESPB

• HLRs
- Bridge transfers data correctly from one end to the other (accounting for 16-bit to 64-bit data
width change, and splitting of long burst packets)
- Bridge is fair to both masters
- …
• Complexity
- Timing relationship between the two masters
- Error responses from the slave with arbitrary timing relationship to what is happening in the
master
• Corner-case
- Bridge locks up if slave issues error on 2nd cycle of a 4-beat request from master A, and
master B makes a new request on the same cycle the error is passed on to master A

33

Verification solution must reduce collateral damage

• Today, design concurrency and complexity are explored late in the flow
• The problem is that at this state massive design interdependencies
exist fully-developed RTL
• Late stage bugs can have significant collateral damage

Exhaustively verified complexity


Complexity explored

System-level
Verification

Block-level
Verification

Sandbox Complex bugs found late!


Verification

Time

34

17
JasperGold Delivers “Provably Correct Design”

100%
verified

Arbitration Fairness

Packet Ordering
Error Handling
Data Integrity

Flow Control

time
High-Level Requirements
from Micro-Architecture Spec.

“Prove-as-you-design”
Incrementally add design functionality, prove its correctness, and
then ensure that it remains correct through final regressions

35

Two Reasons for Formal Verification

Typical Debug
Bugs Remaining

Bug-Hunting

Assurance
Find Last bug

Time

“By far, we [at Intel] have found that the greatest value of
formal verification is assurance” – John O’Leary, Intel
Full proof provides the most value

36

18
Design and Verification Flow
Design Validate
Spec

SRAM CPU
Chip Chip testbench
Test IP I/F SRAM CPU

Refine &
Partition Test IP I/F

FIFO Algorithm M
Super- U
X
Most of today’s verification
Block Controller Sync is performed at the chip level

Timer Timer Actions Integration &


Block
FSM
Verification
FSM
Actions
Very little block-level
verification is performed
Block-level testbench

Unit or FSM
Module
Timer Timer Actions
FSM

Actions
FSM

Cost of a re-spin > $1M + months of delay

37 © 2003, Jasper Design Automation, Inc. All rights reserved.

Design Cycle Comparison

Simulation Flow
Design Sanity Create
System Simulation
Blocks Check Testbenches

Months

Simulation + Formal Verification Flow


Design Formal Create
System Simulation
Blocks Verify Testbenches

Weeks

Design and formal verification


performed in parallel - 100%
Proof

38

19
Retrun on Investment
TAPE More
OUT!
Easy Bugs Corner Case Bugs Corner
Simulation Methodology Cases!

Block Unit-level
System-level Testing – Simulations Silicon Testing
Design Testing

System-level Testing – Simulations


Jasper Methodology
For every bug at system-level:

Identify Block

Tape-out Early
Identify Fix
Complete Verification with
JasperGold during Design
Tape-out Right Re-run All

0 6 10 26 32 Weeks*

39

Conclusion: Making Formal Verification Mainstream

• Higher returns
- Block verification on real-sized designs
- Expectation and predictability of results
- For even higher results:
• Interactivity and methodology support
• User creativity always adds more
• Higher investment from the user
- Lower investment ) marginal payoff
- Cannot be factored in project schedules
- Need predictable effort estimates
- User interaction is acceptable, if effort can be quantified and
estimated up front
- Today, simulation can be replaced: launch-and-forget will not
replace simulation
- Don’t short-change formal by applying only to local assertions

40

20

Das könnte Ihnen auch gefallen