Scientific puzzle of formal circuit verification at next week's DAC 
Author Message
 Scientific puzzle of formal circuit verification at next week's DAC

This may be a side effect of commercialization of Computer Science,
but there is a very interesting scientific puzzle that one can observe
at next week's Design Automation Conference (DAC).  Namely, that although
formal verification has been a degenerating research program since at
least 1972, nearly every technical paper and new exhibitor is describing
tools for formal verification of circuits.

These tools supposedly lead to RTL sign off without a need to simulate and
verification usally by software checking of user supplied constraints.
Seemingly "bread boarding" and its modern equivalent "extracted delay gate
level simulation" are no longer needed.

Here are Some of the puzzles.  1) because formal proving is related to
BDD construction from net lists, this formal analysis should be usable
in communication code breaking cryptography.  Yet, paper at this year's
Eurocrypt showed heuristics for the related NP complete problem do not help.
Reference is: Krause, M. "BDD-based Cryptanalysis of Keystream Generators",
Proceeding 2002 Eurocrypt, Springer, 2002.  Also the objections to
formal verification first detailed in the Lighthill Report submitted to
the British Government in 1972 still need answers.  Some are: 1) how is
correctness of human coded assertions verified, i.e. errors in assertions
become injected into formally verified circuits?, 2) how can circuit proving
data structures not have exponential size?  3) how can verification work
when the same formal algorithms are used to synthesize circuit in one
direction and verify them in other direction using same "closed system"
algorithms.

It is unfortunate that DAC technical committee does not try to balance
papers that advocate theories with papers that falsify theories.  In any
case, what is probably the most one side scientific refereeing in the last
500 years should make for interesting listening and observing.

--
Steve Meyer                             Phone: (612) 371-2023

520 Marquette Ave. So., Suite 900
Minneapolis, MN 55402



Tue, 23 Nov 2004 13:17:05 GMT  
 Scientific puzzle of formal circuit verification at next week's DAC

Quote:
> This may be a side effect of commercialization of Computer Science,
> but there is a very interesting scientific puzzle that one can observe
> at next week's Design Automation Conference (DAC).  Namely, that although
> formal verification has been a degenerating research program since at
> least 1972, nearly every technical paper and new exhibitor is describing
> tools for formal verification of circuits.

Define 'degenerating' because I have no idea what you're trying to say.
Please - I'm genuinely puzzled by what you said.  I know a few people
making their living quite nicely doing formal methods in academia,
government, and industry.  What part of FM is degenerating?

Quote:
> These tools supposedly lead to RTL sign off without a need to simulate and
> verification usally by software checking of user supplied constraints.

I've never heard *any* formal methods person advocate not simulating the
RTL.  Maybe I've not travelled in the same circles you have.  All you get
from FM is confidence that two representations of your circuit (your spec
and your implementation) have some relationship (equivalency, implication).
That certainly doesn't prevent your synthesis tool from breaking your
design, or your layout tool, or the fab house.  

I like simulating the RTL as much to have confidence in my formal specs
as to gain trust in my design.

Quote:
> Here are Some of the puzzles.  1) because formal proving is related to
> BDD construction from net lists, this formal analysis should be usable
> in communication code breaking cryptography.  Yet, paper at this year's
> Eurocrypt showed heuristics for the related NP complete problem do not help.
> Reference is: Krause, M. "BDD-based Cryptanalysis of Keystream Generators",
> Proceeding 2002 Eurocrypt, Springer, 2002.  Also the objections to
> formal verification first detailed in the Lighthill Report submitted to
> the British Government in 1972 still need answers.  

I think you need to get out and look at formal methods more.  I'm from the
theorem prover side of the house (as opposed to the BDD side) and it's not
a problem to omit BDDs entirely if you don't mind algebraic representations.  

I haven't read the paper you've cited, but I gave a lot of thought to
crypto applications when I was in grad school.  Feel free to ignore this:

As to why BDDs aren't much use for crypto, the answer isn't hard: BDDs only
have huge gains when the logic functions collapse nicely under a certain
ordering of the logic variables.  A big random function of 128 variables
probably won't become tractable just because you throw BDDs at it.  Luckily,
most of the sorts of functions that real-life hardware designers use are
much more amenable to BDDs, given a suitable variable ordering.  Crypto
functions are generally designed to mix up the variables quite completely,
making the functions hard to manipulate algebraically (via theorem provers)
or via BDDs.  

As to the 1972 paper, I'd have to go find it.  The earliest FM-for-hardware
papers I've studied are the RSRE Viper project from the mid 1980s.  Is the
Lighthill Report online?

Quote:
> Some are: 1) how is
> correctness of human coded assertions verified, i.e. errors in assertions
> become injected into formally verified circuits?,

The problem of writing 'correct' specifications is, IMHO, harder than the
problem of showing that a given design meets the specification.  Joe
Jackson said is succintly: you can't get what you want 'till you know
what you want.  The best ways I've found for writing good specs is lots
of peer review together with use-case analysis; simulators also help a
lot.  Wwriting specs for hardware isn't much different than writing specs
for software.  Excepting those damn-near unreadable temp{*filter*}logic specs,
of course.  Those are the specification equivalent to apl - write only.

Quote:
> 2) how can circuit proving
> data structures not have exponential size?

Here are two ideas: a) BDDs may reduce ugly logic equations to a more
compact form, under certain ordering of the variables.  See McMillan's
"Symbolic Model Checking", or any BDD reference.  I say 'may' because
there are worst-case functions that are not amenable to BDDs.  Fortunately,
those sorts of functions are rare in most hardware designs, although crypto
hardware is a notable exception. b) if you maintain the function
symbolically, and do your verification that way, you may avoid exponential
size at the expense of doing a big proof instead.  If you do a brute-force
case-analysis on the variables, of course you'll get an exponential number
of proof cases.  But the power of the proof tools is being able to perform
induction or otherwise make the proof tractible.

Quote:
> 3) how can verification work
> when the same formal algorithms are used to synthesize circuit in one
> direction and verify them in other direction using same "closed system"
> algorithms.

I think you're confused here.  Every formal verification tool I've seen
attempts to show equivalence (or at least implication) of two different
formalisms.  For example, between a formal specification written in some
logic (be it a temp{*filter*}logic, or higher-order logic, or even propositional
logic) and an implementation represented as a state machine, or BDDs, or
some other representation.  The benefit lies in showing that the
implementation satisfies the spec.

The equivalence checking folks make their money (or used to, anyway)
by showing the equivalence of two diffenent forms of the implementation,
as when the design is revved for other bug fixing.  The formal checking
takes less time than running a complete regression simulation, hopefully.

Quote:
> It is unfortunate that DAC technical committee does not try to balance
> papers that advocate theories with papers that falsify theories.

No one declared DAC the end-all and be-all of formal methods.
If you've got some interesting results to present to the world, you've
already found Usenet.  Matt Drudge broke the Monica Lewinsky fiasco on
his web site - I would assume that you could create a web page on which
to publish your results.

If you're just bent because DAC rejected your paper, get over it.  The
papers of many people are rejected from DAC - they've got finite space
and time considerations.  You're actually in good company.

Quote:
> In any
> case, what is probably the most one side scientific refereeing in the last
> 500 years should make for interesting listening and observing.

I think you *might* be over-reacting just a smidge to having your paper
rejected.  500 years of history goes back to 1502.  Do you really believe
DAC 2002 is the most one-sided scientific event in all that time?

So what's the kernel of your gripe with formal methods?  It sounds like
you're convinced that FM is totally bogus.  I think it's merely a pain
in the butt, expensive, and arcane.  But with ICs having 50M transitors
and development schedules under 12 months, FM starts making sense compared
to the concept of thousands of workstations running simulations 24/7.

Kelly



Tue, 23 Nov 2004 14:32:59 GMT  
 Scientific puzzle of formal circuit verification at next week's DAC
You make my point.  If conferences publish papers in an area, they should
balance papers that "prove" with papers that "refute".  If such papers
were published you would know what th technical term "degenerating" research
program means.  It is a technical term from mid century philosphers of
Science named Lakatos and Feyerabend (a good starting reference is:
Lakatos, I. "Mathematics, Science, and Epistomology", Philosophical
Papers Volume 2. Cambridge Press, 1978.

Formal verification is a degenerating research program because it
replaced the superior Polya program without any scientific debate and
because continual predictions of impending success have been made for the
last 30 years and are still being made.

This discussion should not be taking place in a newsgroup but rather
in refereed conferences.
/Steve

Quote:


>> This may be a side effect of commercialization of Computer Science,
>> but there is a very interesting scientific puzzle that one can observe
>> at next week's Design Automation Conference (DAC).  Namely, that although
>> formal verification has been a degenerating research program since at
>> least 1972, nearly every technical paper and new exhibitor is describing
>> tools for formal verification of circuits.

> Define 'degenerating' because I have no idea what you're trying to say.
> Please - I'm genuinely puzzled by what you said.  I know a few people
> making their living quite nicely doing formal methods in academia,
> government, and industry.  What part of FM is degenerating?

>> These tools supposedly lead to RTL sign off without a need to simulate and
>> verification usally by software checking of user supplied constraints.

> I've never heard *any* formal methods person advocate not simulating the
> RTL.  Maybe I've not travelled in the same circles you have.  All you get
> from FM is confidence that two representations of your circuit (your spec
> and your implementation) have some relationship (equivalency, implication).
> That certainly doesn't prevent your synthesis tool from breaking your
> design, or your layout tool, or the fab house.  

> I like simulating the RTL as much to have confidence in my formal specs
> as to gain trust in my design.

>> Here are Some of the puzzles.  1) because formal proving is related to
>> BDD construction from net lists, this formal analysis should be usable
>> in communication code breaking cryptography.  Yet, paper at this year's
>> Eurocrypt showed heuristics for the related NP complete problem do not help.
>> Reference is: Krause, M. "BDD-based Cryptanalysis of Keystream Generators",
>> Proceeding 2002 Eurocrypt, Springer, 2002.  Also the objections to
>> formal verification first detailed in the Lighthill Report submitted to
>> the British Government in 1972 still need answers.  

> I think you need to get out and look at formal methods more.  I'm from the
> theorem prover side of the house (as opposed to the BDD side) and it's not
> a problem to omit BDDs entirely if you don't mind algebraic representations.  

> I haven't read the paper you've cited, but I gave a lot of thought to
> crypto applications when I was in grad school.  Feel free to ignore this:

> As to why BDDs aren't much use for crypto, the answer isn't hard: BDDs only
> have huge gains when the logic functions collapse nicely under a certain
> ordering of the logic variables.  A big random function of 128 variables
> probably won't become tractable just because you throw BDDs at it.  Luckily,
> most of the sorts of functions that real-life hardware designers use are
> much more amenable to BDDs, given a suitable variable ordering.  Crypto
> functions are generally designed to mix up the variables quite completely,
> making the functions hard to manipulate algebraically (via theorem provers)
> or via BDDs.  

> As to the 1972 paper, I'd have to go find it.  The earliest FM-for-hardware
> papers I've studied are the RSRE Viper project from the mid 1980s.  Is the
> Lighthill Report online?

>> Some are: 1) how is
>> correctness of human coded assertions verified, i.e. errors in assertions
>> become injected into formally verified circuits?,

> The problem of writing 'correct' specifications is, IMHO, harder than the
> problem of showing that a given design meets the specification.  Joe
> Jackson said is succintly: you can't get what you want 'till you know
> what you want.  The best ways I've found for writing good specs is lots
> of peer review together with use-case analysis; simulators also help a
> lot.  Wwriting specs for hardware isn't much different than writing specs
> for software.  Excepting those damn-near unreadable temp{*filter*}logic specs,
> of course.  Those are the specification equivalent to apl - write only.

>> 2) how can circuit proving
>> data structures not have exponential size?

> Here are two ideas: a) BDDs may reduce ugly logic equations to a more
> compact form, under certain ordering of the variables.  See McMillan's
> "Symbolic Model Checking", or any BDD reference.  I say 'may' because
> there are worst-case functions that are not amenable to BDDs.  Fortunately,
> those sorts of functions are rare in most hardware designs, although crypto
> hardware is a notable exception. b) if you maintain the function
> symbolically, and do your verification that way, you may avoid exponential
> size at the expense of doing a big proof instead.  If you do a brute-force
> case-analysis on the variables, of course you'll get an exponential number
> of proof cases.  But the power of the proof tools is being able to perform
> induction or otherwise make the proof tractible.

>> 3) how can verification work
>> when the same formal algorithms are used to synthesize circuit in one
>> direction and verify them in other direction using same "closed system"
>> algorithms.

> I think you're confused here.  Every formal verification tool I've seen
> attempts to show equivalence (or at least implication) of two different
> formalisms.  For example, between a formal specification written in some
> logic (be it a temp{*filter*}logic, or higher-order logic, or even propositional
> logic) and an implementation represented as a state machine, or BDDs, or
> some other representation.  The benefit lies in showing that the
> implementation satisfies the spec.

> The equivalence checking folks make their money (or used to, anyway)
> by showing the equivalence of two diffenent forms of the implementation,
> as when the design is revved for other bug fixing.  The formal checking
> takes less time than running a complete regression simulation, hopefully.

>> It is unfortunate that DAC technical committee does not try to balance
>> papers that advocate theories with papers that falsify theories.

> No one declared DAC the end-all and be-all of formal methods.
> If you've got some interesting results to present to the world, you've
> already found Usenet.  Matt Drudge broke the Monica Lewinsky fiasco on
> his web site - I would assume that you could create a web page on which
> to publish your results.

> If you're just bent because DAC rejected your paper, get over it.  The
> papers of many people are rejected from DAC - they've got finite space
> and time considerations.  You're actually in good company.

>> In any
>> case, what is probably the most one side scientific refereeing in the last
>> 500 years should make for interesting listening and observing.

> I think you *might* be over-reacting just a smidge to having your paper
> rejected.  500 years of history goes back to 1502.  Do you really believe
> DAC 2002 is the most one-sided scientific event in all that time?

> So what's the kernel of your gripe with formal methods?  It sounds like
> you're convinced that FM is totally bogus.  I think it's merely a pain
> in the butt, expensive, and arcane.  But with ICs having 50M transitors
> and development schedules under 12 months, FM starts making sense compared
> to the concept of thousands of workstations running simulations 24/7.

> Kelly

--
Steve Meyer                             Phone: (612) 371-2023

520 Marquette Ave. So., Suite 900
Minneapolis, MN 55402


Fri, 26 Nov 2004 02:59:05 GMT  
 
 [ 3 post ] 

 Relevant Pages 

1. BACUP meeting: Formal Verification, Physical Verification; SourceLink

2. BACUP meeting: Formal Verification, Physical Verification; SourceLink

3. Cooley's Great-Gobs-Of-Guilt-6-Months-After-DAC'97 DAC Survey

4. Cooley's Great-Gobs-Of-Guilt-6-Months-After-DAC'97 DAC Survey

5. FYI, I'll be offline for the next week

6. ** ASIC Designers Wanted For Next Week's Great ESDA Shootout **

7. Opportunity for CAD/Circuit/Verification Engineers

8. Opportunity for CAD/Circuit/Verification Engineers

9. IFIP Verification Benchmark Circuits

10. Formal verification methods for OOPS

11. Formal methods for verification of OOPS

12. Functional programming languages & formal verification

 

 
Powered by phpBB® Forum Software