First ACM & IEEE International Conference on

Formal Methods and Models for Codesign (MEMOCODE'2003)

Programming models, formal analysis methods and verification techniques for high-level
system design and (re)engineering: towards convergence of formal methods and industrial trends.

June 24th.-26th., 2003 -- Mont Saint-Michel, France


Home page
Aims and scope
Call for papers
Industrial case studies

I. Conformance checking
Arvind, Stan Liao, Nancy Lynch, Carl Pixley, Rob Slater, Marten Van Hulst

II. Hierarchical verification
Forrest Brewer, Hans Eveking, Grant Martin

III. Abstraction vs. optimization
Brian Bailey, Stephen Edwards, Rajesh Gupta, James Hoe

IV. Incremental verification
Harry Foster, Hillel Miller, Marten Van Hulst

V. Post-production patchability
Forrest Brewer, Masahiro Fujita

Industrial case studies. Prospective authors are invited to submit novel and unpublish work devoted to the presentation of position papers and scientific contributions on actual design problems proposed by members of the program committee from major EDA industry companies.

Problem III : Should the space of implementation possibilities be determined by the abilities of high-level synthesis and validation?

Authors : Brian Bailey, Stephen Edwards, Rajesh Gupta, James Hoe

A high-level hardware description, employing high-level language constructs and/or computation models, is inherently more "abstract" than its concrete implementation. A high-level construct typically has many correct mappings to lower-level constructs, and a high-level model may allow many correct implementations and behaviors. This abstraction of "details" from a high-level description offers the potential to increase design productivity by allowing a more concise and intuitive design conceptualization and enabling a greater scope of automatic/semi-automatic optimization and synthesis. In addition, an abstract design representation can enable more effective validation by conforming to a simpler computation model and also by sparing the validation engine of many peripheral details. However, the adoption of high-level modeling, validation and synthesis methodologies also implies that the designer has less exact control over the specifics of the final implementation. This decoupling between what can be specified and what is ultimately implemented can hinder a designer's ability to achieve optimality and possibly even reduce the applicability of high-level validation results. This compromise in high-level methodologies presents a dilemma for the microelectronic system designer where state of the art implementations have routinely required manual interventions at all levels of the design flow.

Focus & Scope : This "methodology enforced" quality versus designer efficiency tradeoff is not new. For example, when IBM adopted synchronous design practices in the early eighties, the decision led to competitively lower-performance hardware, but the hardware were more effectively verified. In this case, the performance compromise for verifiability was an overall win for IBM. Nevertheless, it is not automatically given that a move to higher-level modeling/validation is always a good thing. Like most engineering decisions, a quantitative understanding of the actual tradeoffs involved is needed before a high-level modeling methodology can be considered "matured" (or ready) for practical adoption. Consequently, we seek contributions that shed light on the nature of "compromise" in design quality, validation accuracy required for improvements in modeling and validation efficiency through specific high-level modeling approaches. We seek contributions from formal understanding of the modeling limitations and tradeoffs to specific case studies and design experiences.

Example Issues/Problems :

- How essential is high-level synthesis and validation in your upcoming designs? What are the demands and requirements of your future designs in terms of synthesis and validation?

- What must designers give up to take advantage of synthesis or validation in a given high-level design framework? What and how much does a designer have to gain by adopting a high-level design framework?

- What low-level control over implementation details are designers most willing to trade for increased ease-of-design and productivity? How does a designer decide how much performance (in terms of speed, power, size) is worth sacrificing for moving to a high-level design framework?

- What low-level constructs present the most challenge to a validation engine? What high-level constructs would be most helpful in specifying tomorrow's designs?

- Platforms are becoming a prevalent framework for embedded systems design. While they limit the architecture of a solution, they provide a large amount of pre-verified hardware and a stable software base on which solutions can be built. How do we provide incremental verification that insures that all of the 'rules' of the platform have been observed when additional hardware is added? How can you assure that a function mapped onto that platform still operates as intended?