IBM®
Skip to main content
    Israel [change]    Terms of use
 
 
 
    Home    Products    Services & solutions    Support & downloads    My account    
IBM Research & CRI

Haifa verification conference 2006

IBM Haifa Labs


Invitation Registration

IBM Tools Track 2006
October 24, 2006
Organized by IBM Research Lab in Haifa, Israel



Abstracts


Smart-lint: Improving the Verification Flow

Itai Yarom and Viji Patil, Intel

As design features increase, sizes shrink, and more transistors are squeezed into a system-on-a-chip (SoC) IC, the sheer number of on-chip devices far outstrips a design team's ability to harness the full benefits of all the transistors. Furthermore, according to a Synopsys survey, one of the main reasons for bugs in first silicon designs is logic bugs. To address those needs, the EDA community provides a large set of tools for the logic designer that includes simulation, formal verification, and linting, among others. To fully benefit from these tools, the logic design teams should use them as one environment rather than as separate tools. In this paper, we demonstrate how this use of simulation, formal verification, and linting as one environment can provide a solution that is greater than the sum of its parts. Several groups at Intel from DEG and MG report good results using this flow.


Model-Driven Development with the jABC

Bernhard Steffen, Tiziana Margaria, Ralf Nagel, Sven Joerges, and Christian Kubczak, University of Dortmund, Germany / University of Potsdam, Germany

We present the jABC, a framework for model driven application development based on Lightweight Process Coordination. With jABC, users (product developers and system/software designers) easily develop services and applications by composing reusable building-blocks into hierarchical (flow-) graph structures that are executable models of the application. This process is supported by an extensible set of plugins providing additional functionalities, so that the jABC models can be animated, analyzed, simulated, verified, executed, and compiled. This way of handling the collaborative design of complex software systems has proven to be effective and adequate for the cooperation of non-programmers and technical people, and it is now being rolled out in operative practice.


Detecting Design Flaws in UML State Charts for Embedded Software

Janees Elamkulam, Sandeep Kohli, Gururaja Kowlali, Satish Chandra Gupta, Sai Dattathrani, Ziv Glazberg, Ishai Rabinovitz, and Claudio Paniagua Macia, IBM

Embedded systems are used in various critical devices and correct functioning of these devices is crucial. For nontrivial devices, exhaustive testing is costly, time consuming, and probably impossible. A complementary approach is to perform static model checking to verify certain design correctness properties. Though static model checking techniques are widely used for hardware circuit verification, the goal of model checking software systems remains elusive. However, embedded systems fall into the category of concurrent reactive systems and can be expressed through communicating state machines. Behavior of concurrent reactive systems is more similar to hardware than general software.

So far, this similarity has not been exploited sufficiently. IBM Rational Rose RealTime (RoseRT) is widely used for designing concurrent reactive systems and supports UML State Charts. IBM RuleBase is an effective tool for hardware model checking. In this paper, we describe our experiments of using RuleBase for static model checking RoseRT models. Our tool automatically converts RoseRT models to the input for RuleBase, allows the user to specify constraints graphically using a variation of sequence diagrams, and presents model checking results (counterexamples) as sequence diagrams consisting of states and events in the original UML model. The model checking step is seamlessly integrated with RoseRT. Prior knowledge of model checking or formal methods is not expected, and familiarity with UML sequence diagrams is exploited to make temporal constraint specification and counterexample presentation more accessible. This approach brings the benefits of model checking to embedded system developers with little cost of learning.


Seqver: A Sequential Equivalence Verifier for Hardware Designs

Daher Kaiss, Silvian Goldenberg, Ziyad Hanna, and Zurab Khasidashvili, Intel

This paper addresses the problem of formal equivalence verification of hardware designs. Traditional methods and tools that perform equivalence verification are commonly based on combinational equivalence verification (CEV) methods. We, however, present a novel method and tool (Seqver) for performing sequential equivalence verification (SEV). The theory behind Seqver is based on the alignability theory; however, in this paper, we present a refinement to that theory: strong alignability, which introduces a concept of automatic model synchronization to the verification process. Automatic synchronization (reset) of sequential synchronous circuits is considered one of the most challenging tasks in the domain of sequential equivalence verification. Earlier attempts were based on BDDs or classical reachability analysis, which by nature suffer from capacity limitations. Seqver is empowered with hybrid verification engines that combine state of the art SAT and BDD based engines for performing synchronization and verification. Seqver is widely used today in Intel for formally verifying leading next generation CPU designs.


Unpaved Road between Hardware Verification and Software Testing Techniques

Panel Moderator: Shmuel Ur, IBM Haifa Labs

To many, hardware design verification and software testing may seem like separate disciplines. Usually different people work on them in separate companies and conferences. Yet, significant similarities between software development and hardware design exist, and the successful adoption of techniques originally developed for one field for use in the other, suggests that these disciplines are related. One prominent crossover example is code coverage, which was first developed for software testing and is now commonly used in hardware verification. Another example is the FSM based test generator, developed for the verification of hardware modules and now successfully employed for software testing. Moreover, some techniques, such as reliability estimation, were developed for hardware, changed, and adopted for software, and now start to show their usefulness with hardware again. I would like the panel to analyze the similarities and differences between hardware verification and software testing, and with our knowledge and experience in both fields, try to identify technologies that are mature in one field and are ready to cross over. Negative experiences of what did not or will not work are also important.


Invited speaker: Practical Methods in Coverage-oriented Verification of the Merom Microprocessor

Alon Gluska, Intel

Functional coverage is a well known means of measuring verification progress. However, approaches to coverage, such as Coverage Driven and Coverage Oriented approaches, are often difficult or impractical to implement. This paper presents the coverage methodology used in the verification of Merom, Intel's first converged-core microprocessor. We describe practical methods and applied techniques that enabled a high return on a significantly reduced investment in coverage measurement and analysis. Given the tight schedule, this approach provided a clear metric for measuring verification progress and for effectively steering resources to improve the quality of the design under test.


Invited speaker: Increasing Predictability in the Design Verification Process

Andrew Piziali, Cadence

The uncertainty with which hardware and software designs are committed to production ultimately leads to surprises with negative consequences after product ship. We are learning how to quantify that uncertainty and determine the cost trade-offs for reducing it to meet business objectives. A rigorous verification planning process that identifies the scope of the verification problem and specifies its solution is the first step toward managing this uncertainty. This planning process actually quantifies the verification problem and prioritizes the order of its solution sequence. The initial application space and use model of the design determines the subset of the design that is first verified. Capturing these choices in a verification plan that is both human and machine readable transforms this former verification artifact into an application-specific verification interface used to manage the verification process.


An Open Source Simulation Model of Software Testing

Shmuel Ur, Elad Yom-Tov, and Paul Wernick, IBM

This paper describes a discrete event simulation model built using a mathematical tool (Matlab) to investigate the simulation of the programming and the testing phases of a software development project. We give three preliminary concrete examples of how this model can be utilized to examine the effect of adopting different strategies for coding and testing a new software system. Specifically, we provide results of simulation runs intended to simulate the effects on the coding and testing phases of different testing strategies, the adoption of pair programming in an otherwise unchanged process, and the automation of testing. The model source code is available for downloading at http://qp.research.ibm.com/concurrency_testing, and we invite researchers and practitioners to use and modify the model.






























 



Related Conference Links
Visitors information  
Formal Verification in HRL  
Verification and Testing Solutions in HRL  
Simulation based methods in HRL  
IBM Verification Conference 2005  
IBM Verification Seminar 2004  
IBM Verification Seminar 2003  
IBM Verification Seminar 2002  


Proceedings publication: Springer Lecture Notes in Computer Science



    About IBMPrivacyContact
Caesarea Rothschild Institute (CRI) IBM Research