Embedded systems technology powers many of today's innovations and products. Rapid technological advancement fuels the increasing chip complexity, which in turn enables the latest round of products. Embedded systems touch many aspects of everyday life, from the pocket-sized cell phone and digital camera to the high- end server that searches an online database, veri¯es credit card information, and sends the order to the warehouse for immediate delivery. Also, expectations for these chips grow at an equal rate, despite the additional complexity. For example, critical applications like chips that monitor the safety processes in cars require that the chip not fail during normal use of the vehicle. Also, it is not acceptable for a user to be denied access to an online brokerage account because the server is down. Then, an enormous amount of engineering goes into each of these chips, whether it is a microprocessor, memory device or entire system on a chip. All types of chips have a set of challenges that engineers must solve for the ¯nal product to be successful in the marketplace. The veri¯cation °ow has become a bottleneck in the development of today's digital systems. Chip complexity and market competitiveness has increased to the point that design teams are required to spend approximately 70% of their e®orts ¯nding the bugs that lurk in their designs. In particular, two main phases of the veri¯cation °ow are testing [1] and functional veri¯cation [2]. The latter aims to ensure that the design satis¯es its speci¯cation before manufacturing, by detecting and removing design errors. Testing focuses on the detection of production defects quickly as chips come o® of the manufacturing line. Even though testing and functional veri¯cation are often grouped together, the two disciplines have little in common. A chip that successfully runs through testing may still add one to one and get a result of three if the design had poor functional veri¯cation. Testing only con¯rms that the manufactured chip is equivalent to the circuit design speci¯ed to the manufacturing process. It makes no statement about the logical functionality of the chip itself. However, the design teams dealing with the veri¯cation of a system have to handle three constraints: scheduling, costs and quality. Because digital systems success depends heavily on hitting the marketplace at the right time, scheduling has become an imperative point. The use of automatic tools reduces both the veri¯cation time and the probability of committing errors. A valid solution is represented by dynamic veri¯cation, which exploits simulation based techniques and automatic test pattern generators (ATPGs) to generate the required test sequences. Customers expect delivered products to meet a standard of quality. This is especially true of critical applications. Furthermore, if the marketplace perceives that a product is of poor quality, it can have a devastating e®ect on the company. Another critical constraint is cost. Cost drastically in°uences the di®erent ver- i¯cation phases. The cost of undetected bugs grow exponentially over time. If a bug is detected early during veri¯cation, it is less expensive to ¯x it. The designer needs only to rework the high-level design description, and the veri¯cation time shows that the update ¯xed the ordinal problem. A bug found in a system test, however, may cost hundreds of thousands of dollars: hardware must be refabri- cated and there is additional time-to-market costs. Finally, one of the most costly types of bugs is one in which the customer discovers a bug. This not only invokes warranty replacement costs but may also tarnish the image of the company or brand of products . Functional veri¯cation is the biggest lever that a®ects all three constraints. A chip can be produced early if the veri¯cation team is able to remove design errors e±ciently. The cost of re-fabricating a chip multiple times can drive the development expenses to an unacceptable level and negatively a®ect the product schedule. Functional veri¯cation reduces the number of re-spins and removes latent problems, avoiding also quality problems of the developed products. Another advantage of functional veri¯cation is that designers are able to work at a higher abstraction level, and the design descriptions are more tractable than gate-level ones. On the other hand, e±cient logic-level ATPGs are available for digital systems and are already state of the art, while high-level functional ATPGs are still in a prototyping phase. This thesis is intended to de¯ne a methodology that exploits the positive as- pects of both functional veri¯cation and logic-level testing, while also providing bene¯ts of testing at both of these two veri¯cation levels.
A functional ATPG as a bridge between functional verification and testing
MARCONCINI, Cristina
2008
Abstract
Embedded systems technology powers many of today's innovations and products. Rapid technological advancement fuels the increasing chip complexity, which in turn enables the latest round of products. Embedded systems touch many aspects of everyday life, from the pocket-sized cell phone and digital camera to the high- end server that searches an online database, veri¯es credit card information, and sends the order to the warehouse for immediate delivery. Also, expectations for these chips grow at an equal rate, despite the additional complexity. For example, critical applications like chips that monitor the safety processes in cars require that the chip not fail during normal use of the vehicle. Also, it is not acceptable for a user to be denied access to an online brokerage account because the server is down. Then, an enormous amount of engineering goes into each of these chips, whether it is a microprocessor, memory device or entire system on a chip. All types of chips have a set of challenges that engineers must solve for the ¯nal product to be successful in the marketplace. The veri¯cation °ow has become a bottleneck in the development of today's digital systems. Chip complexity and market competitiveness has increased to the point that design teams are required to spend approximately 70% of their e®orts ¯nding the bugs that lurk in their designs. In particular, two main phases of the veri¯cation °ow are testing [1] and functional veri¯cation [2]. The latter aims to ensure that the design satis¯es its speci¯cation before manufacturing, by detecting and removing design errors. Testing focuses on the detection of production defects quickly as chips come o® of the manufacturing line. Even though testing and functional veri¯cation are often grouped together, the two disciplines have little in common. A chip that successfully runs through testing may still add one to one and get a result of three if the design had poor functional veri¯cation. Testing only con¯rms that the manufactured chip is equivalent to the circuit design speci¯ed to the manufacturing process. It makes no statement about the logical functionality of the chip itself. However, the design teams dealing with the veri¯cation of a system have to handle three constraints: scheduling, costs and quality. Because digital systems success depends heavily on hitting the marketplace at the right time, scheduling has become an imperative point. The use of automatic tools reduces both the veri¯cation time and the probability of committing errors. A valid solution is represented by dynamic veri¯cation, which exploits simulation based techniques and automatic test pattern generators (ATPGs) to generate the required test sequences. Customers expect delivered products to meet a standard of quality. This is especially true of critical applications. Furthermore, if the marketplace perceives that a product is of poor quality, it can have a devastating e®ect on the company. Another critical constraint is cost. Cost drastically in°uences the di®erent ver- i¯cation phases. The cost of undetected bugs grow exponentially over time. If a bug is detected early during veri¯cation, it is less expensive to ¯x it. The designer needs only to rework the high-level design description, and the veri¯cation time shows that the update ¯xed the ordinal problem. A bug found in a system test, however, may cost hundreds of thousands of dollars: hardware must be refabri- cated and there is additional time-to-market costs. Finally, one of the most costly types of bugs is one in which the customer discovers a bug. This not only invokes warranty replacement costs but may also tarnish the image of the company or brand of products . Functional veri¯cation is the biggest lever that a®ects all three constraints. A chip can be produced early if the veri¯cation team is able to remove design errors e±ciently. The cost of re-fabricating a chip multiple times can drive the development expenses to an unacceptable level and negatively a®ect the product schedule. Functional veri¯cation reduces the number of re-spins and removes latent problems, avoiding also quality problems of the developed products. Another advantage of functional veri¯cation is that designers are able to work at a higher abstraction level, and the design descriptions are more tractable than gate-level ones. On the other hand, e±cient logic-level ATPGs are available for digital systems and are already state of the art, while high-level functional ATPGs are still in a prototyping phase. This thesis is intended to de¯ne a methodology that exploits the positive as- pects of both functional veri¯cation and logic-level testing, while also providing bene¯ts of testing at both of these two veri¯cation levels.File | Dimensione | Formato | |
---|---|---|---|
PhD Thesis.pdf
accesso solo da BNCF e BNCR
Dimensione
1.86 MB
Formato
Adobe PDF
|
1.86 MB | Adobe PDF |
I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/20.500.14242/113571
URN:NBN:IT:UNIVR-113571