Coordination and Visibility Challenges In Design Verification

In our previous blog, we outlined the critical role of verification in chip design. In order for verification teams to be productive, a variety of coordination and communication related challenges need to be addressed.

For example, the design process is typically divided into several groups that handle different aspects of the design. This can create a disconnect between the various groups. Additionally, it can lead to miscommunication and a lack of information sharing between the groups. This makes it difficult for verification engineers to have a complete understanding of the design when generating and running tests. It is of paramount importance for the verification team to be able to understand how design teams are progressing, in order to plan accordingly.

Furthermore, verification engineers have to deal with the fact that the hardware design team is still making changes to the design long after the functional verification has been completed. This is especially true today, given the high rate of change in the industry and the speed of innovation that is required. This requires verification teams to be able to run regression tests regularly and understand the results of these regressions at a granular level.

As designs get closer to completion, visibility into metadata such as test run times and the ability to drill down into failures become even more important. In addition, teams need to be able to understand why a test failed, and in some cases, the root cause of the failure should be understood with enough detail to allow further analysis and investigation. This information is typically hidden in some test signature that needs to be unpacked.

To address these challenges, many chip design teams are attempting to find ways to improve verification. For example, some teams look at metadata such as testbench run times to isolate the “needle in a haystack” change that could have triggered failures. Teams are also trying to improve the verification process and make it more efficient through automated parsing of regression logs. However, for any of this to be successful, it is critical for verification teams to be able to use a robust data processing infrastructure that just works.

In our next blog, we will describe a potential solution that can help track verification and regression runs more effectively. To learn more about how our solution can help you, contact us at: