Oral/Lecture

Test Parameter Tuning with Blackbox Optimization: A Simple Yet Effective Way to Improve Coverage

Constrained random verification in industrial settings involves parameterized tests. The parameters used to control the test stimuli generation are typically set when the test is first written, and seldomly varied later on in nightly regressions. In this work, we formulate test parameter configuration as a blackbox optimization problem and we introduce Smart Regression Planner (SRP), an approach that automatically configures the tunable test parameters to better explore the input space and accelerate convergence towards coverage. The optimizer in SRP can drive the parameters update with a Bayesian optimization technique that uses coverage from nightly regressions as feedback. Our evaluation on open-source as well as larger industrial designs demonstrates that SRP leads to up to 9.84% higher average coverage over 100 nights than the human baseline. Importantly, it converges to coverage milestones up to 20x faster than the human baseline. With high-level test parameter optimization, we introduce a problem space and an opportunity to achieve categorically higher coverage in industrial settings with very low overhead. Furthermore, through two practical use cases, we show that employing multi-objective optimization and transfer learning can further accelerate the verification process.

Qijing Huang, UC Berkeley
Hamid Shojaei, Google
Fred Zyda, Google
Azade Nazi, Google
Shobha Vasudevan, Google
Sat Chatterjee, Google
Richard Ho, Riho