Over the past decade, the core engines of functional verification have matured. Formal verification, simulation, emulation and FPGA based prototyping are the core anchors to deliver verification productivity. While they continue to evolve and differentiate on core parameters like performance, capacity, memory footprint, the next leaps in productivity will be added in the fabric that binds them together with advanced analytics of the data created by the core engines. What will fuel these leaps? Will it be smarter management of the engines?
• Switching between engines to leverage advantages and availability
• Selection of best engine for the task
• Optimizing engine use for overall throughput or latency Will it be smarter testing?
• Good cycles vs. Bad cycles
• Which tests should be run more and which run less and with what parameters/settings?
• Which tests give highest return Or will it be something else or all of the above?
It is safe to say that one thing all of these have in common is the need for relevant data that can be analyzed, correlated, and ranked. This panel will review the requirements for verification in an increasingly application specific and connected world and examine the key trends in verification productivity and what impact data analytics and machine learning could have.
Thank You to Our Sponsor: