Olivera Stojanovic, Vtool; Nemanja Mitrovic, Vtool; Anna Revitzki, Vtool
The growing complexity of SoCs is challenging the efficiency of present-day verification approaches, making verification processes increasingly more involved. As waveform databases expand in size and simulation run-time becomes more time consuming, logs and code execution traces are harder to read and understand. This requires greater resourcefulness from verification teams, alongside more powerful verification tools. By addressing the various complex SoC verification bottlenecks, this paper proposes a viable, highly effective verification solution that uses Big Data techniques for analyzing standard verification outputs. The new approach presented here transforms traditional debugging by applying proven AI techniques and algorithms on unified Big Data datasets, derived from multiple sources. To highlight the applicability and potential of AI for complex SoCs verification, the new approach is tested, demonstrated, and ultimately proven to be successful using several NOC verification examples. It provides the nearest endpoints addresses algorithms, proving to be highly more effective in matching source and destination endpoints in failing transfers. The approach also suggests potential resolutions for common failures by finding the common data values in failed transfers, and its time-scale analysis extracts deviations in transfer execution time to suggest potential throughput issues, transfers timeouts, and more. Although standard debugging methodologies can resolve such common NoC verification issues, the new AI-driven approach described in this paper significantly accelerates SoC verification, transforming the entire verification process into a highly efficient one.