Page 55

EETE FEB 2014

Fig. 2: Sample output from the Argon Coverage tool. This is a key challenge, says Alan Scott, CEO at Argon. “We have come at it from an odd background because a lot of the team has been in chip design,” said Scott. Argon has a pipeline tool that gives the theoretical ranges and output from the test stream so that the streams themselves can be evaluated. This ensures that any issues that are seen are from the design of the block, rather than the test streams. This is linked to an interactive view of the specification. “It’s a very human readable approach,” he said. “Because you are able to measure these against the specification we can select a tiny set to streams that cover all the things you want to test. This means you can simulate RTL in minutes without having to use an FPGA and you can do all the development using the compressed set.” “There’s been a little delay in people making IP blocks available for HEVC but we are seeing a surprising number of people wanting to do their own, often because they have done H.264 in the past and want the smallest chip cost,” said Scott. “We are quite pleasantly surprised.” The streams are produced on the same principle used in microprocessor random instruction testing. The streams are generated randomly at the syntax level, giving high specification coverage per byte, which is good news for simulator-based testing – see figure 1. These streams form a provably rigorous decoder test, giving great confidence to silicon IP designers that any decoder which can produce correct output for this set will be able to handle anything that might be thrown at it. The compiler produces a decoder which mirrors the structure of the specification document and has profiling hooks inserted into its code at every calculation. This allows a coverage report to be produced for either a single stream, or set of streams, showing which code paths were hit or missed, and what ranges of values were supplied to each calculation. This comes in the form of an interactive collapsed tree with each branch being dynamically linked to the relevant line in the specification. The compiler which produces the coverage tool can also produce an encoder which generates valid streams which are random at the syntax level. Like the coverage tool, the encoder’s structure mirrors the structure of the specification document, but where the coverage tool would read from its input stream, the encoder generates a random value, within the rules of the specification, and writes it to its output stream. Argon is now looking at how to support other profiles in HEVC and now that the standard is stable is expecting more IP blocks to emerge. “We have one large IP vendor waiting for this point before kicking off their development,” said Scott. Making sure video works is one side of the code verification challenge. Making sure all the options and elements of control code can be tested is another being tackled by Coveritas. The software analyses the constraints of the software and generates random but repeatable streams of data, playing through multiple scenarios, testing for edge and corner cases. The key is that this process if both random but repeatable and automated on a vast scale. The last four years have been focussed on a few large target customers, not the broader market, and they have now worked with key SoC users Cisco Systems, Ericsson, Renesas and NXP. With the release of 2.0 of the Vitaq tool in 2013 the company is looking further afield, and showing the importance of software testing at the top of the SoC eco-system. Software is now the critical factor in an SoC development, not the hardware, says Sean Redmond, CEO at Coveritas. Vitaq includes C++ libraries which provide the framework to express the constraints and sequential rules that are required to build such powerful environments. The resulting test program is then run on a conventional workstation and may either test the system software directly on the same workstation or may interface to the electronic system development board under test using available serial interfaces, network connectivity or other suitable mechanisms. As the environment is run, Vitaq’s constrained random scenario generator allows it to automatically create scenarios of fully formed use case tests that obey the pre-defined rule sets. Each run with a given seed will produce the same parameter values and sequence of events and so is precisely repeatable, but selecting a different random seed will produce a completely different run exercising completely different aspects of the software and system under test, all the while obeying the defined rule sets. The latest interest is coming from a global SoC vendor that is offering software defined functionality on a programmable platform and needs to test all the different use cases on the silicon, says Redmond, as the this provides up to ten times the productivity in software testing without the developers having to learn new test methodologies. Bringing hardware SOC chip techniques to code coverage and testing is delivering increased quality and faster time to market. Compiling a standard into an executable specification and using that to generate the test streams automates the testing of standards-based blocks. Using random yet repeatable data to test the options and branches of complex software helps improve the quality of the software and soothe performance of the system. www.electronics-eetimes.com Electronic Engineering Times Europe February 2014 39


EETE FEB 2014
To see the actual publication please follow the link above