Abstract:
As the semiconductor industry prepares for the Internet of Things, one of the major challenges the industry will face is improving the DPPM (defective parts per million) rate as the volume of devices continues to grow. The electronic systems populated by semiconductor devices are moving from items of convenience (PCs) to necessity (smartphones) to mission-critical (autonomous automobiles) and the demand for greater device quality is reaching all market segments.
In the face of these ever-tightening quality requirements, the use of Big Data analytics is becoming essential to ensure that no suspect devices are being shipped into the end market. Semiconductor manufacturing test today is a “one size fits all” process, with every device being made to go through the same battery of tests.
Using Big Data analytics within a manufacturing environment where data can be aggregated in real-time from multiple test points can enable companies to establish a “Quality Index” where every individual device can be “scored” independently, providing the opportunity to optimize the testing process while simultaneously improving overall quality metrics.
This presentation will show how semiconductor companies today are putting Big Data solutions in place to improve overall product quality and simultaneously reducing their costs in manufacturing operations purely by using data they already have in their possession.