Customer Story
Racing Toward the Future
With Silicon – and JMP®
Polar Semiconductor designs and manufactures advanced wafers for automotive computer chips. JMP® helps the company save time, reduce costs and continually improve in a competitive market.
Polar Semiconductor
Challenge | Optimize efficiencies, manage costs, make the best use of equipment and data, and continually improve the manufacturing of innovative wafer solutions. |
---|---|
Solution | Apply JMP to automate and optimize the collection, preparation, analysis, reporting and workflows of engineering and process data. |
Results | JMP scripting, data visualization and dashboards give Polar the data insights and accuracy it needs to keep fab equipment running smoothly, achieve efficient processes, maintain top product quality and deliver new innovations. |
In the digital economy, products increasingly rely on more information technology for more capabilities. That’s especially true in the automotive industry, where integrated circuits have become integral to virtually every component – and will enable future functionality such as autonomous operation.
The foundation of many automotive computer chips is advanced wafer technology engineered and manufactured by Polar Semiconductor. Based in Bloomington, MN, Polar is the wafer fabrication division of integrated circuit designer Allegro MicroSystems and a subsidiary of Japanese semiconductor manufacturer Sanken Electric. Polar’s skilled engineers use sophisticated equipment and a 140,000-square-foot clean room to produce more than 3,000 wafers a week for automotive and other advanced industrial applications.
Semiconductor manufacturing is a competitive industry, with razor-thin margins and constant pressures to deliver higher quality and capacity. To compete and win, Polar needs to optimize efficiencies, manage costs, fully use its data and equipment, and continually innovate. To achieve these goals, the company relies on state-of-the-art analytics.
Greater demand, greater challenges
The automotive and other industrial computer chips built on Polar’s wafer technology couldn’t be more in demand. “What’s driving the industry right now are automotive and the Internet of Things [applications],” says Jim Gillard, the Operations Master Black Belt for Polar. “If you look at your car and how many electronic features are controlled by integrated circuits, it’s just growing so quickly.”
Yet huge demand doesn’t mean small challenges. Like many companies, Polar needs to drive innovation, improve quality and increase output – all while controlling costs. Wafer fabrication equipment is expensive; foundries can’t make huge capital investments in physical assets every time they need to deliver advancements. That calls for optimizing existing equipment to achieve new competitive advantages.
“We’re a 200mm fab, and right now 200mm fabs around the country are full,” Gillard explains. “It’s really putting pressure on getting more from our existing equipment and upgrading the right tools with the right features at the right time.”
For example, if you’re etching wafers with radio frequency plasma, you need to make sure the output is consistent, regardless of the equipment being used. “There’s a significant amount of engineering effort put toward tool-matching and variation reduction,” says Evan Ngo, a Senior Electrical Test Engineer for Polar.
Matching process outputs and reducing sources of variation involves capturing and analyzing large data volumes. “We collect many real-time tool signals for things like radio frequency powers and their matching network parameters, microwave powers, pressures, temperatures, gas flows, etc.,” Ngo explains. He and his colleagues need to make sure those signals are well-controlled, and that they can identify and remedy faults on tight inline controls before running wafers on a tool that could negatively affect the distributions of the various electrical device tests at the end of line.
“Keep in mind that one wafer sees a couple hundred tools in its lifetime, and a couple hundred parameters per tool,” Ngo adds. “If you have 25 wafers in a lot, that’s a lot of data to analyze and correlate. So you need a statistical program like JMP.”
Mastering data flows
Gillard and Ngo both have extensive experience with JMP. Gillard, a Lean Six Sigma Master Black Belt, specializes in process engineering, design of experiments (DOE), critical-to-quality (CTQ) performance, statistical process control (SPC) and the Cpk process capability index. He provides training in Six Sigma and the statistical software to go with it, and JMP is center stage.
Gillard uses JMP to analyze Polar’s process data, working with groups across the company to identify potential trouble areas and widen or tighten limits where needed. He has developed JMP scripts to collect, prep, analyze and report data for CTQ, SPC and Cpk – in the process, improving SPC charts by 35 percent and Cpk by 18 percent.
Gillard began using JMP with a previous employer. “Everyone started using JMP because you could script and pull in data directly,” he recalls. “Once people saw it, they’d say, ‘Whoa, I just circle this and then this graph highlights – wow!’ Scripting, pulling in the data, visualizing, doing that over and over – you can’t really do that in Excel or Minitab.”
Ngo, who has participated in Gillard’s Six Sigma training, works in Polar’s electrical test area. He analyzes electrical data to monitor patterns and identify root causes of failures. The goal is to make sure every product is top quality. “A co-worker first showed how to do a variability analysis plot in JMP, and I said, ‘Perfect, that’s exactly what I wanted!’”
He further embraced JMP as he began to work with larger data sets. “Excel can handle only about 1 million rows,” he points out. “And VLOOKUPs in Excel are pretty painful when you’re doing 10 or 20 of them.” Ngo works with tables that can be several thousand columns by several million rows. “JMP can handle tables with millions of rows easily. That, along with quick table joins and scripting, makes it useful for doing repeat correlations and analyses,” he says.
Data analysis according to script
Gillard and Ngo both do extensive scripting in JMP. For example, Gillard has scripted parametric model functions for electrical tests and inline functions for the actual process. The scripts organize data in a graph and assign it to a user. The user can simply click on a file and then quickly review charts. “In the past we had to look at one chart at a time, which wasn’t efficient,” Gillard says.
Ngo has built applications to help Polar’s electrical test area analyze device characteristics such as resistances, capacitances, leakages, voltage thresholds, breakdown voltages and more by correlating these values to data collected inline. “Now we have a way to quickly visualize what’s going on for any given product and we can use scripted functions to further analyze what we see to verify our hypotheses,” he says. “If there are abnormalities, we can now find the root cause faster and be more confident in our findings.”
He also scripted a dashboard to monitor the company’s line yield and display standardized key metrics to drive actions to owners. “I first fine-tuned how we collect our scrap data, because if you’re dashboarding on bad data, there’s no point,” he observes. “Now we have data collection set up to be ownership-oriented, and we can track scrap across our entire facility and view the data in the same way across all groups.” Ngo emphasizes the significant time savings of JMP scripts. “Before this, groups used to manually pull data from a SQL database and make their own charts; you can see how time-consuming this would be if it needed to be done weekly.”
Do you see what I see?
The ability to share data with key stakeholders is just as important. “One challenge a lot of companies deal with is, ‘Where is the information?’” Gillard believes. “Having a dashboard that consistently communicates data is very valuable. Rather than having to perform an analysis repeatedly, you can just check your dashboard. Or if you have an action list, you can just publish that on the dashboard for various people to use. Those dashboards help us immensely, and we create them in JMP.”
Going forward, Gillard and Ngo are looking at JMP Pro for big data analysis. “People are talking about sensor data and machine learning and all this data that will be captured from machines every second,” Gillard says. “That involves a tremendous amount of data. It will take tools like JMP to dig through it and figure out what’s important.”