Revolutionizing Data Analysis and Delivery in Semiconductor High Volume Manufacturing

Kevin Lennon & Adrian Porter

Contents

Abstract

Bios

Introduction

Intel

Intel Ireland

Part 1: Conception and Beginning

Analysis Tool Considerations

The JClick! Method

Technical Growth

Simple JClick!’s

Beginnings of Code

Data Consolidation

Learning

The JCat! – The JClick! Analysis Tool

Local Growth

Part 2: Initial Growth

Intel Ireland Internal Lean Conference

Crossing the Atlantic

A Philosophy of Sharing

Part 3: Takeoff

IMEC 2010: Intel Manufacturing Excellence Conference

ACE – The Analytically Capable Engineer

The JClick! class

Support Systems

JClick! and the Manager

Standardization

Effective Meetings

Managing the Resource

A Total Solution

The Return to IMEC

Sustained Growth

Part 4: JClick! and you

Running JClick!

Building Momentum by Sharing and Collaborating

Quality of Code

Abstract

Intel Corporation, one of the world’s largest semiconductor manufacturers, supplies processors used in every part of the connected world; from high powered multi core servers to ultra-low power internet of things devices. In the creation of these world class products the manufacture of silicon wafers creates terabytes of data, data that can, and must, be used to maximize value.

In 2009 the authors initiated a data analytics revolution by creating an automated data delivery system in Intel Ireland’s Fab 24 site that has gone on to save over a thousand engineers, hundreds of thousands of hours in engineering time. It has automated mundane tasks to allow immediate solutions and it has detected defect signals that were previously undetectable. The system is called JClick!and JMP is right at its heart. What started as a lean project in a single process engineering department has spread to 7 high volume manufacturing facilities and is being used by thousands of employees daily. By focusing on the individual needs of the engineer and looking for lots of small wins JClick!has succeeded in making analytics personal. JClick!is a remarkable tale of technical innovation and inventive marketing, one everyone should hear.

Bios

Adrian Porter is a Senior Staff Technologist at Intel’s Fab24 Facility. Whilst his core role is as a Lithography Process Expert, he specialises in Advanced Problem Solving and Data Visualisation of Factory Performance. He is a 22year veteran of Intel having worked in Ireland and across Intel’s global network.

Kevin Lennon is an Intel Staff Product Development Engineer. He is a recognized industry expert in the lean applications of analytics and most recently served on the Board of Accreditation for Ireland’s first undergraduate Data Science degree.

Introduction

Intel

Intel Corporation(also known asIntel, stylized asintel) is an Americanmultinationalcorporation andtechnology companyheadquartered inSanta Clara,California(colloquially referred to as "Silicon Valley") that was founded byGordon Moore(ofMoore's lawfame) andRobert Noyce. It is theworld's largest and highest valued semiconductorchip makers based on revenue,and is the inventor of thex86series ofmicroprocessors: the processors found in most personal computers (PCs). Intel supplies processors forcomputer system manufacturerssuch asApple, Inc.,Lenovo (formerly IBM),Hewlett Packard, Inc.andDell. Intel also manufacturesmotherboardchipsets,network interface controllersandintegrated circuits,flash memory,graphics chips,embedded processorsand other devices related to communications and computing.[1]

Intel Ireland

Intel Ireland's Leixlip campus, located in County Kildare, began operations in 1989. Since then, Intel has invested over $12.5 billion in turning the 360-acre former stud farm into one of the most technologically advanced manufacturing locations in Europe.

The Leixlip campus is home to a semiconductor wafer fabrication facility which produces latest generation silicon microprocessors that are at the heart of a variety of platforms and technology advancements which are essential to the way we learn, live and work today.

Intel first came to Ireland in 1989 establishing what was to become one of Europe’s leading semiconductor manufacturing locations at Collinstown Industrial Park in Leixlip. Today, more than 4,500 people work at the campus and in March 2014 Intel shared details of the progress of a$5 billion campus upgradeinvestment at the Leixlip campus, the largest private investment in the history of the Irish State, which will prepare the facility to manufacture latest generation Intel process technology on 300mm wafers.The latest investment by Intel in the Leixlip campus brings the cumulative capital invested in Ireland over the past 25 years to $12.5 billion.[2]

Part 1:Conception and Beginning

In early 2008 Intel Ireland embraced the lean methodology. They began a series of educational seminars on eliminating waste and standardization and this new thinking was applied to all areas of work. It was at this time the authors identified an area for development centered on improving capability in data extraction and analysis.

A typical semiconductor factory can produce gigabytes of data per fabricated wafer. Multiple measurements are taken at hundreds of process steps. Electrical measurements, film thicknesses, hardness, critical dimensions and thousands of end of line tests are taken on every lot that runs in the factory. Not to mention the endless amount of data collected on defects and lot movement (tool chambers, process times etc…). All of this data is vital to ensure a stable process, the identifying of process issues and the timely delivery of product to our customers.

While applying the lean toolbox provided by Fab 24 management the authors found that engineers were using multiple extraction tools, multiple analysis tools and had multiple means of using them. They also found that engineers were spending far more time preparing data than making decisions…up to 90% of their time was extracting and preparing data. This turned out to be a universal truth found in every company. Even simple differences in how engineers would present their work slowed down how quickly a decision was being made. For example an engineer working shift at the start of the week might use an x-axis scaled by week, while the engineer at the end of the week might scale the x-axis by day. These subtle differences just served to slow down how quickly we could make a decision. Couple this with the fact that each analysis was done manually and the most common way to deliver an analysis to those that needed it was email meant a process flow that was loaded with waste.

The authors wanted to change the paradigm of data analysis and they set their sights on both standardizing and improving the efficiency of data extraction and analysis in Fab24. This paradigm was to have the data they needed at their fingertips and the analysis required ready to be applied. Before they could arrive at this point however the first step in standardization was to pick the tools for the job. They quickly landed on a proprietary extraction tool that would automate the data extraction, but the analysis tool required more consideration. They looked at several options before landing at JMP. The table below shows these tools and the considerations.

Analysis Tool Considerations

Tool / Considerations
SAS / Provided all the analysis requirements. Able to operate on large data sets. Needed to learn how to code in it. High barrier to entry.
JMP / Easy to use. Provided all analysis requirements. Bristling with interactivity. Already a standard tool used by all engineers. Easy to rip out the code for an analysis.
Excel / Limited analysis capability. Relatively easy to automate. Large amount of user support.
PERL / Large amount of user support online. Provided most of the analysis, but difficult to find. High barrier to entry if you are not already a programmer.

The JClick! Method

The two single most important factors in the success of JClick!was the simple system and the marketing of that system.

The JClick!method, without exception, uses the following flow. This is the key to reducing hours of analysis to seconds of analysis and, while simple, is the most significant lesson of this paper.

  1. Schedule an SQL query to run at a required frequency.
  2. Send the output of that query, as a csv, to a shared location accessible by anyone who needs it.
  3. Create a JSL script that calls that csv file and performs an analysis. The key in creating these scripts is to let JMP do the vast majority of the scripting for you.
  4. Save the JSL script in the same shared location. Embed a script that calls this JSL script (using Include) in a button on a journal.
  5. Distribute the Journal

This Journal, in step 5, becomes the analytic hub for your team. Whenever any engineer needs to do an analysis they open the journal and the most up to date data is there along with the most up to date analysis…all at the click of a button. Storing this journal in a shared location is also an option where you would create a menu item that opens it.

Technical Growth

Once the method was established the authors began to grow their technical depth in JMP, JSL, SQL and general automation methods. Over the next couple of years they would, along with the rest of the JClick!team, master their craft.

Simple JClick!’s

At first they simply coded up graphs that they would generate manually for example using variability plots to identify process tool mismatches, see Figure 1 and Figure 2.

Figure 1: Tool Comparison Variability Plot

Figure 2: Variability Plot with Trending

Beginnings of Code

They quickly began to realize that growing their JSL knowledge would allow analysis that was just impossible to do without a great deal of manual work. The simplest example of this is the 24hr reference line as seen in Figure 3. This simple addition to the code generated by JMP meant that you could always see data that was new in the last 24 hours and the reference line would update every time the script was executed.

Figure 3: 24hr Reference Line

Data Consolidation

One of the earliest uses for JClick!was to consolidate and personalize data for the engineers that were accessing the data. Using JMP/JSL Kevin and Adrian were able to gather the data they needed to do their job and consolidate it in a journal. This meant that instead of going to several different automated systems they were able to consolidate the data in to one place. It wasn’t long before this simple use of JClick!became a go to tool for the author’s local team. See Figure 4

Figure 4 Data Consolidation

Learning

The authors had a lot of tools at their disposal and leveraged many of the SAS provided learning opportunities to grow their capabilities some examples are listed below

  • JSL Scripting guide provided a detailed reference on how JSL is constructed.
  • A free seminar in Marlow in 2009 provided insight into the partition platform and the row filter.
  • That same Marlow seminar also introduced the authors to Stephen Few and the benefits of effective visualization.
  • Discovery conferences like this one provided information on using JMP for design of experiments and even using Neural Networks for defect detection.
  • Keynotes at discovery conferences opened the author’s eyes in how to approach problem solving. Speakers such as Jonah Lehrer, Kaizer Fung and Dick De Veaux.
  • Webcasts also provided a valuable learning tool. Xan Gregg’s remarkable Graph Builder being one of the most viewed webcasts in Ireland.
  • But ahead of all of these the most valuable learning came from collaboration with the growing JClick!team. Sharing and healthy competition meant the pace of improvement grew far faster than it could have on its own.

The JCat! – The JClick! Analysis Tool

The next breakthrough in the author’s journey came when they began to discover the wonderful world of display boxes, data filters and buttons.This provided a giant step forward in the ability to do analysis. In fact this tool provided the means to answer John Sall’s call from Discovery 2010 “Be brave. Do it live”. By stitching together a graph, a data filter and a number of analysis scripts embedded in buttons an engineer was able to quickly get to the decision, so quickly that they often had the analysis done before the meeting was over. Saying “I’ll have that by the next meeting was a thing of the past”. A redacted version of the JCat! can be seen in figure 5.

Figure 5: The JCat

Local Growth

By consolidating data, creating innovative applications and implementing a central repository for the JClick!applications the authors now had a tangible product that could be shared with other engineers. They started with their local team and began designing applications that would provide efficiency improvements. In fact one of the first applications that was widely adopted saved 60 engineers in one department approximately 30 minutes a day. JClick!dashboards (JMP Journals with script buttons) soon became a tactical necessity for managing that same department. JClick!s were set up to inform engineers of tool status, product movement, defect rates, yield performance etc…all in a far more accessible and tailored way than ever before.

While the authors were mastering the skills required to make JClick!possible they also made what was perceived as a complex process simple. So simple in fact that they believed anyone could learn it and create their own JClick!s. It was this determination that would fuel the growth that would take JClick!from two engineers to over two thousand engineers.

Like many large companies, Intel factories provide many in house data delivery systems that cater to many people. However they can rarely provide 100% of the solution to 100% of the customers. JClick!, with its highly personalized delivery, closed the gap that could not be closed by the larger solutions.

Part 2: Initial Growth

Intel Ireland Internal Lean Conference

In late 2009 Intel Ireland held an internal lean conference to present the improvements that had been brought about during its lean journey. This was a perfect opportunity to showcase JClick!to the rest of the factory. The presentation of the work served two purposes, it allowed them to share many of the applications that they had created but alsogave them insight in to how they might fine tune the methodology.

It was also for this conference that the name JClick!was conceived. The decision to give the methodology a name turned out to be an inspired choice as 7 years on that name carries a very specific meaning for a type of analytical application within Intel’s factories. The phrase “let’s put it in JClick!” can be heard from Vietnam to Israel.

Crossing the Atlantic

Intel factories employ a method that they call Copy Exactly! The Copy Exactly! methodology focuses on matching the manufacturing site to the development site. Matching occurs at all levels for physical inputs and statistically-matched responses (outputs). This process enables continuous matching over time by using coordinated changes, audits, process control systems, and joint Fab management structures. [3]

Because of Copy Exactly! Factories running the same process technology will use the same equipment, the same databases and the same metrics to measure performance. All this meant that any of the JClick!applications built in Ireland could be transferred to the Arizona site (who were running the same technology) with almost no modifications.

This transfer happened when Tracy Hendricks, a peer of one of the authors (Kevin), was given a demonstration of the Ireland JClick! Dashboard. Within weeks she had transferred the entire dashboard and was evangelizing the methodology within the Arizona site. Tracey become our first customer and also the first person to join the JClick!team of two. Her enthusiasm for the methodology and her eagerness to learn it meant that they soon had another engineer building applications for their organization.

A Philosophy of Sharing

For a long time complex data analysis and scripting were seen as the purview of the elite few and it was thought that only a certain type of person could learn to build automated applications. The authors were determined to turn this belief on its head. They believed that they had simplified the process of creating an automated data solution to the point where anyone could do it. They were also determined to share everything they learned with anyone who wanted to learn. By enabling every engineer with the tools to create their own personalized applications they could amplify the impact of their work while making every employee more efficient at their job.

Part 3: Takeoff

IMEC 2010: Intel Manufacturing Excellence Conference

JClick!was chosen to be a display at the 2010 Intel Manufacturing Excellence Conference. IMEC, abiennial event attended by a worldwide audience of 1000 selected Intel employees, shares papers, presentations, and exhibits to proliferate "best known methods" across the company. A rigorous selection process ensues to select the exhibits and presentations (only 1% of papers are selected).[4]

The team of three had prepared a demonstration dashboard that showcased the various applications that they had created and were currently using with their organizations. The display was a hit and during the first session various factory managers showed great interest in what was on show and how the team had solved both simple and complex problems. This interest continued for the 4 days.

JClick!had a hit a note with the engineers that attended the conference. The team continually explained that this method was accessible to everyone with just a small amount of personal investment. What’s more JClick!provided the answer to the many thousands of automated solutions that were needed but just too small for the traditional software development approach.