قالب وردپرس درنا توس
Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Technology https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Manufacturing memory means scribing silicon in a sea of ​​sensors

Manufacturing memory means scribing silicon in a sea of ​​sensors



 How it is made: silicon wafers! </div>
<p> Enlarge <span class= / How it's made: silicon wafers!

Micron

At Micron's memory chip fabrication facility in the Washington, DC, suburb of Manassas, Virginia, the entire manufacturing area is blanked in electronic detectors in all their various forms. But the primary purpose is to keep intruders out or anything so prosaic. "A lot of them are microphones," a spokesman for Micron said. "They list to the robots."

It turns out that there are thousands of microphones throughout the facility, or "fab," as silicon manufacturing plants are commonly used. known. There are microphones inside the giant $ 70 million cameras that imprint the component layout on the silicon surface of a memory chip. There are microphones lining the robot controlled railways that carry colorful plastic FOUPs (front opening universal pods) along the ceiling throughout the plant. There are microphones near essentially every moving part in the facility.

All those thousands of microphones are listening for signs of wear ̵

1; for variances to develop in the noises made by the machines – so that maintenance can be scheduled before anything breaks and causes downtime. Downtime, if you might imagine, is about the worst thing that can happen to an automated chip-making facility.

Listen up

Micron engineers have created an AI system that uses deep learning to visualize the sounds produced by moving parts within the production machinery. The visualization takes place by creating a full-color time-based spectral display, which is then used by the AI ​​software to watch for changes over time. The images resemble the sonar displays used by Navy ships and submarines to detect underwater noises.

 An older sonar waterfall display, displaying spectrographic data at bearing over time. This is similar to Micron's visualizations. "Src =" https://cdn.arstechnica.net/wp-content/uploads/2019/06/q5_1-300x235.jpg "width =" 300 "height =" 235 "srcset =" https://cdn.arstechnica.net/wp-content/uploads/2019/06/q5_1.jpg 2x
Enlarge / An older sonar waterfall display, displaying spectrographic data at bearing over time. This is similar to Micron's visualizations.

Because each moving part has a unique sound signature, the AI ​​software is necessary to first learn what sounds like in normal operation and then to detect problems. With some time and guidance, the deep learning software can detect how badly the part is degraded – and in some cases, even diagnosis what's wrong with it

The ability to determine the potential failure and to classify the sound signatures comes from convolutional neural networks (CNNs), which are used to extract detailed information from audio spectrographic images. The images are stored and used later as acoustic fingerprints for classification of potential problems

Acoustics are used for more than just listening for worn bearings. Micron has also used its microphones to find water and air leaks and failed components by mapping the minute sounds of those leaks, along with the location from which they are emanating.

Of course, that's not all AI does in the fab— Nor are all the sensors simply arrays of microphones. A process called "real-time defect analysis" uses deep learning to perform image analysis of silicon wafers on the way to becoming memory chips. The image analysis looks for defects in the surface of the wafer, from tiny scratches to errors in the photographic process and everything else in. [Thosesiliconwaferswhichareactuallyverythinslicesofasingle300millimeter-widesiliconcrystalarethebaseofwhatwilleventuallybeawholebatchofmemorychipswithelectronicelementsimprintedontheirsurfacesEventuallythewafersaredividedintoanarrayofchipsthatarecutapartforindividualuseinproducts

Before Micron started using AI-based image analysis to find defective wafers, the job dependent on people looking at the surface. If you might imagine, the visual fatigue created by this sort of was significant and, because of human error, was a significant problem.

By using deep learning, the image classification system can perform consistent classifications without any loss of productivity. because employees start making mistakes after looking at the tiny circuits on silicon wafers for too long. Instead, the AI ​​system learns over time, is able to accept human feedback, and can be updated with new information or classification parameters while in production. As important as the image processing for inspection wafers is – along with acoustic monitoring for production —The use of sensors in the fab goes beyond those two cases. According to Micron data scientist Ted Doros, there are sensors for almost any environmental factor you can imagine. Doros said that the company's machine is based on pattern matching with deep neural nets to classify wafer defects, which in turn allows the company to intelligently grade its production output. Silicon that might not meet requirements to be turned into first line "mission critical" consumer products might be used for less demanding things — such as electronics used in toys.

The company also developed a series of what it calls "super sensors "that combine two to fourteen functions in a single package. Doros said that these functions can include vibration, temperature, humidity, or partial coronal discharge. Some sensors can detect radio frequency interference and power quality. In one case, the AI-based pattern matching was able to spot the turbulence caused by a leak in the levitation system that floats wafers on a layer of air and the spot for maintenance.

 One of Micron's "super sensors," combining multiple detectors in a single package. "src =" https://cdn.arstechnica.net/wp-content/uploads/2019/06/DSC_0152r-640x360.jpg "width =" 640 "height =" 360 "srcset = "https://cdn.arstechnica.net/wp-content/uploads/2019/06/DSC_0152r-1280x720.jpg 2x
Enlarge / One of Micron's" super sensors, "combining multiple detectors in a single package

Wayne Rash

One unexpected set of sensors at the Manassas factory is there to monitor seismic activity. Normally, you wouldn't think of Northern Virginia as being prone to earthquakes, but a 5.8 earthquake on August 23, 2011 knocked the fab offline for its first time ever. It quits that quakes are more of a problem than previously expected. "Micro quakes can ripple through the fab," Doros explained, but with the seismic sensors in place, the AI ​​can account for them and make adjustments in the production. line

The advanced manufacturing processes built into the Manassas fabrication plant are reflected in the plant's output. Doros said that so far, the company has seen 25% fewer quality events, along with an improvement in time to a yield of 25% and an increase in manufacturing output or 10%.

that Micron announced in 2018 that it's doubling the size of its Manassas fab and adding a global research center that will be used to develop even more smart manufacturing capabilities. That new facility will start coming online in stages, with the first wafers starting to appear in late 2020, Doros said.


Source link