Wafer Technology

 

What Does Wafer Mean?

A wafer is a piece of silicon (one of the most abundant semiconductors available worldwide) or other semiconductor material, designed in the form of a very thin disc. Wafers are used to create electronic integrated circuits (ICs) and silicon-based photovoltaic cells. In these designs, the wafer serves as the substrate. Engineers use processes such as doping, implantation and etching to complete the build of the integrated circuit.

The use of a wafer in integrated circuits begins with a solid “pipe” or piece of silicon that is purified, melted and cooled. It is then carefully sliced into very thin wafers. Various substances called “dopants” are used in the creation of n-type or p-type silicon microcells, and processes like sputtering, vapor deposition and molecular beam epitaxy are performed to engrave or etch a pattern onto the wafer. This is performed in clean rooms to avoid contamination and other problems related to working at this nanoscale.



The general idea is that on the very small wafer surface, different charges and substrate conditions are clustered together in ways that will drive complex operations on the miniature physical footprint. Prior to the wafer as a convention, more rudimentary methods were necessary (i.e., the use of vacuum tubes in the ENIAC); the creation of modern substrate design frees up tremendous potential in device advancement.


A wafer is also known as a slice or substrate.

The doping of silicon wafers and other innovations in building microprocessors is a continuation of technology following Moore’s Law, which, in 1965, stated that the density of transistors able to be produced on a given surface area doubles every two years. This law has been vindicated by the rapid innovation of microprocessors. First, there was small-scale circuit integration, then medium-scale circuit integration, then large-scale circuit integration, and then finally very large-scale and ultra-large-scale integration.


This, in turn, has made devices smaller and more compact.

With the use of nanoscale wafer design a given in modern industry, attention turns to how to complement those complicated microprocessors with innovative batteries and other small components. For example, the interface itself — the controls and the screen of a device — are also being shrunk and miniaturized according to these principles. That miniaturization is what has led to having everyone having minicomputers in their pockets, and cameras so small they are nearly undetectable.


Beginners can get a better idea of the wafer and the integrated circuit that it holds by contrasting this technology with the printed circuit board (PCB). The PCB is the part of a circuit design that users can see when they open a device. It has electrical soldered lines running between various components — not only ICs, but resistors, capacitors and other components. By contrast, the integrated circuit on the wafer is a component that is added to the PCB. Both have circuitry, but work in different ways. The IC does the lion’s share of processing, and the PCB “runs” impulses between components.


There is also a difference in how the technologies are designed: While the PCB material is usually fairly hardy, the wafer is often delicate and sensitive, hence, necessitating the use of clean rooms in their manufacture. The use of a “chip on board” (COB) design melds the two ideas to create complex device circuitry.

Comments

Popular Posts