An Evaluation of the Future of Computing

13 May, 2010 § 2 Comments

This article was written by Brendan Grebur and Jared Wein.

Current technology can no longer support the ever shrinking size of transistors and switches. In order to combat this limitation, researchers are focused on novel approaches to either replace the current MOSFET transistors or supplement their abilities. Many notable technologies have been leveraged to solve the issue, all have yet to be realized on a large-scale. The articles explore two areas of technology aimed at resolving our computational situation. First, manufacturing hardware on the molecular scale presents new challenges as the peculiarities of quantum mechanics begins to reveal itself. Advances in nanotube technology and an ever growing knowledge of quantum physics presents new opportunities and even challenges our classical view of computation. Second, the actual means of computing are explored as the electron is abandoned for photons or nanofluids.

This paper will cover different research areas that intend to pave the way for the future of computing. The first part of this paper will describe, compare, and contrast the techniques being researched. The second part of this paper will describe two techniques that are predicted for adoption within the authors’ lifetime. The last part of this paper will cover the research area that offers the greatest improvement and largest chance to fundamentally change and improve computing.

Trillion Crummy Components [5]

The ability to perform in the face of massive component failure and faults deeply concerned system architects and operators in the early days of computing. A fortuitous advancement into integrated circuits rendered the issue moot. However, as the scales approached nanometers, the problems resurfaced. This paper looks into a method for developing FPGAs with a nanoswitch/CMOS combination.

In manufacturing ever smaller components, failure rates inevitably increase as little room is left for error. Researchers explored building nanoscale systems containing faulty components and attempted to dynamically cope with it. Wires connected by nanoswitches on the scale of 15 nanometers were produced through NIL techniques.

The design suffers from the failure or faulting of nanoswitches that connect the CMOS transistors to form the configurable logic system. Scanning of the system for these faulty components provides a view of functional components who can then be dynamically connected through the FPGA configuration. Replacing typical CMOS implemented components in an FPGA with nanoswitches freed significant amounts of space. Enough so to obtain an eight-time increase in logic density, which translates to a multiple generation chip improvement. In the end, even in the face of 50% nonoperational switches, production yield remained at 99.7%. Performance of the FPGAs was affected, but only slightly by the nonfunctional components.

The authors of this paper are trying to address the fact that faulty components can be overcome through redundancy. It gives the initial look that faulty components can easily be routed around through switches. The paper leaves many important questions unanswered, such as the minimum amount of redundancy for each of the necessary functional units. They have assumed the CMOS transistors will not fail, only the wires or switches. When this is extended to CMOS transistors, or their replacements, how many of these will need to be replicated? This could essentially undo the advantage to reducing component sizes as more components are now necessary to maintain functionality.

Carbon Nanotubes [3]

Single-walled carbon nanotubes (SWNT) exhibit a variety of advantages over silicon when used in transistors. Some of these include reduced electron scattering, better heat dissipation, and less chemically reactive material. Researchers have already produced p-type and n-type transistors which outperform their silicon based counterparts in speed and mobility.

A hurdle for implementation remains in correctly placing and orientating the SWNTs within the system, as a single nanotube width can measure 1 nm. However, a ring oscillator was constructed and sustained operation in excess of 70 MHz on a single nanotube. Thus affirming the potential of the technology. Any further abilities will depend on advancements in the construction of nanotubes.

The other application resides in constructing switches from nanotubes for nonvolatile memory. Inherent characteristics could allow impressive switch density, along with switching speeds in excess of 100 GHz, all using minimal power consumption.

Restrictive production methods have prevented full exploitation, however novel implementations could resolve these problems. One company used mats of criss-crossing nanotubes capable of being locked in certain directions resulting in a simple but quite effective approach with lifetimes of 50 million cycles and switching states in under 3 ns.

Further exploiting SWNT characteristics, nanotubes can replace copper connections as they do not exhibit the limitations of copper at the nanoscale. The bandwidth in the nanotubes is more than three orders of magnitudes larger than copper, allowing more nanotubes to be used to reduce losses in electrical current with the copper wires.

In addition, quantum computing also benefits from the utilization of nanotubes. Singled-out electrons can be ‘caged’ within the carbon structure to represent a quantum bit. An increased spin relaxation time for the particles within the nanotube dots is advantageous for constructing quantum computers.

This paper looks into completely replacing our dependency on silicon and copper for constructing circuits. SWNTs offer superior performance in almost all aspects, but hasn’t made itself ubiquitous due to large-scale manufacturing issues. The material itself is so versatile and atomically abundant, in addition to being inherently scalable, that once manufacturing hurdles are overcome we are sure to see it overwhelm the market.

Molecular, Chemical, and Organic [6]

Chemical computing currently operates on the macroscopic level to implement complex algebraic operations. Signals including optical, electrical, or chemical are used to represent data. Fluorescence cues have been the most successful, but implementing on a reduced scale remains a crux.

One approach attempts to use single molecules as temporary bridges between electrodes, but accurate measurements of its effectiveness are rarely obtained.

Scaling remains an issue, even at the theoretical level for constructing circuits from organic molecules. Researchers must deal with quantum mechanic effects, when operating at the molecular level. The behavior theorized may be inadvertently fabricated by approximation of their calculations.

Some attempts to embrace the quantum phenomenons have found success. A cascading effect seen when lining up carbon monoxide molecules results from the tunneling of vibrational energy from one molecule to another. The positioning of molecules can represent binary encoding and perform logic operations.

Traditional electron usage is abandoned here for the natural interactions of varying molecules to represent data or perform useful work. Again, other materials are researched for their potential to outperform current technology. Many issues arise from quantum phenomenons and accurately placing or observing individual molecules. Such limitations will inevitably inhibit the widespread application of organic computing.

Quantum Computers [2]

Quantum computing represents an inevitable shift in our quest for ever smaller computation scales. Only recently has the combination of quantum theory and computer science been accepted as reality. Beginning with Peter Schor’s discovery, quantum computing entered the scene by undermining one of the most important features of cryptography, difficulty factoring.

Much of the theorectical framework regarding quantum computing was almost rendered useless by the introduction of imprecision and quantum noise. Since computation occurs as infinite precision data represented by the amplitudes of configurations, any errors that enter the system would propagate and corrupt any results. However, it has been shown that error-correcting codes, coupled with redundancy, can prevent such events from destroying the configuration.

Further discoveries solidified the feasibility of quantum computers, specifically the threshold theorem. The only hindrance lie in the implementation of quantum bits and the manner in which they are controlled.

One of the greatest advancements to stem from this field was the introduction of quantum key distribution. The unconditional detection of eavesdropping allows a level of security unparalleled by traditional methods. Implementation details are surprisingly tolerant of imperfections in device endpoints and error rates. Such technology is currently marketed and actively investigated by many large technology companies.

Quantum algorithms have forced us to reconsider standard approaches to classical computation problems. Algorithms ranging from database sorting to estimating Gaussian sums have runtimes significantly smaller than any current approaches.

Quantum computing has certainly altered every aspect of computer science. Not only will circuit technology be completely redesigned, but the manner in which information can be represented opens up endless possibilities. The functionality is there, but much is still needed to actually access it.

Optical Computing [1]

By replacing electrons with photons, computers would no longer suffer common sources of failure including electromagnetic disturbances and short circuiting. Even more advantageous attributes provided by light are the immense bandwidth, low-loss transmissions, and unprecedented capacity when implemented for storage. Devices dealing with optics tend to be less expensive to manufacture and operate at higher frequency resulting in superb computational speeds.

Currently constructed logic gates operate on the order of nanoseconds, while photonic switches can top the femtosecond range. From figures like these, it would only seem natural for the technology to completely replace our silicon implementation. However, some important factors continue to prevent a full optical solution.

One issue concerns the immense difficulty in cascading large numbers of logic gates together. Half-adders can currently be produced, however extending past that seems to present a greater challenge. Another issue relates to the materials used for constructing switches that demand generous amounts of photons to function properly. Such an optical force generates counterproductive signals, including stimulated Brillouin scattering and self-phase modulations.

By exploiting particles that operate at the universal speed limit, we have reached our limit for transmission speed. In addition, production of photons is far more efficient than than electrons while potentially decreasing thermal output. The technology to effectively link a necessary amount of optical logic gates together remains to be found.

Nanofluidic Computing [4]

Fluid interaction on the micro-scale can be manipulated to perform computational logic or as means of memory storage. The speed of the actions are previously known to be less optimal than current silicon technology, so its aim is simply to supplement.

One type of microfluid implemented logic gates without electrical control by using the resistance of fluid flows. Such an approach yielded the advantage of simultaneously running multiple logic gates from the same flow. Cascading gates could now be easily implemented.

Another type used a hybrid approach where an electrical current would restrict the flow of potassium ions between silicon dioxide plates. A binary state could then be represented and logic gates created from them. Essentially, the standard transistor is replaced by fluid.

Due to limitations in the technology, using it for a reliable communication line would be more likely in standard computing systems. Applications could also extend to analyzing other fluids and performing actions based on the presence of a substance. Since microfluid systems would only require small amounts of test material, they may find a place in blood tests or testing hazardous samples.

This area of research is in a premature stage, but seems to be limited by computational speed and the ability to cascade components. Perhaps the parallel capability and small sample size could prove useful in the future, but for now it appears there are no available applications.

Summary

A Trillion Crummy Components attempts to resolve a problem that we are likely to face soon. By adding redundant components accessible through the crossbar technique, we can easily compensate to maintain a functional system in spite of failure. Here we only deal with a scaling issue, while maintaining the CMOS technology. All other papers looked to completely replace CMOS, most with the goal of ground-up nanoscale construction. Some simply used various materials enabling advantages to complete the same tasks, only using different signals to represent data, i.e. photons, chemicals, fluids. One paper, quantum computing, explored an area that could completely revolutionize everything we know about computing. Quantum computing challenged the very concept of data representation with superpositions and infinite precision variables.

Out of all these radically different approaches, some can certainly be defined as more viable than others. Carbon nanotubes offer nothing but advantages over our silicon devices. Their inherently miniature size supports massive density, coupled with power efficiency, greater heat dissipation, and astounding mechanical switching speed. With more research to perfect the manufacturing process, full-scale realization should be upon us shortly.

Optical computing also offers a formidable technology we are likely to soon see. Optics are already heavily used in the computing industry for networking, data storage (CD-ROM), and scanning. Replacing communication and data representation with light significantly decreases latency and operates flawlessly in the face of electromagnetic interference. Photons offer massive bandwidth as a result of overlapping frequencies without corruption. Optical computing offers such an increase in computational speed (10⁷) that it would prove an invaluable commodity, worthy of any development cost. Cascadability appears to be the biggest challenge, but exploration into useful materials could easily resolve the problem.

The techniques discussed for crummy computing could likely appear as transitions are made to the aforementioned devices, but is restricted to the traditional silicon devices. The future appears to indicate a radical departure from the norm. Organic and nanofluid computing are hindered by a lack of speed or reliability. Finally, quantum computing suffers from a lack of large-scale bit implementations and properly applying quantum error correction. We are sure to see the birth of quantum computers, but not within this lifetime.

Future Impact

One technology stands apart from the rest to forever change computation. Quantum computing has already provided a security technology guaranteed by physics to be unbreakable. A technology that allows for accurate representation of an infinite precision variable is, without doubt, an incredible advancement. No other technology visited provides such a feature. With such claims as solving NP-complete problems in P time, computation now becomes an afterthought. Many other traditional computing problems have been redesigned to run on quantum computers, experiencing substantial decreases in runtime. With such a raw computational power, humans are offered the chance to look further into the physics governing our world through simulation. Quantum computers are sure to offer us a wealth of information and possibilities to change the way we live and think.

References

[1] Abdeldayem, H. and Frazier, D. O. “Optical computing: need and challenge.” Communications of the ACM. Special Issue: Beyond Silicon: New Computing Paradigms (Sep 2007): 60 – 62.

[2] Bacon, D. and Leung, D. “Toward a world with quantum computers.” Communications of the ACM. Special Issue: Beyond Silicon: New Computing Paradigms (Sep 2007): 55 – 59.

[3] Kong, J. “Computation with carbon nanotube devices.” Communications of the ACM. Special Issue: Beyond Silicon: New Computing Paradigms (Sep 2007): 40 -42.

[4] Marr, D. W.M., and Munakata, T. “Micro/nanofluidic computing.” Communications of the ACM. Special Issue: Beyond Silicon: New Computing Paradigms (Sep 2007): 64 – 68.

[5] Robinett, et al. “Computing with a trillion crummy components.” Communications of the ACM. Special Issue: Beyond Silicon: New Computing Paradigms (Sep 2007): 35 – 39.

[6] Stadler, R. “Molecular, chemical, and organic computing.” Communications of the ACM. Special Issue: Beyond Silicon: New Computing Paradigms (Sep 2007): 43 – 45.

Tagged: ,

§ 2 Responses to An Evaluation of the Future of Computing

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

What’s this?

You are currently reading An Evaluation of the Future of Computing at JAWS.

meta