Thursday, November 27, 2014

Who’s Paying for All of This? The Federal Government’s Involvement

With the potential implications quantum computing can have on processing speed, data capacity and, perhaps most importantly, information security, it’s not at all surprising that the Federal Government takes great interest in its development.  More specifically, agencies such as the NSA take a great interest in the new technology.  In leaks from the infamously known whistle blower, Edward Snowden, it is predicted that approximately $79.7 million were spent on research in a program referred to as “Penetrating Hard Targets”.  Portions of this funding were specifically allocated for developing a quantum computer that could be used in quantum cryptography.

Understandably, much of the research program is undisclosed due to its classified status but, as covered in this blog’s earlier posts, quantum computing can potentially have profound effects on encryption and cryptography. 

In addition to the NSA, many other agencies and companies have taken great interest in the development of quantum computing.  DARPA, an organization given credit for the birth of the internet, has provided years of funding and the super company, Google even opened a “Quantum Artificial Intelligence Lab”.

Perhaps the most interesting Federal involvement into the effort behind quantum computing comes
from NASA.  The space administration has been operating a specialized lab known as QuAIL, Quantum Artificial Intelligence Laboratory.  This program dives deep into the new technology, exploring theoretical possibilities in quantum processing, algorithms and even hybrid quantum-classical approaches.


With all the Federal attention given to this research, one may ask if its having a positive or negative effect on the process.  From one angle we may be reminded of the bureaucracy added to any process involving the government.  On the other hand, the additional resources granted from the Federally-backed funding will only speed up progress.  With results still in their infancy, it seems only time will tell.

Robertson, Adi. ‘NSA secretly funding code-breaking quantum computer research, says Washington Post’. Retrieved from: http://www.theverge.com/2014/1/2/5267430/nsa-reportedly-secretly-funding-code-breaking-quantum-computer-research

‘A Description of the Penetrating Hard Targets project’. “National Security’. Retrieved from: http://apps.washingtonpost.com/g/page/world/a-description-of-the-penetrating-hard-targets-project/691/

‘Quantum Artificial Intelligence Labortory’. “NASA. Retrieved from: http://www.nas.nasa.gov/quantum/

Tuesday, November 25, 2014

Maintaining the Flow: How Scientists Are Putting a "Spin" on the Whole Process.

As we’ve been discussing, cohesion, or the lack there of, is a big, if not the biggest, issue slowing the development of quantum computing.  At small levels, quantum bits can maintain their quantum state, therefore achieving cohesion for a reasonably lengthy period of time.  On the contrary, at larger levels, quantum bits loose cohesion rapidly causing the quantum process they’re supporting to cease.  Finding the right material to maintain this cohesion is only part of the equation. Performing the correct process on this material is equally important. 


The manipulation of electrons across the provided medium is a key factor in quantum mechanics.  As electrons pass along a conducting source, their quantum state is maintained with a balance of conducting and insulating properties contained within the material they pass through.  While traditional electrical mechanics are not concerned with the quantum charistics of an electron, quantum mechanics relies on it.  One of these characteristics is the “spin” of the electron.  One of the challenges scientists face is controlling this spin which has a direct effect on their quantum state.

Yanxia Xing, Zhong-liu Yang, Qing-feng Sun, Jian Wang (2014). ‘Coherent single-spin source based topological insulators’. Physical Review B condensed mater and materials physics, Retrieved from http://journals.aps.org.mutex.gmu.edu/prb/pdf/10.1103/PhysRevB.90.075435

Sunday, November 23, 2014

Who's Behind the Curtain?

When thinking about quantum computing, processing, mechanics or any other type of quantum physics, many of us are reminded of names like, Galileo, Newton or Einstein.  While these individuals were pioneers in their respective fields, hundreds of others offered great contributions to the science.  Today’s cast of industry leading physicists include minds like Steven Hawkins, author of A Brief History of Time and Michio Kaku, author of Physics of the Impossible.  Both experts in Theoretical Physics, they have demonstrated remarkable understandings of the world of Physics as well as captured the world’s attention by using main stream media to convey their ideas.  While these men stand as pillars in the industry, the works of scholars like Yong P. Chen, while not widely popular, are some of the driving factors behind current advancement in quantum computing.

Associate Professor Young P. Chen is a lead Physicist at Purdue University with expertise in areas including physics, mathematics and nanotechnology.  Professor Chen began his career in IBM’s Zurich Research Laboratory in Switzerland in the summer of 1998 and continued his contributions to other institutes of higher learning including the Grenoble High Field Laboratory and Rice University.  He began his tenure at Purdue University in 2007 and is currently working on projects involving low dimensional physics, topological insulators, spintronics, and several others.  The latter two have offered amazing results in the area of quantum computing and, in a sense, revitalized efforts into its research.  For the first time in years, scientists can see substantial returns at a potentially large enough scale to yield promising results.

While leading a team of researchers at Purdue University, Professor Chen, along with his team, discovered a material potentially capable of maintaining quantum cohesion at substantial scales.  Cohesion is the ability of a particle to maintain its quantum state.  Without cohesion, quantum computing is not possible and scientists, throughout the industry invest a large majority of their time into solving the challenge of maintaining cohesion.  Chen and his team’s discovery of topological insulators may be the answer or “smoking gun” (as described by Yang Xu, a member of Chen’s team) to unraveling the mystery of cohesion.

Currently Chen and his team are working, along with other Universities including Princeton and the University of Texas in an effort to understand better the material and determine the most efficient approach for producing it.  One of the most remarkable characteristic of this material is its lack of mass.  While, historically, the mass of a quantum insulator had a direct correlation with its ability to maintain cohesion, currently, with this material, that does not seem to be the case.  Topological insulators work by conducting a unique type of electron containing no mass due to “spin-polarization.”  This unique process is the major focus of “Splitronics”, study at the heart of Professor Chen’s research.  With current momentum and continued success, Young P. Chen may be the next name we jump to when considering the breakthroughs of an era.

Purdue Science, Department of Physics and Astronomy, Physics and Astronomy, People, Faculty, Yong P. Chen, Retrieved from http://www.physics.purdue.edu/people/faculty/yongchen.html

What's Currently Being Done About This?

If you’ve ever developed an application; commercial, enterprise or personal, you’ve probably had the often-cumbersome issue of performance weighing heavily in the back of your mind.  How many variables do I have?  When should I instantiate this object.  How often will this method be called?  Is this loop too complex?  These are just some of the overwhelming number of questions we ask ourselves as developers with every programming decision we make.

There are a number of reasons we have these questions.  The first, and most obvious, is we don’t want to create an application that responds too slow or “freezes”.  The terms “Spinning Wheel” and “Blue Screen of Death” have become commonplace in the industry due to the fact that they are so often experienced and, likewise, so despised.  Other, not so well known, reasons include implementing critical, time-based functionality like scheduled emails or daily uploads.  In processes like these, when one process lags or does’t complete in a timely manor other processes completely fail.  Additionally, interesting things happen when more than one process compete for time slots in the process lineup.  Race condition errors are random, sporadic and inconsistent errors that occur when different methods or functions compete for the same artifact and, as a result, often receive it in different states.  These errors are both confusing and very time consuming to resolve.  If only machines could move fast enough to support our far superior mental capacity.

As discusses in my earlier post, there is a solution and we have a slight idea how to achieve it.  Until now, we can only achieve quantum processing at very small levels.  The complexity of achieving two states at the same time is just too much for materials we’re currently aware of to handle… until now.  A team of scientists at Purdue University led by associate professor of physics and astronomy and electrical and computer engineering, Yong P. Chen are experimenting with new materials called “topological insulators”.  Their research thus far has yielded some amazing results which have incited the team to refer to this material as the “smoking gun” in the effort to find materials capable of facilitating quantum processing.

Since discovering the benefits of using topological insulators, research teams have allocated resources towards identifying the best material for producing these insulators.  These efforts have received great attention and support from groups including Harvard, The Welch Foundation, the U.S. Army and other organizations.  With continued support, a reliable element will be realized resulting in reliable, fault-tolerant processing which could make development practices more efficient and make those performance-based headaches a thing of the past.

Venere, Emil (2014). ‘Topological insulators’ promising for spintronics, quantum computers. Science Daily, Retrieved from http://www.sciencedaily.com/releases/2014/11/141113195156.htm

There may be a solution

Have you ever been frustrated with your computer when it slows down, loses work or comes down with a nasty virus?  In a life surrounded by laptops, smart phones and tablets, we spent much if not the majority of our time inundated by technological marvels recently thought to be impossible.  Scientists, Engineers and Developers around the world are in a constant battle to keep our devices up-to-date while conceiving the “next great advancement” in our modern arsenal.

Studies predicting performance and optimization statistics are prepared regularly by organizations such as Berkeley in efforts stay ahead of the curve.  These studies along with well-known observations such as Moore’s Law predict that that processor capabilities double roughly every two years.  At this rate, some experts expect to reach out technological “ceiling” in approximately eighty years.  This would result in a halt to our “ever-advancing” progress in processing speed, capabilities and innovation.  Luckily for us, there are currently efforts to preempt this roadblock before it comes to pass.  The science of computer engineering and quantum physics join forces to explore the possibilities of Quantum Computing.

Quantum Computing, a broad term for Quantum Processing, refers to using quantum particles as a medium for storing and processing information.  This is being explored as an alternative for the traditional electron-based approach.  This fundamentally changes computing as we know it by drastically increasing the speed and capacity of both processors and memory.  Current technologies fall short in the sense that they can process information in only one of two forms at a time.  Groupings of positive or negative charges, often represented by ‘1s’ or ‘0s’, represent data to the machine at the hardware level.  Using only two states at a time limits our technological growth to only doubling with every advancement.  While this may seem substantial, with broadband, wireless and countless other networking capabilities we currently enjoy, we’re “evolving” at a much faster rate.

Quantum processing considers the approach of using different particles such as photons and quantum bits or “q-bits” to store and process information.  While current methods hold a positive OR negative charge to convey information, the quantum approach offers ways to hold a positive AND negative charge.  This essentially means that the machine can be in two different states at the same time.  Using this approach would render any proverbial “ceilings” obsolete.

After approximately forty years of research, quantum processing can be achieved at only a very small scale.  The problem is that, unlike the traditional Newtonian physics that most of us are accustomed to, quantum mechanics behave very differently at large scales than they do at smaller levels.  Scientists have been experimenting with potential solutions to this dilemma and have arrived at some potential solutions including using photons, and a combination of photons, electrons and quantum bits.  Unfortunately, over the last decade we have reached a plateau.  If forward movement isn’t motivated in this area, we may soon experience that very same “plateau” in terms of the performance of our every-day devices we have become some accustomed and, in many senses, dependent on.