MIT working to make custom chips for AI more attractive for mobile

This reminds me of other research I’ve read about over the years to add other kinds of custom processors into the many core mix that is now prevalent even on mobile devices. This story isn’t even about creating a custom chip for neural networks but improving their power efficiency to make them more adoptable on phones.
Read More …

Building quantum registers from imperfect crystals

Chris Lee at Ars explains some new research that could fill in a critical piece needed for a practical quantum computer, a way of storing multiple qubits similar to a register in a classical computer so more sophisticated computation and communication can be realized. Lee does his usual excellent job of making what can be a pretty opaque topic very readable, especially how this likely informs future applications.
Read More …

Breakthrough at IBM will continue to fuel Moore’s Law

Brian Barrett at Wired has a good explainer on the challenges of scaling as the backbone of Moore’s Law. He goes over the transistor design that has helped keep the trend of price to performance going in the last few years as well as a new approach IBM is optimistic will keep the trend going for several more years. No mention in the article of where miniaturization reaches physical limits but that threshold is still out. Unlike this particular development, at a certain point the realities of physics at ridiculously small scales will required shifts away from transistors and silicon.

Read More …

Spray-on memory may advance true ubiquitous computing

Research like this continues to invite us to re-think how computing technologies can intersect with and enlive the world around us. This isn’t going to transform your PC or phone but rather will continue the trend of weaving computation into everything, even more so than the still relatively apparent propagation of the so-called Internet of Things. This technology also reminds us to ask: when memory is so available you can just literally paint it on, what would you do with it? Environmental sensors is really just the tip of the iceberg.

Read More …

SciFi Designer to Work on Real World Technology

John Pavlus has an interesting post at io9 about Mark Coleran who has come up with quite a few of the most recognizable fantasy user interfaces seen on film in recent years. More often than not, tech enthusiasts find Hollywood’s imaginings of futuristic computers to be groan inducingly bad. Coleran’s work, however, is grounded in real design principles.

“[The FUIs] look fantastic when you see them in the theater, but a lot of that is actually grounded in reality — stuff that’s not mainstream yet, that I’ve been researching and experimenting with.” Case in point: Coleran’s design for a near-future music player in Children of Men looks uncannily similar to iTunes’s “Coverflow” interface, which came out nine months later.

As it turns out, his experience includes both real world design as well as the more fantastical work that has appeared on screen. That mixture of the visionary with the practical no doubt factored into his being hired by Bonfire Labs.

But he says Bonfire hired him to be more of a “visual concept designer” for their interactive and advertising clients — sort of a Syd Mead for UIs, “looking at the bigger picture rather than the detail of individual buttons,” says Coleran. “My background from the film work, plus my experience in engineering, electronics, and graphic design, sort of fits with these interactive projects. There’s an element of futurism, where you can play the ‘what ifs’ out to their logical conclusions. Not just for the sake of it, but if you know the rules, you can break them to get something better.”

Read on for his thoughts on the distinct challenges presented by movie UIs and ones, no matter how speculative, that are intended for the real world. If only Canonical or RedHat had snapped him up. Can you imagine what a “Syd Mead for UIs” could do to revolutionize Linux on all kinds of devices?

A master of science fiction movie gadgets moves over to the real world, io9

Barrier to Graphene Replacing Silicon in CPUs

Slashdot links to some discussion originating out of IBM. Yu-Ming Lin, a researcher from Big Blue, explained in an interview that graphene transistors cannot be switched off in the same way as silicon ones. Slashdot and bit-tech both quote Yu-Ming’s explanation as saying:

… there is an important distinction between the graphene transistors that we demonstrated, and the transistors used in a CPU. Unlike silicon, ‘graphene does not have an energy gap, and therefore, graphene cannot be “switched off,” resulting in a small on/off ratio.

The quote is from an as of yet unpublished interview. It isn’t clear if the lack of an energy gap is a quality of graphene as a material or the current way transistors are constructed from it. Given the direct comparisons to silicon, I infer the former. If it were the latter, then the possibility would remain that a different approach could overcome this critical obstacle.

The article goes on to share some more optimistic thoughts from Yu-Ming on plenty of interesting applications within computer chips for graphene. A further quote from Mike Mayberry, Intel’s director of component research, suggests this all may still be theoretical, that more experimentation may be required before we can so confidently declare the practical limits of the material.

Graphene offers considerable advantage over silicon, a few are mentioned in the bit-tech article. I’ve discussed many of them in past posts here and on the podcast. It is intriguing to imagine graphene’s further use in computing, even replacing many of the materials in use today. Mayberry’s quote reminds us of how wide the gap is between such speculation and even tomorrow’s technology just in terms of what we know about silicon and don’t know about graphene.

Graphene Won’t Replace Silicon In CPUs, Says IBM, Slashdot

Producing Graphene from Table Sugar

Slashdot links to a Gizmag piece covering some research that covers one of the things I’ve always wondered about the amazing material, graphene. Being just a single atom sheet of atomic carbon, the material itself is environmentally sustainable but I’ve wonder if we could produce it without nasty chemicals involved or arising as byproducts.

Researchers at Rice University have made graphene even sweeter by developing a way to make pristine sheets of the one-atom-thick form of carbon from plain table sugar and other carbon-based substances. In another plus, the one-step process takes place at temperatures low enough to make the wonder material easy to manufacture.

The process they discovered is also versatile. The article doesn’t say if this is true of sugar as a carbon source, but using the original material, plexiglass, they could variously produce single, double and multilayer sheets. They also could alter the doping of the graphene. The additional of other materials is key to controlling the sheet’s resulting electrical and optical properties.

Graphene Can Be Made With Table Sugar, Slashdot

Harnessing Chaos for Computation

John Timmer at Ars Technica explains some fascinating new work on a type of processor that could build on the advantages of FPGAs and provide the speed of more conventional CPUs. Timmer explains the relationship between specialized chips, like DSPs, and traditional CPUs pretty well. In the course of doing so, he notes how a field programmable gate array in many ways represents the best of both, allowing such a chip to dedicate all of its silicon to specialized tasks but able to change the type of task as needed. In reality, FPGAs have limits that make them useful only in certain circumstances, like prototype new chip designs without dedicating fabrication capabilities to building set chips.

The key to this new approach is harness chaos theory.

Those who think of chaos as completely unpredictable are likely to be wondering how unpredictable behavior can be used to perform logic operations. But chaos theory isn’t concerned with unpredictability; instead, it focuses on what are called nonlinear functions, ones where the ultimate output is very sensitive to the initial conditions. When you can control the initial conditions, you can still predict the output.

That ability is at the heart of a chaotic processor. The authors of a recent paper in Chaos describe what they call “chaogates,” which use simple, nonlinear functions to perform logic operations. The basic idea is that, ultimately, you want a logical output, a binary 1 or 0. It’s possible to convert the output of even a complex function into that sort of binary distinction using a strategically placed less than or equal to (<=) operation. If this sort of function is hardwired into the chip, then it’s simply a matter of knowing how to select your inputs so that you get the operation of your choice.

This is very early stage work. While there is a working prototype, it is far from the scale that would make it comparable to existing FPGAs. Timmer notes one aspect of these “chaogates” that already has worked out well, that is they can be re-purposed in about a single clock cycle. If that holds as they are accelerated from the current 30MHz to useful speeds, that would be a considerable advantage.

The biggest barrier is that the existing hardware description languages, used in programming FPGAs, do not apply to these new chips. In addition to proving the theory and building workable prototypes, the researchers have to invent an entirely new, compilable language as well.

Researchers harness chaos theory for new class of CPUs, Ars Technica

feeds | grep links > Kickstarter for Interactive Fiction, Microsoft Response to Open Kinect, Another Computer Built in Minecraft, and More

feeds | grep links > Towards a Graphene Transistor, Over Broad Child Protection Law Blocked, B&N Caught Deleting Customer’s EBooks, and More

Apologies for the second day of just links. I was in a rush to get to the local CopyNight here in DC last night. I took a sick day from work today to try to final get over this cold and have been trying to keep blogging to a minimum, too, in order to maximize my rest.

Thankfully, tonight’s podcast is an interview I recorded last week so will got out with minimum effort as scheduled.