16 new world changing technologies

The Next Big thing?

The memristor, a microscopic component that can “remember” electrical states even when turned off. It’s expected to be far cheaper and faster than flash storage. A theoretical concept since 1971, it has now been built in labs and is already starting to revolutionize everything we know about computing, possibly making flash memory, RAM, and even hard drives obsolete within a decade.

The memristor is just one of the incredible technological advances sending shock waves through the world of computing. Other innovations in the works are more down-to-earth, but they also carry watershed significance. From the technologies that finally make paperless offices a reality to those that deliver wireless power, these advances should make your humble PC a far different beast come the turn of the decade.

In the following sections, we outline the basics of 15 upcoming technologies, with predictions on what may come of them.

Some are breathing down our necks; some advances are still just out of reach. And all have to be reckoned with.

— Memristor: A Groundbreaking New Circuit

— 32-Core CPUs From Intel and AMD

— Nehalem and Swift Chips Spell the End of Stand-Alone Graphics Boards

— USB 3.0 Speeds Up Performance on External Devices

— Wireless Power Transmission

64-Bit Computing Allows for More RAM

Windows 7 : It’s Inevitable

Google’s Desktop OS

— Gesture-Based Remote Control

— Radical Simplification Hits the TV Business

— Curtains for DRM

— Use Any Phone on Any Wireless Network

— Your Fingers Do Even More Walking

— Cell Phones Are the New Paper

— Where You At? Ask Your Phone, Not Your Friend

— 25 Years of Predictions

The Future of Your PC’s Hardware

1) Memristor: A Groundbreaking New Circuit

Since the dawn of electronics, we’ve had only three types of circuit components–resistors, inductors, and capacitors. But in 1971, UC Berkeley researcher Leon Chua theorized the possibility of a fourth type of component, one that would be able to measure the flow of electric current: the memristor. Now, just 37 years later, Hewlett-Packard has built one.

What is it? As its name implies, the memristor can “remember” how much current has passed through it. And by alternating the amount of current that passes through it, a memristor can also become a one-element circuit component with unique properties.

Most notably, it can save its electronic state even when the current is turned off, making it a great candidate to replace today’s flash memory.

Memristors will theoretically be cheaper and far faster than flash memory, and allow far greater memory densities. They could also replace RAM chips as we know them, so that, after you turn off your computer, it will remember exactly what it was doing when you turn it back on, and return to work instantly. This lowering of cost and consolidating of components may lead to affordable, solid-state computers that fit in your pocket and run many times faster than today’s PCs.

Someday the memristor could spawn a whole new type of computer, thanks to its ability to remember a range of electrical states rather than the simplistic “on” and “off” states that today’s digital processors recognize. By working with a dynamic range of data states in an analog mode, memristor-based computers could be capable of far more complex tasks than just shuttling ones and zeroes around.

When is it coming? Researchers say that no real barrier prevents implementing the memristor in circuitry immediately. But it’s up to the business side to push products through to commercial reality. Memristors made to replace flash memory (at a lower cost and lower power consumption) will likely appear first; HP’s goal is to offer them by 2012.

Beyond that, memristors will likely replace both DRAM and hard disks in the 2014-to-2016 time frame. As for memristor-based analog computers, that step may take 20-plus years.

2) 32-Core CPUs From Intel and AMD

If your CPU has only a single core, it’s officially a dinosaur. In fact, quad-core computing is now commonplace; you can even get laptop computers with four cores today. But we’re really just at the beginning of the core wars: Leadership in the CPU market will soon be decided by who has the most cores, not who has the fastest clock speed.

What is it? With the gigahertz race largely abandoned, both AMD and Intel are trying to pack more cores onto a die in order to continue to improve processing power and aid with multitasking operations. Miniaturizing chips further will be key to fitting these cores and other components into a limited space. Intel will roll out 32-nanometer processors (down from today’s 45nm chips) in 2009.

When is it coming? Intel has been very good about sticking to its road map. A six-core CPU based on the Itanium design should be out imminently, when Intel then shifts focus to a brand-new architecture called Nehalem, to be marketed as Core i7. Core i7 will feature up to eight cores, with eight-core systems available in 2009 or 2010. (And an eight-core AMD project called Montreal is reportedly on tap for 2009.)

After that, the timeline gets fuzzy. Intel reportedly canceled a 32-core project called Keifer, slated for 2010, possibly because of its complexity (the company won’t confirm this, though). That many cores requires a new way of dealing with memory; apparently you can’t have 32 brains pulling out of one central pool of RAM. But we still expect cores to proliferate when the kinks are ironed out: 16 cores by 2011 or 2012 is plausible (when transistors are predicted to drop again in size to 22nm), with 32 cores by 2013 or 2014 easily within reach. Intel says “hundreds” of cores may come even farther down the line.

3) Nehalem and Swift Chips Spell the End of Stand-Alone Graphics Boards

When AMD purchased graphics card maker ATI, most industry observers assumed that the combined company would start working on a CPU-GPU fusion. That work is further along than you may think.

What is it? While GPUs get tons of attention, discrete graphics boards are a comparative rarity among PC owners, as 75 percent of laptop users stick with good old integrated graphics, according to Mercury Research.

Among the reasons: the extra cost of a discrete graphics card, the hassle of installing one, and its drain on the battery. Putting graphics functions right on the CPU eliminates all three issues.

Chip makers expect the performance of such on-die GPUs to fall somewhere between that of today’s integrated graphics and stand-alone graphics boards–but eventually, experts believe, their performance could catch up and make discrete graphics obsolete. One potential idea is to devote, say, 4 cores in a 16-core CPU to graphics processing, which could make for blistering gaming experiences.

When is it coming? Intel’s soon-to-come Ne

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now