Staying up to date with everything that's going on in PCs and tech is almost impossible, so these are the ten technologies that you should be most aware of, as they're the ones that'll make the biggest difference to your life.
1. 3D gaming
The fact that to get any kind of 3D image from a 2D screen means wearing a pair of sunglasses or worse means that three dimensional gaming isn't quite as convincing as multitouch and natural user interfaces, even though the two have been commoditised at almost the same time.
An Acer Aspire 5738 laptop with a 3D display costs about £550 at the moment, not bad for something with cutting edge technology that adds depth to any DirectX 9 game. The screen is of the polarised filter type, which is the new norm for extra dimensions.
3D vision
Instead of using coloured filters splitting an image into two – one for each eye – the vertical pixel columns are alternated between left image and right image and shone through a piece of polarised glass. A pair of dark glasses with oppositely polarised lenses ensures that only one image is seen by each eye. The difference to a game is tangible too, something like WoW runs and looks incredible on the low-end graphics hardware.
It's over in TV land that the real push for 3D is happening, though, as LCD suppliers ask us to upgrade again to watch hyper-real cinema in the lounge. Compared to the other technologies we've talked about here, though, 3D requires a lot of effort on behalf of the watcher (those pesky glasses) and most of us are very lazy; hence the ubiquity of MP3 and standard definition movies, while Blu-ray and higher resolution sound standards continue to flounder. We value ease of use over quality every time.
In its favour, 3D doesn't actually require any work on behalf of games developers or publishers, as the stereoscopic image is created at the driver level. On the other hand, that means there's no massive push by the people who make and sell games to encourage us to adopt it.
2. Streaming games
The advancements in superfast broadband hasn't just helped the cause of downloadable games. It will also have no some small impact on the future of streaming games over broadband, or at least that's the theory.
There are several companies pursuing and a significant amount of money invested in the idea that one day, your precious PC will be almost entirely redundant as a games machine.
The concept is simple: all the game's data is hosted on a central server and all you will have to do is receive the display and send back input commands. It's a little like the technology used for MMORPGs, except that the rendering engine isn't on your PC, it's actually in the same server farm as the core intelligence.
This idea was actually mooted some years ago with the Phantom console, which never made it to the stores. It's looking unlikely that OnLive (www. onlive.com), Gaikai and Microsoft's own streaming project will end up as vapourware though, despite the obvious concerns about input lag: the delay that occurs every time you press a key. The signal has to travel hundreds of miles before a character even moves.
Proponents say that even twitch gaming FPS games are possible but we're a little more sceptical. There's another reason that at least one of these services will be properly launched soon, and that's vested interest by games publishers.
Because no content is stored on your machine, of course, it's impossible to pirate a streamed game, which is obviously an attractive proposition for them. In the immediate future, though, it is more likely to be a technology that runs like games such as Quake Live, which use a combination of some local processing power and some server-based cycles. That's certainly the route Microsoft is taking, and seems more achievable than relying on 'the cloud' at this stage.
3. Six-core processors
You won't have to wait long for this one. Intel's Westmere CPUs may be hanging around with the dregs of processor society at the moment, chucking their chips in with the integrated graphics crowd, but they're about to grow up – and fast.
Sometime over the next few months Intel will go two better than the current line up of quad-core CPUs by launching a six-core version of its high-end Core i7 line. Based on the existing Nehalem architecture, the headline feature is a process shrink down to 32nm, while the rest of the spec sheet remains largely the same. It could be a genuine upgrade.
CPU
Games programmers are getting much better at working with multithreaded code so that most major titles, like Empire: Total War and its forthcoming sequel Napoleon, will see a much bigger performance increase when given extra cores to play around in than the often sporadic leaps in frame rate we saw going from two to four cores.
Because the benefits will be in the amount of cores, rather than the speed of things you can do at once, Intel are encouraging some developers to add extra content specifically for people with a six-core CPU. Given the plethora of disappointments we've had lately with almost every technology that's promised to increase our frame rates, we'll reserve judgement until we have one in the office.
The good news is that these hexa-powered processors will fit into most existing X58 motherboards after a simple BIOS flash. The bad news is X58 motherboards are still very expensive too.
4. Wireless power
A few years ago we saw a demonstration by a team at Fulton Innovation of a product called eCoupled. Using the principle of electromagnetic induction, by which an electrical charge can be stimulated in a wire coil by placing another one nearby, the crazy boffins were able to display wireless power transfer.
Despite being high voltage, they said, it was safe, efficient and could be applied to any surface. The demo room consisted of a kitchen without plugs, but full of lights that could be stuck anywhere and a frying pan that heated up just by sitting it on the counter. Put a phone on the same counter and it began charging. Clearly, this was the future.
wireless power
Fulton are still working on wireless power, but it's a different company that's beat it to the shops, Powermat – and its products are expensive for something that replaces a 50p mains plug.
The good news is that the Wireless Power Consortium are going to be finalising a standard for wireless power called Qi later this year, which should mean prices drop and manufacturers have the confidence to build the technology straight into devices, rather than requiring an adaptor.
If you think that's crazy, though, take a look at Airnergy by RCA. It's a tiny dongle that can turn Wi-Fi signals back into electricity for charging phone batteries and the like.
5. Wireless displays
The last two standards for monitors, HDMI and DisplayPort, didn't exactly have us all rushing out to upgrade our PC screens and graphics cards, so it's a safe bet that DVI will remain the cabled interface of choice for some time to come. What about connecting a monitor to your PC without wires though?
That's something that could be worth shelling out for. Two different technologies were on the show floor at CES, which should be available en masse this year.
The first, WirelessHD is being pushed by the usual line-up of TV and DVD player manufacturers as a replacement for HDMI. It uses a short range, high bandwidth in the Ultra-Wide Band (UWB) spectrum to transmit HD video and audio from a set-top box or media centre to a TV screen.
The idea is nothing new, Philips have had a kit out for a while that does the same thing, but WirelessHD is a proper standard and should ensure maker A's TV works nicely with maker B's Blu-ray machine and so on.
Perhaps more relevant for us, though, is Intel's new Wireless Display, or WiDi. It's designed specifically for laptops in order to remove the hassle of cables when you want to dock them with a proper screen, and like WirelessHD sends the video signal to a receiver box.
Unlike WirelessHD, WiDi can't handle protected content and the like, but it is much simpler since it requires no new hardware inside the laptop. Instead of using a separate transmitter, WiDi is a software layer on top of the existing Wi-Fi chip, so it's much cheaper to produce. Providing there's no latency introduced to the picture refresh rate, this could be a killer.