Geeks2U Promise
We guarantee you'll love our fast, friendly service - or we'll refund your money.  
133,572 Happy Customers & Counting
Need tech support?
1300 769 448
Extended hours, 7 days a week

Author Archives: Alex Kidman

Microsoft Xbox Series X Review: Nice improvements, but no need to rush

Microsoft recently launched its fourth console generation with the arrival of the Xbox Series X and Xbox Series S. They follow the original 2001 Xbox, 2005’s Xbox 360 and 2013’s Xbox One generations, although many of those had smaller and bigger revisions throughout their lifecycles.

Where prior Xbox generations launched with a single console, Microsoft’s opted for two; the cheaper and slightly less powerful $499 Xbox Series S, which works with digital games only, and the higher tier $749 Xbox Series X. Microsoft sent me the latter model for a fortnight’s testing recently to see what’s good – and what’s not – with the next generation of gaming consoles.

While PC gaming hardware generally sees smaller iterative improvements over time, but not much in the way of “generational” leaps, the whole console business is built on the idea of a single gaming box that should last five or more years until the next box – or in this case, the next Xbox – comes along.

So, it’s no surprise that the Xbox Series X is a powerful device, because it’s going to be expected to carry the gaming expectations of consumers not just for today, but for at least half a decade or so. The Xbox Series X is capable of some truly stunning 4K visuals through the right software, and the use of SSD storage means it’s also considerably quicker at loading games than prior Xbox generations.

Visually, the console itself looks like a small black PC case, ideally placed like a monolith in your living room, and Microsoft’s only really tweaked around the edges of the Xbox controller. That’s also because the Xbox Series X/S consoles will work quite nicely with existing Xbox One controllers, so they really couldn’t change much.

One detail I do wish Microsoft had changed is the actual interface. It’s a mess of boxes that calls back to that “tiled” display that Microsoft seemed to think everyone wanted with Windows 8, and it honestly makes getting to anything except the games or content you’ve been playing quite recently a bit of a chore. There’s time to make changes, I guess, and that familiar-if-arguably-a-bit-broken interface at least won’t confuse anyone used to the Xbox One systems.

It’s been a misnomer for a while to think of these machines as purely “gaming” consoles, because both Microsoft and Sony have pushed them heavily as home entertainment centres as well. As such, either the X or S consoles can access a wide range of streaming services, such as Netflix, Stan, Disney+ and Apple TV+, while the pricier Xbox Series X also supports 4K Blu-Ray playback if physical media is your thing.

A key part of Microsoft’s particular play this generation is the ability to play existing Xbox, Xbox 360 and Xbox One games natively on the Xbox Series X and Xbox Series S. It’s not a 100% path all the way back to every game released for a Microsoft console since 2001 – Microsoft maintains a list of compatible games here if you’re curious – but it has two distinct advantages.

Firstly, if you’re already in possession of a library of games, you can keep playing them. Secondly, the Xbox Series X uses its faster internal storage and processing power to significantly lessen load times and, in some cases, improve visual quality on compatible games. I’ve honestly spent most of my reviewing time playing through some genuine classics that never looked quite so good or loaded quite so fast before.

Part of this is cheating in a way. Drop an original Xbox or Xbox 360 game into the drive on the Xbox Series X and what it’ll do is download a full copy of the game from Microsoft’s servers into the internal storage, so it can load it that way. At that point, the disc you have purely serves to authenticate that you’re still allowed to play the game in question. Naturally that does mean that there’s no way to validate disc games on the disc-free Xbox Series S, although existing Microsoft Xbox One digital-only download purchases should still be redeemable on those new consoles.

However, while I applaud the backwards compatibility angle, it also does serve to rather paper over the cracks, which in this case can be expressed by the low number of available, genuinely “new” games for the new console platform. That’s an inevitability – and it’s much the same story over on rival Sony’s side of the fence with its new PlayStation 5 console – but it did serve to remind me that the longer expected lifespan of a new console means that it’s often not always the best move to rush to buy on day one.

Sure, there are folks who simply must have the latest and greatest as soon as they can – but outside that backwards compatible library, they’ll face a decent wait for genuinely new and actually exclusive games to play. I’ve seen some online auction sites awash with folks trying to sell the new generation of consoles for literally thousands of dollars – or scam folks by selling empty boxes or “pictures” of consoles – but I’d honestly bide my time if I were you. The Xbox Series X is nice device, no doubt – but it’s one that will only improve with a little time and patience.


Google’s latest Chrome update brings serious speed

When you’re browsing the web, the one thing that you don’t want is slow. Nobody likes to be left waiting, but all too often we’re stuck staring at a half-loaded page or a spinning animation letting us know that something is happening – but rarely what it might be.

Google’s Chrome browser currently has the lion’s share of the browser market on desktop devices – it’s a slightly more complex story on mobile devices where the popularity of Apple’s iPhone and iPad lines keep its Safari browser in a strong position – but it’s often decried as being slow and clunky even if you don’t have any extensions running on it.

Google, it seems, is listening to this outcry for speed, announcing recently that the latest version of its Chrome browser has been specifically tuned for performance and speed. Chrome version 87 is, according to Google’s claims, now capable of launching 25% faster than Chrome 86, with typical page loading speeds up to 7% faster as well. Smart optimisation of memory allocated to tabs means that it’ll give more grunt to the active tab and less to ones that you’re not currently looking at.

If you’re guilty of having too many tabs open at once – whether you define that as “more than six” or “so many I can’t make out any individual tab names any more”, you can always free up a little memory by closing a few tabs you don’t need any more. However if all those tabs are vital to your online experience, Chrome 87 is also introducing “tab search”, letting you search for a specific tab even if you can’t make it out through your messy tab organisation.

Chrome 87 will also introduce more of what Google calls “actions” in the address bar. That’s where you’d normally put in the URL of a site you wanted to visit. Most folks are probably aware that you can directly search Google there as well – so if you type “pet kennels” there it’ll return a Google search page on that term, or any you pick. New actions in Chrome 87 focus strongly on privacy and security, including the ability to manage your passwords if you store them in Chrome by typing “Edit passwords”, or to launch an incognito private page simply by typing “incognito” and clicking on the button that appears under the address bar when you do.

How do I tell which version of Chrome I’ve got?

Google’s Chrome browser typically handles updates automatically for you, so you generally don’t have to go chasing updates, which are usually applied when you open and close the application itself. If you’re curious, however, it’s pretty easy to check your current Chrome version and kick off an update session if required.

Open up Chrome on your computer and click on the three little dots that are sitting stacked vertically at the top right of the browser window. You’ll get a pop-up menu with the word “help” at the bottom. Move your cursor down to that help word, and you’ll get a sideways stacked menu that should have “About Google Chrome” at the top. Click on that, and Chrome will open a new tab that tells you the version of Chrome you’re using, as well as setting off an update check for a newer version. If it finds one, it’ll download it, and typically prompt you to click a button to restart Chrome.

If you’re jumping from Chrome 86 to Chrome 87 – as I’ve just recently done – that should be a nicely speedy task relative to its usual sluggish pace. I do try to keep my tabs under control, but anything that can make browsing the web faster and less stressful is good news in my book.


Apple makes big software changes, but fails to launch Big Sur smoothly

It’s been a massive couple of weeks for Apple, one of the world’s biggest computer hardware companies, with numerous new iPhone models and a slew of new “Apple Silicon” MacBooks hitting store shelves. Apple announced its first “Apple Silicon” Macs at its WWDC event mid-year, promising that it would release at least one Mac running on its own processors by the end of the year. Ultimately we got three, with new MacBook Pro 13 inch, MacBook Air and Mac Mini models running on Apple Silicon available to buy.

While I’ve written about Apple’s ambitions for its own processors before, a quick recap. Apple Silicon brings the same essential computer architecture that already lies beneath the glass of every iPhone, iPad and iPod Touch, as well as the Apple TV to the Mac world. The core idea here is one of unification, because using the same parts means that the very same apps could run across multiple different devices.

For now, however, the way Apple is handling the transition is twofold. New apps should be written as “Universal” apps, workable on both Apple Silicon Macs and the now-older Intel Macs that Apple’s been selling for just a touch over a decade now. For Intel-specific apps that haven’t yet been rewritten as Universal apps, they’ll use an emulation layer that Apple calls “Rosetta 2” to fool the Apple Silicon Mac into thinking that it’s running its own code.

It’s still very much up in the air as to how well this will work for every app, but the reality here is that Rosetta 2 apps will be with us for some years. Apple’s not quite abandoning Intel-based Macs right away, stating mid-year that it had plans for up to 2 years of Intel Macs alongside a growing family of Apple Silicon based Macs.

The software that brings them all together is the latest version of macOS, “Big Sur”. It’s named for a California landmark bit of coastline if you were curious. Big Sur recently became available for Intel-based Macs, a week ahead of when Apple will be selling those first Apple Silicon models, all of which will ship with Big Sur pre-installed.

Apple has huge plans for Big Sur – it’s a large scale visual rewrite of macOS with more than a touch of iOS design inspiration, and on those Apple Silicon Macs, you’ll also be able to natively run actual iOS apps. Again, that’s the unifying concept in play there – if Apple can get the actual operating system off the ground, that is.

Apple OS updates are usually popular when they arrive and it’s not uncommon to have a few slower-than-usual downloads in the early days of any operating system update. Big Sur, however, was a big headache for Apple, because for many users – I was one of them, for what that’s worth – it simply wouldn’t update at all. Users were hit with error messages, and to compound the problem, the issues around Big Sur availability hit a lot of other Apple online services as well, either making them slow to respond or in some cases completely unresponsive as well.

There’s a good general maxim that says you should always hold off on large scale software updates while any bugs are ironed out, but it’s generally expected that you would at least be able to upgrade if you wanted to.

As always if you are keen – and presuming Apple’s ironed out its availability issues more broadly by the time you read this – it’s vital that you back up your personal files before doing so. Ideally, do so both online in the cloud and on a local storage device. Not only does that give you a level of redundancy if something does go wrong, it also makes it simple and relatively quick to restore if you have a local copy and don’t have to rely on slow internet downloads for your precious personal files.


Amazon Fire TV Stick Lite vs Google Chromecast with Google TV: Making your smart TV smarter

Smart TVs – flat panels that include some level of integrated internet-based streaming functionality – have been a reality for years now, with a wide array of approaches and compatible apps built right into your living room TV. In terms of an all-in-one solution it seems like a no-brainer, because if your TV can easily access the streaming services you love, you don’t need any extra remotes, or to cast content from a laptop or phone or anything like that.

When you first buy a smart TV and connect it up to your home internet, that may well be true, but there is a problem. Most smart TV makers aren’t always all that good at updating the apps on their smart TVs.

That’s an issue on two levels. As new services — Disney+, which only launched late last year springs to mind — appear, you may find that you’re not able to access them directly from your TV. The app simply doesn’t exist, and if you’ve got an older but still perfectly good TV, you may find that your TV manufacturer doesn’t have any interest in software upgrades for it.

That can also be an issue even for apps that are present on your smart TV anyway. As the services themselves make changes to their quality options, software delivery or user interfaces, some smart TV access can become a non-starter, even though it looks like they should be compatible. The app launches, you see the logo, and then all you get is an error message when you want something enticing to watch.

It’s why I’ve long been a fan when buying a new TV of making sure that you’re getting the best picture quality above all, because that’s a factor that can last years or even decades if you’re lucky, with everything else being a bit of a bolt-on.

Need better audio? Add a surround sound system or high quality soundbar.

Need better smart TV access? Add a cheap smart TV dongle.

Luckily that’s pretty easy to do. Just recently both Google and Amazon updated their smart TV offerings for Australians, via the quite cheap Amazon Fire TV Stick Lite and the slightly more expensive Google Chromecast with Google TV.

The Fire TV Stick Lite is “lite” because in the US, Amazon offers a range of smart TV appliances, with the Lite being the lowest cost option.

It’s the only model sold here in Australia, retailing at $59. It’s a very simple HDMI-connected stick with its own remote control that mostly focuses around Amazon’s own Prime Video streaming service. Prime Video can be subscribed to by itself, but most consumers effectively treat it as a “freebie” alongside an existing Amazon Prime shipping subscription. Alongside Prime Video, it’s compatible with Netflix, Stan, Disney+ and Apple TV+, as well as the major free to air “catch up” services such as ABC iView, 9Now and SBS On Demand.

Google’s sold cheap Chromecast devices in Australia for a few years now, but to date they’ve all relied on casting and control via a smartphone, tablet or Chrome browser. The $99 Google Chromecast with Google TV steps it up a notch with an included remote and a user interface effectively built on the Android TV platform that you find on some smart TVs.

This means that there’s a wider array of apps you can install on the Chromecast with Google TV. It’s got coverage for all the big hitters – Netflix, Stan, Disney plus and even Amazon Prime Video, although you don’t get Apple TV+. Apple and Google don’t like each other that much, at least for now. If you’re wondering what the difference is between this model and the existing Chromecasts, think of it like a super-sized model; you can still cast to the new Google Chromecast with Google TV from compatible devices, but it’s also got an onscreen menu and a remote control as well.

Having tested out both streaming sticks, they’re fine for their basic purposes, but if I was buying, unless Apple TV+ is really important to you, I’d opt for the pricier Google Chromecast with Google TV. Its response is a little quicker than the Amazon Stick, and critically it’s capable of 4K streaming, where the Amazon stick will never go above 1080p.

While that’s also a matter of your subscription tier for selected services, as well as your broadband quality, the reality for the Google Chromecast with Google TV is that it’s eminently portable, so you could take it with you from TV to TV, or when you upgrade your subscriptions or your broadband improves. The Fire TV Stick Lite will always be a lightweight solution – hopefully Amazon will bring more models to Australia soon.


Should you buy a laptop with integrated or dedicated graphics?

Intel has just released a new GPU for laptop builders to incorporate into their designs in the form of the Intel Iris Xe Max. If you’re looking at GPU like I just started speaking in Dutch, it stands for Graphics Processing Unit, the bit of your computer that handles flinging images around onscreen.

The Iris Xe Max is designed for ultraportable laptops that need a little more graphics grunt than you’d get in a standard integrated GPU, so best suited for matters like video editing, rendering and gaming pursuits.

What’s interesting here is that Intel is claiming that the Iris Xe Max can handle 1080p gaming tasks with aplomb but also share its computing power with the standard CPU if you do hit an intense task.

You might not care about gaming per se, but it does represent a big leap forwards in terms of what can be done with an integrated graphics solution, because it really wasn’t that long ago that these units struggled to even shift a single column of excel data across a display.

When you buy a new computer – and most notably laptops although this can be true of desktops as well – there’s a choice to be made between integrated and dedicated graphics processing.

An integrated GPU means that the graphics handling is done in (essentially) the same chip as the actual computing, sharing memory resources and generally lowering energy usage. Conversely, a dedicated GPU can have its own memory and deliver superior performance when needed, but at a cost of power usage, which is why you typically don’t see them on ultralight notebooks, or for that matter low cost models either.

If your computing needs are modest – a little light web browsing, office documents and the like – then dedicated GPUs will be more than enough. It’s a little easier if you’re buying a PC based on Intel processors because Intel doesn’t make standalone graphics cards itself. Typically, what you’ll find on systems with Intel and standalone GPUs are cards using NVIDIA’s GeForce solutions.

Rival AMD makes both CPUs with integrated graphics and graphics cards in its own right, although in the laptop space that’s usually easily determined by the price sticker on the laptop.

While gamers are often cited as the core market for laptops with dedicated GPUs, and there’s an entire class of “gaming laptops” to suit their needs, it’s not the only scenario where a system like that makes sense.

If you do a lot of work involving graphics elements such as photo or video editing, most of the popular software can leverage the power of a good GPU to speed up your workflow. More recently, there was a bit of a run on actual graphics cards for desktops due to the whole cryptocurrency boom, because again their computing power could be leveraged that way – not that I’m advocating for that approach, but it’s another example of how computing power can be shifted around different computing architectures.

It’s also worth bearing in mind if you’re buying a laptop that upgrades beyond memory – and even then it can’t be assumed – are usually not a realistic proposal, and if you’re looking to need more power over time, a dedicated GPU should offer more performance than an integrated one. Every system that comes with a dedicated card can and will drop to the integrated GPU for low-intensity tasks, which means you get the best of both worlds that way.

That being said, for most everyday users, the integrated graphics you get will manage most tasks just fine, and there’s still very much a price differential between systems, and especially laptops, with decent dedicated GPUs.

Outside much older laptop stock – where you run the risk that integrated improvements like Intel’s Iris Xe Max may have outpaced dedicated GPUs anyway – you won’t typically see a dedicated GPU on any laptop under around $900, whereas anything under that price is extremely likely to be relying on an integrated GPU.


Google won’t give up search without a fight

One of the biggest tech news stories of recent months emerged when the US Department of Justice announced that it’s going to take search giant Google to court, alleging that it has violated antitrust laws in a monopolistic fashion. According to statements reported by the New York Times, “nothing is off the table” in terms of remedies, including potentially breaking up parts of Google, a fairly standard approach for cases of this type.

That’s assuming that the DOJ wins the case, and as with all legal matters that could take years to conclude, because not shockingly Google’s said that it’s going to fight the case. That’s where it gets interesting for Australian users of Google’s services, which is pretty much everybody who searches, uses an Android phone or Google-specific services such as Google Maps or YouTube.

Google isn’t the first big tech company to fall foul of US law, with Microsoft infamously being pursued over anticompetitive bundling practices in the late 1990s and early 2000s. That was a case that the DOJ ultimately won, but Microsoft didn’t end up being split up, at least in part because the length of time it took for the case to wind through the courts meant that a lot of the arguments around matters such as bundling an Internet browser had been rendered somewhat moot because the Internet had moved on.

Google may well be in a different position this time round – the Internet absolutely is all of its business and it’s constantly iterating to stay on top of technology movements and changes, as well as investing heavily in external applications and companies that it sees as either complimentary or competitive to its own business. That’s got to be at least part of the DOJ’s point, but it’s a reality that a lot of Google’s projects, both purchased and developed in-house don’t stick around all that long. There’s even a website where you can track defunct Google products, the Killed By Google site.

Google’s not likely to kill off products if it does ultimately lose its case, and it’s not assured that this is what the DOJ would seek, with much of its argument relating to contracts around search specific functions, such as the deal it has with Apple that sees the iPhone maker paid $8-$12 billion a year to remain the primary search platform on iOS. There are alternative search engines out there, such as Microsoft’s Bing or privacy-centric DuckDuckGo, but when you’re the primary preinstalled platform and many folks don’t even consider what service they’re using, it’s got to be pretty hard to get any kind of market traction there.

It’s further complicated by the fact that Google itself reorganised a few years back with different business units, including its search engine under an umbrella corporation called Alphabet. This isn’t a case that will resolve quickly, and the Google that exists by that time could be quite a different company. We’ve seen similar, albeit smaller moves here in Australia through efforts such as the ACCC’s news media code for example.

While crystal ball gazing in technology is a dangerous matter – back when the Microsoft/DOJ case first emerged Google didn’t even exist after all – it doesn’t take too much psychic energy to suggest that Google will both fight this in the courts and the court of public opinion, and while this is a US case, there’s no doubt that effects of losing such a case would ripple through Google product offerings. A huge part of the reason that it’s so big and influential right now is precisely because it can leverage the data we give it every time we search or use its products, whether that’s a direct Google web site search, a Google Maps search or even chasing down the latest viral video on YouTube.


Apple revives MagSafe as it drops chargers from iPhones

Apple recently launched its 2020 crop of iPhone smartphones, comprising 4 different sizes and models that will become progressively available over the next month or so. The realities of the COVID-19 Pandemic have meant Apple has had to stagger its iPhone 12 launch schedule, with the basic iPhone 12 and iPhone 12 Pro going on sale first in Australia from the 23rd October, while those who want either the smallest iPhone – that’s the iPhone 12 Mini – or the largest model, the iPhone 12 Pro Max having to wait until the 13th of November to get their hands on one.

Amidst all of Apple’s hype around faster processors, better cameras and the first 5G iPhone models the company has produced, it also took a step back into its own past in a somewhat unusual way, resurrecting the “MagSafe” brand that it used to use for the proprietary chargers it used for MacBooks up until it started to shift to USB-C charging for its laptop products.

If you’re a longer-term Mac user, you probably fondly remember MagSafe, which used an array of magnets in the charging plug part of the MacBook to ensure that the charger clicked in place. More importantly, it also meant that if somebody tripped or walked over the power cord, all that came undone was that magnetic attachment, not your laptop as it came crashing to the floor. That’s why it was “MagSafe”, you see, because it kept your MacBook safe.

The new MagSafe, however, has nothing to do with MacBooks at all, or at least not to do with any MacBooks Apple is currently selling. Instead, it’s an extension of the Qi wireless charging that’s already present in the company’s iPhones. Qi can be super handy when you don’t have a charging cable to hand, but it does involve placing your iPhone (or any other Qi compatible device) in the right space to line up with magnetic coils within a wireless charger. Get it right, and the juice will flow, but get it wrong, and you may as well have just left your iPhone on any bench for all the extra power you’ll get.

MagSafe for iPhone will use that same idea of magnetic attraction to, as per Apple’s claims, ensure that you don’t have to worry as much about getting the placement right. Magnets in the new MagSafe chargers will line up with the rear of the new iPhone 12 models to make charging more secure. They’ll also be faster, with up to 15W charging where iPhones currently top out at 7.5W, although that’s not a unique MagSafe feature per se – plenty of wireless charging compatible Android phones can charge at 15W or even higher already.

Apple will initially produce MagSafe chargers itself, although third party brands such as Belkin already have products that will hit the market pretty soon. It’s an interesting technology take, although it does come at a price – and not just the one you’ll pay for a new MagSafe charger. Along with announcing MagSafe as one part of the iPhone charging story, Apple also announced that from now, it won’t include a phone charger or headphones in with any iPhone.

The latter might not be that much of a loss – Apple’s “free” headphones with iPhones have always been bland at best – but the loss of the charger could be a tad more challenging. Apple’s position is that it’s doing so for environmental reasons on the grounds that many users already have chargers – though it will still sell standalone chargers if you need one – but there is a challenge there as many folks may be tempted to simply buy the cheapest charger they can find.

That’s generally a poor idea, and sadly we’ve seen more than one instance of harm – and some fatalities – as a result of the use of poorly built phone chargers. What you save in using a cheap charger could cost you a whole lot more than you expected.


What will the upcoming NBN changes mean for your home broadband?

NBN Co recently announced that it’s spending some $3.5 billion dollars to upgrade parts of the nation’s Fibre To the Node (FTTN) network to full Fibre To The Premises (FTTP) over the next 3 years.

While the NBN itself has been one massive political football, for better or worse, the practical reality of its near-finished state in 2020 is that there’s definitely some “winners” and some “losers” when it comes to the quality and speed (and they’re different matters) of their NBN connection. If you’re already in an area with FTTP connections then you’ve got the current best tech available with the highest likely reliability to boot. However, the majority of the fixed line connections in Australia use the older, slower and less reliable copper network for the last part of the connection.

As such, the news of an upgrade should be very welcome for most of Australia’s broadband users. That’s pretty much all of us, by the way, with recent ACMA figures suggesting that in the financial year 2019-2020, Australian internet usage by population peaked at 99% for the first time.

So if you’re on a FTTN connection with speed or reliability woes, you’re set, right?

Well… maybe. The devil, as always is in the detail, and there’s a mix of unknowns and some catches to be aware of before you start planning for your high speed future. Not shockingly, the rollout (again) of trucks, workers and cabling will take some time, and while NBN Co’s aim is to have at least 75% of the fixed wired broadband network capable of speeds of up to 1Gbps by 2024, their own estimations don’t see much actual accessibility for higher speeds before around 2023.

Then there’s the way the actual rollout is being handled. If you were in an original area serviced by FTTP, then the NBN rollout would hook up to your premises and the choice of plans and providers was then up to you. However, for this upgrade rollout, the plan is to roll fibre down streets where FTTN is present, but only connect up houses or businesses where those occupiers sign up for a higher speed plan.

It’s not at all clear how much of a boost over what you can get you’d have to sign up for, bearing in mind that some FTTN connections barely manage 25Mbps even now. There’s some suggestion that you’d have to sign up to a contract for a decent term as well, not just for a month before settling down to a lower speed, cheaper plan with the better reliability of fibre, either, although again details are scant.

It’s a move designed to make the rollout a little cheaper, although it feels like an odd step from a technology standpoint, because it means NBN Co would be on the hook for remediation in areas that have seen the FTTP upgrade for both the fibre and remaining copper in the streets, effectively running two networks in the place of one.

That 75% figure is important too, because it means that there will still be some premises that don’t even see the option. While some of those will be premises already served by fibre to the basement, there’s still going to be some folks unable to access faster speeds or more reliable connections regardless.

Against this is the rising spectre of 5G networks, with a range of “available” 5G home broadband options. I’m putting “available” in quotes there because while it’s often touted as an NBN beater, 5G has its own challenges in terms of rollout and especially shared spectrum.

A single device on the current sub-6Ghz 5G in an area might be able to punch some impressive download speeds – much less so for uploads, though – but once you saturate an area, as is likely once more 5G phones become available, that spectrum is shared amongst all devices – and you could end up with considerably more variable speeds as a result.

Even the telcos that have built 5G networks have long maintained that they’re more in the space of “complimentary” technologies, with 5G more filling in the cracks where the NBN cannot or will not provide what a broadband user needs.


Will consumer VR ever hit it really big?

Virtual Reality, often shortened to VR is one of those “future tech” concepts, along with hoverboards, jetpacks and teleportation that we always seem to be just on the cusp of… but never quite getting there.

However, unlike teleportation – which conventional physics suggests might be a bit of a non-starter – or the risky nature of hoverboards and jetpacks, VR as a tech has been a reality for some time now, and not just in commercial or scientific research terms.

For some years now, there’s been a push for consumer-based virtual reality in the home. We’ve seen efforts from the likes of Facebook-owned Oculus, HTC with its Vive platform, Google with Daydream, Sony with its PSVR headset and Samsung with its Gear VR initiative.

VR is nothing new in a conceptual sense, but the last five years really saw an explosion of consumer-facing virtual reality hardware, much of it designed to work with devices you already had, whether that was a PC, a phone or a games console.

However, while we’ve seen a touch over five years of consumer-grade VR, which is typically enough time for both the technology to mature and for more widespread adoption of a new platform, that isn’t what has happened to a wide extent.

Samsung, for example, launched its Gear VR platform, an affordable VR system built around its Galaxy smartphones slotting into a specially designed headset back in 2015, but despite revisions to accommodate newer phones, it pulled the plug on its VR ambitions at the end of September.

While there were a few smaller competitors – the likes of LG and Alcatel for example – in that phone-driven VR space, the other big competitor in smartphone based VR was Google with its Google Daydream platform. Like Samsung, the Daydream View headset accommodated a smartphone that you’d slot into a headset, along with a controller for managing your virtual experiences, although Daydream compatibility was across a wider range of phones. I’m using the past tense here because, you guessed it, Daydream is dead too, with Google recently pulling the plug on its own VR ambitions too.

Sony has its play in the VR space via its PSVR headset, which works with the PlayStation 4 console and according to reports will also function with the company’s new PlayStation 5 system when that launches on the 12th of November. As you’d expect, the PSVR experience is heavily game-led, and while Sony’s put a fair bit of promotion into the platform since it launched 4 years ago, it’s not exactly set the gaming world on fire. There are rumours that Sony’s working on a PSVR2 exclusively for the PS5, but no confirmation just yet. It’s not uncommon to see the PS4-compatible version on sale at electronics stores at a significant discount, which is rarely the province of a red-hot must-have tech gadget.

All this is not to say that VR is dead; there’s a lot of work going into commercial and educational applications, and platforms like the Oculus or HTC’s Vive are still continuing along nicely. There are some issues with the technology as it stands, especially if you wear glasses or find things like 3D effects disconcerting.

Back when cinemas pushed “3D” movies above all else, I found them essentially unwatchable due to headaches, so I’m not exactly a prime candidate for VR either. However, I can very much see the potential, because right now our tech interactions are increasingly screen based, and VR removes that friction point by placing you essentially “inside” the experience – whether that’s an action game, movie experience or educational activity.


Nuki Smart Lock 2.0 & Arlo Essential Spotlight Camera: Gadgets to keep your home safe

In recent years there’s been a glut of smart home devices with a strong focus on what amounts to self-managed security. Where once you might have paid an external firm for monitoring services – or just bought a large bitey dog – you can now use technology to tell you what’s happening in and around your home. But how well do these products work? I’ve been spending some time recently with a few smart home security products, including a Bluetooth-connected smart lock and mobile smart security camera.

Nuki Smart Lock 2.0
RRP: $419+

Nuki’s Smart Lock isn’t in fact a lock in its own right. It’s a module that you install over your existing lock with your key placed inside. It then uses a motorised turner to flip the key around as required when you tell it to open or close a given door. Because it’s an add-on module, it’s a nicer option than some smart locks that require a full replacement of your actual lock. With landlord approval, you could pretty easily install the Nuki Smart Lock 2.0 on a rental property if you wanted and take it with you when you left that property later on.

The idea is that you install inside your home, so nobody knows it’s there, but then use a Bluetooth connected smartphone to actually unlock the door as you approach. Nuki also sells a Wi-Fi connection bridge, so you could then manage your lock and its status from anywhere on the planet, although of course the Bridge costs extra.

Installation of the Nuki Smart Lock is an interesting one; you’ve to measure up your lock and ensure it will fit, and then either side bolt it into place if it has a protrusion, or effectively “stick” it on if it’s a flatter lock type. One catch I discovered early on here was that if you don’t get good adhesion, the Nuki lock can pretty easily spin itself out of place through sheer motor force. Placement can also be tricky if you have locks and handles above each other too closely.

Once it’s installed securely, however, I was quite pleased with how well it typically worked. There’s a button at the back so if you’re inside you don’t even need a phone. A simple tap will get the key spinning and unlock the door. It’s also voice compatible with Amazon’s Alexa Assistant or Google Assistant, but you’ll have to invest in the Nuki Bridge if you want that kind of functionality. Likewise, you can integrate with a keypad if you want number pad entry, but that’s an added cost too.

Arlo Essential Spotlight Camera
RRP: $229

Arlo has been around in the home security camera space for some time, but most of its products to date have relied on the idea of having multiple cameras connected to an Arlo Hub that hooks into your home internet connection. You can do that with the Arlo Essential Spotlight Camera, but it’s also designed to act as a standalone unit if you only wanted the single camera in place.

As the name suggests, it’s camera with an integrated spotlight that fires up if the camera detects movement after dark. Once it’s charged up and set up, a process that’s nicely simple through Arlo’s app for iOS or Android, you then have to choose where to place it. It’s rugged enough for outdoor use, although the further it sits from a decent Wi-Fi signal, the more power it’s going to use. One drawback with this model compared to other Arlo devices is that it uses a sealed battery, so when you want to recharge it, you’ve got to move the entire camera offline, unless you’re using in in a situation where you can keep constant power flowing to it.

Actual video pickup is really good, with nicely crisp 1080p images even in low light situations. You do need to be careful about placement however, because the spotlight is very bright, and a few of my family members did comment while I was testing that it was a tad blinding if it spotted them returning home late at night.

Arlo’s proposition isn’t just for hardware, however, and if you do want longer term storage of your footage, as well as advanced features such as object detection and advanced motion zone setting, you’ve got to pay extra for an Arlo Smart subscription package. You get a 3 month trial with the Arlo Essential Spotlight Camera, but after that plans start at $4.49/month.

Bear in mind with a camera like this that if your core interest is in making sure you get alerts on your property when you’re not there, you’ll also need a decent speed broadband service to send images to your phone or other device when you’re away.


Recent News

Microsoft recently launched its fourth console generation with the arrival of the Xbox Series X and Xbox Series S. They follow the original 2001 Xbox, 2005’s Xbox 360 and 2013’s Xbox One generations, although many of those had smaller and bigger revisions throughout their lifecycles. Where prior Xbox generations launched with a single console, Microsoft’s

When you’re browsing the web, the one thing that you don’t want is slow. Nobody likes to be left waiting, but all too often we’re stuck staring at a half-loaded page or a spinning animation letting us know that something is happening – but rarely what it might be. Google’s Chrome browser currently has the

It’s been a massive couple of weeks for Apple, one of the world’s biggest computer hardware companies, with numerous new iPhone models and a slew of new “Apple Silicon” MacBooks hitting store shelves. Apple announced its first “Apple Silicon” Macs at its WWDC event mid-year, promising that it would release at least one Mac running

Getting online via Wi-Fi networks is easier than ever before, but it’s important to understand the risks of what happens when you press ‘connect’. Whether you’re setting up your network at home or in the office, it’s crucial you take the right steps to set up a secure network for you and anyone else who

Coronavirus (COVID-19) Update

Learn about the precautions we are taking and our new contactless pick-up and remote service options. Read More
Get help setting up your home office or homework area today. Learn More