Geeks2U Promise
We guarantee you'll love our fast, friendly service - or we'll refund your money.  
133,572 Happy Customers & Counting
Need tech support?
1300 769 448
Extended hours, 7 days a week

Author Archives: Alex Kidman

Has the time come to switch to SSD?

samsungssd

You probably don’t think that much about the storage inside your laptop or desktop, excluding those times when you start running out and your computer complains about the lack of overall storage.

A quick primer, as a tiny digression. If you look at the specifications of your computer, it’ll list two types of storage: the memory and the storage (sometimes just referred to as the hard drive). You might think the two are the same, but they’re not. Memory is (in human terms) short term memory active only when your computer is on. Think of it as a digital blackboard that gets wiped clean every time the power goes down, and may indeed be cleared progressively as you use different applications, while the storage is more your long-term memory, where anything you create, download or install is stored.

It’s feasible to upgrade the memory (or RAM) in many systems, especially desktops, although some thinner laptop designs used sealed memory compartments that can’t be upgraded. Storage is an easier solution, however, because you can add external devices such as thumb drives or full external hard drives to gain extra portable storage for any laptop or desktop with a spare UBS port.

For years, the external drives you increased your storage with weren’t that much different to the drives within your computer. They were mechanical drives with read heads, encased in enclosures that provided power and not much else, although many external drives used smaller physical size drives than their desktop counterparts. In recent years, there’s been a growth in the use of solid state drives, that use memory that’s not terribly dissimilar (at a basic level) to the memory used for RAM. Solid State Drives (SSDs) are quieter, thinner, and generally more power efficient, but their real benefit is in pure speed.

I’ve recently been testing out Samsung’s T5 SSD, the company’s latest generation of its external solid state drives. They’re very small, very lightweight and very fast indeed. Connectivity is via either of two cables provided in the box, covering both standard USB (USB-A) and the newer round USB-C standard, and the drives are available in 250GB, 500GB, 1TB and 2TB sizes. Samsung also supplies a simple utility to encrypt the data on the drive if you’re thinking of using it for business purposes, or even if you just want to keep your family photos truly private.

Connecting up the 500GB T5 SSD to a 2016 MacBook Pro, I was able to hit an impressive 513.5Mbps real world read speed, and write speeds of up to 482.7Mbps. You’ll never see that kind of speed from an external mechanical drive, although if you do have a laptop or desktop with an internal SSD, it’ll be even faster again, because the realities of shifting data from an external interface will always introduce some overhead.

All well and good and robust, but there is a catch, and it’s one worth knowing about. SSDs have been favoured by speed freaks, but to date they’ve been the more costly option against standard mechanical drives. Samsung’s new drives start at $199 for the 250GB version, all the way up to a wallet-scaring $1,249 for the 2TB version. Considering you can pretty easily buy a 2TB mechanical drive for less than $199, you’ve got to balance your storage needs against your need for speed. That being said, when the first external SSDs emerged they were both slower than the T5 and much more expensive. That price and storage gap is shrinking, and as it does, expect to see more and better SSD bargains along the way.


Time to relearn all your password rules

Closeup of Password Box in Internet Browser

For just about any online service you’d care to name, you’re going to be requested to set up a password in order to securely access those services. This may be for a relatively trivial reason, such as one-time access to a site you’re not sure you’re going to use regularly, or something far more serious such as your online banking.

Either way, you’ve probably been hit by a set of password rules that required you to, generally, pick a unique password (always important) with at least one capital letter and one number as part of the combination. There’s a reason why those rules have permeated across the internet which can be traced back to a US security document from 2003, which laid out the (at the time) understood best practice for password creation.

There’s just one problem. The rules that were laid down then were built on both a limited understanding of passwords, and an even more limited subset of “bad” passwords to work from, most of which dated from the 1980s. They recommended, amongst other things, that passwords should be regularly changed, as frequently as every 90 days.

For many of us, this has led to really lax practices, such as re-using passwords across multiple sites, or using really simple ciphers such as appending a number (usually a 1) to the end of a new password to make it easy to remember. Many folks adopted the use of numbers to replace letters, so that “e” becomes “3”, “A” becomes “4” and “O” becomes “0”, for example.

There’s a big problem here, because that creates a recipe for passwords, and it’s one that, especially as processing power has grown, has been ever easier for computers to crack. The author of the original password document now states that they’re not terribly suitable for human beings to use, because they promote passwords that are hard for humans to remember, but easy for hackers to crack.

So what’s the solution? The new rules being proposed change up the way that traditional passwords were thought of.

Out with mandatory numbers, because we’re (generally) lazy and always tend to append them to the ends of our passwords.

Out too, with forced changes of passwords, because that should only be necessary if there’s a known breach of a given service or site.

Users should be encouraged to use passphrases, because you can generally remember a phrase much more easily than a random jumble of letters, whether it’s a song lyric, a poetry phrase or simply a string of words that you happen to like and can find memorable.

Of course, you can still mix it up a little and, for example, use methods such as Diceware, where you roll dice to pick words from a random list, or use acronyms based on the lyrics of your favourite song.

The new rules also stipulate password lengths of up to 64 characters, but before you panic at that length, they also suggest allowing password fields to support pasting in passwords. That means they should work with password managers such as Dashlane, 1Password or Keepass, and that’s good news if you have many passwords to remember, as so many of us do.

With a decent password management app, all you need is one decent passphrase or password, and then you can let the app do the calculations and creation of new passwords for you on the fly, unlocking the app with your master password and pasting in new passwords as needed.


Apple’s iPod range is almost defunct

ipod1stgen

While Apple has a history as a computer company that dates all the way back to the 1970s and the original Apple I computer and wildly successful Apple II model that was the favourite of many Australian educational institutions in the 1980s, by the 1990s, the company wasn’t doing so well. Considerably more affordable Windows PCs had overtaken Apple in the personal computer market, and it had essentially failed to innovate in a way that resounded with the majority of consumers. I can well remember as a much younger writer viewing Apple as a company that probably wouldn’t be around all that much longer, right around the same time that Michael Dell, CEO of Dell Computers, declared that if he was Apple, he’d be refunding the shareholder’s money.

That’s not a claim he’d make any more, because the value of Apple as a company is phenomenal. There’s simply no other word to describe it, given what an absolute basket case it was in the late 1990s. Apple did simplify and greatly improve its desktop and laptop computer range with the shift to Intel architecture, but that wasn’t what revitalised the company. Instead, it was in reading the market around personal music players and building the iconic personal music player of the early part of this century: the iPod.

Apple wasn’t the first to market with a solid-state music player by a long shot, but there really wasn’t anything with either the quality or style to compare to it, especially with Apple’s own iTunes music store behind it. These days iTunes as an application can be a bit of a pig, even on its native macOS platform, but back then it was so much easier than any other competing legitimate platform, and even some of the dodgier pirate alternatives too. The iPod saved Apple, and it paved the way for both its iPhone and iPad products that have led it to such a phenomenal share price. You could buy Apple, but you’d need a really hefty wallet to do it with.

A big part of Apple’s problem in the late 1980s and 1990s was that it rested on its laurels for way too long, spinning out the Apple II and staying with its PowerPC architecture for Macs way beyond the point where it was practical to do so. It seems that Apple isn’t going to stay nostalgic for the iPod, however. It sold its last mechanical hard drive based iPod a few years ago, and recently also discontinued the iPod Shuffle and iPod Nano, which means that there’s now only one product that it sells which is still called an iPod. That’s the iPod Touch, which these days is effectively a lower-power, slightly cheaper iPhone without the iPhone calling technology built in. It’s a fair bet that, while it’s a function that does get used, most iPod Touch buyers are more interested in other apps than simply having a music player.

What does this mean to you if you’re still rocking an iPod Nano or Shuffle? You can expect the available support and repairs from Apple for those products to dry up relatively quickly; while right now there’s probably a few spare parts for the units that would be under warranty, once those are gone, if your Nano or Shuffle won’t play, it’s time to send it responsibly to e-waste recycling. If all you want is music, there are still a few budget brands offering simple players, but there’s few with the style of an iPod. Even the iPod Touch, sitting a few generations behind the iPhone 7 or current model iPad Pro in processing terms is probably living on borrowed time, but the iPod Touch is your best bet if you do want to keep your tech purchases purely within the Apple camp.


Are you risking it all on public Wi-Fi?

freewifi

It’s feasible within Australia to stay connected online for relatively low prices, especially if you make smart use of public Wi-Fi. It’s possible to get online from many public libraries, cafes, shopping centres and popular public locations, where the venue either offers Wi-Fi as a straight service (a la libraries) or as part of a hook to get you in the door. It’s extremely convenient if you just want to quickly Google up some details, check a map or send a social media update. Sure, you rarely get particularly great speeds, but you’re paying nothing for the service.

Or are you?

How careful are you when you go online to ensure that what you’re doing is for your eyes only? A recent study by Norton by Symantec, the 2017 Norton Wi-Fi Risk Report suggests that an alarming number of Australians are either unaware of whether they’re safe online when using public Wi-Fi, or seemingly don’t seem to care. 60% of respondents said they felt safe when using public Wi-Fi, seemingly without any checking to see if what they’re doing was secure, or by using other tools such as VPNs to ensure their information was encrypted. 59% simply couldn’t tell if a public Wi-Fi network was secure or not. Still, they’re very popular, with 83% of respondents indicating that they’ve used public Wi-Fi networks to log into personal email accounts, check bank balances and/or share photos and videos. This isn’t always a particularly wise option.

So what should you do? The general rule of thumb when using public Wi-Fi is to assume that just about nothing is secure. If you don’t control the network, you’ve got very little in an easy way to tell if others are snooping on your web traffic.

At the very least, that should give you pause for thought when using such services for any kind of sensitive information, whether it’s checking your email (remembering that your password has to be sent over the network to verify that it’s you), sending personal photos or any kind of online financial transaction.

However, that doesn’t mean that you should equally assume that all public Wi-Fi is a poisoned well with no recourse. If you’re surfing web sites using public Wi-Fi that use HTTPS encryption, which you can spot by looking at the full URL, which will commence with HTTPS rather than the more open HTTP header, or via a padlock symbol in your browser, then you’re part of the way to being secure.

It’s not generally a great idea to do any private financial transactions on such networks, such as online banking, but if you must for whatever reason, it’s wise to invest in a VPN (Virtual Private Network) application. A decent VPN will encrypt your web traffic, keeping it secure for just you and whoever you’re interacting with, and should only cost from $5-$10 per month. It might irk to have to spend on security, but the difference here could be $10 on a VPN versus having your identity compromised for all sorts of nefarious uses, not to mention your bank account’s contents suddenly and irrevocably heading offshore.


2018 could see a mobile broadband price war

optusannouncement

Optus recently announced that it was sinking around $1 billion dollars into improving its nationwide mobile telephony network, with a particular focus on improving its regional coverage. That billion dollars will go into 500 new mobile sites across the nation, including 114 on the Federal Government’s Mobile Blackspot program, as well as upgrading more than 200 sites from 3G to 4G.

That last bit is critical, because mobile telephone services have seriously jumped beyond simple calls and texts. Data is king in these kinds of equations, as a simple 3G network is more than enough to handle the bulk of call and text needs. Throw data into the mix, and especially as we’ve expanded to using more data-intensive service in a mobile setting such as videoconferencing or for that matter video streaming, and 4G is barely enough.

If you’re a regional phone or mobile broadband user, Optus announcement is good news, even if you’ve got no particular intention of switching telcos to Optus itself. That’s because it provides further market ammunition against Telstra, the biggest player in the telco space in Australia, and especially in regional areas. The big players all claim greater than 95% coverage in Australia, but those figures are always population based, and Australia’s population is heavily focused on living along the coastline, especially in the eastern states. Telstra’s inland coverage has been the one to beat, and Optus is giving it a solid go.

Even if you’re more of a metropolitan user, it’s still good news, as 2018 is shaping up to be a highly competitive year in the mobile broadband space, with TPG also set to start rolling out its own mobile network to around 80% of the population. There’s no doubt that it will want to capture customers, and the easiest way to do that will be via competitive pricing. Telstra’s long sold itself on the quality and breadth of its network, but with outages and competitors creeping in on its network map’s coverage, some of those advantages level out, which should see it facing ever more pressure to drop prices. It’s certainly not as though Optus or for that matter Vodafone aren’t going to compete as well.

All of this movement is also happening as the picture around 5G networks is starting to coalesce. 2018 will see a number of important 5G trials take place in Australia, and while we won’t see actual networks or 5G mobile broadband devices until 2020 at the earliest, what we will see is an increasing focus on delivering better value data packages with wider overall coverage, and most likely at lower price points. 5G is all about data, whether it’s from the internet-of-things approach taken up by devices such as Google’s recently launched Google Home, or long-distance high speed broadband, a particular focus for Australia.

As such, while the data inclusions on plans are improving at a fairly regular rate right now, 2018 could see telcos switch to lowering the overall cost of data instead as they scramble to nab customers from each other and soak up any of the as yet uncommitted userbase left. Given the relatively high rate of smartphone adoption in Australia, that can’t be that many people.

Just in case you were wondering, the emergence of improved wireless broadband doesn’t take away the focus for high speed fixed line broadband networks such as the NBN. In many cases, the 4G and 5G networks we’ll be using in the future will use the NBN as the effective data backbone, because even though the technology is improving, wireless broadband is by its nature both lossy and prone to interference and oversaturation, whereas physical fibre optic cable has an immense potential upside. They’re essentially complimentary, rather than competitive technologies.


Google opens up Game of Thrones via Street View

gothrones

When it comes to opening up the world of information, few companies have the reach and impact of Google. The odds are insanely high that when you search for information online, you’re using Google’s proprietary search algorithms, with only Microsoft’s Bing search engine standing in serious competitive spaces, or DuckDuckGo if you’re more privacy minded.

Most of us tend not to be, which gives Google the primary space on our searches. Likewise, Google’s Android operating system and its associated smartphones and tablets are the most popular in the land, and Google has a host of other services such as mapping to offer us views into our own world. Recently on a trip to Tokyo, I would have been seriously lost were it not for the ability to fire up Google Maps and make my way around, although some friendly locals did help out there too.

It’s not just the real world that Google’s opting to map, either. Some years ago Google rather secretly added the Tardis from Doctor Who to its mapping solution, and it’s recently supplemented that admittedly fantastical location with something that merges the virtual world with the real world, by way of popular (albeit not for the whole family) TV series Game Of Thrones. While fans wait for George R.R. Martin to finish out the last book, the TV adaptation is rapidly racing towards the epic story’s conclusion.

To mark that milestone, Google’s done the hard work mapping out as many of the real world locations that are used as filming bases around the actual world through its Street View project. (Game of Thrones: The old Views and the new)

Street View is the spinoff from its maps product that uses real world 360 degree photos to represent the real world, so that rather than looking at a top-down map and wondering where you are, you can examine real world landmarks as they were when photos were taken, for a more natural navigation experience. Google’s breakdown of the Game Of Thrones filming locations gives you a genuine feel for quite how far and wide the producers of the show go to get just the right backdrops, even though the series is also noted for its heavy use of computer aided background imagery. So for example, if you’ve ever wanted to tour Winterfell, the home of House Stark, it’s right here and actually in central Scotland, but the nearby forest locations used for filming are actually in Northern Ireland. King’s Landing locations range from Northern Ireland to Croatia (so that’s one seriously big capital!) while both Volantis and the Dothraki Kingdom are represented by filming locations in Spain.

For fans of the show it’s a real treat to peek behind the curtain and see those real world locations, and obviously for those fans with travel plans, also a way to perhaps get a few selfies in iconic locations that are now also often popular tourist destinations in their own right. Even if you’re not, it’s a lovely way to see how information can be leveraged to deliver a new view on how we see the world.


Does it make sense to jump into beta operating systems?

Update software

Apple announced the latest updates to its macOS and iOS platforms at the recent WWDC event, with promised delivery for iOS 11 and macOS High Sierra for later in the year, most likely around the October/November timeframe.

Operating system updates (in major forms) are a yearly occurrence for Apple users, provided free of charge once they’re ready to roll out. For years now, the standard wisdom around these releases has been to maybe wait a while once they’re fully rolled out for those last minute bugs to be ironed out, unless there’s a really specific feature that you’ll need right away. That’s because when software that’s as complex as an operating system gets updated and set free across millions of users, each with individually different needs, applications and approaches, a few issues are bound to crop up.

What Apple has done in recent years to mitigate this somewhat is open up its software for public appraisal, albeit in beta form. Beta software is, by very definition, incomplete and prone to some errors, and it’s something that Apple has made available to its developer community for some years to enable updates and new software to be developed with compatibility in mind.

Apple has opened up the public beta versions of both iOS 11 (for iPhones and iPads) and macOS High Sierra (for Mac computers) for download, and if you’re very keen, you can sign up for Apple’s Beta program to access them here. You’ll need an Apple ID, but if you’re running an iPhone, iPad or Apple computer, it’s all but inconceivable that you won’t have one of those.

Getting early access means that you’ll get to try out new features across both operating systems, although those may vary depending on the capabilities of your device, especially on iOS. Newer devices like the shiny new iPad Pro 10.5 inch get the full whack of iOS 11 features, while older iPads and iPhones that may sit on the compatibility list get a scaled down experience. That’s again a standard play on Apple’s part. The reasoning usually given is that they’re limited by the internal hardware on those older devices.

However, there’s then the question of whether or not you should apply beta software to your existing devices. If they’re your only devices and you rely on them absolutely for business or personal purposes, the answer is almost certainly no. Again, beta software has bugs that are both known and being worked on, and quirks that are yet to be uncovered. You don’t want that on a machine you rely on.

However, if you’re in the Apple camp heavily enough to have a spare system, it may be worth your while enrolling a single device for testing purposes, especially for business. That’s more because it’ll give you more of an insight as to how your current apps actually run under the new environments, as well as spotting any issues. Beta users are monitored with a strong focus on providing app and performance reports, which means if you do spot an issue with an app you can provide feedback to Apple (or other software or hardware providers) that may lead to a faster fix for those problems. That way, you’re much more prepared for when the full software release happens.

If you’re in the Windows camp, by the way, you’ve got access to much the same kind of beta testing, albeit on a rolling and ongoing basis. Microsoft runs what it calls the Windows Insider Program (https://insider.windows.com/), which allows you to enrol individual Windows PCs to receive the ongoing beta versions of Windows 10. Again, though, it’s best for testing machines and software combinations, rather than using on your day to day or only computer.


A decade of iPhones changed more than just our calling habits

iphone_2g

Just recently, the very popular Apple iPhone celebrated the tenth anniversary of its onsale date. Complex tech products don’t just spring up out of the earth, so it’s actually more like a grumpy teenager than a bright-faced ten year old, with even secretive Apple admitting that it worked for some years on getting the iPhone look and feel “just right” for launch.

That original 10 year old iPhone was never formally launched in Australia back in 2007, although very dedicated Apple fans did import some of them. If you’re still holding onto one, it’s perhaps a collector’s item but not likely to be much of a phone any more, because it was a 2G device, and our local 2G networks only have a matter of weeks left before they’re switched off entirely. The first iPhone concentrated heavily on just a few pre-built apps with a strong focus on telephony; allegedly Apple CEO Steve Jobs was paranoid that any more complex apps might interfere with a phone call, which would never do if they were going to take on industry heavyweights such as Nokia.

Fast forward a decade, and a brand that’s bought the rights to the Nokia name is launching Android phones locally at budget price points, because that’s what the Nokia name is worth now. Apple has seen astonishing growth in that time, often being the world’s most valuable company, but it’s not just a question of stock market value.

That second iPhone, the iPhone 3G, which launched here in 2008 before anywhere else in the world, changed things because aside from the smooth iOS interface, which refined ideas others had sketched around for years without ever getting quite so right, the secret sauce was in the apps, and the ability to get external developers to make apps for the iPhone. Apple still keeps a tight rein on the types of apps it will let play in its iPhone garden, but from that grew competitors such as Google’s Android, which offers more flexibility, albeit sometimes at the cost of whether an app will work or not, plus some security worries.

Before the iPhone’s emergence, mobile networks could carry data, but did so at astonishingly low rates. Now mobile data usage engulfs every other type of mobile usage to a point where if you graph it, you have to squint really hard to make out the volume of calls. Internet user numbers have jumped from the millions to the billions, and while not everyone is using an iPhone (indeed, they’re the minority share against Android) that first kickstart into that position was quite essential.

It’s also changed the way we interact while out and about. Some may decry folks always checking their phones, and there’s perhaps something in that, but at the same time it’s opened up the worlds of photography, social media, and even disruptive new industries like Uber along the way. Fortunes have been made and lost, endless games of Candy Crush and Angry Birds have been played, and there’s been an increasing focus on cloud-based services that extend way beyond mobiles to our desktops and laptops.

Apple’s somewhat slowed its innovative pace since those early iPhones, so whether we’ll look back on the second decade of iPhones as such a radically changing time feels a little unlikely. I guess we’ll find out when we’re using the iPhone 15 or 16 sometime in 2027.


Anyone fancy a transparent flexible tablet?

LG-display

Predicting the next big thing in technology is always a risky game. There’s always the possibility that you’ll pick the incorrect next big thing, or for that matter assume that current big things will maintain their status well into the future.

Watch any Sci-Fi show from the 70s or 80s that references the (then) far away world of 2010, and the chances are pretty good that you’ll see lots of curved cathode-ray monitors, few touchscreens and possibly even a rotary dial or two for communications purposes.

Still, there are technology ideas that persist over time as seeming “futuristic”, while still never seemingly coming to market. That’s because while technology as an entire field moves pretty quickly, individual components and change can occur much more slowly. It’s why while research is continuing in multiple fields, we’re still stuck with lithium ion batteries powering our laptops, tablets and mobile phones, because nobody’s yet totally cracked new technologies that work as well and as safely.

It’s also why while those older science fiction shows showed cathode ray TVs, we’re still using display screens that are flat and solid. There’s been plenty of research into more flexible screen display technologies over the years, and even some implementations. Earlier this year at CES in Las Vegas, LG had an array of its “flexible” W series OLED TVs set up as a massive curved chasm you could walk through, which was visually impressive, but probably not the way you’d want to watch the next State of Origin Match. Then again, put the opening crawl of Star Wars: A New Hope on that thing, and I could stayed there all day.

Getting that technology available at scale and cost, however, is another matter. As the Korea Herald reports, LG isn’t standing still on flexible displays, having now developed a screen that’s not only flexible, but also transparent and high resolution, with a display size of up to 77 inches.

That opens up all sorts of possibilities for usage, because while today’s laptops, tablets and phones all tend towards the same kinds of designs because of the limitations of screen display technologies, something that you can both bend and see through could be formed into many shapes for all kinds of applications.

Before you bin your smartphone and put a brick through your laptop display, though, even here LG reckons it’s a number of years away from mass commercialisation. It’s suggested that the displays might first be used for digital signage, or perhaps aquariums at first. LG produces displays not only for its own gear but (as is common in technology generally) also for many third parties. It’s the same all over, with a lot of memory and processors coming out of Samsung, while a lot of camera modules are produced by Sony, for example.

Still, even in the mid-term, some of the ideas that LG has around flexible transparent displays are fascinating. An aquarium built out of transparent displays has a lot of scope for engagement and education. Imagine staring at an aquarium full of fish, only to have the side of the tank interactively display the species of each tracked fish, with more information on display as required.

That feels like something out of science fiction, and while it’s not here yet, it’s also not that far away.


Are you protecting your business from phishing attacks?

virus, phishing, mail,

What’s the most precious part of your business? Depending on your trade, that answer might vary, but when you boil most businesses down to their core, it will usually revolve around money; either the operating capital that keeps you afloat, or the profits that you make on a day by day basis as the result of whatever it is that you do.

That information is usually stored electronically, and that has had a profound effect on business efficiency, whether it’s the speed with which you can turn around an email to a client, or the level of detail you can provide the tax office if they come calling. There’s few that would advocate for a return to a more fragile, harder to index paper business world.

Having said that, the use of technology to run your business isn’t without its risks, especially when it comes to the rising preponderance of phishing attacks. Phishing broadly defines the act of maliciously impersonating some other person or business you deal with (whether it’s your bank manager, your clients or the company that hosts your web site) in order to gain access to your private information. It’s by no means a new concept, but it’s also one that’s on the rise. The classic phishing approach is via email, because if you’re a scammer, it’s very low cost to execute, and even if 99% of your emails are either bounced back or ignored, that 1% that you fool could be very lucrative indeed.

There are numerous technological approaches you can take to mitigate phishing risks in order to minimise your exposure, by way of filtering incoming emails, but the best approach by far remains using your own actual intelligence. A recent report from Mimecast (Updated Email Security Risk Assessment) suggests that emails that intend to impersonate other business bodies for phishing purposes saw a 400% increase in the last quarter year. Testing the actual email from 44,000 users over 287 days uncovered 9 million pieces of spam, 8,318 dangerous file types, 1,669 known and 487 unknown malware attachments and 8,605 impersonation attacks.

Bear in mind that in all these cases, email had already passed through some kind of spam detection filter, which is why it’s vital to keep your wits with you at all times. Always check the simple stuff, like spelling errors, or even errors in the way that you’re addressed. Why would your business partner/bank/other entity not address you by your full name, rather than, say “customer”? There’s one bit of phishing spam I’ve hit recently that seems to love inverting my name, so it always sends through to kidmanalex, which is a bit of a giveaway. Unfortunately, not all the scammers are quite that dumb.

Even if you don’t operate a business, where many of these phishing scams are targeted, it’s worth keeping your exposure to spam and malware at a minimum. You might think that you have no data worth pillaging or no online banking for the fraudsters to access, but even if they can get access to your computer, that’s a valuable resource in and of itself. For email phishing scams, for example, it’s fairly common to route emails through unsuspecting bot-controlled PCs to evade detection, which means that if your machine is compromised, it could be putting others at risk. As we saw with the recent wannacry infection as well, scammers will often use multiple attack vectors, so if the phishing email doesn’t get you, a gap in your patching updates just might.


Recent News

samsungssd

You probably don’t think that much about the storage inside your laptop or desktop, excluding those times when you start running out and your computer complains about the lack of overall storage. A quick primer, as a tiny digression. If you look at the specifications of your computer, it’ll list two types of storage: the… More 

Closeup of Password Box in Internet Browser

For just about any online service you’d care to name, you’re going to be requested to set up a password in order to securely access those services. This may be for a relatively trivial reason, such as one-time access to a site you’re not sure you’re going to use regularly, or something far more serious… More 

ipod1stgen

While Apple has a history as a computer company that dates all the way back to the 1970s and the original Apple I computer and wildly successful Apple II model that was the favourite of many Australian educational institutions in the 1980s, by the 1990s, the company wasn’t doing so well. Considerably more affordable Windows… More 

freewifi

It’s feasible within Australia to stay connected online for relatively low prices, especially if you make smart use of public Wi-Fi. It’s possible to get online from many public libraries, cafes, shopping centres and popular public locations, where the venue either offers Wi-Fi as a straight service (a la libraries) or as part of a… More