The post Worst Computers EVER! appeared first on PCL Publications.
]]>However, there are some examples of computer that are categorically bad. They could have been badly specified, overpriced, have terrible support, exploding PSUs… the list goes on. We’ve therefore gathered a selection of such computers, the ones that near-everyone agreed were pretty awful.
Apple III
After the success of the previous Apple line of home computers, the company continued its branding with the release of the Apple III in late 1980 at an eye-watering cost of between $4,300 to $7,800. Sadly, due to initial problems with the Apple III it was withdrawn from the shelves and re-issued nearly a year later in November 1981.
The Apple III is regarded as one of the biggest failures in the history of modern computing. Its problems were many, and included warped circuit boards, severe overheating that would pop chips out of their sockets and melt disks. The aluminium case served as a heatsink, and would grow incredibly hot to the touch. And, according to internet rumour, the system was designed by the marketing team and not the engineering team.
Too many components in a small space was the main blame, which caused the massive overheating of the Apple III. Needless to say, by 1984 the Apple III was laid to rest in favour of its successor, the Lisa.
Coleco Adam
Launched in 1983, the Coleco Adam was a cassette and cartridge-based home computer that could also serve as an expansion device for the ColecoVision games console. It’s initial price was around $790, but that also included a printer, couple of Coleco controllers and a keyboard. It was comparable to that of the Commodore 64 at the time, but it did come with a myriad of problems that meant the Adam only ever sold about 100,000 units.
One of the more bizarre issues regarding the Adam was that the power switch was located on the back of the printer, not the computer. So if you didn’t have the printer, or the printer failed for some reason, the entire Adam setup was next to useless. But, the major issue with the Adam was the fact that when powered up it released a huge surge of electromagnetic energy, and any tapes that were already inserted during start up were promptly wiped.
IBM PCjr
The IBM PCjr, PC Junior, was released in 1984 and is one of the biggest flops for the company during the 80s. Introduced at the unbelievable price of $1,300 (without a monitor), this 128KB system was supposed to be designed for home users who didn’t want the fully-blown IBM PC experience. Sadly, it didn’t work out too well.
One of the major issues with the PCjr was that much of the existing IBM PC software couldn’t run on it, due its lack of memory. There was also limited hardware expansion, just a single external slot, and a terrible keyboard. IBM later swapped out the keyboard, and dropped the price of the PCjr, which surprisingly improved the sales of the machine. However, by the following year the PCjr was discontinued.
eMachines eTower 366c
eMachines became a well-known household name in the late nineties due to its amazingly cheap PCs flooding the market. In a time when the internet was taking its first fledgling steps, having an eMachine meant that you could get online to that superhighway thingie for as little as a few hundred pounds.
The computer in question, from the many that eMachines dished out, was the eTower 366c. A beige-coloured behemoth that could be bought for around £400, but could also come as part of a deal with certain ISPs on a three-year contract. While that sounds reasonable, the eTower did have its issues. For one, the PSUs were notoriously faulty, the fans were noisy, the built-in modems were next to useless, poor customer support, and many users reported their eTowers randomly turning themselves on in the middle of the night.
Needless to say, they didn’t last long.
Texas Instruments TI-99/4
1979, and the golden era of the home computer was about to kick off in a big way. Before we got there, though, we had the TI-99/4. This monstrous computer, with its own Zenith 13-inch display, and terrible keyboard was, to quote The New York Times, “an embarrassing failure.? Overpriced, non-standard BASIC, and a severe lack of software meant the TI-99/4 didn’t last too long on the shelves.
Among the aforementioned issues, the TI-99/4 also suffered from a lack of expansion, overheating, random power shutdowns, and you could only ever type in capitals. Texas Instruments fared better a couple of years later with the release of its successor, the TI-99/4A.
Future Tech
Many of these systems were filling a need, or a niche in the market at a time when the home computer was still quite a new experience for a lot of people. In some respect you can hardly blame the companies for opting for something new, and little extreme to fit the bill. On the other hand, there were a lot of corners cut in some models. Who knows, maybe in another forty years someone will write about those terrible PCs we used back in the 2020s, while typing on their quantum-based, mind-connected computers?
The post Worst Computers EVER! appeared first on PCL Publications.
]]>The post Stop Your Kids Spending Your Money On Google Play! appeared first on PCL Publications.
]]>Luckily, you can enable a parental lock on the Google Play Store to prevent others downloading things you don’t want them to. Setting these controls is a good idea for a few reasons. It will help to avoid hefty bills when ‘paid-for’ apps are downloaded without you knowing. You can also rest assured that children aren’t downloading inappropriate content for their age. Whatever your reasons for setting limits you will need to know how to do it!
Enable parental lock on Google Play
First, you will need to open the Play Store and tap on the menu icon at the top of the screen. This icon is usually three horizontal lines together. From the menu options that appear tap on Settings and then tap the sliding button next to the option titled Parental Controls. This will turn the Parental control option on resulting in your device asking you to create a parental lock pin number.
Once you have created your pin number you will be shown a range of options you can control. For example you can ask the Play Store to request your PIN number before letting you download any content that is for over 18’s. Work your way through the list of things you can download and tap on any that you want to set restrictions for. Once you have completed this you can rest safe in the knowledge that the children in your life will not be able to accidentally download anything you think they shouldn’t. It is not, however, a magic bullet and it remains important to keep an eye on what your children are viewing and downloading on your smartphone or tablet.
The post Stop Your Kids Spending Your Money On Google Play! appeared first on PCL Publications.
]]>The post A Billion Android Devices at Risk From Viruses! Is Yours? appeared first on PCL Publications.
]]>Rather than simply highlighting the problem, Which? then went about performing a limited test, asking an expert antivirus lab (AV Comparatives) to try to infect 5 different phones. Each phone tested was at least 3 years old and could only be updated to 7.0 or 8.0, but were still available to buy across the world.
The AV lab not only managed to infect all devices, with some it managed to infect them with multiple well-known viruses, including the Joker, Bluefrag and Stagefright viruses. None of the phones except a Samsung Galaxy A5 running Android 8.0 fully supported Google Play Protect, which is the built-in malware protection software for Android.
Apple typically supports their iPhone devices with security updates for around 5 years, and Microsoft usually supports older versions of their operating systems for as long as ten years. Google, it seems, have a much shorter security support buffer period.
If you think you could be one of the billion, the first thing to do is check which version of Android your device is using. To do this, open the main settings app, then look for About Phone/Device and tap Software Information. You will see the currently installed version here.
If your device is using a version of Android older than 8.0, check to see if you can update. Again in the settings app, look for Advanced > System Update, or sometimes you will just see Software Update in the main setting menu. Tap this and then Check for Updates.
If an update is available, download and install it. If not, you should consider upgrading to a newer device as soon as possible. If, however, upgrading is not an option for you right now, there are things you can do to reduce the risk of virus infection.
Install a Third-Party Antivirus App
Download and install some antivirus software on your device if you don’t already have some installed. There are a range of options available from the big names in antivirus (Norton, Kaspersky, AVG, etc.,) but some of these may not be compatible with very old versions of Android (such as Android 4.4).
Be Careful What You Download
Installing well-known apps through the Google Play Store should be pretty safe, but installing from outside of the official store is risky (this is known as sideloading). If you are planning on doing this, ensure the app is from a reputable source, and always re-enable the Unknown Sources block in the Android settings afterwards.
Backup Your Data
In an ideal world, you should be backing up your data at least monthly. Many newer devices have features in place to do this automatically, but older devices may not. Try to back up to a local location (your computer for example) but if this isn’t possible, back up to a free cloud service like Dropbox or Google Drive.
The post A Billion Android Devices at Risk From Viruses! Is Yours? appeared first on PCL Publications.
]]>The post ExpressVPN Review – The Best VPN Ever? appeared first on PCL Publications.
]]>ExpressVPN was launched in 2009, and has been protecting the digital rights of users across the world with an impressive arsenal of security features. Running from 160 server locations in 94 countries, ExpressVPN can provide you with a solid connection wherever you’re located. And since the company is registered in the British Virgin Islands, a privacy-friendly jurisdiction, there’s no logging taking place. In fact, the company regularly welcomes third-party audits of its logging policies, to assure that all connection to and from the VPN are private. That includes IP addresses, browsing history, traffic destination and data, and DNS queries.
In terms of security technology, ExpressVPN tops the bill with some of the best VPN protocols we’ve ever tested. With an AES-256 bit cypher, RSA-4096 bit keys, and SHA-512 bit HMAC, ExpressVPN is quite possible the highest level of security in any VPN available. In addition to all that, the company also uses Certificate Authorities to prevent man-in-the-middle attacks, and Perfect Forward Secrecy, which means there’s a new secret key issued every time you connect or every sixty minutes. All in all, it’s impossible for any attacker to be able to view your connection information and transfer of data while you’re connected to ExpressVPN.
Other features include a network kill switch, which will lock out all Internet traffic should you lose connection to the VPN – except local network traffic, so you can still access a network printer; Split Tunneling, so some traffic can be passed through the VPN while the rest access the Internet normally; IP Address Masking, Unlimited Bandwidth, and the ability to connect up to five device simultaneously; and the Express VPN servers run in RAM mode, so once the power is removed from the volatile memory anything stored in it is gone forever – so there’s certainly no storing of information regarding your connection.
One of the main bugbears with VPNs is the speed of the connection. While it’s great to be able to connect and obtain an IP address from another country, the speed of the connection is often something of a let-down. However, that’s not the case with ExpressVPN.
Using the Ookla online speedtest, we recorded 38.25Mbps download, and 13.17Mbps upload while not connected to ExpressVPN. When we connected to ExpressVPN, we chose a local server, one based in London Docklands, and managed an excellent 35.64Mbps download with 12.11Mbps upload. Furthermore, we connected to a server on the other side of the planet, one located in New Zealand, and was surprised to get a download speed of 26.56Mbps and upload speed of 11.55Mbps; not too much of a loss of speed from our original test without the VPN altogether.
Using the ExpressVPN software is extremely easy. Once installed, you can choose from a list of recommended servers, ones that are closest to your current location for maximum speed, or by All Locations, which lists all the available servers across the 94 countries. Connecting and disconnecting can be done by clicking on the power button in the centre of the ExpressVPN app window, and there’s also plenty of options to fine tune your experience, as well as to install the VPN as an extension to your web browsers.
Pricing is slightly higher than some VPNs we’ve reviewed in the past, but as the saying goes, you get what you pay for. In this case it’s the highest possible form of security and speed. A one-month connection to ExpressVPN costs around £9.99, while six months comes in at £7.71 per month. There’s currently (at the time of writing) a special deal on for 12 months with 3 months free, this costs just £5.15 per month. All pricing models come with a 30-day money-back guarantee, and are charged in US dollars – you can even pay with Bitcoin, to heighten your privacy if you so wish. ExpressVPN has proved itself to be well above the competition in terms of security and privacy. It’s policies are rock solid, and its features are more than enough for any sort of user who’s in need of secure Internet access. The connection speeds are the best we’ve seen from a VPN, and while the pricing may be slightly higher than some of the competition, we think it’s worth every penny. In short, simply the best VPN on the market, and we highly recommend using it.
The post ExpressVPN Review – The Best VPN Ever? appeared first on PCL Publications.
]]>The post Fix Raspberry Pi File Manager Crashing appeared first on PCL Publications.
]]>Raspbian’s PCMan File Manager app is usually a pretty solid piece of code. Despite the continual updates to the core distro, alongside many other updates and upgrades, the file manager has always remained a constant source of stability. However, if you’ve updated and upgraded recently, then you’ll probably have come across an issue whereby opening the file manager will result in it closing almost instantly.
The problem lies with the way we’ve traditionally updated and upgraded packages within a Linux distribution. Historically, most of us have used apt-get update, and apt-get upgrade, which is fine but there have been some subtle changes over the years that now mean we need to look at things a little differently.
The apt-get command worked well enough, but it has since been replaced by Debian with apt. So where you would once enter apt-get update, to search for and install any package upgrades from the repositories, we would now use apt upgrade.
Apt-get upgrade would inherently install newer versions of all the system’s installed packages, but without installing any new dependencies. Apt upgrade, on the other hand, will install new version, check for new dependencies, but won’t do anything that will result in a package getting removed from the system.
In turn, the PCMan File Manager and libfm packages were updated, with pcmanfm getting a new set of dependencies, which meant that if you executed an apt-get upgrade you would notice that the pcmanfm package was held back while libfm was upgraded.
Now that pcmanfm and libfm are out of sync, and as such no longer compatible. Opening the file manager under such circumstances results in the app closing itself again.
To fix the issue, therefore, you will need in future to update and upgrade your Raspberry Pi by issuing the following commands:
sudo apt update
sudo apt full-upgrade
The full-upgrade option of apt will install any newer versions of the packages, check for new dependencies, and upgrade the lot even if it means a package will be removed in the process.
Running the above commands will update and upgrade all your packages, and ultimately fix the issue you’re having with the file manager shutting down as soon as you open it. It’s best from this point on to remove apt-get update/upgrade from your scripts etc, and opt for the better apt update/full-upgrade options.
The post Fix Raspberry Pi File Manager Crashing appeared first on PCL Publications.
]]>The post Is Apple Still Working on a Foldable iPhone? appeared first on PCL Publications.
]]>Stop us if you have heard this before! Yet according to the Apple rumour mill, a forthcoming top-of-the range iPhone might feature a foldable screen, as seen in current competitors such as the Samsung Galaxy Fold and Huawei Mate Xs, but not until 2024 at the earliest; we’re unlikely to get a folding iPhone with the 2024 refresh.
The forthcoming foldable iPhone is said to be called the iPhone Flip, and will feature a flexible version of the new ceramic shield display that debuted with the iPhone 12. It’s said to feature a single foldable screen rather than two individual displays like the ZTE Axon M or the LG G8X ThinQ Dual Screen. Whether the iPhone Flip’s screen folds horizontally or vertically isn’t known for sure, but its name would suggest the latter.
An Apple patent titled ‘Hybrid coverlay/window structure for flexible display applications’ suggests the foldable screen will have four-layers; a cover layer, a hardcoat layer, an inner surface and a transparent support layer. The website Economic Daily News claims the screens are already being tested at Foxconn in Taiwan, a key Apple manufacturing supplier. These tests are said to involve opening and closing the phone over 100,000 times, making sure the screen doesn’t crease or distort with continual use. Another leak, this time from supply chain sources in Asia, claim that two prototypes have now passed this strenuous test.
Which way the folding iPhone opens isn’t yet known. The name ‘iPhone Flip’ would suggest a clamshell design with a horizontal fold across the screen, but this is far from final. Rumour has it that one of the two prototypes under consideration has a clamshell form factor, while the other opens like a book, with the fold dividing the screen vertically. A clamshell iPhone would offer a screen of a similar size and dimensions as a regular iPhone, but would fold down to a smaller size to better fit the handbag or pocket. A book-type fold would suggest the phone would be a similar size to a standard model when folded, but offer a bigger screen when open. This is all speculation, of course, but it will be interesting to see which Apple goes for.
It’s rumoured that Samsung – whose folding display for the first-generation Galaxy Fold proved notoriously unreliable – is to make the screens for Apple, and is involved with the prototype tests. This isn’t as unlikely as it sounds, as Samsung often makes displays for other mobile phone manufacturers.
The screen technology for the folding iPhone is likely to be either OLED, as used in the 12-series iPhones, or mini LED, which is rumoured to debut with some of this year’s mobile devices. We assume both are being tested, with a decision to be made on which to use made after their performance has been assessed.
The Android foldable smartphones available today aren’t cheap, and Apple’s rumoured release is likely no exception. Could it be the first iPhone to top the £2000 price point? Perhaps we’ll find out in the Autumn of 2024. Or perhaps not? So which of these rumours and leaks will prove accurate, and which are just idle speculation? As always, we’ll have to wait and see.
The post Is Apple Still Working on a Foldable iPhone? appeared first on PCL Publications.
]]>The post Remembering Retro Tech: The Atari 7800 appeared first on PCL Publications.
]]>However, since the announcement in 1984 of the new Atari 7800, it took three years before the console itself was on the UK shelves, with it being made available the year before in the U.S. This gap was enough to cause Atari some immense headaches, as the Nintendo Entertainment System had already been realised, and was dominating the living rooms of the gaming market that Atari so desperately wanted to claim.
Our fondest memory of the Atari 7800 was that of the version of Galaga and the original 7800 game, Desert Falcon. In our humble opinion, the 7800 was one of the best ports of Galaga and managed to retain the feel, sprites and smoother motion of the arcade version that other systems simply didn’t offer.
Atari also had the revolutionary concept of making the Atari 7800 backward compatible with the 2600. Something that modern console makers could do well with improving throughout their current models.
Its history
The Atari 7800 was actually commissioned by an external company to Atari. General Computer Corporation, and had been designed throughout 1983 and 1984 ready for a rollout that same year.
The sale of Atari to Tramel Technology Ltd in 1984 though put a stopper in the launch of the 7800, which incidentally was going to be called the 3600 to begin with. A quick upgrade a couple of years later made the 7800 a better, technically speaking, machine and one that also allowed the aforementioned backward compatibility for the Atari 2600.
It was powered by a customised 6502 processor running at around 1.8MHz, with 4KB of RAM and a 4KB ROM. The cartridges were each around 48KB in size, although rumour has it that some stretched to around 64KB toward the end of the console’s life.
The faster processor, named Atari SALLY, was similar to that found in the Commodore 64 (the 6510 was the direct successor to the 6502), and the rival console of the time, the NES. This meant that faster graphics could be displayed, which brought a more arcade feel to the home.
The end of the 7800 though was quite swift. Within a few years the 7800 production had no choice but to be terminated as Nintendo was dominating the market with a huge 85% share, and had secured a large catalogue of gaming content from the likes of Taito. Although profits were generally good for Atari, the company never did manage to reclaim the home console crown that it enjoyed so much in its heyday.
The good
Classic Atari console design, backward compatibility with the 2600; Commando and Dig Dug!
The bad
Poorly limited number of 7800-specific games, better graphics on NES (sorry to say that, but there you go). Terrible controller.
Conclusion
The 7800 effectively ended Atari’s home console division. The XE did follow, but that didn’t fare too well either and died a relatively quick death as well.
Such is the way of things, but we do have to thank Atari for bringing us the 7800, without it we wouldn’t have had Galaga championships at home.
The post Remembering Retro Tech: The Atari 7800 appeared first on PCL Publications.
]]>The post The History of Linux appeared first on PCL Publications.
]]>As we sit in front of the latest version of Linux Mint, Ubuntu, or openSUSE, revelling in the glorious animated desktops, taking pleasure in the ease-of-use the GUI grants and enjoying the fact that 99% of our hardware works perfectly out of the box, do we ever wonder how our favourite operating system got to this point? Do we consider and appreciate the amount of time and effort that a long list of developers has taken in reaching this Zen-like state of user and OS? Most likely, not.
A quick reminisce of Linux distro’s long gone made us think one day about the history of this wonderful OS and its journey over the last couple of decades. When was it born? How did it evolve? What distro’s stand out in history as the pivotal turning point, which changed a humble bedroom project into the desktop OS we have today? And, which poor distro’s fell by the wayside, as failed crumpled heaps.
1991 –
– 0AD to 1991: In the beginning, there was Unix. Created by the great bearded ones, Ken Thompson and Dennis Ritchie, in 1969. After that, throughout the eighties, a number of projects started life, all based on the encompassing vision that is Unix. From Richard Stallman’s GNU Project, the Berkley Software Distribution (BSD), the book ‘Operating Systems: Design and Implementation’ by Professor Andrew S. Tanenbaum and to MINIX (Mini-Unix) which was released to the academic world in conjunction with the aforementioned book. But it wasn’t until 1991 that a young Finnish student, called Linus Torvalds, would combine all of the ingredients that made up those landmark systems into a kernel that would take the world by storm.
– 1991: There are many legends that tell of the start of Linux; one of which is: Linus, while playing around in MINIX, piped data to his hard drive instead of his modem and wiped out the MINIX partitions he had created, thus leading to his frustration of the limitations of the OS he decided to create his own. Another being, he wrote the kernel to gain better functionality of the new Intel 386 machine he was using. Another still is that he was barred from further improving MINIX, and so went on to develop his own. Whatever the real story may be, he successfully created a free terminal emulator that was based on MINIX, which was based on Unix, that would eventually become the workings for an operating system kernel, and on the 25th August 1991, Linus posted this famous message on the MINIX Newsgroup:
“From: torvalds@klaava.Helsinki.FI (Linus Benedict Torvalds)
Newsgroups: comp.os.minix
Subject: What would you like to see most in minix?
Summary: small poll for my new operating system
Message-ID:
Date: 25 Aug 91 20:57:08 GMT
Organization: University of Helsinki
Hello everybody out there using minix –
I’m doing a (free) operating system (just a hobby, won’t be big and
professional like gnu) for 386(486) AT clones. This has been brewing
since april, and is starting to get ready. I’d like any feedback on
things people like/dislike in minix, as my OS resembles it somewhat
(same physical layout of the file-system (due to practical reasons)
among other things).
I’ve currently ported bash(1.08) and gcc(1.40), and things seem to work.
This implies that I’ll get something practical within a few months, and
I’d like to know what features most people would want. Any suggestions
are welcome, but I won’t promise I’ll implement them :-)”
After that, FTP servers around the world became a-buzz with versions of Linux (originally named ‘Freax’) which grew at an astounding rate due to the number of contributors involved.
– 1991: Version 0.01 of Linux is a far cry from what’s available these days, but if you want to get your hands dirty, point your browsers to, https://mirrors.edge.kernel.org/pub/linux/kernel/Historic/linux-0.01.tar.gz, and download the 71KB kernel in all its glory; along with the release notes from https://mirrors.edge.kernel.org/pub/linux/kernel/Historic/old-versions/RELNOTES-0.01.
Unfortunately we can’t stay in 1991, needless to say though, Linux evolved into a fully blown OS, with the Manchester Computing Centre creating one of the first distributions that used a combined boot/root disk, named MCC Interim Linux.
1992 to 1994 –
Not much of a time jump, but between 1992 and 1994 we saw the rise of the most influential founders of the modern Linux desktop: Slackware, Red Hat and Debian, along with the Linux Kernel growing to become 0.95, the first to be capable of running the X Window System.
– 1992: Slackware had something of a rocky start, despite being one of the first systems to adopt the ‘new’ Linux kernel at that time. Slackware started life as SLS, the Softlanding Linux System, as founded by Peter MacDonald in 1992. SLS was quite ahead of its time as it was the first Linux distribution to contain not only the 0.99 Linux kernel, but also the TCP/IP stack and the X Windows System. However, SLS was a buggy beast at best, and it wasn’t long before it was superseded by Patrick Volkerding’s Slackware, which is crowned as the longest running Linux distro.
– 1993: SLS did more than just spawn Slackware, due to the frustrations of its buggy interface, another user found the motivation to go it alone and create a new branch of Linux distribution. In 1993 Ian Murdock went forth and gave birth to a system called ‘The Debian Linux Release’, which is allegedly named after his then girlfriend Debra Lynn and himself, Ian.
– 1994: As Slackware evolved, other distros began to form, using Slackware as the code base. One such distro that appeared on the scene in 1994 was the ‘Software und System-Entwicklung’, or as it was more commonly known, ‘S.u.S.E Linux’.
– 1994: One final distro that saw the light of day on 3rd November 1994 was called the ‘Red Hat Commercial Linux’, created by Marc Ewing and was so named after the similarly coloured hat he wore while at University.
– 1994: On 14th March 1994, Linux 1.0.0 was launched with a whopping 176,250 lines of code under its belt; thus was the start of something wonderful.
1995 to 1999 –
We take quite a leap now, as the next five years saw some of the greatest Linux distributions arise from the ‘big three’, along with some rather notable off-shoots of the Linux family tree; and including the infamous penguin attack of 1996. All this Linux history happening amid the dot com boom, incredible.
– 1995: Jurix Linux was an interesting distro that was notable for a number of reasons: it was allegedly the first distro to include a scriptable installer, allowing an admin-based install to copy the installation process across similar machines. It was one of the first to fully support bootp and NFS, and one of the first Linux systems intended to use EXT2. But what really made Jurix an important milestone in Linux history was the fact that it was the base system used for creating the openSUSE Linux that we know and use today.
– 1995: The Red Hat based branch of Linux OSs were a fertile bunch during this five-year stretch. Notable releases such as Caldera, Mandrake, TurboLinux, Yellow Dog and Red Flag all began life from the sudden big bang of the ever-evolving Linux kernel, which was now, from 1995 to 2000, in versions 1.2.0 to 2.2. In fact, version 2.0, launched in 1996, saw something like 41 releases in the series. It was this fast turn-around of the kernel, and the addition of some very important features that solidified the Linux operating system as the server OS of choice for IT professionals the world over. Version 2.0, for instance, had features such as SMP support, better memory management and could run on more types of processor. Version 2.2 heralded an improvement of SMP, support for the PowerPC architecture and a read-only capability for NTFS.
– 1996: While on holiday in Australia, Linus visited a zoo where he was bitten by a ferocious penguin. He was then infected with ‘penguinitis’, which makes the victim lay awake a night dreaming of penguins and becoming very fond of them – His words, not ours! Anyway, Linus liked penguins, they are “goofy and fun”, as he commented. As for the name ‘Tux’, again, according to Internet fable, this is from (T)orvalds (U)ni(X). So now you know.
– 1996: Debian based systems, although not as active as the Red Hat counterparts, began to grow and favoured a much less technical server room approach to their distros. Being more a desktop orientated OS, a Debian based distro was often displayed on the front of the popular magazines at the time, showing off such notable entries as: Libranet, Storm, Finnix and Corel Linux.
– 1996: Of course, the most notable of happenings during these five years, was the birth of KDE and Gnome. KDE (Kool Desktop Environment) was founded in 1996 by Matthias Ettrich, a student at the University of Tübingen, who proposed not just a set of working applications, but also an entire desktop environment for them to work in. No longer did users have to fiddle around in CDE or X11 based environments, now we had Qt! By 1998 KDE version 1.0 was open to the world, and the first distro to use it was Mandrake. By 2000, version 2.0 was out and featured a greatly improved system, with Konqueror, KOffice and KIO networking.
– 1997: Miguel de Icaza and Federico Mena announced the development of a new desktop environment and accompanying applications. Based on GTK+ this new desktop environment was called Gnome. Interestingly, according to Internet folk-lore, the first Linux OS to feature Gnome, was Red Hat. Gnome fast became an acceptable desktop environment, being quick, malleable and very friendly for the average user, and by May 2000 Gnome 1.2 ‘Bongo’ was released.
– 1998: Oracle and Sun announced official support for Linux, as the OS becomes increasingly popular and more system admins start to adopt it in their server rooms.
– 1999: Red Hat goes public and achieves the eighth biggest first day gain on Wall Street, further fuelling the rise of Linux.
2000 to 2005 –
The next five years saw an incredible surge of Linux-powered computers hitting the media, with further improvements to the kernel, heaps of new applications and the appearance of the first Live Distro.
– 2000: Knoppix, a friendly Debian based distro developed by Klaus Knopper, was also one of the most popular of its time. It was noteworthy for many reasons, but the main one was the fact that it could boot directly from the CD. True, this is something we take for granted these days, or to be fair, don’t really even consider now since optical media for the PC is drawing to a close; but Knoppix 1.4, as released on 30th September 2000, could be inserted into any PC and boot into a fully working Linux, with access to a massive range of hardware and the ability to communicate and automatically connect to almost any network available at that time. Knoppix set the bar for other Linux distros to follow, and from its humble beginnings it spawned quite the family tree of Knoppix based distros, many of whom are still with us today.
– 2000: With all these pre-built distros now becoming the flavour of the month, and starting to look vaguely like Microsoft’s OS offerings, a project was started to help get Linux users back in touch with what makes Linux work. Linux From Scratch (LFS) was conceived, along with a book, by Gerard Beekmans, which gave users instructions on how to build their own Linux system from source.
– 2000: Linux is freedom, and it must be allowed to grow. But to ensure the protection and the advancement of Linux a group must be formed to help keep Linux independent. So, in 2000 the Linux Foundation was formed, to sponsor the work of Linus and the developing community, in making and improving Linux, but to also defend it and keep it within the core values of freedom, collaboration and education. A bit like The Avengers, but without tight-fitting suits.
– 2001: A pivotal moment in the Linux kernel came with version 2.4, released on 4th January. Version 2.4 contained support for USB, PC Cards, ISA Plug and Play and went on to include Bluetooth, RAID and EXT3. In fact 2.4.x was the longest supported kernel, ending with 2.4.37.11 in 2011 and demonstrated just how versatile and powerful the Linux kernel had become from the early days of 1.0.
– 2002: Red Hat, having now enjoyed some time on the stock market, decided that although they made some money via the support of their free Red Hat Linux OS, the time had come to adopt a more business-like and commercial approach. From this came a two-way split, Red Hat Enterprise Linux 2.1 was born, with kernel 2.4.9, more stability and long-term support for the enterprise user; and the Fedora Core for the community distribution. Interestingly, with RHEL being open source, Red Hat makes the source code freely available on their FTP servers, which several groups downloaded and compiled to their own distros, (chiefly to remove the Red Hat referencing and repositories). CentOS, Oracle Linux, CERN and Scientific Linux are notable examples of such distros; all the goodness of a well-built distro, but without access to the mighty hat’s expert knowledge and software.
– 2002: December of 2002 saw the release of a notable distro, CRUX. With special emphasis of the ‘keep it simple’ theme that had become popular during this time, CRUX was extremely light-weight and focused on the developer as opposed to the end user. In a time when Linux distros were starting to grow exponentially, and vied for the position of the replacement to Windows, CRUX took a different look and thinned themselves down to the bone; becoming a welcomed minimalist distro. What’s notable about CRUX though, was the fact that it was the inspiration and base for the incredibly popular, Arch Linux.
– 2003: with kernel 2.4 firmly doing so well, version 2.6 was announced on 18th December. With it came support for PAE, new CPUs, improved 64-bit support, 16TB file system sizes, EXT4 and much more.
– 2004: As the Linux distro was now approaching an almost Zen-like harmony with user and PC, it was still deemed as being distant to those who preferred the flavourings of Microsoft. Therefore, a new philosophy was needed. Something that would make Linux more personal, and become more human: something Ubuntu. Based on Debian, Ubuntu’s aim was to create an easy-to-use Linux desktop that could be updated to include the latest offerings by the end user with very little experience in Linux. With the release of Ubuntu 4.10, the Warty Warthog, on 20th October 2004 this dream was realised. There’s little than can said regarding the rise of Ubuntu, its popularity grew to such a point where it, and the rest of the Ubuntu family tree, have become one of the most well-known Linux distributions in the world.
2006 to 2012 –
– 2006: Of the many differing distros that were launched from 2006 onwards, one became the most popular Linux distro of recent times. Linux Mint 1.0, Ada, was released in 2006 with a heady mixture of FOSS and proprietary software this ‘works-out-of-the-box’ Linux distro briefly followed the Ubuntu base until 2007, when it started to use its own code base. Linux Mint has adapted itself to embrace and offer the newest technologies, while still keeping an ear to the ground and listening to its users; hence the huge support for this great distro.
– 2007/8: KDE4 was released this year and was met with some criticism due to the lack of stability, with Linus himself stating that KDE 4.0 was a “break everything” and “half-baked” release. However, users began to enjoy the Plasma desktop, and the cutting-edge look and feel, so that by the time KDE 4.2 was released in 2009, everyone had forgot about the terrible experience they had previously. What a fickle bunch we users are.
– 2008: The 23rd September saw the release of the most popular Linux-based operating systems ever; although 90% of its users have no idea that it’s Linux based at all. That OS is Android. Version 1.0 was launched with the HTC Dream and could achieve everything you’d expect from a modern smartphone, but it was buggy. Version 1.1 fixed most of the bugs, but it wasn’t until version 1.5 ‘Cupcake’ that Android really started to get interesting and pave the way for smartphones to take over the world.
– 2011: Ubuntu had gone from strength to strength during this time. It was regularly the top of the Linux user charts, it had a huge fan base and it was easy to use. Then, one sunny April day, the fourteenth release of Ubuntu came about, with a slightly different look: Unity. Apart from KDE4 and Gnome 3, never has such venom been spat at a desktop interface as with Unity. It’s safe to say that nearly everyone at the time hated it, and many still do (despite the regular updates until Canonical abandoned it recently). Ubuntu fell from favour and never really regained the popularity it enjoyed so much in its early years.
– 2011: After some years with the 2.6.x kernel, version 3.0 was finally released with the following note: “NOTHING. Absolutely nothing.” As quoted by Linux. Indeed, due to the kernel numbers getting too high, and the 2.6.x notation getting out of hand, Linus decided that a new number was called for. Version 3.0, there you have it.
– 2011: After the debacle that was KDE4 some years earlier, you’d think that those who created desktop environments would have learned what the public liked. This obviously hadn’t reached the ears of the Gnome team, who in April of this year launched Gnome 3.0. Like lemmings, users of Linux ran towards the cliff and threw themselves off in favour of KDE, or earlier versions of Gnome; such was its effect on the Linux community. The damage was done, and Gnome is still paying for it, with the likes of Linux Mint offering users an alternative desktop environment, in the form of MATE and Cinnamon. However, to be fair, Gnome has managed to scrape back its user base and now provides a modern, flashy, yet stable and clean environment
2012 – Present Day
Distributions come and go, and the last six years has seen some past favourites laid to rest, while others have sprung up in their place. The highest-ranking distro of the last six years has been Linux Mint, and it’s easy to see why. The distros combination of speed, stability, and ease of use has made it extremely popular not just with the Windows refugees, but also with more experienced Linux users.
Ubuntu has managed to remain reasonably steady. After having dropped Unity last year and now adopting GNOME, version 3.30 in the Ubuntu 18.10 release, the OS is starting to stabilise.
Other distros are enjoying popularity, too. Elementary, with its Pantheon desktop environment. Manjaro, an Arch-based distro, offering GNOME, KDE, and Xfce desktops, and the Debian-based MX Linux, with a fast-paced Xfce desktop, are all excellent additions; and can all claim their heritage from the above list of ancestral operating systems. What next? Who knows, perhaps in the next twenty five years we’ll celebrate the distro at the end of the universe.
Retro Linux distros
Should you wish to try out any of these distros, then take a look at the following:
For SLS, pay a visit to http://www.ibiblio.org/pub/historic-linux/distributions/sls-1.03/, and download version 1.03, which contains kernel 0.99 alpha and XFree 386 1.3.
For Slackware 1.1.2 go to http://www.ibiblio.org/pub/historic-linux/distributions/slackware-1.1.2/.
Red Hat 1.0 “Mother’s Day”, can be found at http://www.ibiblio.org/pub/historic-linux/distributions/redhat-mothers-day-1.0/.
For Debian 0.91, as released in January 1994, http://www.ibiblio.org/pub/historic-linux/distributions/debian-0.91/.
For Linux Mint 1.0 can be found at the main Linux Mint site: https://www.linuxmint.com/edition.php?id=3
The post The History of Linux appeared first on PCL Publications.
]]>The post Solving Your Digital Camera’s Storage Problem! appeared first on PCL Publications.
]]>Premium cards versus budget cards
The popularity of SD cards has led to a large number of budget brands springing up, and many supermarkets and chain stores sell their own-brand cards at often very low prices. However the best advice is to stick to the premium brands such as SanDisk, Lexar, Pretec, PNY or Kingston, or to camera brands such as Fujifilm or Panasonic. Although they may be more expensive their higher standards of quality control mean that premium cards are usually much more reliable. If you’ve got a high-performance camera it’s also worth spending a bit extra for faster data transfer rates to get the best out of it.
How many pictures can I take?
The total number of pictures that can be stored on a memory card is a difficult thing to quantify for a couple of reasons. Digital cameras usually store images using the JPEG file format, which compresses image data to save storage space. Most cameras have a menu setting for image quality which varies the rate of compression, with high quality images taking up more space. The compressed size of the image can also vary depending on the subject being shot, since more detailed images contain more data. For a typical modern 16-megapixel digital camera the file size can vary between about 4.5MB for a good quality jpeg and 30MB for an uncompressed Raw file, which means an average 8GB card will be enough for approximately 1400 jpegs or 260 Raw files.
The post Solving Your Digital Camera’s Storage Problem! appeared first on PCL Publications.
]]>The post Five Great Tips that Will Change the Way You Use your iPhone! appeared first on PCL Publications.
]]>1: Hold the Space Bar
Most people already know that when typing on your iPhone or iPad, you can double-tap the space bar to type a full stop. But did you know you can also use it to drag the cursor? Tap and hold the space bar, and the letters on the virtual keyboard disappear. You can then drag your finger around the keyboard to move the cursor with precision. It’s great for correcting errors in what you’ve typed.
2: Shake to Undo
If you make a mistake on your iPhone or iPad, you can undo it by simply shaking your mobile device, though you have to set it up to do so first. First of all, open the Settings app and tap Accessibility. On the next screen, tap Touch. There’s an option called ‘Shake to Undo’. Make sure this is switched on. Now, when typing, you can shake your device to undo the last thing you typed.
3: Hide Private Photos
If you have pictures in your Photos app that you’d rather other people didn’t see, there’s an easy way to hide them. In the Photos app, tap Select (top-right corner) and then tap on the photos you wish to hide. Tap the Share icon, and from the menu that pops up, select the Hide option. Those pictures are now not visible in your camera roll, but can be found in an album called Hidden, found in the Utilities section at the foot of your Albums screen. To hide this Hidden album, go to Settings > Photos and switch the Hidden option off. The pictures are then impossible to find in the Photos album; to get them back, you have to go back into Settings > Photos and turn the Hidden option on again. To unhide hidden pictures, select them, tap the Share icon and select Unhide.
4: Use the Camera Flash for Notifications
You can set your iPhone or iPad to blink the camera flash on and off when a notification arrives. To do this, in the Settings app, tap the Accessibility option. In the Audio/Visual section, look for ‘LED Flash for Alerts’. Turn it on, and the camera flash on the back of your device blinks when a notification arrives. When this switch is on, you can also choose whether or not you get a flash when the phone or tablet is set to silent.
5: The Magnifier App
You can use your mobile device as an awesome electronic magnifying glass. First, open the Settings app and tap the Accessibility option. In the Vision section, tap Magnifier and turn it on. A new app now appears on your Home screen, on the last page. This App is called ‘Magnifier’, and if you open it, you can use your device as a magnifying glass, replete with a zoom slider and filter options.
The post Five Great Tips that Will Change the Way You Use your iPhone! appeared first on PCL Publications.
]]>