Razorback
NewsProjectsGuidesResourcesContact
 Archive Quick Links


The Coexistence of Old and New Technology

February 6, 2022 at 2:05 PM
Category: Hardware

This world is not what I used to know it as. Even though computers are becoming so powerful that they could very well last forever in everyday usage, somehow, new methods are being developed to accelerate the obsoletion of five year old products. See that new, exiciting, revolutionary iteration of some brick telephone? Forget it, it'll be trash in a while. Even then, it's as if absolutely everything has to be so phone-centric now. Want to do this thing? Download an "app" that takes up precious space on your telephone! Do you need to get tested or schedule an appointment? Dude bro, just push fake buttons on a touch screen!

It's as if one day, future generations will not have any concept of non-capacitive buttons. If that's going to be the case, though, they'll probably also not ever know what it was like to compute in an environment that wasn't outright sandboxed. It really doesn't have to end up this way, though. In fact, for a while, it didn't seem it was headed in that direction. There was a brief time around the late 2000's when the trendiest techno bling coexisted peacefully with the tried and true solutions of yesterday. It was beautiful; there was something for everyone.

I guess somewhere along the way, a larger collective decided that we've gotta start throwing out so many things at once practically overnight because oh shit, we still don't have hoverboards or flying cars! I didn't even notice how much had been lost until some years into the following decade, some of which could've easily lived alongside what we're left with now.

CRTs and LCDs

Two 4K IPS monitors next to each other

This might not be the best example to start off with. In many ways, LCD monitors open up room for a lot more possible display configurations; if you want to stack visible screens vertically, all you need is some VESA mounting equipment. To achieve the same result with CRTs, you'd need a much more bulky and backbreaking setup. I can set up two 32" 4K IPS monitors on a fairly small desk and still have plenty of room for the keyboard, mouse, and whatever else I might want to throw on there. (Yes, I do make good use of the resolution... with 4K, one can review many, many lines of code at once, especially in a vertical orientation!)

Orca showing up on monitor in CGA mode

That being said, a hefty sum of things were lost in the transition from CRTs to LCDs. Perhaps the most notable of these is resolution scaling, which only grows more problematic with higher resolutions. On LCD monitors, an image can only look crispy when the monitor's native resolution is set. Anything lower, and you'll see a lot of blurring around the edges of pixels. Meanwhile, you can take a 21" CRT monitor and it will look super crispy at any resolution, whether it be 640x480 or 1600x1200. Considering how demanding the more bloated games are with 150GB worth of textures, dropping to a lower resolution can matter greatly if your machine is underpowered, especially given the ongoing GPU scalping crisis. If lowering the resolution still only looks great on CRT monitors... maybe it is all the justification needed to start manufacturing them again?

The later CRTs from the late 90's to the 2000's have also been very capable of high refresh rates. In CRTs, getting high refresh rates was easy, and, in many ways, a necessity; 60Hz isn't really enough for them, as some users may notice the slight flickering that a jump to 75Hz or 85Hz could fix. Only in recent years have LCDs really started to catch up on refresh rates, especially in tandem with the bright colors some of the higher quality CRTs offered. Not that going beyond 60Hz is so necessary on an LCD as it is on a CRT to prevent headaches, but it just makes you wonder - maybe if CRTs were given a few extra years, they could've evolved into even greater superpowers backed by decades of a maturing technology.

When we moved, I made the grave mistake of not cherishing the four CRT monitors I had on hand. I gave every single one of them up in 2012, and it wasn't until 2014 that I figured out I messed up by doing that; there is no way some glossy 16:9 LED panel would match up with a 486 desktop chassis from 1992. Since then, I've been accumulating CRT monitors little by little and making certain a lot of them get good use, starting with a ViewSonic E70f.

I didn't really bring up much about televisions here, but those could also continue to be useful as long as they still work; before we moved, we still had a plentiful of CRT TVs around the old house. Even though they were tough to lift, they did their job well until they too were abandoned. A shame, really; one of them had component input, which would've probably made it more valuable these days.

SSDs and Floppy Disks

This might be more of a novelty than anything else, it's just really interesting to point out that at the same time that SSDs were starting to take off, you could still buy brand new motherboards with onboard floppy controllers, floppy drives, and packs of floppy disks as late as 2009. No joke, I did that myself; I bought a pack of 30 floppy disks from a brick and mortar computer store at that time. There was no way I was going to use them for storing critical files (the standard never moved beyond 1.44MB even when 2.88MB was available for a time), but I sure did make use of them in various applications, particularly those involving old computers.

Come Christmas 2010 when I got a bunch of parts for building a brand new Core i7 machine with Windows 7, a floppy drive was included in the mix, as my uncle knew I wanted to use such a thing. But... the motherboard had no onboard floppy controller. We hoped maybe we could just find some PCI floppy controller like we could a serial/parallel controller and call it a day, but it seems there was no such thing. The floppy disk was on its way out, and I think one of the only reasons why it survived the late 2000's in the consumer field was because Windows XP insisted you could only load a new storage controller driver from a floppy disk unless you pulled off some slipstreaming wizardry.

I did eventually get a USB floppy drive in the hopes it would answer my needs, but... it sucked ass. It was very unreliable, constantly failed its operations to a point where I had to keep replugging it in the hopes I could eventually get some disk image loaded. I later opted to only use real floppy drives connected to onboard controllers on separate computers instead.

I was slow to adopt SSDs largely due to their excessive cost compared to the capacity they offered at the time. It wouldn't be until somewhere under three years later that I would build myself a hex-core workstation with a SATA SSD installed, and another five years to make the jump to NVMe. All the while, I've continued to situationally use floppy disks, and still do.

For the average user, floppy drives are something one would be better off without due to their slowness and extremely limited capacity, but there are some niche cases where they are still useful. In rare cases, particularly involving old nuclear-related applications, it would be for the greater good that they continue to be used. Nobody wants a modernized internet-equipped nuclear system of some kind even if they think they do; what could happen if a hacker were to break in one, or a serious software bug occurred? Manage nuclear energy the smart way, log off!

VHS and Blu-Ray

Gotta be real with ya here... I have NEVER found a practical use for Blu-Ray, not even for data storage. Even when I had a Blu-Ray drive, I only ever used it with CDs and DVDs. I hardly ever watch feature films, and the last time I was ever in a movie theater was in 2016 (following another seven year gap at that). Whenever I wanted to move large volumes of data between computers without networking, I always used USB flash drives. They're compact, reusable, and devoid of moving parts. Even now, I see burning CDs or DVDs as more of a chore than anything else.

VHS tape being inserted into a VCR

VHS tapes, on the other hand, I have found to be useful in a number of things. I didn't record TV programs on them since I had completely lost interest in television at that point, but the practical functions of VHS were abundant enough that it compelled me to buy a package of four brand new blank VHS tapes from Best Buy in, say, early 2009 I believe. It provided a means of recording gameplay footage at a convenient location before transferring it to my computer, though I wonder if I should've just gotten myself double-ended composite cables for my Canopus ADVC-110 instead.

Though I don't remember much of what I recorded to my tapes, considering the pack I had seems to have gone missing, I know I had come up with a way to export videos on my computer to a VHS tape. First, I'd connect my MiniDV camcorder via FireWire to one of the Macs I used, tell iMovie or Final Cut to print the video to a MiniDV tape located at some empty space, and it would automatically stop the tape when the video was finished copying to it. From there, it would just be a matter of plugging the camcorder into some VCR's composite input and recording the footage to tape. It's kind of elaborate and there are better ways of doing it, but that was the first method I established.

I still use a VCR/DVD combo player from 2004 from time to time whenever I get ahold of someone who wants their video tapes digitized. While I do it for cheap, you're shit out of luck if you want me to convert your tapes to MP4 files. I only deal my service locally under the counter!

For those who need to use VHS and Blu-Ray, it just so happens that there are Blu-Ray/VHS combo boxes out there. The VHS part probably sucks, though, as by 2006 I think most VCR manufacturing was only done by Funai, which is reportedly a garbage brand. The last commercial VHS tapes produced likely would have been from 2007, at least here in the United States.

Analog and HDTV

Now here's where things start to get kind of shaky. Up until this point, all the obsolete technologies I mentioned are still 100% usable, so as long as the devices themselves are intact. Old school over-the-air television is not in most of the world anymore. The only way you can use such a kind of TV is if you can tap into the receiver with some external cable-driven signal, or whatever it is they asked everyone to use.

I do not have much personal experience with analog television. There were only two instances where I really used it. One time, I fired up some sort of pocket TV stashed away behind the bar downstairs. It wasn't something that I was hooked on, but it was cool to know there was such a thing as a hand-sized TV. It may have been one of those "radio TVs" given the kind of tuner it had. I think it was in either 2005 or 2006 that I used that thing for a bit, and it was receiving some sort of news or weather broadcast. More recent attempts to use the device have turned up with nothing but static.

My more bizarre experience with analog TV was in early 2009, some months before the bulk of it was shut down in the United States. I did not intend to watch any such broadcasts, let alone anything coming off of cable TV, hence there was no coax cable coming into my bedroom from what I recall... yet somehow, my Vizio TV I was using expressly with game consoles somehow picked up some fuzzy TV signal. It had to be an old analog signal coming from the airwaves. I don't know what caused it, but maybe I should explain:

The holiday before this happened, I had gotten myself a piss yellow SNES with only a couple of cartridges. It didn't come with a composite cable, so I had to connect it either to my TV or VHS/DVD combo box through coax. I don't think the SNES had any part in it, but when I shut it off, I started receiving some odd signal that appeared to be some kind of TV show. I asked about this earlier, and I was told about this thing called "the VCR effect" where VCRs were supposedly known to amplify over-the-air signals. If that was the case, this must have taken place at one point when I was recording footage from my SNES to a VHS tape, or more specifically after I shut the console and/or VCR off. It's hard to describe because I have no footage documenting this thing actually happening, but it shocked me to some degree. I guess now we'll never know.

In the United States, there have been some exceptions to the mandate to shut down over-the-air stations, at least until last year. Low-power stations covering a short range were allowed to continue operating normally, but existed sparsely across the country. I think someone I know brought up being able to watch such an over-the-air station during that grace period, unless it was just a digital receiver.

While antenna TV lives on through digital signals, which I imagine are more efficient than pure analog waves, one has to question the ethics of shutting down a long established TV signal. What about in emergencies, if cable TV were to go out during a storm? Not sure if that ever happened, but either way, outlawing analog signals just to make room for speeding up video streaming to iPhones or whatever is not something I would call wise.

Seems 4G and 5G aren't enough, either; now, everyone's talking about shutting down shortwave radio. Some countries have ended up following through with the idea. While its quality wasn't as good as AM or FM radio, the ability to tune into foreign stations from almost anywhere proved to be an important diplomatic function, and now that it's being phased out, um... yeah, yeah, you and your "oh but there's the digital converter box oh we can do the satellite now", what does that mean? An antenna is a simple, tried and true rod of useful signal receivings; does everyone suddenly have such an aversion to it now? If that's not enough, some countries are shutting down FM radio of all things... you know, what was used to receive all sorts of tunes in high fidelity during a road trip?!

This might sound insane to you, but I have an alarm clock. Not my computer running a program, but an actual alarm clock I thrifted for $2. Except when I have to reset it following a power outage, it has never failed to tell me the time in a large, beautiful, red 7 segment LCD display. Sometimes, I need to set an alarm to wake me up at a certain time, and I have it tuned to an AM radio station - more pleasing than a simple buzzer. I have a feeling that in order to make 6G, 7G, or whatever happen, they're gonna go after AM radio, and it's not gonna be pretty.

But now I'm also hearing talks about countries wanting to shut down 2G services... so now phone frequency bands are cannibalizing each other? This is getting way out of hand. You may as well run a bulldozer through a rainforest while you're at it. If whoever's in charge of this really wants to be shutting down a band, make it 3G. As I understand, 3G devices can fall back to 2G, so while they'll connect slower, nobody will be taken offline. But if we're talking about phone bands being shut down now, all these sacrifices of analog TV and shortwave radio are going to be for nothing.

Landlines and Cellular Phones

Telephone handset next to a keyboard

Now we're getting somewhere! A cell phone in the early 2000's was much like a laptop in the 90's; it wasn't as robust as landline, but it was there for those who truly needed it. As cellular technology developed, such phones became a highly viable alternative to payphones, largely because you could take one with you anywhere and place or receive calls wherever there was coverage. It was for those who were regularly on the go, those who sometimes needed to be reached directly whenever they were away from home.

Of course, since cellular reception was really shaky, it was only natural that using a cell phone would be less desirable than landline whenever the latter was available. Landline has always offered crispier audio quality, and in many cases it's proven to be far more reliable in emergencies for a number of reasons: one, should the emergency occur at home, your address will be automatically sent to the operator even if you are unable to say much at the given moment. Furthermore, in the event of a power outage, landlines can supply their own power for around two weeks, so if you were to use a simple handset powered by the phone line, you could still call anyone - very important in the event a natural disaster occurs!

But don't cell phones also have batteries? Sure, if coverage is still available, you could place calls without power at your house, but you should also mind that cell phones hold a much shorter charge than copper phone lines, unless one can intelligently manage their phone's battery life by turning it off when they don't need it. Regardless, one full charge is certainly not long enough without that kind of management to outlast a very real blackout that happened a couple decades back. But the landlines still had power! When people's homes caught fire, reportedly due to the usage of candles going wrong, they could call 911 with phones that didn't require a separate power supply.

Imagine if that blackout happened today; millions of us would have gotten fucked over a lot harder. Seriously, if you run into a situation where you desperately need to call someone and your phone's battery is running low, and you don't have power or any backup batteries holding a charge, you're gonna be left with no choice but to pray. But I'm not trying to downplay the usefulness of cell phones; one helped me get out of a nasty situation. Why not have both? Relying on cellular telephones exclusively is just creating a single point of failure.

US Robotics modem

Landline also became highly versatile in its later years. If you needed to instantly transmit a paper document to a remote location, a fax machine could use a phone line to get the job done. Similarly, data could be exchanged between computers with modems, back when "modem" only meant a device that modulates and demodulates signals over telephone lines. Sure, even at its peak, dial-up internet was pathetically slow (part of the reason why Razorback is designed to be lightweight), but look at it this way: with any conventional phone line, you could connect to a corporate network or the internet from practically any location. Public Wi-Fi has taken its place these days, but still... a century-old system that initially required you to speak with a human operator to reach someone turned out to be capable of doing THAT. How cool is that?

I very much remember wanting to have my own cell phone as a little kid, particularly because it was this new and exciting technology which could let me place calls on the go. I didn't get one until I was 13, but even though the rest of my family had cell phones, landline was still very dependable for all of us. At the end of 2006, we updated our Panasonic answering machine to a new one that allowed installing multiple cordless handsets connecting to a single base, allowing us to place calls from anywhere in the basement. No need for a convoluted setup of multiple leftover answering machines each connected to different phone lines. That system was well-featured, even allowed us to place intercom calls to different handsets without even having to go through a telephone service. It was awesome.

Yet when I finally got a cell phone, I was slowly starting to notice something offputting about these things. Oh, so if you put your phone number in somewhere, someone might just charge your phone bill even if you say not to? And don't even get me started on texting... the cell phone I got was a leftover from before, some Sony Ericsson from 2006 with only a numeric keypad and some other buttons. It was NOT fun to type on. Since all of us were using cell phones now, I hadn't even noticed that we quietly forfeited landline completely.

I later upgraded to another cheapo phone with a full keyboard (all actual buttons), which was better, but even then I never thought so highly of texting. What's wrong with just placing a phone call? I could easily get more through in a one minute call than I could in some text message. Plus, texting is lifeless. It takes away the ability to convey raw emotion unless you have a way with words. Maybe that's why more phones started adopting the ability to press a button five times to convince others you're really laughing.

Isn't it something, how much more frequently people burn through phones now than they used to? It could take many years for us to move from one phone to the next back in the landline days, especially when an answering machine is brought into question. Whenever we did, there was a clear cut advancement in function, as was the case with the aforementioned answering machine and its multiple handsets. Compare that to the three cell phones I burned through in three years. Two of them broke by my own hand, I'll give them that, one which got shorted in the washing machine because I forgot to take it out of my pocket, and the other, well... the phone itself was not broken, but I guess the SIM card has a self-destruct feature that occurs if you attempt to access a certain settings menu or whatever it was.

"Oh, let's just get a new SIM card for $10 or whatever..." you know what, no. I had enough with cell phones. I decided to sign up for a VoIP service after getting a second-hand Cisco business phone. VoIP doesn't offer all the features of traditional copper landlines, but it is what I have settled on. Yes, when I was only 16, I was so pissed off with cell phones that I personally registered for my own landline-like phone plan. I only pay as much as a Little Caesars® Hot-N-Ready® pepperoni pizza for the service, and in all the years I've been using it, I have NEVER had a single issue with it. Ever. I can transmit my voice to others with clarity equivalent to landline, and I can press real buttons on my phone and get the status from a sharp, crispy monochrome backlit LCD. It's fucking magical. I probably would still be using a cell phone, perhaps alongside my current service, if not for such things being so locked down, expensive, and sketchy. I don't truly need a cell phone, so what gives.

Still, I envision going back to pure copper landline whenever I can buy myself my own house (hence, gotta learn more programming)... that is, assuming it will even still be a thing. I'm worried it might not be by the end of this decade.

Desktop Computers and...

Desktop computers are alive and well, but a couple months ago I've been in a discussion talking about how that might not be the case in the future. I don't want to believe it myself, but there is a possibility that maybe in the following decade or so, there won't be such a thing as a "computer" with a local CPU practical enough to perform even the most basic functions. Why? Cloud cloud cloud cloud cloud cloud cloud cloud! Everything's gotta be in the "cloud", because that supposedly makes everything better and the future and stuff.

Basically, this dystopian vision would be comprised of nothing but thin clients and iPads connecting to one giant megacomputer that streams a sandboxed environment to the end user. They will say that this will eliminate the problems of tight storage space (which was intentionally manufactured) and limited local CPU power for rendering your projects... no it won't. You will have to lease your storage and CPU cores month to month for a high rate, and it's only a matter of time before malicious actors start stealing collectivized CPU/GPU resources for their own gains, whether it be crypto mining or just causing pain. That's already happening on a smaller scale, so imagine how bad it would get if so many of the world's computers were consolidated to a remote site.

If the tech industry actually leans towards this, local desktop machines aren't bound to disappear outright, but if only so few manufacturers end up continuing to sell traditional computers to ordinary consumers, they're bound to get mega expensive and hard to find (because they will definitely be scalped). By then, either I'm gonna become a billionaire or I will be killed in a burglary for all the computers I have. Or, I will go to prison because it'll become illegal to own a traditional desktop computer, because only criminals use those!

As I think about it more, this could actually happen. CPU manufacturers may end up determining it is more economical to put all their developments into only the highest end server CPUs, leaving only table scraps worth of local computing power to end users, only enough to boot up and log on to the "greater" supercomputer, where you are entirely at the mercy of a company.

Yeah, in some sense, I do use a """"""cloud"""""" service, since that's what they market themselves as, but please call it a VPS. So far they haven't let me down; they've been very reliable, and I'm closing in on 365 days of uptime on this server I've been leasing for cheap. Still, if I could get the logistics to host this website and some others in-house with the kind of upstream speed I'd expect, I would ditch this service in a heartbeat, even if the long-term costs would be higher. Ugh. Just thinking about the word "cloud" applied to technology makes me puke. What does it even mean? It's such a vague and pretentious term.

It's only inevitable that new technological innovations will come forward. Much like the conflict between the environment and developments in human civilization, new technology must learn to coexist with old, well-established standards instead of demolishing it. If planned obsolescence continues to be a crack addict that can't stop shitting out electronic waste, everyone will be paying the price for incrementalist "innovation".


Comments

Marble64 - June 18, 2023 at 04:05 PM

Yeah. today, I got this Trimline phone from an antique store for 12$ and it may get some phone service running some time soon. :bugfix_delight:

1 comment on this page

Sort: Ascending | Descending

Leave a Comment

Name: (required)

Website: (optional)

Maximum comment length is 2000 characters.
First time? Read the guidelines

SORRY NOT GONNA SHOW THIS IN TEXT BROWSER
Enter the text shown in the image: