Overhead photo of a Tandon TM100-1 Floppy Drive and a 5,25" Floppy

How To Revive A Tandon Floppy Drive

In this episode of [Adrian’s Digital Basement], we dive into the world of retro computing with a focus on diagnosing and repairing an old full-height 5.25-inch floppy drive from an IBM 5150 system. Although mechanically sound, the drive had trouble reading disks, and Adrian quickly set out to fix the issue. Using a Greaseweazle—a versatile open-source tool for floppy disk diagnostics—he tests the drive’s components and explores whether the fault lies with the read/write head or electronic systems.

The repair process provides fascinating insights into the Tandon TM100-1 floppy drive, a key player in vintage computing. Adrian explains how the drive was designed as a single-sided unit, yet hints at potential double-sided capability due to its circuit board, raising possibilities for future tweaks. Throughout the video, Adrian shares handy tips on ensuring proper mechanical maintenance, such as keeping lubrication in check and ensuring correct spring tension. His attention to detail, especially on termination resistors, provided vital knowledge for anyone looking to understand or restore these old drives.

For fans of retro tech, this episode is a must-watch! Adrian makes complex repairs accessible, sharing both technical know-how and nostalgic appreciation. For those interested in similar hacks, past projects like the Greaseweazle tool itself or other Amiga system repairs are worth exploring. To see Adrian in action and catch all the repair details, check out the full video.

Continue reading “How To Revive A Tandon Floppy Drive”

IBM’s 1969 Educational Computing

IBM got their PCs and PS/2 computers into schools in the 1980s and 1990s. We fondly remember educational games like Super Solvers: Treasure Mountain. However, IBM had been trying to get into the educational market long before the PC. In 1969, the IBM Schools Computer System Unit was developed. Though it never reached commercial release, ten were made, and they were deployed to pilot schools. One remained in use for almost a decade! And now, there’s a new one — well, a replica of IBM’s experimental school computer by [Menadue], at least. You can check it out in the video below.

The internals were based somewhat on the IBM System/360’s technology. Interestingly, it used a touch-sensitive keypad instead of a traditional keyboard. From what we’ve read, it seems this system had a lot of firsts: the first system to use a domestic TV as an output device, the first system to use a cassette deck as a storage medium, and the first purpose-built educational computer. It was developed at IBM Hursley in the UK and used magnetic core memory. It used BCD for numerical display instead of hexadecimal or octal, with floating point numbers as a basic type. It also used 32-bit registers, though they stored BCD digits and not binary. In short, this thing was way ahead of its time.

Continue reading “IBM’s 1969 Educational Computing”

Mainframe Chip Has 360MB Of On-Chip Cache

It is hard to imagine what a mainframe or supercomputer can do when we all have what amounts to supercomputers on our desks. But if you look at something like IBM’s mainframe Telum chip, you’ll get some ideas. The Telum II has “only” eight cores, but they run at 5.5 GHz. Unimpressed? It also has 360 MB of on-chip cache and I/O and AI accelerators. A mainframe might use 32 of these chips, by the way.

[Clamchowder] explains in the post how the cache has a unique architecture. There are actually ten 36 MB L2 caches on the chip. There are eight caches, one for each core, plus one for the I/O accelerator, and another one that is uncommitted.

A typical CPU will have a shared L3 cache, but with so much L2 cache, IBM went a different direction. As [Clamchowder] explains, the chip reuses the L2 capacity to form a virtual L3 cache. Each cache has a saturation metric and when one cache gets full, some of its data goes to a less saturated cache block.

Remember the uncommitted cache block? It always has the lowest saturation metric so, typically, unless the same data happens to be in another cache, it gets moved to the spare block.

There’s more to it than that — read the original post for more details. You’ll even read speculation about how IBM managed a virtual L4 cache, across CPUs.

Cache has been a security bane lately on desktop CPUs. But done right, it is good for performance.

IBM’s Latest Quantum Supercomputer Idea: The Hybrid Classical-Quantum System

Although quantum processors exist today, they are still a long way off from becoming practical replacements for classical computers. This is due to many practical considerations, not the least of which are factors such as the need for cryogenic cooling and external noise affecting the system necessitating a level of error-correction which does not exist yet. To somewhat work around these limitations, IBM has now pitched the idea of a hybrid quantum-classical computer (marketed as ‘quantum-centric supercomputing’), which as the name suggests combines the strengths of both to create a classical system with what is effectively a quantum co-processor.

IBM readily admits that nobody has yet demonstrated quantum advantage, i.e. that a quantum computer is actually better at tasks than a classical computer, but they figure that by aiming for quantum utility (i.e. co-processor level), it could conceivably accelerate certain tasks for a classical computer much like how a graphics processing unit (GPU) is used to offload everything from rendering graphics to massively parallel computing tasks courtesy of its beefy vector processing capacity. IBM’s System Two is purported to demonstrate this when it releases.

What the outcome here will be is hard to say, as the referenced 2023 quantum utility demonstration paper involving an Ising model was repeatedly destroyed by classical computers and even trolled by a Commodore 64-based version. Thus, at the very least IBM’s new quantum utility focus ought to keep providing us with more popcorn moments like those, and maybe a usable quantum system will roll out by the 2030s if IBM’s projected timeline holds up.

Making Intel Mad, Retrocomputing Edition

Intel has had a deathgrip on the PC world since the standardization around the software and hardware available on IBM boxes in the 90s. And if you think you’re free of them because you have an AMD chip, that’s just Intel’s instruction set with a different badge on the silicon. At least AMD licenses it, though — in the 80s there was another game in town that didn’t exactly ask for permission before implementing, and improving upon, the Intel chips available at the time.

The NEC V20 CPU was a chip that was a drop-in replacement for the Intel 8088 and made some performance improvements to it as well. Even though the 186 and 286 were available at the time of its release, this was an era before planned obsolescence as a business model was king so there were plenty of 8088 systems still working and relevant that could take advantage of this upgrade. In fact, the V20 was able to implement some of the improved instructions from these more modern chips. And this wasn’t an expensive upgrade either, with kits starting around $16 at the time which is about $50 today, adjusting for inflation.

This deep dive into the V20 isn’t limited to a history lesson and technological discussion, though. There’s also a project based on Arduino which makes use of the 8088 with some upgrades to support the NEC V20 and a test suite for a V20 emulator as well.

If you had an original IBM with one of these chips, though, things weren’t all smooth sailing for this straightforward upgrade at the time. A years-long legal battle ensued over the contents of the V20 microcode and whether or not it constituted copyright infringement. Intel was able to drag the process out long enough that by the time the lawsuit settled, the chips were relatively obsolete, leaving the NEC V20 to sit firmly in retrocomputing (and legal) history.

The IBM PC: Brainchild Of A Misfit

We’ve read a number of histories of the IBM PC and lived through that time, too. But we enjoyed [Gareth Edwards’] perspective in a post entitled The Misfit who Built the IBM PC. The titular character is Don Estridge, a decidedly atypical IBM employee who was instrumental in creating the personal computer market as we know it.

It’s not that IBM invented the personal computer — far from it. But the birth of the PC brought personal computers to the mainstream, especially in offices, and — much to IBM’s chagrin — opened up the market for people to make add-on cards for printers, videos, and other accessories.

IBM was a computer juggernaut in the late 1970s. Its divisions were the size of other companies, and some have compared it to a collection of mafia families. The company was heavily invested in big computers, and management was convinced that personal computing was, at most, an avenue to video games and most likely a fad.

Known as a conservative company, the PC project drew from a number of corporate misfits who had been technically successful but often punished for coloring outside the lines. They developed a prototype. The post quotes one of the people involved as saying, “The system would do two things. It would draw an absolutely beautiful picture of a nude lady, and it would show a picture of a rocket ship blasting off the screen. We decided to show the Management Committee the rocket ship.” Wise choice.

That’s just the kind of tidbit in this post, and if you have any interest in computer history of the 1980s, you’ll definitely want to check it out. Estridge died in 1985, so he didn’t get to see much of the result of the market he opened up. Of course, there were many other players who appear in this story. The PC has many parents, as you might expect.

We’ve done our own recounting of this story. However, we tend to obsess more over the internals.

Aiken’s Secret Computing Machines

This neat video from the [Computer History Archives Project] documents the development of the Aiken Mark I through Mark IV computers. Partly shrouded in the secrecy of World War II and the Manhattan Project effort, the Mark I, “Harvard’s Robot Super Brain”, was built and donated by IBM, and marked their entry into what we would now call the computer industry.

Numerous computing luminaries used the Mark I, aside from its designer Howard Aiken. Grace Hopper, Richard Bloch, and even John von Neumann all used the machine. It was an electromechanical computer, using gears, punch tape, relays, and a five horsepower motor to keep it all running in sync. If you want to dig into how it actually worked, the deliciously named patent “Calculator” goes into some detail.

The video goes on to tell the story of Aiken’s various computers, the rift between Harvard and IBM, and the transition of computation from mechanical to electronic. If this is computer history that you don’t know, it’s well worth a watch. (And let us know if you also think that they’re using computer-generated speech to narrate it.)

If “modern” computer history is more your speed, check out this documentary about ENIAC.

Continue reading “Aiken’s Secret Computing Machines”