Posted inExperience / Information Technology

Why is network-byte-order defined to be big-endian?

AST Premium 286 ad imageThe big-endian question comes up on those Internet chat places young techies like to hang out in. To us old-timers it falls under the heading of “Kids these days!” Here is the tale for those too young to remember.

Around 1987 AST Research introduced the AST Premium which won the InfoWord Editor’s Choice award. There were many different models differing in memory and hard drive capacity. If you had the Model 140 shown in the ad image, you really had something. You had a desktop computer with 1 MB of RAM running at 10 MHz with a massive 40 MB MFM hard drive. It even had a color monitor with EGA graphics. You were a geek stud muffin. You also had enough disposable income to drop just over $3K on a computer to use at home.

There were cheaper models which came with less RAM and only two floppy drives, but if you had this you were da-man! This particular computer is near and dear to my heart because I actually owned one. I wrote an awful lot of software with it. Honestly, I still miss the keyboard. You could even get tape drives for another wad of cash. I never met anyone who bought a Model 170 which had the 70 MB hard drive nobody could possibly fill.

RFC1700

Around October of 1994 RFC1700 was birthed. Given the amount of debate and haggling it shouldn’t take a rocket scientist to figure out several years went into creating RFC1700.

It was here that it was determined network-byte-order would be for all time world without end big-endian. What kids today don’t realize is what the state of the industry was. Today they look at the industry through the eyes of a yippy-yappy dog sitting on the porch watching the big dogs run past. This dog is the size of the first installment of a Time-Life books type dog piece. After each monthly payment you get another piece of the dog to assemble. When you complete your “easy” 60 or 84 monthly payments, you then have a whole dog.

Buying The Whole Dog

Laugh all you want at that statement, it’s true. It is also how a great many things were sold during the time. People didn’t have any money. Minimum wage was around $3.25 – $3.50/hr in most states. A standard 2000 hour work year at $3.50/hr meant your annual income was $7000. That computer was roughly half someone’s annual income. They still had to both eat and live indoors.

My parents bought a Random House Dictionary from Time-Life Books in this manner. Each month another bundle of pages came to be added to the clapboard spring cover assembly. We still have it.

Encyclopedias were sold in this manner as well. Instead of assembling each book you got one volume after each payment. They kept you buying the entire set by making the master index its own volume and the last one you got. Credit cards didn’t exist and this was the solution the educational market came up with. Parents wanted the best for their children and most schools could afford only one set of encyclopedias and one podium sized dictionary, if that. No, dogs weren’t sold this way, but now you understand the reference.

Everyone Had a Network Standard

Every major player in the computing world had their own networking standard. This wasn’t just a big-endian vs. little-endian thing. IBM had Token Ring. Novell had Netware. Digital Equipment Corporation (DEC) had DECnet. The list went on and on.

Network Interface Cards (NICs) for many of these proprietary network protocols were north of $1,000. None of these protocols could talk to each other without buying some really expensive gateway software or devices.

LanTastic came out with NodeRunner for small hobby computer users which was a peer to peer network that could run over existing parallel port or could use these 2Mbit NICs and coax cable. It sold well because you could get NICs for around $25. It sucked horribly once you got beyond a handful of computers. We were all still running DOS then and trying to make things fit within the hard 640K memory limit.

Starting salaries for programmers fresh out of school were around $20K. Parents slaved at blue collar jobs sacrificing a lot to send one or more kids to school to get a degree in computer programming. The starting salary of their child would be more than both of their incomes combined. It was a way to climb the economic ladder.

Every Protocol Had Its Legitimate Selling Point

Token Ring was deterministic. Every node got a chance to hold the token when its turn came. That node could hold the token for a maximum amount of time. Holding the token meant it could transmit on the network. If the node had nothing to send it immediately returned the token to the master.

As you added more nodes the amount of times your node could hold the token over any given time measurement decreased. It shouldn’t take a rocket scientist to figure out the dude who needed to transfer 100MB took forever while slowing others down. I vaguely remember there were various priority schemes. They generally involved privilege rings like a cast society. Nodes in the upper crest of network society got to hold the token more often than each of the lower crests.

We Did Not Have the Internet

You have to remember, we didn’t have the Internet. Nobody was surfing for funny cat videos. Nodes at the lower levels were clerical staff who needed to pull a spreadsheet or word processing document from a file server then work on it locally, saving it back to the server when done. They didn’t often notice they were at the bottom tier of the network.

Netware became the corporate office standard for a few reasons. PCs weren’t considered real computers. To this day that is still a mostly legitimate point of view. PCs started popping up on clerical desks in departments as did the little printers they needed. The tasks most did consisted of word processing and spreadsheets.

Multiple people needed to have access to the same documents and sneaker-netting around floppy disks caused problems. Besides, printers needed to be shared as they were far from cheap. Servers started popping up in various departments until each department came to the realization they needed to share files with other departments. (Data Silo ring a bell?) Eventually management started consolidating the servers until they started hitting number of users limits.

The Size of Netware

Kids today can’t possibly understand how big Netware was. Yeah, there were lots of others, but Novell dove in hard. It provided Btrieve and indexed file system which actually allowed multiple users/programs to read and write records to the same file, not fake file sharing like was done under DOS. Department level applications started springing up all over corporate America as did RAD (Rapid Application Development) tools.

Netware also had Groupwise. Basically SharePoint with email and other do-dads. Not Internet email, Groupwise email. You could send an email to another user who had a Groupwise account but you couldn’t send an email to the mainframe group who were still using 3270 terminals and never received a Groupwise account or desktop computer.

DEC Had the Most Amazing

Of all the proprietary networking protocols, DECnet was the most amazing. Combined with a VMS cluster there was nothing that could hold a candle to it. I was at a client site which had, for all intents and purposes, a global cluster. They had labs and other facilities in Germany, Puerto Rico, and other places, but the corporate headquarters was in Illinois. I could run a report on the corporate computer and print it on a printer in one of these labs.

VMSmail allowed all users to type text email (we could even send files) to any other user anywhere on the cluster. VMSphone allowed you to chat with another user via two text windows on your terminal. As hokey as that may sound to you today, back when international calling was around $2/minute, the phone feature could almost pay for the cluster, especially when one added in the cost savings of VMSmail. Airmail letters weren’t and still aren’t free.

DEC also had PATHWORKS which would let PCs connect and store data on the midrange computer. If memory serves, you could also print to the system printers. While that may not sound like much, keep in mind “average” PCs came with a 20MB drive and studly PCs came with 40MB. The RA81 offered 456MB and the RA82 increased storage to 622MB and the operating system could logically link multiple physical drives into one logical drive.

PATHWORKS was a big deal. Properly configured storage could allow mainframe applications to read and write files in the PC storage areas. You no longer had to get a physical printout of a report. It could be copied to your “network drive” and you could look at it in your favorite text editor.

Every Protocol Had Its Failure

Token Ring had an inverted hockey stick performance curve. After you reached that tipping point in number of users and data transfers the performance just plummeted. This lead to companies having multiple rings and various hacks to route/bridge messages between them.

All of the networks running on PC hardware had both user limits and distance limits. While the distance limits hobbled many, they could be overcome via various repeater technologies which also introduced another point of failure to the network.

The only proprietary network I know of which could have scaled to be much of what the Internet is today was DECnet. The downfall was it really needed to be running on VMS which, at that time, only ran on midrange computers. No “home user” was going to plunk down $30K-$850K for a “home” computer, especially one earning $3.50/hr. Let us not forget, all of the early models also required a special air conditioned room.

Yippy-Yappy Dogs Got Loud

Departments didn’t want to give up their PCs. Corporations also wanted to stop tying up north of $5K in each and every desktop computer. Data centers wanted to reign in this flood of personal computers which was now sucking up twice as much power and realestate as the real machine room.

TCP/IP was beginning to surface in various places. Adding insult to injury the original IBM XT class of desktop computers was being amortized out. First it was employees who could buy the XT computer they (or someone) had been using at work for $300-$500. Some companies amortized them out completely and just issued them to employees. The rest started getting sold in the used computer market. Generic XT class computers could be purchased brand new for around $800 with a 20MB hard drive and monochrome monitor.

Every vendor wanted to be the standard having everyone else license and buy from them. The market wanted “cheap stuff.”

The Big Dog Put It’s Paw Down

The little yippy-yappy dogs wanted respect. They didn’t deserve respect and were too stupid to realize it. Between RAD and people doing more with spreadsheets than anyone ever should have thought about; upper management had no idea what any department did or what information it had. Management could no longer run reports across departments to get an idea about what was going on at the company. This one time upper management and the real world both had the correct view of things.

If a tiny little piece of a Time-Life Book dog wanted to be considered in any way shape or form part of a dog it had to talk to the big dog.

IBM agreed to supporting the TCP/IP OpenSource protocol after much arm twisting. Once this was done, everybody else was screwed. IBM and most everybody else for a time (and some still are today) running there own protocols on top of TCP/IP. DECnet runs on top of TPC/IP. If Netware still exists anywhere, it exists on top of TCP/IP as well.

EBCDIC vs. ASCII

IBM had EBCDIC and the rest of the world had ASCII. Eventually ASCII became the first part of the UTF encoding standards now in use on the Web. IBM assembly already had a lot of “mask” instructions so it bowed to letting the specifications be created in ASCII.

The little yippy-yappy dogs thought they were going to run the table at this point. The little yippy-yappy dogs weren’t tall enough to see down the street.

The Hill the Little Yippy-Yappy Dogs Died On Was Called Big-Endian

I was very deliberate in stating the storage capacity of “da-man!” PC of the day. I was very deliberate in pointing out the size of the RA disk drives in use on DEC equipment of the day and why PATHWORKS was such a big deal. Now I must be very deliberate in pointing out what the main frame was.

Nine-track tapes commonly had densities of 800, 1600, and 6250 cpi, giving 22.5MB, 45MB and 175MB respectively on a standard 2,400 feet (730 m) tape. Yes, there were record gaps and other space consuming things, but a 2400 foot tape had a lot of storage.

IBM iron could read and write those things faster than most of my beloved DEC machines could read/write RA series disk drives. DEC could read/write it’s disks at speeds which made my 40MB MFM PC seem like it was chiselling storage in stone. Most PC users never bought a tape drive. Most DEC shops owned at least one tape drive and bought tapes, at most, a box at a time. IBM shops tended to buy 1,000 tapes at a time.

Big Dog Data is Big-Endian

Big dogs eat mountains of data that would bury a city block hundreds of not thousands of feet deep every minute. They consume that data in big-endian form. They generate piles of output that even at high noon cast a shadow over the top of the two story house that has the porch where the yippy-yappy dog is yapping from.

We are talking about Standard Oil processing all of the seismic data from one single test trying to determine where to drill. We are talking about payroll companies processing weekly payroll for both full and part-time workers, all of them. Then creating tapes to send to the IRS reporting both wages and withholding for each worker.

In 1990 there were over 98 million full-time employees in America. Granted some worked for small companies who filed and paid such things quarterly using paper, but a company the size of Sears didn’t have such an option. When it came to the really big employers the government wanted its money now. Even if you could wait until the end of the quarter to pay it, they wanted to know how much was coming in right now.

There was no way IBM, upper management at any major corporation, or those in charge of government agencies using IBM computers, all participating in the standards committees were going to anything into the standard which dimpled the throughput of their million plus dollar blue box. They especially weren’t going to allow it for a little yippy-yappy dog with a 40MB drive. All of these officials were looking down the road at transmitting payroll and tax data electronically. Instead of trucks bringing thousands of tapes each week it was going to be communications infrastructure transmitting billions of records which the big blue boxes would happily consume.

For a deeper journey through IT history pick up a copy of The Minimum You Need to Know About the Phallus of AGILE.

Roland Hughes started his IT career in the early 1980s. He quickly became a consultant and president of Logikal Solutions, a software consulting firm specializing in OpenVMS application and C++/Qt touchscreen/embedded Linux development. Early in his career he became involved in what is now called cross platform development. Given the dearth of useful books on the subject he ventured into the world of professional author in 1995 writing the first of the "Zinc It!" book series for John Gordon Burke Publisher, Inc.

A decade later he released a massive (nearly 800 pages) tome "The Minimum You Need to Know to Be an OpenVMS Application Developer" which tried to encapsulate the essential skills gained over what was nearly a 20 year career at that point. From there "The Minimum You Need to Know" book series was born.

Three years later he wrote his first novel "Infinite Exposure" which got much notice from people involved in the banking and financial security worlds. Some of the attacks predicted in that book have since come to pass. While it was not originally intended to be a trilogy, it became the first book of "The Earth That Was" trilogy:
Infinite Exposure
Lesedi - The Greatest Lie Ever Told
John Smith - Last Known Survivor of the Microsoft Wars

When he is not consulting Roland Hughes posts about technology and sometimes politics on his blog. He also has regularly scheduled Sunday posts appearing on the Interesting Authors blog.