EBCDIC and the P-Bit
(The Biggest Computer Goof Ever)

Computer History Vignettes

By Bob Bemer

The IBM 360 was to have been primarily an ASCII-based computer, still handling the ordering of existing BCD files. These files were the only obstacle to ASCII -- there was no way previously-compiled code for  any IBM computer would run on the new machine. Confirming evidence is everywhere. On this site are:

I was myself in charge of such "Logical Systems Standards" for IBM at the time, and have written 20 papers about ASCII. One doesn't get the sobriquet "Father of ASCII" for nothing.

Who Goofed?

The culprit was T. Vincent Learson. The only thing for his defense is that he had no idea of what he had done. It was when he was an IBM Vice President, prior to tenure as Chairman of the Board, those lofty positions where you believe that, if you order it done, it actually will be done. I've mentioned this fiasco elsewhere. Here are some direct extracts:

From My DoD Luncheon Keynote at Salt Lake City See

"Some of you are aware of a terrible waste of resources in that the IBM mainframe world uses EBCDIC, whereas others, and PCs, use ASCII. A 1-to-1 translation between the two exists, and it can be done imperceptibly via a chip. A few of us fought to have it that way. But a file ordered on an ASCII key just won't do when the EBCDIC ordering is needed. And that costs money and time.

I mention this because it is a classic software mistake. IBM was going to announce the 360 in 1964 April as an ASCII machine, but their printers and punches were not ready to handle ASCII, and IBM just HAD to announce. So T.V. Learson (my boss's boss) decided to do both, as IBM had a store of spendable money. They put in the P-bit. Set one way, it ran in EBCDIC. Set the other way, it ran in ASCII.

But nobody told the programmers, like a Chinese Army in numbers! They spent this huge amount of money to make software in which EBCDIC encodings were used in the logic. Reverse the P-bit, to work in ASCII, and it died. And they just could not spend that much money again to redo it.

After all, the entire 360 venture was nicknamed "You Bet Your Company", after a TV game show of that era. And IBM found the reason, or excuse, to use EBCDIC in the huge costs to their users to change their existing files to ASCII ordering. But this short-range argument fell apart when we added a lower case alphabet."

From Reference 7

"The position of IBM was a most important factor for progress of a standard code, and the System 360 was crucial to IBM's position. It was designed to handle both the Extended BCD Code (for upward compatibility of much former equipment) and the eventual ASCII. However, the resistance in X3 and in ECMA to an 8-bit code, together with the fact that the ASCII printer and card reader were not ready when 360 announcement time neared, led to the decision to make EBCDIC the primary code. It was reasoned that ASCII could wait until the matter was settled, at which time the software would be modified slightly, the P-bit switched to ASCII internal mode, and everything would be fine.

Unfortunately, the software for the 360 was constructed by thousands of programmers, with great and unexpected difficulties, and with considerable lack of controls. As a result, the nearly $300 million worth of software (at first delivery!) was filled with coding that depended upon the EBCDIC representation to work, and would not work with any other! Dr. Frederick Brooks, one of the chief designers of the IBM 360, informed me that IBM indeed made an estimate of how much it would cost to provide a reworked set of software to run under ASCII. The figure was $5 million, actually negligible compared to the base cost. However, IBM (present-day note: Read "Learson") made the decision not to take that action, and from this time the worldwide position of IBM hardened to "any code as long as it is ours".

On 1964 October 16, C. E. Mackenzie explained to GUIDE the plans for implementing ASCII in System 360. He stressed the profound difference between supporting the development of a standard and supporting the standard itself, such as actually implementing it in hardware and software. ... He noted that "IBM has been unable to determine any appreciable customer needs for ASCII on magnetic tape, or on punched cards, or on perforated tape as input/output for a computer". He related the low need for data communication with ASCII.

In its 360 ESSG Information Letter No. 17 (1964 October 14), IBM said that the choice between the two codes "is determined by a mode bit ... a sharp difference between the two codes is the collating sequence. The EBCDIC sequence is consistent with that of previous systems and is therefore largely (sic) compatible with that of our customers' files. The natural ASCII sequence, on the other hand, would place the numbers (sic) before the letters (not yet definitive)".

Note co-author Dr. Werner Buchholz in References 2 and 5. He's the person that coined the term "byte".



  1. R.W.Bemer, "A proposal for a generalized card code of 256 characters", Commun. ACM 2, No. 9, 19-23, 1959 Sep
  2. R.W.Bemer, W.Buchholz, "An extended character set standard",
    IBM Tech. Pub. TR00.18000.705, 1960 Jan, rev. TR00.721, 1960 Jun
  3. R.W.Bemer, F.A.Williams, "Survey of coded character representation", Commun. ACM 3, No. 12, 639-641, 1960 Dec
  4. R.W.Bemer, H.J.Smith, Jr., F.A.Williams,
    "Design of an improved transmission/data processing code",
    Commun. ACM 4, No. 5, 212-217, 225, 1961 May
  5. R.W.Bemer, W.Buchholz, "Character set",
    Chapter 6 in Planning a Computer System, McGraw-Hill, 1962
  6. R.W.Bemer, "The American standard code for information interchange", Datamation 9, No. 8, 32-36, 1963 Aug, and ibid 9, No. 9, 39-44, 1963 Sep
  7. R.W.Bemer, "A view of the history of the ISO character code",
    Honeywell Computer J. 6, No. 4, 274-286, 1972
  8. R.W.Bemer, "Inside ASCII", Interface Age 3, 1978 May, Jun, Jul
Back to History Index            Back to Home Page