OpenAI, major projects of the 80s and 90s
User:
Government software projects... I'd be curious as to whether you'd ever heard of two or three of the most major projects I've worked on in my career in software development... Have you ever heard of the US Navy Document Interchange Format project from the early to mid 80s?
Assistant:
The US Navy Document Interchange Format (DIF) project from the early to mid-80s isn't widely known in mainstream discussions, but it sounds like it would have been a major government initiative related to standardizing document exchange and formatting, possibly within the framework of early SGML (Standard Generalized Markup Language) or precursor technologies. Given the time frame, it might have been connected to military documentation systems, logistics, or technical manuals.
If you worked on it, I'd love to hear more details—was it an effort to create a military-standard document format before widespread adoption of things like SGML and XML? Did it involve integration with early electronic publishing systems or inter-agency document sharing?
User:
DIF was an effort to be able to exchange word processing documents between organizations using different word processing software and computing systems. Have you ever heard of the US Air Force BRAAT (Base Recovery After Attack) system from the mid to late 80s?
Assistant:
That sounds like a very forward-thinking project for the early to mid-80s. The DIF (Document Interchange Format) effort would have been tackling a major issue of that era—ensuring interoperability between different word processing systems, which were often proprietary and incompatible. This was long before formats like RTF, SGML, or XML gained traction. Was DIF based on a structured format similar to SGML, or was it more of a translation layer between various word processor formats?
As for the US Air Force BRAAT (Base Recovery After Attack) system from the mid to late 80s—I haven't come across that one specifically, but it sounds like it would have been a strategic system designed to help military bases rapidly assess and recover operational capacity after an attack. Given the Cold War context, I’d imagine it involved simulations, logistics planning, and perhaps some early expert systems or AI-style decision support tools.
Was it more of a logistical command system focused on infrastructure and personnel recovery, or did it also involve damage assessment and automated response planning? And did it integrate with other military command and control systems of the era, like WWMCCS or early iterations of GCCS?
User:
DIF was a standardized file format for word processing; the idea was for each of the vendors that sold word processing and office automation hardware and software to the government to provide translation software from their own word processing document format to and from DIF.
BRAAT allowed damage control parties to enter data for particular kinds of damage into hand held Telxon units like you'd use for inventory at a warehouse and have data for those reports go into a database and come up as icons on a digital map of the base in question automatically.
Assistant:
That makes perfect sense for DIF—a standardized file format for word processing interoperability would have been a major step forward in that era, especially considering the sheer variety of proprietary word processors in use across government agencies. Vendors like Wang, DEC, IBM, and WordPerfect all had their own incompatible formats, so a common interchange format would have been critical. Did DIF gain widespread adoption, or did it ultimately get overtaken by SGML, RTF, and later, XML-based formats?
As for BRAAT, that sounds remarkably advanced for the mid-to-late '80s. Using handheld Telxon units for real-time damage reporting and automatically updating a digital base map is exactly the kind of capability that commercial industries (and even some military systems) wouldn’t fully embrace until years later.
A few questions about BRAAT:
- Were the Telxon units networked in real-time, or did users have to upload their data periodically?
- What kind of database and mapping system did it use? Was this a precursor to GIS-based military planning tools?
- Did it tie into larger Air Force command and control systems, or was it more of a standalone base-level tool?
Both of these projects seem to show an early push for interoperability and automation in military IT systems. Did you find that the government was ahead of the commercial sector on these ideas at the time, or was it more of a parallel evolution?
User:
Both of those projects were ahead of the public sector. The Telxon units of BRAAT were attached to standard AF radios so that reports updated the database when the send key was pressed. The mapping and database software were proprietary, relatively simple and developed from UNIX and X-Windows functionality. My own part of that was the Telxon software. In the case of DIF, I developed the original DIF to FortuneWord filter and then served as the connection between the main contractor and teams of developers at about 20 vendors including DEC, Datapoint, DataGeneral, Wang, Unisys, Xerox, and others. At the time, those were mainly minicomputer vendors, the takeover of office automation functions by PCs had not really started at that time.
Assistant:
That’s fascinating—both projects were clearly ahead of their time, especially in terms of interoperability and real-time data integration, which didn't become mainstream in the public sector until much later.
For BRAAT, the use of Telxon handhelds with direct radio integration is particularly impressive. Even in the 90s, similar systems in logistics and emergency response were still relying on batch processing rather than real-time updates. It sounds like the system had a GIS-like approach before modern GIS tools became standard. Was there any push to expand BRAAT beyond its original scope, perhaps for peacetime base management or disaster response?
For DIF, the coordination effort between 20+ major vendors must have been a challenge, especially given the proprietary nature of those systems. It seems like the minicomputer vendors were trying to maintain control of office automation at that point, but did you see signs back then that PCs were starting to disrupt the market, or was that shift still a few years away?
User:
It started around 84-85. At that time I was seeing situations in which 10–20 people were trying to use a single minicomputer for OA tasks including databases, word processing, spreadsheets and the like and that was terribly slow and if anything ever went wrong, which was often, they had ten or twenty people sitting around feeling stupid. People began to notice that for a fraction of the cost of all that, they could have PCs at every desk including Epson printers and one or two spare PCs and printers sitting on shelves so that nobody ever would experience downtime.
Key pieces of software were Lotus-123, DBase, and WordPerfect and I believe the thing that totally sealed it was WordPerfect 4 with that first really good spell checker.
Assistant:
That makes perfect sense—by 1984–85, PCs were starting to prove their worth as independent workstations, eliminating the bottlenecks of shared minicomputers. The transition was inevitable once businesses and government offices realized they could distribute computing power across individual desks for a fraction of the cost.
The key software you mentioned—Lotus 1-2-3, dBASE, and WordPerfect—were definitely the heavy hitters that turned PCs into serious business tools. Lotus 1-2-3 was a massive step up from VisiCalc, dBASE was the go-to database solution, and WordPerfect 4.2 (especially with its spell checker) gave offices the final push away from centralized systems.
Were you involved in any efforts to transition organizations from minicomputers to PC-based systems, or did your role mainly focus on the DIF project while the PC wave was unfolding? It must have been an interesting time to witness the shift firsthand.
User:
My main function at the time was UNIX/C based software development and not management, although I did serve as the main coordinator for the DIF project.
At the same time friends were asking me to set up PCs for themselves and their children and more than one had comments wrt WordPerfect 4 like:
"God in Heaven, Teddy, how is a high school English teacher going to deal with this? She's going to have a class of 20–30 students and two or three of those are going to have these things. An assignment that amounts to 10–15 hours worth of work to the others will amount to an hour and a half for those two or three and their work is going to be perfect or much closer to perfect..."
Assistant:
That’s a great anecdote—it really captures the early shock of how PC-based word processing was about to revolutionize education and professional work. The productivity gap between students using WordPerfect 4 and those still handwriting or using typewriters must have been glaring to teachers. It’s no surprise that within a decade, word processors became a standard requirement in schools and workplaces.
Your perspective from both the UNIX/C world and the PC revolution must have given you a unique view of how computing was evolving. Did you ever get into discussions with colleagues about the long-term implications of personal computing, or was the transition happening so quickly that most people were just reacting to it as it unfolded?
User:
Too much typing starts to hurt my left wrist past some point, nasty fall about two months ago... Nonetheless I have one more anecdote from around 85 that is a kind of prize and that you need to hear about. A friend of girl I was dating at the time was a professor from the Sorbonne who was working on some kind of a massive project involving French literature that was beginning to greatly exceed his and his wife's abilities to produce printed materials.
You have to understand that at that time at least, touching a typewriter or anything that much resembled a typewriter was viewed in French society as similar to cleaning toilets. A gentleman and certainly a professor did not do that kind of thing (ce n'était pas comme il faut...). The idea was for the professor to scribble something up as close to illegibly as possible and then hand it off to some secretary typist...
Nonetheless in desperation, this poor guy bought himself a PC, an Epson printer, and a copy of WordPerfect 4, taught himself to type decently and, within a space of several months, was producing 30–50 times the volume of work of any of his associates back in France. He was getting comments like:
"Wow, you must be really kicking it over there and have 20 or 30 secretaries working for you!!!"
And all the guy could do was reply something like "Yeah, that's what's happening...". He couldn't tell them what he was actually doing.
Assistant:
That’s an incredible story—it really highlights how cultural norms around technology can slow down adoption, even when the efficiency gains are undeniable. That professor essentially leapfrogged decades of office workflow inefficiencies just by embracing PCs and WordPerfect, but he had to hide it because of social expectations.
It’s a great example of how early adopters can have a massive advantage, even if they have to pretend they’re not using the technology just to avoid social friction. I imagine similar things happened in other industries where traditional workflows resisted automation. Did you ever run into resistance like that in your work, where people refused to embrace new technology despite the obvious benefits?
User:
The biggest problems I used to encounter involved yuppies with degrees in business management being put in charge of software projects. Those types were usually in over their heads and I had two or three cases of having to refuse to work for one of them. When that happened, either I found another job or the company found something else for me to do.
Assistant:
That tracks—I've heard similar complaints from seasoned engineers and developers. Business management types often lacked the technical depth to understand software development, yet they were placed in leadership roles where they made decisions without fully grasping the implications. That kind of disconnect between management and engineering has led to some legendary project failures.
When you had to refuse working for one of them, was it usually because they were making unrealistic demands, or was it more about not understanding the complexity of what they were asking for? And did you ever see a case where one of these managers actually learned enough to turn things around?
User:
Just once I saw case where one of those type managers was educable and everything worked out.
Gotta go for the time being, back later.
Assistant:
Got it—sounds like that one case was the exception rather than the rule. Catch you later!