|
What was the mainstream programming language before C took the lead?
|
|
|
|
|
COBOL, FORTRAN?
I’ve given up trying to be calm. However, I am open to feeling slightly less agitated.
I’m begging you for the benefit of everyone, don’t be STUPID.
|
|
|
|
|
How was switching from one generation of languages to another?
Was it a hurdle or a natural evolution as the computers got better.
modified 36 mins ago.
|
|
|
|
|
As I mention in my other response: After 50 years of C, both Cobol and Fortran are still alive. I guess that comes closer to 'natural evolution'. In academics, there is a continuous line from Algol60 through Pascal to C - no great big revolution, only that C was an 'El Cheapo' language with a lot of features dropped in order to make a simpler, faster compiler.
The change of language platforms for production use is a lot slower than you might be lead to think. Legacy is a lot more essential than what any university student discovers until he enters a job in business or industry. If he goes the academic route and becomes a professor himself, he probably never discovers it.
Religious freedom is the freedom to say that two plus two make five.
|
|
|
|
|
Yeah, each in their own sector. Fortran was never an option in business, Cobol was never an option in engineering.
C's ability to knock out Cobol in business has been a lot less successful than most people believe. Even today, Cobol runs a lot of applications. Declining use of Cobol over the last few years (fewer than you would think!) is primarily due to universities not having educated new Cobol programmers for several decades: Those who could maintain the once billions of Cobol code lines (according to Wikipedia: 220 billion lines as late as 2017) are retiring. The needs covered by Cobol are still there. If C hasn't been an improvement for 50 years, it probably isn't today, just an emergency solution.
Similarly, Fortran is still a very important language in supercomputing - a revised standard was published less than a year ago. Then again: "I don't know what programming languages will look like in year 2000, but they will be called Fortran!", as old guru Tony Hoare remarked to all the crazy extension proposals for Fortran-77. Fortran 2023 has only vague resemblance to Fortran of the 1970s.
IBM tried to make PL/1 a common language for all application areas, including system programming. Let us say that it was a half success for some years - on IBM machines only. (But compilers exist for several other architectures.)
In academic circles, a plethora of widely differing languages were known, and taught, in the late 1970s and 80s, such as Lisp, APL, Prolog, Snobol, Forth, Algol68 - all very different from the C family. Especially in compiler courses, students were expected to know a variety of language classes, not just the 'algorithmic' ones. The predecessor of C in academic circles was Algol60 in the 1960s and 70s, with Pascal taking over in the 70s and into the 80s.
At some universities, for OO programming Simula67 (an OO extension of Algol60) was essential, but the world in general wasn't ready for OO at that time. Algol68 offered a lot of exciting 'academic' extensions that you might call 'experimental', so it was widely studied at academic institutions, but hardware wasn't ready for it yet, so few people used it for any serious work.
C entered academics along with those other 'academic' languages that were not widely used in business and industry, and for several years were not considered a real alternative for production work. The main reason why it gradually took over the scene is that during the 1980s, universities dropped teaching of other languages: People fresh from the university didn't master other languages than C. 95% of all 'new' languages arriving after the late 1980s are mostly based on C syntax; those that initially differed a lot has been modified to become more C-like with time, as that is the only style programmers of today know.
Also, up through the 1980s, for production work there were lots of either proprietary - but not that much different - or domain specific languages. E.g. at one point in time, it was said that 50% of all the worlds digital phone switches were programmed in CHILL, a special-purpose language developed by the International Telecommunication Union for that purpose; a fair share of the rest was programmed in Erlang. Both are essentially displaced by C.
Give a programmer of today a program in Lisp, APL, Snobol, Forth ... and he would hardly recognize it as computer program. If you try to present arguments for any not-C-looking language today, you are usually met with a blank stare. For those (few) who care to listen to your description, they may answer with how the same thing can be achieved in C, or by using C++ classes -- so there really is no need for that facility you describe. No need for anything but C/C++. If all you've got (or all you master) is a hammer, then whole world consists of nothing but nails.
Religious freedom is the freedom to say that two plus two make five.
|
|
|
|
|
assembler and Fortran here. Maybe PL/1
>64
It’s weird being the same age as old people. Live every day like it is your last; one day, it will be.
|
|
|
|
|
|
Assembler here
A home without books is a body without soul. Marcus Tullius Cicero
PartsBin an Electronics Part Organizer - Release Version 1.4.0 (Many new features) JaxCoder.com
Latest Article: EventAggregator
|
|
|
|
|
COBOL, FORTRAN, PL/1, and various Assembly languages. It really depended on the hardware and application. What's interesting to note is that other than the Assembly languages, none of the high-level languages at the time had buffer overrun, use after free, use before allocation, and the entire host of possible memory management errors that have resulted in roughly 90% of all vulnerabilities.
|
|
|
|
|
Can you tell an approximate timeframe for when C took over? Was Dos written in C?
|
|
|
|
|
DOS is written in assembly language - originally in 8080 assembler. It is based on CP/M, which was an OS for the 8080. Essentially, 8080 assembler is source code compatible with 8086 assembler, but of course the 8086 has lots of extensions. I don't know how much these were used in the very first DOS versions (for the 8086 based IBM PC). Somewhere down in my basement is a ring binder that came with an IBM PC: The entire DOS source code is published there - if I could find it, I could tell, but a fast search was unsuccessful.
Note that DOS is not a single OS, and not from a single vendor. There is at least half a dozen of DOS versions, from different vendors for IBM PC compatibles, each in multiple versions. Maybe some of the more recent ones were written in C. If anyone were to write a DOS emulator today, it would of course be implemented in C.
The age when C took over is very diffuse, and people would give (highly) varying answers. It started spreading in academics through the 1980s, but didn't become what you'd call dominant until the late 80s. It probably occurred a few years earlier in the US than in Europe, but even in the US, it took quite a few years from its introduction until it had squeezed out everything else.
In business and industry, it took a lot longer. To some degree, it hasn't happened yet ... (ref my other post). Let's say that in new application domains, such as internet communication, C has been dominant or the single alternative since the late 1980s. In established application domains, such as business, supercomputing, CAD/CAM and several others, C didn't gain a strong foothold until the 1990s, possibly late 1990, into the 2000s or even later - but that varies a lot with application domain.
Most academics will tell that it happened much earlier - which is true within academics, which is what counts to a lot of academics. Lots of them consider Fortran and Cobol, and any other language with a not-C-like syntax, dead, historic languages.
Religious freedom is the freedom to say that two plus two make five.
|
|
|
|
|
Before C there were lots of higher level assembly languages (Jean Sammett wrote in the 70-ies, may be even late 60-ies, a thick book with on the cover the tower of Babel.
I myself used assembler (PDP-8, PDP-9) until I ported BCPL to the PDP-9, later using
BCPL on and for the PDP-11 with cross compilation for the P860 (a small Philips 16 bit computer with obly papertape in and output).
I actually wrote a lot of software in BCPL, including parser generators and a compiler for Algol 60 on the PDP-11
It was in app 1978 that we got Unix on a PDP-11 and obtained the original C Book
|
|
|
|
|
For me, it was mostly assembler - first for the PDP-8, then the IBM 360 and PDP-11, and lastly, the Motorola 6800 and Intel 8xx8 processors. Of course, in those days I specialized in operating systems, device drivers, system utilities, and hardware diagnostics.
Most of my peers used COBOL or FORTRAN.
__________________
Lord, grant me the serenity to accept that there are some things I just can’t keep up with, the determination to keep up with the things I must keep up with, and the wisdom to find a good RSS feed from someone who keeps up with what I’d like to, but just don’t have the damn bandwidth to handle right now.
© 2009, Rex Hammock
“If you don't have time to do it right, when will you have time to do it over?” - John Wooden
|
|
|
|
|
|
That list is a nice reference, but it only tells you when the language was developed, in several cases only in its very first version, and nothing about when it became widespread, generally adopted.
If it became widespread, generally adopted! Most of them never were. An entry in Wikipedia only proves that at least one person still remembers the language.
Religious freedom is the freedom to say that two plus two make five.
|
|
|
|
|
Non-sequitor. No programming language is mainstream.
|
|
|
|
|
FORTRAN, COBOL, BASIC, Pascal
"It never ceases to amaze me that a spacecraft launched in 1977 can be fixed remotely from Earth." ― Brian Cox
|
|
|
|
|
|
Well crap, there goes my weekend plans.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
I spend a lot of time trying to get my code right. Sure, I'm not immune to bugs. Could I be better about methodically testing? Absolutely, especially since I hate that part, but I think for the most part I do a pretty good job. I just spent awhile tracking down all kinds of little issues to get my SVGs rendering pixel perfect. They now look better than the reference implementation I've been using.
Meanwhile, Microsoft's mail client dies inside about every other time my computer suspends itself. Their windows task bar gets confused and starts stacking task icons almost completely on top of each other, etc.
If big companies like MS are pushing user expectations downward in terms of software quality, it makes me wonder.
Other than integrity and self respect, why do I care if my code works, if Microsoft doesn't? If IBM doesn't? If Oracle doesn't? You know?
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Because for some reason, clients spend thousands and thousands on big suppliers that don't get the job done, but when it comes to us little folk they want only the best for the lowest price and they want it yesterday
|
|
|
|
|
True. Maybe it's because Microsoft basically has a captive consumer base.
Operating systems vendors end up being a small pond with big fish just because of the sheer man hours/capital-expenditure it takes to develop a modern OS.
You have what? Apple's OS/Linux/Windows*. Unless you want to go totally off the beaten path with something like QNX, but that's usually cost prohibitive for compatibility and user education reasons.
Apple doesn't really compete except as boutique because they've priced themselves out of ever being a mainstream consumer product, although to their credit, they've expanded that boutique market more than I thought it could bear.
Users on Linux is something that IT people scare their children with if they misbehave.
So Windows it is. What real choice does one have?
*ChromeOS doesn't count. Don't even go there - it's a phone with a keyboard, not a PC.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
honey the codewitch wrote: *ChromeOS doesn't count. Don't even go there - it's a phone with a keyboard, not a PC.
The way I see it, ChromeOS is for the Anything-But-Microsoft crowd that isn't technical enough to use Linux, and at the same time smart enough not to want to pay the Apple tax. Or they're smart enough to avoid it. But looking at ChromeOS, I have to question whether those buyers realize what they're getting themselves into.
Sometimes I wish there was more competition, but then I'm reminded interoperability is a mess even with just the few options that exist today. More would just compound the problem.
|
|
|
|
|
There is surely apathy from the part of people employed at large companies like those you mention. Can't work out a problem? Push it out anyway, if it becomes a problem for enough people, someone else will surely get the job of fixing it eventually.
But when it's your own, and your name gets attached to it, you tend to take pride in your work and don't want to be made to look bad. There's really something about putting a project together on your own that works better than what a multi-billion company can do.
That's why I would bother.
|
|
|
|
|
I wonder if I left the wrong impression with my initial comment. I absolutely agree with you, but I wrote what I did as food for thought.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|