|
ErnestoNet wrote: Consider that C can work in microchips with very little memory and a very
reduced CPU instruction set
You are wrong.
You are confusing language, compilation and executable. One uses a C compiler to create a program where the executable then runs on the chips that you cite.
compiler != program != executable
ErnestoNet wrote: Java and .NET are very different that C. They are "platforms", not languages
Wrong.
Java and C# very specifically are languages.
I would guess that you are unaware of real time Java which is specifically targeted at embedded devices.
You are confusing language with libraries and you will NOT be able to write a C program that does anything discernable without the C libraries. And it is quite possible to write a C program on windows that would never run on the chips that you cite.
ErnestoNet wrote: I'm not defending C, but most modern languages aren't languages, they are
frameworks.
What exactly do you think that the C++ Standard Library is? Where do you think 'printf' comes from in C?
ErnestoNet wrote: C/C++ do not target app developers. They target compiler developers, OS
developers, driver developers and libraries developers. I guess
thats why C tops TIOBE (and C++ is 4º).
Nonsense. The people who write compilers, drivers, OSes and libraries are a small, small fraction of the developer base. If those were the only people using it then it would be far down the TIOBE list.
Developers creating business applications use it. That is why it shows up there.
You would be better off citing embedded development but there is also C++ and Java development in that domain.
|
|
|
|
|
It's not about being right, but if you want to be right, there you have it:
YOU'RE RIGHT!
That said, the STL, if you don't use it, you don't load it. The header approach only loads (and compiles) what you use.
In fact, when you compile and link, only the functions that you use are linked. In Java and others, you load everything (even if you don't use it).
About Embedded development, I know a lot of people that work with it and they work mostly with Assembler and C. Java and C++ may exist, but they are not used.
Today embedded has changed with ARM cellphones.
A 2Gb memory dual core CPU cellphone is closer to a PC that to embedded. There you can run Java, as in the PC.
A real, cheap embedded chip with 64Kb of memory and a few cycles processor can't run (with decent performance) Java or C++.
Java embedded only runs on ARM to start with. It requires at least 130Kb (with tweaking) and 700kb
by default. It requires a network connection! It does not support real time. You can check everything here:
http://www.oracle.com/ocom/groups/public/@otn/documents/digitalasset/1852008.pdf[^]
I'm going to say it again, just because Bjarne Stroustrup said it and I think he is right:
Java and .NET ARE PLATFORMS.
it´s the journey, not the destination that matters
|
|
|
|
|
ErnestoNet wrote: In Java and others, you load everything (even if you don't use it).
Wrong. In a number of ways.
First both C and C++ in modern compilers commonly rely on shared libraries. Modern OSes load the entire shared library even if one method is used.
Static linking although possible isn't normally used on desktop OSes and there is no reason to expect that Java/C# would need to do anything different on a desktop OS.
In contrast Java and C# do something similar but, at least for Java, that is done solely as a run time performance optimization. One can create a stripped down version of the library is one wants. And one can also do a load only on use case (at least in java) even if using the standard API jars. I suspect the same is true for C#. And that is true for a desktop OS.
If one has an embedded Java SDK then there ware going to be substantial differences when compared to a desktop OS. But the same thing apples to using C in embedded systems as well.
ErnestoNet wrote: Java and C++ may exist, but they are not used.
If it wasn't used then there wouldn't be companies creating compilers for just that purpose. Googling provides that. A company selling nothing but compilers wouldn't be viable if there wasn't a sizable market of people creating applications using it.
As another example the following is a link for a company that specializes in creating java embedded solutions.
http://www.k-embedded-java.com/[^]
Following is a device.
http://www.avidwireless.com/AVIDdirector.html[^]
ErnestoNet wrote: Java embedded only runs on ARM to start with. It requires at least 130Kb (with
tweaking) and 700kb
That is a product that Oracle offers. It is far from the only product out there.
|
|
|
|
|
|
ErnestoNet wrote: You said it yourself, in C and C++ you can statically link. Or not use. In Java/C# you can't. You load 42Mb rt.jar in Java.
Actually no I didn't say. Actually I said the opposite. Rather specifically.
What part of what I said didn't you understand?
ErnestoNet wrote: About embedded Java in http://www.avidwireless.com/Products.html[^], it looks rather expensive and not a custom solution.
Which is irrelevant.
ErnestoNet wrote: We are talking about different kind of machines here.
If you want to define "embedded" software in a specific way and get the developers who currently call themselves embedded developers to agree that they are not in fact embedded developers then go for it.
But until then my statement stands.
|
|
|
|
|
ErnestoNet wrote: Text based compilers are easier to write that binary based compilers.
The headers approach is easy to build a compiler from. C is a very
simple language to write a compiler. There are LOTS of C compilers.
At this point I am rather certain that you do not know what you are talking about.
C headers are part of the language. Period.
C compilers implement the C language. Period.
The first fact is only related to the second by the fact that headers are in the language. It has nothing to do with compilation.
There is no such thing as a "binary" compiler in common usage. Best I can suppose you are talking about is what occurs in a Java Virtual Machine when it process a Java class file. That process it best described as interpretation not compilation.
Your confusion about the above also has nothing to do with your confusion about what portabiity means.
ErnestoNet wrote: More compilers provide better portability.
Wrong. You have confused availability with portability. More compilers means you can use it on more platforms.
In point of fact almost all C code written for the Windows system will not work on any other platform (without extreme care but that is my point about portability in the first place.)
ErnestoNet wrote: There are not many cross platform/architecture languages.
You are wrong.
Java and Perl exist on many platforms. Excluding small unix platforms, perl exists on basically all unix systems that C does.
And again availability is NOT the same as portability.
ErnestoNet wrote: Name a crossplatform language and I'll compare that to C in terms of
portability....
Your term definition for portability is wrong.
http://en.wikipedia.org/wiki/Software_portability[^]
By definition of programming languages I can always (within resource limits) create compiler/interpreter that originated on one platform and implement it for another. You are using that for your definition and then claiming that because C exists in many places that that makes in "portable" but instead is what it proves that it is 'popular' and 'useful'. Which is different than portability.
And in addition Perl shows up on all those systems too.
|
|
|
|
|
Call it availability. C is the most available language.
There is an important point you make:
I can always (within resource limits) create compiler/interpreter that originated on one platform and implement it for another.
"within resource limits" is the key here. C is simple. Java JVM is complicated (and owned).
It's important that you know that I'm not defending C.
It's really ugly compared to other languages and prone to error.
But it's fast (predictable fast, no GC collection pause), very "available", flexible and stable.
Most of the programs I use everyday are built in C/C++.
None in Perl. None in Java.
it´s the journey, not the destination that matters
|
|
|
|
|
ErnestoNet wrote: "within resource limits" is the key here. C is simple. Java JVM is complicated
(and owned).
Not sure what you think "owned" means but far as I know the term "Java" is owned but the language is licensed in such a way that I can in fact take it and create a compiler and complete JDK fro a new platform free of charge.
See "Licensing" in the following.
http://openjdk.java.net/faq/[^]
If you had to start from scratch then C, as in C11, is not trivial. And I am talking about the language and not the libraries. Implementing everything from scratch would be a large undertaking.
Of course one would start with the existing source and modified for a new platform.
Which is the same thing one would do for Java.
ErnestoNet wrote: But it's fast (predictable fast, no GC collection pause),
Again - research Real Time Java.
Other than that modern business usage is not impacted by the modern GC.
ErnestoNet wrote: Most of the programs I use everyday are built in C/C++. None in Perl. None in
Java.
Which is why *I* said that language choice is by user preference and not availability.
Perl is available on all of the C platforms so they don't choose C because it is available.
|
|
|
|
|
pasztorpisti wrote: I'm open to any reasoning against the above list
In terms of the errors - those are not errors that show up for me. The errors that I must find involve logic errors. Whether those errors are made by me or others.
And if you spend significant amounts of time day to day on errors like those then at best that I can judge you have a problem that can only be solved with a change in process.
pasztorpisti wrote: In my opinion tools (including dev envs and languages) are very important.
I can and I have achieve orders of magnitude increase in performance by changing requirements.
The only way I have ever achieved anything close to that at the implementation level is because the design itself that lead to the implementation was wrong. Other than that implementation improvements can only lead to small increases when everything else is held steady.
The following is what impacts performance and even project success.
1. Requirements - most impact
2. Architecture
3. Design
4. Implementation - the least impact.
1/2 are not impacted by language although 2 might be impact by technology.
3 can be impacted by technology but only minimally by language.
4 is impacted by language.
So the reason language as less impact is because it does.
pasztorpisti wrote: I think every company that continues other than garage development has some
kind of dev process,
Every company has a process. Some companies have a formalized process. The only measured and significant improvements in development have come from formalizing the process and improving that. Far as I can remember improvement measures consisted of fewer bugs (detected at various points in life cycle), short delivery time, reduced cost and reduced maintenance cost.
pasztorpisti wrote: but that is another dimension of the problem of cutting development time and
providing quality on a completely different level: management
No it isn't. Again the only impact from the studies that I read was based on process improvements. Tools of any sort had nothing to do with it.
|
|
|
|
|
Please understand that I never questioned the importance of a development process and the same is true for architectural decisions and design. Indeed, the biggest mistakes I have ever seen were architectural/design mistakes. Without good design or anything above that we have nothing to speek about. My answers assume good architectural conditions and this time focus on the beautification of the least important 4th step.
Perhaps some day I will work just with the first 3 steps and I will probably be less concerned about the language and toolset the coders have hard time with... Seriously!
|
|
|
|
|
Richard MacCutchan wrote: When it was chosen, it was the only choice.
Really?
The Multics operating system was being programmed in PL/1.
The Burroughs B-5000 and subsequent machines had their operating systems coded in Algol with extensions. And a damn fine operating system it was.
My alma mater, Case Western Reserve University, implemented an operating system with an Algol-like language called Chili.
All of this while C was just beginning to happen and Unix was not yet born.
Richard MacCutchan wrote: I meant it was unfair in that you were judging a language developed in the 80s
by the standards of today's knowledge and technology.
Oh! Does that mean you will be saying nice things about Fortran and COBOL? Or, at least, refrain from saying nasty things about them?
|
|
|
|
|
Vivic wrote: All of this while C was just beginning to happen and Unix was not yet born. And when UNIX was born it was developed with C (and its predecessors) in mind. Of the others I only ever worked on Burroughs' Algol based OS and it was one of the most difficult I ever tried to understand.
Vivic wrote: Does that mean you will be saying nice things about Fortran and COBOL I often have. I worked with both languages in the 70s and 80s and found them perfectly adequate for solving specific problems. That is not the case today but it does not detract from their usefulness at the time.
One of these days I'm going to think of a really clever signature.
|
|
|
|
|
pasztorpisti wrote: The problem is that the accident has already happened and windows and linux are
already in C.
Which would be relevent except for the fact that windows has been re-written several times.
|
|
|
|
|
The sources of win2k have been leaked. Download them and look at the code. It contains tons of legacy code even from win31. Not to mention the backward compatibility between windows versions. Windows has never been rewritten.
|
|
|
|
|
I'm surprised no hackers from around the world has hacked into Microsoft(R) servers and steal every O.S. source codes or projects (or at least some of them), and them publish them on torrents. But Google, Bing, Yahoo and all those other people filter out those results and/or delete them. There probably has been private court orders of Microsoft vs. some hacker, stealing source. But I'm pretty sure Microsoft knows everything about security then any other security company out there combined, considering you have to send your driver to them to get it signed now. Basically Microsoft probably has there servers extremely hard to break through or breach, they probably have it to where if it detects a breach or unauthorized transfer of specific files, it disconnects the servers and computers and locks them down, terminating the hacking or suspicious signal. Who knows, they are probably smart and keep the internet cord unplugged, considering the fact that government leaks there "state of the art" F-35 blue prints on the internet, where everyone can see (technically, cause if there's internet, there's hackers), or maybe that's bogus that the government sent out to intentionally trick hackers into thinking they got the "real stuff", but who knows what crazy stuff tech companies do these days.
Simple Thanks and Regards,
Brandon T. H.
Programming in C and C++ now, now developing applications, services and drivers (and maybe some kernel modules...psst kernel-mode drivers...psst).
Many of life's failures are people who did not realize how close they were to success when they gave up. - Thomas Edison
|
|
|
|
|
Every OS contains tons of bugs, old bugs go, new ones come with the new features. Still the wast majority of security holes are caused by buggy networking applications and not by the OS itself. The more complex a networking application the more changes you have to have a security hole in your system. (For example a browser is quite a complex piece of software!!!) Anyway, several windows and internet explorer patches followed the windows source leaking, not without reason but because they had to fix a lot of discovered and (I suspect that) known but shelved (!!!) bugs that became obvious for the hackers from the sources!
|
|
|
|
|
pasztorpisti wrote: It contains tons of legacy code even from win31.
Which would be relevant if I had claimed that they had written a newer version of windows from scratch using a black box development effort.
But I am pretty sure that the vast majority of code is some command like 'type' has essentially remained the same since pre-win95.
On the other hand I know for a fact what is required to move a code base that was originally written for 32 bits when that was new to what it takes to use a 64 bit system.
How exactly do you think that windows handles VMing? Did you ever attempt the same thing in Windows 95? I can tell you that I did in fact run two OSes on a Win95 box and it was not easy.
|
|
|
|
|
Win95 is "just a nice DOS program". Even if they have replaced and extended parts in the OS they always stayed backward compatible on api and source level and thats alone is a good reason for not switching the language even if all they do is wrapping new kernel. You can take the sources of a pretty old windows program and with minimal or no modifications you have good chances to compile it even for a 64 bit system. I think this backward compatibility drains a lot of their energy and it implies some practical restrictions (like language).
|
|
|
|
|
pasztorpisti wrote: Win95 is "just a nice DOS program"
The point however is that windows has been re-written.
pasztorpisti wrote: You can take the sources of a pretty old windows program and with minimal or no modifications you have good chances to compile it even for a 64 bit system
I suspect I can take 'cat' from a pre windows unix variant and with "minimal" modification get it to work on a windows 7.
That however doesn't mean that windows wasn't re-written.
|
|
|
|
|
jschell wrote: The point however is that windows has been re-written.
Well, from our point of view its not important how much they extended or replaced their codebase. Its a fact that they kept backward compatibility with their old api and that doesn't leave much space for practically changing the underlying language. Lot of parts have been tweaked and replaced in windows in the past decade, but I was shocked how much hacks have been kept in the code and win9x compatiblity layers. I'm pretty sure the level of backward compatibility they have often ties their hands pretty strong. On hack I really mean (sometimes really dirty) hack for example to avoid crashes of specific popular old programs that has bugs that don't crash the program on older systems but without special handling they would simply die on NT. And the hacks are explicitly commented with bug IDs and reasonings. This is another hidden face of windows' backward compatibility. I think what makes windows successful also holds it back in development, but I really respect the MS coder guys for what they achieved. Keeping backward compatibility on such a large scale is tremendous work.
They don't rewrite everything, what I was curious in the sources is module loading, that consisted of pretty old sources with lot of win3.1 and win9x sources. They use a lot of typedefs that makes porting relatively easy even to 64 bit and thats quite OK till they go on with backward compatibility.
jschell wrote: I suspect I can take 'cat' from a pre windows unix variant and with "minimal" modification get it to work on a windows 7.
That however doesn't mean that windows wasn't re-written.
I wouldnt compare the complexity of a cat program with even the simplest windows gui program. The same is true for the winapi versus posix. The posix api doesn't contain api calls that seriously enforce windows specific compatibility restrictions internally. On the other hand if a program for example subclasses a windows common controls dialog and it expects the border size to be X pixels and hacks around with gui hooking and expects you to send unrelated window messages to avoid a crash then you have a very complex (and sometimes not too well designed) api system to simulate natively and thats near not as easy as implementing a pure simple posix api.
Well, we are talking about why havent they changed the language, and the answer is clear: Source level backward compatibility. Its pointelss spinning around how much they rewrote from windows - it simply meaningless if they decided to keep backward compatiblity its impractical to start thinking about changing the language.
|
|
|
|
|
Windows isn't written in Basic.
So based on your reasoning why does Microsoft keep producing that?
|
|
|
|
|
Ah, sorry, I misunderstood your post so I correct my answer. I don't know the answer to that and I'm not really interested to research one because I'm not a big basic fan. I could just guess and I don't like doing that. You can however open a new topic for that and then everyone can tell their opinions or whatever they know about it. But how is this related to bad/good things in C/C++?
modified 22-Sep-12 23:55pm.
|
|
|
|
|
pasztorpisti wrote: like header files that terribly slow down the compile time Have you any proof supporting this sentence, regarding the C language?
pasztorpisti wrote: Again, the only reason for the existence of C/C++ is massive amount of legacy code This is an opinion (mine, for instance, is completely different).
Veni, vidi, vici.
|
|
|
|
|
CPallini wrote: Have you any proof supporting this sentence, regarding the C language?
Yes, in the last years we had many times when we had to rearrange the header includes and optimize for compile times for our CI system. We compiled the codebase (~2millions loc) with a grid system (IncrediBuild) plus SSD drives in all machines in the grid and the compile time was still 20 minutes. By rearranging some header files we could decrease the build time to around 5 minutes. Thats what I'm talking about not some few file hobby projects that make no sense to measure such things.
CPallini wrote: This is an opinion (mine, for instance, is completely different).
And could you make a list of language features and compare that to some other languages that have better support for that? I see significant deficiencies in C++ comparred to some other languages, and its syntax because more-and-more complex with every new draft. A language that has redundant features and backward compatiblity with a thousand years old other language simply can't be "optimal".
|
|
|
|
|
pasztorpisti wrote: Yes, in the last years we had many times when we had to rearrange the header includes and optimize for compile times for our CI system. We compiled the codebase (~2millions loc) with a grid system (IncrediBuild) plus SSD drives in all machines in the grid and the compile time was still 20 minutes. By rearranging some header files we could decrease the build time to around 5 minutes. Thats what I'm talking about not some few file hobby projects that make no sense to measure such things.
Still it is not a proof. You should compare it to the compilation time of a similar project written with your favourite language and achieving the same performance (if your favourite language could assist you on that).
pasztorpisti wrote: And could you make a list of language features and compare that to some other languages that have better support for that?
C and C++ are performant. No other language (other than assembly) compares with them. You should know that.
pasztorpisti wrote: I see significant deficiencies in C++ comparred to some other languages, and its syntax because more-and-more complex with every new draft.
While, for instance, C# syntax becoming simpler?
pasztorpisti wrote: A language that has redundant features and backward compatiblity with a thousand years old other language simply can't be "optimal"
Still is compatible.
I wouldn't call it 'optimal'. However I like it (this doesn't means I show apparent disgust for other languages - with the very exception of COBOL).
Veni, vidi, vici.
|
|
|
|