|
Mr. Spear was writing about a fantasy world - not reality! You write like a real Ph.D
Some humans were brilliant programmers, Steve Jobs, Bill Gates, Charlie Bachman and many others come to mind regardless of what I did. Don't think any of them had one of those - you know (ph.d)
|
|
|
|
|
Yes, I pile it higher and deeper at times. I did actually get a degree in BS.
|
|
|
|
|
steve jobs was not a programmer - where do you people get this stuff?
he was a salesman and a manager, thats it. oh and human
|
|
|
|
|
And a forward-thinker, not many of us are! My mistake, of course I meant 'The Woz'
|
|
|
|
|
Correction Steve Jobs was not a brilliant programmer though no doubt he was a brilliant salesman. Common misconception, in reality Jobs sold Wozniak's programs.
|
|
|
|
|
Hi r, I am a brilliant programmer - created the very first mainframe terminal emulator program for the Apple II - called it 'STEM' (you figure it out). Wished I'd had Steve around to sell it for ME!
|
|
|
|
|
For sure who doesn't want Steve to sell their stuff. Someone who can convince a planet full of people that they need some crap that they had never even seen before has to be the g8est Salesman.
|
|
|
|
|
Edsger Dykstra pointed out philosophy majors as one of the better sources of good programmers in his article criticizing computer science programs as not producing a better crop of programmers. You see, there is very little opportunity to learn analysis in computer science, because the programs are oriented to teaching technique and theory - it wasn't until graduate school that I was REQUIRED to have adequate error handlers in my homework code, or to analyze its performance and optimize it for peak efficiency. More than half of the students my classes in the M.S. Comp Sci program had degrees in areas other than computer science.
Contrast this with philosophy, where detailed formal analyses of philosphical positions and the consequences that arise from them are demanded from you starting sophomore year. Or to my major, psychology (the B.S. kind, not the B.A. kind), where in my sophomore year I had to turn in nine separate experimental reports with analysis (my B.S. program also required two semesters of BASIC programming, as our faculty believed that learning to program would help us perform statistical analyses and mathematical models of behavior as the state of the art improved). And of course there's physics and mathematics, both of them producing bumper crops of programmers every year. Computer science programs have improved a bit thanks to the criticisms of Dykstra and others...but the other fields have not themselves grown less difficult.
I don't know what your background is, but the fact is, ANYONE can become a programmer without ever having to learn to write an explicit report detailing how and why their program works, and what its side effects could be if left in operation. Most programmers, in fact, are barely competent enough to leave understandable comments. About 90% of programmers are what I call "coders", as they will code whatever they're told to code. Virtually every programmer I've met who I'd consider to be in the ten percent I call "developers" had a bachelor's degree in another area, and sometimes a master's in another area as well.
You might want to consider a bit more exposure to philosophy and psychology yourself, enough at least to avoid making yourself look like a bigoted pinbrain by making childishly insulting and glaringly ignorant remarks about these fields - believe me, there's plenty of room for critiques and VALID insults in both fields if only you know more about them than the fact that you don't like them.
As I said earlier, your story was amusing, it's just the conclusion that rankled. Your defense of that conclusion is worthless, useless, and not amusing in the least, except in one small respect: you're actually boorish enough to write this stuff where anyone can read it, and still believe you can be taken seriously.
|
|
|
|
|
Hmmm... I'm wondering how many sense-of-humour-failures I can notch up today. Yes, yes - you're right... except in one thing: not anyone can become a programmer: I'd say one person in a hundred probably can, in a real sense: I also don't think it's a discipline that can be taught unless you're predisposed to doing it anyway.
Incidentally, I'm not defending my conclusion, and I hardly expected anyone to take it seriously, let alone waste five paragraphs attacking it. However (check your psychology degree for this one) I guess I must've touched a nerve since that's what happened.
|
|
|
|
|
Dan Sutton wrote: Incidentally, I'm not defending my conclusion, and I hardly expected anyone to
take it seriously, let alone waste five paragraphs attacking it. However (check
your psychology degree for this one) I guess I must've touched a nerve since
that's what happened
It was the tone of the message more than anything else - it's a tone I'm all too familiar with ("think you're smart, college boy?"). Unfortunately, it's never easy to tell exactly what attitude is behind the print you see on the Web without explicit markers of some sort (e.g. <over-the-top-exaggeration-'cause-i'm-irritated>). Again, and also unfortunately, you managed to quote (with nearly identical phrasing) more than one programmer I worked with in the past, and those quotes were being directed to me personally...which might have been easier to take if I didn't have twice the programming skills asleep that these individuals had on their best day. All told, it was all too easy to believe you meant what you said in exactly the way you said it...but, as my own hot buttons had been pressed, I didn't take the time to ask you if you really meant it that way before I launched my "retaliatory strike." Apologies.
I do note that programming education was pretty happenstance forty years ago, and that few if any resource materials were available to assist your education unless your employer provided them. At my high school in 1974, my first BASIC course was taught using a teletype hooked in the old GE Apple Time Sharing Service, 300 baud (bits-per-second, more or less) and paper tape to store the program. We were taught the language, but not really how to use it effectively. It wasn't until college that I was introduced to structured programming and why spaghetti code was bad...but by the time I was finishing my M.S., enormous advances had been made available to anyone who could buy or borrow a book on algorithms and data structures, or even how to document your program. And still, the clueless would wind up with programming jobs. I helped hire such a total waste of space myself in 1999 for an Internet start-up...she could program, she just couldn't figure out how to read someone else's code and understand it, so she wrote everything from scratch...and while it worked some of the time, reinventing the wheels we had already built and were busily polishing was not on our agenda that year, especially since it was obvious from her code that she was not going to be giving the Turing lecture any time soon nor writing any of the three programmers senior to her out of a job. And yes, she was a computer science graduate who, when she resigned from our job (we decided not to tell her we were going to fire her that week anyway), was moving to North Carolina to help her husband sell real estate thus saving her potential future employers quite a few headaches, IMHO.
Oh, BTW, remind me to tell you sometime about the stripper I met who could debate Kantian ethics as easily as she could twirl her tassels...but that's a story for another day.
|
|
|
|
|
Yeah - I got lucky: my first programming teacher taught us Algol 60 (this would've been at some point in the late '70s or something): he refused to teach us BASIC ("I'm here to teach programming, not Pidgin English!") so structure came into it pretty early on, as did Dijkstra, who was a god as far as our teacher was concerned... Of course, by then I'd already been programming (at home) for several years -- I did notice that most of the students who couldn't already program by the time they started the class never really learned -- it appeared to have to be something one would cultivate in one's self, rather than just a type of course material like any other. In any event, after that came a PDP 11/34 which had paper tape readers, punch card readers, and a front panel with a bunch of lights on it where you could program the thing one instruction at a time, by putting your (binary) op code into an accumulator, putting a "store" instruction into the instruction register and then hitting the "execute one instruction" button. Beautiful. Makes you really understand how great the Pascal compiler on the thing was.
I've generally tried to avoid hiring programmers to work on any of my stuff -- as a veteran of the genre, I tend not to play well with others... but I've managed to write some pretty massive projects regardless. I came to America in '93, and the idea of a programmer who knew more than one (or eight) language(s) seemed to be a revelation to the people I ran into, so life has not been dull...
|
|
|
|
|
Dan Sutton wrote: I came to America in '93, and the idea of a programmer who knew more than one
(or eight) language(s) seemed to be a revelation to the people I ran into, so
life has not been dull...
This is a conversation I remember from 1997, at least five languages ago.
Me: "Dave, how many languages have you actually used in business?"
My boss: "Crap, I dunno, I lost count around ten."
Me: "Yeah, I count eight myself, but a couple more I can't remember if I used it just in grad school or on work projects as well."
That was before I had done any form of Web development. Not long before, I'd left a job which had a huge AS/400 shop, and most of these guys knew RPG, enough CL to get by, and nothing else at all.
Also, I remember when my girlfriend (later my wife) took FORTRAN, aced every test and paper homework assignment...then choked completely when it came time to write the program and test it. I've taught read-only SQL to clerks who cheerfully claimed to know nothing about PCs by first telling them that it really was easy, then I became the drillmaster from hell until they did EXACTLY what I told them, and they discovered that they were SQL geniuses (or, at least, they could now perform ad hoc queries to help with QA or support issues without having to call a programmer immediately). I probably should have been a teacher, but the expected income sucked so I became a programmer instead.
|
|
|
|
|
I've done the same thing with SQL. Unfortunately (a) I suck as a teacher -- no patience; (b) they generally learn just about enough to get themselves into trouble... and then it's all my fault!
It's funny how people's brains work: a girlfriend of mine was excellent at debugging programs, but couldn't develop anything from scratch: something about the lateral thinking required -- it just wasn't there. She declared that women can't program because she couldn't. That appealed to me, somehow.
|
|
|
|
|
cpkilekofp wrote: ANYONE can become a programmer without ever having to learn to write an explicit report detailing how and why their program works, and what its side effects could be if left in operation.
See, this is part of the reason Philosophers don't make good programmers - your basic assumption was WRONG!
I wasn't talking about 'writing a report about a program' - it was writing a 'reporting program' as in: Headers, Page #, Body (detail, totals), Footers. We actually had to code these by hand back in the dark ages....
|
|
|
|
|
Frank Towle wrote: I wasn't talking about 'writing a report about a program' - it was writing a
'reporting program' as in: Headers, Page #, Body (detail, totals), Footers. We
actually had to code these by hand back in the dark ages....
ROTFLMAO...no. Your understanding of what I said is, er, WRONG. I wasn't talking about your guy, I was talking about what the average Philosophy and Psychology students have to do in order to get even halfway through their degree, that is, describe the workings of complex abstractions in detail in their native language. Now, I understand, there are a lot of computer science programs that include technical writing as a requirement, which I think is a major step in the right direction; one of the reasons I was made to suffer Experimental Psychology was so that, by the time I'd done the ninth or tenth experiment, I'd have learned how to write a passable description of a psychological experiment, i.e. one that would stand at least some chance of getting published in a peer-reviewed journal. Hmmm...these days, you could have had your Philosophical Phlower Child document the code, keeping him out of trouble, if you couldn't push him aside any other way - at least he'd probably know how to write in English.
And yes, I remember in one job we let the weak guy do the report programs - he turned out to be the saboteur by willful neglect, as every one of his programs was Y2K non-compliant even though we were writing this stuff in 1995. In every other job I had after that, the really weak guys didn't last long enough to contribute much - it was a much more competitive environment.
|
|
|
|
|
Oh, just a passing thought, but had your guy had a good grounding in Symbolic Logic in his coursework? That was one of the bigger advantages of a Philo. vs. some of the other fields, as few of them dealt in Boolean or other discrete mathematics and had no familiarity with De Morgan's Rule and the other shortcuts.
|
|
|
|
|
On you mean "The negation of a disjunction is the conjunction of the negations." This kind of trap produces compound conditions that should be totally outlawed in software development!
|
|
|
|
|
Frank Towle wrote:
On you mean "The negation of a disjunction is the
conjunction of the negations." This kind of trap produces compound conditions
that should be totally outlawed in software development!
Um, it also allows compound logical statements to be broken up in to a series of pure conjunctive clauses (i.e. Horn clauses) which allows one to search a set of Horn clauses for those that are true for a particular set of facts. This is the essence and basis of the Prolog programming language.
Properly used, the De Morgan transformation and other logical transformations allow logical statements to be reorganized into whatever best fits the computing environment you must use. It's like the C language: don't throw it out just because some idiot can overwrite the operating system in DOS.
|
|
|
|
|
Interesting - I recall being able to define the 'probability' of something being a fact, think we called it 'fuzzy logic' but I didn't find the specifics. Is some of your last reply related to this? I don't have the time to look up the stuff you mention.
|
|
|
|
|
Major sense of humour failure itt
|
|
|
|
|
Frank Towle wrote: A programmer working with me many years ago either had a short attention span or
leaned on his professors’ admonition that everything in the world is gray… He
would never reuse a snippet that worked and because we were asked to comment our
code (this was back in cryptic Assembler/Fortran land) he would liberally
sprinkle ‘THIS MIGHT WORK’ anywhere there was questionable logic.
Not sure I understand that...the logic was "questionable" yet you are still asserting that it would work absolutely 100% of the time?
Or you just didn't like that the person noted that there was in fact some question as to exactly what might happen?
Frank Towle wrote: Lesson: Don’t hire Philosophy Majors for Dev projects!
Are you referring to a senior developer who has years of experience and fails to match the culture of the group? Or who is just incompetent? Obviously then there is a failure in the interview process in that it didn't weed them out in the first place. Or alternatively didn't proactively review their product once they started and get rid of them when they failed to meet expectations.
Or are you talking about a novice with no experience? Any company that hires beginning programmers and does not provide extensive long term mentoring deserves whatever happens. Those cases certainly have nothing to do with the employee.
|
|
|
|
|
Hi J, Ph.M. was 'inherited' with the project. And no, much of his logic would have to be rewritten, before the days of formal code reviews... 'If it ain't broke don't fix it' approach. This was in a package that was sold throughout the 1970's and 80's for $200K+ each copy - We relegated Ph.M. to writing report programs (report writers didn't exist) that we could quickly verify and did not let him near any LOB processing.
'Culture of the group'? We never heard of such a thing then! You just worked with who you were given, but yes, frustrating at times - one of our crew, I'm sorry TEAM, punched his hand through a plastered wall he got so mad at something, probably ME telling him he had to do something over... Training programmers was very expensive, we would have several years salary invested in someone who had never SEEN a computer let alone program one. You would have to agree today is very different! We eventually re-wrote this same package for three different platforms using basically the same design, although you couldn't say the environments were anywhere near the same. I was then asked to head up a NEW Quality Assurance department to get our lack of same under control; the first task was to quantify almost TEN THOUSAND bug reports across the now FOUR platforms - we were still selling product - 'Outstanding in our Field'
There would be value in formal 'Computing History' courses to provide perspective about just how far (or not) this industry has progressed in 50 years.
|
|
|
|
|
I happen to have 2 undergraduate majors: pilosophy and mathematics (numerical analysis). I have been programming in the robotics industry for 20 years and am quite sucessful and proficient in a domain which is very demanding.
I have seen many programmers come in/out of this industry and I can tell you that education is not really what makes a quality developer. We have all worked with many CS graduates that were ineffective and helpless. It is more about the person then the degree.
I thought this issue was well understood. Perhaps not by all.
|
|
|
|
|
Agree totally with your conclusion. It's LOGIC, LOGIC, LOGIC.
Still think several extended courses in Computer History would be invaluable to these young whipersnappers who start their career at .NET and never look back. That's how the 'Wheel' gets reinvented over and over!
|
|
|
|
|
Recently my group was asked to troubleshoot an embedded program written by our overseas sister company, which was "never coming up" and their engineers(?) couldn't figure out why. A few seconds of perusing revealed the following in the middle of the program's initialization code:
for (long i = 0; i < 99999L; i++) dlytsk(cur_task, DLY_SECS, 9999);
The dlytsk function causes the calling task to sleep for the specified number of seconds. So the effect of this snippet was to sleep for around 32 years before allowing initialization to complete. One has to wonder what the original intent was (both of the programmer and the manager who hired him!).
|
|
|
|
|