|
Are you wanting to know which one to not go to?
|
|
|
|
|
|
It wasn't one hospital. I wrote it for a hospital system that McDonald-Douglas sold through out the USA. I don't think the system is in use today.
|
|
|
|
|
I never saw it as critical to human safety but some parts of it are pretty critical indeed
The rest of my development path was always related to financial areas, this was the only one I had to actually deal with this kind of data.
|
|
|
|
|
I've worked on three projects where operator safety was a concern.
The first was an imaging device that used three lasers to write an image on photographic film. They weren't the problem. The problem was the infrared laser used to track mechanism positioning, which could cause blindness or other injuries. The engineers were so afraid of a software crash leaving the IR laser on that I had to set a bit on an I/O port ten times a second to keep power on the machine.
The second was control software for a fluorescent light bulb manufacturing line. At one end, you have a blast furnace churning out molten glass. Six hundred feet later, you have a very hot glass tube being measured and cut using (get this) sprays of water.
The most important one was an emulation of the flight control software for a special version of the F-16 fighter jet. The purpose of the emulation was to verify and validate the flight control system design. This special version of the F-16 was a test bed for a number of fancy controls based upon aerodynamic instability. The aeronautical engineer on the project explained it like this to me: "If we screw up, and the flight control system stops working, the pilot dies two seconds later."
Software Zen: delete this;
|
|
|
|
|
Did training software for the F-15. In training mode the pilot could simulate using the weapons during a flight mission. Was pretty important the missles didn't fire....Ah yes, in assembler using a Z8000....those were the days..
Toto1107
|
|
|
|
|
Our emulation for the F-16 was in Ada on a microVAX. Unfortunately, Ada compilers at the time were relatively rare, and we were a beta site for the company who wrote the one we used. If you compiled a generic package, it crashed VMS , which up to that point I had thought was impossible.
Software Zen: delete this;
|
|
|
|
|
Gary Wheeler wrote: "If we screw up, and the flight control system stops working, the pilot dies two seconds later."
At which point you promptly discarded the beer you'd been sipping on
"For fifty bucks I'd put my face in their soup and blow." - George Costanza
|
|
|
|
|
Actually it was coffee, but you've got the right idea.
Software Zen: delete this;
|
|
|
|
|
A few years back I was talking to a German nerdy type progrmmer, whose software was responsible for killing a whole bunch of children a year prior. I asked him how he felt about that, and he said it was good to get some feedback about how well the software was working, as up to the point in time of the incident he had heard nothing.
|
|
|
|
|
John Stewien wrote: it was good to get some feedback
Please tell us he wasn't that blase' about what he'd done .
Software Zen: delete this;
|
|
|
|
|
He didn't exactly have a strong personality, so I've tried to convey the strange feeling that I got when he answered. Some people are too caught up in the mathematics to worry about anything else in life. I don't think the humanity side of it registered with him.
|
|
|
|
|
Why should it? It's not like he did it on purpose.
|
|
|
|
|
If you write software whose failure can cause injury or death, then you have a moral and ethical obligation to ensure its correctness.
The OP is pointing out that the programmer in this case is oblivious to the human cost of his ineptitude.
Earl Truss wrote: It's not like he did it on purpose.
I believe it's called "due diligence". If you choose not to exercise it, it's your fault.
Software Zen: delete this;
|
|
|
|
|
There was no mention of the programmer being inept in any way. I assumed that if he was working on software that might endanger property or people that it was tested "properly". We can't guarantee it's perfect, just that it has been tested as required. I agree that "due diligence" is the test. If you do it and it fails, it's probably not your fault even if you think it might be ...
|
|
|
|
|
Earl Truss wrote: even if you think it might be ...
Which is pretty much what most people would feel. Except for this programmer in question. No guilt and calling deaths "feedback" is inhuman.
|
|
|
|
|
My argument is that the programmer's attitude makes him incapable of exercising due diligence, and therefore the software was not tested properly.
Frankly, his response to the tragedy sounds sociopathic.
Software Zen: delete this;
|
|
|
|
|
Manager: Some people just died using your software. You'd better fix it.
Dev: Did you log it in Bugzilla?
Manager: No...
Dev: Then I'm not going to fix it.
or
Manager: Some people just died using your software. You'd better fix it.
Dev: Yes. They didn't use the right combination of arguments.
Manager: They died.
Dev: Not my fault they didn't RTFM. Dumbarses. We done here?
I'm making light of it but holy crap, what a whack job.
|
|
|
|
|
Man, you guys are reading a lot into a short comment from someone else. You really don't know the situation or the real conversation so it seems like some of these comments are assuming the worst. I'd think the people who responsed "not sure" deserve a bit more questioning.
|
|
|
|
|
Yeah. That's what we do. More interesting than being objective, fair and rational...
|
|
|
|
|
Given my experience with lusers- chances are, #2- Gigo and Pebcak are the #1 and #2 causes of "bugs" in any given system.
|
|
|
|
|
Isn't there a similar issue with people who design/build cars that people get killed in? Or safety systems, like airbags, that can inadvertently kill people under the wrong circumstances? Using "stuff" has an inherent risk, I guess.
There's side effects, too. I worked on a system that fueled a car on natural gas or propane. If installed incorrectly, by some third-party guy who took a course from us to be certified, there were some potential hazards. One of which was a backfire igniting a fuel injector and burning up an engine. Hard for the software to detect what was going on there, but the whole project had a certain safety aspect associated with it that we used to worry about.
But there are some things out of your control, once the product leaves your hands. I'm not agreeing with the attitude conveyed in the short quote. Maybe it was more of a coping mechanism on the guy's part, the way that air traffic controllers refer to "packages" rather than "airship with 300 lives aboard". I'd like to hear the whole story before passing judgment...
|
|
|
|
|
GuyWithDogs wrote: Maybe it was more of a coping mechanism on the guy's part, the way that air traffic controllers refer to "packages" rather than "airship with 300 lives aboard". I'd like to hear the whole story before passing judgment...
Yes, that was my thought also ... although I did not state it first. Thanks.
|
|
|
|
|
Yeah, that's why you hear military guys talk about "targets". Your mission is to hit the target, that way you don't think about the consequences, that hits you later...
|
|
|
|
|
Any indication of whether they had any QA on the team or did he have unit tests written?
ed
~"Watch your thoughts; they become your words. Watch your words they become your actions.
Watch your actions; they become your habits. Watch your habits; they become your character.
Watch your character; it becomes your destiny."
-Frank Outlaw.
|
|
|
|