|
Clifford Nelson wrote: If I remember correctly, VB pre .NET was from 1, Not really - you had a choice of specifying whether 0 or 1, but the default was 0.
|
|
|
|
|
Terrence Dorsey wrote: I can think of a whole array of reasons
Starting from 0 or 1?
|
|
|
|
|
0 isn't a real number anyway, how can you have zero of something? Oh I have zero cake (and I assure you, I don't).
Next you'll be telling me that I can keep counting backwards : 3,1,0, then ?,??,??.
What kind'o'crazy maths is this?
|
|
|
|
|
Zero is my hero.
|
|
|
|
|
Look at it this way: when you are born you are zero years old. It's perfectly natural to start with 0
|
|
|
|
|
You get used to it.
"If you think it's expensive to hire a professional to do the job, wait until you hire an amateur." Red Adair.
nils illegitimus carborundum
me, me, me
|
|
|
|
|
Starting at 1 is just silly from a low level perspective...
array[n] is equivalent to telling the computer "I want to read/write the data at the start address of the array, plus n * the size of an item in the array". If you started at 1, you'd be wasting an item's worth of memory or you'd have to do "(n - 1) * the size of an item in the array", which adds in an extra operation for every array access.
One way wastes memory, the other wastes CPU time, or just start at 0 and be more efficient.
|
|
|
|
|
So to help you get prepared we have created a 6 week plan for you.The plan is based on a minimum of 10 hours study time per week. The better prepared you are the better you will do with implementing Windows 8 within your curricula or even simply having the discussion with students using Windows 8 next term. From a Microsoft perspective we have a massive amount of materials to help you prepare and create a very comprehensive application and we provide you with the suite of Tools and Documentation to help you create their very first Metro 8 App. Get ready to graduate magna cum app!
|
|
|
|
|
Terrence Dorsey wrote: Get ready to graduate magna cum app
If you are too lazy to strive for summa cum app.
|
|
|
|
|
Is it worth the effort. There are a lot of people out there still using XP. That should mean that I can continue to use Windows 7 until at least 2022 (XP came out in 2002 if I remember).
|
|
|
|
|
I am just now switching to Windows 7, I figure it has been throughly tested by now.
I have no intention of ever switching to Windows 8, I will wait for the new improved version after that.
Just because the code works, it doesn't mean that it is good code.
|
|
|
|
|
I think you are right about decision to never switch to windows 8. Vista was bad, Windows 7 was a second version. Windows ME was bad, Windows XP was a second version. Windows 95 had issues, and 98 basically fixed them.
|
|
|
|
|
If you’re a software developer and you’re thinking about changing jobs, you’re probably at least a bit anxious (if not downright freaked out) about the prospect of facing a whiteboard armed with only a trusty dry erase marker and your wits while an interviewer fires a coding question at you. That’s not shocking because software development interviews are weird: the skills necessary to answer the technical and behavioral/situational questions that are asked don’t necessarily map 1:1 with the skills to be a good developer. Ask me the questions, bridge-keeper. I'm not afraid.
|
|
|
|
|
Coding questions are not the difficult parts of interviews. The more difficult ones are architectural, interpersonal, and "what are the advantages of [technology-x] over [technology-y]" questions. Oh, and what to wear.
|
|
|
|
|
I used to get anxious about the prospect of a coding question. But then had an interview for a programming job that had no questions related to actual coding. Now I'm anxious about the possibility there may be no coding question.
|
|
|
|
|
I loved the interview I had with a company looking for a WPF developer. They asked all sorts of questions, but not one on WPF. Of course did not get the job since my expertise is WPF.
|
|
|
|
|
My favorite non-coding but code related question I have gotten asked in an interview.
Interviewer: "Tell me about Java."
Me: "Well, what would you like to know? History? Platform Independance? Coding examples? Structure? Or something like polymorphism?"
Interviewer: Blinks a few times and does his best deer in the headlights impression.
Me: Knowing I won't get the job at this point. "Ummm, I think I should be going."
|
|
|
|
|
Apple has been working on its file system and with iOS it had almost killed the concept of folders — before reintroducing them with a peculiar restriction: only one level! With Mountain Lion it brings its one folder level logic to OSX. What could be the reason for such a restrictive measure? What's wrong with all your files right there on the desktop?
|
|
|
|
|
Can apple fix the problem with Software companies having the default folder the documents folder, or adding a folder to the documents folder. I hate when I get extra crap in the documents folder. Not even Microsoft keeps the Documents folder clean. It seems to me the Visual Studio should have its own folder under the user folder, and not use the Documents folder.
|
|
|
|
|
When the Director of Research for Google compares one of the most highly regarded linguists of all time to Bill O’Reilly, you know it is on. Recently, Peter Norvig, Google’s Director of Research and co-author of the most popular artificial intelligence textbook in the world, wrote a webpage extensively criticizing Noam Chomsky, arguably the most influential linguist in the world. Their disagreement points to a revolution in artificial intelligence that, like many revolutions, threatens to destroy as much as it improves. Chomsky, one of the old guard, wishes for an elegant theory of intelligence and language that looks past human fallibility to try to see simple structure underneath. Norvig, meanwhile, represents the new philosophy: truth by statistics, and simplicity be damned. No biting, no scratching... kicking, no gouging, no kickboxing, no punching, no slapping, no spitting...
|
|
|
|
|
It's essentially the same discussion again as when deep blue (may or may not ~ it's controversial) defeated Kasparov in the 90's.
There have been numerous attempts to create a program that played chess 'like' a human, especially before microprocessor era with mixed success. But once processors and memory reach a certain capacity, the brute force approach will always win.
One can argue that an advanced chess engine like Houdini just crunches numbers really fast and doesn't really understand he is playing chess. But arguing the machine doesn't understand chess falls in the category of "fallacy of exhaustive hypotheses". If you make a deep abstraction of what our biological brain actually does, it's all just number crunching as well.
Giraffes are not real.
|
|
|
|
|
When placing audio and video elements on a web page I’ve worked on a number of pieces of work where, for one reason or another, clients want to have their media Autoplay. To us nerds it may seem like common sense that Automatically playing media to a visitor is a bad idea for accessibility. The W3C has made this clear with it’s WCAG guidelines – we’re nerds; we care about these kinds of things. Oh no! Mute, mute mute!
|
|
|
|
|
Last year, I coded an embedded YouTube player to automatically start playing a video with the sound muted. Clicking a button that was over the video controls made the button disappear and the video unmute.
|
|
|
|
|
I have run into web sites that do this, and it can be real irritating. What is really bad is that you do not even know which page may be the irritant. I know there was a restauarnt web site I went to and even after turning off the sound once, when navigate back, the sound came on again. I hated it.
|
|
|
|
|
Over the past 5 years, Apple’s software and hardware have re-defined the mobile and tablet industries. Google, Samsung, RIM, Amazon, HP, and others have all tried to follow suit. But to date their offerings have been sub par; good enough at best. When you’re competing with Apple, good enough is not good enough. Because even to Apple, good enough isn’t good enough. Software’s natural vector is towards complexity.
|
|
|
|