|
|
I still bring up the Nintendo emulator from time to time for a little nostalgia. to this day i'm still amazed with what those developers could do with so little resouces, compaired to todays games in the 2gb+ size; and they still have about the same playtime/storyline length as the classics, just the graphics have gotten better over time.
|
|
|
|
|
On Monday, Microsoft shocked the tech world by announcing a pair of Windows 8 tablets, both dubbed Microsoft Surface. But it wasn't just the press event's mysterious nature that made the news so stunning.
For the first time, Microsoft will make its own Windows PCs. The company will be in direct competition with hardware partners such as HP and Dell, and judging from early reactions, Microsoft is in a good position to win. The potential effects of Surface on the PC market can't be understated.
And yet, anyone who's paid attention to the tech industry for the last five years shouldn't be too surprised. Microsoft's approach with Surface--designing the hardware in tandem with the software--is the same approach that Apple has taken for decades. And Apple's method is paying off--just look at the rise of the iPhone and iPad, and the success of the MacBook Air. It took a few years for other companies to catch on, but now it's finally happening.
|
|
|
|
|
|
REUTERS - Microsoft Corp's (MSFT.O) new Surface tablet looks promising, but many questions need to be answered before it can be considered a rival to Apple Inc's (AAPL.O) iPad, analysts said the day after the company made its first foray into the computer hardware business.
The shares of the world's largest software company jumped almost 3 percent as Wall Street and investors welcomed Microsoft's move. But analysts had questions about the lack of enthusiasm among developers for creating applications that run on the new Windows 8 operating system and the absence of hard details on pricing and availability.
|
|
|
|
|
I don’t claim to be psychic, so I’m not embarrassed by my utter failure to predict what Microsoft would announce at its mystery event in Hollywood on Monday (click here to check out photos from the event). I was seduced by the scuttlebutt that the company would unveil a Kindle Fire-like media-consumption tablet with Xbox and Barnes & Noble Nook-related features, running something that might not be Windows 8. It sounded more plausible than other possibilities.
|
|
|
|
|
|
Always interesting to see the different paths people take. His Eightfold path is a little too long for me. I went with:
1. Pay Apple
2. Unity3D
3. Publish Apps
This path also lets you target Android by replacing "Apple" with "Google". Works for games at least.
|
|
|
|
|
|
Josh moving to IOS development was an interesting one. However his love for WPF (WPF/.NET developers more to the point) has driven him to start writing a book for .Net developers about IOS development. He sets out to explain the differences, his experiences and general principals of IOS development.
http://ijoshsmith.com/2012/06/05/sneak-peek-reference-vs-pointer/[^]
I probably wont go down IOS development for the moment but it surly will be an interesting read!
|
|
|
|
|
I keep hearing aphorisms about the "software crisis" and the lack of progress in software development. I have been programming for over 15 years, and I find such claims to be completely false: I am convinced that I could reproduce with today's tools the work of a competent programmer of 15 years ago in a small fraction of the time. By analogy to Moore's law and (more appropriately, because of its intention to provoke, rather than predict) Proebsting's law, I propose that programmer productivity doubles every 6 years. This assumes the codebase does not expand as a power of the aging developer's increasing girth.
|
|
|
|
|
I know that the tools provided in C# have made a big difference. I would be a lot less productive if I could not use the binding in WPF, thus being able to use the MVVM pattern, and I know that LINQ also has a big impact. There are also anonymous methods that allow me to put everything in one method instead of having to spread them out, thus making the code more obvious.
|
|
|
|
|
Except that an overflow has occurred and we are now using JavaScript, which has set productivity back to the beginning.
|
|
|
|
|
Sorry, but that's just self-praising BS.
- Where are the programming languages that miraculously make you so productive? Why should a programmer that uses good old C be less productive?
- How much time do you have to invest to learn your way around in all those great tools, languages, IDEs and frameworks? How much time does it cost you to keep up with new versions of all that?
- Sure you appear to be very productive when you have a big fat framework behind you. In the past the smarter programmers built, updated and used their own libraries. After a while they ended up, at least in theory, with a framework that is perfectly adapted to whatever they were doing. Those who could not get to that point never were very productive anyway.
- How often have you had a tool that was not able to do just what you needed? How often have you spent time to hunt obscure bugs and the great framework did little to help you understand what was happening? Is that productive?
- The rookies will not notice it, but programming has been dumbed down a lot in the last 20 years. Building one contraption after another that does little more than pass data from and to a database and generate some HTML from that honestly bores me to tears. And what happens when you have to leave the beaten path? Will the tools and framework help? Not very likely. You are not supposed to leave the path and your punishment will be that you are on your own. But only if the more patronizing contraptions will allow you to do that. As if it were the pinnacle of productiveness when some tools actively protects you from yourself.
As I see it: If you want to bve productive then THINK! Tools, languages and frameworks come and go. Concentrate on things that are independent of them, like algorithms and design patterns. You will be productive when you know what you are doing. Everything else is just an illusion, usually at your expense.
At least artificial intelligence already is superior to natural stupidity
|
|
|
|
|
How long until we see cars where the driver/customer cannot access the engine at all? If Apple made cars, the hood would be secured with pentalobe screws (Please visit an authorized garage to have your wiper fluid refilled). I’m not sure if this would be a good thing for cars but in computers, I’ll gladly take better design and usability in a smaller and lighter package over upgradeability and serviceability. Is this the end of DIY computer repair?
|
|
|
|
|
HOwerver, it should be possible to upgrade without a technicion. Generally there is not much that can be done to a modern laptop without replacing the whole mothoerboard, but there is still memory, the disk drive, and a few other things. I certainly know about dell charging me to replace a motherboard because of a bad power connector, and then there was the time I had a bad battery connection (battery does not charge), which dell could not find, and requires replacement of the motherboard.
|
|
|
|
|
|
The end of DIY mainstream computers. We'll be back to being weird geeks with screwdrivers again.
|
|
|
|
|
Sacrilege! You must prove your worth by soldering together your computer. Screwing together prefabricated stuff is for users.
At least artificial intelligence already is superior to natural stupidity
|
|
|
|
|
CDP1802 wrote: Screwing together prefabricated stuff is for users
Which is why I only buy screwless cases.
|
|
|
|
|
I am pretty sure I can incite a minor panic walking through the office with a solderign iron.
|
|
|
|
|
I have repeatedly been confounded to discover just how many mistakes in both test and application code stem from misunderstandings or misconceptions about time. By this I mean both the interesting way in which computers handle time, and the fundamental gotchas inherent in how we humans have constructed our calendar — daylight savings being just the tip of the iceberg. Tonight we're gonna party like it's new DateTime(1999, 12, 31);
|
|
|
|
|
some of those are silly.
a program that's stuck in a VM might not know time has effectively stopped and restarted ? how the hell is a programmer supposed to handle that situation - resync the clock with NIST before calling any function that gets time ?
|
|
|
|
|
Chris Losinger wrote: resync the clock with NIST before
No guarantees...
A problem I had to fix a number of years ago was that we got the time from NIST, stored it in a script, and ran the script on the cluster (this was OpenVMS), but when the system was busy, it didn't run on time, thereby setting the clocks back. It was actually a simple fix.
|
|
|
|
|
Here's some more:
- Time always goes forward.
- At least, it doesn't go backwards.
- There is a whole number of days in a year.
(ok those were easy)
- Date strings can be unambiguously converted to timestamps.
(you might even encounter that one)
- Every day of the month always exists, except the leap day.
- If a date exists in one country, it also exists in every other country.
- The differences between the dates in two places is never larger than one.
- Whether a year is a leap year can be calculated from the just the year.
- etc?
We made a huge mess, and no one is making any serious move to clean it up.
|
|
|
|