|
|
Terrence Dorsey wrote: How many devices do you carry during the day?
That depends on what I put in my backpack the night before!
Bill Gates is a very rich man today... and do you want to know why? The answer is one word: versions.
Dave Barry
Read more at BrainyQuote[ ^]
|
|
|
|
|
Technology is adapting, trying to keep up with our needs and how we handle things. How come then, that the email never really changed? When the first email was sent in the 1970s there was no big difference to the email we know today. ...or why I live at the inbox.
|
|
|
|
|
Probably the same reason that letters (snail-mail) haven't really changed much since 1840: they work. (Okay, not as popular because of email but still relatively unchanged from the time of the first postage stamp until now).
You can whack all sorts of work-flow and other goodies onto email but at the end of the day it's just a message.
"If you think it's expensive to hire a professional to do the job, wait until you hire an amateur." Red Adair.
nils illegitimus carborundum
me, me, me
|
|
|
|
|
Before Linux was around there was a huge problem on the UNIX-based operating systems, since they were all proprietary and each OEM had their own version of the OS. Now, if you look carefully and you’ll notice that the same is happening with Android.... Everyday Android feels a lot more like vaporware for me. Interesting view from someone who thinks Blackberry is a compelling alternative.
|
|
|
|
|
I closed the page when I saw the usual crap about HTML 5 being the "future".
=====
\ | /
\|/
|
|-----|
| |
|_ |
_) | /
_) __/_
_) ____
| /|
| / |
| |
|-----|
|
=====
===
=
|
|
|
|
|
I have always half-jokingly taken credit for inventing the @reply on Twitter. Recently, user @rabble put together a blog post titled Origin of the @reply - Digging Through Twitter’s History, in which he did some research to show when it was first used. Only his research isn’t entirely correct and it doesn’t give fair credit to everyone involved. Those who forget history are condemned to @reply it.
|
|
|
|
|
Hadoop is firmly planted in the enterprise as the big data standard and will likely remain firmly entrenched for at least another decade. But, building on some previous discussion, I’m going to go out on a limb and ask, “Is the enterprise buying into a technology whose best day has already passed?” GFS, MapReduce and you.
|
|
|
|
|
Daily Code Drills is an experiment by Zed Shaw, creator of the Learn Code the Hard Way series of tutorials, to see if a daily drill in Python or Ruby helps build your coding muscles. Try to do this every day for as long as you can or until you can do the whole thing in 10 minutes. Where is your drill sergeant, men? Stack traced, sir!
|
|
|
|
|
The beaurocrats are taking over - oh yawn.
What about a daily drill of actual work - radical idea I know.
Peter Wasser
Art is making something out of nothing and selling it.
Frank Zappa
|
|
|
|
|
Self-publicist.
"If you think it's expensive to hire a professional to do the job, wait until you hire an amateur." Red Adair.
nils illegitimus carborundum
me, me, me
|
|
|
|
|
We already know there are tools to measure how fast a program runs. There are programs called profilers which measure running time in milliseconds and can help us optimize our code by spotting bottlenecks. While this is a useful tool, it isn't really relevant to algorithm complexity. Algorithm complexity is something designed to compare two algorithms at the idea level — ignoring low-level details such as the implementation programming language, the hardware the algorithm runs on, or the instruction set of the given CPU. We want to compare algorithms in terms of just what they are: Ideas of how something is computed. Algorithm complexity is just a way to formally measure how fast a program or algorithm runs.
|
|
|
|
|
<algorithm complexity="" is="" just="" a="" way="" to="" formally="" measure="" how="" fast="" program="" or="" algorithm="" runs="">
Err... no, that's completely wrong.
Algorithm complexity has almost nothing to do with how fast the algorithm runs.
How fast an algorithm runs depends on the machine it's running on, the language it's written in, how well the algorithm has been coded, and many other factors besides.
Algorithm complexity is a measure of how difficult it is to derive the algorithm outputs from the inputs.
|
|
|
|
|
pt1401 wrote:
Algorithm complexity is a measure of how difficult it is to derive the algorithm outputs from the inputs.
but the main reason anybody cares about algorithmic complexity (outside of pure academic research) is because studying an algorithm, determining its complexity, and then searching for a simpler method, is a path to performance gains. end users, library callers and QA testers do not care how "difficult" the algorithm is, they just want the algorithm to turn input into output as fast as possible.
|
|
|
|
|
Complexity is a pretty generic term. Perhaps it's algorithm performance complexity vs algorithm design complexity. Though, the term "complexity" does imply nuance more than magnitude. Maybe a better term would be "algorithm efficiency".
|
|
|
|
|
Daniel Clifford recently gave a great talk at Google I/O 2012 called “Breaking the JavaScript Speed Limit with V8″. In it he goes in depth to explain 13 simple optimizations you can do in your JavaScript code to help Chrome’s V8 JavaScript engine compile / run your JavaScript code faster. In the talk he gives a lot of great explanations as to what they are and why they help, but if you just want the quick and dirty list, here goes... Better than it ran before. Better, stronger, faster...
|
|
|
|
|
Some developers waste time waiting for their employer to train them on new technology or complaining they aren’t getting the training to stay current. While companies continue to cut training budgets, every developer should take the initiative to educate themselves, especially with so many free resources available on the web. When a developer takes risks like this, everyone on a team benefits. You really need to come out of the dark ages of software development.
|
|
|
|
|
Somehow felt that the article was a smart ad for Scala :P
|
|
|
|
|
What would happen if you tried to hit a baseball pitched at 90% the speed of light? A new xkcd series answering your hypothetical questions with physics.
|
|
|
|
|
The Internet was designed to be robust, fault-tolerant and distributed, but its technology is still in its infancy.
The fact that the Web has not stopped functioning in its initial decades sometimes encourages us to assume that it never will. But like any system, biological or man-made, the Internet has the potential to fail. Don't be too proud of this technological terror you've constructed.
|
|
|
|
|
Apple didn’t cut the iPad from whole cloth (which probably would have been linen). It was built upon decades of ideas, tests, products and more ideas. Before we explore the iPad’s story, it’s appropriate to consider the tablets and the pen-driven devices that preceded it. From the Dynabook to the future.
|
|
|
|
|
PCs are more complicated and less reliable than they should be. They require too much maintenance, like a car that requires you to top off the oil, check the tire pressure and fill the gas tank on every trip. Even though they use chips that are far more powerful than the ones in the iPad, they’re often much slower. They rarely have built-in wireless broadband. Every moment I spend dealing with this stuff is a moment I’m not spending creating content. I find that deeply frustrating. Is the iPad good for content creation as well as content consumption?
|
|
|
|
|
I'd argue that the last truly revolutionary version of Windows was Windows 95. In the subsequent 17 years, we've seen a stream of mostly minor and often inconsequential design changes in Windows – at its core, you've got the same old stuff: a start menu, a desktop with icons, taskbar at the bottom, overlapping windows, toolbars, and pull-down menus.... Windows 8 is, in my humble opinion, the most innovative version of Windows Microsoft has released since Windows 95. What's good about Windows 8? A ton of stuff.
|
|
|
|
|
I'd argue that the last truly revolutionary version of Windows was NT4.
It marked the transition from a personal OS to a business-capable OS - Win95 was more revolutionary with it's UI changes, but NT4 was the game-changer for MS.
|
|
|
|
|
NT4 brought a lot to the table, but I thought Win2k brought the best parts of NT to an OS that was better suited to day-to-day desktop use.
Director of Content Development, The Code Project
|
|
|
|