|
Salt started life as a remote execution system: a class of software applications written to address concerns of the form, “I have this command I want to run across 1,000 servers. I want the command to run on all of those systems within a five second window. It failed on three of them, and I need to know which three.” Other systems were designed to do this, of course, but they failed in several ways.... Salt's approach was far simpler. Salt leverages the ZeroMQ message bus, a lightweight library that serves as a concurrency framework. It establishes persistent TCP connections between the Salt master and the various clients, over which communication takes place. A good intro to Salt if you're responsible for setting up servers and VMs.
|
|
|
|
|
I guess that system is worth its salt...
I hope it makes people thirst for more!
...
...
...
...
...
...
...
Sorry.
Gryphons Are Awesome! Gryphons Are Awesome!
|
|
|
|
|
An Apple 13-inch MacBook Pro is the "best performing" Windows laptop? Yes, says a PC services company that has done "frustration analytics" on some of the best-selling PCs. The MacBook Pro won out over established PC makers like Dell, Acer, and Lenovo, according to Soluto, which was quick to explain its finding: "A main factor in this machine's metrics is the fact that every Windows installation on it is clean. With PC manufacturers loading so much crapware on new laptops, this is a bit of an unfair competition. But, on the other hand, PC makers should look at this data and aspire to ship PCs that perform just as well as a cleanly installed MacBook Pro." Bonus: they come with Mac OS (and Unix... sort of).
|
|
|
|
|
Tells you what a great job the manufacturers of laptops do. Notice there is no Sony or Samsung in the list.
|
|
|
|
|
I heard Apple have spent extra attention to tweeking the motherboards so components use less power. That's why Macbook battery life is better than Windows laptops.
|
|
|
|
|
Of course why could not Dell have done that. I can understand the lower end brands. I remember when Apple Hardware was very poorly designed. The Mac II could not be upgraded to higher speeds because, I guess, they had been stupid and not designed the motherboard to handle higher speeds.
|
|
|
|
|
The second on the list (Azus) is not much worse but almost a third of the cost. Makes you think: do you want to pay triple for that difference?
|
|
|
|
|
I had a 15" Pro through work 2 years ago and sure the ergonomics were great, the lighted keyboard was swanky and the clean install of Windows 7 ultimate was excellent.
So when I went to buy one for my own business I got a Macbook Pro right?
No. because for £700 less than the best Macbook Pro at the time I got the same screen, same RAM size, larger harddrive and faster quad core processor with a BlueRay Disk Drive thrown in and a second 2GB graphics card, B&O sound system etc etc, in a plasticky, poorly layed out Asus.
If I was designing dining rooms I'd have bought a Mac, I needed real computing "performance" in a reasonably portable package that didn't give me RSI and in spec terms Apple were nowhere.
"The secret of happiness is freedom, and the secret of freedom, courage."
Thucydides (B.C. 460-400)
|
|
|
|
|
The tech news has been awash with Twitter shut-downs lately. I'm sure that my company makes more revenue than some of those whose API access has been shut off to the outraged fanfare of onlooking technorati. Unlike those other companies, I don't have a leg to stand on. I'm directly cannibalizing Twitter's Ads revenue model and doing so on their very own platform. A clever business niche, or playing with fire?
|
|
|
|
|
The stable release of Ubuntu 13.04 became available for download today, with Canonical promising performance and graphical improvements to help prepare the operating system for convergence across PCs, phones, and tablets. "Performance on lightweight systems was a core focus for this cycle, as a prelude to Ubuntu’s release on a range of mobile form factors," Canonical said in an announcement today. "As a result 13.04 delivers significantly faster response times in casual use, and a reduced memory footprint that benefits all users." This is the year of Linux on the tablet.
|
|
|
|
|
The next Xbox is based on the "Core" (base) version of Windows 8. This suggests a common apps platform or at least one that is similar to that used by Windows 8. It also suggests that Microsoft could open up this platform to enthusiast developers. (That last bit is supposition on my part.) ... Microsoft originally planned to offer both a “full” version of the next Xbox (with video game playing capabilities) and a lower-end entertainment-oriented version, code-named “Yuma,” that didn't provide gaming capabilities. But plans for Yuma are on hold, and no pure entertainment version of the next Xbox will appear in 2013 (or possibly ever). Is a gaming console still relevant in an era of cheap, always connected mobile devices?
|
|
|
|
|
Another point is that this xbox must be always connected to the internet to work. Wonder what is the point of a gaming console. Also it is $500 unless commit to 2 years at $10, when it is the low price of $300. Just cannot see why one would bother. Just use your phone, or whatever. However, I have never been a user of these gaming consoles.
|
|
|
|
|
As the executive chairman of one of the biggest tech companies on the planet, Eric Schmidt seems well-placed to do some crystal ball gazing. So what does he think will be major trends in the future? Self-driving cars, Google Glass and mobile operating systems named after desserts? Not quite. In the run up to the launch of his new book ‘The New Digital Age’, co-authored by Google Ideas director Jared Cohen, the two Googlers gave CNN a few predictions about what would be big news in the world of tech for years to come. They stressed that the technology means nothing unless it is adopted by people, and how they use it will make all the difference. Bonus prediction: an important update will begin just when you need to do something else.
|
|
|
|
|
I'm not impressed. Nothing seemed very "visionary."
There are only 10 types of people in the world, those who understand binary and those who don't.
|
|
|
|
|
And you need to be a Google exec to "predict" these things? *yawn*
Marc
|
|
|
|
|
My nom de net is Dr. Drang, and I blog about scripting, engineering, and occasional other topics at leancrew.com. In real life, I’m an engineer (civil and mechanical) who spends most of his time figuring out why things have broken. Dr. Drang is a consulting engineer who blogs about scripting and fatigue analysis.
|
|
|
|
|
30 years of research on memory safe C/C++ should be enough. It’s time to suck it up, take the best available memory safety solution, and just turn it on by default for a major open-source OS distribution such as Ubuntu.... If the safe-by-default experiment succeeded, we would have (for the first time) a substantial user base for memory-safe C/C++. There would then be an excellent secondary payoff in research aimed at reducing the cost of safety, increasing the strength of the safety guarantees, and dealing with safety exceptions in interesting ways. Would it be better to just stop using C/C++ instead of making them safe?
|
|
|
|
|
Yes, I have to agree. From what I understand, properly done, C# is just about as fast as C++. Obvously still a need to have the ability to bypass automatic memory management
|
|
|
|
|
You'd wish that would be true, but, especially in numerical areas and making use of all of your processor vector instructions .NET is not great. Also note the recent shift away from .NET towards native. MS needed to get better performance in order to get decent battery life on tablets and phones.
Wout
|
|
|
|
|
Terrence Dorsey wrote: Would it be better to just stop using C/C++ instead of making them safe?
Yeah, they should just the write the whole OS in javascript/c#/whatever. I'm sure the tiny performance hit would be worth the "safety"
|
|
|
|
|
|
In a word no, and no they can't make them safe either. Witness C# and Java. All these years of 'safety' and nothing of any serious size or complexity stands up unless manual memory management is used and when it's not memory sizes still balloon and effective leaks still occur.
The safest systems are the most transparent systems where there is little or nothing 'under the hood', in fact there is no hood. Then and only then can the smallest problem be seen for what it is the moment it arrises and hence fixed.
To put it another way, in space there are no 'no user servicable parts inside, do not void warranty by opening' sealed units.
"The secret of happiness is freedom, and the secret of freedom, courage."
Thucydides (B.C. 460-400)
|
|
|
|
|
Terrence Dorsey wrote: Would it be better to just stop using C/C++ instead of making them safe?
For most new development yes; but for performance sensitive apps GC pauses aren't acceptable and the cost of porting billions of lines of legacy code to a new language is beyond prohibitive (see COBOL[^]).
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, waging all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
As a relatively new Common Lisp user, I’ve compiled a list of notes and tips on learning it. It’s a synthesis of my own experience, as well as my observations of the Lisp world. Lisp has a reputation as a hard to learn language, and I believe this is not the case, but there are such things as bumps on the road. Some of them are false beliefs about lisp that might scare people, others are actual nuisances that need to be dealt with, yet others are simply culture shock. Since Lisp is old, it has it’s own distinct culture and jargon, and people often get confused or put off by the differences. An old post, revamped for a new generation of Lisp learners.
|
|
|
|
|
One day, we may be able to check e-mail or call a friend without ever touching a screen or even speaking to a disembodied helper. Samsung is researching how to bring mind control to its mobile devices with the hope of developing ways for people with mobility impairments to connect to the world. The ultimate goal of the project, say researchers in the company’s Emerging Technology Lab, is to broaden the ways in which all people can interact with devices. I felt a great disturbance in the Force, as if several emails had just arrived.
|
|
|
|