|
probably skip this version of windows in our organization altogether
for tabloid, we have android based tabloid tablets
for work, we'll use Winform and raw sockets! (screws WPF WCF!)
dev
|
|
|
|
|
|
Terrence Dorsey wrote: Either you change with the times or you do something else.
Change (aka "Paradigm Shift") for sake of change is just plain stupid (i.e. it's not "Cool", it's not "Fashionable", it's indeed very "Uncool")
- if the new platform or API don't allow developer to code things up *much* more efficiently and if the new platform or API doesn't deliver any real/additional functionality, what's the point?
dev
|
|
|
|
|
So essentially his point is that we have to just eat whatever piece of sh*t that Microsoft throws our way, and pretend it is the nicest thing since sliced bread?
Well forget it. If I did that I would have wasted my time on Silverlight and countless other "abandontechs". And every year I'd spend a lot of time upgrading my XNA n programs to XNA n+1, which for some reason must always be incompatible in some bigger or smaller way.
That's not going to work.
The nice thing about MS is that the newfangled stuff is optional. You can ignore it, focus on functionality instead of "new for the sake of new", and actually get something done.
|
|
|
|
|
Both the HDD, the core building block of nonvolatile storage in computer systems today, and the SSD are part of a class of storage called block devices. These devices use logical addressing to access data and abstract the physical media, using small, fixed, contiguous segments of bytes as the addressable unit. Each block device consists of three major parts: storage media, a controller for managing the media, and a host interface for accessing the media. While the ubiquitous SSD shares many features with the hard-disk drive, under the surface they are completely different.
|
|
|
|
|
In August, a collection of military, government, and nongovernmental humanitarian organizations from 22 countries in the Pacific gathered in Singapore for Pacific Endeavor 2012, a joint exercise to test how quickly and how well they could communicate in the face of a disaster. While the simulated mission was peaceful, some of the participants were put through a separate, more hostile test—Cyber Endeavor, a full-on "live fire" cyberwarfare exercise focused on "protecting information in a collaborative environment, "with both innocent bystanders and hostile attackers." This is my terminal. There are many like it, but this one is mine...
|
|
|
|
|
The Standish Group is an organization that has been studying and reporting on software projects for many years. In 1995, it reported that only 16.2% of software projects succeeded. In large companies, the number was only 8%. That is, these projects were on time and on budget. In 2012, there is some improvement. The 2012 report is still fairly damning of the industry as a whole. It cites 12% success rate for waterfall projects and 42% success rates for agile projects. Assuming the budgets and plan were realistic, that's not good. But you know what "assume" means...
|
|
|
|
|
software developers and Project Managers typically have too little power in management to bargain for decent resources/budget/timeframe
dev
|
|
|
|
|
When neural networks are used to model a set of existing data so that predictions can be made on new data, the main challenge is to find the set of weight and bias values that generate the outputs that best match the existing data. The most common technique for estimating optimal neural network weights and biases is called back-propagation. Although there are many excellent references that describe the complicated mathematics that underlie back-propagation, there are very few guides available for programmers that clearly explain how to program the back-propagation algorithm. My neural network hurts from trying to follow this.
|
|
|
|
|
My CPU is a neural net processor — a learning computer.
|
|
|
|
|
In our trade we sometimes like to segment the land of the programmers. You have the Computer Scientists, and the engineers, and the programmer, and the app developers, and the hackers, and the script kiddies, and on and on. Some groups like to make fun of the others (e.g. “heh. PHP kiddies. heh.”). There is a notion that you really need a deep understanding of the programming world before you can get going. Sometimes knowing just enough is fine.
|
|
|
|
|
Terrence Dorsey wrote: Sometimes knowing just enough is fine.
Cannot agree more! In fact, a good library/framework/package/tool whatever you call it, is one which gets the most jobs done with minimal learning curve (WCF config files failed miserably on this count and after fiddling [... professionally] with it for two years - we gone back to raw socket on a new project)
dev
|
|
|
|
|
I really really dislike WCF, glad I am not the only one! I do some things with it every once in a while because there are not many options, but every time it's a pain getting into it again because there's just too much to know. The .NET 2.0 stuff like .NET remoting, and adding a web reference for a web service were pretty smooth. It's a pity they abandonded that for some over designed horrid framework.
How's the raw socket working out? WCF is not great, but usually you can hack it until it approximately work the way it needs to work.
Wout
|
|
|
|
|
AFAIK, Remoting, Web Services, and web references still work. To do web references, in the add reference dialog, click 'Advanced', and then click the 'Add Web Reference...' button at the bottom.
I think computer viruses should count as life. I think it says something about human nature that the only form of life we have created so far is purely destructive. We've created life in our own image.
Stephen Hawking
|
|
|
|
|
of course it works, but question is why bother
even if I'm not employing a message bus i can code up the comm layer in 2-3 days with libraries I already written
dev
|
|
|
|
|
Yeah they do, but I still can't figure out why they replaced Win Forms/Managed DirectX with WPF: fail, .NET Remoting/Web ref with WCF: fail. The linq stuff is cute, but marginally useful.
Most improvements in .Net 3.5/4.0 have been in the C# language itself, I do like lambdas that shortens a lot of constructs. But the framework has been bloated with a lot of crap, while they could have improved the existing framework, improving performance for numerically and graphics intensive applications and maintain managed DirextX. The last few years they've been hurling frameworks into dev space, and then discontinuing them a few years laters because they didn't catch on (because they were crap).
Wout
|
|
|
|
|
our raw socket works just fine - it's high volume intRAnet application (ticking prices and calculated figures in realtime getting dished out to clients on multiple machine)
I have existing code library do everything from serializing object graphs/compression/encryption/load distribution took me no time to implement - and I don't need remember WCF config, and I can debug every line of code (no need guess at what svclog throws at you)
dev
|
|
|
|
|
Sampling theory deals with the process of taking some continuous signal that varies with one or parameters, and sampling the signal at discrete values of those parameters. If you’re not familiar with signals and signal processing, you can think of a signal as some continuous function of any dimension that varies along its domain. To sample it, we then calculate that function’s value at certain points along the curve. Working with discrete samples has a lot of advantages. Read this to find out why.
|
|
|
|
|
After yet another round of futile Twittering on the subject of research software, I thought I'd share a deeply personal story -- a story that explains some of my rather adamant stance that most research scientists need to think more critically about their code, and should adopt at least some of the basic coding hygiene used by virtually every modern practicing programmer. The scientific method applied to software in science.
|
|
|
|
|
Bitcoin is a digital currency, shared among its peer network that helps people to buy and sell things online with no connection to their local currency. Sounds enticing, but this young concept is not for the faint-hearted. Its growing success has attracted incessant attacks against both individuals and Bitcoin merchants. This post explores the risks associated with adopting the fledgling currency and looks at some of the major security breaches that have hit the headlines in the last year. I'd buy that for a Bitcoin!
|
|
|
|
|
This should've been tagged as pro-bitcoin propaganda.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, waging all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
Despite being one of its inventors, I can see a time when ClearType becomes obsolete. It solved a real problem of too-low display resolutions when we invented it about nine years ago, and I estimate it still has a long useful life in front of it. But it definitely has a limited lifespan. How long that will be depends on how quickly displays move towards higher resolution. That might take longer than anyone realizes. In fact, maybe a combination of ClearType and hardware resolution will get us where we need to be - in which case it could be around forever. Bill Hill explains where fonts, resolution and ClearType collide.
|
|
|
|
|
Apart from the fact that this article is over four years old, it contains a glaring factual error - it was the late, great Bill Shankley who made the quip about footbool being more important that life or death, not Tommy Docherty!
====================================
Transvestites - Roberts in Disguise!
====================================
|
|
|
|
|
Bill Hill wrote: The human visual system has a vernier acuity of about 1/600th of an inch. We can't see anything smaller than that.
I take exception to this statement and all statements regarding the limits of our perception. Anything that is wave-base, frequency-based, has harmonics. The higher frequency waves mix with the lower frequency waves and alters them. Some people are very sensitive to these changes and some are not.
Tom Scholtz, of the band Boston, said in an interview that he had a hard time listening to music from CDs so he hooked up a spectrum analyzer and compared a song from a CD and the same song from vinyl and found that the music on the CD, digitzed, had introduced harmonics and phase shifts that the record did not have. If you have an understanding of Fourier series, you can see why this would be the case (you should also take a look at the article below on signal processing[^]).
Now, you might not be able to distinguish elements below 1/600th of an inch, but that does not mean you cannot see them. I have a form of blue/yellow color blindness where I can see blue and I can see yellow, but I have a hard time distinguishing the two of them side by side. I can do it, but it is harder than red/green.
What I am saying is that people will blythely throw away data that they believe isn't useful data because it cannot be immediately appreciated when, in fact, that data colors other data, makes it richer and more vibrant and should not be tossed out so quickly.
m.bergman
For Bruce Schneier, quanta only have one state : afraid.
To succeed in the world it is not enough to be stupid, you must also be well-mannered. -- Voltaire
In most cases the only difference between disappointment and depression is your level of commitment. -- Marc Maron
I am not a chatbot
|
|
|
|
|
Well, I absolutely cannot stand colour fringing, or grayscale antialiasing blur. But I don't mind the slightly pixellated look of monochrome font rendering. Why is that a problem again?
|
|
|
|