|
Mercurius84 wrote: This special program shrinks down the size by ~30% (7MB for example) and
then store it in to a container.
Fine - but why to you need to do that? What is the business or technical need that requires this?
Mercurius84 wrote: What i mean by OS handling is unlike a normal compression and store them into a
folder.
Not sure what you mean by that - as I already said desktop OSes already support compression. So that point by itself is moot.
|
|
|
|
|
Hi,
Actually our company wants to develop something similar to COLD system.
But in generically .... we are building a new product based to this concept....
|
|
|
|
|
Mercurius84 wrote: similar to COLD system.
No idea what that is but if a static storage system then your stated requirements still are not anywhere close to being reasonable for creating and marketing such a system.
Just to make it clear again, your stated requirements do not present a need for anything.
I can only presume that either you work in an highly unusual environment (an unstated requirement) or the stated file count\size does not span that actual need (another unstated requirement.)
|
|
|
|
|
|
Hi,
We are in the process of implementing a new eCommerce platform, which would replace our existing eCommerce Server 2007.
We are looking at some solutions provided out there. Does anyone have any suggestions?
Thanks!!
|
|
|
|
|
Start by collecting requirements for what the system needs to do.
And also size your business needs.
|
|
|
|
|
Hi I want to start studying process- based software.
i don't know where to start . can you give me the name of the courses to take by their order
for example Workflow - BPMN - ,...
thanks
|
|
|
|
|
I'm fairly new to writing unit tests and I've run into something of a design/architecture question.
I'm operating under the belied that any given method should not be overly large, and any that is should be refactored into smaller methods.
As a result I end up with this:
public MyClass : IMyClass
{
public string MyMainMethod()
{
Method1();
Method2();
return "somestring";
}
private void Method1()
{
}
private void Method2()
{
}
}
Obviously this is grossly simplified, but it serves my purposes for this question. I have my classes loosly couples so when writing a unit test I can stub any interface that is injected into MyClass and isolate the code under test.
How do I go about stubbing Method1 and Method2? I'm using Moles (and can't change because of company restrictions), but I suspect/hope this is a testing platform independent question.
Should I be desiging this different? Not using private methods, but public virtual would allow me more flexibility, but doesn't feel like the right approach.
Any advice or pointers would be greatly appreciated.
- Andrew
|
|
|
|
|
Andrew, there are two schools of opinion on this. In one school, you'd effectively elevate the visibility of these methods purely for the purpose of testing. In the second school, you wouldn't directly test these methods. Instead, you'd test the public method that called them. In other words, if there were no route to your private method, then it shouldn't exist.
|
|
|
|
|
That makes sense. I guess I was trying to be able to test the private methods individually and separately from the public method that calls them. It sounds like you're saying I could just test them through tests on the public method.
This feels cleaner to me, I really didn't like the idea of elevating them just for tests. I made them private for a reason, changing that just for a test felt wrong.
Thanks for the feedback, Pete.
|
|
|
|
|
If there is some need for verifying the code of those functions, I'd use either:
- protected instead of private and create a sub-class for testing
- reflection for invoking the private methods.
But testing is normally about public functions only. And you may have found a situation where that stringency is not fully appropriate.
|
|
|
|
|
Hi,
If you want to know how much of your private code is tested by your unit tests, there are tools to measure test coverage e.g. NCover.
Such a tool may report which code lines are not hit by tests.
Usually, each assembly is assigned a minimum coverage that must be reached. A 100% coverage can actually be very difficult to reach.
Getting the coverage feedback can be a useful experience.
You could get more conscious about how your coding style can make testing easier; branches and exceptions are usually up for discussion.
Kind Regards,
Keld Ølykke
|
|
|
|
|
Hi,
I am looking for a way to solve my problem in a proper way:
I have a pool of objects as input (lets name them p)
I have alot of different decission criterias (lets name them c)
I now want to sort those p with weighted c to get the best decission about the sorting of p.
I think about a system where c are objects derived from a base class (name it Criteria). I want to be able to weight those c in a way that I am not sure of yet.
Maybe it would be good to create a directed graph with those weights and let a search algo like Dijkstra run over it and get back the best result.
Sorry for the confusing description. I dont get it sorted in my brain
I thinkt I am not the first one with a problem like this and hopefully there are some design concepts/patterns to solve such a problem without using if-else.
Another question: Is there some design concept where the system can selve learn from its results and maybe start weighting those c on its own, based on earlier results?
I appreciate any input from you!
|
|
|
|
|
|
please any one told me concept of Concurrent OO System.
and also about it's practical application.
|
|
|
|
|
|
thank you for reply Mr. Richard MacCutchan
But have already surf this topic on google.
I have little bit confusion that Is that cloud is based on concurrent OO system?
|
|
|
|
|
N_cooL wrote: I have little bit confusion that Is that cloud is based on concurrent OO system?
"Which" cloud? There are more of them.
They're built using "normal" proven languages, not using esoteric academic languages.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
N_cooL wrote: please any one told me concept of Concurrent OO System. That's not a well-defined concept. Concurrency and OO are usually explained as different concepts, and combined later on. It might be interesting when you're going to develop a new language[^].
N_cooL wrote: and also about it's practical application. Simulation[^]. Don't know whether *anyone* ever used it, never seen it personally.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Hi,
I just started a few days ago at my new work place. One issue, my first issue, I am researching on - we have the main company web site on main domain, all the images are located in sub domain.
It is a heavy images driven web site.
On the IE Explorer, after you load the web site, and hit the key F12, you are able to run some statistics on the current web page. There is a latency in loading the images, although the image is ready to be loaded.
Any thoughts on the latency? Do you think it is because on running on sub-domain?
When I plug in the URL directly to the image, the response time is much faster, while the main web site's response time is almost double the time. Also the received size is not the same on both images, once from the main web site, and once from the plugging in the url directly.
Thank you!!
|
|
|
|
|
Where is the sub-domain hosted - on the same server with the main domain?
What is "plug in the URL directly to the image" - I do not understand that.
|
|
|
|
|
How are the site loading the images? Did you see that code? It is very strange that the image does not have the same size.
|
|
|
|
|
I think this has to do with the fact that some browsers first load what they see as 'the main content' - e.g. if your site is hosted on 'mydomain.com' and your images are hosted on 'img.mydomain.com' the browser loads content residing on 'mydomain.com' first because you entered this URL and the browser assumes that you want to see this content before loading the other content from a subdomain.
Another cause are possibly very large images - Their transmission via network is slower than the site's content and therefore they are displayed after the site has loaded (just because they are not completly transmitted yet).
cheers,
Marco
|
|
|
|
|
Hi folks,
In near future am seeing that almost all the application is started building in MVC,very less enterprise application in going for ASP.NET,what is the reason behind this.
Ive asked the same question to many architects but every body giving different different answers could any body tell me why this MVC,what is its advantage
|
|
|
|
|
It's about separating the different parts of the application for ease of maintenance. Microsoft has this site: http://www.asp.net/mvc[^] dedicated to it.
Use the best guess
|
|
|
|