|
I have this situation and I want to know the opinions of other members on this. My language is C++.
There is a base class that has the most vital virtual function called Job . Within the derived classes, there are several possibilities of things to be done before Job and/or after it. The Job would have to be done for about a hundred times* (The design has no problem, so if this sounds absurd it is only because I am not providing the exact details of the class). Also, this is supposed to run on multiple platforms, including .NET, mobile etc and not just native code on big fast CPUs. The core that contains this will remain in C++ and the necessary wrapping etc will be done to get it to run on the said platforms.
In this scenario, considering speed and other things, which would be a better solution?
1. Have just the Job function and let the derived class do something like this
DerivedClass::Job()
{
}
2. Have three virtual functions, PreJob, Job, PostJob and have the call to Job be bracketed like the following so that the code is neatly segregated
JobCaller()
{
pBaseClass->PreJob()
pBaseClass->Job()
pBaseClass->PostJob()
}
EDIT: *about hundred for user interaction events like mouse clicks, screen taps etc depending on the platform. Basically, the user may "test" the load by playing with it.
|
|
|
|
|
(2) is the cleaner approach, obviously.
if PreJob+Job+PostJob is more than 100 lines of code (dynamically speaking), then the difference in performance will be unnoticeable.
if mosr jobs have different PreJob and PostJob functionality (never shared with the base class), the added structure doesn't bring you much.
So look at the statistics and decide for yourself.
|
|
|
|
|
I agree with Luc on performance issues, but I disagree when he say that the added structure would not be beneficial. If your core Job function cannot stand on its own (i.e., requires pre- and post- function, regardless of how different those functions may be from derived class to derived class), then I personally would absolutely declare the Pre- and Post- as "abstract" methods to ensure that every new derived class you create implements Pre- and Post-. This is exactly why "abstract" declarations were invented. You could then invoke Pre- and Post- from Base::Job and not have to implement a Derived::Job method at all. Like this (I apologize if my syntax isn't perfect. Been doing a lot of C++/CLI, so I might not be completely right in native C++ syntax):
class Base
{
public:
virtual void Job()
{
Pre();
Post();
}
protected:
virtual void Pre() abstract;
virtual void Post() abstract;
}
class Derived : public Base
{
protected:
virtual void Pre() override
{
}
virtual void Post() override
{
}
}
Derived* myDerived = new Derived();
myDerived->Job();
|
|
|
|
|
I would also add a
protected virtual void Actual()
containing the actual job code, and call that from the Job() method.
Not only does it make things clearer, you can also be sure that the actual job code will have to be overridden in a derived class soon.
|
|
|
|
|
Unless the optimizer has changed much, if PreJob and PostJob don't do anything in a class, I believe they are optimized out. It's been a long time since I've done any serious C++, but I seem to remember that this is the case. As the base class is only going to be using VTable lookups here anyway, the overhead should not be significant in real terms.
|
|
|
|
|
Have you looked at Aspect Oriented Programming (AOP)? It provides a great alternative to inheritance in cases like yours. Don't know how complicated that triplet(?) of Pre , Actual , and Post calls will become, but my experience tells me that the orchestration of overridden virtual calls of that kind quickly becomes a nightmare... A good half-way to AOP are design patterns like chain of responsibility - in many cases this gives you larger flexibility and better control over the behaviour than (often hacky) orchestration of calls to overridden methods.
By the way, if you are on native C++, have a look at the C++ idiom called Wrap http://www2.research.att.com/~bs/wrapper.pdf - an interesting possibility to get compile time configurable Aspect-like behaviour. Often (but not always) the flexibility supersedes the one of polymorphic inheritance based implementation, could have issues about interoperability though...
Cheers,
Paul
|
|
|
|
|
From a perfectionists' point of view, having job callers deal with sequences of calls violates encapsulation: job callers do not need to know that a job requires a "setup" and a "cleanup" steps.
To hide the internals from your callers while keeping the distinction in your implementations, make an abstract class with a skeleton method, then override pre/post/job methods as needed in your derived classes. This technique is commonly known as the "Template Method" pattern[^]; I prefer "skeleton" to avoid confusion with C++ templates.
AbstractClass::Job()
{
doPreJob();
doJob();
doPostJob();
}
DerivedClass::doPreJob() {}
DerivedClass::doJob() {}
DerivedClass::doPostJob() {}
JobCaller()
{
pAbstractClass -> Job();
}
|
|
|
|
|
Hi
I am stuck with a problem when i have been told to fill the details regarding the application which we have built for providing us a server.
The details to be filled are as below.
disk space required to store each associate (kb/MB)
size of each transaction (kb/mb)
So how do i estimate?
Regards
Naina
Naina
|
|
|
|
|
nainakarri wrote: So how do i estimate?
Look at the classes you are using and what information they expect to contain, multiply by the expected quantity. This is just simple mathematics.
Just say 'NO' to evaluated arguments for diadic functions! Ash
|
|
|
|
|
Hi Thanks for the reply.
Is there any way tool/way to calculate the size of data transporting when using the application?
Naina
Naina
|
|
|
|
|
What data, how are you transporting it, where are you processing it? Your question is far too generalised to give a valid answer. Try searching the internet (Google, Bing) for the subject matter you are interested in.
Just say 'NO' to evaluated arguments for diadic functions! Ash
|
|
|
|
|
sizes get measured on a development system, and if necessary on a mock-up. Just measure total size for quantity 1, 10, 100, 1000 and you'll know unit size. Quantities you need to estimate yourself. No tools involved, just common sense.
|
|
|
|
|
I now have to deal with a whole lot of very badly structured solutions and projects. My first bugbear is solution files within project folders. To me this smells very bad, as projects are subordinate to solutions, and maintaining a healthy solution folder structure should not rely only on the solution file contents, but should be implicit in the actual Windows folder structure.
I would like to propose a standard for use by me and my colleagues, and in this suggest a truly hierarchical structure with a Code folder containing a solution file, and subordinate folders for all projects, containing only .csproj files and source, no solution files! I would very much also like to include some useful background information justifying my proposal, but I don't want to ramble too much and would like sources I could maybe quote, with attribution of course. What good articles, tutorials, books, etc. could I consult for this?
|
|
|
|
|
|
I'm sure most questions here could be answered by Google. In fact, most of mine are actually answered by Google, but I like forums because they introduce experience factors versus mere PageRank.
|
|
|
|
|
The first few hits seemed quite to the point, and your post did show interest but not really prior research, hence...
|
|
|
|
|
lol, hate the cynicism, love the approach.
|
|
|
|
|
I don't see the problem with having individual solutions Thats a good thing, particularly if the solution can be implemented in isolation. Also I can't see a problem with other more enterprise type solutions being able to reference projects within another solutions.
I doubt you will find a book or article expressly on the subject because it is a fundamental principle of OOP. I think what maybe more concerning is a lack of namespacing guidance. As enterprises develop over time, namespacing does become an issue (that's life) and changing namespacing is relatively simple. What you need to consider in backward compatibility. Changing namespacing can cause upgrade deployment issues.
|
|
|
|
|
I disagree with your ideas.
We have a few solutions, and some of them share projects, e.g. both Solution1 and Solution2 make use of ProjectA. And here, the structure you suppose would fail.
Our sln files reside in the "main" project of a solution, not in a shared project. When shared projects contained sln files, I would not like that either.
|
|
|
|
|
What would be a good source of information to read upon the various methodologies and the documentation that should go along with it, who is responsible the various documents, document templates(industry standards),etc?
|
|
|
|
|
This is really a research question, type the methodology name or subject that interests you into Google and see what happens.
Just say 'NO' to evaluated arguments for diadic functions! Ash
|
|
|
|
|
Microsoft has something they call the Microsoft Solutions Framework which has definitions of teams and various artifacts. I haven't looked at it since several versions back when I was studying for the now retired 70-300 certifciation exam. It looks like it has been updated to handle some Agile processes since then.
I don't know that I would actually recommend using that particular framework, but at least it will give you a good idea of some of the concepts you should consider.
|
|
|
|
|
I struggle with this question teaching Project Management. There is no standard. Each organization is different in there approach and each project within an organization is different. You need to "customize" a methodology to meet the organization's development guidelines and at the same time meet the project requirements.
So keep an open mind and customize the templates and guidance you find to the project at hand.
|
|
|
|
|
In my project I am using Visual Studio and SQL Server.
During my project review, the reviewers asked me that "Why u preferred Visual Studio,ASP.NET and SQL Server and trying to waste money on buying them?"
"Why didn't u go behing free and open source development options like using PHP/MySQL?"
I could only reply with some features of Visual Studio.
But I still doubt why Microsoft is selling things when the powerful PHP is free.
How should I reply to the reviewing team? Please give me a good explanation.
|
|
|
|
|
Well, they seem to be mistaking Visual Studio for a language. PHP is a language, while VS is an IDE. The key thing is that it is integrated, so you can do more than just edit source code in it; you can manage your databases, run unit tests, code coverage, profiling, metrics analysis, and so on in the IDE.
|
|
|
|