|
George L. Jackson wrote: I failed to see the last line of his post!
I do that a lot these days. I guess in my case, it's an age thing
Although I target UNICODE, I do tend to use TCHAR.H still, although as I do more and more C#, it will become less relevant.
Steve S
Developer for hire
|
|
|
|
|
If you are old, I am ancient! I got ten years on you. You were six years old when I wrote my first FORTRAN program in high school.
|
|
|
|
|
I have been playing around with subclassing a RichEdit control to get an XP like Syslink common control which works on all versions of Windows. I've been reasonably successful but wanted to remove the dependency on RicheEdit. I searched CP and came across a couple of classes which look suitable:
1) Hyperlink Text View Control[^]
This is WTL based but has not been updated for a while and did not compile straight away. I changed some of the files that were included and got the example project to compile under Visual Studio 2005, but when I run the program it fails with a Debug Assertion error "vector iterator not dereferencable". Does anyone know what's wrong with this class, or have a version compatible with VS 2005 ?
2) XHTMLStatic - An extra-lean custom control to display HTML[^]
This is MFC based and has been updated more recently. It derives from MFC's CStatic, so would there be any problems using it with WTL ?
Does anyone have a WTL compatible version of this class ?
|
|
|
|
|
Hi guys. I have been working on this for a long time and am not getting anywhere. I wrote a DLL in ATL 8 and on the dev machine it works just fine. I take it to another computer and all I can get is "This application has failed because the application configuration is incorrect. blah blah blah"
I have tried everything that I can think of and am out of ideas. I installed the vc redistribution stuff by the vcredist_x86.exe. Depends reports no missing dependencies. The same component written in ATL7 works as expected. That is to say that regsvr32 can register the object in the registry and can be used. But this ATL8 issue has me baffled. I think that it has something to do with the Winsxs changes but cannot find out how to overcome that. Even if I put the run-time DLLs in the same directory, the registration fails.
Any help? Thanks.
|
|
|
|
|
Well solved my own problem. Turns out the problem was regsvr32 and the way it was being called. Just so happened that the DLL I was trying to register had a space in the name (foo bar.dll). So, to register it with regsvr32, I had to enclose the name of the DLL in double quotes ("foo bar.dll").
|
|
|
|
|
From the 'I feel I ought to know the answer but I don't' dept.
I need to create a large array of UDT's and I'm short on space. Using std::set would be a no-brainer if I could work out (and justify) how much extra space it'll need at run-time compared with a hand crafted array.
I realise I don't even know how to measure it at run time!
Any hints and tips would be welcome.
Dave Cross
|
|
|
|
|
That all depends on the type of efficiency you are looking for.
If you want memory efficiency, vector or deque is better (deque being better than vector in that case as well). Also, if you need random access capability, you should stick with one of those two collections.
If you want fast sorting efficiency, set is a good choice since it sorts elements as they are inserted.
Keep in mind that set also requires elements to be unique, so if you need to have elements that might compare to be the same (e.g. a set of integers with 2 5's), you will need to use multiset or another collection like vector, deque or list.
Basically, it depends on what you are doing, what type of data you have, and how you intend to use that data.
If you decide to become a software engineer, you are signing up to have a 1/2" piece of silicon tell you exactly how stupid you really are for 8 hours a day, 5 days a week
Zac
|
|
|
|
|
std::set is normally implemented with balanced binary search trees (AVL or RB). That gives you O(log(N)) for deletion, insertion and lookup of single nodes. Memory allocation is O(N).
If you need to measure the memory overhead introduced by any STL container, you can always create your own allocator which just forwards calls to new and delete. In the same call, you can log the memory requested, and compare it against memory needed for your data.
--
In Hypno-Vision
|
|
|
|
|
Thanks for the clue, I traced the .insert() operation and found it made two calls to new() for 33 & 72 bytes when inserting my class ( int and char[50], not Unicode).
That looks to me like an overhead of 105-54=51 bytes per entry.
std::list required 97 bytes per entry but std::vector only 56.
I have to store about 10,000 records and 50k seems like too big a price to pay for the convenience of std::set.
Do these results seem reasonable to you?
Dave Cross
|
|
|
|
|
int + char[50] should be 56 bytes with default alignment and packing taken into account (4 byte boundary). That makes the vector allocation sound resonable.
How many inserts did you make? Do at least 100 inserts, as the various containers may need house keeping information, which doesn't grow with the number of items. I don't see why std::set should keep ~50 extra bytes per element! Most implementations are RB-trees, generally requiring 12 bytes (no packing) per element (color + child pointers).
Also do the measurements in release mode, as I know the STL does store extra debug information.
--
Mit viel Oktan und frei von Blei, eine Kraftstoff wie Benziiiiiiin!
|
|
|
|
|
Here[^] is a good example on how to implement your own STL-compliant container. It'll show you how to use allocators.
--
Please rise for the Futurama theme song
|
|
|
|
|
If you can allocate it all at once, std::vector is the most memory efficient - it allocates a single contiguous block of memory and will have something like 12-16 bytes overhead on top (pointer to allocated memory, size of allocated block, use count of allocated block). If the vector has to reallocate, though, you will have high memory usage while the vector allocates a new (larger) contiguous block and copies elements to it.
std::deque is second in terms of overhead. It allocates memory in large-ish chunks, so if you need more elements, it just allocates a new chunk. Obviously, it needs to keep track of each chunk, so you're likely to have 12-16 bytes overhead per-chunk, possibly (I'm guessing here!).
std::set - mmmm, you don't really want to use that unless you need to utilise its particular properties, i.e. it's an ordered set and it's pretty efficient for checking for the existence of a particular element (given the ordering function you specify). Even then - it's probably more performant to sort a std::vector and one of the algorithms (std::lower_bound to get an iterator somewhere near the element, or std::binary_search to show the existence (but not the location!) of the element)
BTW - you may find Effective STL by Scott Meyers[^] useful for STL related hints and tips - tip number 1 is "Choose your containers with care." - hmmm, sounds relevant
HTH!
|
|
|
|
|
Stuart Dootson wrote: Even then - it's probably more performant to sort a std::vector and one of the algorithms (std::lower_bound to get an iterator somewhere near the element, or std::binary_search to show the existence (but not the location!) of the element)
Not if the vector changes frequently (size and values)!
Stuart Dootson wrote: tip number 1 is "Choose your containers with care." - hmmm, sounds relevant
I'd say it's relevant even at times when one think it's not.
--
Mit viel Oktan und frei von Blei, eine Kraftstoff wie Benziiiiiiin!
|
|
|
|
|
Thanks for that.
I have the book and I know the relative merits of the containers. I need the functionality of std::set but I could code around the limitations of vector or deque if it was worth while.
My question is really how to tell, in a given situation, exactly how much extra memory std::set uses compared with, say, std::vector? I can knock up a little test program to dynamically allocate 10,000 items using each method but how do I tell how much space is allocated to the whole collection? I need something like run-time_sizeof().
I have the feeling I've forgotten something simple.
Dave Cross
|
|
|
|
|
No, I don't think you have forgotten anything simple. It's difficult to say what memory usage is going to be without looking at the implementation However, if it's VC 7.1 you're using, then it looks like you have the following memory usage (this is from a running program built for Win32 Release):
The std::set object is 12 bytes. The VC7.1 std::set implementation is a red-black tree. Each node of the tree is 16 bytes plus the size of the object you've stored. In my case, I've stored pointers, so each node is 20 bytes. Ouch!
So total usage of a VC7.1 std::Set is 12 + n(16 + s), where n is the number of items stored and s is the size of each stored item.
Now, I'd guess most std::set implementations will have a similar level of overhead. So....I'm guessing you don't want to use std::set if you're tight on memory....
Last modified: 4mins after originally posted --
|
|
|
|
|
Stuart Dootson wrote: Now, I'd guess most std::set implementations will have a similar level of overhead. So....I'm guessing you don't want to use std::set if you're tight on memory
That is correct, at a minimum. Since each node in the set must have 2 pointers (to the left and right children of the node) at the very least (and some implementations also carry pointers to the parent node as well).
If you are trying to keep your memory footprint down, vector is the way to go (assuming you have a good estimate on how many elements it will contain and can reserve that memory at some point). That said, deque will be more memory friendly if you have a lot of elements. The reasoning is fairly simple: if you estimate 10,000 elements for your vector, but end up addign 10,001, vector will try to reallocate to 15,000 elements and copy over the 10,001. With such a large memory footprint, for a few moments in CPU time, you will have more than double the required memory being allocated just to add a single element. In the same scenario, deque will just allocate another large chuck of data somewhere and add your single element in that chunk.
set isn't very memory friendly. It is designed to contain unique elements that are pre-sorted for you, making it more efficient for certain types of operations, but definitely not on memory usage.
If you decide to become a software engineer, you are signing up to have a 1/2" piece of silicon tell you exactly how stupid you really are for 8 hours a day, 5 days a week
Zac
|
|
|
|
|
i create a com object contain 2 interfaces and one client that use this interfaces. one of this interface contain a variant property (get and put) i set the value of this variant to a string then when i try to allocate a BSTR object to communicate with the other interface i found that the variant property of the other interface changed to the value of the new BSTR object and i don't know why.
i use ATL in building both com dll and client application
|
|
|
|
|
I found that i return the variant simply as follow
HRESULT get_Value(VARIANT* pVal)
{
(*pVal) = m_Value;
return S_OK;
}
but the correct is to test for BSTR type and if it is there you must manually allocate a system string for it:
{
if(m_Value.vt == VT_BSTR)
{
pVal->vt = VT_BSTR;
pVal->bstrVal = SysAllocString(m_Value.bstrVal);
}
else
(*pVal) = m_Value;
return S_OK;
}
|
|
|
|
|
Samy_Ibraheem_Mo wrote: HRESULT get_Value(VARIANT* pVal)
BZZZZZZ! Don't do that. The contents of the variant may be an object which is shared (interfaces), or needs to be deep copied (strings). Please use the function VariantCopy() instead.
HRESULT get_Value(VARIANT* pVal) {
if(!pVal) return E_POINTER;
return ::VariantCopy(pVal, &m_Value);
}
--
-= Proudly Made on Earth =-
|
|
|
|
|
HI.
I'm developing an ATL control with composite controls and it's used in a VB application. Everything went well until it started to throw an error at the moment to load the VB application and then I don't be able to open the VB project again.
It looks like there's a trouble with versions or something like that. But I wrote a new ATL control, and the same error happened and now I'm lost because if I build new controls at the time of use it in VB it will fail in the first run and then it will fail ever.
Maybe it's something easy and dummy to solve, but I can't realize what it is.
Please I need your help.
thank you.
Demian.
"I have always wished that my computer would be as easy to use as my
telephone. My wish has come true. I no longer know how to use my telephone."
-Bjarne Stroustrup, computer science professor, designer of C++
programming language (1950- )
|
|
|
|
|
Hi
Please bear with me as I am new to ATL.
I am trying to reproduce the "Attributes Tutorial" in MSDN online.
The solution consists of:
One ATL exe singleton server (apartment threaded) with two interfaces: IServer with one method [HRESULT Send(VARIANT data)] and _IServerEvents with one event [HRESULT Transfer(VARIANT data)]. This server has the attribute: event_source(com)
and
One ATL dll client (single threaded) also has two interfaces: IClient with one method [HRESULT Send(VARIANT data)] and _IClientEvents which at the moment I have left empty. There is also one public member function HRESULT CClient::Transfer(VARIANT data) . This Client has the attributes: event_source(com), event_receiver(com, true) both set.
What I am trying to do is __hook the Tranfer event from the server to the Transfer Function in the client using:
CComPtr<iserver> m_pIServer;
bool m_bConnected;
HRESULT hr;
hr = m_pIServer.CoCreateInstance(__uuidof(CServer));
if (SUCCEEDED(hr))
hr = __hook(_IServerEvents, m_pIServer);
if (hr == S_OK)
m_bConnected = true;
Everything compiles without problems and the CoCreate'() works at runtime I just cant seem to get the __hook() to work. I am using the two argument version because my client dll has both {event_source(com), event_receiver(com, true)] set.
Any help would be great.
Thanks
Steve Manikiam
|
|
|
|
|
Did you include the following line in the client object's declaration?
__event __interface _ICISDropListEvents;
Regards,
Alex Boyce
|
|
|
|
|
after reading http://www.codeproject.com/vcpp/stl/vector_vs_deque.asp,
i realize that deque has its own speed advantage over vector in all the aspect (except for memory deallocation). does it mean that we should prefer deque over vector? (in contrast with c++ standard, which recommence vector over deque)
thanks!
|
|
|
|
|
That depends on what you are trying to do with the container. For example, if you are using vector to store char's (that is, use it instead of the string class), vector is a much better choice then deque since you can do things like the following:
vector<char> charVec;
cout << &charVec[0] << endl;
Which would not work properly using a deque.
If you are using the container as a pool of objects, then deque is a better choice.
If you decide to become a software engineer, you are signing up to have a 1/2" piece of silicon tell you exactly how stupid you really are for 8 hours a day, 5 days a week
Zac
|
|
|
|
|
Zac Howland wrote: If you are using the container as a pool of objects, then deque is a better choice.
Only if you need random access. list/queue is much better if random access/special ordering is important.
--
Simulcast on Crazy People's Fillings
|
|
|
|