|
I don't know!
Because thas is what I found in dbcore.cpp file while debugging my app.
|
|
|
|
|
Chances are, at the point of that failed call, either pDbState has been deleted
and/or the object containing it has been destroyed.
Mark
Mark Salsbery
Microsoft MVP - Visual C++
|
|
|
|
|
But I can't make any change in those code! Those are belong to Microsoft not me!
I encounter the same error even for the following simplest code in entire world:
<code>
CMyDatabaseClass db;
db.Open();
db.Close();//<b>Immediatelly close it</b>
</code>
Please someone help me
|
|
|
|
|
The problem is maybe somewhere else.
There's only a few places where CDatabase::Free() is called from.
Try putting a breakpoint in there and see how many times it gets called.
Also try stepping through the Open() and Close() calls - anything fishy going on there?
Mark
Mark Salsbery
Microsoft MVP - Visual C++
|
|
|
|
|
Mark Salsbery wrote: anything fishy going on there?
Wow, what are you thinking? One filet-o-fish coming right up! Would you like special sauce on that?
led mike
|
|
|
|
|
A Freudian Slip for sure! Now I'm hungry dammit...
Mark Salsbery
Microsoft MVP - Visual C++
|
|
|
|
|
How do I declare initialise a global variable, exported from one dll and use it from various files?
Let me be more specific. Assume that I have two libraries (i.e. dlls) in my application A & B. The A library consists of the A_1.cpp & the A_2.cpp file (among others). The B library consists of a B_1.cpp source.
The requirment is that both use a global variable A_X which is stored in the A dll.
In UNIX using gcc I declare the code in a common header A_X.h Thus the A_X.h looks like
<br />
extern int A_X; <br />
Then in library A my two sources look like
A_1.cpp source
<br />
#include "A_X.h"<br />
<br />
<br />
int A_X = 10;<br />
A_2.cpp source
<br />
#include "A_X.h"<br />
<br />
.... code using A_X;<br />
and in library B the source code of B_1.cpp is looking like
<br />
#include "A_X.h"<br />
<br />
.... code of B_1 using A_X;<br />
I have tried the following in windows
A_X.h header :
<br />
#ifdef LIB_IS_A<br />
__dllspec(dllexport) extern int A_X; <br />
#else<br />
__dllspec(dllimport) extern int A_X; <br />
#endif<br />
and the
A_1.cpp source in A dll
<br />
#include "A_X.h"<br />
<br />
<br />
__dllspec(dllexport) int A_X = 10;<br />
The rest of the sources are the same.
Although the solution worked I take the following warning in Win32 when I try to build the A.dll
A_1.obj : warning LNK4197: export "?A_X@@3IA" specified multiple times; using first specification
How can I declare my A_X to be a global variable. I.e. delcare it in a header, use it in both A.dll and B.dll, store it in A.dll once, and be able to use it by any source that I include that header?
|
|
|
|
|
In DLL:
<br />
__declspec(dllexport) int g_iYourVariable;<br />
In EXE files:
<br />
__declspec(dllimport) int g_iYourVariable;<br />
|
|
|
|
|
Appologies for the typo in the original post. Where __dllspec read <code>_declspec</code>.
However, I am not sure that I understand since the only thing I am building is two dlls (exes come much further down the line and they do not really know the variable).
Thus, how I create the g_iYourVariable initialise in one of source files of the first dll, use it in all other sources of the same dll, export it and be able to imported (assuming using __declspec(dllimport) ) in any other dll file)?
|
|
|
|
|
Thus, how I create the g_iYourVariable initialise in one of source files of the first dll, use it in all other sources of the same dll, export it and be able to imported (assuming using __declspec(dllimport)) in any other dll file)?
1 - define a handy macro to make it easy for your DLL to use dllexport and allow clients to use dllimport. Put it in a file, call it "MyDLLExports.h".
#ifdef BUILDING_MYDLL<br />
# define MyDLLExport __declspec(dllexport)<br />
#else<br />
# define MyDLLExport __declspec(dllimport)<br />
#endif
2 - compile all sources of your DLL with the preprocessor definition 'BUILDING_MYDLL' added - when building only your DLL.
3 - define a header file where your variable will exist. e.g.
#ifndef __MyDLL_MyVar_h__<br />
#define __MyDLL_MyVar_h__<br />
<br />
#include "MyDLLExports.h"<br />
<br />
MyDLLExport int MyDLLmyVar;<br />
<br />
#endif // __MyDLL_MyVar_h__
|
|
|
|
|
That's almost exactly how I did it. It seems to work but still get a linking warning of multiple definitions. I think the problem comes from the difference between the declaration and the definition with initialisation.
To be more specific my code originates from Unix. What I use to have there was
File export.h
<br />
#ifndef _my_variable_decl<br />
#define _my_variable_decl<br />
<br />
extern int my_var;<br />
<br />
#endif<br />
which was included by all the clients.
However, because the variable is external I have to redeclare it in a local file in one of my libraries. The declaration & its initialisation is
File globals.cpp
<br />
#include <export.h><br />
<br />
int my_var = 10;<br />
</export.h>
Now migrating to Win32 I transform it
File export.h
<br />
#ifndef _my_variable_decl<br />
#define _my_variable_decl<br />
<br />
#ifdef _WIN32<br />
#ifdef _MY_DLL <br />
#define MY_EXTERN __declspec(dllexport) extern<br />
#else <br />
#define MY_EXTERN __declspec(dllimport) extern<br />
#endif<br />
#else<br />
#define MY_EXTERN extern<br />
#endif<br />
<br />
MY_EXTERN int my_var;<br />
#endif<br />
and modified the initialisation as follows
File globals.cpp
<br />
#include <export.h><br />
<br />
#ifdef _WIN32<br />
#define MY_GLOBAL _declspec(dllexport)<br />
#else<br />
#define MY_GLOBAL <br />
#endif<br />
<br />
MY_GLOBAL int my_var = 10;<br />
</export.h>
As you said the _MY_DLL is set only when I compile the main dll part of it is globals.cpp. Other dlls include export.h. This code works and my_var behaves as a global variable inside the main dll and any other dll (note also that it is still compatible with UNIX). However, when I build the main DLL I get a warning of
globals.obj : warning LNK4197: export "?my_var@@3IA" specified multiple times; using first specification which is due to the fact that _WIN32 think that the variable is declared twice (at least that's what my Visual C++ 6.0 compiler says).
I tried to remove the extern keyword in the windows part of the export.h but then it does not compile. I tried to add a __declspec(selectany) and again I get the same warning.
It is an annoying warning and nothing more so I am incline to endure it since my_var is a real global variable.
I thought to reshape the code as following
File export.h
<br />
#ifndef _my_variable_decl<br />
#define _my_variable_decl<br />
<br />
#ifdef _WIN32<br />
#ifdef _MY_DLL <br />
#define MY_EXTERN(_type, _var, _val) __declspec(dllexport) _type _var = _val<br />
#else <br />
#define MY_EXTERN(_type, _var, _val) __declspec(dllimport) _type _var<br />
#endif<br />
#else<br />
#define MY_EXTERN extern _type _var<br />
#endif<br />
<br />
MY_EXTERN(int, my_var, 10); <br />
<br />
#endif<br />
File globals.cpp
<br />
#include <export.h><br />
<br />
#ifdef _WIN32<br />
#define MY_GLOBAL(_type, _var, _val)<br />
#else<br />
#define MY_GLOBAL(_type, _var, _val) _type _var = val<br />
<br />
#endif<br />
<br />
MY_GLOBAL(int, my_var, 10);<br />
</export.h>
but by doing that I am not sure how many instance of my_var do I have. Would be one or more than one as my export.h will be include by more than one sources in the main dll? Also it introduces the danger that the initial value is specified in two different places (one for Unix and one for Win32) and this may lead to bugs in future (forget to update one of the two).
modified on Thursday, April 10, 2008 9:33 AM
|
|
|
|
|
There may be a easier way to do this, but one way is to create global shared memory is to use CreateFileMapping on one DLL, and OpenFileMapping in the other.
Pseudo-code is something like:
DLL #1
------
int *pMemory;
hMem = CreateFileMapping(....."MySharedMemory");
pMemory = (int *)MapViewOfFile(hMem);
pMemory = 3;
DLL #2
-------
int *pMemory;
int Value;
hMem = OpenFileMapping(....."MySharedMemory");
pMemory = (int *)MapViewOfFile(hMem);
Value = *pMemory;
Note sure if this is 100% correct, its been a while since I used it. grep in codeproject for "Shared Memory with IPC with threads" to get more details.
|
|
|
|
|
Thanks but I think shared memory will complicate my code immencelly. I gave it a thought but then it is too much of a complication. In the worse case I 'll keep it as it is and will have to ignore the compiler warnings. After all despite the warnings the existing code seems to work (not sure though if it is by accident though).
|
|
|
|
|
Hello everyone,
In the book ATL Internals, it is mentioned, the properties of CComPtr is,
--------------------
Release the encapsulated interface pointer when the class destructor executes;
Automatically releases its interface pointer during exception handling, ...
--------------------
They both means invoking the Relase method (inherited from IUnknown) of the interface?
thanks in advance,
George
|
|
|
|
|
|
Thanks led mike,
Question answered.
regards,
George
|
|
|
|
|
Hello everyone,
1.
BSTR is always wide character buffer. I do not know why the name of the macro A2WBSTR is so special and add a needless 'W'?
For others dealing with BSTR, since it is already wide character buffer, 'W' is no need. Like W2BSTR, and A2WSTR.
2.
What is differences between A2WBSTR and A2BSTR? Can not find answer from search.
thanks in advance,
George
|
|
|
|
|
Yes, AFAIK, BSTR s were (and are) always wide-character strings.
Perhaps the reason they did this was to specify the encoding of the characters in the BSTR ? Likely not, but just a guess.
Of course, you could always just look at how those macros/functions are implemented to try and find differences!
Peace!
-=- James Please rate this message - let me know if I helped or not!<hr></hr> If you think it costs a lot to do it right, just wait until you find out how much it costs to do it wrong! Remember that Professional Driver on Closed Course does not mean your Dumb Ass on a Public Road! See DeleteFXPFiles
|
|
|
|
|
|
On my system, provided _ATL_EX_CONVERSION_MACROS_ONLY symbol is not defined, they are the same
#if defined(_UNICODE)
inline BSTR T2BSTR_EX(__in_opt LPCTSTR lp) {return ::SysAllocString(lp);}
inline BSTR A2BSTR_EX(__in_opt LPCSTR lp) {return A2WBSTR(lp);}
inline BSTR W2BSTR_EX(__in_opt LPCWSTR lp) {return ::SysAllocString(lp);}
#ifndef _ATL_EX_CONVERSION_MACROS_ONLY
inline BSTR T2BSTR(__in_opt LPCTSTR lp) {return ::SysAllocString(lp);}
inline BSTR A2BSTR(__in_opt LPCSTR lp) {return A2WBSTR(lp);}
inline BSTR W2BSTR(__in_opt LPCWSTR lp) {return ::SysAllocString(lp);}
#endif // _ATL_EX_CONVERSION_MACROS_ONLY
#else // !defined(_UNICODE)
inline BSTR T2BSTR_EX(__in_opt LPCTSTR lp) {return A2WBSTR(lp);}
inline BSTR A2BSTR_EX(__in_opt LPCSTR lp) {return A2WBSTR(lp);}
inline BSTR W2BSTR_EX(__in_opt LPCWSTR lp) {return ::SysAllocString(lp);}
#ifndef _ATL_EX_CONVERSION_MACROS_ONLY
inline BSTR T2BSTR(__in_opt LPCTSTR lp) {return A2WBSTR(lp);}
inline BSTR A2BSTR(__in_opt LPCSTR lp) {return A2WBSTR(lp);}
inline BSTR W2BSTR(__in_opt LPCWSTR lp) {return ::SysAllocString(lp);}
#endif // _ATL_EX_CONVERSION_MACROS_ONLY
#endif // defined(_UNICODE)
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
|
|
|
|
|
Hi CPallini,
After checking implementation in my system (MSVC 2008), my conclusion,
A2BSTR is for null terminated string input, and A2WBSTR can be non-null terminated and assign an additional length field.
Correct?
_Check_return_ inline BSTR A2WBSTR(_In_opt_ LPCSTR lp, int nLen = -1)
{
if (lp == NULL || nLen == 0)
return NULL;
USES_CONVERSION_EX;
BSTR str = NULL;
#pragma warning(push)
#pragma warning(disable: 6385)
int nConvertedLen = MultiByteToWideChar(_acp_ex, 0, lp,
nLen, NULL, NULL);
#pragma warning(pop)
int nAllocLen = nConvertedLen;
if (nLen == -1)
nAllocLen -= 1;
str = ::SysAllocStringLen(NULL, nAllocLen);
if (str != NULL)
{
int nResult;
nResult = MultiByteToWideChar(_acp_ex, 0, lp, nLen, str, nConvertedLen);
ATLASSERT(nResult == nConvertedLen);
if(nResult != nConvertedLen)
{
SysFreeString(str);
return NULL;
}
}
return str;
}
regards,
George
|
|
|
|
|
I don't know, because you did not post the A2BSTR definition (in your MSSVC 2008).
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
|
|
|
|
|
Here it is, CPallini.
Do you think my previous analysis is correct?
inline BSTR A2BSTR(_In_opt_ LPCSTR lp) {return A2WBSTR(lp);}
regards,
George
|
|
|
|
|
You analysis is wrong.
A2BSTR do no more and no less then calling A2WBSTR , i.e. their effect is identical (there is even no difference in involved stack levels, since A2BSTR it is declared inline).
BTW all BSTR must have a length field (this is the reason why you typically call SysAllocString to alloc a BSTR rather than doing it directly).
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
|
|
|
|
|
Thanks CPallini,
CPallini wrote: A2BSTR do no more and no less then calling A2WBSTR, i.e. their effect is identical
I do not agree with you. I think for A2WBSTR you can specify the 2nd argument, which is the length of the ANSI string input, but for A2BSTR, you can not, you can only use the implicit default parameter -1.
Agree? Any comments?
regards,
George
|
|
|
|