|
Hi
I had this same question earlier and someone told me that linkers are stupid. That's what I'll tell you because I am a newbie. The solution to your question is to undefine GetObject like so:
#undef GetObject
If you use System::Windows::Forms::MessageBox you will also need to undefine it.
Hope that helped and if anyone could explain why this is that would be great.
|
|
|
|
|
|
Since you're using CString in your umanaged project you're using MFC. Throught the MFC library there are many #defines in order for code to be compiled using either Ascii characters or Wide characters. When you complie your code the library will check what type of strings you should be compiling for and will then use the appropriate function. For you it was "GetObjectA" meaning your code was trying to compile using 16bit characters. You have to do it for the message box as well because it has both MessageBoxA and MessageBoxW functions so it can complie using either Ascii or Wide strings.
I hope that made some sense.
I'm not sure if that's completely right, as I'm not a pro programmer or anything.
- monrobot13
|
|
|
|
|
I'm not sure on the .net stuff still learning it myself. But from an MFC perspective xxxA is ascii ie 8bit chars whilst xxxW is unicode ucs16 ie 16bit characters. The normal way to control this is with a preprocessor define of MBCS or UNICODE. (MultiByteCharSet is just ascii 8bit characters with an escape character defined to encode further characters with a second 8bit character)
|
|
|
|
|
Slaru wrote:
someone
Me
Slaru wrote:
told me that linkers are stupid
No, I told you preprocessors are stupid. Somewhere in windows.h there is a preprocessor directive to turn every MessageBox into MessageBoxA or MessageBoxW, depending on whether it is a Unicode or non-Unicode build. Now, the *stupid* preprocessor does not know that your MessageBox is .NET MessageBox and not Win32 API MessageBox, and it turns System::Windows::Forms::MessageBox to System::Windows::Forms::MessageBoxA.
|
|
|
|
|
Here's the scenario:
// MyUnmanagedDll.dll
__declspec(dllexport) bool MyDll::Go(unsigned char *ptr){
// read from ptr and print out ptr's contents
MyPrint("During MyDll, ptr contains\n", ptr);
return true;
}
// MyManaged.cpp
namespace A{
[DllImportAttribute("MyDll.dll", EntryPoint="Go")]
extern "C" __declspec(dllimport) bool Go (unsigned char *ptr);
}
main(){
unsigned char* myPtr = new unsigned char(200);
for(int i=0; i<200; i++){
myPtr[i] = '0';
}
MyPrint("Before calling MyDll, ptr contains\n", myPtr);
MyDll::Go(myPtr);
MyPrint("The ptr now contains after call from MyDll\n", myPtr);
}
Output:
Before calling MyDll, ptr contains
00000000...(200 times)
During MyDll, ptr contains
00000008072000...(rest 0)
After MyDll, ptr contains
00000008072a0710000...(rest 0)
What is going on here? I tried marshaling the data from "unsigned char gcPtr __gc[]" into a LPArray, and same thing! I've also tried casting into an IntPtr, and same results, and also tried GC::KeepAlive(gcPtr), and no luck.
The corrupt data are 4 consecutive bytes during the Dll call, and 8 consecutive bytes when it returns. They range from 150th - 180th, depends (not sure why)
Any help would be greatly appreciated. Thanks much in advance!!
Cal
|
|
|
|
|
>unsigned char* myPtr = new unsigned char(200);
This line allocates only a single character, not an array. To allocate an array, call new unsigned char[200];
READIN writin rhythmetic
|
|
|
|
|
Hi All,
I'm using the magic Library in my MC++ app, but I'm running across a problem trying to access one of it's emums. The enum is declared like this in the source for the library:
public enum DisplayTabModes
{
HideAll,
ShowAll,
ShowActiveLeaf,
ShowMouseOver,
ShowActiveAndMouseOver
} and the property is declared like this:
public DisplayTabModes DisplayTabMode
{
get { return _displayTabMode; }
set
{
if (_displayTabMode != value)
{
_displayTabMode = value;
Notify(TabGroupBase.NotifyCode.DisplayTabMode);
}
}
} when I try to change the property using this code:
tabGroup->DisplayTabMode = TabbedGroups::DisplayTabModes::HideAll; I get this error:
error C2248: 'HideAll' : cannot access protected enumerator declared in class 'Crownwood::Magic::Controls::TabbedGroups::DisplayTabModes' Maybe I'm missing something, but it looks like "HideAll" is public. Anyone know what I'm doing wrong? Any help is much appreciated.
- monrobot13
|
|
|
|
|
Shouldn't your code be like this
Notify(TabGroupBase::NotifyCode::DisplayTabMode);
instead of
Notify(TabGroupBase.NotifyCode.DisplayTabMode);
A wild guess.
Sonork 100.41263:Anthony_Yio
|
|
|
|
|
Anthony_Yio wrote:
instead of
// Propogate to all children
Notify(TabGroupBase.NotifyCode.DisplayTabMode);
This code is in the library which is written in C# so the .'s are correct.
|
|
|
|
|
Yes, the magic library is written in C# but when you are using MC++ syntax
You need to call using :: instead of .
the . mean different thing in MC++.
Sonork 100.41263:Anthony_Yio
|
|
|
|
|
Hi all
i am writing a Managed C++ class which internally using unmanaged classes.
For my class to work,i need to pass the callback function pointer to one of my unmanaged class as shown below
__gc class MyClass : System::Object
{
UnmanagedClass* pUMClass;
static MyStatic(void*,unsigned long,unsigned long);
MyClass(void)
{
pUMClass = new UnmanagedClass();
pUMClass->SetCallback(&MyClass::MyStatic);
}
}
is it possible pass Managed method pointer to unmanaged class?
if possible how can i type cast void* parameter to my class type(which i'm setting as client data).
Thanks in advance
|
|
|
|
|
hi,
I want to redirect the windows telnet.exe to a MFC GUI (CRichEditCtrl).
I have manged to redirect cmd.exe but not the telnet.exe and the ftp.exe can someone help me. please E-mail me at
lior_zar@hotmail.com
|
|
|
|
|
I did a coordinate transformation to flip the X-AXIS in a Graphics object. All the plotting works great. But, when I use DrawString the text is flipped around the x-axis. Does DrawString not know about the coordinate transformation? How would one fix this?
Thanks for your help,
icdma
|
|
|
|
|
Sorry, I flipped the Y-Axis not the X-AXIS. But DrawString flips around the x-axis...
|
|
|
|
|
hi;
how can i input voice in a c++ program through microfone and is there any way that i can control my program written in c++ to work with voice commands....i really need detailed help........thanx
|
|
|
|
|
Ok, here's what must be the stupidest question on the board, but I have to ask.
I'm trying to migrate from C++ (VC6) to the new .NET style of managed C++.
I'm trying to enumerate any/all NIC cards on the sytem and get a list of valid IP addresses using System:ManagementClass object
At one point in my code I'm calling:
<br />
ManagementObject * mo = new ManagementObject(); <br />
mo->get_Item("IPEnabled")<br />
Now the question
This Get_Item() function returns a system object pointer. As I debug and step through my code, I can see this pointer does indeed point to an object which toggle between "true" and "false"
What I'm trying to do is simple:
<br />
if(mo->Equals(true))<br />
{<br />
GO DO SOME STUFF<br />
}<br />
That if statement does not compile. calling Equals(false) compiles but always returns "true" (I guess because mo is not a NULL pointer) How do I test the value of a generic system object to see if it's "value" == true or false?????
This seems like it should be so simple, yet is turning out to be so hard is embarrassing. Can anyone point me in the right direction??
Phrustrated Phil
|
|
|
|
|
Phil C wrote:
What I'm trying to do is simple:
if(mo->Equals(true))
{
GO DO SOME STUFF
}
What you are doing there is totally absurd! A ManagementObject can NEVER EVER be a Boolean.
Now if you are doing as you did originally, AFAIK you just need to unbox the return object pointer. EG
if (__unbox(mo->get_Item("IPEnabled")))
{
}
leppie::AllocCPArticle(Generic DFA State Machine for .NET);
|
|
|
|
|
Oops, I wrote it wrong while trying to trim away all the fat and make it legible.
What I'm trying should have read:
<br />
Object *booltest = mo->get_Item("IPAddress");<br />
<br />
if(booltest)<br />
{<br />
DO STUFF<br />
}<br />
<br />
if(booltest->Equals(true))<br />
{<br />
DO STUFF<br />
}<br />
<br />
Basically the entire mo->GetItem("IPEnabled") was originally inside my if statement (like you wrote) and I was (trying to) break it down into pieces to determine where I was going wrong.
Thank you for telling me I'm totally absurd. I was really hoping someone like you would tell me how stupid I am even though I admitted right up front that this is a pretty embarrassing question.
I think you did point me to the key statement though. I haven't figured out the whole box/unbox theory yet. For that I do thank you sincerely.
Less Phrustrated Phil
|
|
|
|
|
If I declare a variable as
Int32 arr[] = __gc new Int32[100];
and I have a legacy function which takes a int * and count, is there any way to pass arr and 100 to that function? I'm thinking something along the lines of
Int32 __pin *ptr = arr;
f(ptr, 100);
Will this work? I mean arr is actually pointing to a System::Array, right?
READIN writin rhythmetic
|
|
|
|
|
Jeremy Osner wrote:
I mean arr is actually pointing to a System::Array, right?
arr is pointing to a class derived from System::Array.
|
|
|
|
|
Right -- so is there any way I can get an interior pointer to the memory that implements the array?
READIN writin rhythmetic
|
|
|
|
|
Yes!! I have found the solution at MSDN -- I pin the pointer to the first element of the array.
READIN writin rhythmetic
|
|
|
|
|
|
> __int32 __pin * ptr = (int*)&array[0];
Why do you put that (int *) cast in? The line compiles without the cast -- I would think the cast would generate a compiler error -- can't convert a __gc pointer to an unmanaged pointer. No?
READIN writin rhythmetic
|
|
|
|