|
try:
#include <iostream.h>
int main()
{
cout << '\a';
}
if that's what you mean..
bye
|
|
|
|
|
I'm using the new Visual C++ Express beta and when I try to compile a file containing this code:
int fileLength = binRead->ToString()->Length;
array<Byte>^ sendFile = gcnew array<Byte>(fileLength);
I get the following errors:
error C2065: 'array' : undeclared identifier
error C2275: 'System::Byte' : illegal use of this type as an expression stdafx.cpp : see declaration of 'System::Byte'
error C2065: 'sendFile' : undeclared identifier
error C3192: syntax error : '^' is not a prefix operator (did you mean '*'?)
error C2061: syntax error : identifier 'array'
From several articles i've read it seemed to me as though I was declaring the array in the proper manner, however can anyone give me a hand with this? Could the error possibly be elsewhere in my code?
[Edit]I know when declaring native arrays the length must be a constant, is that also the case with C++/CLI Arrays?
|
|
|
|
|
BrianOlej wrote:
error C3192: syntax error : '^' is not a prefix operator (did you mean '*'?)
If the compiler does not recognize that operator it might indicate that you do not of the project configured properly for managed extensions.
"No matter where you go, there your are." - Buckaroo Banzai
-pete
|
|
|
|
|
I would have to have rethought my entire life if that was the problem. Naw, when I comment out the arrays the rest of the managed code compiles fine.
Signature under construction.
|
|
|
|
|
Sorry. Shot in the dark... missed everything and hit myself in the head
"No matter where you go, there your are." - Buckaroo Banzai
-pete
|
|
|
|
|
#using <System.dll>
using namespace stdcli::language;
That may help.
Thank You
Bo Hunter
|
|
|
|
|
Perfect! Thank you, it solved my problem.
Signature under construction.
|
|
|
|
|
Interesting..I thought that was automatically imported for every /clr build. Maybe not in beta 1, I guess.
Matt
|
|
|
|
|
Yeah, I thought so as well, I guess not.
Signature under construction.
|
|
|
|
|
I believe mscorlib.dll is the only
automatic import, but I have been
wrong before.
Thank You
Bo Hunter
|
|
|
|
|
How can i repair and compact a Microsoft access database with ADO.NET
Thanks
|
|
|
|
|
I don't know if that is possible. You may have to use the JET engine for those operations.
"No matter where you go, there your are." - Buckaroo Banzai
-pete
|
|
|
|
|
What is the difference between Managed C++ and just plain old C++??
Also does managed having anything to do with the whole "Trusted Computing" deal M$ is trying to implement.
Mark
As you journey through life take a minute every now and then to give a thought for the other fellow.
He could be plotting something.
-Hagar the Horrible
|
|
|
|
|
|
hello,
can anyone tell me how to force an mfc window to be the parent window of an dotnet form (system.windows.form)? i want to create an modeless dotnet dialog (form) within my mfc application.
|
|
|
|
|
I'm just new to .NET, so I just plain don't know how most stuff is done.
My current problem is that I have multiple identical forms open at once. Each form has a TreeView control. These TreeViews are all supposed to display the same data, although each seperate form might have different area od the tree expanded. My problem is that if I add a node somewhere in the TreeView on one form, how do I make the TreeViews on all the other forms updated themselves with the new node, without disturbing which areas are expanded on each form.
If its useful, the data the TreeViews are displaying is stored seperately, so if I just have to rebuild the whole of each TreeView off the data I can, but of course that would mean remembering which parts of the view are expanded, mhich just sounds nasty.
Thanks
Sam
|
|
|
|
|
I more or less want to write a method that accepts a .NET string (encoded more or less in UTF-16, as I understand it) to a UTF-8 encoded string. The prototype for this method might be something like this:
Convert(String *unicodeText, char buffer[], int buflen):
The intent it to populate the buffer with a null-terminated UTF-8 encoded string logically equalivalent to the .NET string passed in as the first argument. Of course, the buflen is there to prevent overflow, although in practice it will never impose any real limitations.
I've explored the Encoder and Encoding classes offered, but as far as I can tell, they only get me to a managed array of System::Chars or System::Bytes. Perhaps I can cast a System::Byte to a (char *) or a char. Can I use Marshal::StringToXXXX to convert it to ASCII and or Unicode and then use some other functions?
Any help? I'd be extremely grateful.
Jerry
jerry@cs.stanford.edu
|
|
|
|
|
You certainly have to use System::Runtime::InteropServices::Marshal to copy memory of any kind from managed to unmanaged.
If you have already converted to UTF-8 in managed code then you should be able to marshal the memory into unmanaged buffer somehow right?
"No matter where you go, there your are." - Buckaroo Banzai
-pete
|
|
|
|
|
This is the best solution I ultimately came up with. It's not all that satisfying, because I'd think there'd be something that allowed me to do the equivalent of strcpy from a Byte[] or a char[], but if there is, then I"m unaware of it.
// Populates the specified buffer with the
// UTF-8 encoded, null-terminated C-string
// logically equivalent to the specified converts
// from a Unicode .NET string
void Convert(String *unicodeText, char buffer[], int buflen)
{
Convert(unicodeText, char buffer[], int buflen, Encoding::UTF8);
} // wrapper to a more generic function
void Convert(String *unicodeText, char buffer[], int buflen,
Encoding *encoding)
{
Byte bytes[] = encoding->GetBytes(unicodeText);
if (bytes->Count> buflen - 1) throw new Exception();
for (int i = 0; i < bytes->Count; i++)
buffer[i] = bytes[i]; // char is assignable to System::Byte
buffer[bytes->Count] = '\0';
}
A UTF-8 representation of "hello" is just like the traditional ASCII representation, since all five characters are faithfully mapped to numbers less than 128: UTF-8 and ASCII are the same. But German strings containing
umlauts and eszets, such as "FluBen" (where we'll pretend that the u is a really an umlauted u and the B is the double ess aka eszet) require more bytes. "FluBen" would marhal to 8 bytes, because the center two characters have UTF8 mappings that are two bytes wide.
If anyone knows of a more official solution to this, then I'd be delighted to hear of it. Thanks for your time.
Jerry
|
|
|
|
|
Hi. I just have a little question: Is it possible to see an DLL import's map? When I tried to do so in a managed c++ assembly, instead of "XXX.dll" i saw a commented "No map", but in 'normal' (c# or vb.net assemblies), i see all the things. There should be a way to see the map(the name of the dll holding the exported function in an managed c++ exe), right?
Here's what I got when I tried to peek at an dll import using ILDASM:
.method public static pinvokeimpl(/* No map */)
int32 modopt([Microsoft.VisualC]Microsoft.VisualC.IsLongModifier) modopt([mscorlib]System.Runtime.CompilerServices.CallConvStdcall)
ThemepDemoCheck(uint16 modopt([Microsoft.VisualC]Microsoft.VisualC.IsConstModifier)* A_0) native unmanaged preservesig
{
.custom instance void [mscorlib]System.Security.SuppressUnmanagedCodeSecurityAttribute::.ctor() = ( 01 00 00 00 )
// Embedded native code
// Disassembly of native methods is not supported.
// Managed TargetRVA = 0x001D65D7
} // end of method 'Global Functions'::ThemepDemoCheck
Thanks in advance
|
|
|
|
|
I am currently reading, ".NET and COM: The Complete Interoperability Guide", by Adam Nathan. As a way to work my way through the material (to better understand it), I am creating a Windows Forms project from which I can programmatically manipulate MS Word. The authors recommends using the Type Library Importer (TLBIMP.EXE) to convert the Microsoft Word 10.0 Object Library (MSWORD.OLB) into an Interop Assembly. The assembly is then referenced in the Visual C++ .NET project and placed in the same directory as the compiled executable. This is surprisingly easy, and the assembly can then be inspected with the IL Disassembler (very similar to the way the OLE/COM Object Viewer works). At this point, in your source code you can instantiate COM objects very simply with the new operator. The author describes all this clearly and in great detail in his book.
Anyway, many of the method calls on the COM object involve passing structures to the COM component as a VARIANT. The Interop Marshaler apparently maps the COM VARIANT type to the .NET System::Object type, but, there is no way to pass a struct via a System::Object type to the COM object. So the method call throws an exception if you attempt to pass a System::Object to it (or just doesn't compile in the first place). However, the author has included in his book a code listing which manually marshals the VARIANT type in a given example to a structure which the COM component accepts and the function call then will succeed. It is confusing as hell, frankly, and I don't yet fully understand why it works. The basic idea is to reassemble the assembly after changing the signature of the original method so that it accepts an IntPtr instead of the VARIANT. He then writes a class that is like an Interop chimera, and which converts the original VARIANT into a struct via the IRecordInfo interface (InteropServices) and a couple of functions that you've probably never heard of: GetRecordInfoFromTypeInfo (oleaut32.dll), Marshal::GetITypeInfoForType, and Marshall::GetObjectForIUnknown. (I'm leaving alot of information out of this explanation just for brevity and clarity.)
This seems like an awful lot of trouble to go to, and potentially, you might have to write many of these custom marshalers, all different and complex.
I'm wondering if any of you have had to implement these kinds of marshalers in .NET applications, and if so, just how do they perform, and is the development process as insane as it looks?
Any opinion would be appreciated, thanks.
|
|
|
|
|
Well, apparently, no one has any interest in this subject.
I've spent some time exploring the Word.dll assembly, and as it turns out the methods referred to in the above message were alot more easily implemented than I had originally thought. In the Word assembly (which is a huge library) in most of the methods that I've encountered so far, the Type Library Importer attaches a custom attribute (the MarshalAsAttribute) to the [in] parameters of the methods (which is a .NET object&) that apparently is type cast to an ordinary struct (not a struct contained within a VARIANT), so there was no need to write a custom marshaler for any of the data conversions. And, many of the parameters that are required for function calls on an interface or instantiated object are optional. However, the Visual C++ .NET compiler requires an object be supplied for these optional parameters (which override default settings). Weirdly enough, this is easily accomplished by casting Type::Missing to a System::Object using the dynamic_cast operator (or the __try_cast operator, if you'd rather handle the exceptions) and supplying this as the optional parameter. Example:
Object* OptionalParameter = dynamic_cast<Object*>(Type::Missing);
Where you want to supply a structure for one of the optional parameters, it must be boxed.
There is a help file supplied that describes the various objects that can be instantiated and the interfaces that can be used to make function calls. Unfortunately, the help file is written for Visual Basic programmers, and provides only minimal guidance in writing the C++ code. It does, however, tell you how to obtain the many objects, and essentially what functions you must implement to accomplish various tasks in the Word application.
So, as it turns out, Microsoft Word can be automated fairly easily.
|
|
|
|
|
Ok I have just inherited a piece of C++ .NET code( i feel like a proud PAPA ).
This C++ .NET code in the form of assemblies that interface with another set of C# assemblies which are being called from a ASP.NET project. The problem is that when the ASP .NET project is executed..i can just see the resources on this process going from 72,000K to almost 154,000K in a hurry and progressively getting "worst".
So, my first thought is memory leak in the C++ assemblies. Here are the questions.
1. Can i use #define CRTDBG_MAP_ALLOC to debug memory leaks in dll's.
2. Technically according to MS I should put
_CrtSetDbgFlag ( _CRTDBG_ALLOC_MEM_DF | _CRTDBG_LEAK_CHECK_DF );
at the start of my program.
So, since it is a just a class library..where would the _CrtSetDbgFlag go.
thanks
Sameer
|
|
|
|
|
for converting *char to int what is the command? i know it should has easy command but i forgot.
really thank you
|
|
|
|
|
|