|
I have made a program to except user's input and use that input to execvp another program. The problem is that execvp has a prototype of int execvp(const char *file, char *const argv[]); and the user's input is a c++ style string (string input for instance). How can I convert that c++ style string into something at execvp can use? the c_str() function DOES NOT work.
|
|
|
|
|
Which argument are you having trouble with ? What format is the string you're getting as input ? c_str() does indeed return a char *, if you've got a std::string. argv is an array of char *, not just a char *.
What do you mean by 'it doesn't work'. A more meaningful error report would help those trying to help you.
Christian Graus - Microsoft MVP - C++
|
|
|
|
|
Our company is in need of a Software based application, that can compress our data on windows 2003 servers. The software will run as a service on our servers, and will compress data as it is transfered from one system to the next. It will need to be highly reliable, and stable. Were looking to compress files as they are transferred to our file server. The files will need to have the option to be uncompressed or compressed when transferred from one system to the next. Were looking for the maximum compression possible. Goal is to build a linux based file storage system. This system will be designed for
storing files at a compression rate of 75:1. File Shares will be accessible via
URL (\\share01\data). Data will be compressed as files are transfered to individual
folders within each share. Important key factors will be that file permissions are
unchanged within the share. Users will have the option to move the data either
compressed or uncompressed out of the system. If data is moved off of the system,
compressed, it will be traveling to a similiar device with the same comperssion software.
Speed of the data transfer is very important.
Any help ?
Paul
|
|
|
|
|
Which bit do you need help with ?
ANPPS wrote: This system will be designed for
storing files at a compression rate of 75:1.
Where did this figure come from ? Did you just pull it out of the air ? What sort of files will they be ? Compression of jpg images, for example, is next to impossible, because they are already highly compressed. In fact, I just tested, and a 1.2 MB bmp is a 392 kb zip, or a 121 kb jpg ( jpg is of course, a lossy compression, so this doesn't mean you should expect a 10:1 compression rate to be achievable for programs ).
I'd say that you need to research compression algorithms, find out what is possible, and then impliment it from there. The rest of the system sounds pretty straightforward to me.
Christian Graus - Microsoft MVP - C++
|
|
|
|
|
Hi again fellows
I have this function in my code:
void ImprimeTexto(unsigned int iFont, const char* cValue)
{
if(iFont == 0) return;
glEnable(GL_BLEND);
glPushAttrib(GL_LIST_BIT);
glListBase(iFont - 32);
glCallLists(strlen(cValue), GL_UNSIGNED_BYTE, cValue);
glPopAttrib();
glDisable(GL_BLEND);
}
Because of my const char cValue variable I have a leak memory of 4kbytes.
I need to pass a variable of type const char because the glCallLists(an OpenGL function) needs. The last parameter is a pointer to GLVoid.
Here is the function signature:
GLAPI void APIENTRY glCallLists( GLsizei n, GLenum type, const GLvoid *lists );
I've tried to pass a std::string but the function doesn't works .
Guys, can you help me in this??
Thanks a lot
|
|
|
|
|
pass a std::String, and then use the c_str() function to get a char * to pass into the OpenGL function.
Christian Graus - Microsoft MVP - C++
|
|
|
|
|
I suppose that you dynamically allocate the list array before calling ImprimeTexto, and then you forget to deallocate it.
|
|
|
|
|
Some questions on Win32/.Net interaction DLLs:
When compiling a managed DLL that uses Win32 funcs, what character set should be selected, or does it always become unicode due to /clr?
If one creates a native DLL that is to be used with [DllImport...] from .Net, what character set should be used there? I've seen them set to compile with MBCS. Is there a point in this? I know if it was set to Unicode it wouldn't work on W98, but does MBCS enable it to receive "exotic" unicode glyphs (they translate) from a .Net client app, or might it just as well be set to "Not Set"?
|
|
|
|
|
|
- who are you ?
- what is the question ?
- why do you shout ?
- what is a popup dictionnary ?
- do you know you worth a '1' ?
TOXCCT >>> GEII power [toxcct][VisualCalc]
|
|
|
|
|
probably one of the most useless topic I've ever seen. I think I know what you want to do, but it's really complicated. And if you don't give more infos, noone will help you.
|
|
|
|
|
Moooommmmmm, i'm seeing anonymous everywhere now !!!
hummm, this one seems to be nicer however
TOXCCT >>> GEII power [toxcct][VisualCalc]
|
|
|
|
|
the second "Anonymous" guy with no name was me. sorry, problem with my name and html
|
|
|
|
|
|
toxcct wrote: cynferdd wrote:
"Anonymous" guy with no name
isn't it the same ?!
I meant "with no name written"
lol
|
|
|
|
|
what he wants do to is probably the same as i; getting members a la IntelliSense...
|
|
|
|
|
maybe you could ask the Visual Assist team...
i doubt you have some answers, but why not to try ?!
TOXCCT >>> GEII power [toxcct][VisualCalc]
|
|
|
|
|
Calling EgWin::Calibrate works only if i do some stuff like Write nothing before i call another void in an unmanaged dll. Why does it not work if i take out this stupid Console::Write("") ??
<code>
class EgWin
{
public:
EgWin() {}
~EgWin() {}
int Init() { return EgInit(&stEgControl);}
void Calibrate(IntPtr hwnd){Console::Write(""); EgCalibrate(hwnd);}
};
public __gc class EgWinProxy
{
public:
EgWinProxy(){}
~EgWinProxy(){}
int _EgInit(){return mpC->Init();}
void _EgCalibrate(IntPtr hwnd){mpC->Calibrate(hwnd);}
private:
EgWin * mpC;
};
</code>
|
|
|
|
|
How is EgWin able to see methods on EgWinProxy at all ?
Christian Graus - Microsoft MVP - C++
|
|
|
|
|
The call gets from EgWinProxy to EgWin and from EgWin to an unmanaged dll wich is not visible in this sniplet. EgWin can not see EgWinProxy, but EgWinProxy keeps a pointer to EgWin.
Hope you have an idea...
|
|
|
|
|
Hi fellows
I've discovered the direct.h recently and I would like to know where I can find more infomation about this header file. Someone can pass this infos for me? Tutorials and links are very good ideas.
Thanks for help
|
|
|
|
|
I think you've found a DirectX header. The header is worthless without hte rest of the library. Google for DirectX.
Christian Graus - Microsoft MVP - C++
|
|
|
|
|
Hello,
I found really unexpected behaviour of calls from C++ managed extensions, to native code. Following code shows what I wanted to do:
#pragma unmanaged
class UnmanagedClass
{
public:
virtual bool unmanagedFunction() const
{
return false;
}
};
#pragma managed
void managedFunction()
{
UnmanagedClass* object = new UnmanagedClass();
bool result = object->unmanagedFunction();
}
The question is- what would be the value of "result" after
virtual function call? Well.. believe me or not, but it's TRUE!
I debuged it in disassembly window, and inside native code-
the value of EAX (AL actually) register is correctly set to FALSE (xor al, al)
When the function does return, my debuger shows that EAX value has changed- into TRUE! There must be some kind of managed wrapper between Managed Extensions code and native code.
Has anybody any idea what is going on here?
And how to solve the problem?
|
|
|
|
|
I got the same result you got. Don't know why.
Anyway, normally I use the #pragma managed and unmanaged to control the function compilation only and for the class, I use either __nogc or __gc to tell it if I want the class managed or unmanaged.
I tried the modified code below and it works. hope this help.
__nogc class UnmanagedClass
{
public:
virtual bool unmanagedFunction() const
{
return false;
}
};
#pragma managed
void managedFunction()
{
UnmanagedClass* object = new UnmanagedClass();
bool result = object->unmanagedFunction();
}
|
|
|
|
|
I believe its been mentioned somewhere that there is a bug in the compiler, or perhaps the runtime, that gets confused with bool in regards to calling unmanaged code from managed code. I believe one work around is to use BOOL instead of bool. I think what happens is that the size is wrong between the two worlds, 8bit in unmanaged, but 32bit in managed, and the translation doen't account for it. I could be wrong, of course.
--
Joel Lucsy
|
|
|
|