|
thank you very much
YES, I am here.
|
|
|
|
|
That won't work at all.
Ryan
Being little and getting pushed around by big guys all my life I guess I compensate by pushing electrons and holes around. What a bully I am, but I do enjoy making subatomic particles hop at my bidding - Roger Wright (2nd April 2003, The Lounge)
Punctuality is only a virtue for those who aren't smart enough to think of good excuses for being late - John Nichol "Point Of Impact"
|
|
|
|
|
char buffer[100];
sprintf(buffer,"%f",floatvalue);
bb |~ bb
|
|
|
|
|
thank you.But it seems that must include the file stdio.h
YES, I am here.
|
|
|
|
|
When I look at the answers I see that both answers are completly different. The first casts the float to a char* so that you can look at the bits of how the processor encoded the float. I am not sure why you would want that.
[EDIT]
Also if you want that you should use BYTE or unsigned char instead.
[/EDIT]
The second returns a string that represents the float number. This is more likely what you want.
Which one are you looking for?
John
|
|
|
|
|
<br />
char* ConvertBSTR(BSTR text)<br />
{<br />
return (char*)_T(text);<br />
}<br />
I've got this function, but when I call:
<br />
char* temp;<br />
temp=ConvertBSTR(BSTRTEXT);<br />
MessageBox(NULL,temp,"",MB_OK);<br />
It only converts the first character
Does somebody know how to resolve this problem???
|
|
|
|
|
[edit] This solution won't work because the converted string is actually on the stack, so the returned string will be invalid. The macros will have to be used inline, not in separate functions [/edit]
You can't use a BSTR like that. A BSTR is a unicode string. To convert BSTR to a char* , use the ATL conversion macros:
[edit]You have to include "atlconv.h" first[/edit]
char *ConvertBSTR(BSTR text)
{
USES_CONVERSION;
return W2A(text);
} In your case, you only need a const char* , so you can do this
const char *ConvertBSTR(BSTR text)
{
USES_CONVERSION;
return W2CA(text);
} Hope this helps,
Ryan
Being little and getting pushed around by big guys all my life I guess I compensate by pushing electrons and holes around. What a bully I am, but I do enjoy making subatomic particles hop at my bidding - Roger Wright (2nd April 2003, The Lounge)
Punctuality is only a virtue for those who aren't smart enough to think of good excuses for being late - John Nichol "Point Of Impact"
|
|
|
|
|
No, still doesn't work
Any other options?
|
|
|
|
|
That won't work because the converted string is stored on the stack, and it goes out of scope as soon as the function returns. Use the macros inline, instead of in separate functions like that.
--Mike--
"So where does that leave us? Well, it leaves us right back where we started, only more confused than before." -- Matt Gullett
Ericahist | Homepage | RightClick-Encrypt | 1ClickPicGrabber
|
|
|
|
|
That's the problem, the macro's won't work here... and they are to slow I'm making a game engine so that need to be fast With COM, that's the reason for using a BSTR. And now I call a DirectX function that needs a LPCWSTR. So I have to convert it, but how??? :S
|
|
|
|
|
You can pass a BSTR to a function expecting a LPCWSTR , because both strings are in the same format (zero-terminated Unicode string)
--Mike--
"So where does that leave us? Well, it leaves us right back where we started, only more confused than before." -- Matt Gullett
Ericahist | Homepage | RightClick-Encrypt | 1ClickPicGrabber
|
|
|
|
|
But then I get the error that the format isn't right
|
|
|
|
|
Michael Dunn wrote:
That won't work because the converted string is stored on the stack, and it goes out of scope as soon as the function returns
Aargh. Of course . I should have known that
Ryan
Being little and getting pushed around by big guys all my life I guess I compensate by pushing electrons and holes around. What a bully I am, but I do enjoy making subatomic particles hop at my bidding - Roger Wright (2nd April 2003, The Lounge)
Punctuality is only a virtue for those who aren't smart enough to think of good excuses for being late - John Nichol "Point Of Impact"
|
|
|
|
|
BSTR a = SysAllocString (L"asdfghj12345");
MessageBoxW (NULL,(LPCWSTR)a,NULL,NULL);
|
|
|
|
|
hello @all,
how can i convert a double value into a string?
can anybody help me???
thank you very much!
sunny
|
|
|
|
|
With this function I think
_gcvt();
Convert double number to string; store string in buffer
|
|
|
|
|
Try using sprintf (or _stprintf to support UNICODE as well):
char buf[16];
sprintf(buf, "%f", doubleValue);
TCHAR tbuf[16];
_stprintf(buf, _T("%f"), doubleValue); If you're using MFC, you can do this:
CString str;
str.Format(_T("%f"), doubleValue); Hope this helps,
Ryan
Being little and getting pushed around by big guys all my life I guess I compensate by pushing electrons and holes around. What a bully I am, but I do enjoy making subatomic particles hop at my bidding - Roger Wright (2nd April 2003, The Lounge)
Punctuality is only a virtue for those who aren't smart enough to think of good excuses for being late - John Nichol "Point Of Impact"
|
|
|
|
|
sprintf()
[EDIT]
Ryan answered the question while I had the thread open. He has a very good answer.
[/EDIT]
John
|
|
|
|
|
John M. Drescher wrote:
answered the question while I had the thread open
Don't you hate it when that happens
Ryan
Being little and getting pushed around by big guys all my life I guess I compensate by pushing electrons and holes around. What a bully I am, but I do enjoy making subatomic particles hop at my bidding - Roger Wright (2nd April 2003, The Lounge)
Punctuality is only a virtue for those who aren't smart enough to think of good excuses for being late - John Nichol "Point Of Impact"
|
|
|
|
|
This was the first time I noticed it.
John
|
|
|
|
|
I don't know if this will solve your problem (you're not explaining it exactly) but be carrefull when you read data directly from memory and store it in the strucuture. If you use a memcpy, you will probably have some troubles because the default byte alignement in the structure is 8 (take a look at the msdn documentation about the #pragma pack directive).
That means in fact that all your member variables after the first are stored on the smaller member type or 8-byte boundaries. So, if it's not the case in memory, you will have some serious troubles when you want to "read" your data.
|
|
|
|
|
Thanks cedric,
I realized what I was doing wrong, but I didn't know about 8-byte boundry.
sj
|
|
|
|
|
could you publish the code?
You might consider usage of MS specific __declspec(property) and ignore alignment example
#include <stdexcept>
typedef struct TSimSignal
{
TSimSignal(void * pData = 0) : m_pData(pData){}
__declspec(property(get=GetSimWriteFlag,put=PutSimWriteFlag))
int SimWriteFlag;
__declspec(property(get=GetDisplayReadFlag,put=PutDisplayReadFlag))
int DisplayReadFlag;
__declspec(property(get=GetValue,put=PutValue))
double Value[45];
__declspec(property(get=GetTimeStamp,put=PutTimeStamp))
double TimeStamp;
int GetSimWriteFlag()
{
if(!m_pData)
throw std::out_of_range(__FUNCSIG__);
int * pValue = reinterpret_cast<int*>(m_pData);
return *pValue;
}
void PutSimWriteFlag(int value)
{
if(!m_pData)
throw std::out_of_range(__FUNCSIG__);
int * pValue = reinterpret_cast<int*>(m_pData);
*pValue = value;
}
int GetDisplayReadFlag()
{
if(!m_pData)
throw std::out_of_range(__FUNCSIG__);
int * pValue = reinterpret_cast<int*>(m_pData) + 1;
return *pValue;
}
void PutDisplayReadFlag(int value)
{
if(!m_pData)
throw std::out_of_range(__FUNCSIG__);
int * pValue = reinterpret_cast<int*>(m_pData) + 1;
*pValue = value;
}
double GetValue(long index)
{
if(!m_pData || index < 0 || index >= 45)
throw std::out_of_range(__FUNCSIG__);
double * pArray = reinterpret_cast<double*>(reinterpret_cast<char*>(m_pData) + sizeof(int) * 2);
return pArray[index];
}
void PutValue (long index, double value)
{
if(!m_pData || index < 0 || index >= 45)
throw std::out_of_range(__FUNCSIG__);
double * pArray = reinterpret_cast<double*>(reinterpret_cast<char*>(m_pData) + sizeof(int) * 2);
pArray[index] = value;
}
double GetTimeStamp()
{
if(!m_pData)
throw std::out_of_range(__FUNCSIG__);
double * pValue = reinterpret_cast<double*>(reinterpret_cast<char*>(m_pData) + sizeof(int) * 2 + sizeof(double) * 45);
return *pValue;
}
void PutTimeStamp(double value)
{
if(!m_pData)
throw std::out_of_range(__FUNCSIG__);
double * pValue = reinterpret_cast<double*>(reinterpret_cast<char*>(m_pData) + sizeof(int) * 2 + sizeof(double) * 45);
*pValue = value;
}
protected:
void * m_pData;
}TSimSignal;
|
|
|
|
|
I've posted a couple times before about this problem, and I seemed to be making some progress, but I now seem to have hit a wall. This is for an onscreen keyboard to be used to input text into textboxes in other windows. I'm using a global WH_GETMESSAGE hook to capture messages, and when the message is a WM_LBUTTONDOWN and the window it is being sent to is the keyboard, I use AttachInputThread and a keybd_event to simulate keyboard input. This works, but I have to call SetFocus() every time. Since I am capturing the WM_LBUTTONDOWN message, why is the focus changing? I tried capturing the WM_LBUTTONUP message too, just ignoring it by passing 1 to CallNextHookEx (Yes I pass 1 to CallNextHookEx on the WM_LBUTTONDOWN too...) ... I've tried capturing messages like WM_SETFOCUS, WM_KILLFOCUS, and WM_ACTIVATE... None of those ever seem to actually be called though, or if they are, they aren't being seen by my hook... but its global!!
hook = SetWindowsHookEx(WH_GETMESSAGE, (HOOKPROC)MessageProc, hHookDll, 0);
If there is any other snippets of code I can post that might help, let me know. Thanks!
|
|
|
|
|
Perhaps you need to go deeper and earlier in the message sequence. Have a look at WM_NCHITTEST and WM_HITTEST. Good Luck!
onwards and upwards...
|
|
|
|