|
Clearly that depends on the encoding of the date in the DWORD , which you haven't mentioned.
Steve
|
|
|
|
|
Nikesh Jagtap wrote: I have a DWORD and i want to convert it into a time format.
Assuming that you mean your DWORD is actually a time_t value then you can use one of the Time Management Routines[^].
It's time for a new signature.
|
|
|
|
|
This way:
char buf[0x20];
DWORD dw = 0x41107ECE;
time_t t = dw;
struct tm * ptime;
ptime=gmtime(&t);
strftime(buf,0x20,"%Y/%m/%d %a %H:%M:%S UTC\n",ptime);
printf(buf);
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
[My articles]
|
|
|
|
|
Hi,
CString sz;
sz.LoadString( HINSATANCE hin,UINT id);
How to achieve the above thing by using std::string instead of cstring???
Thanx.
|
|
|
|
|
There's no such functionality in STL for std::string, but course if you could convert CString to std::string.If you are asking how to read string from a file look up ifstream class.
Life is a stage and we are all actors!
|
|
|
|
|
You have to do it yourself. LoadString[^] is a windows specific function (wrapped by CString ) and std::string is standard C++ and thus can't be expected to have any OS specific functions.
Steve
|
|
|
|
|
Hi,
First you need to call FindResource[^] to find the resource string then you can call SizeofResource[^] to get the size of the string so you can resize your std::string and finally call LoadString[^] to copy it.
Best Wishes,
-David Delaune
|
|
|
|
|
Hi,
Not sure who gave me a one vote... but the solution I presented is a secure and safe method for loading a std::string from a resource.
The method presented by Aescleal will fail for a string greater than the 4096 constant. The wstring solution presented by Stephen Hewitt just flat out doesn't work for std::string
Best Wishes,
-David Delaune
|
|
|
|
|
It does for a wstring (which was all it was intended to, it wasn't a reply to the OP but to Aescleal).
It's documented. MSDN (on LoadString ):
nBufferMax [in]
int
The size of the buffer, in characters. The string is truncated and null-terminated if it is longer than the number of characters specified. If this parameter is 0, then lpBuffer receives a read-only pointer to the resource itself.
Secondly, I tried it (after reading the documentation thoroughly first) and verified that it did work.
Finally, your technique will not work. LoadResource returns nothing that LoadString can consume.
Steve
|
|
|
|
|
Stephen Hewitt wrote: Finally, your technique will not work. LoadResource returns nothing that LoadString can consume.
Hi Stephen,
You may be correct. I was thinking that you could just do something like this:
HRSRC hres = FindResource(NULL, MAKEINTRESOURCE(IDS_SOMESTRING), RT_STRING);
DWORD dwSizeRes = SizeofResource(NULL,hres);
char *p = NULL;
int result = LoadString((HINSTANCE)&__ImageBase,AFX_IDS_APP_TITLE,p,0);
std::string s(p,dwSizeRes);
Which is not much different than what you posted.
With the method that you posted...I am thinking that the std::string would need a NULL terminator in the string resource. I admit that I have not tried either method (including the one I just typed above). I will test both methods later when I get some free time.
Best Wishes,
-David Delaune
|
|
|
|
|
Hey Steve
I just spent 30 minutes testing both of our code samples. I could not get the code you wrote working for std::string probably because the resource strings are actually stored in Unicode within the PE files. However for wstrings your code is absolutely perfect. I suspect... that the brilliant code you posted could be actually be modified to utilize wcstombs and work for both ANSI and Unicode builds.
As for my code sample... I had even less success although I was eventually able to get it working. I had to re-read some of the MSDN docs and ended up using FindResourceEx with the language identifier. To make things worse the address I was recieving was the start of the string resource block. I had to load the string block into an HGLOBAL and walk the string table before I could even read the string.
Anyway I just wanted to let you know that you were correct.
Best Wishes,
-David Delaune
|
|
|
|
|
|
I didn't give you a 1 vote but the method I outlined will work for strings greater than 4096 characters, it just won't load the whole string. If you're really worried about that you have to use the OS/2 vintage "try it once to fail, once to succeed" method of sizing buffers.
And it'd be fairly easy to change Stephen's method to work for std::string - depending on how much you know about the resource and the characters in it it's either trivial (use the two iterator string constructor) or slightly harder (use wcstombs).
Ash
|
|
|
|
|
Load it into a character array first then assign the result to the std::string.
std::string load_string_from_resource( HINSTANCE app_instance, unsigned string_ID )
{
char buffer[ 4096 ];
unsigned bytes_copied = LoadString( app_instance, string_ID, buffer, sizeof( buffer ) );
if( !bytes_copied )
throw std::runtime_error( "Resource not found!" );
return std::string( buffer, bytes_copied );
}
(Code not tested, Errors and Omissions Excluded)
Cheers,
Ash
|
|
|
|
|
If you're using Unicode it can be done more efficiently than that:
std::wstring LoadString(HINSTANCE hInstance, UINT uID)
{
LPCWSTR pString;
int res = LoadStringW(hInstance, uID, (LPWSTR)(&pString), 0);
if (res == 0)
return std::wstring();
return std::wstring(pString, res);
}
MSDN on LoadString[^]:
If this parameter is 0, then lpBuffer receives a read-only pointer to the resource itself.
NOTE: Raw string resources are not NULL terminated.
Steve
|
|
|
|
|
Cool, I'd either forgotten about or never known the behaviour of LoadString when you pass zero as the buffer size.
What's the reason this wouldn't work on ANSI as well?
|
|
|
|
|
I'm not positive, but it doesn't seem to. Note that the documentation says it returns a pointer to the raw resource and string resources are always Unicode, so perhaps this is a reason. In fact, in the ANSI case 0xFFFFffff was returned to indicate an error (when 0 is passed as the buffer size), which isn't mentioned in the documentation.
Steve
|
|
|
|
|
Thanks for that, I'll stick that on the list as the first new thing I've learnt today.
Ash
|
|
|
|
|
|
Very nice and useful..Thanx
|
|
|
|
|
gmallax wrote: Very nice and useful
Then please upvote. Thanks, AR
When the wise (person) points at the moon the fool looks at the finger (Chinese proverb)
|
|
|
|
|
hi,
here we declare char buffer as char buffer[ 1024 ] = { '\0' }..
How to release this?
|
|
|
|
|
It is released at the next }
ReturnType SomeClass::SomeFunction(SomeParams)
{
char buffer[1024] = {0};
return ReturnType::SomeValue;
} cheers,
AR
When the wise (person) points at the moon the fool looks at the finger (Chinese proverb)
|
|
|
|
|
Hi..
can u please help me to solve this error... i'm getting the following error
"fatal error C1001: INTERNAL COMPILER ERROR
(compiler file 'msc1.cpp', line 1786) "
I'm using vc6.0 and in my project settings options - there are no such options like /Og , /Oa .. but still i'm getting the error.
can anyone suggest me some solution asap..
Thanks in advance...
Archana
|
|
|
|
|
I had a few of those. I think I rearranged some of the code to get it to work. But it was a long time ago. Use #error directive at the top of your source file that fails, move it downwards and see where it breaks. Then, try rearranging your code at that point.
|
|
|
|