|
nguyenbinh07 wrote: should be
Should be? YOU are responsible for setting the font
if you want something different
Mark Salsbery
Microsoft MVP - Visual C++
|
|
|
|
|
Hi,
All
I have a problem in MFC SDI application.I Want to show a flash window before the SDI frame window is created.
Please help,
how to do this.
Tanmay
|
|
|
|
|
You mean a splash screen ? See here[^].
|
|
|
|
|
|
Have you searched CodeProject for articles on "Splash screen"?
Many are stubborn in pursuit of the path they have chosen, few in pursuit of the goal - Friedrich Nietzsche
.·´¯`·->Rajesh<-·´¯`·.
[Microsoft MVP - Visual C++]
|
|
|
|
|
|
Hi,
Im having .inf and .sys file in my "C:\Program Files\Inno Setup 5" directory. I want to write the code for copy this file and paste it into "C:\WINDOWS\system32". I dont know, how to implement this.Plz...... help me.....
Regards,
Anitha
|
|
|
|
|
Did you try asking this question in any of the InnoSetup forums? You may get more help over there instead of trying to reframe the question and posting it here time and again.
Somethings seem HARD to do, until we know how to do them.
_AnShUmAn_
|
|
|
|
|
Hello all
I had written a small program to print ones compliment of zero in visual studio, and got the result as -1. I am surprised to see this result. because what I expect was ~0 = 11111111 !!!
Y it is -1 ?
Thank you
|
|
|
|
|
spicy_kid2000 wrote: ~0 = 11111111
Yes, in binary . You are probably storing the result in a signed variable (e.g. an integer or a char). And for a signed variable, if all bits are set, the resulting value is -1.
|
|
|
|
|
Hello , Thank you
in VC++ how can we print the right value ?
|
|
|
|
|
And what is, in your opinion, the right value?
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
[My articles]
|
|
|
|
|
in my opitions, if it a unsigned int, it should be 2 ^32 -1
|
|
|
|
|
Then store the result in an unsigned int and display it. What's your code ?
|
|
|
|
|
here is my code
int main(int argc, char* argv[])
{
unsigned int i = 0;
printf("%d",~i);
return 0;
}
|
|
|
|
|
Use %u instead of %d. %d is for signed integers and %u is for unsigned integers.
|
|
|
|
|
The flawn is in %d format specifier. According to documentation [^] %d specifies 'signed decimal integer' hence you unsigned number is first converted to signed and the printed out.
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
[My articles]
|
|
|
|
|
spicy_kid2000 wrote: if it a unsigned int, it should be 2 ^32 -1
Good. The following snippet may possibly confirm you opinion
void main()
{
unsigned int u = 0;
u = ~u;
printf("%u (%x)\n", u, u);
}
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
[My articles]
|
|
|
|
|
Code:
#include <stdio.h>
#include <stdlib.h>
int main()
{
char myVar = 0xFF;
printf("Signed: %d\n", myVar);
printf("Unsigned: %d\n", (char unsigned)myVar);
system("Pause");
return 0;
}
Output:
Signed: -1
Unsigned: 255
Press any key to continue . . .
|
|
|
|
|
Hi all.
I want to change a date format from dd/mm/yyyy to yyyy/mm/dd..
suppose i am getting date as..
Date = 12/04/2008 i want it to be changed to 2008/04/12..
I am trying with the below code but i am getting the system time..
the code is
CString Date,Datee;
Date = "12/05/2008;"
SYSTEMTIME sysTime;
GetSystemTime(&sysTime);
Datee.Format("%04d%02d%02d",sysTime.wYear,sysTime.wMonth,sysTime.wDay);
can anyone help me
manju
Hi..
I am Mnaju.I have Completed my B.E Computers Science.Lokking for a job.I am interested in VC++
manju
|
|
|
|
|
manju#123 wrote: I want to change a date format from dd/mm/yyyy to yyyy/mm/dd..
It would help a lot if you would tell us what this date is: is it stored in a plain string, a COLEDataTime object, ... ?
Don't you have the different elements of the data (day, month and year) separately available ?
|
|
|
|
|
manju#123 wrote: SYSTEMTIME sysTime;
GetSystemTime(&sysTime);
Gets the current system time in UTC
manju#123 wrote: I am trying with the below code but i am getting the system time..
because you are using the system time and just formatting it before displaying
manju#123 wrote: Datee.Format("%04d%02d%02d",sysTime.wYear,sysTime.wMonth,sysTime.wDay);
Do you try to print the Datee string after the string is formatted? It should now be in the format you need.
Somethings seem HARD to do, until we know how to do them.
_AnShUmAn_
|
|
|
|
|
Try using GetLocalTime...
|
|
|
|
|
If you are talking about Date variable then you can get the desired format using -
CString Date = "12/05/2008";
CString YY,MM,DD;
int nLen;
nLen = Date.ReverseFind('/');
YY = Date.Right(nLen - 1);
Date = Date.Left(nLen);
nLen = Date.ReverseFind('/');
MM = Date.Right(nLen);
Date = Date.Left(nLen);
DD = Date;
CString sNewDateFormat;
sNewDateFormat.Format("%s/%s/%s",YY,MM,DD);
And if you are talking about the variable Datee, then the code written by you is fine.
|
|
|
|
|
manju#123 wrote: I am trying with the below code but i am getting the system time..
No surprise there since you are explicitly using sysTime members.
manju#123 wrote: can anyone help me
Yes. To get your yyyy/mm/dd format, use:
Datee.Format("%04d/%02d/%02d", sysTime.wYear, sysTime.wMonth, sysTime.wDay); Now if you are actually wanting to use the date in the Date variable, try:
CString Date = "12/05/2008";
COleDateTime dt;
dt.ParseDateTime(Date);
CString Datee = dt.Format("%Y/%m/%d");
"Love people and use things, not love things and use people." - Unknown
"The brick walls are there for a reason...to stop the people who don't want it badly enough." - Randy Pausch
|
|
|
|