|
Yes, that's what I'm going to do - reduce last column by 2-4pixels...but that's some kind of a 'hack';) The goal is to create this listview without the scrollbar at the bottom but not by using LVS_NOSCROLL, but by resizing the columns. Of course the scrollbar should appear if I decrease the whole dialog window or change the column width using mouse....
Thanks for help anyway
|
|
|
|
|
Use LVSCW_AUTOSIZE_USEHEADER for the cx param when you set the size of the LAST column.
|
|
|
|
|
Hi
I have a toolbar with its own background (bitmap) on a ReBar object. This ReBas has olso its own background (the same bitmap). And again, the ReBar is on a dialog window with the same background. All three objects use the same background (CBitmap object) and use ON_WM_ERASEBKGND message to paint itself.
The problem is, when I place the manifest file next to the application exe, the toolbar loses its background?!?! The ReBar and dialog window remain untouched, only the toolbar becomes plain (no texture on it). What's more, this 'bug' doesn't depend on Windows themes. When I turn the XP themes on/off the toolbar remains plain...Removing manifest file helps in both cases... but I can't use XP-style controls. Do You have any idea what can be wrong???
If that could help, for toolbar I use:
m_Toolbar.CreateEx(this,WS_CHILD|TBSTYLE_FLAT|WS_VISIBLE|CBRS_TOP|CBRS_SIZE_FIXED|TBSTYLE_CUSTOMERASE);
and for ReBar:
m_ReBar.Create(this);
m_ReBar.AddBar(&m_Toolbar,NULL,NULL,RBBS_USECHEVRON|RBBS_GRIPPERALWAYS));
....
rbbi.fMask = RBBIM_CHILDSIZE | RBBIM_IDEALSIZE | RBBIM_SIZE | RBBIM_ID; //Bar info flags
Thanks in advance!
|
|
|
|
|
Why does an occasional sleep in your thread cause your cpu usage to drop?
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~<br />
Peter Weyzen<br />
Staff Engineer<br />
<A HREF="http://www.soonr.com">SoonR Inc.</A>
|
|
|
|
|
Because it's not doing anything while it's "sleeping".
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997 ----- "...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001
|
|
|
|
|
But doesn't a Sleep(0) help too?
to take a situation where a tight loop in a thread hogs all the cpu? Yielding cpu seems to help....
Is that true?
And why?
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~<br />
Peter Weyzen<br />
Staff Engineer<br />
<A HREF="http://www.soonr.com">SoonR Inc.</A>
|
|
|
|
|
Sleep(0) gives up the remainder of the thread's timeslice, allowing another thread to run
immediately so, yes, it helps.
Check out Scheduling Priorities[^]
Mark
|
|
|
|
|
But it doesn't sleep 0 millisecs. It actually sleeps about 25 millisecs (and this value is dependent on your hardware).
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997 ----- "...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001
|
|
|
|
|
John Simmons / outlaw programmer wrote: But it doesn't sleep 0 millisecs.
I never said it did
|
|
|
|
|
i want to Broadcast a message to multiple instances of an application !
I have CEdit m_edit1;
CButton IDC_SEND //// m_send;
////////////////////////////////
CString sText;
m_edit1.GetWindowText(sText);
i want to diplay the massege from m_edit1 when i press the button m_send to all m_edit1 to instances of aplication.
how can i do that?
Bravoone
|
|
|
|
|
If the strings aren't too long, you can turn your string into an "ATOM".
GlobalAddAtom stores the string and gives you a handle to it. GlobalGetAtomName does the reverse.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~<br />
Peter Weyzen<br />
Staff Engineer<br />
<A HREF="http://www.soonr.com">SoonR Inc.</A>
|
|
|
|
|
i have CDialog base aplication and i want to display the posibility to transfer data from one dialog to another,but the base dialog is the same , something like broadcast a message to multiple instances of an application :
void CTestFindWindowDlg::OnButton()
{
// Send message in broadcasr mode
BOOL a = ::SendNotifyMessage(HWND_BROADCAST,WM_MYMESSAGE ,0,lpString);
}
void CTestFindWindowDlg::OnTestMyMessage(WPARAM wParam,LPARAM lParam)
{
int a =(int)lParam;
CString strValue;
strValue.Format("%d",a);
AfxMessageBox("broadcast :"+strValue);
}
now all i want is to display the text into CEdit m_edit1 lets say... how can i do that ?
and the text must be from the m_edit1!
Bravoone
|
|
|
|
|
I’m not sure what exactly you’re really trying to do, but for information on sending messages here is an excellent article
http://www.codeproject.com/dialog/messagemgmt.asp[^]
I'd love to help, but unfortunatley I have prior commitments monitoring the length of my grass. :Andrew Bleakley:
|
|
|
|
|
Bravoone_2006 wrote: i want to Broadcast a message to multiple instances of an application !
Use WM_COPYDATA .
"Approved Workmen Are Not Ashamed" - 2 Timothy 2:15
"Judge not by the eye but by the heart." - Native American Proverb
|
|
|
|
|
Hello,
I am trying to encrypt some data which contains a structure and the encrypted data looks like the following
ËO.ÛßÞ¸ëô'¾óÉïD»åÏë3¸·««««««««"
The code to encrypt the data is the following
dwBufferLen = dwCount=((STRUCT_BUF)pBufPtr)->ulSize;
if(!CryptEncrypt(hKey, 0, TRUE, 0,pBufPtr, &dwCount, dwBufferLen))
{
if(GetLastError() == ERROR_MORE_DATA)
{
//MessageBox("error","",MB_OK);
}
}
Now I want to convert this encrypted data to a string so that it can be displayed to the user on the screen for typing purposes
However when I call CryptBinaryToString with the data passed in it returns a string with alphanumeric characters but also other garbage characters. The string looks like the following
y08u29/euOv0vWzx9Oo
DWORD len;
if(!CryptBinaryToString(pBufPtr,dwCount,1,NULL,&len))
{
DWORD dwRet;
dwRet=GetLastError();
MessageBox("error","",MB_OK);
}
char *msg2=(char*)malloc(len);
*msg2 = NULL;
if(!CryptBinaryToString((BYTE *)pBufPtr,dwCount,1,msg2,&len))
{
DWORD dwRet;
dwRet=GetLastError();
}
Instead of 1 as the third argument above I tried passing in 4 and 12 but still get the same result. So how do I get rid of the garbage characters. Please help.
Thanks.
vg
|
|
|
|
|
vgandhi wrote: Instead of 1 as the third argument above I tried passing in 4 and 12 but still get the same result
Where are you getting those numbers from?
The third argument should be one of the following, correct?
CRYPT_STRING_BASE64HEADER
CRYPT_STRING_BASE64
CRYPT_STRING_BINARY
CRYPT_STRING_BASE64REQUESTHEADER
CRYPT_STRING_HEX
CRYPT_STRING_HEXASCII
CRYPT_STRING_BASE64_ANY
CRYPT_STRING_ANY
CRYPT_STRING_HEX_ANY
CRYPT_STRING_BASE64X509CRLHEADER
CRYPT_STRING_HEXADDR
CRYPT_STRING_HEXASCIIADDR
I suppose CRYPT_STRING_HEXASCII is the human-readable one.
Mark
|
|
|
|
|
Basically 1 stands for CRYPT_STRING_BASE64 and 4 stans for CRYPT_STRING_HEX
CRYPT_STRING_HEXASCII doesn't help in this case also.
Thanks
vg
|
|
|
|
|
This works for me with no garbage chars...
BYTE BinaryBytes[256];
for (int i = 0; i < 256; ++i)
BinaryBytes[i] = (BYTE)i;
DWORD dwStrLen = 0;
::CryptBinaryToString(BinaryBytes, 256, CRYPT_STRING_HEXASCII, NULL, &dwStrLen);
LPTSTR pszBuf = new TCHAR[dwStrLen];
::CryptBinaryToString(BinaryBytes, 256, CRYPT_STRING_HEXASCII, pszBuf, &dwStrLen);
delete[] pszBuf;
|
|
|
|
|
Dear Mark,
Yes this worked for you but if you see the string value in this case how will you display it to the user. It is not in any displable format whereas I want to display the data on the screen for the user to either type or read it loud. Thanks.
vg
|
|
|
|
|
What are you expecting to see?
The source is bytes, 0 to 255. How do you want to represent them on the screen?
You can always loop through the source bytes and convert them to whatever format you want.
Mark
|
|
|
|
|
Why are you calling that garbage? You're passing the flag 1, which is CRYPT_STRING_BASE64 , and you're getting back a base 64 encoded version of the data.
|
|
|
|
|
Dear Mike,
The string that I get from the CryptBinarytoString has a square shaped ascii character in it and that gets displayed to the user that day. So if the user were to use that string there is no way he/she can do that. I'm getting a base 64 encoded version of the data but I'm just wondering as to why it always includes those funny characters at the end and is there a way to avoid it.
Thanks.
vg
|
|
|
|
|
This code works for me:
int main()
{
LPCSTR data = "Some encrypted data goes here...";
LPTSTR pszBase64 = NULL;
DWORD cchString = 0;
CryptBinaryToString ( (const BYTE*) data, strlen(data), CRYPT_STRING_BASE64, NULL, &cchString );
pszBase64 = (LPTSTR) _alloca ( cchString * sizeof(TCHAR) );
CryptBinaryToString ( (const BYTE*) data, strlen(data), CRYPT_STRING_BASE64, pszBase64, &cchString );
wcout << pszBase64 << endl;
return 0;
} I get the output:
U29tZSBlbmNyeXB0ZWQgZGF0YSBnb2VzIGhlcmUuLi4=
|
|
|
|
|
Yes but you using a string instead of a byte buffer. So try this out
typedef unsigned char *PBYTE;
typedef struct chal_buffer
{
DWORD ulSize; // size of the entire structure
DWORD Num1;
}CHALL_BUF, *PCHALL_BUF;
PBYTE pBufPtr
pBufPtr = (UCHAR*)malloc(sizeof(CHALL_BUF));
if(!pBufPtr)
return;
((PCHALL_BUF)pBufPtr)->ulSize = sizeof(CHALL_BUF);// in bytes
((PCHALL_BUF)pBufPtr)->Num1=1233456;
if(!CryptAcquireContext(&hProv, NULL, MS_DEF_PROV, PROV_RSA_FULL, 0))
{
if(GetLastError() != NTE_BAD_KEYSET)
{
}
if(!CryptAcquireContext(&hProv, NULL, MS_DEF_PROV, PROV_RSA_FULL, CRYPT_NEWKEYSET))
{
}
}
if(!CryptCreateHash(hProv, CALG_MD5, 0, 0, &hHash))
{
DWORD dwRet = GetLastError();
}
// Derive a session key from the hash object.
if(!CryptDeriveKey(hProv, CALG_RC4, hHash, 0, &hKey))
{
DWORD dwRet = GetLastError();
}
// Destroy the hash object.
CryptDestroyHash(hHash);
hHash = 0;
dwBufferLen = dwCount=((PCHALL_BUF)pBufPtr)->ulSize;
if(!CryptEncrypt(hKey, 0, TRUE, 0,pBufPtr, &dwCount, dwBufferLen))
{
if(GetLastError() == ERROR_MORE_DATA)
{
//MessageBox("error","",MB_OK);
}
}
DWORD len = 0;
if(!CryptBinaryToString(pBufPtr,dwCount,CRYPT_STRING_BASE64,NULL,&len))
{
DWORD dwRet;
dwRet=GetLastError();
MessageBox("error","",MB_OK);
}
LPTSTR pszBase64 = NULL;
pszBase64 = (LPTSTR) _alloca ( len * sizeof(TCHAR) );
if(!CryptBinaryToString((BYTE *)pBufPtr,dwCount,CRYPT_STRING_BASE64,pszBase64,&len))
{
DWORD dwRet;
dwRet=GetLastError();
}
vg
|
|
|
|
|
What are you expecting to see? CryptBinaryToString() only understands BYTEs as input so whether
you input a char string or a byte buffer is irrelevant.
A byte has 256 values. That means that somehow CryptBinaryToString() must be able to represent
256 values with a limited set of ASCII characters.
Again, what do you want to see? Maybe that API isn't appropriate for your needs.
Mark
|
|
|
|
|