Hi,
I am seeing a strange behavior with CryptStringToBinary Cryptography API.
Please see the below code (config: x64 Debug):
#include "stdafx.h"
#include <windows.h>
#include <strsafe.h>
#include <iostream>
#include <exception>
void main()
{
DWORD dwSkip;
DWORD dwFlags;
DWORD dwDataLen;
LPCWSTR pszInput = L"MyTest";
if(! CryptStringToBinary(
pszInput,
_tcslen( pszInput ) + 1,
CRYPT_STRING_BASE64,
NULL,
&dwDataLen,
&dwSkip,
&dwFlags ) )
{
DWORD dw = GetLastError();
throw std::exception( "Error computing Byte length." );
}
BYTE *pbyteByte = NULL;
try
{
pbyteByte = new BYTE[ dwDataLen ];
if( !pbyteByte )
{
DWORD m_dwError = ERROR_INVALID_DATA;
throw std::exception( "Wrong array size." );
}
}
catch( std::exception &ex )
{
throw ex;
}
catch(...)
{
throw std::exception( "Out of memory." );
}
return ;
}
With one pszInput string, the CryptStringToBinary returns true and if i use L"MyTest" as pszInput string it returns false with error code 0x0000000d . I do see, there is some issue with length of the string passed to the API. When I pass the length without null terminated char, the API returns true always. But in this case, is the BYTE length returned correct?
Could anybody help me understanding the reason behind this behavior?
Also, in which case the API would return correct BYTE length?
Thanks in advance!