|
The latter, and no.
________________
David Wulff
http://www.davidwulff.co.uk
"I loathe people who keep dogs. They are cowards who haven't got the guts to bite people themselves" - August Strindberg
|
|
|
|
|
In 'SQL Server' security the group login 'DOMAIN/Domain Users' is permitted to use my target database. On Friday (1/18) I was logged into my SQL NT server as 'DOMAIN/Administrator' and the app under development was testing well. When I returned to work on Monday (1/21) the same app withouting any code changes would not execute any SQL queries.
After testing, I discovered that logged in as '(local)/Administrator' the app would work. Logged in as '(local)/Mike' the app would work or logged in as 'DOMAIN/Mike' the app would work. However, logged in with a domain account name other than 'Administrator' that does not exist on the SQL NT server the app fails. So, I know that 'SQL Server' is doing NT Authentication of a sort. It just appears that between Friday and Monday it stopped using the domain controller for authentication.
I have tested to be sure that the DC still considers the SQL NT server to be a part of the domain. I have shutdown and restarted the SQL NT server twice.
I can run the SQL Server Enterprise Manager from other workstations logged into the domain with domain-only user names.
Any suggestions of what to try or what may have changed between Friday and Monday are appreciated.
>>>-----> MikeO
|
|
|
|
|
Hi,
I have a numbric field !
I wanna set Request Value property to this field !
I mean user must FILL the field !!!
How can i do that by SQL statments ??? (i need SQL functions)
My month article: Game programming by DirectX by Lan Mader.
Please visit in: www.geocities.com/hadi_rezaie/index.html
Hadi Rezaie
|
|
|
|
|
I have a server that runs on a configurable number of threads, that process incoming requests and write them to database. Is it advisable to use one OLE-DB connection using the CDataSource class to send all the concurrent database updates? My question is: does one SQL query on the connection block another, if SQL server itself does not cause locking? or put in different words, do the connection wait for one command to finish before starting another?
Thanks for any help ...
Thomas
modified 29-Aug-18 21:01pm.
|
|
|
|
|
There is a question to me:
- I have all rights to Control the Server (but I have no permission to Setup SQL Server on it)
- I have a database file (Save from other Server had setup SQL Server and copy to my Server)
- I must Access and Update the Database..
--> How to access a SQL Server database while i have no permission to setup SQL Server or DLL on my Server?????????
I have no solution...
If you have any ideas please help me by email to me at: newcinc@hcm.fpt.vn
(Sorry if my English is not good...)
Thanks for your reading
Hai Binh (Vietnam)
http://go.to/JSLib
|
|
|
|
|
You need to Update the SQL Server, or the database, Because, you can import the database from other database.
If you have other sql servr running with the database that you want to get, you can import the database to your SQL Server...
Best Regards....
Is Friday, and Party!!!
My software never has bugs. It just develops random features.
Carlos Antollini.
Sonork ID 100.10529 cantollini
|
|
|
|
|
Without creating a DSN how do I specify that the connection should use TCP/IP?
I'm using ADO.
<reason behind my question>
I need to connect to a remote SQL Server and the only thing our firewall lets through is TCP/IP. I could just specify the IP address, but that means if we have to change it for some reason or another, the program no longer works; so I point it to a hostname (database.mycompany.com) which will always resolve to the correct IP.
My current workaround uses a DSN with the client configured to TCP/IP, changing it to another protocol breaks the application (thus I know that TCP/IP is what makes it work).
<reason>
TIA,
James
Sonork ID: 100.11138 - Hasaki
"Not be to confused with 'The VD Project'. Which would be a very bad pr0n flick. " - Michael P Butler Jan. 18, 2002
|
|
|
|
|
You could try configuring Client Network Utility on all the clients but that could get messy and I'm not totally positive it would fix your problem. You also might want to configure the Server Network Utility to only allow TCP/IP connections.
Try adding this to your ADO connection strings:
"Network=dbmssocn; Data Source=database.mycompany.com, 1433;"
1433 is the port SQL Server runs on by default. If you have SQL Server running on any other port, then specify it there. But if you know the default port is being used, I think the ", 1433" is optional in the connection string.
This may also be of interest:
http://support.microsoft.com/default.aspx?scid=kb;EN-US;q238949
Andy Gaskell, MCSD
|
|
|
|
|
Thanks, I'll give that a shot on Monday
James
Sonork ID: 100.11138 - Hasaki
"Not be to confused with 'The VD Project'. Which would be a very bad pr0n flick. " - Michael P Butler Jan. 18, 2002
|
|
|
|
|
Hi,
I have this problem. I have created this class for ADO binding:
class Obce : public CADORecordBinding
{
BEGIN_ADO_BINDING(Obce)
ADO_VARIABLE_LENGTH_ENTRY2( 1, adVarWChar, m_szCITY, sizeof(m_szCITY), m_lCITYStatus, FALSE)
END_ADO_BINDING()
//Attributes
public:
CHAR m_szCITY[46];
ULONG m_lCITYStatus;
};
Now I have an edit control IDC_CITY and the variable m_strCity. All I need is to exchange data between these two variables.
How do I do: m_strCity -> m_szCITY and m_szCITY -> m_strCity?
Thank you for your help or suggestions.
David Pokluda (pokluda@mujweb.cz)
|
|
|
|
|
Hi,
I use in my soft an ADO Data Control 6.0.
I want to install my soft with InstallShield. So I don't know if it is possible to modify the data source of the ADO because when I install my soft, it doesn't fin the data path.
A solution would be modify the data source of the ADO DATA Control directly in my soft, but I don't know if it is possible.
If someone can help me !!
Thanks a lot.
Ludovic
|
|
|
|
|
Hi there,
My VC codes access SQL Server via ADO (using help class CADORecordBinding). I have a DBTYPE_DBTIMESTAMP data type column.
How can I put value in it? Is there a data type responding to it in C++?
|
|
|
|
|
Out of SQL Server Books Online:
timestamp is a data type that exposes automatically generated binary numbers, which are guaranteed to be unique within a database. timestamp is used typically as a mechanism for version-stamping table rows. The storage size is 8 bytes.
...
A nonnullable timestamp column is semantically equivalent to a binary(8) column. A nullable timestamp column is semantically equivalent to a varbinary(8) column.
So it does not look like you can set a value to it. It is auto generated by the server...
|
|
|
|
|
Maybe my words were not clear enough, I find the corresponding data type in C++ not anything else. And more, I've found it in 'ADO Data Bound Class Wizard' article. Thanks for ur help!
Best regard,
Kamp Huang
|
|
|
|
|
I'm working on a C++ program that merges and export Excel woorkbook sheets to plain text files. The program runs as a service and therefor needs to be stable and have very good memory management. I've done lots of testing, and I've finally come to the conclution that the ODBC driver (or perheaps the Excel driver) does not return all memory that it allocates. I removed all my own memory allocation and noticed that the program increases approx. 10 kB of memory (average) per ODBC connection.
I do close both the recordset and the database when I'm done with the date retrieval.
Have anyone else noticed this problem and have a work around? I would be very pleased for any ideas on the issue.
cheers,
Martin Fridegren
|
|
|
|
|
System : Win2000Profession VC6SP5 MDAC2.6 SQLserver7
When I retrive 15,000,000 records,the memory raise slowly to 2G,how to resolve this memeory problem?
The main routine is blow:
m_ptrRecordset->Open(
bstrSQL,
m_ptrConnection.GetInterfacePtr(),
ADODB::adOpenForwardOnly,
ADODB::adLockUnspecified,
ADODB::adAsyncExecute );
//--------------------------------------------------
while(!ptrRS->adoEOF)
{
lRow ++;
ptrFields = ptrRS->Fields;
nCols = ptrFields->Count;
for(long n = 0; n < nCols; n++)
{
vCol=n;
hr = ptrFields->get_Item(vCol, &ptrField);
VARIANT _result;
VariantInit(&_result);
hr = ptrField->get_Value(&_result);
CString str((LPCTSTR)CHelpers::CrackStrVariant(_result));
fwrite(str.GetBuffer(2048),sizeof(char),str.GetLength(),file);
if(n<ncols-1)
fwrite(",",sizeof(char),1,file);
="" }
="" fwrite("\n",sizeof(char),1,file);
="" hr="ptrRS-">raw_MoveNext();
if(FAILED(hr))
break;
}
ptrRS = ptrOldRS->NextRecordset(&vRowsAffected);
}
while(ptrRS != NULL);
Please pardon my weak English!
|
|
|
|
|
we have an app that does a whole bunch of calcs and end up writing upwards of 200,000 records into a sql7 database via direct odbc calls (sqlexecdirect) ... the app takes maybe 15 - 25 minutes to run which isn't so bad considering but i want to know if it can be sped up ... after profiling it i find that 89% of the time is spent inside sqlexecdirect()
we don't use stored procs and the records are inserted using a simple INSERT INTO blah (xxx,xxx,xxx,xxx) VALUES (xxx,xxx,xxx,xxx) statement ... the number of 'parameters' for the queries is maybe 20 or 25
my understanding of stored procs is that they compile once and execute quicker but will this make much difference given the simple nature of the statements and the number of parameters i would have to be working with?
thoughts would be appreciated
---
"every year we invent better idiot proof systems and every year they invent better idiots ... and the linux zealots still aren't being sterilized"
|
|
|
|
|
One thing that can speed it up a lot (in my experience) is to batch multiple insert statements together. For example, I would append all your insert statements to a string and when it reaches certain length I would execute it, then start all over again. The limit of the query statement is pretty big, I think, it is around 16K, so you can safely build strings up to at least that limit.
Someone told me that wrapping your inserts in transactions speed things up, but from my personal experience I did not see any improvement.
I hope this helps.
|
|
|
|
|
The fastest way of inserting records on a table is using BCP (Bulk Insert).
200.000 records takes about 3 or 4 seconds on my machine.
Other good option is using DTS if you need doing calculations on the data.
Crivo
Automated Credit Assessment
|
|
|
|
|
I agree with Daniel...
You must to use the BCP utility....
Regards
Carlos Antollini.
Sonork ID 100.10529 cantollini
|
|
|
|
|
Can you distribute BCP with your application?
Beside that, what I proposed to do is essentially the same thing that BCP does (batch inserting) only without a hassle of writing it to a file and then using external app to import it. What do you think?
|
|
|
|
|
Yes you can distribute the BCP utility. But I remember somthing better. The command BULK INSERT.
You can write a stored procedure using the command BULK INSERT in It. This command, You only need to inform the Path of the file that has the data, and the Field and record separator, and ready....
The good news is that, the BULK INSERT is executed in the Server, in that case all the resources are in the server....
Regards....
Carlos Antollini.
Sonork ID 100.10529 cantollini
|
|
|
|
|
This indeed looks like a better choice providing they can upload or access their file from the server. Good thinking.
|
|
|
|
|
Thanks... Always rememmber, that the path that you must to inform to the BULK INSERT is the Path from the Server...
Regards...;)
Carlos Antollini.
Sonork ID 100.10529 cantollini
|
|
|
|
|
given that the prog calculates the data to be inserted would you suggest i write all data to a file and then run a bcp to bulk insert?
i think writing it all to a memory buffer might be a bit resource gobbling no?
---
"every year we invent better idiot proof systems and every year they invent better idiots ... and the linux zealots still aren't being sterilized"
|
|
|
|