|
davers wrote: I believe they prohibit bulk inserts here
Does that mean you can't prepare a DataTable and send the lot via a DataAdapter? That doesn't use the SqlBulkOperations class, but it does mean that the insert becomes one message in each direction instead of message-response-message-response-... which is going to be significantly slower as it's a new command parse and construct each time at the SQL end.
Performing Batch Operations Using DataAdapters[^]
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
Glad you've narrowed it down, Dave.
As others have suggested, if you can do a BULK INSERT from a file, then parse all your data first into a single (csv) file and pass that to BULK INSERT. Very fast.
|
|
|
|
|
Indeed, parsing is usually faster than inserting each record separately using SQL. I did a quick search right now and this [^] resembles a possible approach using parametrized SQL, but it is also possible without explicit "BULK" operation, just a parametrized insert query would do (wrapped in a a SQL transaction...)
Cheers,
|
|
|
|
|
- Your code is prone to SQL injection (if the source file can be modified).
- You might want to process files in order.
- Your code do not validate that the data is the expected one.
- There are a bunch of hard-coded constant.
If you do it one, then it does not matters much but it would also do not matters much that it would take about one hour to process the file.
One thing that would have a significant impact on the performance is to keep the connection opened for the whole importation if it is not already done.
Philippe Mori
|
|
|
|
|
I would think a RegEx would work a bit faster.
"There are three kinds of lies: lies, damned lies and statistics."
- Benjamin Disraeli
|
|
|
|
|
I would do this completely differently.
Read the text files into memory
Insert each record into a data table in memory - no transformation
Bulk copy the raw data into a staging table in sql server
Use a stored procedure to transform the data.
Bulk copy and stored procedures will reduce the impact of the primary bottle necks of IO and transforms
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Two simple things to consider:
- at 4-5 files per second, those 17000 files will take an hour. So what? Run it somewhen during night time when nobody else is at work.
- SQL statements can be Prepare d. That means you have to change it to a parameterized query first, call Prepare , and later only add the fresh parameters before calling ExecuteNonQuery .
|
|
|
|
|
I recently had a similar problem and gained some speed with a 2 step change: parallelize the
foreach (string f in files) and then ditch the file streaming in favor of File.ReadAllLines(...). Basically the disk was my bottleneck and it performed better doing a bulk reads of data instead of streaming line by line; after each bulk read I had a chunk of data in memory that I could process bound only by the limitations of my CPU. This does, of course, assume that you have some memory to spare because you're reading entire files in at a time. There was also a limit though as to how many concurrent file reads I could throw at my disk, after a certain amount the performance dropped significantly so I had to throttle back how many parallel file reads my app could do. I did some trial and error to find the best number, YMMV.
That would also require you to redo your database access though since you probably don't want to be doing that from multiple threads, but that kind of goes along with some of the other suggestions you've received. Collecting that data in memory and do a bulk insert later on instead of individual inserts would make a big difference, too.
|
|
|
|
|
I want to develop a Web Service restful CRUD between a data base PostgreSQL and a web application in asp. I look many examples and tutorials but I've not find solution. I'am here :
I have a service "Service191" that I can call by Mozilla or by WCF Test Client :
public class Service191 : IService191
{
public string data;
public static NpgsqlConnection conn;
public void connection()
{
try
{
Mails mail = new Mails();
string strConnString = "Server=194.206.X.XXX; Port=5432; Database=XXXXX; User Id=XXXX; Password=XXXXX";
DAL.DAL dal = new DAL.DAL(strConnString);
GenerateGetRequest();
}
catch (Exception e)
{
MessageBox.Show(e.ToString());
}
}
For test this I call the function GenerateGetRequest() :
private static void GenerateGetRequest()
{
string url = "http://localhost:49761/Service191.svc/mails?id=14";
HttpWebRequest GETRequest = (HttpWebRequest)WebRequest.Create(url);
GETRequest.Method = WebRequestMethods.Http.Get;
HttpWebResponse GETResponse = (HttpWebResponse)GETRequest.GetResponse();
Stream GetResponseStream = GETResponse.GetResponseStream();
StreamReader sr = new StreamReader(GetResponseStream);
MessageBox.Show(sr.ReadToEnd());
}
Url is what my web application send later. "Mails" is the table where the web service will do the request. I've got too a Handler :
public class Handler : IHttpHandler
{
private DAL.DAL dal;
private string connString;
private Mails mail;
private ErrorHandler.ErrorHandler errHandler;
#region HANDLER
public bool IsReusable
{
get
{
throw new NotImplementedException();
}
}
public void ProcessRequest(HttpContext context)
{
try
{
string url = Convert.ToString(context.Request.Url);
connString = "Server = 194.206.X.XXX; Port = 5432; Database = XXXX; User Id = XXXX; Password = XXXXX";
dal = new DAL.DAL(connString);
errHandler = new ErrorHandler.ErrorHandler();
switch (context.Request.HttpMethod)
{
case "GET":
READ(context);
break;
case "POST":
CREATE(context);
break;
case "PUT":
UPDATE(context);
break;
case "DELETE":
DELETE(context);
break;
default:
break;
}
}
catch (Exception ex)
{
errHandler.ErrorMessage = ex.Message.ToString();
context.Response.Write(errHandler.ErrorMessage);
}
}
#endregion
#region CRUD
private void READ(HttpContext context)
{
try
{
int id = Convert.ToInt16(context.Request["id"]);
mail = dal.GetMail(id);
if (mail == null)
context.Response.Write(id + "No mail found");
string serialized = Serialize(mail);
context.Response.ContentType = "text/xml";
WriteResponse(serialized);
MessageBox.Show("mail READ");
}
catch (Exception ex)
{
errHandler.ErrorMessage = dal.GetException();
errHandler.ErrorMessage = ex.Message.ToString();
}
}
private void CREATE(HttpContext context)
{
try
{
byte[] PostData = context.Request.BinaryRead(context.Request.ContentLength);
string str = Encoding.UTF8.GetString(PostData);
Mails mail = Deserialize(PostData);
dal.AddMail(mail);
MessageBox.Show("mail CREATE");
}
catch (Exception ex)
{
errHandler.ErrorMessage = dal.GetException();
errHandler.ErrorMessage = ex.Message.ToString();
}
}
private void UPDATE(HttpContext context)
{
try
{
byte[] PUTRequestByte = context.Request.BinaryRead(context.Request.ContentLength);
context.Response.Write(PUTRequestByte);
Mails mail = Deserialize(PUTRequestByte);
dal.UpdateMail(mail);
MessageBox.Show("mail UPDATE");
}
catch (Exception ex)
{
errHandler.ErrorMessage = dal.GetException();
errHandler.ErrorMessage = ex.Message.ToString();
}
}
private void DELETE(HttpContext context)
{
try
{
int id = Convert.ToInt16(context.Request["id"]);
dal.DeleteMail(id);
MessageBox.Show("mail DELETE");
}
catch (Exception ex)
{
errHandler.ErrorMessage = dal.GetException();
errHandler.ErrorMessage = ex.Message.ToString();
}
}
#endregion
private Mails Deserialize (byte[] xmlByteData)
{
try
{
XmlSerializer ds = new XmlSerializer(typeof(Mails));
MemoryStream memoryStream = new MemoryStream(xmlByteData);
Mails mail = new Mails();
mail = (Mails)ds.Deserialize(memoryStream);
return mail;
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
throw;
}
}
private static void WriteResponse(string strMessage)
{
HttpContext.Current.Response.Write(strMessage);
}
private String Serialize(Mails mail)
{
try
{
String XmlizedString = null;
XmlSerializer xs = new XmlSerializer(typeof(Mails));
MemoryStream memoryStream = new MemoryStream();
XmlTextWriter xmlTextWriter = new XmlTextWriter(memoryStream, Encoding.UTF8);
xs.Serialize(xmlTextWriter, mail);
memoryStream = (MemoryStream)xmlTextWriter.BaseStream;
XmlizedString = UTF8ByteArrayToString(memoryStream.ToArray());
return XmlizedString;
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
throw;
}
}
private String UTF8ByteArrayToString(Byte[] characters)
{
UTF8Encoding encoding = new UTF8Encoding();
String constructedString = encoding.GetString(characters);
return (constructedString);
}
}
But I don't understand why my handler is never call. So I have always a 400 error.
I've got too a DAL class who permiss the connection et request to the database :
public class DAL
{
private NpgsqlConnection conn;
private NpgsqlCommand command;
private static string connString;
private static List<Mails> mailList;
private ErrorHandler.ErrorHandler err;
public DAL(string _connString)
{
err = new ErrorHandler.ErrorHandler();
connString = _connString;
}
public void AddMail (Mails mail)
{
try
{
using (conn)
{
string npgsqlInsertString = "INSERT INTO mails (id_entete, emmetteur, destinataires, objet, contenu, date_envoi, heure_envoi) VALUES (@id_entete, @emmetteur, @destinataires, @objet, @contenu, @date_envoi, @heure_envoi)";
conn = new NpgsqlConnection(connString);
command = new NpgsqlCommand();
command.Connection = conn;
command.Connection.Open();
command.CommandText = npgsqlInsertString;
NpgsqlParameter idParam = new NpgsqlParameter("@id_entete", mail.Id_entete);
NpgsqlParameter emmParam = new NpgsqlParameter("@id_entete", mail.Emmetteur);
NpgsqlParameter destParam = new NpgsqlParameter("@id_entete", mail.Destinataires);
NpgsqlParameter objParam = new NpgsqlParameter("@id_entete", mail.Objet);
NpgsqlParameter contParam = new NpgsqlParameter("@id_entete", mail.Contenu);
NpgsqlParameter dateParam = new NpgsqlParameter("@id_entete", mail.Date_envoi);
NpgsqlParameter heureParam = new NpgsqlParameter("@id_entete", mail.Heure_envoi);
command.Parameters.AddRange(new NpgsqlParameter[] { idParam, emmParam, destParam, objParam, contParam, dateParam, heureParam });
command.ExecuteNonQuery();
command.Connection.Close();
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
}
public void UpdateMail (Mails mail)
{
try
{
using (conn)
{
string npgsqlUpdateString = "UPDATE mails SET id_entete=@id_entete, emmetteur=@emmetteur, destinataires=@destinataires, objet=@objet, contenu=@contenu, date_envoi=@date_envoi, heure_envoi=@heure_envoi WHERE id=@id";
conn = new NpgsqlConnection(connString);
command = new NpgsqlCommand();
command.Connection = conn;
command.Connection.Open();
command.CommandText = npgsqlUpdateString;
NpgsqlParameter idParam = new NpgsqlParameter("@id_entete", mail.Id_entete);
NpgsqlParameter emmParam = new NpgsqlParameter("@id_entete", mail.Emmetteur);
NpgsqlParameter destParam = new NpgsqlParameter("@id_entete", mail.Destinataires);
NpgsqlParameter objParam = new NpgsqlParameter("@id_entete", mail.Objet);
NpgsqlParameter contParam = new NpgsqlParameter("@id_entete", mail.Contenu);
NpgsqlParameter dateParam = new NpgsqlParameter("@id_entete", mail.Date_envoi);
NpgsqlParameter heureParam = new NpgsqlParameter("@id_entete", mail.Heure_envoi);
command.Parameters.AddRange(new NpgsqlParameter[] { idParam, emmParam, destParam, objParam, contParam, dateParam, heureParam });
command.ExecuteNonQuery();
command.Connection.Close();
}
}
catch (Exception ex)
{
err.ErrorMessage = ex.Message.ToString();
throw;
}
}
public void DeleteMail (int id)
{
try
{
using (conn)
{
string npgsqlDeleteString = "DELETE FROM mails WHERE id=@id";
conn = new NpgsqlConnection(connString);
command = new NpgsqlCommand();
command.Connection = conn;
command.Connection.Open();
command.CommandText = npgsqlDeleteString;
NpgsqlParameter idParam = new NpgsqlParameter("@id", id);
command.Parameters.Add(idParam);
command.ExecuteNonQuery();
command.Connection.Close();
}
}
catch (Exception ex)
{
err.ErrorMessage = ex.Message.ToString();
throw;
}
}
public Mails GetMail(int ID)
{
try
{
if (mailList == null)
{
mailList = GetMails();
}
foreach (Mails mail in mailList)
{
if (mail.Id == ID)
{
return mail;
}
}
return null;
}
catch (Exception ex)
{
err.ErrorMessage = ex.Message.ToString();
throw;
}
}
private List<Mails> GetMails()
{
try
{
using (conn)
{
mailList = new List<Mails>();
conn = new NpgsqlConnection(connString);
string npgsqlSelectString = "SELECT * FROM mails";
command = new NpgsqlCommand(npgsqlSelectString, conn);
command.Connection.Open();
NpgsqlDataReader reader = command.ExecuteReader();
while (reader.Read())
{
Mails mail = new Mails();
mail.Id = (int)reader[0];
mail.Id_entete = (int)reader[1];
mail.Emmetteur = reader[2].ToString().Replace(" ", "");
mail.Destinataires = reader[3].ToString().Replace(" ", "");
mail.Objet = reader[4].ToString().Replace(" ", "");
mail.Contenu = reader[5].ToString().Replace(" ", "");
mail.Date_envoi = reader[6].ToString().Replace(" ", "");
mail.Heure_envoi = reader[7].ToString().Replace(" ", "");
mailList.Add(mail);
}
command.Connection.Close();
return mailList;
}
}
catch (Exception ex)
{
err.ErrorMessage = ex.Message.ToString();
throw;
}
}
public string GetException()
{
return err.ErrorMessage.ToString();
}
}
So, who can I do for call the function ProcessRequest(HttpContext context) ? Thank you for your help and sorry for my bad english ... !
|
|
|
|
|
Hi All,
In my project textbox will update value based on the timer tick trigger, i want to bypass the textbox update when i want to change value in textbox.
|
|
|
|
|
And?
What have you tried?
Where are you stuck?
What help do you need?
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
I tried textbox enter property, so when i select the text box it is not updating and i used textbox leave property to update the entered value. but i have 50 textboxes. i want to make a common function for all textboxes.
|
|
|
|
|
So do: you can call the same event handler for all your boxes, and use the Tag property to hold the "I'm editing" flag. You get the appropriate TextBox instance passed in as the sender parameter of the event handler.
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
i am newbee for c#, can please explain how to do.
|
|
|
|
|
Which bit? This is all basic stuff ...
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
thats ok. but i don't know, which active control is enabled when i select textbox.
|
|
|
|
|
Sorry?
Do you mean you don't know which control it was when you enter the event handler? Because as I said, that is given to you as the sender parameter.
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
|
Hi,
I want to send push notifications to mobile phones that I will register. And I want to use Azure Notifications Hub for this purpose. If this possible?
Best regards.
|
|
|
|
|
That's a question that is specific to Azure Notifications Hub, not to C#. According to their website, it is possible.
Push notifications for consumer and enterprise apps—from any backend to any device platform
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
I have an app that transfers files to a server via FTP. This class doesn't actually do the Ftp'ing, it just maintains a queue of files to be sent.
Some requirements:
1) No more than 3 files can be FTP'd at one time.
2) The appearance of a file in the target folder should start the process.
3) When a file is done being FTP'd, the next file should automatically start.
So I came up with this FTPQueue class[^]. I would appreciate any feedback.
Thanks
public static class FTPQueue
{
private const int MAX_FILE_UPLOADS = 3;
private static int _filesBeingUploaded = 0;
private static Queue _fileQueue = new Queue();
private static List<string> _filesInProgress = new List<string>();
public static void FileReceived(string fileName)
{
Console.WriteLine("Received file {0}", fileName);
_fileQueue.Enqueue(fileName);
UploadNextFile();
}
private static async void UploadNextFile()
{
if (_fileQueue.Count == 0)
{
return;
}
string fileName = (string)_fileQueue.Peek();
if (!string.IsNullOrEmpty(fileName))
{
if (_filesBeingUploaded >= MAX_FILE_UPLOADS)
{
return;
}
if (_filesInProgress.Any(x => x == fileName))
{
return;
}
_filesBeingUploaded++;
Console.WriteLine("Files being uploaded {0}", _filesBeingUploaded);
Task<bool> upload = TransferFile(fileName);
bool result = await upload;
Console.WriteLine("Files {0} upload complete", _filesBeingUploaded);
FileUploadComplete(fileName, result);
}
}
private static async Task<bool> TransferFile(string fileName)
{
Console.WriteLine("Tranfering file {0}", fileName);
await Task.Delay(10000);
bool result = true;
return result;
}
private static void FileUploadComplete(string fileName, bool result)
{
if (result)
{
Console.WriteLine(string.Format("File {0} upload complete", fileName));
if (_fileQueue.Count > 0)
{
Console.WriteLine("Removing file {0} from the queue", fileName);
_fileQueue.Dequeue();
_filesInProgress.Remove(fileName);
}
_filesBeingUploaded--;
Console.WriteLine("Files being uploaded {0}", _filesBeingUploaded);
}
else
{
Console.WriteLine(string.Format("File {0} upload failed", fileName));
}
UploadNextFile();
}
}
For testing I ran it from a simple Console app:
class Program
{
private static Timer _timer;
static void Main(string[] args)
{
_timer = new Timer();
_timer.Interval = 3000;
_timer.Elapsed += _timer_Elapsed;
_timer.Enabled = true;
Console.ReadLine();
}
private static void _timer_Elapsed(object sender, ElapsedEventArgs e)
{
var fileName = Path.GetRandomFileName();
Console.WriteLine("Created new file {0}", fileName);
FTPQueue.FileReceived(fileName);
}
}
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
Why even bother trying to FTP more than 1 file at a time; is this supposed to increase throughput?
|
|
|
|
|
Why not?
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
Well, does it increase throughput? If not, it's a make-work project.
|
|
|
|
|
It's possible for many files to be dropped into the folder at once, so I want to send more than one at a time to decrease the total time it takes to get them to the server.
If there's a lot of files and they go one at a time it could take forever for them to finish.
I set it to 3 for starters but if it works well I could up it to 5 or even 10. Each one will be in it's own process so they should all go concurrently.
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|