|
musefan wrote: Why would I want to create a file in memory, store in DB and then read it straight back into memory?
Separation of responsibilities: one (or several) process(es) creates, another sends. Maybe another archives. Who knows?
|
|
|
|
|
PIEBALDconsult wrote: Separation of responsibilities: one (or several) process(es) creates, another sends. Maybe another archives. Who knows?
In some requirements, yes maybe so. But mine is much more just 'Create and Send' - its wasted effort to split the process
Life goes very fast. Tomorrow, today is already yesterday.
|
|
|
|
|
I'll check back in a year.
|
|
|
|
|
don't forget now... maybe I will come crawling before then
Life goes very fast. Tomorrow, today is already yesterday.
|
|
|
|
|
musefan wrote: maybe I will come crawling before then
I don't think so... I'm sure you'll be able to handle it on your own.
|
|
|
|
|
I had the same experience; my guess is the attachment files remain locked by the mail system until it is done sending them. Your code is not to blame for any of that. Solutions would be to create:
- an attachment from the file contents;
- an attachment form a stream, as you did;
- a copy of the file (leaving even more garbage on the file system).
|
|
|
|
|
Or don't delete the file -- archive it at a later date. You never know when you may need a paper trail to CYA. Or to resend for some reason.
|
|
|
|
|
Files are created purely for sending to users. There is no need for keeping the file - besides generated files could end up frequent and many so want to keep the file system tidy
Life goes very fast. Tomorrow, today is already yesterday.
|
|
|
|
|
musefan wrote: Files are created purely for sending to users
Aren't they all?
musefan wrote: There is no need for keeping the file
"There will be." -- Murphy
musefan wrote: keep the file system tidy
Create them in a directory specifically for them.
|
|
|
|
|
PIEBALDconsult wrote: Aren't they all?
No, I have many files on my computer that are not required to go anywhere. In my service the files being sent are exports of data at regular intervals
PIEBALDconsult wrote: "There will be." -- Murphy
If I ever need to keep them, then I will start keeping them. But I don't now and cannot think of what they would be useful for
PIEBALDconsult wrote: Create them in a directory specifically for them.
That's still a folder full of files I don't need or want. Plus they are already generated in a nice file structure to keep them unique
Life goes very fast. Tomorrow, today is already yesterday.
|
|
|
|
|
musefan wrote: I will start keeping them
I see no reason not to keep them now. Do you need plausible deniability?
musefan wrote: I don't now and cannot think of what they would be useful for
Mostly as a paper trail and for resending when someone's system crashes and is rebuilt. But also in case sometime down the road someone asks about comparing current numbers with numbers from a year earlier -- it happens, and if you aren't already archiving (and perhaps summarizing) data, all you have are files, reports, extracts... or maybe you don't.
Don't throw anything away.
|
|
|
|
|
If I need to resend I can generate the files again. These files are just data exports from a database, there is no lost data by deleting the files as they already will exist in the DB which itself will be backed up and archived.
These files are for users and they can do whatever they wish to with them. It is there data to control.. and to look after. We are not providing a file storage system for them.
Like I have already said if it wasn't for an MDB file being generated I probably woudln't even let them out of memory. In that instance would you be telling my to dump my RAM and keep copies of that 'just incase' - if requirements changes, then they change, and it can be handled then
You also said to plan for the future - but what if we look at this from the flip side. What if these generated files becomes thousands a day, even a TB of data a day (bit extreme, but you never know) - now the question will be 'what good reason is there to keep the files?'
Life goes very fast. Tomorrow, today is already yesterday.
|
|
|
|
|
musefan wrote: a TB of data a day
::gibbering:: Well... maybe a very pro-active archive regimen and compression?
|
|
|
|
|
Thanks for the reassurance. It would be nice if the was a function SendAndIWillWaitThankYou() but sadly not.
In this case I am happy to use a memory stream as I don't expect the generated files to be too large. I don't like the idea of loading in memory if they get too big so I may have to revisit this at some stage if they start growing
Life goes very fast. Tomorrow, today is already yesterday.
|
|
|
|
|
musefan wrote: SendAndIWillWaitThankYou()
The email server may not have an Internet connection... or only 56K...
|
|
|
|
|
My code is on a separate thread anyway, so I don't mind it waiting however long. Perhaps in conjunction with the Timeout property I would be happy to wait say (30 minutes) and then log a failure
In the event it wants to queue the attachment and return control to me, I would much prefer it to make its own copy of the file to be queued, and then it can be responsible for the clean up itself
Life goes very fast. Tomorrow, today is already yesterday.
|
|
|
|
|
I'd rather have a DeleteWhenSent property, so I don't have to worry at all.
And I don't really want to receive e-mails that you consider too large for storing them in memory...
|
|
|
|
|
Luc Pattyn wrote: nd I don't really want to receive e-mails that you consider too large for storing them in memory
Yes. In this case it is a 'If you really want it' function. Users have the option for an FTP transfer of the same files if they would prefer (recommended)
Life goes very fast. Tomorrow, today is already yesterday.
|
|
|
|
|
Luc Pattyn wrote: I don't really want to receive e-mails that you consider too large
Neither do I, but you know those MBA-types and their love of Excel...
|
|
|
|
|
Hiya,
For me I found that disposing the client and or message releases the file lock.
Also, there is a problem sending emails over 4mb in size (this is if you do not know about it).
I would use the following: Net.Mail.Attachment.TransferEncoding = System.Net.Mime.TransferEncoding.SevenBit
kind regards
Dan
|
|
|
|
|
Agreed - the file remains locked until the Attachment instance is disposed. Calling Dispose on the MailMessage instance disposes of all attachments for the message.
protected virtual void Dispose(bool disposing)
{
if (disposing && !this.disposed)
{
this.disposed = true;
if (this.views != null)
{
this.views.Dispose();
}
if (this.attachments != null)
{
this.attachments.Dispose();
}
if (this.bodyView != null)
{
this.bodyView.Dispose();
}
}
}
public void Dispose()
{
if (!this.disposed)
{
foreach (Attachment attachment in this)
{
attachment.Dispose();
}
base.Clear();
this.disposed = true;
}
}
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Did you try using your second approach, but with a FileInputStream (or whatever the dotNet equivalent is)?
Just change
Stream stream = new MemoryStream(File.ReadAllBytes(filename));
to
Stream stream = new FileInputStream(filename);
and retain the cleanup loop at the bottom.
This avoids the memory stream.
|
|
|
|
|
Thanks for the suggestion. I don't have time to test that at the minute but I have put a TODO so will give it a go at some point
Life goes very fast. Tomorrow, today is already yesterday.
|
|
|
|
|
The C# 4.0 compiler is slow on my 4-core laptop. I noticed only one core is busy when compiling. Is C# compiler multi-core aware?
|
|
|
|
|
AFAIK it is not. However you may be interested in this[^]. I haven't tried it yet.
|
|
|
|