|
Yes. The TIFF file with an alpha channel works perfectly under XP. Both BMP and TIFF files WITHOUT an alpha channel work under both OSes.
"Microsoft -- Adding unnecessary complexity to your work since 1987!"
|
|
|
|
|
Possibly a problem with the TIFF codec in Windows 7 then I'm afraid. If I were you, I'd raise a report through Microsoft Connect.
|
|
|
|
|
Thanks!
"Microsoft -- Adding unnecessary complexity to your work since 1987!"
|
|
|
|
|
You're welcome. I'm just sorry I couldn't be of more assistance.
|
|
|
|
|
Hi everyone!
I'm looking for a library that allows me to load, save and edit .dxf documents. Preferably, it has to be free. I need to allow the user to define "zones" over the "graphic" through a visual interface.
Thank you in advance!
modified 8-Mar-12 11:57am.
|
|
|
|
|
|
Thank you very much. I will take a look to these libraries!
|
|
|
|
|
The autodesk API is not what I need. I understand that it is more about developing autocad plugins or using autocad functions remotely, but always having the main program installed. I will take a look to your second proposal...
|
|
|
|
|
hi all
i need an component that find all directory that can show in tree viewer.
|
|
|
|
|
It's part of the Framework (means it's already on you machine). You'd use the GetFilesMethod of the Directory[^] class and display it in a TreeView[^].
I'm sure that there are plenty examples on this on the internetz. What have you tried so far?
Bastard Programmer from Hell
|
|
|
|
|
A nice sample here[^].
Unrequited desire is character building. OriginalGriff
I'm sitting here giving you a standing ovation - Len Goodman
|
|
|
|
|
Hi,
I am getting an System.OutofmemoryException when using Filestream to load zip file. the file is 521MB. My code is:
//Insert using Filestream, file into SQL Server Table
private void btnInsert_Click(object sender, EventArgs e)
{
OpenFileDialog openFileDlg = new OpenFileDialog();
openFileDlg.InitialDirectory = Directory.GetCurrentDirectory();
if (openFileDlg.ShowDialog() == DialogResult.OK)
{
FileInfo fi = new FileInfo(openFileDlg.FileName);
FileStream fs = new FileStream(fi.FullName, FileMode.Open, FileAccess.Read);
BinaryReader rdr = new BinaryReader(fs);
byte[] fileData = rdr.ReadBytes((int)fs.Length);
rdr.Close();
fs.Close();
My program dies on this line: byte[] fileData = rdr.ReadBytes((int)fs.Length);
i have also attached the detail for the System.OutofMemoryException. (<a href="http://www.mediafire.com/?vk77dph2vjcq3vx">)
</a>[<a href="http://www.mediafire.com/?vk77dph2vjcq3vx" target="_blank" title="New Window">^</a>]
I am runing this on a Windows 7, 64bit machine.
thanks for your help.
Sharon
|
|
|
|
|
Sharonc7 wrote: byte[] fileData = rdr.ReadBytes((int)fs.Length);
You're reading the 521 Mb in a single call. How about reading a single megabyte, writing that and move on to the next one?
Bastard Programmer from Hell
|
|
|
|
|
Two possibilities:
1. you really ran out of memory. If your code fails right away, this is a likely cause.
2. a byte array is an object; objects larger than some 80KB get allocated on the "Large Object Heap". That heap does not get compacted, hence it may become a subject of fragmentation, which in turn could result in a big memory block not being available although maybe lots of smaller ones are. If your code succeeds a couple of times, and then starts to fail, this is a likely cause.
Now, as Eddy said, why do you want all that data in memory at once? Do you really need that, or are you just being lazy?
BTW: the line byte[] fileData=File.ReadAllBytes(openFileDialog.FullName); could easily replace six of your statements (assuming having all the data at once is really necessary). Notice how the word "Stream" did not show, as streaming means consuming data in chuncks, not getting it all in one. Your code is an example of "stream abuse".
|
|
|
|
|
Hi,
thanks for the help.
Hi,
As I mentioned I have a machine that has 8 GB of memory. I had to shut down my machine to take it home last night.
When I tried to load the file again today it loaded with no problem. Normally I will only be loading one file at a
time, but in this case I was testing my Development database and was loading 28 files one right after the other.
Now my question is, since it is working ok now, is there a problem I still need to fix?
Sharon
|
|
|
|
|
Sounds like a fragmentation issue then; and that could be tough, it may well be a fundamental weakness of the .NET memory management.
Sharonc7 wrote: Normally I will only be loading one file at a time Is that one file per run? if so, nothing needs to be done.
Otherwise "solutions" are (pick one or more):
1. when it happens, restart your app;
2. organize things so that you don't need those large objects, or at least reduce their number;
3. make sure you free the large objects as soon as you don't need them any more (and before allocating more of them). This may involve setting a reference to null AND calling GC.Collect() explicitly (which under normal circumstances should better not be done).
4. organize memory reuse, rather than forgetting about stale objects and allocating new ones. However, this can't be done for methods that return a (new) large object, such as your stream.Read or my File.ReadAllBytes.
|
|
|
|
|
Luc Pattyn wrote: That heap does not get compacted, hence it may become a subject of fragmentation, ...
I did not know that.
|
|
|
|
|
There is some length limit for stream, it won't take all of your machine's memory. I think it was some 512 MB on 64bit systems.
|
|
|
|
|
Thanks everyone. I will give this a try.
Sharon
|
|
|
|
|
Based on Luc Pattyn's answer I found the following.
Trouble with the Large Object Heap[^]
It claims the problem does not exist on large memory platforms although that would seem unlikely to me (more likely that it just takes longer to show up.)
However if that is the problem then based on your original code snippet then the solution would be to use a fixed buffer rather than allocating it each time. This would require redoing to the code to allow for reading files that are larger than the allocated buffer.
The solution is simple if you are doing sequential processing but more complicated with a thread processing. In that case you would need to use an object pool where the pool contained buffers.
|
|
|
|
|
Hi!
I have a file which contains this information:
A
B
C
D
E
F
I iterate over all lines in the file. When I reach B, I want to store the location of C in a variable.
When I later hit EndOfStream I would like to set the StreamReader's pointer with Seek() method back to C and continue reading until I hit EndOfStream.
Here is where I stumble over problems. When I store the StremReaders.BaseStream.Position in a long variable, it always store 16.
When later using Seek() method with the stored location I don't read C just null.
I don't know if EndOfStream has something to do with it. I also tried to use DiscardBufferedData() method, but I just flags the EndOfStream property.
This might looks like to be a very weird functionality but it's a small part of a functionality I need.
I just need to know how to set the position of a StreamReader instance somewhere in the file and continue read from it.
Here is my code:
static void Main(string[] args)
{
string filePath = @"C:\File1.txt";
StreamReader reader = new StreamReader(filePath);
long position = 0;
while (!reader.EndOfStream)
{
string line = reader.ReadLine();
if (line == "B")
{
position = reader.BaseStream.Position;
}
Console.WriteLine(line);
}
reader.BaseStream.Seek(position, SeekOrigin.Begin);
Console.WriteLine(reader.ReadLine());
reader.Close();
}
|
|
|
|
|
The reason it returns 16 is that that is the length of the file...
BaseStream.Position does not do what you want - you want to use the Seek method combined with the position of the character in the stream. So, there is good news and bad news:
Good news: The position is stored in StreamReader.charPos, and does exactly what you want.
Bad news: It's a private variable.
So, getting it and using it is a bit dodgy - you can do it via reflection, but then your code will rely on an undocumented feature, which could disappear at any time. If you want to go that route, there is an extension method which does it all for you here: http://www.daniweb.com/software-development/csharp/threads/35078/page2[^] The code is about 3/4 of the way down.
Not a good idea, to my mind. Instead, I would use a Binary Reader, and read the data directly, processing the lines for myself:
BinaryReader br = new BinaryReader(File.OpenRead(@"D:\Temp\MyList.txt"));
int recordLength = 16;
for (int i = 0; i < 10; i++)
{
Console.WriteLine("{0}: {1} = {2}", i, br.BaseStream.Position, (char) br.ReadByte());
}
Console.WriteLine();
Gives:
0: 0 = A
1: 1 = B
2: 2 = C
3: 3 = D
4: 4 = E
5: 5 = F
6: 6 = G
7: 7 = H
8: 8 = I
9: 9 = U
Ideological Purity is no substitute for being able to stick your thumb down a pipe to stop the water
|
|
|
|
|
I appears that StreamReader reads more than you expect and buffers it -- hence the Position of the Stream isn't where you think it is.
I suggest using a Stream rather than StreamReader.
|
|
|
|
|
Hello Everyone
I begin to develop an application using C #, but I can not handle the DataGridView has actually I manage to display data from a table but after posting I need to recover the contained of a cell but I can not !! in fact I want to use the datagrid to manage user profile (add, edit and delete).
pleaaaaaaase i need your helppp !!!!!
|
|
|
|
|
I need to recover the contained of a cell but I can not
This doesn't make sense. Did you mean 'content'? The DGV allows you to access cells directly: myDGV.Cells[2, 6].Value.
|
|
|
|