|
ok i will try
can u suggest more features
This code was posted by me...
|
|
|
|
|
i'm trying to insert data to access db, and i did it, but the problem is after first insert when check the db data is there, then close the application and run it again the last inserted data is missing, what might be the problem.
|
|
|
|
|
Actually the senerio is not clear, Probably you using a datatable with a disconnect recordset, Try to directly insert.
Md. Marufuzzaman
|
|
|
|
|
no i'm using oleDbDataAdapter1,dataSet11,oleDbConnection1 and following s the code
DataRow dr = dataSet11.Table1.NewRow();
dr["ic"] = txt1.Text;
dr["DES"] = txt2.Text;
dr["STKBAL"] = txt3.Text;
oleDbConnection1.Open();
dataSet11.Table1.Rows.Add(dr);
oleDbDataAdapter1.Fill(dataSet11);
oleDbDataAdapter1.Update(dataSet11,"table1");
oleDbConnection1.Close();
|
|
|
|
|
Ummm... you're adding a DataRow to a DataTable in a DataSet then filling the DataSet with the contents of the database? Doesn't that wipe out the existing data in the DataSet?
Other than that; why are you using a DataSet and a DataAdapter? You probably just want to use ExecuteNonQuery to add the data.
|
|
|
|
|
I would have thought that you might want to Open the Connection first??
Excellence is doing ordinary things extraordinarily well.
|
|
|
|
|
Check all the stored procedures/SQL queries that are fired on the DB until you close your application. If the data has been inserted, it can vanish only if some command that does so is run.
Had it been SQL server, Profiler would have helped you out. Don't know if it works with Access or there is something like that for Access DB too.
|
|
|
|
|
if you are running your application from visual studio and your access db file is added to project right click on your access db file, select properties, and set "copy to output directory" to "copy if newer"
hope it helps
Let there arise out of you a band of people inviting to all that is good, enjoining what is right, and forbidding what is wrong: They are the ones to attain felicity.
Āli-'Imrān (The Family of Imran), 104.
|
|
|
|
|
Here is my code, and i tried same code for file access control, it works well. but doesn't work for common application data on Vista.
<code>
private static void GrantEveryoneFullControlRight(string directory)
{
try
{
if (!Directory.Exists(directory))
Directory.CreateDirectory(directory);
DirectoryInfo dirInfo = new DirectoryInfo(directory);
DirectorySecurity ds = dirInfo.GetAccessControl(AccessControlSections.Access);
FileSystemAccessRule rule = new FileSystemAccessRule(
"Users", FileSystemRights.FullControl, AccessControlType.Allow);
ds.AddAccessRule(rule);
dirInfo.SetAccessControl(ds);
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
}
</code>
Glad to discuss with you and best wishes.
|
|
|
|
|
|
I do something very similar except use "Everyone" in the access rule:
<br />
<br />
string appDataPath = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData);<br />
string _localUserDataPath = Path.Combine(appDataPath,"My Company\\MyApp\\");<br />
<br />
if (!Directory.Exists(_localUserDataPath))<br />
{<br />
DirectorySecurity ds = new DirectorySecurity();<br />
ds.AddAccessRule(new FileSystemAccessRule("Everyone", FileSystemRights.FullControl, AccessControlType.Allow));<br />
Directory.CreateDirectory(_localUserDataPath, ds);<br />
}<br />
else<br />
{<br />
DirectoryInfo di = new DirectoryInfo(_localUserDataPath);<br />
DirectorySecurity ds = new System.Security.AccessControl.DirectorySecurity();<br />
ds.AddAccessRule(new FileSystemAccessRule("Everyone", FileSystemRights.FullControl, AccessControlType.Allow));<br />
di.SetAccessControl(ds);<br />
<br />
}
|
|
|
|
|
does this work on Vista?
Glad to discuss with you and best wishes.
|
|
|
|
|
Well, I thought it was working on Vista but I must have been running my client in admin mode because I can't get it to work again. What does work is creating an installer package for my windows service that creates the directory in ProgramData. Using vs.net 2008 I added a custom folder called Common Application Data Folder (any name will work). I set the DefaultLocation property to [CommonAppDataFolder] and the Property property to COMMONAPPDATAFOLDER. Then I added my own folders to this folder (Company\commondir). When the install is run it creates the folders and adds "Users" to the security properties for those folders.
This seems to give "Users" enough permission to read/execute and write files to that location. My non-admin windows client can read/ write and execute from the folder and my windows service can as well.
|
|
|
|
|
Thanks for your reply, we granted directory access right in installer too.
It works well.
Glad to discuss with you and best wishes.
|
|
|
|
|
I am working in C# and dealing with very large images - too large to load the entire image into memory (which is what GDI+ Bitmap class does). I wondered if anyone at has had a similar experience and if so - how was this type of issue solved?
Any ideas are much appreciated.
Thanks!
|
|
|
|
|
Hi,
I would try and avoid such situation completely, probably by having a grid of smaller images to start from.
Except maybe for a database I wouldn't want to have objects that are larger than main memory.
What is the size of your image, what is its source, and what is your application?
Luc Pattyn [Forum Guidelines] [My Articles]
The quality and detail of your question reflects on the effectiveness of the help you are likely to get.
Show formatted code inside PRE tags, and give clear symptoms when describing a problem.
|
|
|
|
|
Unfortunately I can't really decrease the size of the images at least as they enter the software. Are there any other solutions that might work with super large images?
[further detail on the issue]
I'm working with scanned images. My users believe (incorrectly) that to get a good scan they need to work at 1000 dpi. Scanning an entire page yields a 250 MB image. I keep a the original intact for good measure and only work with a copy of it, which means another 250 MB. Finally, they are making numerous partial copies of the original image to do whatever they do, which means probably another 50 MB x 10 copies = 500 MB. Total memory for this scenario so far is somewhere around 1 GB. If the user applies filters in the software these can sip extra memory and push it over the edge.
Even if the software doesn't crash in a single scenario like that the user will probably proceed to do this a few times, and possibly in a few different tabs at once. Long story short, it takes some work to get the software to crash but sooner or later they swap the software with all the images they are using. No one seems to happy about the whole ordeal...
|
|
|
|
|
Would compression help? TIFF images support LZW compression.
Maybe skip keeping the original intact to save 250 MB, and reload from disk when you want to restore it.
Another idea: Instead of partial copies, create a class that provides a virtual partial copy: A link to the whole image, and a Rectangle that defines a portion of the image.
If a page contains only black and white text, you could use one bit per pixel which would give you 32X compression and faster processing.
|
|
|
|
|
Thanks for these great ideas - Just a few thoughts on these ideas.
+ I'm not sure about compression, I shall have to try it
+ I have a cache already. I shall check this to make sure it operates correctly
+ Sometimes my users delete the "original" from the screen. "Virtual"
copies wouldn't work well, then.
+ The "1bpp" is probably the best idea but all of the filters I have right now
only work on 32 bpp argb I shall have to rewrite these so I can directly
modify a 1bpp picture. By the way, can .NET/GDI+ work directly with 1bpp?
Keep the ideas coming!
|
|
|
|
|
awaldro wrote: I'm working with scanned images. My users believe (incorrectly) that to get a good scan they need to work at 1000 dpi. Scanning an entire page yields a 250 MB image.
If I may ask, what is on their pages that would require such a resolution?
The last time I scanned with such a resolution, I could count almost every single atom! If you know what I mean.
|
|
|
|
|
I certainly do - the resolution you can get from a reasonable scanner is quite amazing! The users are scanning latent prints from crime scenes and lift cards. Unfortunately, none of them (or you) would be happy with a invalid positive id - "inclusion" - so they use 1000 dpi. Additionally there is a standards board that regulates the latent software market that 'requires' 1000 dpi. Its really funny - some users think they are going "above and beyond" by scanning at 1200 dpi!
I believe I am going to try writing my own C++ code (piecing together parts of FreeImage and a few other free libraries I have found). Who knows how this experiment shall turn out.
Of course, if there are any other ideas - I am still open!
|
|
|
|
|
Why is 1GB memory a problem? Are you talking about RAM memory?
I came, saw, and then coded.
|
|
|
|
|
The problem isn't "1 GB". The problem is "1 GB per Tab" - which is a bit different!
|
|
|
|
|
Oh, yes, the tabs you have mentioned. How much RAM do they have? 3GB RAM plus Virtual Memory, may still be enough. Have a look at the Virtual Memory, if that fills up then you have a problem.
I came, saw, and then coded.
|
|
|
|
|
The computers have 4 GB of physical memory (32 bit cpu), but I certainly don't get 4GB of memory. Actually it seems like I get roughly 1.1 GB. I presumed this was a .NET thing. Is there a site anywhere that actually says how much memory the .NET runtime can allocate?
|
|
|
|