|
Greetings!
1) How do I eliminate the first column in the DataGrid of Windows Forms containing cells with auxilary stars and arrows?
2) Can the DataGrid control be used for developing daily timesheet like one in the Outlook? If not, what should I use or write my own Windows control?
Thanks for your support!
|
|
|
|
|
1) Have a look at the RowHeaderVisibile property of the DataGrid and of the DataGridTableStyle (latter only necessary when using self created table styles).
2) I Dont think you will have much luck with the DataGrid. I think there are some free and commercial controls out there but I dont have any at hand right now. Google is your friend
|
|
|
|
|
On the VS forms designer, some Controls act as a container for other Controls and some do not. Select a Button and drag another from the toolbox onto the first and you get two Buttons on the Form. But select a Frame and drag a Button onto it and you get a Button on a Frame on a Form. What tells the designer whether or not to treat a Control as a container for other Controls?
I've tried to create a container Control by inheriting from UserControl, but I cannot get it to act as a container (dispite the fact that UserControl inherits from ContainerControl). The only way I've found to write my own container Control is to inherit from Frame. But there are Controls written by others in Code Project that act as containers and actually inherit from Control directly. I've looked carefully at their source code (and in the documentation) and I cannot discover what it is specifically in the code that tells the designer to put Controls dragged onto it into its Components collection rather than it's parent's.
Can anyone enlighten me?
|
|
|
|
|
|
Hm yes, I read that bit before, but it does not really answer the question. The component member that gets added in the UserControl (and Form) template implements this interface. My problem is that if you create a control based on UserControl, the designer for UserControls will add components dragged from the toolbox onto it to its components collection - so far so good. But when you put this UserControl onto a Form, it will no longer act as a container and components dragged onto it get added to the Form's components collection not the UserControl's. Something is telling the Form designer to treat some components as valid containers and some not - even when they all have components collections. I expected this to be an attribute - but apparantly not.
|
|
|
|
|
On MSDN it says that only objects having a Finalize method are put in the finalization queue. But since all objects derive from the Class Object and the Class Object has a Finalize method, does that mean that all objects are put in the finalization queue?
Signatures are dumb
|
|
|
|
|
The GC adds only those objects that override the Finalize method.
Regards
Senthil
_____________________________
My Blog | My Articles | WinMacro
|
|
|
|
|
As an introduction, I work for a small scale CRM software vendor. Since our lastest version, we are officially in .NET and have a total API based middle ware architecture. Basically, our APIs are glorified business objects that facades the database (SQL Server). Most of our clients (around 97%) are in .NET and use these APIs to build their eBusiness applications. The rest are in the earlier version and coldfusion based. They will be moving to .NET with the upgrade.
With the new webservices "hype", our architect wants us to drive the webservices way and remove the complete API collection. He wants all clients building eCRM applications to use the webservices to reach for data. His reasoning is that client architecture is independent of the CRM.
I, for one, think this architecture blows because we can accomplish the same goal with a service oriented architecture, have improved performance (SOAPless), be in process and spin less wheels.
I am looking for input from you geniuses out here. Your input will definitely help me build my case over the next few weeks.
Thanks
|
|
|
|
|
Hi,
I want to prgrammatically apply some policy to user/group.
The related manual steps are:
Run gpedit.msc
Path is
Local Computer Policy\Cpmouter Configuration\Local Policies\User Rights Assignments\
In that "Log on Locally" policy I want to add the group \\machineName\GroupName
To achieve this I used GetAppliedGPOList API to enumarate GPOs.
This API returns SUCCESS but I dont get anything in GROUP_POLICY_OBJECT structure.
If I get GPO data in above structure, I can call some API to modify GPO data.
The code is:
// To get pSid of user call LookupAccountName API
char domain[256] = "";
DWORD domainSize = sizeof( domain );
DWORD size = 256;
PSID pSid;
pSid = (PSID) new BYTE[size];
if ( pSid == NULL)
return false;
memset(pSid, 0, size);
SID_NAME_USE eSidName;
DWORD err = LookupAccountName( NULL, "Administrator", pSid, &size, domain,
&domainSize, &eSidName );
if ( err == 0 )
err = GetLastError();
if ( IsValidSid( pSid ) == FALSE )
return false;
// With this pSid of user Call GetAppliedGPOList
GROUP_POLICY_OBJECT *pGPOList;
GUID guid =
{0x827D319E,0x6EAC,0x11D2,{0xA4,0xEA,0x00,0xC0,0x4F,0x79,0xF8,0x3A}};
DWORD error = GetAppliedGPOList( NULL, NULL,pSid,&guid, &pGPOList );
Is any other way to apply some policy to user/group?? WMI??
Please Help!!
|
|
|
|
|
Hey all. I am trying to set up a data adapter to show me a list of class. I do this with a select command that inner joins on some other tables to get some names etc. Then double clicking on an item brings up a dialog to edit that data.
The issue i'm having I guess is a design one. because I have all the data in the dataset on the form with the list should I use that to do the binding and editing in the popup dialog. This way I save a trip to the DB to get data I already have? If so, I then have to problem of I can't run DataAdapter.Update because the select command had inner join it did not generate update command or inserts.
So I can have two dataAdpaters each selecting their own table. Then when I update after the dialog closes. it will work. But then on the list I don't get the names showing as I havn't joined the tables in the select.
Is there a way to create a Expression column in one table (that is bound to th elist) that gets the correct name from the second table? The dataset has the relations set up in the xsd already.
Cheers, Hope that makes some sense.
Thanks
Luke
|
|
|
|
|
Sure
Have a look at the DataColumn.Expression description on MSDN. Just add a column to your table and give a valid expression. This even works for aggregation of values from all childs of one parent etc.
|
|
|
|
|
I have a two dimensional array of 600 by 800 integers, and while processing this array, I need to visit each element approximately 250 times. This increases the total number of references I need to make to a mammoth figure of 120 million, accordingly increasing the time taken by this method (am using C#.NET) to around 3.5 seconds. This time figure however, is more than what is acceptable in my scenario. I would want to bring it down to around 2 seconds...
So, I was wondering - is there a way to construct and process a matrix-like structure (without or without using two-dimensional arrays; or using pointers or something) which will take less time?
Any assistance is appreciated...
|
|
|
|
|
Try using the System.Collections.ArrayList . You'll be surprised, but the ArrayList is 100's of times faster than an Array. Enumeration is also lightning fast compared to the Array. It's all I use unless some prebuilt function just must use the standard Array.
The ArrayList is also Dynamic, which means it can shrink and grow automatically without the need to "Redim" or move values in and out of one Array to the next.
One quick note about the ArrayList constructor. It must be instantiated by calling the New method, unlike the Array - Dim TestArray(15) As String , you must create the ArrayList, Dim TestArrayList As New ArrayList , then add to it like this... TestArrayList.Add(Value) .
Hope this helps,
Scott Page
|
|
|
|
|
Thanks...
I'll try it out...
|
|
|
|
|
Nopes! no success...
I understand an ArrayList is better when a dynamic data structure is required - but I don't need a dynamic structure - I basically add the values to the array once at the beginning (takes less than a fraction of a second), process them (this is what takes ~ 3-4 seconds), and then read them back from the array (also takes almost no time).
Besides, I need a two-dimensional array. Building such a structure would require me to add new ArrayList objects for each row - this in turn adds to the overhead I pay when referencing each element since I now need to cast each row to ArrayList.
The use of ArrayList also adds extra-overhead of casting the object to int each time I need to refer to it as well as calling two functions (ArrayList.RemoveAt() and ArrayList.Insert()) when I need to update a value in the stucture (something I need to do often in my processing).
The time improvement I'm looking for isn't as much in updating the data structure I use, but more about how do I store the values such that they can be read and written to fast (faster than they are in an array).
Hope you have a solution...
|
|
|
|
|
I deal with large amounts of data on a regular basis as well, and most of the time I only need to manipulate a small portion to get the results I need. I'm not sure if this applies to your problem, but if you know the specific points you need, you could reference that location within the array and calculate a smaller amount of data. If you need all of the data and can't live without any of it, then as far as I know right now, processing 120 million data points is just a slow process no matter how you code it.
Sorry I couldn't provide more help, hope someone knows of a solution for you.
Scott
|
|
|
|
|
|
Do you have information about any resources on how to implement multi-dimensional arrays as pointers (in unsafe code) in C# or in C?
|
|
|
|
|
|
I'm already using pointers + unsafe code to convert the input image to a pixel matrix, and to reconvert the processed matrix back to an image.
I don't see using pointers throughout my code providing much improvements since most of my operations are basic mathematical operations which I perform on the integer matrix. Besides, a pixel map resembles the image more than the Bitmap data, which is a linear list of pixel values in the input image. I often need to analyze 2-dimensional "windows" around each pixel which is simpler to do in a 2-dimensional map than in a linear data structure.
As for now, the methods suggested by Robert have given some improvements. I'll look into pointers again - am just not very familiar with their use at this point.
Thanks...
|
|
|
|
|
This is probably not a data structure but an algorithm problem
It would help much if you could describe what you are actually doing or what result you are expecting. I always like crunching performance issues but for that some more input is needed.
Nevertheless some suggestions:
1. Dont use Length or GetLength within a loop. Assign the length of the two dimensions to some variables and use them. Depending on what you are doing within those loops this could speed things up up to 50% (only true for Release mode).
2. Test with Release mode. Depending on what operations you are doing this can have a real impact on your performance.
3. Take a better processor
|
|
|
|
|
I've tried my best to make my algorithm as optimal as I can...
1.
Here's what I'm trying to do - the two dimensional matrix I had mentioned is a pixel map of an image. I need to process the image, updating the value of each pixel depending upon the characteristics of a 10x10 pixel window with the pixel I wish to update at its center.. A typical loop for example goes like:
/*
iht - height of array
iwd - width of array
iPix - two dimensional array
n1, n2, n3 - integers
iWindowSize - size of processing window
*/
for (int y=0;y<iht;y++) {
for (int x=0;x<iwd;x++) {
// get margins of the processing window
xl=(x<iWindowSize)?0:x-iWindowSize;
xr=(x>iwd-iWindowSize-1)?iwd:x+iWindowSize+1;
yt=(y<iWindowSize)?0:y-iFilterSize;
yb=(y>iht-iWindowSize-1)?iht:y+iWindowSize+1;
// get mean value
n1=0; n2=0;
for (int i=yt;i<yb;i++)
{
for (int j=xl;j<xr;j++)
{
n1+=iPix[i,j];
n2++;
}
}
n3=n1/n2; // n3 is the mean
// get variance
n1=0;
for (int i=yt;i<yb;i++) {
for (int j=xl;j<xr;j++) {
n1+=((n3-iPix[i,j])*(n3-iPix[i,j]));
}
}
n2=n1/n2; // n2 is the variance
// update pixel
if (n2<(n3*n3)/4) iPix[y,x] = n3;
// this is an example of the kind of processing I am using
}
}
2.
Haven't tried in release mode sofar.. Guess I'll try that option...
3. Can't change the processor!
|
|
|
|
|
Release mode doesn't provide any advantage.
Instead, whereas in debug mode, the average processing time for 14 pictures was 3.411 seconds, it climbed upto 3.540 seconds for the same set of pictures in release mode!
|
|
|
|
|
Hmmmm... Ive tested exactly the code you posted and on my machine the relese mode version cut down the time needed to nearly the half. Have you tested it within VS or standalone?
Check that the 'optimize' flag is set to true, 'check for arithmetic overflow' to false and 'generate debug info' also to false.
But I think I have found something to increase the speed of the algorithm itsself:
If I get it right you loop through all columns and in that loop through all rows. Then you sum up all values within this window. One big part is summing up the values within this window. If you imagine all those windows in a chart you can imagine that they all overlap each other. Thus you are summing up the same values some dozens times. My suggestion would be to sum up the values for the needed rows for all columns at once before processing a line. This way you dont have to sum all values within the window but only the prepared column sums. The same is probably possible for the variance.
I hope this was somehow clear. If not I'll probably implement it myself when I have some minutes free .
Btw: Do You only work with integers? Are those matrix values calculated for only one color chanel (red, green , blue) or is really the complete RGB-value stored? If latter then summing up those values could easily overflow an integer variable
|
|
|
|
|
Great find!
I've made the changes you'd suggested in my algorithm (to avoid repeatedly summing up columns in overlapping windows), and have been able to shed over 1 second!
The release mode however is still not giving much improvement. The 'optimize' flag is true and 'check for arithmetic overflow' and 'generate debug info' are both set to false. I must mention that the code I had posted is only a small section of my entire procedure. On my end, I'm processing the input image through a series of functions(/filters), each of which update pixel values depending upon the environments and are similar to the extract in my previous post.
Currently, the entire procedure takes an average of ~ 2.4 seconds for a set of 14 test pictures in both Debug and Release modes.
Also, am only working with integers. The input images are 8-bit grayscale images, so these integers are only between 0 and 255.
Thanks...
|
|
|
|