|
It sounds like you don't want to use app.config at all*; find another solution.
* I write my own config files.
|
|
|
|
|
I love it but I don't want others to go, open it and play with the data there!
|
|
|
|
|
That's like saying you love your girlfriend, but don't want others to... oh, wait, maybe that's not a good analogy.
|
|
|
|
|
is it possible to compile the App.config withing the EXE output or as a separate DLL? will I still be able to read and write to the appSetting in this case?
|
|
|
|
|
Hi,
I am new to C shrap. I need help to design s software to move object in window. there can 4-5 object which will move in window and we can use mouse to catch all tht object. hope anyone will help me to do tht.
|
|
|
|
|
Hi,
you should learn to look around; make Google your friend, and use the CodeProject search facilities. Here[^] are some results.
|
|
|
|
|
windows forms have a property named Location. By changing it you can move objects in the window.For example to move a button in the screen you say:
yourButtonName.Location = new Point(newX, newY);
|
|
|
|
|
So I've set my grid to allowReorder ,
this causes persistence on the grid order but once the program is restarted it goes back to default
can someone let me know how I can save a user's re-order preference.
Thinking it can be a property? If not maybe build a method to take care of this
|
|
|
|
|
In your DataGridView look at the ColumnHeaderMouseClick event. You can gather DataGridView1.SortedColumn.Index to know which column is being sorted on and DataGridView1.SortOrder to see if ascending or descending. Then next time you refill the grid sort on that column and order.
|
|
|
|
|
I think he was asking about the order that the columns appear, left to right.
|
|
|
|
|
Ugh, that's what I get for not reading thoroughly.
|
|
|
|
|
|
As piebald suggested persist the order for each user. I use XML and store that sort of thing in the users application data folder. ASP would use a cookie.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
yes I persist on columnheaderclick.
So it looks like I would have to save the column header index when user navigates from that specific panel.
question is: (XML is going to be best option)
each column will have an index.. on build the grid builds the columns in order to the columns index.
what columnheader property is this and how would I go on to save and read from xml to populate these index #s
|
|
|
|
|
I use xml in this article[^]
You can set the column index at runtime or construct the DGV from code or you can reorder the underlying data source to reflect the column choice. Lots of choices.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
hi,
i wrote a code and it has 4 layers and difference count neuron in each layer, and also it has 5 input,output. i have read so many codes of neural network and they have run very fast but mine is very slow.
finally, when i use 400 paterns for input and 100000 iterations ,i wait so much.
can we solve this problem for me ?
i wait your recommend.
Best
this code :
Main Blocak
double err=10, mmse ;
int ss = data.GetLength(0);
snetwork(0);
initializes();
mmse = 0;
double mser=0.0;
for (i = 0; i <iterasyon; i++)
{
mser = 0.0;
for (int xx = 0; xx <data.GetLength(0); xx++)
{
ffw(xx);
backpropagate(xx);
mser += mse(nb_cikis, xx);
}
mmse = mser / data.GetLength(0);
if (mmse < 0.00001) break;
} ;
ffw_egitimout();
backprobagate error:
void backpropagate(int r)
{
double sum;
for (int j = 0; j < networks.layers[networks.layers.Length - 1].noron.Length; j++)
{
networks.layers[networks.layers.Length - 1].noron[j].hata = nb_cikis[r,j] - networks.layers[networks.layers.Length - 1].noron[j].output;
networks.layers[networks.layers.Length - 1].noron[j].delta = networks.layers[networks.layers.Length - 1].noron[j].output * (1 - networks.layers[networks.layers.Length - 1].noron[j].output) * networks.layers[networks.layers.Length - 1].noron[j].hata;
}
for(int i=networks.layers.Length-2;i>0;i--)
{
for(int j=0;j<networks.layers[i].noron.Length;j++)
{
sum=0.0;
for (int k = 0; k < networks.layers[i].noron[j].sbaglanti.Length; k++)
{
sum+= networks.layers[laybul(networks.layers[i].noron[j].sbaglanti[k])].noron[nronbul(networks.layers[i].noron[j].sbaglanti[k])].delta*networks.layers[laybul(networks.layers[i].noron[j].sbaglanti[k])].noron[nronbul(networks.layers[i].noron[j].sbaglanti[k])].dentw[k];
}
networks.layers[i].noron[j].delta=networks.layers[i].noron[j].output*(1-networks.layers[i].noron[j].output)*sum;
}
}
for(int i=1;i<networks.layers.Length;i++)
{
for(int j=0;j<networks.layers[i].noron.Length;j++)
{
for (int k = 0; k < networks.layers[i].noron[j].obaglanti.Length; k++)
{
networks.layers[i].noron[j].dentw[k]+=momentum* networks.layers[i].noron[j].prew[k];
}
networks.layers[i].noron[j].bias+=momentum* networks.layers[i].noron[j].prewbias;
}
}
for(int i=1;i<networks.layers.Length;i++)
{
for(int j=0;j<networks.layers[i].noron.Length;j++)
{
for (int k = 0; k < networks.layers[i].noron[j].obaglanti.Length; k++)
{
networks.layers[i].noron[j].prew[k]=Lr*networks.layers[i].noron[j].delta*networks.layers[laybul(networks.layers[i].noron[j].obaglanti[k])].noron[nronbul(networks.layers[i].noron[j].obaglanti[k])].output;
networks.layers[i].noron[j].dentw[k]+=networks.layers[i].noron[j].prew[k];
}
networks.layers[i].noron[j].prewbias=Lr*networks.layers[i].noron[j].delta;
networks.layers[i].noron[j].bias+=networks.layers[i].noron[j].prewbias;
}
}
}
feed forward calculating:
void ffw(int v)
{
double ssum,ssum1;
for (int n = 0; n < networks.layers[0].noron.Length; n++)
{
networks.layers[0].noron[n].output = ndata[v,n];
}
for (int l = 1; l < networks.layers.Length; l++)
{
for (int n = 0; n < networks.layers[l].noron.Length; n++)
{
ssum = 0.0;
for(int k=0;k<networks.layers[l].noron[n].obaglanti.Length;k++)
{
ssum+= networks.layers[laybul(networks.layers[l].noron[n].obaglanti[k])].noron[nronbul(networks.layers[l].noron[n].obaglanti[k])].output*networks.layers[l].noron[n].dentw[k];
}
ssum+=networks.layers[l].noron[n].bias;
networks.layers[l].noron[n].output=activation(ssum);
}
}
}
network class:
class neuron
{
public double[] dentw,prew;
public double prewbias,bias, hata, delta;
public double[] input;
public double output;
public int[] sbaglanti;
public int[] sbagk;
public int[] obaglanti;
public int numara;
public int lay;
public int nn;
public int dents;
public int oid, sid;
public neuron(int onron, int snron)
{
dentw = new double[onron];
prew = new double[onron];
input = new double[onron];
sbaglanti = new int[snron];
sbagk = new int[snron];
obaglanti = new int[onron];
oid = onron;
sid = snron;
dents = onron;
}
}
class layer
{
public neuron[] noron;
public layer()
{
noron = new neuron[nron];
for (int index = 0; index < noron.Length; index++)
{
noron[index] = new neuron(nron, nron);
}
}
public layer(int neurons, int sneurons, int kendi)
{
noron = new neuron[kendi];
for (int index = 0; index < noron.Length; index++)
{
noron[index] = new neuron(neurons, sneurons);
}
}
}
class net
{
public int ban, hen;
public layer[] layers;
public net()
{
layers = new layer[lay];
for (int index = 0; index < layers.Length; index++)
{
layers[index] = new layer();
}
}
public net(int lays, int[] neuronMap)
{
layers = new layer[lays];
for (int index = (layers.Length - 1); index >= 0; index--)
{
if (index == 0)
layers[index] = new layer(1, neuronMap[index + 1], neuronMap[index]);
else if (index == layers.Length - 1)
layers[index] = new layer(neuronMap[index - 1], 1, neuronMap[index]);
else
layers[index] = new layer(neuronMap[index - 1], neuronMap[index + 1], neuronMap[index]);
}
}
}
#endregion
static int[] neuronMap = new int[] { 5, 10,10, 5 };
net networks = new net(neuronMap.Length, neuronMap);
modified on Monday, October 11, 2010 4:56 AM
|
|
|
|
|
If you are going to post code format it properly by using the pre tags
I know the language. I've read a book. - _Madmatt
|
|
|
|
|
Are you going to read this much long code with this kind of formatting if someone provides you ?
Think yourself and decide
your time begins now
Life's Like a mirror. Smile at it & it smiles back at you.- P Pilgrim
So Smile Please
|
|
|
|
|
Its hard to go through your complete program, but it looks like you are running just too many loops (some of them nested).
I would start at looking to refactor some of your logic to avoid running so many loops.
Also try and use break; to get out of the loop once the loop logic is complete.
The funniest thing about this particular signature is that by the time you realise it doesn't say anything it's too late to stop reading it.
My latest tip/trick
Visit the Hindi forum here.
|
|
|
|
|
Are you still stuggling with this !
Please format your code, as you can see it is dificult to read and you are being down voted.
Point one: Nested loops = poor performance [generally].
Your class structure is wrongly IMO. Each Neuron should have a List<Dendrite> for both input and output. A Dendrite should have both an input and output Neuron as well as any weighting. You might need to put deltas in there too, I can't remember. There needs to be a special subclass for the Input Layer and Output layer as these don't have Input Neurons and Output Neurons respectively.
Set the activation inputs, and calculate the activations on each layer in succession. The calculation is easy:
InputAcivation = 0.0;
ForEach(Dendrite dendrite in InputDendrites)
InputAcivation += dendrite.Activation;
Activation = Squash(InputActivation);
Note that I have the Dendrite class calculaing the Activation from the input, "Weight * InputActivation".
The backprop works in a similar way (but reverse!) the desired output is placed onto a property on the output layer, and then each layered is worked back to the input. I really can't remember the ins and outs of this, but you only really need to loop over each neuron in each layer to do this, plus the dendrites.
One advantage of generic Lists over arrays (if you are having a speed problem) is that you can trim "dead" connections (ones whose weights are near zero and have little momentum over multiple tries. Another suggestion is to try to define Halting Criteria, which defines when the NNW is getting it right enough to stop learning. You can over-train a NNW.
One final thing, the OO encapsulation is poor IMO, the calculation code (as I have suggested above) should be in the Neurons, not the main code block. This goes for the backprop calcs too. If you do this, you won't have so many array calls confusing the mix, and you'll be able to profile your app a little better.
[Edit]
My class structure probably isn't the best, I last wrote a NNW ~10 years ago as an undergrad, in c++, but hopefully you'll get the idea. Did you check out the NNW article on CP I mentioned in your previous thread?
|
|
|
|
|
hi, Everybody.
Firstly Thanks for repyl.
i wonder, is there any solution for nested loop in ANN.
can you explain with a example?
My project is about back propagation neural network this project.
to run fast of this project, when i can change my class, will program run fast ?
|
|
|
|
|
Hello,
I am dealing with some problems on using the Wpf advanced text formatting capabilities. I have created my custom TextSource, which loops through custom spans in text lines than contain properties such as offset, length and forecolor. For each span, it creates a TextRun - however, through debugging, I have found out that it creates the TextRuns multiple times for each span, which results in text being shown multiple times.
Have anyone had the same problem? Should I post my code?
Thanks,
Theo
|
|
|
|
|
Theodor Storm Kristensen wrote: Should I post my code?
Yes - but in the WPF forum.
Real men don't use instructions. They are only the manufacturers opinion on how to put the thing together.
|
|
|
|
|
Hi Experts,
Kindly let me know that, How may I use following command using C#?
MD C:\ABC123
Regards
(Riaz)
|
|
|
|
|
Google is your friend:
http://msdn.microsoft.com/en-us/library/as2f1fez.aspx[^]
.45 ACP - because shooting twice is just silly ----- "Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997 ----- "The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001
|
|
|
|