|
Hmmm, how about this nested loop?
main()
{
List<int[]> n = new List<int[]>();
for(int i=0;i<n.count;i++)>
{
n[i]=new int[100];
for(int j=0;j<n[i].length;j++)>
{
for(int k=0;k<n[i].length;k++)>
{
}
}
}
}
Will this still be O(N^3)? The two inner loops is not proportional to the size of the problem.
It is said that the most complex structures built by mankind are software systems. This is not generally appreciated because most people cannot see them. Maybe that's a good thing because if we saw them as buildings, we'd deem many of them unsafe.
|
|
|
|
|
This is when you can start talking about O(N*M^2), where M is the "usual" size of an array stuffed in the list. In reality you'd probably have some idea about how many elements are put in the arrays. Say you were storing four integers in each array, and had hundreds of these arrays in the list, you would probably refer to it as O(N), as the M is fairly insignificant. If the arrays contained n.Length items, then it would be ^3...
Big-O notation is a very rough analysis, you are generally interested in the highest power.
Depends on data too, some sort algorithms are worst case O(N^2), but the likely case is much gentler.
|
|
|
|
|
I see. So in the example above, if N>100, M (Integer in each array) does not really matter? But if N < M ,then it matters.
So I can say that the best case is just O(N) and the worst is O(N^3)?
It is said that the most complex structures built by mankind are software systems. This is not generally appreciated because most people cannot see them. Maybe that's a good thing because if we saw them as buildings, we'd deem many of them unsafe.
|
|
|
|
|
Hey Guys, don't want to start another thread so here it goes...
for(int i=0;i<n;i++)>
int x=1+1;
is this O(N) or O(1)? I'm just doing constant operation inside a loop.
It is said that the most complex structures built by mankind are software systems. This is not generally appreciated because most people cannot see them. Maybe that's a good thing because if we saw them as buildings, we'd deem many of them unsafe.
|
|
|
|
|
It is O(1). It has nothing to do with what is inside the loop (as long as it is not another loop dependent upon "n"), just how many times you execute the loop. In your stated case, it is linear, for each increase of n you only execute the loop one more time.
|
|
|
|
|
You are doing the constant operation N times...
|
|
|
|
|
Yeah. But it doesn't matter right?
Its still O(n). It doesn't matter if the operations inside the loop is a expensive or inexpensive operation?
It is said that the most complex structures built by mankind are software systems. This is not generally appreciated because most people cannot see them. Maybe that's a good thing because if we saw them as buildings, we'd deem many of them unsafe.
|
|
|
|
|
Yep, it doesnt matter if the execution time is trivial. The main thing is that if you double N then you double the execution time. N*10 -> 10*t, etc.
In that exact case the compiler would probably optimise it to x = 2
|
|
|
|
|
|
I am receiving some datafiles which have been compressed by the unix Compress.exe program. The files have a .Z (note this is not TAR/GZIP format) extension.
So far I am unpacking these using an old compress.exe port (shelling out of my .NET app to run this process) but would like to get the algorith and do it in managed code. The .Z files have magic bytes 1F 9D at the start of each file.
Either C# or VB is okay, or even C if that's all that is available?
'Howard
|
|
|
|
|
Hi all,
are there any general patterns about sorting and grouping of data in memory?
My application should read statistic results from a poultry scale. The poultry scale is used to weigh live chicks during they grow and the main reason to use the scale is to get an average weight of a flock during the growing period. There can be more scales used at one farm and after the results are downloaded to the PC, I have a set of scales and a set of weighing results for each scale. For example:
SCALE_NAME * DATE * AVERAGE_WEIGHT
SCALE1 * 2008-08-01 * 2.56
SCALE1 * 2008-08-05 * 2.75
SCALE2 * 2008-08-01 * 2.32
SCALE2 * 2008-08-05 * 2.60
Once I load the data from a database, I need to sort and group it according to the scale name and time. For example, I need to display a graph of weight over time for one scale, compare this graph to another scale etc.
What I don't know is how the data will be organized in memory? Do I have the scale name stored directly in the Weighing class or do I make some list of scales first and then in each Weighing class I point it to the appropriate Scale object? I can imagine how to get the data from database, but I somehow don't know how to work similarly when the data is in memory. I use .NET 2.0, so LINQ is out of question.
As you can see, I don't even know what to ask or what to look for... Are there any general approaches to this? I believe this problem must be common.
Thank you,
Petr
|
|
|
|
|
My first impression is you can have a Weighing class with three fields: scaleName, date, and weight. Then a generic List<Weighing>.
Reading from the database into List<Weighing> is easy; just make a new Weighing object for each database record, and add it to the list.
Once you have all the Weighings in a list, you can easily manipulate them.
To get a list of all the scales for example, for each Weighing, insert the scaleName member into a generic Dictionary<string>.
To get a list of all the Weighings for a particular scale sorted by date, for each Weighing in the original list, insert the Weighing into a new SortedList<Weighing> if the scaleName member equals the particular scale.
modified on Tuesday, August 5, 2008 10:48 AM
|
|
|
|
|
While its nice to have your domain model all sorted out, your scenario is purely a reporting requirement. Database servers are designed for, and are very good and very fast at doing these manipulations - do it there, and get the results back as a dataset if you have to.
Even if your data layer supports aggregations, its rare that your query actually matches the model - so you could have a list of Chickens which arent real Chicken instances which kind of represents the average Chicken for the Scale they point at - are you going to get a benefit from this? Generally I find its easier to hook data up using data sets for reporting purposes - using reporting services.
|
|
|
|
|
Hi. I want some stuff regarding Bin Packing Algorithm. plz help me.
modified on Thursday, July 31, 2008 1:12 AM
|
|
|
|
|
2489128 wrote: I want to sum stuff
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
[My articles]
|
|
|
|
|
|
What help you need? Idea how to do it or just code so you don't have to and not learn a thing?
"The clue train passed his station without stopping." - John Simmons / outlaw programmer
"Real programmers just throw a bunch of 1s and 0s at the computer to see what sticks" - Pete O'Hanlon
|
|
|
|
|
Actually i want detailed stuff i.e. theory, Calculating order etc. I've to implement it in my project.
|
|
|
|
|
And none of the 6360 hits for bin packing algorithm that come up on Google helped you any?
If you don't have the data, you're just another a**hole with an opinion.
|
|
|
|
|
Actually I've found some good stuff but all are paid.
1
2
3
I want some free stuff for study purpose.
|
|
|
|
|
Does this [^] help?
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
[My articles]
|
|
|
|
|
2489128 wrote: I want some free stuff for study purpose.
Then you needed to be more specific in your question. Your question essentially came across as I have a homework assignment that involves the bin packing algorithm and you're too lazy to even look it up. You didn't tell us what you knew or didn't know. There was no specific question.
If you don't have the data, you're just another a**hole with an opinion.
|
|
|
|
|
You may also try to implement it using Simulated Annealing [^].
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
[My articles]
|
|
|
|
|
Tim Craig wrote: none of the 6360 hits for bin packing algorithm that come up on Google helped you any?
Certainly there has to be something useful in those 6360 results
"The clue train passed his station without stopping." - John Simmons / outlaw programmer
"Real programmers just throw a bunch of 1s and 0s at the computer to see what sticks" - Pete O'Hanlon
|
|
|
|
|
Paul Conrad wrote: Certainly there has to be something useful in those 6360 results
I learned more about it than I wanted to from the first one, the Wikipedia article.
If you don't have the data, you're just another a**hole with an opinion.
|
|
|
|