|
I am attempting to come up with an algorythm for ranking players in a multiplayer game. The ELO system (used in Chess et-al) does not seem to fit as that is geared towards head-to-head games and im dealing with a variable player-number multiplayer game.
I basically have 4 (possibly 5) "statistics" that I can capture about a players history. I may want to weight some of those statistics more than others.
Can anyone start me off on how to proceed?
|
|
|
|
|
What about MS TrueSkill?
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
[my articles]
|
|
|
|
|
I think more information about the game is needed to come up with an appropriate algorithm, in particular how the variable number of players influences the outcome.
|
|
|
|
|
It's a tricky problem.
It's tempting to add the statistics together but then you'll find players exploiting the system by finding a technique for gaining huge levels in 1 stat.
The next thought is to multiply the scores together as that would require a broader based approach where players do a bit of everything. This unfortunately stops specialist players from gaining many points.
Often in multiplayer games you find that the guy who is running round helping his team mates or coordinating the efforts of his team is providing the most valuable service of anyone but will never get points for his actions. It's a very hard thing to determine a score and normally the people who care most about their individual score are the people you least want playing.
Even giving people points based on their team's performance is a risky ploy as you'll have people switching sides to improve their rankings.
Is there a way you could give people a score based on their teams improvement since they joined? That would then rank them according to the value they add through their presence.
Russell
|
|
|
|
|
It is indeed tricky.
I should point out that this is not a team-based game. Essentially everyone plays versus everyone else.
|
|
|
|
|
In which case I would record your scores in each of the 5 or six categories. At the end of a game rate your playing as you see fit from 1 to 5. If you have friends whos opinions you trust with a different style of play but who are none the less worthy opponents in your eyes then get them to do the same.
You may find that one of the metrics correlates very well against the subjective scores given by the players. In this case that's what you should mainly base your scores on. Otherwise you need to get some kind of statistical analysis of the results to work out what the correct balance of each of the points is. Or use some kind of very basic weighted network and try to get a best fit between the inputs and the ratings you have given yourselves.
Russell
|
|
|
|
|
Okay -- you have a PvP game in which you can gather several statistics.
First you may want to incorporate some additional stats under the covers that cannot be seen. For example -- time between deaths, kills between deaths, etc.
Now part of you weighting will be number of players participating at given times within the game.
Finally you would have your standard stats which everyone can see (total kills, hit accuracy, etc.)
You will have a bell curve for you weighting system. Too few or too many players and the stats are badly stacked. But in the sweet spot you would gain the highest scoring.
At the end of a match (assuming this is timed matches) you would first compute the weighting factor based on number of players against the bell curve * (time between death / factor ) * ( kills between death / factor ) -- or some such formulae as that.
Now take your weighting factor and apply it to the sum of your key stats * each stat weight factor.
This way when you hit the sweet spot a player who lives a long time, kills many people during that time, and has a high chance of being killed (the number of players playing without a ripe field of fodder) will get a higher ranking than a player that finds someplace to hide and only nails a few players.
Just my opinion.
|
|
|
|
|
I need to find what the closest point is for each of a series of points with 2 dimensional coordinates. To give a real world example say you took a picture of the stars at night and you wanted to find what star appears to be closest to each star when you look at them (this is not the real application just an example).
Obviously the easy way to do this would be to find the distance from every point to every other point and then sort the differences but this could be very slow if there is alot of points since the number of comparisons would increase exponentially. I have a few ideas on quicker ways to do this but I wanted to make sure I'm not overlooking some existing algorithm that would be good for this problem. Actually I'm not even sure what the name of this problem is (if it has one). If anyone knows the name or knows of some existing algorithms it would be a big help.
Thanks,
Mike
|
|
|
|
|
Someone may know about established algorithms, but one way to go would be to organise the 'star' co-ordinates in boxes on a grid. This way you would only have to search a small number of boxes dependent on the co-ordinates of each variable point and the nearest point found in each box. If you have variable densities of 'stars' then you could nest your boxes in some areas.
Peter
"Until the invention of the computer, the machine gun was the device that enabled humans to make the most mistakes in the smallest amount of time."
|
|
|
|
|
Unfortunately, Peter, this is not sufficient. Even comparing the smallest distance between any of the stars in all of the grid boxes may miss the case where a star in one grid box is immediately adjacent to another star in another grid box. You also need to make a quick comparison at the end inside a narrow band that spans the grid lines, the width of the band being the smallest distance between two stars found thus far.
Many, many optimizations also need to be done in the algorithm to avoid expensive calculations. With 9100 6.5 magnitude stars, about 4500 should be visible from any point on earth. O(n) of 4500 is about 10,125,000, and that is a lot of calculations.
What I was concerned with, is not that these were stars (~4500), but (as stated) this is an analogy for... Maybe the real problem was one in physics where the points were atoms in some condensate and numbered in the billions and the distance was to determine attractions.
Dave Augustine.
|
|
|
|
|
Member 4194593 wrote: Even comparing the smallest distance between any of the stars in all of the grid boxes may miss the case where a star in one grid box is immediately adjacent to another star in another grid box
That's why I said in my post that you would only have to search a small number of grid boxes - not just one.
I haven't thought about it much, but one way to start would be to search it's own box first, and if a star is found that is closer than any boundary then you are finished, otherwise you have to search any box that intersects the sphere to the nearest star you have found.
Peter
"Until the invention of the computer, the machine gun was the device that enabled humans to make the most mistakes in the smallest amount of time."
|
|
|
|
|
I think that the point having such property is the average one, i.e.
{X<small>avg</small>, Y<small>avg</small>}
where X<small>avg</small> is the mean value of the X coord of the points (the same consideration applies to Y ).
Of course that point as zero probability to coincide with one of your set. But if you choose the point in the set that is nearest to that average point, then you have (IMHO) a very good candidate for fulfilling your requirements.
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
[my articles]
|
|
|
|
|
Look at using a multidimensional binary tree (also called a kd-tree). It allows search by any number of criteria (in your case, two; the two dimensions), as opposed to a normal binary tree, which only uses one.
|
|
|
|
|
Simply use a point where x is the average of all the x coordinates and y is the average of all the y coordinates. For example, in standard C++:
#include "stdafx.h"
#include <iostream>
#include <ostream>
#include <iterator>
#include <algorithm>
#include <numeric>
using namespace std;
struct point
{
point()
: x(0), y(0) {}
point(double x, double y)
: x(x), y(y) {}
double x;
double y;
};
point operator+(const point &lhs, const point &rhs)
{
return point(lhs.x+rhs.x, lhs.y+rhs.y);
}
point operator/(const point &lhs, double rhs)
{
return point(lhs.x/rhs, lhs.y/rhs);
}
ostream& operator<<(ostream &os, const point &pt)
{
os << "(" << pt.x << ", " << pt.y << ")";
return os;
}
int main(int argc, char* argv[])
{
const point pts[] =
{
point(1.0, 2.0),
point(2.0, 1.0),
point(3.0, 3.0),
point(4.0, 4.0),
point(-1.0, 3.0)
};
const point *pEnd = pts + sizeof(pts)/sizeof(pts[0]);
ostream_iterator<point> osi(cout, "\n");
copy(pts, pEnd, osi);
cout << endl;
point pt;
pt = accumulate(pts, pEnd, pt);
pt = pt/(sizeof(pts)/sizeof(pts[0]));
cout << "Closest point is " << pt << endl;
return 0;
}
Steve
modified on Monday, January 14, 2008 6:25:06 PM
|
|
|
|
|
I don't think that meets the problem requirements; the result must be one of the existing points of the input.
"...say you took a picture of the stars at night and you wanted to find what star appears to be closest to each star when you look at the..."
|
|
|
|
|
Anyway it gives a good hint for starting the search.
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
[my articles]
|
|
|
|
|
Thank you everybody for your suggestion. cp9876, that was the same idea I had in mind, I guess great minds think alike .
On the other ideas that suggested averaging or something similar I'm not sure I understand your suggestions and I'm not sure if there was some confusion about the question I asked.
Say there was 100 'stars'. Were you guys suggesting a way to find the one 'star' that has the shortest total distance from all the other stars or were you suggesting it as a way to find 100 'stars' one for each of the 100 'stars'? My problem is more like the latter.
In either case I appreciate everybody taking the time to help me and I appologize for any ambiguity in my original question.
thanks,
Mike
|
|
|
|
|
Most of the others were solving the problem of which point is 'nearest' in some overall sense to the given points. If you aim to minimise the sum of the squares of the distances to the given points then you get the average solution.
Peter
"Until the invention of the computer, the machine gun was the device that enabled humans to make the most mistakes in the smallest amount of time."
|
|
|
|
|
hey, you should use a Delaunay triangulation. By definition, one of the stars that is coincident to a given star in the triangulation, call it MY_STAR, will be closest to MY_STAR. So your algorithm would be:
1. triangulate o(n lg n)
2. for each star, look at all coincident stars, and take the min distance
there are o(n) triangles, thus o(n) edges (another property of the Delaunay), so this step is o(n)
This should give you a loglinear solution, which seems optimal.
|
|
|
|
|
Wouldn't it be an O(n) operation to simply compute the distance of every single star, then select the minimum?
--- Modified realization
Oh, sorry. I didn't notice that he wants to find the closest star to ALL the other stars. I thought he was just looking for the closest star to a particular point.
Sounds like somebody's got a case of the Mondays
-Jeff
modified on Monday, January 21, 2008 3:24:22 PM
|
|
|
|
|
Hello,
I have a function that takes three variables and gives back two variables. Something like (a,b) = f(x,y,z). I'd like f to be like a look-up table. This means that I do not have mathematical representation of f. So the question is how do I represent such a look-up table efficiently, given that I would like to have an interpolation between existing values.
In 2d case when I have a = f(x) I can represent f as a bezier curve (which I can create and edit manually). And when I load it up I create a map m[x] gives me a. I use interpolation if x is not in the map.
My solution so far is based on 2d case, which means that I have a map like that m[x][y][z] (basically map of map of map of a pair (a,b)).
Is there a library that works with such issues in C++? Something that already provides similar functionality efficiently? I'd like to be able to create f(x,y,z), save it, load it and use it.
Thanks a lot.
|
|
|
|
|
It depends wether your f needs to have specific properties.
In the most simple case you can use linear interplation over both componentes (a and b) (just like you interpolate 3d-surfaces with triangulation / polygons ).
If f should be C^1, C^2, ... things get nasty and you might have to use some powerfull numerical interpolation methods.
Even in the most basic case you would have to find the 4 closest fixed points (see the other thread on this forum) and linear interpolate on those.
|
|
|
|
|
Thanks CKing! In my particular case linear interpolation is sufficient. I'd like to see if there is a library that deals with similar issues so I can reuse it.
Thanks.
Regards,
Alexander.
|
|
|
|
|
hello friends,
i read the algorithm to find Longest Monotone Subsequence which helps to find the Monotone increasing subsequence in a set of numbers in a list..
will u plz explain the concept of it..
|
|
|
|
|