|
_Erik_ wrote: It is just a straight application of the axiom of induction[^] for natural numbers
But from the wiki article :-
...proving that if any one statement in the infinite sequence of statements is true, then so is the next one.
What you've done is shown it may be true for two arbitrary sequential cases, but you haven't derived a proof that if case n is true, then so must n+1 be. That really requires symbolic manipulation of the equations.
Days spent at sea are not deducted from one's alloted span - Phoenician proverb
|
|
|
|
|
Yes, I agree, as I said before, this is just a heuristic method, not a demonstration. Increasing the number of random cases (and succesors) would bring a very accurate result without having to check every possible input.
Edit: Have my 5 for your comment
modified on Friday, March 18, 2011 10:53 AM
|
|
|
|
|
Ok, maybe I now understand what you meant. However, it doesn't solve the problem, since false positives are not allowed at all.
|
|
|
|
|
Yes, I know... but I did it when I was at the University and nobody was able to find two expressions that could bring a false positive. It checked for the base case (of course) and another 10 random values (and their respective succesors).
|
|
|
|
|
Eric's suggestion is excellent and is quite similar to how the fast primality test finds large primes quickly.
If you test in enough bases the chance of an equation failing can be reduced to 1 in billions
This means it falls under the meteor rule..
If the chance of a false positive is less than being hit by a meteor today..It's okay to use it
So I think Eric is got an excellent idea.
|
|
|
|
|
I respectfully disagree. This will be part of an optimizing compiler. If the meteor decides to hit us after all then someone will have a nearly impossible to diagnose bug.
There are also large classes of expressions for which Eriks heuristic method does not work at all regardless of how many iterations it uses or require such a large number of iterations that it becomes impractical.
For example: (uint64)x / 3 and ((uint64)x * 0x5555556) >> 32 are the same for x in the interval 0..0x7FFFFFFF but not outside it. That gives a 0.5^k error probability where k is the number of iterations - which is fairly good, but if you combine a couple of those you can make the error probability arbitrarily high.
And I'm not even "cheating" with comparisons here - a single comparison against a constant can make the error probability arbitrarily high. They could be special-cased of course, but can the expressions above be special cased?
|
|
|
|
|
David, if these expressions are analytic expressions (defined and differentiable for all values of input), then a simple proof by induction will ALWAYS verify their equivalence. So you only have to try for a few values.
BHM
|
|
|
|
|
Maybe so, but that's not what we have here, they are expressions with types such as int32 (see original post) and the class of expressions is not limited to something simple.
For example, 1/x and 2/x differ only for x=1 and x=2 (edit: and -1 and -2), but there are 2^32 possibilities for x if it is an int32 and even more if it is an int64 - the chance that either 1 or 2 is tested isn't that big.
modified on Sunday, April 3, 2011 8:58 AM
|
|
|
|
|
Can you define or characterize this "good portion"?
For example, if you want to detect equivalence between bounded-degree polynomials (two computer programs or expressions such that their result is eventually a polynomial in the input), and you know the degree bound, it can be easily done by checking a small and finite number of inputs.
|
|
|
|
|
A good portion is hard to define
It should find "significantly" more equivalences than merely comparing whether two expressions are equal.
The class of expressions is not constrained in any way. I suppose I could first detect whether two expressions are both polynomials and then use a special comparison, but there are plenty of other types of expression so in order to find a "good portion" of equivalences I'd need something for them too.
|
|
|
|
|
Well, I suppose that you know that for polynomials of degree n, you need to check for equality at (n+1) points. If all are equal, the polynomials are the same.
Anyway, detecting polynomials is NOT easy. They can be computed in many ways, with for loops, while loops and other kinds of computations. It's not always visible by analyzing an expression that it is a polynomial.
Alex.
|
|
|
|
|
I could detect Some polynomials and match them that way I suppose.
That may be more effort than it's worth.. well I guess I'll try it and see how much it helps.
|
|
|
|
|
This is an excellent question - the bad news is, the answer is rather complicated. You would need to implement a unification algorithm[^] to do it correctly.
Things would not be as bad if you considered Prolog[^] as the language for your implementation, but something tells me that that would be out of the question
On the other hand, you can always embed Prolog to get unification for free.
But no matter how you look at it, the learning curve is going to be steep.
Good luck!
|
|
|
|
|
That page only seems to discuss logic formula's (unless I'm missing something?), does it also work for other formula's?
|
|
|
|
|
Absolutely. Formally, you can represent A+B as add(A,B), and define an additional rule that add(A,B) :== add(B,A). In fact, since you are interested only in a yes/no answer (as opposed to variable assignments produced by unification), you don't even need to implement a complete unification: a simpler recursive matching algorithm would do. The only challenge in its implementation is matching ((A+B)+C) trees to (A+(B+C)). You can deal with it by flattening the list of operands, and trying all possible orderings on one of the sides. Don't forget to memoize[^] your matches, both positive and negative. Otherwise, your search will be too slow.
|
|
|
|
|
java or C# , which better?
|
|
|
|
|
I would personally prefer C#.
|
|
|
|
|
As far as algorithms go, they are equal.
"You get that on the big jobs."
|
|
|
|
|
Actually, I wouldn't say that.
What is the algorithm for 64bit unsigned division in Java?
|
|
|
|
|
For a Java-programmer, Java is better. For a C# programmer, C# is better. Try to find where they differ, as that decides which of the two is best for the given circumstance.
I are Troll
|
|
|
|
|
COBOL
Regards
David R
---------------------------------------------------------------
"Every program eventually becomes rococo, and then rubble." - Alan Perlis
The only valid measurement of code quality: WTFs/minute.
|
|
|
|
|
|
Even better - I once spent 6 months doing MUMPS. DOn't think I've recovered.
Regards
David R
---------------------------------------------------------------
"Every program eventually becomes rococo, and then rubble." - Alan Perlis
The only valid measurement of code quality: WTFs/minute.
|
|
|
|
|
Strange, I had Mumps when I was a boy and recovered after a couple of weeks.
I must get a clever new signature for 2011.
|
|
|
|
|
Look at the IDE which is available for the language. As both with Java and C# you can create nice programs, the IDE is important. And I prefer Visual Studio.
|
|
|
|