|
Dan Sutton wrote: However, if you were to downscale a disintegrated universe it would be hot: the implication is that you'd need energy to do it, and that energy has to end up going somewhere (and would be expressed as heat). So regardless of whether elementary particles scale, you'd still experience the types of temperature variation you'd expect.
Well, you convinced me. Honestly, an idea of a pure energy is still mysterious for me. Despite we experience it in everyday life. I think we can end this fascinating discussion here.
Dan Sutton wrote: I guess the question is whether that twist itself occupies more physical space as the universe expands: I have a feeling that it probably does -- which is a tricky thing because it implies that Planck's constant isn't a constant at all... but inasmuch as we're capable of measuring it, it certainly appears to be one.
It will take some time until we find definite answers for this kind of questions. Theory of Everything maybe... Wait for it.
Besides, there is an interesting theory saying that matter is discrete to a Planck scale... inside a black hole. I don't know how it differs from a plain quantum theory, but I saw it in "news" sometime ago. They wrote that it implies that there is no actual singularity in a centre of black hole. It is just packed to it's limit. The theory might solve an information loss paradox (hopefully). Discrete Black-Hole Radiation and the Information Loss Paradox[^].
PS1. Out of curiosity: Are you a Physicist? As for me, I'm just a passionate of astrology, especially when it comes to crazy Mars missions (like Mars One[^]).
PS2. "There are two things that mankind will never understand: black holes and women's brain."
|
|
|
|
|
Funny you should mention the Planck thing: I've thought of it myself: I have a feeling that all singularities detonate as soon as they form, having achieved critical mass. However, current theory holds that a singularity has zero size -- but if that were the case, then there would be zero time flow within it, and thus (to the outside observer) the detonation would take an infinite amount of time to occur -- which violates a number of principles, not least that of heat death. So in fact, I think the singularity is probably one Planck length in diameter - which allows the explosion to occur - and, as you say, solves any other number of inconveniences.
Talking about singularities is vaguely problematic, in that they're mathematical derivatives rather than observed phenomena: we believe they're there, but theory states that they don't form until after the black hole itself comes into existence: mathematics says they must exist and thus they do... but that's as close as we're going to get - at least for now.
I'm not a bona-fide physicist, no - but I am fascinated by the subject and I study it quite a bit. I will, however, admit to being a science-fiction freak -- it's a great genre for getting the mind working...
Speaking of the science fiction, and of the Theory of Everything, Greg Egan suggests in his book "Distress" that as soon as anyone is able to explain exactly how the universe works and provide such a grand unifying theory, the universe will rearrange itself spontaneously so that it has always worked that way. Similar quantum theory suggests that a lot of reality exists because we observe it: that, for example, there was no such thing as a quark until someone discovered them, and then, at that point, the universe had always had quarks in it. Unfortunately, quantum theory supports this type of retroactive creation... I wonder what we've done with this conversation!
|
|
|
|
|
Is the universe expanding, or it is simply our abilty to see further into the universe that is expanding? If we can't see the reaches of the universe, how can we know that it is expanding?
And, if it is expanding, what is it expanding into? Does the absence of matter mean that space doesn't exist?
|
|
|
|
|
Well, the general consensus is that it's expanding because everything we observe out there is somewhat red-shifted, thus the Doppler effect tells us that everything's retreating from everything else. The question of what it's expanding into is more interesting: my own theory which I've held for a long time, and which is now becoming accepted by various factions within astrophysics, is that the universe is actually an exploding singularity within a larger universe: this explains several things, such as the fact that the size of the universe is (mathematically) much greater than it should be. In theory, if the universe is 13.7 billion years old (as is currently stated) then its radius should be 13.7 billion light years, since it shouldn't be possible for it to expand faster than the speed of light. But in fact, it's something like twice that: a conundrum which has stumped physicists for a while now. However, if the universe is an exploding singularity, then its theoretical radius is determined by the radius of the event horizon of the black hole surrounding such a singularity -- into which matter can fall from outside. This would explain massive objects on the boundaries of what we can see - such as quasars and so on - which conventional closed-system theory cannot explain - and also where all that extra mass came from. There would be a shift in perception between what we can see and the universe outside, simply because of the time dilation effect one would perceive when approaching a large center of mass. Furthermore, if one were to calculate the distribution of matter inside a black hole with the mass of the universe, then one would actually come out with a distribution of matter virtually identical to what we can see now. My theory goes on to state that (a) all singularities detonate at the instant they form (having achieved critical mass), but that because of the time dilation effect, an outside observer would not detect the explosion: it would appear to take an almost infinite time to occur (although, to an entity inside the exploding black hole, time would proceed at a regular pace, with the "outside" appearing almost infinitely slow, and thus unfathomable: there would be a definite interface between "inside" and "outside"; and that (b) a singularity is not zero-sized at all, but is in fact one Planck length: this removes the problem of it actually taking an infinite length of time for the explosion to occur (as seen from outside). This theory is supported by the fact that known black holes, such as the supermassive type seen at the center of the Milky Way, do radiate massive quantities of energy - primarily in the form of neutrino jets at the poles, as they spin - as Hawking pointed out a few weeks ago, the idea that information cannot leave a black hole is patently false: we see it happening all the time. There's still a hell of a lot of thinking to be done on this subject, in any event.
|
|
|
|
|
A well reasoned explanation, which is better than most of the time when it is simply stated 'scientists say...' and to counter that simple arguement is to invite ridicule.
While I may not understand what you wrote, or agree with it, as I said, it is a possible, well explained answer.
Thank you,
Tim
|
|
|
|
|
Thanks! [Disclaimer: I'm not sure I agree with it, either, but it does have the benefit of being an explanation which isn't currently disprovable, and which does explain a lot of "that weird sh*t" which seems to plague the field...!]
|
|
|
|
|
WALL OF TEXT!
<voice type="Ebeneezer Scrooge"> Bah. dumb bugs </voice>
|
|
|
|
|
Somehow I was thinking that Number of Programmers and Code Quality would be inversely proportional.
|
|
|
|
|
They are. If you re-write it a little, you get: Q = c·t/N
Thus quality is proportional to time, and inversely proportional to the number of programmers.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Seems a bit incomplete to me, you need to add the number of managers, and the level of customer involvement AKA feature creep.
|
|
|
|
|
If your actions inspire others to dream more, learn more, do more and become more, you are a leader.-John Q. Adams You must accept one of two basic premises: Either we are alone in the universe, or we are not alone in the universe. And either way, the implications are staggering.-Wernher von Braun Only two things are infinite, the universe and human stupidity, and I'm not sure about the former.-Albert Einstein
|
|
|
|
|
I'll buy that. Change "programmers" to "people". LOL!
|
|
|
|
|
I was thinking more in the lines of:
NQM(2f+1)/t=c, where:
- N is the number of programmers on the project;
- Q is the quality of the final product;
- M is the number of managers on the project;
- f is feature creep (in percents of the original number of features);
- t is the time taken to develop the product;
- c is a constant
|
|
|
|
|
|
Doubling the number of features doesn't just double the time needed as it's also adding complexity.
Hmm, it's probably more like (f+1)2 when I think about it.
|
|
|
|
|
learner'sbug wrote: not f+1 ?
Because f+1 is a race with no passing.
This space intentionally left blank.
|
|
|
|
|
Brilliant! Although... since you don't want to affect N inversely with M, then I suggest:
(N^(M(2f+1)))Q/t=c
...which, since M implies 2f+1, could theoretically be shortened to:
(N^M)Q/t=c
(or else, we could include a constant to state the probability of someone posting a thread like this...)
|
|
|
|
|
It makes sense. When M=1 then everything works fine, but when M=2 or more, then a quality drops exponentially.
Also, instead of t we could take some Chi squared distribution, as from a certain point of time, giving more time brings more bad code.
|
|
|
|
|
I think we are getting somewhere, but you also need to account for "New Technology" with a heavier factor than feature creep (e.g. "We are going to change our platform to be all in the Cloud").
Soren Madsen
"When you don't know what you're doing it's best to do it quickly" - Jase #DuckDynasty
|
|
|
|
|
Nah, a new technology is a whole load of features in one go, you just need to break them apart and the formula will still work.
|
|
|
|
|
NQ/t=c
is incorrect.
It is:
NQ/t=C^2
Where "C" = Change; as in "_________ in my pocket" from all of the "_________ requests"
|
|
|
|
|
|
Not so sure about that.
If either N or Q are 0, then c = 0.
Knowing this and assuming c is a constant, then t = lim(x->inf), or lim(x->-inf).
.
|
|
|
|
|
Well, it almost makes sense: if you have no programmers, it's not really supposed to mean anything...
|
|
|
|
|
Yes, no programmers means zero enthropy. It also means that either everything is working fine and the visible problem space is zero... or total economic collapse.
.
|
|
|
|
|