Friday, 9 June 2017

"Cosmos is a Greek word for the order of the universe.


"Cosmos is a Greek word for the order of the universe. It is, in a way, the opposite of Chaos. It implies the deep interconnectedness of all things. It conveys awe for the intricate and subtle way in which the universe is put together."
~Carl Sagan

#wordsofwisdom #cosmos

13 comments:

  1. Cosmos and chaos are competing oses for the universe:) In the contemporary sense though, "chaos" isn't the described opposite of "cosmos" and resembles the portrait given of the latter in the quote. Couple dates, Sagan's Cosmos 80, Gleick's Chaos 87, Zelazny's Amber series 70-91...

    ReplyDelete
  2. At first it was order. From this order came chaos. Reality is the process of converting order into chaos. Chaos is the ultimate destiny of the universe.

    ReplyDelete
  3. At the heart of thermodynamics is a dual macroscopic-microscopic view, and while causality descends the scales by fixing more and more intricate details, the details themselves pass beyond an horizon of indiscernibility. What accumulates beyond that horizon is called entropy. Insisting entropy is "chaos" is under-emphasizing indiscernibility to pander to omniscientism, imho.

    ReplyDelete
  4. Boris Borcic Then please explain the difference between chaos and entropy…

    ReplyDelete
  5. Entropy is a measure of the ambiguity of the macroscopic state in terms of the number of indiscernible yet distinct microscopic states that a single macroscopic state may correspond to. With time, this ambiguity grows and hides more and more of the detailed effects of past causation. What you call "chaos" is just a partial picture of the hiding mechanism.

    ReplyDelete
  6. Boris Borcic​ Mathematical relationship, between entropy and molecular movement, was worked out by Ludwig Boltzmann. He showed that entropy represent the total number of different ways the molecules could move and interact. With each collision, kinetic energy is exchanged. Molecules with more energy lost their kinetic energy, colliding with molecules having less kinetic energy. Molecules wit low kinetic energy can gain it – colliding with other molecules. This process continues until the kinetic energy is equally distributed – among all the molecules and their various modes of movement. Boltzmann stated that entropy can be related to disorder. Because the more ways such system can move internally – the more disordered is. A system in "order" have one state – in which all the molecules are locked, without any freedom of movement. This way a dynamic system, in perfect equilibrium, represent system in "perfect disorder". So entropy is a measure of disorder.
    In statistical mechanics entropy is defined in terms of accessible regions in phase space. All configurations must be equally accessible. In such perspective system is "chaotic", when particle's trajectory cover each and every single bit of phase space accessible to it.
    So, when we assume statistical mechanics as being applicable for such system, this mean that we cannot predict trajectories in phase space. This is either because there is to many initial conditions or because we can not track each and every trajectory. And after an infinite time they will also cover the whole phase space.
    This situation is strongly connected to the notion of chaos in classical mechanics…

    ReplyDelete
  7. Mariusz Rozpędek Would you please make clear, relative to my shorthand affirmation that "entropy is a measure of the ambiguity that grows with time of (observationally available) macro-states in terms of number of (observationally unavailable) detailed micro-states" -- would you please make clear whether you mean that picture you just detailed at length in response, to

    (a) give another perspective on what my statement affirms, without contradicting it,

    (b) constitute a refutation of my statement by contradicting it on what it says

    (c) disqualify my statement as "not even wrong" by plastering over it a self-contained pre-packaged picture without any effort to relate it to my own language.

    As I perceive it, your are attempting (c) while really doing (a).

    The statistical mechanics picture elucidated thermodynamics by piercing an horizon in theory, which it did by providing that narrative of secret microscopic chaos in such a way that the known macroscopic quantities such as temperature and entropy got explained as emerging from the statistics of micro-states whose actual detailed knowledge escapes us.

    Not just the statistics, but the dynamics: if there's many micro-states for a single macro-state, a situation that's static at the macro level can be secretly dynamic at the micro level.

    Of course the historical moment of piercing the horizon of thermodynamics gave great visibility to its key narrative ingredient of portraying disordered micro-states so as to allow defining statistics over their hidden diversity (itself a correlate of their disorder), but an unfortunate side-effect, in my opinion, is the election of "disorder" (or your "chaos") over "ambiguity" as the one-keyword shorthand of the light shed by statistical mechanics on thermodynamics.

    Ultimately, I relate this conflict to my one-line credo that omniscience is impossible; equivalent to affirming knowledge horizons to be a necessary rather than a contingent feature of world experience. In the presence of a "translucent" knowledge horizon like that featured by statistical mechanics, it leads me to prefer focus on the horizon and its properties, over an "omniscientist" attitude of capitalizing on the generic mental picture that it authorizes us to make of what's beyond it (your "chaos").

    ReplyDelete
  8. Boris Borcic​ Because English is not my national language I do not feel comfortable playing the word games – but I will try to explain my position in other words.
    Firstly, as far as I know, the term "ambiguity" is not used in the context of entropy. As I understand "ambiguity" means that specific and distinct interpretations are permitted in the same time. To some extent we can use this term talking about entropy. But, in my opinion, when we are considering trajectories in phase space – available information is rather vague.
    Moreover, looking on any system at quantum level – there is impossible to form strict description of all their states. One should also note, that every single measure will change internal state of such system. Quantum phenomena are not deterministic – so we can't have certainty at all. Only probability…
    Such an approach is also typical for modern science. Contemporary science do not consider phenomena as "certain", but rather estimate their probability. This is true for biology, medicine and physics.
    And there is no place for ”omniscience” in current system of knowledge… There is no possibility to know everything about macroscopic system at quantum level. On the other hand (from information theory) we know that the more microstates the given system has – the greater is our uncertainty about its description. So we have to ask more questions to know its condition. The more questions we have to ask, the more information this system contains. It is therefore more difficult to describe it. Or such description becomes even impossible… If we do not ask the right number of questions then we have a lot of uncertainty about the state of the system. At the same time, every question asked changes the state of the system, increasing uncertainty.
    About such a system one will say that it is indescribable – or in other words chaotic. But most people will not say that it is "ambiguous"…

    ReplyDelete
  9. Mariusz Rozpędek, that's right, I don't think I've ever heard "ambiguity" employed in this context -- I am using it in this context not because I've been told what to say, but because it is what I think given what I've learned (in terms similar to yours).

    My key insight about ambiguity is encapsulated in my saying: ambiguities are like microbes: the pathogenic ones steal attention.

    Whatever, there's like a sticking point remaining for me that thermodynamic entropy is first of all described as sizing the physical freedom of the micro-state for the given macro-state. The macro-state is obviously a representation. The macro-state is a representation that's. literally. ambiguous. proportionally to the diversity of micro-states it may represent.

    ReplyDelete
  10. Boris Borcic OK. I think I can see what you thinking about… :-)

    ReplyDelete
  11. "the ambiguity of the macro-state grows with time" would be my choice version of the second law.

    I agree that "vagueness" feels common sense to use instead of my "ambiguity". However, I feel "ambiguity" suggests an activity of counting interpretations (aka micro-states) much better than "vagueness" does. And such a counting is of course central to the definition of entropy.

    And to further my initial point, the chaos, the famous disorder of entropy, is one we imagine but don't see. The famous "disorder" names what we don't see (but imagine astutely).

    ReplyDelete
  12. Hey, Mariusz Rozpędek, thanks for prompting forward what now invites a try to "vague binocular accommodation" with statistical quantum mechanics.

    An imo thing about horizons and "omniscientism" or excess cognitive greed (as I am now tempted to call it), is that the greedy in presence of a translucent horizon will want to make it transparent -- to remove it -- while the wise will welcome an occasion to get better acquainted with horizons translucency.

    ReplyDelete
  13. I mean, the macro-state shares (some) characteristics with the wave-function. Is this what they call Wick Rotation?

    ReplyDelete