The Science of Consciousness - Integrated Information Theory (IIT)

Human

The Living Force
Received today a notification about lecture this week on CERN titled "The Science of Consciousness". Intrigued by the title I went to check what it's about and had sort of knee jerk reaction to the lecture description (especially bolded part; below). Can't really elaborately point out why, it just doesn't feel right, e.g. indicating that humans are animals, that being conscious equals having a subjective experience (implying that consciousness can't be objective, IMO) and stating that consciousness of mice and bees represents "more difficult case [to study]" than in [adult] humans.

I might be seeing to much into this and "adding" things which are not really there, so a comment/view from others would be appreciated.
However, to me, it seems like consciousness is striped off of any spiritual dimension and reduced to purely mechanical (neural) processes of a machine. Well, maybe that's how "machines" are [only] capable of apprehending consciousness.

_http://indico.cern.ch/event/393890/ said:
CERN Colloquium

The Science of Consciousness
by Dr. Christof Koch (Chief Scientific Officer, Allen Institute for Brain Science, Seattle)

Description
We not only act in the world but we consciously perceive it. The interactions of myriad of neuronal and sub-neuronal processes that are responsible for visual behaviors also give rise to the daily movie screened for our benefit in the privacy of our own skull. I will discuss the empirical progress that has been achieved over the past several decades in characterizing the behavioral and the neuronal correlates of consciousness in human and non-human animals and in dissociating selective visual attention from visual consciousness. I will introduce Tononi’s integrated Information Theory (IIT) that explains in a principled manner which physical systems are capable of conscious, subjective experience. The theory explains many empirical facts about consciousness and its pathologies in humans. It can also be extrapolated to more difficult cases, such as fetuses, mice, or bees. The theory predicts that many, seemingly complex, systems are not conscious, in particular digital computers running software, even if these were to faithfully simulate the neuronal networks making up the human brain.

About Tononi (there are only two search outputs with his name, none related to IIT, which is not mentioned on the forum, AFAIK it):

_http://en.wikipedia.org/wiki/Giulio_Tononi said:
[...]

This research has led to a comprehensive hypothesis on the function of sleep (proposed with sleep researcher Chiara Cirelli), the synaptic homeostasis hypothesis. According to the hypothesis, wakefulness leads to a net increase in synaptic strength, and sleep is necessary to reestablish synaptic homeostasis. The hypothesis has implications for understanding the effects of sleep deprivation and for developing novel diagnostic and therapeutic approaches to sleep disorders and neuropsychiatric disorders.

Tononi is also a leader in the field of consciousness studies, and has co-authored a book on the subject with Gerald Edelman. He developed the integrated information theory (IIT): a scientific theory of what consciousness is, how it can be measured, how it is realized in the brain and, why it fades when we fall into dreamless sleep and returns when we dream. The theory is being tested with neuroimaging, TMS, and computer models. His work has been described as "the only really promising fundamental theory of consciousness" by Christof Koch.

Koch is the guy giving this lecture which notification is described above.

http://en.wikipedia.org/wiki/Christof_Koch said:
Christof Koch is an American neuroscientist best known for his work on the neural bases of consciousness. He is the Chief Scientific Officer of the Allen Institute for Brain Science in Seattle. From 1986 until 2013, he was a professor at the California Institute of Technology.

[...]

Since the early 1990s, Koch has studied the physical basis of consciousness as a scientifically tractable problem, and has been influential in arguing that consciousness can be approached using the modern tools of neurobiology. His primary collaborator in the endeavor of locating the neural correlates of consciousness was the molecular biologist turned neuroscientist, Francis Crick and, more recently, the psychiatrist and neuroscientist Giulio Tononi. Koch advocates for a modern variant of panpsychism, the ancient philosophical belief that some minimal form of consciousness can be found in all biological organisms, from single cells to humans, and in appropriately built machines. Koch writes a popular column, Consciousness Redux, for Scientific American Mind on scientific and popular topics pertaining to consciousness.

[...]

In early 2011, Koch became the Chief Scientific Officer of the Allen Institute for Brain Science, leading their ten-year project concerning high-throughput large-scale cortical coding. The mission is to understand the computations that lead from photons to behavior by observing and modeling the physical transformations of signals in the visual brain of behaving mice. The project seeks to catalogue all the building blocks (ca. 100 distinct cell types) of the then visual cortical regions and associated structures (thalamus, colliculus) and their dynamics. The scientists seek to know what the animal sees, how it thinks, and how it decides. They seek to map out the murine mind in a quantitative manner. The Allen Institute for Brain Science currently employs about 270 scientists, engineers, technologists and supporting personnel. The first four years of this ten-year endeavor to build brain observatories were funded by a donation of $300 million by Microsoft founder and philanthropist Paul G. Allen.

About IIT:

http://en.wikipedia.org/wiki/Integrated_information_theory said:
Integrated information theory (IIT) is a proposed theoretical framework intended to understand and explain the nature of consciousness. It was developed by psychiatrist and neuroscientist Giulio Tononi of the University of Wisconsin–Madison. Tononi's initial ideas were further developed by Adam Barrett, who created similar measures of integrated information such as "phi empirical".

Overview
The integrated information theory (IIT) of consciousness attempts to explain consciousness, or conscious experience, at the fundamental level using a principled, theoretical framework. The theory starts from two key postulations regarding the nature of consciousness: That consciousness has information regarding its experience, and that the experience is integrated to the extent that parts of an experience are informative of each other.

Here, IIT embraces the information theoretical sense of information; that is, information is the reduction in uncertainty regarding the state of a variable, and conversely is what increases in specifying a variable with a growing number of possible states. When applied to conscious experience as we know it, since the number of different possible experiences generated by a human consciousness is considerably large, the amount of information this conscious system must hold should also be large. The list of a system's possible states is called its "repertoire" in IIT.

In a system composed of connected "mechanisms" (nodes containing information and causally influencing other nodes), the information among them is said to be integrated if and to the extent that there is a greater amount of information in the repertoire of a whole system regarding its previous state than there is in the sum of the all the mechanisms considered individually. In this way, integrated information does not increase by simply adding more mechanisms to a system if the mechanisms are independent of each other. Applied to consciousness, parts of an experience (qualia) such as color and shape are not experienced separately for the reason that they are integrated, unified in a single, whole experience; applied in another way, our digestive system is not considered part of our consciousness because the information generated in the body is not intrinsically integrated with the brain.

In IIT 3.0, the 2014 revision of IIT, five axioms were established in underpinning the theory:
  • Consciousness exists
  • Consciousness is compositional (structured)
  • Consciousness is informative
  • Consciousness is integrated
  • Consciousness is exclusive
The suggestion is that the quantity of consciousness in a system is measured by the amount of integrated information it generates.

The underlined part above kind of confuses me (well, all other things are also presented in much "higher" academic language than normal layman like myself could easily wrap his mind around them), knowing that the entropy S of a system is defined as (Boltzmann equation): S ~ ln W, where W is accessible phase space, i.e. possible states of the system. So, as I understand above underlined part, it states that information increases the entropy of a system (variable)?

Well, it's a Wikipedia entry, which reliability is always questionable, so here's the article from journal The Biological Bulletin (it says open access to full article):

_http://www.biolbull.org/content/215/3/216.full said:
Consciousness as Integrated Information: a Provisional Manifesto
Giulio Tononi

Abstract
The integrated information theory (IIT) starts from phenomenology and makes use of thought experiments to claim that consciousness is integrated information. Specifically: (i) the quantity of consciousness corresponds to the amount of integrated information generated by a complex of elements; (ii) the quality of experience is specified by the set of informational relationships generated within that complex. Integrated information (Φ) is defined as the amount of information generated by a complex of elements, above and beyond the information generated by its parts. Qualia space (Q) is a space where each axis represents a possible state of the complex, each point is a probability distribution of its states, and arrows between points represent the informational relationships among its elements generated by causal mechanisms (connections). Together, the set of informational relationships within a complex constitute a shape in Q that completely and univocally specifies a particular experience. Several observations concerning the neural substrate of consciousness fall naturally into place within the IIT framework. Among them are the association of consciousness with certain neural systems rather than with others; the fact that neural processes underlying consciousness can influence or be influenced by neural processes that remain unconscious; the reduction of consciousness during dreamless sleep and generalized seizures; and the distinct role of different cortical architectures in affecting the quality of experience. Equating consciousness with integrated information carries several implications for our view of nature.

Well, after reading this I feel like I've just drunk Pan Galactic Gargle Blaster (from D. Adams' The Hitchhiker's Guide to the Galaxy), but the overall understanding about this theory is that neither consciousness, nor information, nor experience, are really defined independently from each other, making things go in circles and self-reference.
Some help from those more experienced in neuro-, cognitive and psychology sciences would be appreciated.

Also, if all this has already been covered in some other topic, I apologise for starting new thread, and ask the mods to merge it with appropriate one. Like I said, I didn't find it using Search function.

edit: Well, maybe the fairest question to those more experienced in these things would be: Is there any real use in diving deeper into this representation of consciousness? For me it seems not...
 
Hi Sasha,

When they say "information is the reduction in uncertainty regarding the state of a variable, and conversely is what increases in specifying a variable with a growing number of possible states". It means that information is in a sense the opposite of entropy. The phrasing is bizarre.
For example, if I hold a coin in one of my hands (left or right) and asked you in which hand the coin is, for you, the entropy is higher than for me, because you have two possibilities with equal probabilities, while on my part I know where the coin is. Therefore, information "is" which reduces uncertainty (to know is to reduce entropy) and which increases the specification of a variable (either x=right, or x=left) when there are many possible states (right and left in this case). I don't know if it's clearer this way :D

However, I agree that the approach of explaining everything from a reductionist point of view must appear very limited for those of us who are somehow used to multilevel analysis, but that's all they know of.
 
mkrnhr said:
Hi Sasha,

When they say "information is the reduction in uncertainty regarding the state of a variable, and conversely is what increases in specifying a variable with a growing number of possible states". It means that information is in a sense the opposite of entropy. The phrasing is bizarre.
For example, if I hold a coin in one of my hands (left or right) and asked you in which hand the coin is, for you, the entropy is higher than for me, because you have two possibilities with equal probabilities, while on my part I know where the coin is. Therefore, information "is" which reduces uncertainty (to know is to reduce entropy) and which increases the specification of a variable (either x=right, or x=left) when there are many possible states (right and left in this case). I don't know if it's clearer this way :D

However, I agree that the approach of explaining everything from a reductionist point of view must appear very limited for those of us who are somehow used to multilevel analysis, but that's all they know of.

Yes, it is. :)
I was confused with the bolded part especially because they used the phrase "with a growing number" implying, to me, that the number of possible states is increasing at the same time that specifying a variable is taking place. Really strange way to put it.
With simple explanation/rephrasing "increases the specification of a variable when there are many possible states" it becomes crystal clear.
Thank you mkrnhr. :flowers:
 
information is the reduction in uncertainty regarding the state of a variable, and conversely is what increases in specifying a variable with a growing number of possible states.

Per my understanding, the first part stating "information is the reduction in uncertainty regarding the state of a variable" refers to decoding a variable. An example can be a guessing game like 20 questions . Here the variable is a word that person A has in mind. Person B needs to guess the word person A has thought about by asking A 20 questions. A can only answer "yes" or "no" to each question. The number of possible states of the word guessed by A at the beginning of the game can be very big. However, each question asked and answered (in yes or no) provides information to B about the word and reduces the number of possibilities - or uncertainty regarding the word. There are ways of framing the questions such that each question provides more information about the word and progressively reduces the uncertainty. It is an interesting game - if you have not played it, you can try with someone.

The second part which states "[information] is what increases in specifying a variable with a growing number of possible states" can refer to encoding the variable. If a variable can have 2 possible states (like the example mkrnhr gave), then it takes one "bit" to encode it. It is either left or right. If we denote left as "1" and right as "0", we can have 1 binary digit (bit) which can take these two values to describe the state of the variable "which hand is the coin in".

Taking another example, suppose one is traveling from one place to another by moving in the 4 cardinal directions - north, south, east west; no cutting across allowed. The variable here is "direction of travel" which can take 4 possible values. To encode these 4 directions we can use 2 bits - each of which can take two values 0 and 1 as earlier.
north -00
east - 01
west -10
south -11

Now suppose we allow cutting across at 45 degree angles - like north-east, south-west .... Now the variable "direction of travel" has 8 possible values. To encode this variable, we can use 3 bits
north : 000
north-east: 001
east : 010
south-east : 011
south : 100
south-west: 101
west : 110
north-west: 111

Thus we need more bits to encode a variable which has more number of states. Why encode? Consider a variable that can take 1024 different values. If we use encoding, then the state of the variable can be described by 10 bits. In other words, 10 bits of information is enough to give us a complete description of a variable which can take 1024 different values. Thus, the number of bits of information needed to specify a variable increases as the number of possible states of the variable increases.

The example given is simple and uses binary coding. There are other types of coding possible which are more efficient and exploit properties like likelihood of occurrence of the possible different states of the variable under consideration.

[quote author=Saša]
knowing that the entropy S of a system is defined as (Boltzmann equation): S ~ ln W, where W is accessible phase space, i.e. possible states of the system. So, as I understand above underlined part, it states that information increases the entropy of a system (variable)?
[/quote]

The basis of information theory comes from the work of Claude Shannon. The mathematical formulation of Shannon "entropy" has a logarithmic term just like the Boltzmann equation above. When computed, Shannon entropy measures the amount of "uncertainty" in the system. You can think of it this way - if any system has more number of possible states, in general we will need more information to describe the system (just like we needed 3 bits when we allowed 8 directions of travel instead of 1bit if we just allowed left or right travel) - or ask more questions to guess which state the system is in. I don't know if you are thinking "entropy=disorder=bad" and "information=order=good" in this context, but it helps if you don't - osit.
 
mkrnhr said:
However, I agree that the approach of explaining everything from a reductionist point of view must appear very limited for those of us who are somehow used to multilevel analysis, but that's all they know of.

Me too, though at the same time, it could still be quite useful by providing a systematic and tangible/physical framework to describe Information Theory in a biological context. It may even force some astute researchers to look beyond the reductionist view. I could see something like mirror neuron experiments possibly coming up results where the total information transfered can't be accounted for from brain activity in fMRI scans, or something to that effect.
 
obyvatel said:
[...]

The basis of information theory comes from the work of Claude Shannon. The mathematical formulation of Shannon "entropy" has a logarithmic term just like the Boltzmann equation above. When computed, Shannon entropy measures the amount of "uncertainty" in the system. You can think of it this way - if any system has more number of possible states, in general we will need more information to describe the system (just like we needed 3 bits when we allowed 8 directions of travel instead of 1bit if we just allowed left or right travel) - or ask more questions to guess which state the system is in. I don't know if you are thinking "entropy=disorder=bad" and "information=order=good" in this context, but it helps if you don't - osit.

Thank you obyvatel for this detailed and very clear explanation, especially the part about the increase in amount of bits needed to encode the variable/system with growing number of possible states. :flowers:

I'm understanding entropy mostly and mainly from mathematical/physics POV: as a measure of available states that some system can obtain (the term W), which basically corresponds to the "uncertainty" mentioned above, thus not associated with "man-given" attributes of good and bad.
It's just a quantity that describes given system, and in principle not opposed to information, but "anti-proportional" to the amount of information someone has about given system (uncertainty/unknown -> all - known), or proportional to the amount of information needed to unambiguously determine/specify that system.
IOW, from this POV, entropy IS information: information telling us what (how much) is still unknown (needed) to have complete description/determination.

Hope this didn't mud the waters even more than before.
 
Saša said:
http://en.wikipedia.org/wiki/Integrated_information_theory said:
Integrated information theory (IIT) is a proposed theoretical framework intended to understand and explain the nature of consciousness... IIT embraces the information theoretical sense of information; that is, information is the reduction in uncertainty regarding the state of a variable, and conversely is what increases in specifying a variable with a growing number of possible states. When applied to conscious experience as we know it, since the number of different possible experiences generated by a human consciousness is considerably large, the amount of information this conscious system must hold should also be large. The list of a system's possible states is called its "repertoire" in IIT.
...knowing that the entropy S of a system is defined as (Boltzmann equation): S ~ ln W, where W is accessible phase space, i.e. possible states of the system. So, as I understand above underlined part, it states that information increases the entropy of a system...

Saša said:
I'm understanding entropy mostly and mainly from mathematical/physics POV... IOW, from this POV, entropy IS information: information telling us what (how much) is still unknown (needed) to have complete description/determination.

I hang out on a philosophical Idealism Facebook group that is into IIT. Philosophical Idealism is pantheism/7th-density-like. One thing this group mentioned is that IIT may require a Quantum Mechanics version. They say this via the group also being into the Penrose-Hameroff Quantum Consciousness model. Quantum consciousness may add a solution to what is called the binding problem. Via superpositions, quantum mechanics may be able to bind a lot of information into a single quasi-particle like structure.

What though is this information in Mathematical Physics terms?

ark said:
Our groupoid, consisting of a couple of points and of connecting them arrows, is just a frame, a scaffolding, and, as such, not very attractive to our minds. Therefore, in order to make our structure more “habitable”, we will cover it with a smooth structure – much like Bartholdi covered the skeleton of the Statue of Liberty with copper. Instead of copper we use another smooth material, this time of a mathematical nature, that is with numbers... our mathematical sculpture, based on the fundamental groupoid skeleton, when finished, has the shape of the algebra of 2x2 complex matrices:

[a b]
[c d]

where a,b,c,d are complex numbers, each having a real and an imaginary part. WE have four complex numbers, that is eight real numbers. Our complete construction spans therefore an eight-dimensional space!

Some more details on this math structure:

http://www.tony5m17h.net/Thv0207095.pdf

Cl(2N;C) = Cl(2;C) x ...(N times tensor product)... x Cl(2;C)
Cl(2;C) = M2(C) = 2x2 complex matrices

http://www.tony5m17h.net/II1vNfactor.html
Since Cl(2n,C) = Mat2^n(C) and Cl(2,C) = Mat2(C), it can be written as
Cl(2n,C) = Mat2^n(C) = Mat2(C) x ...(n times tensor)... x Mat2(C).

That leads me to think that my D4-D5-E6-E7-E8 VoDou Physics model is basically similar to hyperfinite II1, but with real instead of complex structure, as follows:

My D4-D5-E6-E7-E8 VoDou Physics model uses the real-Clifford-periodicity of order 8, in which the tensor factorization is
Cl(8n,R) = Cl(8,R) x ...(n times tensor)... x Cl(8,R).

Cl(8,R) = Mat16(R) and has graded structure
1 8 28 56 70 56 28 8 1

and total dimension 2^8 = 256 with full spinor dimension sqrt(256) = 16 and two mirror image half-spinors, each 8-dimensional. Note that the 52-dimensional exceptional Lie algebra F4 corresponds to the ( 8 + 28 +16 )-dimensional vectors plus bivectors plus full spinors.

So you have a binary structure (2^8=256) that can give you spinors for matter/antimatter (fermions), vectors for spacetime, and bivectors for force particles (bosons). How to do it exactly is another story. Ark for example uses the bivectors of the Cl(6) inside the Cl(8) to get the conformal group that he uses for conformal gravity.

Relating this a little more to philosophical Idealism/pantheism/7th density consciousness:

http://www.tony5m17h.net/IFAEinstein.html

Since Einstein referred to Spinoza as a "religious genius ... distinguished by ... cosmic religious feeling", my comments will come from a Spinoza - Pantheist - Taoist perspective. According to the Stanford Encyclopedia of Philosophy:

"… A defining feature of pantheism is allegedly that God is wholly immanent … pantheism denies the theistic view that God transcends the world …

the most complete attempt at explaining and defending pantheism from a philosophical perspective is Spinoza's Ethic …

philosophical Taoism is one of the best articulated and thoroughly pantheistic positions there is …".

So that I can discuss how such religion fits together with science, here is an outline of how I see science, taken from comments (by NC and B. and Count Iblis and me) on the Cosmic Variance blog:

NC said: "… the idea that genes drive evolution … is stopping off at an arbitrary point in the long chain of causality. The only scientific thing … to do, … searching for ultimate causes, is to not stop at genes but go on a step and tell us about how 'selfish background radiation' drives evolution …".

I said: "... Once background radiation is brought into play, from a pantheistic view, you get to Dave Rothstein's possibility of "God intervening every time a [quantum event] measurement occurs". ...".

B. said: "… Instead of asking where the universe comes from, ask where the natural numbers 'come from'. ... Is maths the foundation for the theory of everything? …".

Count Iblis said: "… That's a good question and that has lead some people to postulate that reality is purely mathematical in nature. … You can define them [the natural numbers] recursively:

0 = {} (empty set)
1 = 0 U {0} = {0}
2 = 1 U {1} = {0,1}
etc. …";

B. said: "… in the end you'll sit in this field of complex numbers, and every one of them is just a point in a plane. Does C have a cause? …".

I said: "... If you want to continue the process, you might note that

C is the real Clifford algebra Cl(0,1;R), and
you can go from there on to real Clifford algebras of arbitrarily high dimension.
Since bivectors give you Lie algebras, you get gauge-group-type things, and
you might think of spinors as fermions, and
think of the vector space as spacetime, and
even try to put such things together to form Lagrangians … and
see where Einstein-type faith might lead. ..".

So, my picture of science is very much like my physics model based on real Clifford algebras, which can be described (due to periodicity-8) in terms of the 2^8 = 256-dimensional Clifford algebra Cl(8).

A little on the entropy/information of a universe:

http://www.tony5m17h.net/cosm.html

... The quantum entropy of N = I qubits is: S = I ln(2). ...[if the bound were saturated]... We would get .. a huge total entropy S_now = 10^120 ln(2) ...

... to get the actual entropy [now], one should compute it as: ...
S_now = 10^120 ln(2) / S_decoherence = 10^120 / I_max

If one agrees with Penrose [... The Emperor's New Mind, Oxford University Press (1989). in which Penrose says:

"... let us consider what was previously thought to supply the largest contribution to the entropy of the universe, namely the 2.7 K black-body background radiation. ... The background radiation entropy is something like 10^8 for every baryon (where I am now choosing 'natural units', so that Boltzmann's constant, is unity). (In effect, this means that there are 10^8 photons in the background radiation for every baryon.) Thus, with 10^80 baryons in all, we should have a total entropy of 10^88 for the entropy in the background radiation in the universe. Indeed, were it not for the black holes, this figure would represent the total entropy of the universe, since the entropy in the background radiation swamps that in all other ordinary processes. The entropy per baryon in the sun, for example, is of order unity. On the other hand, by black-hole standards, the background radiation entropy is utter 'chicken feed'. For the Bekenstein-Hawking formula tells us that the entropy per baryon in a solar mass black hole is about 10^20, in natural units, so had the universe consisted ... of ... galaxies ...[consisting]... mainly of ordinary stars - some 10^11 of them - and each to have a million (i.e. 10^6) solar-mass black hole at its core ... Calculation shows that the entropy per baryon would now be actually ... 10^21, giving a total entropy, in natural units, of 10^101 ...]

... that the entropy now should be of order 10^101, this corresponds to the maximum amount of quantum information at the moment of decoherence: ...
I_max = 10^19 (n_cr = 10^9)

where n_cr stands for the critical number of nodes which are needed to process the maximum quantum information, I_max ... it follows that the early quantum computational universe decohered at ...
t_decoherence = 10^(-34) sec.
 
Saša said:
IOW, from this POV, entropy IS information: information telling us what (how much) is still unknown (needed) to have complete description/determination.

Indeed. I would also say that entropy measures the potentiality of information (there would be no order without a potentiality for "chaos" to use a more archaic description). The Shannon entropy, measures the maximum of "space" for information to be transmitted through, but it's independent of the content. It's like the thermodynamic entropy which describes the permissible states of a system in the phase space. It tells nothing about the information itself, although there are attempts to define that through measures of complexity (Kolmogorov's entropy comes to mind but I don't recall the details). For example, you can have some white noise (maximum entropy, no information). Also, information depends not only on what the medium permits (entropy) but also on the sender (coder) and the receiver (decoder).

I think this point has been discussed somewhere in this thread.
 
mkrnhr said:
Saša said:
IOW, from this POV, entropy IS information: information telling us what (how much) is still unknown (needed) to have complete description/determination.

Indeed. I would also say that entropy measures the potentiality of information (there would be no order without a potentiality for "chaos" to use a more archaic description). The Shannon entropy, measures the maximum of "space" for information to be transmitted through, but it's independent of the content. It's like the thermodynamic entropy which describes the permissible states of a system in the phase space. It tells nothing about the information itself,

One analogy could be an empty DVD. It is essentially a container where certain amount of information can be stored. Compare the DVD to a floppy disk; the latter has far less capacity to store information. The value or meaning of what is stored is a different matter - something that Shannon entropy is not concerned with.

If we bring consciousness in the picture for the analogy, it is not just the storage capacity that is important but also how meaningful or integrated is the content that is stored. This is referred to in the IIT.

In Work terminology, the concept of "being" involves both these aspects
- how much learning from experience is stored
- how interconnected are the results of learning (Steve Mithen's "Prehistory of the Mind").

This is alluded to in Gurdjieff's "knowledge and being" relationship.

[quote author=ISOTM]
People of Western culture put great value on the level of a man's knowledge but they do not value the level of a man's being and are not ashamed of the low level of their own being. They do not even understand what it means. And they do not understand that a man's knowledge depends on the level of his being.
"If knowledge gets far ahead of being, it becomes theoretical and abstract and inapplicable to life, or actually harmful, because instead of serving life and helping people the better to struggle with the difficulties they meet, it begins to complicate man's life, brings new difficulties into it, new troubles and calamities which were not there before.

"The reason for this is that knowledge which is not in accordance with being cannot be large enough for, or sufficiently suited to, man's real needs. It will always be a knowledge of one thing together with ignorance of another thing; a knowledge of the detail without a knowledge of the whole; a knowledge of the form without a knowledge of the essence.

"Such preponderance of knowledge over being is observed in present-day culture. The idea of the value and importance of the level of being is completely forgotten. And it is forgotten that the level of knowledge is determined by the level of being. Actually at a given level of being the possibilities of knowledge are limited and finite. Within the limits of a given being the quality of knowledge cannot be changed, and the accumulation of information of one and the same nature, within already known limits, alone is possible. A change in the nature of knowledge is possible only with a change in the nature of being.

[/quote]


Shannon's information/entropy in its original form deals with probability distributions of random variables and their mapping to real numbers. Kolmogorov's complexity on the other hand may be more applicable in the context of relating consciousness and information.

From "Shannon Information and Kolmogorov Complexity " by Peter Grunwald and Paul Vitanyi

Kolmogorov complexity is defined by a function that maps objects (to be thought of as natural numbers or sequences of symbols, for example outcomes of the random variables figuring in the Shannon theory) to the natural
numbers. Intuitively, the Kolmogorov complexity of a sequence is the length (in bits) of the shortest computer program that prints the sequence and then halts.
...........................
Suppose we want to describe a given object by a finite binary string. We do not care whether the object has many descriptions; however, each description should describe but one object. From among all descriptions of an object we can take the length of the shortest description as a measure of the object’s complexity. It is natural to call an object “simple” if it has at least one short description, and to call it “complex” if all of its descriptions are long.

Relating this to the Work context, consider reacting to situations in a predictable machine-like manner, often referred to as "running a program". If this "i" is predictable, then describing this "i" is relatively easy. In information theory language, we will need only a short sequence of symbols to describe this state completely; its Kolmogorov complexity will be low and it will be a "simple" object with less information content.

Conversely, with higher levels of integration of the psyche, responses become more autonomous and complex. To describe such a state adequately, progressively longer sequences will be needed indicating a high Kolmogorov complexity and higher information content.
 
Would it be helpful to separate entropy in "non-living" objects from living systems? It seems that life has a "negative" entropy / reverse entropy operating (reflected in the ability to rejuvenate / reproduce on a cellular level, etc.). There was a file Ark posted a few years ago "What is Life?" by Schrodinger that lays out this negative entropy idea.

From an esoteric science / 4th Way Work point-of-view, when there's enough consciousness where a given "something" shows the signs of a living organism, then the entropy principles of "non-living" systems don't apply. Keep in mind that from 4th Way esoteric science's stance there's nothing in existence that is NOT alive. The C's have also said that even first density ("non-life") has consciousness. But it seems the consciousness on that level is so mechanical that it is fully controlled by physical laws only; and limited to the information that can be received by the natural physical laws that apply. Whereas living organisms / systems can have this negative entropy and grow and develop to levels where exclusively strict physical laws, including entropy in physical terms, may not apply in the same way as non-living systems.

If so, then the quality AND quantity of information receiving and transmitting goes to higher and higher level as consciousness level goes up, right? The more Being increases, the more knowledge AND understanding (connections) are possible. We can speculate that at a higher level of Being, what is now in our unconscious would by conscious. The hypothetical example of 4th density in the C's transmissions posits that at that level thoughts / mind can literally change the physical reality - variable physicality. So the quality of information receiving, transmitting, processing, etc. is of a whole other order in 4D.
 
I think the living/non-living consideration relates to the concept of densities. In this case, everything has its own level of consciousness according to its limitations. 1d information would be about fundamental interaction, the level of analysis of physics with particles, fields etc. two particles interact because they sense each other, basically non evolutive. 2d information would be about a self-propagating order (information) in competition with thermodynamic entropy, evolutive only through external input. 3d would be about self-propagated consciousness where the "foraging" of information induces a transformation of the system itself, which can be seen as a adaptive from the information point of view, evolutive externally and internally because of the limits imposed by physicality. 4d, where physicality is much less predominant should offer other possibilities to the consciousness capabilities. It all depends on the level of complexity and the mode of information analysis.
The DVD analogy is brilliant. In our density it could be a system that auto-updates its complexity through its self-adaptive processing of information (here the notion of Truth can play a role because absorbing lies would be like eating something that is harmful to the organism because it doesn't harmonizes with the internal structure that allows growth) until the limit is reached (learning all 3d lessons) in which case a major update is needed. OSIT for the metaphoric aspect.
 
Yeah, that's a good explanation. And with the DVD metaphor, it's like when the limits of 3D are reached (and all lessons learned) you go to a whole other order of storage capacity (say, like Blue Ray).
 
SeekinTruth said:
Would it be helpful to separate entropy in "non-living" objects from living systems? It seems that life has a "negative" entropy / reverse entropy operating (reflected in the ability to rejuvenate / reproduce on a cellular level, etc.). There was a file Ark posted a few years ago "What is Life?" by Schrodinger that lays out this negative entropy idea.

I don't know for sure, but somehow doubt it, since the entropy (as defined/stated in previous posts) might be useful in determining advancement/development of individual entities/beings. More below...
Thanks for the hint about Schroedinger's article, here the link to it from this thread.


SeekingTruth said:
From an esoteric science / 4th Way Work point-of-view, when there's enough consciousness where a given "something" shows the signs of a living organism, then the entropy principles of "non-living" systems don't apply. Keep in mind that from 4th Way esoteric science's stance there's nothing in existence that is NOT alive. The C's have also said that even first density ("non-life") has consciousness. But it seems the consciousness on that level is so mechanical that it is fully controlled by physical laws only; and limited to the information that can be received by the natural physical laws that apply. Whereas living organisms / systems can have this negative entropy and grow and develop to levels where exclusively strict physical laws, including entropy in physical terms, may not apply in the same way as non-living systems.

If so, then the quality AND quantity of information receiving and transmitting goes to higher and higher level as consciousness level goes up, right? The more Being increases, the more knowledge AND understanding (connections) are possible. We can speculate that at a higher level of Being, what is now in our unconscious would by conscious. The hypothetical example of 4th density in the C's transmissions posits that at that level thoughts / mind can literally change the physical reality - variable physicality. So the quality of information receiving, transmitting, processing, etc. is of a whole other order in 4D.

Leaving aside the weight of information, characteristic for given level of existence (density), I would say the entropy as a term presented in previous posts could be used in characterising advancement of a system/entity on given level, and its stepping on the next one (phase transition or graduation as C's have put it).

Here's a simplified framework of my understanding:
Learning lessons of a given level would correspond to understanding, for which neither gathered knowledge nor "size" of being alone would suffice, but the combination (maybe convolution would be proper mathematic term) of the two is needed. Only with understanding, the entity (on a certain level of existence) would really know how much more there is to learn/experience, while with just gathering knowledge this perception would not change, IMO. Hence, the entropy as a measure of (still) unknown for this entity would increase, and like stated in thermodynamics/statistical physics, it would define the direction of the arrow of time (increase in entropy -> real/forward direction of time), i.e. advancement.

Each level of existence (density) has certain amount of lessons associated with it (like curriculum of 3rd grade in school, for example). When stepping on the next one (like phase transition), the entropy (as defined above) for the entity would exhibit huge increase, as new landscapes (the understanding of the amount of unknown) would become (momentarily) available. That could represent a graduation, viewed from the entity's perspective, since it would not be completely aware of level boundary living/experiencing that level (density) itself.

On the other hand, outside observer would "see" the density boundaries (like viewing the curriculum) and for him entropy of that entity would not increase in case of "normal" advancement through given density (having knowledge of amount of lessons associated with it) or overall (having knowledge of All That Exists), including graduations. In this sense, first situation could maybe be related to 5D (contemplation level, making "check list" while reviewing curriculum) and later one to 7D, where/when in principle time doesn't exist (at all).

The above would kind of present level/density-independent model, and mkrnhr gave really nice and valid, IMO, assessment about information-density relationship.

Regarding the relation between awareness/consciousness and polarity/duality (STS/STO), and WRT conscience, I'm still trying to wrap my mind around it, since real understanding of how a STS entity advances when conscience is the organ that facilitates growth needed for advancement and graduation is still escaping me. Well, underlying assumption is that STS's don't have conscience, which might be wrong. Otherwise, I have even more difficulty understanding STS way and graduation to 4D STS side. Like, having no conscience, thus having no the possibility to grow and increase consciousness by themselves, STS entities impose lies on those who have conscience and by making them believe lies, eat their consciousness (which probably can be done in more ways than just imposing and making someone believe lies) and grow in that way. The problem is the reconciliation of wishful thinking (characteristic of STS) and increase of awareness/consciousness (prerequisite for advancement/graduation), which, for me, kind of go in opposite directions. Well, work in progress...
 
What I find intriguing is that though we seem to understand quite a bit about matter (1D) and its interactions in its aggregate form, we are quite in the dark about how smaller units of matter behave. Statistical physics can do a good job in explaining a lot of matter to matter interactions. However, its accurate predictions refer to an ensemble - a large number of particles/molecules etc. The fate of individual particles and molecules is not described by statistical physics. Enter quantum mechanics and uncertainty. Fundamentally, we do not know how one electron exists and interacts with its environment in all its details.

Is this state of affairs due to lack of knowledge on our part - i.e will the state of affairs change when we are able to build better instruments and/or ever more abstract theories? Once we are able to build a quantum computer, will all be known about matter? That is a possibility that cannot be ruled out and many (if not most) scientists believe it. And since it seems that this is a matter of belief, here is an alternative belief. There is uncertainty at all levels of creation. The more one knows (traverses the density hierarchy), the hazards due to uncertainty (briefly mentioned in this thread ) increase instead of decreasing. Seems counter-intuitive at first glance and conflicts with the age-old idea that there is an omniscient and omnipotent God. However, within the narrow range of experience in the human realm, it rings true. More knowledge, more responsibility, more high stakes for decisions and actions.

But what does it mean that we cannot understand 1D completely? When we heat a pot of water, we know that if we heat it strong enough and long enough, we will change the state of water to vapor. But neither can we distinguish effectively one molecule of water from another molecule of water; nor can we predict the fate of one molecule of water throughout the heating process. This is where I believe consciousness comes in along with the "esoteric" assertion that everything has consciousness. Part of what we call 1D extends beyond the realm of what we can grasp. Being 3D we lord over 1D - or so it seems - but not quite completely. This may have interesting connotations at a changed scale where we may be like an ensemble of molecules for a higher intelligence.

[quote author=Saša]
Regarding the relation between awareness/consciousness and polarity/duality (STS/STO), and WRT conscience, I'm still trying to wrap my mind around it, since real understanding of how a STS entity advances when conscience is the organ that facilitates growth needed for advancement and graduation is still escaping me. Well, underlying assumption is that STS's don't have conscience, which might be wrong. Otherwise, I have even more difficulty understanding STS way and graduation to 4D STS side.
[/quote]

I do not think developing conscience as we understand the term is the only way to advance from one level to another. The STO way of advancement may require development of values through work on the emotional center in a particular direction which leads to development of conscience. However, increasing complexity (in the information theoretic sense) is most likely attainable without concomitant development of conscience and that seems to be the signature of what we call STS development. Smart does not necessarily equate to good. That is my current understanding.
 
Back
Top Bottom