Monday, December 1, 2008

Quantum Game of Life II

by Carlos Pedro Gonçalves

Below is a video of a simulation of the quantum game of life, implemented in Netlogo.



Following the previous blog post on this game, each cell (patch) can be in a vacuum state, or it can become punctured by a qubit state.In the video the grey cells correspond to the vacuum states that may potentially become occupied by a qubit state Q> = a0>+b1>, while the black and white cells have become occupied (in act) by a qubit state. Each color corresponding to one of the two basis states of the qubit, that is, white corresponds to 0> and black corresponds to 1>.

The actualization of each of the two qubit computational basis states occurs through a probabilistic process since there is decoherence in the qubit histories. The model works with two gate rules, a simple Haddamard rule, and a chaotic rule that exemplifies path dependent quantum computation. The video above uses path dependent quantum computation, in the dynamics of creation and anihilation as well as in the qubit information dynamics.

The video is also available at:

http://www.youtube.com/watch?v=14jdqWEe0wE

Saturday, November 29, 2008

Quantum Game of Life

by Carlos Pedro Gonçalves

The following are pictures of a simulation of a quantum game of life, implemented in Netlogo:


Initial Configuration




After a Few Steps



Each cell (patch) can be in a vacuum state, or it can become punctured by a qubit state. In the pictures the black cells correspond to the vacuum states that may potentially become occupied by a qubit state, while the white cells have become occupied (in act) by a qubit state.

Quantum information is, thus, created and annihilated out of an information vacuum. A clustered patchworked dynamic geometry emerges, in the current simulation, from local connectivity rules that make it more probable for clustering to occur.

Friday, November 28, 2008

Defining Complexity

by Maria Odete Madeira

Diversity and complexity are contextually (co)implicated , depending, the nature of each, from the nature of the context, in which they are rooted and upon which they systemically depend. This means that to refer the term complexity to a single definition would be not only impoverishing but also difficult in terms of a desirable explicative rigor.

To speak of complexity one must comprehend complexity, itself. To comprehend comes from the Latin cum+prehender that means to apprehend conjointly, thus, the term “comprehend” folds, it its own definition, the diversity that allows one to design it also as complex, in the sense in which it folds in the same act of comprehension, successive resendings between different individuals, or subjects, which implies an explicare and, thus, an unfolding of the subjectivity in intersubjectivity.

Any attempt to define complexity will have to consider, in the act itself of defining complexity, the complexity implicated in the act of comprehension, necessary to that definition.

Comprehending complexity systemically depends upon individuals, groups, species, cultures, societies and civilizations, in their differences, multiplicities and diversities.

Tuesday, November 18, 2008

Being and Existence

by Maria Odete Madeira

In Plato, as in Parmenides, the Being is not addressed, from the plane of the Existence, but from the plane of the Truth.

The Being is the very act of Being, immediately realized within the Being, itself.
To Be the Being is the realization of the Unity in its irreducible absolute act of Being.

When, within a Platonic position, one states that the Beauty, the Good or the Truth ARE, one is stating that the Beauty, the Good, or the Truth are in themselves as such: Beauty, Good and Truth.

This in itself means a pure functionality of the Being, as condition of itself, depurated of any constitutive relational element, but that, because of that, it can, itself, constitute itself necessarily as condition of possibility of the existence of the entities in the relative rotative relational positions of the multiple and divergent modes of the being there (dasein).

Wednesday, November 5, 2008

Nietzsche - Man... a chain of forces...

by Maria Odete Madeira

“Eu que sou filho da terra, sinto as doenças do sol como se fossem eclipses meus (…)” (Nietzsche, The Will to Power)

“I, that am son of the earth, feel the diseases of the sun as if they were my eclipses (…)”

For Nietzsche, the “I” is a sort of animal, an individuation constituted by chains of multiple forces articulated between each other and whose motion of rotation displaces itself incessantly in the direction of other forces upon which it exerts its pressure, growing or diminishing, rizhomating in accordance with the trajects and trajectivities.

Man… a chain of forces dependent upon all the cosmic forcers, man… a territory of fluxes of individuated forces, whose frontiers are critically situated between two limits: the infinitely large and the infinitely small, and…, yet, unpredictable, and…, as any small particle, man is a being in itself, simultaneously potential/virtual and actual, simultaneously wave and particle.

Sunday, November 2, 2008

The Nothing

by Maria Odete Madeira

Nihilum or Nothing is a referential expression that signals the non-existence of all the things.

The Nothing is understood in its relation with the Being: the Being is and the Nothing is-not; the Being is the Being, because the Being is, and the Nothing is the Not-Being because the Nothing is-not.

However, the Nothing is also Being, because the Nothing is the Nothing, that is, the Nothing is the Being of the Not-Being.

Every referential expression resends towards a referent. The referential expression Nothing resends towards a referent: the Being of all the things.

In any language the referent can be defined as an object of perception. In the referential expression Nothing, the referent is the Being itself that, as such, has the power of generating its own negation: the Not-Being of the Being, understood as a non-presence and, in this way, as a non-existence.

Parmenides introduced, in the Western thinking, the concept of eternity as total presence of the Being. In accordance with Parmenides, the Being exists as total presence, the Being is not born nor dies, the Being is and exists as an eternal present.

However, the Not of the negation imposed itself on the human thinking and language as a power of the Being, as capability of displacing oneself, in a strange loop, from the Being to the Not-Being, in the same flow of Being, as total presence, and, thus, simultaneous, coincident and permanent rotativity of oneself, in oneself and for oneself.

Heidegger experienced, through a feeling of anguish, the interval of thinking between the Being and the Entity. It is dived in a profound anguish that the author conjectures that the entities are not the Being and that the truth of the entities is in that which they are not: the Being itself.

For Heidegger, in front of the Being, the man is helpless, lost in the Nothing. Because the Being, itself, is neither this nor that: the Being, itself, is Nothing.

Friday, October 24, 2008

Cognition, Truth, Essence and Substance

by Maria Odete Madeira

Every cognitive act is about one or more relations. This about is never neutral, it is committed with an intentionality that points towards a signaled relation. To point is also to orient oneself towards the existing things as presences in the world (ousia, dasein), in relation with one another (Heidegger, 1969).

In Plato’s Fedon and Phaedrus, the problem of the knowledge is placed in what regards its object, being concluded that the response to the problem would have to be found through knowledge itself, but for that, and in accordance with the platonic procedures, it was necessary that one first defined what knowledge is in itself.

In the Menon an exercise of fundamental reflexibility is developed towards defining knowledge, in regards to its origin and nature, but it is in the Book VII of the Republic that a theory of knowledge is developed, whose archetypal operationality still remains in all the Western scientific thinking. This theory of knowledge proposes a cognitive model, in which the knowledge acquired through the senses is linked to the change, which is a configurator of ephemeral appearances, and, because of that, this knowledge is considered an uncertain knowledge, particular and subjective, a knowing of opinion or doxa (Note 1). The true knowledge or episteme, is sought for in an immutable reality, intelligible and supra-sensitive, only accessible to the human intellect.

The Greek terms episteme and logos that mean, respectively, science (true discourse) and discourse, compose the term epistemology. This term still remains, in all the Western thinking, reflexively committed with the Greek matrix in what regards the conditions of truth, consistency, legitimacy and legality of knowledge, that refers it and relates it with the existence and reality of the things.

The Greeks understood and considered reality to simply be that which things were. However, this principle of definition would become more complex and controversial, by the need for the development of the concept itself, from the first causes and first principles of the things themselves, that is, from that which explained the nature of these things, in what regarded their origin, which led to the need to work upon two fundamental notions: the notion of matter (Note 2) and the notion of form (Note 3).

These two notions appeared and were developed in the Greek thinking as two primary epistemological oppositions. The first Greek philosophers and, later on, the atomists, considered the first causes and the first principles, according to matter, while Socrates, Plato and Aristotle considered these same causes and principles, according to form. This division has configured, along the whole Western thinking, the interpretative trajectivity of notions fundamental to the exercise of the reflexive thinking in its relation with the conditions of truth, synthesized in the scientific statements.

Heraclitus conjectured an idea of truth persisting and governing through the permanent change of all the things, that is, an order and proportion that the logos expressed and that constituted a universal explicative pattern accessible to all and that, because of that, should be consensually interpreted.

Parmenides, on the contrary, argued for the existence of a truth inaccessible to the common of mortals, being beyond every aspects and perspectives of the daily experiences, that are linked to change or to opinion (doxa). This truth was only accessible to the pure thought or noema.

One can state that with Heraclitus and Parmenides it was trajected, in the Western thinking, a perspectivic genetics that replicated, in the scientific statements, a fundamental opposition between the senses and the intellect or reason, the first, linked to the change and the experience of this change, as well as to the visibility of this change, and, the latter, linked to what is considered to be invisible, permanent and immutable.

However, and even though the methodological criteria diverged, in each of the two thinkers, Heraclitus and Parmenides wished to access the same permanent order and proportion responsible for the existence of all the things and from which any discourses could be stated as true.

The character of permanence, immutability and indestructibility constituted, thus, the fabric of the notion of truth and of the true discourse which, in turn, were linked to the notion of necessity (agakh, necessitas). Some thing could be considered true only when that same thing was necessarily true. The necessity was defined by Aristotle as that which has to be in that way and cannot be in any other way (Meta., V, 5, 1015a; 1015b).

The notion of necessity signals an idea of permanence that, in turn, places it in relation with the notion of essence, from the Latin essentia, term derived from esse which can mean, as verb, to be or to exist and, as name, the being.

Essence has its Greek correspondent in ousia, noun formed from "to einai" which, in turn, also means to be. In its origin, essence meant, just, being, later on, and through the usage, it came to mean that which is. The accent in the verb displaced itself to the subject (ypokeímenon, subjectum) to which the being is attributed.

Later on, the notion of essence came to coincide with the notion of substance, from the Latin substare (to underlie), interpreted (the notion of substance) as subject and substratum of accidents. In simple and general terms, one can speak of the essence of the substance and of the accidents.

With Plato, the term eidos (Note 4) (eidos or idea) came also to designate that which was understood by essence (or essences) of the things.

For Plato, in the line of Parmenides, the ideas or essences existed separate from the concrete things. According to the author, the corruptible nature of all the things that constituted the world of sensitive experience, made it impossible the access to a true explanation about the things. The true knowledge or episteme should, thus, be sought in a supra-sensitive reality, only accessible to the intellect, that is, in the ideas or intelligible essences.

The theoretical or intuitive nature of the platonic knowledge is associated with an intelligence or cosmic reason, nous, also designated by spirit, related with the ideas or essences, which were considered to be the cause of the nous in the soul, by participation. The operativity of the nous was that of facilitating the access of the human intellect to the true knowledge.

Aristotle, dissolved the nous in a dynamis (potency) and in an energeia (act), criticizing Plato about the theory of the eidos. For this author, the essences did not constitute a subsistent separate from the things, being just their formal cause.

By knowing the things we could also know their eidos. The true knowledge or episteme could be found in them as an energeia or principle of being (esse, enai), by opposition to dynamis, which Aristotle considered as a principle of determinable indetermination, that is, the dynamis could only be determined by the energeia that, in itself, was the determination.

The scholastics interpreted the energeia as a synonym of force (vis), that is, a causal principle of action, belonging to the category of quality. The medieval philosophers called it impetus, Descartes called it quantity of motion, Leibniz and Newton called it living force (vis viva).

While disagreeing between each other as to the locality of the universal essences, both Plato and Aristotle agreed, however, that the scientific discourse could only be considered to be true, when it satisfied criteria of universality and of necessity. Thus, it was imposed as a demand of any scientific inquiry, to find the fundament of any statement of truth or of falsehood, in that which were the first causes and the first principles of the things.

In this way, for there to be discursive rigor, the matter and its forms or essences should be analyzed separately as things that are different from each other, both in terms of their identity as well as of the value attributed to them. The epistemic analysis of one should not interpretatively contaminate the analysis of the others.

The object of the knowledge, its form or essence, meaning and definition obeyed, for Plato as well as for Aristotle, rigorous criteria of adequability and agreement that allowed a general semantics of deductive nature (Note 5) that prolonged itself in Euclides geometry and in Arquimedes’ statics, both based upon a system of axioms that constituted self-evident truths, and upon a system of theorems derived from those same axioms.

The semantic determinism of the Greek scientific discourse generalized itself as an epistemic model until Epicurus, distancing himself from the primitive and deterministic atomism of Democritus (Note 6) introduced the notion of clinamen (declination) as a capability that the atoms have of spontaneously deviating themselves from their trajectories. Thus, Epicurus introduced, in the scientific discourse, an element of irreducible unpredictability, absent until then, with epistemological consequences about that which was considered as criterion of truth.


Note 1 – The distinction between true knowledge or episteme and knowledge of opinion or doxa, the latter considered as inferior knowledge, dates back to Xenophanes. In the poem of Parmenides the sensation or aisthesis regards the appearance or opinion (doxa). The sensation has to do with the perception of the senses (aistheta) that Parmenides excluded from that which he considered as truthful knowledge, which belongs only to the domain of the being (on).

Note 2 – Matter, with origin in the Latin materia, corresponds to the Greek terms: arché (principle); stoichein (element); chora (receptacle); and hyle (materia prima). To explain the nature (physis) and the change of the things, the first Greek physicists or cosmologists conjectured that all things came from one single or several principles (archai), these being considered as substantial beings that existed in themselves and by themselves, such as: the water, the air, the energy or the atoms. Aristotle was the first Greek philosopher to use the term hyle (materia prima) (Meta Z 3, 1029a) as subject of substantial or accidental change, that is, hyle as the first substratum of each thing, or immanent principle, from which some thing comes to the being (Physics, I, 7. 129a; 190b – 192b). Hyle is the indeterminate and the potential principle of the being, opposed to the form, this last considered as actual principle of the being. The term materia prima or hyle was introduced by the scholastics, who added the term materia secunda or deutera hyle. The concept materia prima corresponded to the concept of potential matter (in fieri), and, the concept of materia secunda, to the concept of actual matter (in actu).

Note 3 – The form can be designated by the Greek terms eidos (conceptual form, species) and morphe (real material form).

Note 4 – The notion of eidos, translated by idea, became one of the fundamental concepts of all the Western thinking. Originally, the eidos was that which was seen, the aspect, the appearance or the form, normally designated the form of the bodies. In the time of Herodotus, the eidos and its cognate the idea, that meanwhile started to become common use in the Greek society, assumed the meaning of characteristic property or type. Frequently, the eidos/idea appeared as technical term, linked to the notion of potency (dynamis). The eidos/idea can be considered as a linguistic sign, whose semantic richness guarantees its applicability to a certain variety of contexts without that constituting a loss in its original connotative or denotative value, that is, the identity of its primitive nature as linguistic sign is not lost, despite its adaptive growth. Plato recognized its richness and plurality of senses and vastly applied it in different contexts. Being, however, more important that sense which is projected by its metaphysically substantivated application as subsistent essence in an incorporeal intelligible and incorruptible world. This notion, that constituted the core of the theory of the ideas, has its roots in the Socratic eidos, as subsistent essence of certain modes of ethical behavior. In its set, according to Plato, the ideas constituted the archetypes or models in imitation of which all the things had been done, being, because of that, only them (the ideas), the guarantee of all the scientific knowledge (Sophist, 246b; Fedon, 99e; Parmenides, 132d; Timaeus, 52a). The terms eidos and idea were used by Aristotle with the same meaning that Plato attributed to them (Meta., I, 6, 9).

Note 5 – Aristotle created a theory of demonstrative judgments which he called syllogism, from the Greek sullogizomai which means to bring together. In the formal logic, syllogism means the external signal of the deductive reasoning. In its traditional (Aristotelic) form, it is an inference in which a conclusion is inferred from two premises with a common term and through the elimination of that same term. The inference is always made by means of deductive reasonings, from the Latin deducere which means to extract or to diminish. The deduction is a reasoning through which, from one or more considered propositions (antecedents), one concludes necessarily an unknown proposition (consequent). The notion of deduction is in conformity and agrees with the Theory of the Syllogism of Aristotle. The syllogism is, in terms of its structure, considered the most complete and robust expression of the deductive reasoning.

Note 6 – The primitive atomism was attributed to Leucippus and Democritus. Leucippus has remained as a vague and obscure figure, more being known about Democritus and of his writings. Of the little information that has been gathered, it is conjectured that Democritus must have developed the fundaments of the primitive atomism from Leucippus, these fundaments agreed with the Parmenidean principle that all things had their explicative principle in an immutable and imperishable order, but, these same fundaments, were not in agreement with Parmenides, when these stated that that same order had a material nature that was constituted by solid corpuscles that alternately and eternally collided and repelled each other, in accordance with a regular and determined mechanics, in an unlimited space. These small particles or atmoi were considered the smallest particles of matter, solid, hard and indestructible, different from each other in size and shape and in its relative topological positions, in terms of motion and distance.

Monday, September 29, 2008

On Risk and Responsibility


By Maria Odete Madeira


Responsibility, from the Latin, respondere, is a name for a systemic capability that any autonomous human agent has of being able to account for his/her actions and respective effects, accepting the consequences of these actions (to be accountable).

Because of this capability, the human agents have the right and the duty to account (respondere) for their actions.


“(…) Today the guiding hand of natural selection is unmistakably human with potentially Earth-shaking consequences.

The fossil record and contemporary field studies suggest that the average rate of extinction over the past hundred million years has hovered at several species per year. Today the extinction rate surpasses 3,000 species per year and is accelerating rapidly; it may soon reach the tens of thousands. In contrast, new species are appearing at a rate of less than one per year (…) The broad path for biological evolution is now set for the next several million years. And in this sense the extinction crisis – the race to save the composition, structure, and organization of biodiversity as it exists today – is over, and we have lost.”

Stephen M. Meyer, 2006, The end of the wild, pp.3-5.


The question about the responsibility of the human agents is now stretched to the limit point from which we must ask: what fundamental value do we assign to the planet?

In a risk assessment, where the agents involved have an interest in a support to a certain action, there is the inevitable contamination of that assessment with an undervaluation of the scenarios considered to be unfavorable to the intended course of action. This includes, not only, biases in the assumptions and in the processes of risk quantification, but, also, biases in the interpretations of the results.

Many of the technologies that we are now beginning to produce may open up the risk of situations with catastrophic consequences to the planet.

Some of the risk scenarios associated with these situations are known, and worked upon by the sciences involved and by risk science.

It is necessary, within a scientific approach to risk, to consider risk as a fundamental ontological operator linked to the mechanism of life and death. Generally, the risk assessment is made from the perspective of superstructures that transcend the ontological planes of immanence.

Any systemic cognitive synthesis includes an evaluation of risk and it may be more, or less, accurate in accordance with the functioning of the systemic homeostatic mechanisms.
The question that is placed about the human assessment of risk is that the type of evaluation of risk is being done in planes that ontologically transcend the systems and problems themselves, call these planes of transcendence, for instance.
In these planes of transcendence, we have what we call the political, economic, military and scientific games, that introduce an ontologic systemic bias in what regards the life and death of the systems. The evaluations of the risks are not being done about what may constitute a threat or an opportunity to the systems, what are being done are evaluations of the discourses and strategies of power, compromised with the economy, politics, military and science.
The risk has become, in today's economies, a product aimed at the satisfaction of pleasure, which raises a question: what are the phenotypic effects synthesized and metabolized by a neurocognition of the risk? To what point are the homeostatic mechanisms, that include, for instance, background feelings, being organismically blocked?

What are the risk assessments that the human agents, as consumers of risk products, may produce? In what way are they being affected by the effects of a risk culture and a risk economy?

Furthermore, the economy itself is feeding upon the risk for its own development. Besides a risk consumption culture and economy, there are the technological and scientific risks, as well as the power dynamics of wars and of natural disaster aftermaths.

No less important is to consider the role, in the perception of risk, of the religions as power institutions supported by dogmatically organized discourses of faith, built from conventioned truths and that are supported by active ideological mechanisms of convincing and vanquishing. To what point is it not the case that religions constitute a mechanism of alienation, effective and with structural effects, that blocks the cognitive processes linked to the risk perception?

Sunday, September 14, 2008

Intelligence and Reason


by Maria Odete Madeira

Intelligence has its etymological root in the Latin intelligentia, term composed of intus,which means within, and legere, which means to choose, to elect.

Intelligence is a dispositional systemic capability to perceive, capture, select and process data and knowledge, from cognitive (co)dynamics, linked to dynamics of interpretation and comprehension, genetically incorporated in all the living systems.

Reason has its etymological origin in the Latin Ratio, meaning Calculation. Rationem ducere means computare, to calculate.

All the living systems actively depend upon the cognitive processes.


All the living systems are producers and consumers of knowledge.

In any cognitive processing intervene dynamics of Intelligence and dynamics of Reason. Intelligence and Reason are not dynamics exclusive of a given living system, but of all the living systems.

Wednesday, August 27, 2008

on homogeneization and heterogeneization

by Maria Odete Madeira
While the homogeneization, understood as a dynamic process, is intentionally directed towards capturing, in the systems, patterns that exhibit similarities between each other, towards a systemic identity, the heterogeneization, on the other hand, positions itself, with respect to the homogeneization, as an energetic antagonism, intentionally directed towards capturing, in the systems, patterns that are different and divergent between each other.

The energetic antagonism implies an undefined enchaining of contradictories, that, in the systems, can potentiate mechanisms of autopoiesis capable of producing the so-called systemic fracturing evolutionary jumps, from which lines of fugue can be traced as a threat or opportunity to the systems in what regards their growth and development.

Monday, August 25, 2008

Hegel - to think

by Maria Odete Madeira

To Hegel, the territory of science is the thinking, to think is to integrate the existence in a plane of non-reductionist universality, that is, the universality must be considered in a perspective of a systemic totality in relation with the respective parts that constitute it.

Hegel designates by element or ether the constitutive horizon of science, characterizing that element or ether from its translucid and simple character, that allows one to signal it as a standpoint (standpunkt), that is, as a taking of position that preoccupies itself with the reality from its fundament.

The ether constitutes, for Hegel, a universal principle of possibility of all the experience.

Sunday, August 10, 2008

Color - light and gravity

by Maria Odete Madeira


The human brain reconstructs the color, from the luminous energies, reflected by the diverse colored surfaces. To that end, neural areas, specialized in the perception of the colors, are mobilized, as well as areas of the cerebral cortex specialized in the analysis of the forms, of the disposition in the space and of the motion (Changeux, 1994).

In the presence of the objects, the eyes capture the luminous radiations that these emit, converting them in electrical impulses that propagate themselves to the cerebral cortex, the place where an internal representation of the color is built.

The human eyes are sensitive to the electromagnetic radiations, in a band called the visible spectrum, in which are localized seven visible colors that distinguish themselves by their respective wave lengths.

The perception of the light takes place through the incidence of the luminous rays that penetrate to the retina that converts them in electrical signals, posteriorly transmitted, through the optic nerve, to the brain that interprets them, configuring an internal image that corresponds to the external image responsible for the emission of light.

The world, such as we see it, is filled with colored forms. Our physical perception of the objects involves the perception of the color. This perception is a dynamic act, that passes through the apprehension of external variables that implicate the spatialization of the color, as actualization of this same color, in a real and communicational time that relates the observed color with the observer.

We can, therefore, speak of an overcoding, associated with the color. If, on the one hand, in the spatial unfolding, the color unfolds, reflexively, the senses implicated in it, on the other hand, it folds the memories and experiences, subjectively triggered in the observer, as observer that is affected by the interpellation of the color.

In this sense, the color affects and is affected, in a reflexive nanodynamics expanded to proto-conscious levels that mobilize sets of neurons that enter in resonance with the overcoded senses, in the act of subject/observer and color/observed, intercrossing short or medium term memories, with those of long term, all of them interpellating coded and codifier living experiences.

The experience of the color is, also, and experience of light and gravity. Affections that fold and unfold, explicate and implicate, intercrossing perceptions and projections of light and gravity.

In a primordial dialogue of convincing and vanquishing, shaded and alchemical blacks are dramatically crossed by solid, material, measurable and quantifiable lights. Tensional spaces, geometrized by light and shadow are, thus, configured.

Black and crepuscular inquietudes alternate with swift, intense and extenuated whites, reflecting hybrid and undecided harmonies, chiseled of night and day, light and gravity.
White theories, objective and explicative, twist and turn through dissymmetric, bifurcating, faithful to Plato’s cave, molecular black flows, in the eternal and eternized struggle between white and black, light and gravity.

Between the cave and the sun, intersections of straight lines, planes, volumes, intervals, regions, translucid and empty white intuitions, shadows, secrets and senses triangulate themselves.

Limits that intercross, agitate, hesitate and ramify, in dissipating, metallurgic, prosthetic and projective webs.

Between the white and the black, between the light and the gravity, are the intersection, the shadow, the remain, the trace, the debris, the sense as pre-figurative energies.

We have disconnected ourselves, little by little, from the space of the land in which we live. Little by little, under our eyes, disappeared the space of the sunlight, of the agriculture, of the sacred, of the war, of the states, of the written page that geometry expressed in its intimidating purity (borrowing Serres, 1993).

Since then, a bundle of bodies, messages, knowings and light circulate. In a global land, a new communicational space installed itself in multiple and interlinked networks. A space of mixture, a shaded space.

Monday, August 4, 2008

Referent and Reference

by Maria Odete Madeira

Any existent can, in any language, be considered a referent, whose existence can be linguistically built in a statement external to it.

Any referential expression is directed towards one or more referents, with which the respective sense(s) and meaning(s) are related.

We are before an act of reference, whenever one states an expression that identifies or names an existent.

In a semiotic language, any sign can be approached as a referent or as a reference, depending upon the position and systemic dynamics of each entity/sign in the statement.

In a situation or process of self-reference, a rotative coincidence is realized, in which the system folds upon itself, in itself. Whenever this occurs, we are before a dynamics of autopoiesis.

Saturday, July 26, 2008

On Phenomenology - Discipline and Method

by Maria Odete Madeira

Phenomenology is a branch of philosophy, applied to the study of the phenomena.

Phenomenon (phainomenon) is that which reveals itself, by itself and in itself. Thus, all existing things can be considered to be phenomena.

Phenomenology is not a method, it is a disciplinary area constituted by a theoretical body, available as fundament for the construction of methods and methodologies aimed at the study of any existing reality that, genetically, by itself and in itself, reveals itself as position rooted in the world.

Criticizing the positivism as a mere construction of facts, Husserl (1907) held that the so-called crisis of the sciences had to do with a reduction of humans to constructed facts. In the positivistic model, humans, as such, were alienated in the human facticity, derooted from that which constituted them as beings that are intentionally inscribed in the world, from which they received the sense of themselves and of the things with which they related.

For Husserl, the world was the originary datum, that was turned towards humans and towards which they, themselves, turned self-reflexively, in order to determine and distinguish the sense of the things.

Protesting against the reduction of human reason to mere exercises of calculation, Husserl defended phenomenology as the science of the phenomena. In Husserl, the phenomenon is an eidos that reveals itself in an intuition that is defined by an intentionality that assigns a plenitude of presence to the eidos (phenomenon) that is aimed at. Each eidos is accessible, only, to a certain type of intuition: the intuition that refers and identifies it in an immediate way and that, because of that, captures it in all its totality.

Intuition has its origin in the Latin terms tueri (to see) and in (in, within) that conjointly mean the action of seeing directly within the things, thus, signaling a mode of immediate knowledge of an existing object that, because it exists, shows itself, as such, to the consciousness that aims at it, and, because it shows itself, it also demonstrates itself in its modes of existence, so that it can be described from its fundament (eidos).

The experience, in Husserl, assumes the sense of lived world (Lebenswelt), as that which is in the origin of the knowledge and that is before any reflexive activities. That is, the objects present themselves to the consciousness, giving themselves in their completeness (eidos) in order to be, only after, and through a conscious act of radical reflexivity, on the part of the subject, actively apprehended by the consciousness that intentionally signaled them as existents in themselves, and by themselves, and, in this way, as autonomies.

The noumenon, that was impossible to be categorized in Kant, appears, in Husserl, as that which is immediately intuited. Thus, Husserl turned away from the criteria that transcended the experience, to be able to, in the same experience (Lebenswelt) and in a direct way, capture that which, in the phenomena, was their noumenic sense.

In Heidegger, the notion of experience lost the theoretical sense present in Husserl, to mix itself with the notion of existence. The author considered that, originarily, the things do not “make their apparition” and do not “appear to be” to the humans as phenomena or objects of thought, but, instead, as entities that (co)exist in a complex system of references that constitutes the world where each human exists, understands and interprets himself as being-the-there (dasein), projectively launched to a future that he existentially anticipates.

This perspective, of a gnoseology, as activity rooted in the worldly experience, was, also, developed by Merleau-Ponty (1945), who considered that all the universe of science is constituted upon the lived world. The notions of subjectivity and objectivity are synthesized in the notion of the world, lived and understood by subjects incarnated in it. The phenomenological perception is, thus, considered as an originary pre-reflexive experience of each subject with an own body (corps propre), that body, understood, as a node of living significations that is incorporated by an operating intentionality that links it to the world of life.

In this way, the phenomenological perception constitutes itself as the ground that is previous to all reflexive activity. Assuming that knowledge is, always, the apprehension of an ontologically constituted structure, an ontologically constituted structure that, as such, is given to any originary human consciousness, (a consciousness) which is fully rooted in a world that is previous to it, exterior to it and autonomous.

Monday, July 21, 2008

On Individuation

by Maria Odete Madeira

Individuation is a philosophical term that designates the intrinsic, constitutive, genetic and autopoietic event of separation of each individual within a same species.

It is a systemic principle of separation that operates from the first causes and the first principles that were involved in the genesis of each individual, be these principles entities, situations, processes or events.

In methodological terms, it constitutes an ontological principle used as a classificative criterion of identity.

On Penrose's Argument Against Density Operators

by Carlos Pedro Gonçalves

What is a quantum state?

Should we speak of a quantum state at all?

Should we speak of quantum states or of quantum processes?

These questions can be raised from the work of Baugh, Finkelstein and Galiautdinov (http://arxiv.org/abs/hep-th/0206036) and from the results obtained by Gonçalves and Madeira (http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1438013), about the connection between a stationary quantum state and the consistent histories formalism, these results being obtained from the relational structure of the different bases in which a stationary quantum state can be expanded.

A different, but related, problem arises from Penrose’s Road to Reality (Penrose, 2004), where the author questions the mathematical structure that should be used to formalize what is usually called a quantum state.

Thinking about these two threads, one is lead to the following question:


What is the most fundamental mathematical structure that should be used to describe the quantum system, and, what is the nature of the physical semantics that this structure formalizes?


This is the main question to which we shall return, recurrently, during this article.

Looking at Penrose's (2004) work, and regarding the first part of the question, we find that Penrose (2004) raised the problem of formalizing the quantum state by:

A) The density operator, whose entropy zero space (using von Neumann’s notion of entropy) is comprised of the density operators for the so-called "pure states".

B) The normalized kets (for the "pure states")

It is clear that the density operator is more general, since it can be used for statistical mixtures, but is it more fundamental than the ket for pure states? And, should we call these states at all?

The notion of a zero entropy density operator is effectively equivalent to a projective notion of a (pure) quantum state, as Penrose noticed. Therefore, one might take the position that such density operators appropriately describe a “physical quantum state”, taking the perspective that only that which has an impact in measurement problems can be considered to be physical.

About this, however, Penrose (2004, p.796) argues that:

“(…) I feel uncomfortable about regarding such a ‘pure-state density matrix’ as the appropriate mathematical representation of a ‘physical state’. The phase factor (…) is only ‘unobservable’ if the state under consideration represents the entire object of interest. When considering some state as part of a larger system, it is important to keep track of these phases (…)”

Penrose’s main issue is related to the superposition principle. As Penrose puts it, the basic quantum linearity is obscured in the density operator description. Indeed, an objection of Penrose against the density operator is that the density operator makes complex the simpler linearity of the ket formalism.

So far, Penrose’s arguments equally apply to the density operator and to the density matrix. In pages 797 to 800 of Road to Reality, however, Penrose proceeds discussing what he considers to be the “confused ontological status of the density matrix”, in this case, the argument centers itself in the matrix and not in the operator. Indeed, some of the statements, used as counter-argumens apply correctly to the density matrix but not to the operator.

The major argument refers to the inability of the density matrix to distinguish between different kinds of entangled pairs. For instance, consider the following scheme/example:




Even though we have two different kinds of entangled pairs, the density matrix is the same, that is, the density matrix does not seem to distinguish the bases.

However, this is not the case if we take into account the density operator. The density operators are, not only, different, but if we determine the projection, for instance, of the second density operator with respect to the basis in which the first is represented, we obtain:

Indeed, the second operator is a statistical mixture between two pure states of superposition of 0> and 1> (+> and ->).

What the above results show is that the projections of the second density operator to the basis {0>, 1>} and to the basis {+>,->}, differ, with respect to the probabilities assigned to the quantum events formalized by these projections.

We can use a density matrix for reading probabilities, however, one must never confuse the matrix with the operator, and one must always use the operator, for the fundamental description.

The problems of ontological confusion, raised by Penrose, can be raised with respect to the density matrix but not with respect to the density operator.

This stresses the importance of precision of language, and the issue of the generalized practice of calling the density operator a density matrix (a practice followed by Penrose, and called into attention by Feynman in his Lectures on Physics, as a practical but mathematically imprecise simplification). This must be considered as a simplification of language, as Feynman stressed, but, nonetheless, it is a mathematical imprecision in the usage of the terminology, and, when dealing with fundamental arguments, one must take into account the distinction between the density matrix and the density operator.

However, Penrose’s argument about the phase seems to stand for both operator and matrix, quoting Penrose (2004, p.803):

“Under normal circumstances, moreover, one must regard the density matrix as some kind of approximation to the whole quantum truth. For there is no general principle providing an absolute bar to extracting detailed information from the environment. Maybe a future technology could provide means whereby quantum phase relations can be monitored in detail, under circumstances where present-day technology would simply ‘give up’. It would seem that the resort to a density-matrix description is a technology-dependent prescription! With better technology, the state-vector could be maintained for longer, and the resort to a density matrix put off until things get really hopelessly messy! It would seem to be a strange view of physical reality to regard it to be ‘really’ described by a density matrix (…)”

Although these arguments may seem compelling, one may place a question regarding the statement on the approximation to the whole quantum truth, the question is: what about the so-called 'impure states'?

As Penrose notices, one cannot discard that, at the quantum level, detailed phase relations may get “lost”, because of some deep overriding basic principle. It is still too soon to discard such a hypothesis, and this, indeed, may be likely, if one considers a foamy Planck scale space-time (quantum foam) (Penrose, 2004).

Furthermore, there is still a division in the community in what regards the information loss in black holes. Even if many believe, including, more recently, Hawking (http://arxiv.org/abs/hep-th/0507171), that information may not be lost, we cannot yet reject this possibility.

It seems that, accepting Penrose's argument, leads to the position that if we wish to use a fundamental mathematical description of physical reality, we must use two different formalisms, a ket for the pure states and a density operator for all the other cases, and we cannot discard the need for the usage of the density operator.

Thus, to the first part of our main question (what is the most fundamental mathematical structure that should be used to describe the quantum system?)The arguments seem to point towards using the density operator only when necessary, as a technological tool.

But is this sustainable? Do the phases matter?

The answers to these questions cannot be entirely solved by appealing to mathematics alone.

Indeed, a mathematician might be divided between: (a) a choice where one would work with what can be argued to be a more fundamental structure with respect to the information conserved in the description (the phase information), but two formalisms would be used for two different situations (pure vs impure states); (b) a choice where one works with a single formalism but part of the information (the phase) is lost.

Since we are dealing with physics, all that matters is whether or not the phase is physically relevant, or, even, whether or not the density operator expresses, formally, the most fundamental physical nature, the normalized ket just being a useful representation, that can be shown to be equivalent to the density operator up to a global phase factor.

In effect, so far, all that we can get from the system is the information contained in the density operator. The question of whether or not there might be some technology to recover the phase from a measurement, is still open to discussion.


One may argue that, physically, the phase is irrelevant, one may alternatively argue that the phase is not physically irrelevant. However, to do the latter would demand the mathematical formulation of what might constitute a measurement procedure for the phase, leading inevitably to the problem of the physical meaning and measurability of a complex number.

If one chooses to spend some time with this issue, one is led to this bifurcation of perspectives, where the choice depends less on mathematics and more on physics, in particular, our main question «what is the most fundamental mathematical structure that should be used to describe the quantum system, and what is the nature of the physical semantics that this structure formalizes?» should be considered as a whole, since one cannot really consider the formalism, independently from the object of intentionality of the formalization (that which the formalization is about and that justifies the development of the formalization itself). What is fundamental for the mathematical structures of the formalism, may not be so for the object of formalization.

In the end, the interpretation of quantum mechanics that one follows may decide the choice between the two paths, if one wishes to make such a choice at all, or if, and until, a fundamental thinking about physics demands such a choice.

An interpretation of quantum mechanics that thinks about the nature of quantum processes, inevitably restricts our choices about what is fundamental due to the ontological and epistemological commitments that we assume, along the way of the construction of a scientifically grounded interpretation.

In the interpretations that assign a physical nature to the wave function as corresponding to a pilot wave, the phases are relevant, even if they cannot be measured, since the fundamental object of formalization is that pilot wave.

For a follower of Bohr, on the other hand, the whole discussion would be pointless, since the quantum formalism is just a useful tool used to predict results of experiments, whether we use a ket, a wave function or a density operator is irrelevant.


Furthermore, Bohr was “suspicious” of complex numbers, these could be useful tools, but, in the end, all that mattered were the predictions, and if a phase is unobservable by current technology it is a waste of time to think about it or to assign it a physical significance.


In the Aristotle-based realist interpretation, followed by Heisenberg, the density operator should be taken as the formalization of the fundamental physical structure, since what the formalism “formalizes” is the tendency of a potential alternative to be actualized, this intensity of the dynamis corresponds is quantifiable in terms of a degree, a degree with which probabilities coincide numerically, when these probabilities are interpreted as being proportional to the physical propensity of the potential alternative to be actualized, which is nothing but the intensity of the dynamis associated with that alternative.

Taking this into account, the diagonal terms of the density operator are the fundamental structure, since they are in the direct correspondence with the object of formalization of the theory, i.e., they formalize the most fundamental physical structure, and their interpretation is naturally processual, a processual nature that is obscured by the ket representation.

A closely related mathematical argument can be found in Bohm, Davies and Hiley's paper Algebraic Quantum Mechanics and Pregeometry (http://arxiv.org/abs/quant-ph/0612002), where the authors built quantum theory from the primitive idempotents that are directly related to the different entries of the density operator. Bohm et al. show that the ket notation hides the fact that each ket represents an object with two labels.

Thus, in the end, one’s solution to the phase problem and the answer to the central question placed here, depends on one’s choice of interpretation of quantum mechanics.

Saturday, July 19, 2008

On Situation

By Maria Odete Madeira

The term situation comes from the Latin situs that means relational position or relational disposition of some thing.

The notion of situation has been largely worked upon by philosophy. Besides the scholastic authors, other authors such as Jaspers, Sartre, Merleau-Ponty, Kierkegaard and Heidegger were some of the philosophers that developed philosophical criteria called situationists, in which the situation is thought upon as a concrete and objective reality that underlies all existing things.

Any existent is considered an existent-in-situation, because any existent is one among other contingent existents, situationally launched in the world.

The term world is genetically related to the term nature, that comes from the Latin natura (gnatura, natus, gnatus, nasci), which means to be born. This term has its Greek equivalent in ousia, which means to produce, to give origin to (fazer nascer, faire naître).

Both terms (natura and ousia) are equivalent to the Greek term gignomai, which means to come to be.

Natura and ousia are both related to the Greek term genesiz (birth). All that exists, exists as presence and existence (natura, ousia, dasein).

Friday, July 18, 2008

The Production of Judgments in Kant

by Maria Odete Madeira

For Kant, the production of judgments consists in thinking the diverse of the empirical experience as if it was contained in a universal.

If the universal is previously given, as a rule, principle or law, indicating, a priori, the conditions in which the empirical diverse can be subsumed, then the judgment is said to be determinant and it is objective. However, if only the empirical diverse is given, the faculty of judgment (existent in all subjects capable of thought and reason) has to find the universal to that empirical diverse, then, the judgment is said to be reflexive, indeterminate and subjective.

For Kant, to judge is, always, an exercise of subsuming the empirical diverse in unifying rules. Be it a subjective or an objective judgment, that judgment has, always, its foundation and its condition of possibility in general rules of unity of synthesis of the diverse of the experience. These rules are considered, by Kant, rules a priori, without which no possible experience could be thought or known.

Thus, in Kant, every general unity of synthesis has its condition of possibility in a reflexive capability, dispositionally existent in all the subjects capable of thought and reason.

Wednesday, July 16, 2008

On Transdisciplinarity

by Maria Odete Madeira

No researcher can rigorously identify the border that separates the interdisciplinary from the transdisciplinary work. In question are matters such as individual and collective experiences, processual and methodological convergences and integration of the disciplinary knowledge.

The plurality of the senses and meanings, synthesized by each concept, trigger self-referent systemic lines of fugue that can oscillate between the interdisciplinarity and the transdisciplinarity.

Thus, any transdisciplinary exercise is conditioned by a structuring systemic uncertainty, in regards to the interpretation, comprehension and verbalization of the signs and respective meanings that emerge from the borders of each system involved in the processes.

The great challenge that is placed to the transdisciplinary work is, precisely, the development of techniques and technologies that allow the researchers to capture, in the systems, all the available information about the identity of each of these systems, as well as to capture the respective autopoietic processes involved in the systems’ development, evolution and growth.

Wednesday, July 9, 2008

on observation and experience

During the Renaissance, the impulse towards observation trajected and projected new criteria based upon the notion of experience, from the Latin experientia, that signals a sense of proof through which some thing is acquired or learned.

The observation and experience, during this same period, became the basic elements of a new and compulsive rationality, that tried to apprehend quickly the laws of nature, in order to apply them to fundamental methodologies.

The notion of experience became a fundamental notion that oriented itself by criteria of true/objective knowledge, conditioned by two proposals: a notion of experience linked to the empirism, and a notion of experience linked to the critical rationalism.

Linked to the empirism, the experience is defined as individual living experience/action, accumulation of information and evidence of the immediate/qualitative observation.

Linked to the critical rationalism, experience is approached from criteria of quantitative/repeated qualitative, compared, transmissible with fundamentation and pluripersonal observation. The accumulation of information does not constitute evidence/certainty and the individual experience is considered only in terms of specific information.

Tuesday, July 1, 2008

On Transcendental - Aristotle, Aquinas, Kant and Husserl

By: Maria Odete Madeira

The notion of transcendental assumed, in Aristotle, a metaphysical sense of characterization of attributes, such as, the unity, the true and the bene.

Thomas Aquinas assumed Aristotle’s transcendental attributes and added other two: res and aliquid.

Kant introduced a new sense to the notion of transcendental, relating it to the Copernican Revolution, thus, this notion that, until Kant, was taken in a "formal-logical" sense that fundamented the things in themselves, after Kant, came to constitute the condition a priori of the possibility of something to constitute itself as a phenomenon and object for any subject capable of thought and knowledge, in this way, concepts or categories assumed the value of transcendentals and, because of that, existed, according to Kant, a priori (dispositionally) in each human subject.

The transcendental no longer operated at the fundament of the things in themselves, but, only, at the fundament of the phenomena and respective objects of knowledge for any subject capable of knowing.

Husserl maintained the Kantian sense of the term, but radicalized it by making the reduction to subjectivity (epoché) as the ultimate fundament of the sense and validity of the experience.

Friday, June 27, 2008

Why something and not nothing? Why the entity and not nothing?

by Maria Odete Madeira

"Why something and not nothing?” (Leibniz) Nihil est sine ratione; “Why the entity and not nothing?” (Heidegger).

The matter of the foundation of the Being is a good, endless and disturbing metaphysical question that started with Parmenides and was lengthily developed by Heidegger.

Schelling placed the question of the foundation as a self-position of the Being unfolded in two positions: one as Absolute in itself and another as the other of itself, this last, reciprocally presupposed in a circular topology.

By placing itself as the foundation of itself, the Being placed itself in existence, as a difference from which all things come.

For Heidegger, each entity is one of the modes of the Being, and, each entity, is also its difference, each entity is the being-there (dasein) referring to itself (ek-sistence).

The truth of the Being (aletheia) belongs to the Being itself, Being physis is aletheia. The Being gives itself (es gibt) as Being and Time and, thus, as something that has given itself, the being is event (Ereignis), the logos that wants to be heard.

Thursday, June 26, 2008

Love (Hegel, Höderling, Unitrinity, Schelling...)

by Maria Odete Madeira

Hegel and Höderling introduced a new thinking about sensitivity, in opposition to the anguish and insecurity before alterity (natural, personal and social), that were at the origin of Aufklärung.

This new thinking tried to capture and recuperate the pulsional dynamism and the affective dimension that constitute the human being.

The issue of love was developed as a reply to intrinsic problems and to an internal determinant dialectics, that operated as function of synthesis in the system, that is, a middle-term between the theoretical order and the practical domain.

Love was incorporated, in Hegel’s logical system, as fundament of the harmony of the “Spirit”, in its function of unification of reflexive and effective synthesis of the thought and the feeling.

Love configurated, in Hegel, the dialectical motion of reason, as exposition, negation and return of reason to itself, surpassed. The loving dynamics adequately described the character of the “Absolute” as a fundament of itself, that is, the reconciled return to itself, from its other (Vorlesungen über die Ästhetik, Theorie Werkausgabe).

A different thinking about the feeling of love, seems to come from the notion of Unitrinity, that inscribes itself in the kenosis of the Son (Christ) that reveals God’s mystery as love, a gift that inscribes in space and time an ineffable exchange within the divinity itself. That is, the love as a feeling towards the other, because God as unity does not exclude the other (Son), the Son is already within divinity itself as object of intentionality, for the realization of the love, realizing what can be considered a Superunity (Pseudo-Dionysius the Areopagite).

In the unity, the feeling of love is already a communion with the other without negation, unlike Hegel, and closer to Schelling.

One cannot find, however, in any philosopher, a feeling without an intentionality, without an underlying reason for its being, in the case of Hegel this reason is to be negated and surpassed.

In a neurobiological framework and in a philosophical framework, both feelings and emotions possess an unsurpassable intentionality, they are always about something (their object of intentionality) that relates the individual/agent/subject with his/her environment (culture, civilization, people, recollections, artifacts, etc), possessing a fundamental adaptive value.

Monday, June 23, 2008

Epicurus - Clinamen

by Maria Odete Madeira

While disagreeing, with regards to the topos of the universal essences, Plato and Aristotle were in agreement with regards to a fundamental point: the scientific discourse (episteme) could only be considered as true when that same discourse satisfied criteria of universality and of necessity. Thus, it was demanded of the inquiry to find the foundation of any statement of truth or of falsehood, in what were the first causes and the first principles of the things, themselves.

In this way, in order to have discursive rigor, the matter and its forms or essences (eidos) should be analyzed separately as different things, both in terms of their identity as well as of the value that was assigned to them. The epistemic analysis of one should not contaminate, interpretatively, the epistemic analysis of the others.

The object of knowledge, that object's matter and its form or essence, as well as its meaning and definition, obeyed, for Plato and for Aristotle, rigorous criteria of appropriateness and agreement, that would allow a general deductive semantics. These criteria prolonged themselves in Euclid’s geometry and in the statics of Archimedes, both based upon a system of axioms that constituted self-evident truths, and upon a system of theorems derived from those same axioms.

The semantic determinism of the Greek scientific discourse was generalized as an epistemic model until Epicurus, distancing himself from the primitive atomism of Democritus, introduced the notion of clinamen (declination) as a capability, an arbitriu that the atoms have to deviate themselves spontaneously from their trajectories. Thus, Epicurus incorporated, in the philosophical and scientific discourse, an irreducible element of unpredictability at the epistemic level, absent up until then, with ontological and epistemological consequences about that which was considered as criterion of truth.

Without putting into question the “general laws of nature”, known at the time, the author localized a present and permanent dispositional element of constitutive arbitrariness (arbitriu), in any process of formation of emergent structures, as an element incorporated in the dispositional genetics of these same structures with consequences at the level of the systemic perception and cognitive processing/computation, and at the ontological level of the threat of destructuration and the opportunity of structuration, present in any physical existent, and that function as mechanisms of potential risk (linked to the mechanisms of life and death) and, thus, undetermined, permanently displaced, within the structure itself.

Saturday, June 7, 2008

About the Life of Knowledge and the Knowledge of Life

by Maria Odete Madeira

Any production of knowledge is supported by an organizing activity that acts in accordance to rules that have, as their objective, the resolution of problems that promote the organism’s survival and adaptive fitness.

The discovery of the double-helix, by Watson and Crick, allowed the application, to the notion of living organization, of the cybernetic scheme of a machine governed by an informational program, inscribed in the structure of the DNA molecules, that organizes and directs all the activities of the cellules (Morin, 1986).

In this way, a living organization can be signaled and referred as a self-cognitive, self-organizing and self-replicating agent that is capable, through exchanges with the environment, of concentrating, in itself, the flows of order that feed it and sustain it as a spatio-temporally localized individuated structure.

The knowledge of the life of the systems introduces us to the life of the knowledge, itself. Being, doing and knowing are, thus, inseparable.

The development of communication networks, between the different agents, allowed the transformation, of the natural flows and turbulences, in the subjugated motricity that was at the origin of the human space of millions of years ago, and that can be signaled as a bio-anthropological space aimed at the satisfaction of the biological needs of survival and of the immediate and practical interests, full of feelings and emotions, fantastic visions and terrors, but also full of techniques and precise calculations, synchretically linked to the objects.

In that historical time, the time of the myth, the interface with the environment was profoundly biological, accompanied by that which Damásio (1999) designates by core consciousness, characterized by a weak grasping ability and a weak reflexive operativity that did not allow the exercise of abstract thinking.

The type of thinking produced was profoundly linked to the aleatority of the motion of the natural forces, whose nature revealed itself as powerful, threatening and dramatic.

In this way, the production of judgments exhibited a perceptual and conceptual pattern that allowed the hominidian networks to signal, identify an classify the ecosystemic space, as a fluid and fluctuating nature, determined by local dynamic, unstable, coevolutionary and organic rules, foundationally conditioned by locally emergent mechanisms of territorialization and deterritorialization, rhizomatically aleatorial, producer of myths (meudh, mudh, myo, mytheo, mythos) and of rites, linked to the vital emotions and feelings of immediate survival and creators of visual, tactile, acoustic, and olphactive action spaces, conceptually non-schematizable, but that interacted with the strategic calculus that allowed the development of techniques of working the stone, then the bone and, also, the metal, as well as the development of cognitive memories, associated with the knowledge of the plants, of the animals and of the environment.

From the passage of the mythological thinking towards the so-called rational thinking, the knowledge came to be explicitly referred to as kosmos, or order, and as logos, term of Greek origin, derived from the verb legein that, originally meant to gather, to enumerate or to choose.

In turn, the noun logos, that initially meant collection and (re-)collection of the multiple, came to mean the discourse.

The kosmos and the logos, that substituted the myth as an attempt of interpretation and explanation of the reality, in its complexity, corresponded to the development of the capability of the organismic human grasp, accompanied by the expansion of consciousness.

The core consciousness, thus, lost operative protagonism to the extended consciousness, which came, since then, to operate in the interface with the environment. In accordance with Damásio (1999), if core consciousness is the indispensable foundation of consciousness, extended consciousness is its glory.

Unlike core consciousness, extended consciousness allows to work on temporally more expanded interaction surfaces, connectable to mechanisms of retension and protension.

Monday, May 5, 2008

Unity and System

by Maria Odete

The unity (unitas, atis) is a syntax that can be operationalized to capture, in the systems, that which, in them, is about their internal cohesion, consistency and coherence.

These internal systemic properties (cohesion, consistency and coherence), enactively produced by the system (Varela), signal, refer and identify the system as a being or entity, and, therefore, also an identity.

The system, considered in terms of its individuation, bearer of an identity, is, thus, signaled as a concrete spatio-temporally localized existent in permanent rotative coincidence with itself.

In this way, from an ontological approach, the unity can be thought of as a concrete relational existence, coexistent with that which, in the system, is multiplicity, division and dispersion.

If it is the case that, to form itself, the system needs the existence of a principle of unity enactively present in the system itself, it is also the case that the existence of that same principle of unity depends, constitutively, upon the capability of the relational dynamics of the system to produce and operationalize that principle of unity as a permanent organizing principle, dispositionally available in the system, and that can be thought of as an ontological, logical and epistemological attribute that makes any judicative syntheses about the nature of the unity of the system dependent upon the relations that are coexistent and actively present in the system, since the unity itself of the system is grounded (urgrund) upon those relations.

In this way, the unity can be consistently considered, in the statements, as an organizing principle, systemically synthesized, monadologically complex, relational and in network.

Without relations there can be no systemic unity.

Sunday, April 27, 2008

On Mathematics

by Carlos Pedro Gonçalves


In formulating the question “what is mathematics?” it is important to notice that the intellectual matrix and structure that underlies our concept of mathematics is Greek. This should not be confused with a statement that mathematical activity started in Greece, it does mean, however, that our usage of the term mathematics and our conception of what it means to “do mathematics” has a Greek root.

Thus, in trying to answer the question “what is mathematics?” we should, perhaps, first look at the Greek root of the word mathematics, which is mathematike which comes from manthein (to learn, to study), meaning learning, study, science. It is also important to look at the expression mathematike tekhne, which places mathematical activity as science techne, in Latin being translated to ars mathematica.

As a science techne, the mathematical activity tries to find the quantities, structures and patterns exemplified by the things, and builds an abstraction of these quantities, structures and patterns that allows the study of these quantities, structures and patterns in themselves, in their nature and generative mechanisms.

Monday, April 14, 2008

The Death of John Wheeler

by Carlos Pedro Gonçalves





“I like to say, when asked why I pursue science, that it is to satisfy my curiosity, that I am by nature a searcher, trying to understand. Now, in my eighties, I am still searching. Yet I know that the pursuit of science is more than the pursuit of understanding. It is driven by the creative urge, the urge to construct a vision, a map, a picture of the world that gives the world a little more beauty and coherence than it had before. Somewhere in the child that urge is born.”

John Wheeler, Geons, Black Holes and Quantum Foam, p.84.

The passing away of John Wheeler:

http://jayryablon.wordpress.com/2008/04/14/john-archibald-wheeler-rip/



The “it from bit”


The “it from bit” is a fundamental perspective about the foundations of quantum theory, and about the nature of the universe.

Given the choice, by Wheeler, of the Aristotelic realistic framework, the philosophically consistent interpretation of the “it from bit” means that an actualized reality (the it) comes from a question that nature asks itself, as a result of a particular kind of physical interaction where a quantum system is “asked a question” about the pattern of exemplification of a physical property, such a question is of such a nature that it begs a yes/no answer on the part of the system, hence, the term the it (actualized reality) comes from the bit.

A physicist would state, using the terminology proper of quantum theory, that a decoherence inducing interaction would produce a local diagonal density operator with respect to an expansion of some observable’s eigenbasis, an eigenbasis to which the “observing system” is physically sensitive. The “observing system” does not need, according to Wheeler, an elaborate level of reflexibility, something as simple as a piece of mica, for instance, does the job!

The term decoherence refers to the process that produces a local loss of interference terms of the system’s state. Through the decoherence process, the system’s environment interacts with the system in such a way that an entanglement is produced with respect to a certain observable’s eigenbasis.

Thus, for instance, if there is a system of particles that effectively acts as an obstacle that is able of localize the particle, the potential regions of localization correspond to a correlated potential state of the particle’s position and a potential state of the system of particles. The positioning pattern of interaction introduces a decoherence with respect to the position observable. Informationally, this can be considered to correspond to a yes/no question placed by the system of particles to the particle. Prosaically, the system of particles asks the particle, with respect to each potential region of localization: “Are you here?”.

A typical physical experiment of this is the double slit experiment.

Now, what Wheeler states is that when the particle is asked this question, and when decoherence occurs, the branching of potential alternatives forces a “choice” of a potential branch to be actualized. This actualization depends on the degree to which each branch tends to be actualized, which is nothing but the measure of the intensity of the dynamis or potentia. As it is the nature of the dynamis to tend towards the act.

In terms of quantum cosmology, this is a perspective with fundamental consequences:

- Before the actualized universe we have to consider a myriad of potential universes, each tending to be actualized with different propensities;

- A quantum computation inducing entanglement had to take place that led to the actualization of a given universe, or, at least of some characteristics of the universe.

The second point makes us question the nature of the first event. Either there was a single actualized event that originated the universe, which would mean an initial maximal entanglement with respect to the relevant cosmological observables, or, there were a series of actualizations that processually produced the final result of the big bang. Wheeler admits the theoretical possibility that some parts of what could be considered to be in the universe’s past may only be actualized some time in the future. This is visually expressed by the Wheeler’s eye (see beginning of the article), which pictorically represents the question placed by Wheeler: Does looking back “now” give actual reality to what happened “then”?

About this question Wheeler defends that the point is that the universe can be thought of as a grand synthesis, putting itself together all the time as a whole. Its history is, in this sense, not a history as we usually conceive history. It is not one thing happening after another after another. It is a totality in which what happens “now” actualizes what happened “then”, determining what happened then.

Sunday, April 13, 2008

The Conceptual Confusion between Potentia and Possibility in Physics

by Maria Odete Madeira

Some errors of interpretation of philosophical concepts used in science lead to fundamental problems at the root of the theories, when these concepts play a foundational role in the edifice of that theory. An example of this is the statement, by Heisenberg, of an equivalence between dynamis (potentia) and possibility.

Heisenberg established, incorrectly, an equivalence between the Aristotelic notion of dynamis and possibility.

The dynamis and the possibility are two distinct notions, both logically as well as ontologically. All existing things, situations, events, beings or entities, are contingent things, and, because of that, are, as such, subjects of change, in this way, all physical existents are a composite of dynamis (potentia) and energeia (actus), the dynamis being as real as the energeia.

The dynamis that composes each physical existent is, logically and ontologically, a principle and, also, a temporal moment of determinable indetermination, its determination being done by the energeia that corresponds to it, logically and ontologically, as its principle of determination. Every situation, event, being, entity is constituted by these two principles: dynamis and energeia.

The possibility is an abstract term of any language, be that language logical, epistemological or ontological. The possibility designates the intelligible structure of the possible: possible is all that can be or not be, without logical contradiction.

Any physical existent has in itself, as such, its own possibility, as a neutral element that precedes and accompanies it along its existence. All that is, or exists, must be possible to be or exist, because it, effectively, is, or exists, and, equally, all that is not, or does not exist, must be possible not to be, or not to exist, because it, effectively, is not.

The category of possibility is, in the motion energeia/dynamis/energeia, logically and ontologically, a neutral element. One can talk about a determination of the dynamis by the energeia, and, thus, of an actualization, but one cannot talk about an actualization of the possibility, nor establish an equivalence between the notion of possibility with the notion of dynamis.

Monday, March 31, 2008

On Metaphysics



The term "Metaphysics" (ta meta ta physika) was introduced by Andronicus of Rhodes to classify a collection of texts of the Corpus Aristotelicum that concerned those matters that did not have place in the books of Physics.

Metaphysics refers to the branch of philosophy that studies the first causes, principles and origins of all things. Causes, determinations, reasons and respective foundations are approached within Metaphysics.

As the science of the first causes, principles and origins of all existing things, Metaphysics is an archeology (arche). As science of the being as being it is an ontology. Metaphysics is irreducibly a philosophical concept about the first causes, principles and origins of all existing things.

One must stress that we are not addressing anything that is outside the Cosmos. Indeed, every physical object is also a metaphysical object, in its connection with its origin and foundation.


In this way, as a methodological criterion, Metaphysics concerns the foundation of all the existing things as such. It is, thus, about inquiring in the things themselves, that which, in them, is present as their originary and originating condition, which means that a metaphysical inquiry demands that one questions the foundation itself, that, as such, is present in the things, themselves, as that which maintains them and sustains them as entities, identities and integrities existent and subsistent, in themselves and by themselves.


The foundation is a principle of reason and, thus, of proportion, pattern, number. As Leibniz stated “nihil est sine ratione”. The foundation is incorporated in the nature of the things themselves, as their possibility as existent things, in themselves and by themselves (autonomy).

That which Metaphysics strives for is to find not only that proportion/symmetry/order but that which preceded it. It is no longer, in this last case, about inquiring the being, entity or identity, but about questioning the foundation itself, and, thus, searching for the originating principle of the foundation itself, as its existing potential (Aristotle) anteriority, that is, what ordering principle (number, pattern, order) underlies the foundation itself as an existent foundation of each individuated existent.

Whenever a cosmologist is addressing the first causes, principles and origin of the Cosmos/Order and the entanglement that we call Universe, inquiring about the nature of the laws of physics, or upon the foundations of physics he/she is in a metaphysical inquiry, having to make an effort to interpret that which exists in the Cosmos outside his/her logocentrically conventioned closed mental models and Sapiens' logics, which, in many cases, and in an abusive way, are called metaphysical models, putting into evidence a total lack of knowledge of the notion of Metaphysics itself.

Friday, March 14, 2008

Music and Eternity

By Maria Odete Madeira


Source: Gonçalves, C.P. (2019), Messengers, 


The myth assigned music a divine origin. The Pythagoreans, in turn, referred it (music) to a metaphysical acoustic, in which the intervals and the numerical relations, underlying the relations of consonance, were considered to the formulation of a doctrine of the ethos of the tonality, of the rhythms and of the instruments, all of this correlated with the effects produced in the listeners.

The order of the world and of everything was configured by the harmony and union of the diverse (Philolaus), and this harmony was determined by numerical relations. The soul of each man, as sound, could enter in tune with the order of things and with the music of the spheres.

As Mathesis Universalis, music can be considered as a combinatorial topology of signs that expresses in a hieroglyphic resonance that which the logos hides, Leibniz called it the hidden exercise of an arithmetic of the soul.

This is about questioning the organic and trans-organic feeling (the nostalgy in Mähler/Adorno) about the temporality and trans-temporality to attain an outside of the chronological time, an urgrund (Schelling), and hence concurrent with the first creation.

Musicality is not necessarily restricted to musical pieces. Letters and phrases evoke sounds, and it is possible to convey an additional sense through the topological motion of the verb. The written text has a musical semantics and syntax naturally embedded through the resendings between the written text itself and the verbal articulation (even interior to the reader) of that same text.

Each written text resends towards a metaphysical sense as its first cause. The text is not surrounded, but crossed through by its limit, marked in its interior by the multiple tracks of its margin. Proposing simultaneously the monument and the mirage of the trace (différance, Derrida, Marges de la Philosophie), the trace, simultaneously marked and erased, simultaneously alive and dead, lives, as always, from simulating the life in its kept inscription.

Schopenhauer considered the existence of an infinite will, independent of all the individuation, existing in any living being as an infinite principle, unique and indivisible (or noumenon), without any connection to space/time/causality that, for Schopenhauer, constituted the principium individuationis. This will objectivates itself as an eidos, accessible through art, revealing itself in music as direct and immediate presence of a transcendence (a perspective that Wagner’s music exemplifies).

Steiner (Grammars of Creation) states that music signals and puts in resonant motion that which, in humans, exists as noumenon.