Ideas sobre complejidad

Ideas sobre complejidad

Enero 2010

Aspectos de la Teología que se entienden Mejor con la Complejidad

  1. Si el mundo en un sistema complejo, sabemos que no está destinado a alcanzar un óptimo; no tiene una tendencia al crecimiento lineal. Esta idea se adapta mejor a la visión de la historia que la fe nos pinta: el mundo no está llamada a alcanzar el paraíso en esta propia Tierra. De hecho el Apocalipsis nos dice que el final acontecerá en un momento de baja de la historia.
  2. Se entiende mejor que la Iglesia tenga que se a la vez, jerárquica y carismática: que tenga que estar escuchando a la misma vez hacia abajo, a ver que el Espíritu Santo suscita, como hacia arriba, a ver qué dice la jerarquía. De hecho, el papel de cualquier jerarquía o liderato en una organización, es el de validad la consonancia entre las iniciativas que vienen de abajo, y la misión de la organización. La fuerza está abajo, la visión arriba.
  3. No sólo el hombre tiene vocación a la santidad, sino toda la creación está llamada a Dios.
  4. La materia es más que las cosas tangibles, la materia no es más que una forma de estar la energía. Todo lo material es energía. Este hecho tiene varias consecuencias interesantes:
    1. Todo está conectado. Más aún, posiblemente todo lo material pueda verse como manifestaciones variadas de una única energía. Esto explica mejor el que los animales y las plantas existan para ser usados por otros animales, que limitan o incluso eliminan su “desarrollo como individuos”. Cuando una vaca mata con su rabo a una mosca que la estaba molestando, no puede pensarse que la muerte ha venido a frustra la vida se ese individuo-mosca, que posiblemente tenía crías que también se morirán porque su mamá nunca llegó con alimento para ellas. Tanto la mosca con las crías siguen estando en la naturaleza bajo otro formato. Los únicos individuos que tienen una existencia que tiene que justificarse a sí misma, son las personas. Esto no impide que gran parte del sentido de la vida de una persona se entiende por su conexión con los demás. La existencia de un niño que muere muy pequeño no se justifica enteramente por la cantidad de amor que haya “generado” en su vida. Hace falta, para entender su existencia, verla en el contexto de toda la historia y de la conexión de esta vida físicamente-truncada con las otras vidas.
  5. Una de las principales lecciones que nos quiere dar la Biblia en su relato de la creación es enseñarnos que la explicación del mal que hay en el mundo no es que haya un dios del mal (junto al un dios del bien). La creación toda proviene del Dios del Bien que, además, es el único Dios: no existe un dios del mal. Sí existen agentes del mal, pero fueron creados por el Dios Bueno, por lo tanto, en el fondo, tienen que servir para un propósito bueno. De hecho la Biblia es el mejor lugar para entender del gran misterio que inquieta a todos los hombres de todas las épocas: por qué existe el mal. Se puede decir que todas las religiones pretenden dar una respuesta a esta pregunta de la razón del ser del mal. Por eso los hombres se interesan por la religión, entre otras causas, cuando experimentan el mal muy de cerca. Una de las religiones más recientes, el evolucionismo materialista o simplemente el materialismo, consiste, en esencia, en una explicación coherente de la existencia del mal.
  6. La física cada vez nos revela más claro que la estructura del universo es la de un pensamiento. También nos revela la cosmología, que tuvo un comienzo, y la termodinámica, que tiene un final. La complejidad nos revela que el ser humano, lejos de ser un accidente evolutivo, es “el esperado”. Todo lo anterior nos permite hipotetizar que el universo es, en el fondo, un inmenso piropo, un pensamiento de amor, lanzado, posiblemente, de una persona a otra. La fe, por su lado, nos confirma que la creación es un fruto gratuito de un amor.
  7. Es común que pensemos que la inteligencia humana es, en el fondo, capacidad de procesar información. La persona más inteligente, según esto, es la que logra extraer más consecuencias más rápidamente tanto de las observaciones particulares como de las leyes generales. Pero este tipo de inteligencia es, en el fondo, una evolución de la inteligencia de los animales que también procesan información. No puede ser esto la inteligencia que nos hace criaturas espirituales. El efecto más propio del espíritu del hombre no su capacidad de procesar información, es su libertad,. La libertad es algo muy sutil y transparente: es la capacidad de seguir o no seguir un estímulo surgido de nuestra naturaleza (materia + historia). Por eso, la libertad solo ocupa un “bit” de información: es un 0 ó un 1; aceptamos o no aceptamos. Todo lo que no es libertad le viene al hombre de su parte material (herencia + experiencia; es decir: ADN + aprendizaje). Una consecuencia de esto, es que es muy difícil saber qué en una persona es fruto de su libertad y qué le vino dado, le vino con se equipaje de ADN + experiencias. La verdad de una persona es lo que esa persona haya hecho con su libertad (de esto es de lo único que será juzgado al final). Es prácticamente imposible saber que hay de verdadero en los éxitos y fracasos de una persona. Por eso es que nos prohíbe juzgarnos ni si quiera a nosotros mismos.
  8. Hay un criterio que tiene que cumplir la explicación de la realidad que finalmente se consiga: que sea compatible con el ateísmo. Y esto es así porque sabemos, por la fe, que Dios no puede obligar a nadie a aceptarle. Esto implica que, en cualquier explicación del universo siempre habrá la posibilidad para el que quiera negar a Dios de hacerlo racionalmente. La cosmovisión evolucionista que se está dibujando al presente, cumple con este criterio.
  9. ¿Cómo se compatibiliza que haya surgido la vida –que es una tendencia al orden- con el que exista la segunda ley de la termodinámica –que prescribe una tendencia al desorden? La explicación radica en que la vida no viola la segunda ley porque lo que la segunda ley realmente dice es que para poder ir del desorden hacia el orden hay que consumir energía útil, es decir, hay que aumentar la entropía en el universo. Y esto efectivamente se da: la vida existe por el consumo de energía solar. Una implicación que tiene esta visión de la segunda ley es que, para que pudiera haber vida en el universo hizo falta toda el desarrollo anterior que acabó permitiendo la existencia de estrellas como nuestro sol: nuestro universo no es tan grande: es el necesario para que pueda llevarse a cabo su propósito.
  10. A la verdad no se llega por el choque dialectico entre los que tienen la razón y los que no. A la verdad se camina en directo, porque estamos diseñados para ella. Por lo tanto, no debemos usar como estrategia para encontrar la verdad sobre el universo, el atacar los argumentos de los que dicen que el universo no habla de una inteligencia creadora. Debemos de seguir como estrategia el usar las verdades reveladas como guía para orientar nuestros pasos en la búsqueda de la verdad natural. Con esto, veremos, entre otras cosas, que hasta podemos aprovechar muchas verdades descubiertas por los que no tienen fe, para caminar juntos hacia una verdad a la que estamos llamados todos.

“Self-thinning” is a term that refers to the progressive density-dependent mortality

that occurs within an even-age group of plants as the individuals grow in size.

The Self-Thinning Rule describes the dynamics of self-thinning mathematically:

B = CN-1/2

Where, B = biomass of the population, C = a constant, and N = the density of plants.

When this equation is plotted on a graph of the log biomass (log B) vs. log density

(log N), the result is a straight-line negative slope called the thinning line (Fig 4.1).

This thinning line is a boundary of maximum total biomass for the population at a given density and is considered a carrying capacity. (Tomado de “Population Dynamics”, Ecology Connections, University of Calgary, December 2004)

Information theory and Entropy – Tom Carter – Summary

 

Information theory and Entropy

Tom Carter

A. How to measure compexity: as the amount of Unpredicted Information

We want to be able to compare two systems, and be able to say that system A is more complex than system B.

Various approaches to this measure complexity have been proposed, among them:

1. Human observation and (subjective) rating

2. Number of parts or distinct elements (what counts as a distinct part?)

3. Dimension (measured how?)

4. Number of parameters controlling the system

5. Minimal description (in which language?)

6. Information content (how do we define/measure information?)

7. Minimal generator/constructor (what machines/methods can we use?)

8. Minimum energy/time to construct

My first focus will be on measures related to how surprising or unexpected an observation or event is. This approach has been described as information theory.

B. DERIVING A DEFINITION OF INFORMATION: i (P) = LOG(1/P)

We would like to develop a usable measure of the information we get from observing the occurrence of an event having probability p .

We will want our information measure I(p) to have several properties (note that along with the axiom is motivation for choosing the axiom):

1. We want the Information Measure to be a non-negative quantity: I(p) ≥ 0.

2. If an event has probability 1 (it is certain that it will occur), we get no information from the occurrence of the event: I(1) = 0.

3. If two independent events occur (whose joint probability is the product of their individual probabilities), then the information we get from observing the events is the sum of the two informations: I(p1 * p2) = I(p1)+I(p2). (This is the critical property . . . )

4. We will want our information measure to be a continuous (and, in fact, monotonic) function of the probability (slight changes in probability should result in slight changes in information).

We can therefore from this axioms derive the following:

1. I(p2) = I(p * p) = I(p)+I(p) = 2 * I(p)

2. Thus, further, I(pn) = n * I(p) (by induction . . . )

3. I(p) = I((p 1/m)m) = m * I(p 1/m), so

I(p 1/m) = 1 m * I(P) and thus in general I(p n/m) = n m * I(p)

4. And thus, by continuity, we get, for 0 < p < 1, and a > 0 a real number: I(pa) = a * I(p)

From this, we can derive the nice property:

I(p) = −logb(p)

This will be our definition of Information Measure

I(p) = logb(1/p)

[This means that the information enclosed in a event, is inversely correlated to the probability of the event: the more probable the event, the less information it carries]

C. Defining entropy as the average Information per symbol

Suppose now that we have n symbols {a1, a2, . . . , an}, and some source is providing us with a stream of these symbols. Suppose further that the source emits the symbols with probabilities {p1, p2, . . . , pn}, respectively. For now, we also assume that the symbols are emitted independently (successive symbols do not depend in any way on past symbols). What is the average amount of information we get from each symbol we see in the stream?

If we observe N (independent) observations, we will get total information I of:

I = ∑ (N * pi) * log(1/pi)

But then, the average information we get per symbol observed will be:

I/N = (1/N) ∑(N * pi) * log(1/pi) = ∑ pi * log(1/pi)

This will be our (actually Shanon’s) definition of entropy: the average information per symbol in a stream of symbols, or the expected value of information:

H(P) = ∑ pi* log(1/pi)

D. What is the maximum amount of unpredicted information that a system may Have

Using the Gibbs inequality to find the maximum of the entropy function above, we got:

0 ≤ H(P) ≤ log(n)

That is, the maximum of the entropy function is the log of the number of possible events, and occurs when all the events are equally likely.

An example illustrating this result: How much information can a student get from a single grade? First, the maximum information occurs if all grades have equal probability (e.g., in a pass/fail class, on average half should pass if we want to maximize the information given by the grade).

The maximum information the student gets from a grade will be:

Pass/Fail : [log (1/2) = 1] = 1 bit

A, B, C, D, F : log (1/5) = 2.3 bits

A, A-, B+, . . ., D-, F : log (1/7) = 3.6 bits.

Thus, using +/- grading gives the students about 1.3 more bits of information per grade than without +/-, and about 2.6 bits per grade more than pass/fail.

E. Other characteristic of this Definition

1. These definitions of information and entropy may not match with some other uses of the terms. For example, if we know that a source will, with equal probability, transmit either the complete text of Hamlet or the complete text of Macbeth (and nothing else), then receiving the complete text of Hamlet provides us with precisely 1 bit of information.

2. It is important to recognize that our definitions of information and entropy depend only on the probability distribution. In general, it won’t make sense for us to talk about the information or the entropy of a source without specifying the probability distribution.

3. This observation (almost :-) accords with our intuition: two people listening to the same lecture can get very different information from the lecture. For example, without appropriate background, one person might not understand anything at all, and therefore have as probability model a completely random source, and therefore get much more information than the listener who understands quite a bit, and can therefore anticipate much of what goes on, and therefore assigns non-equal probabilities to successive words.

[…]

Evolution of Cooperation – Robert Axelrod – Summary

 

The Evolution of Cooperation

By Robert Axelrod

Under what conditions will cooperation emerge in a world of egoists without central authority? This question has intrigued people for a long time.

A good example of the fundamental problem of cooperation is the case where two industrial nations have erected trade barriers to each other’s exports. Because of the mutual advantages of free trade, both countries would be better off if these barriers were eliminated. But if either country were to eliminate its barriers unilaterally, it would find itself facing terms of trade that hurt its own economy. In fact, whatever one country does, the other country is better off retaining its own trade barriers. Therefore, the problem is that each country has an incentive to retain trade barriers, leading to a worse outcome than would have been possible had both countries cooperated with each other.

A. Lessons from “the Prisoners’ dilema”

This basic problem occurs when the pursuit of self-interest by each leads to a poor outcome for all.

This maybe illustrated with a game is called the Prisoner’s Dilemma because in its original form two prisoners face the choice of informing on each other (defecting) or remaining silent (cooperating). Each must make the choice without knowing what the other will do.

The rewards are: If both players defect: Both players get $1. If both players cooperate: Both players get $3. If one player defects while the other player cooperates: The defector gets $5 and the cooperator gets zero.

One can see that no matter what the other player does, defection yields a higher payoff than cooperation.

The winner was the simplest of all candidates submitted. This was a strategy of simple reciprocity which cooperates on the first move and then does whatever the other player did on the previous move.

The analysis of the data from these tournaments reveals four properties which tend to make a strategy successful:

  1. avoidance of unnecessary conflict by cooperating as long as the other player does ,
  2. provocability in the face of an uncalled-for defection by the other,
  3. forgiveness after responding to a provocation, and
  4. clarity of behavior so that the other player can recognize and adapt to your pattern of action.

In summary, the best strategy is: Don’t be envious, don’t be the first to defect, reciprocate both cooperation and defection, and don’t be too clever.

B. Lessons from World War I: Live and Let Live

This System emerged during the trench warfare of the western front in World War I. In the midst of this bitter conflict, the frontline soldiers often refrained from shooting to kill – provided their restraint was reciprocated by the soldiers on the other side.

C. Conditions for Stable Cooperation

1. The individuals involved do not have to be rational: The evolutionary process allows successful strategies to thrive, even if the players do not know why or how.

2. Nor do they have to exchange messages or commitments: They do not need words, because their deeds speak for them.

3. Likewise, there is no need to assume trust between the players: The use of reciprocity can be enough to make defection unproductive.

4. Altruism is not needed: Successful strategies can elicit cooperation even from an egoist.

5. Finally, no central authority is needed: Cooperation based on reciprocity can be self-policing.

6. For cooperation to emerge, the interaction must extend over an indefinite (or at least an unknown) number of moves. For cooperation to prove stable, the future must have a sufficiently large shadow. This means that the importance of the next encounter between the same two individuals must be great enough to make defection an unprofitable strategy.

7. In order for cooperation to get started in the first place, there must be some clustering of individuals who use strategies with two properties: The strategy cooperates on the first move, and discriminates between those who respond to the cooperation and those who do not.

D. How Cooperation Evolves

Cooperation can begin with small clusters. It can thrive with strategies that are “nice” (that is, never the first to defect), provocable, and somewhat forgiving. Once established in a population, individuals using such discriminating strategies can protect themselves from invasion. The overall level of cooperation tends to go up and not down.

The foundation of cooperation is not really trust, but the durability of the relationship. When the conditions are right, the players can come to cooperate with each other through trial-and-error learning about possibilities for mutual rewards, through imitation of other successful players, or even through a blind process of selection of the more successful strategies with a weeding out of the less successful ones.

E. The Value of Provocability

One of my biggest surprises in working on this project has been the value of provocability and that it is important to respond sooner, rather than later. I came to this project believing one should be slow to anger. The results of the computer tournament for the Prisoner’s Dilemma demonstrate that it is actually better to respond quickly to a provocation.

F. A Self-Reinforcing Ratchet Effect

Once the word gets out that reciprocity works – among nations or among individuals – it becomes the thing to do. If you expect others to reciprocate your defections as well as your cooperations, you will be wise to avoid starting any trouble.

The establishment of stable cooperation can take a long time if it is based upon blind forces of evolution, or it can happen rather quickly if its operation can be appreciated by intelligent players.

There is a lesson in the fact that simple reciprocity succeeds without doing better than anyone with whom it interacts. It succeeds by eliciting cooperation from others, not by defeating them. We are used to thinking about competitions in which there is only one winner, competitions such as football or chess. But the world is rarely like that. In a vast range of situations, mutual cooperation can be better for both sides than mutual defection. The key to doing well lies not in overcoming others, but in eliciting their cooperation.