Showing posts with label Nick Bostrom. Show all posts
Showing posts with label Nick Bostrom. Show all posts

Wednesday, March 6, 2024

Chance, design and artificial life

In previous posts in this blog I have mentioned my experiments on artificial life: the simulation in a computer of processes similar to those that take place in living beings. Artificial life should not be confused with synthetic life: construction of artificial living beings in the laboratory.

One of the most used tools in artificial life (and in other related fields) are genetic algorithms, which simulate biological evolution within the computer, and make it act on the entities that are the subjects of the research. In these experiments, a mixture of chance and necessity (the title of Monod’s book mentioned in the previous post) is used. Chance is usually applied with a pseudo-random number generator that modifies the operation of the rest of the algorithm, which represents necessity.

Thursday, May 18, 2017

Is the increase in life expectancy accelerating?

Nick Bostrom
Some philosophers, such as Nick Bostrom and the transhumanists, have concocted an updated version of Nietzsche’s superman. Their forecasts are based on two scientific advances presented as imminent since several decades ago: immortality, which will be attained when the advances in medicine increase life expectancy beyond one year per year; and artificial intelligence, the design of super-intelligent machines. Both advances could be combined to attain immortality through artificial intelligence, by downloading our conscience (something we cannot even define scientifically) into a super-intelligent machine, so that it would go on existing inside the machine.
Unfortunately for transhumanists, the UN data do not confirm their expectations. Let us look first at the data about the evolution of the maximum life expectancy in the world from 1950 to 2015 (see table 1). These and the following data have been taken from https://esa.un.org/unpd/wpp/Download/Standard/Mortality/.

Thursday, May 26, 2016

Disappointment in the face of unreasonable optimistic forecasts

Arthur C. Clarke
The future is unpredictable. The information revolution that began in the 80s with the personal computer, followed in the 90s with the global expansion of the Internet, and continued in the first decade of this century with the smartphones, came as a surprise for many futurists. Half a century ago, all predictions agreed that future computers would be larger. In fact, they became smaller. By 1965, something like Internet seemed a prediction for the next century (see the story by Arthur C. Clarke, Dial F for Frankenstein). Looking back, many of the scientific advances of the twentieth century were surprising. Why then do we insist on making predictions, if they are almost never met?
The March 2016 issue of the Spanish version of the journal Scientific American includes an article entitled Neuroscience: how to avoid disappointment, by Professor Alfredo Marcos, which reviews some of the modern predictions about research on the human brain, which he considers far too optimistic. If these forecasts are not met, as can be expected, the disappointment of the public and the governments that sponsor and fund these scientific efforts could lead to a wave of excessive skepticism. These are a few of his words:
However much we learn about the brain, we must not expect that it will provide us with the immediate healing of all our medical and social ills, from Alzheimer's to violence; much less with the keys to human existence.

Thursday, October 8, 2015

We are not living in a simulation: a note on Nick Bostrom’s proposal

Simulation of the collision of two galaxies
In a paper published in 2003 [1] Nick Bostrom proposed the following reasoning:
A technologically mature “posthuman” civilization would have enormous computing power. Based on this empirical fact, the simulation argument shows that at least one of the following propositions is true:

(1)   The fraction of human-level civilizations that reach a posthuman stage is very close to zero;
(2)   The fraction of posthuman civilizations that are interested in running ancestor-simulations is very close to zero;
(3)   The fraction of all people with our kind of experiences that are living in a simulation is very close to one.