Monday, April 13, 2015

Stories close in time, worlds apart

The son of the people that lived in the house next door to the blue house, was visiting.  His grandmother "owns" it.  His parents had vacated many months ago, and it looked abandoned. He said things had been stolen from the unsecured work areas behind the house.

He and his sister were there for two weeks to clean up the property and cut the grass. The water and electric had been disconnected and he asked if he could use the outside faucet on the blue house, I agreed and lent him a bucket.

He said they would camp out in the house, "Not a problem as I camp out in Sealy."


The next day I was talking to our neighbour just back from seeing his wife and new baby in Nigeria. He showed me photos of his new house he is building in Nigeria. A mansion with a visitor's wing, a live in 24/7 guard station, and surrounded by a high wall that will be electrified. As the power is uncertain the system has a battery backup that will last a week.

The place advertises there is something worth stealing too.

Sunday, April 12, 2015

Mini Library Sighting


On a corner house in the Houston Heights area, with free books and toys.

Tuesday, April 7, 2015

Moderately intimidating

A back-of-the-envelope estimation of creating human intelligence in a computer; artificial intelligence . From Nick Bostrom.


What would it take to recapitulate evolution? Not every feat accomplished by evolution in the course of the development of human intelligence is relevant to a human engineer trying to artificially evolve machine intelligence. Only a small portion of evolutionary selection on Earth has been selection for intelligence. 

More specifically, the problems that human engineers cannot trivially bypass may have been the target of a very small portion of total evolutionary selection. For example, since we can run our computers on electrical power, we do not have to reinvent the molecules of the cellular energy economy in order to create intelligent machines— yet such molecular evolution of metabolic pathways might have used up a large part of the total amount of selection power that was available to evolution over the course of Earth’s history. (7)

One might argue that the key insights for AI are embodied in the structure of nervous systems, which came into existence less than a billion years ago. (8) If we take that view, then the number of relevant “experiments” available to evolution is drastically curtailed. There are some 4– 6 ×1030 prokaryotes in the world today, but only 1019 insects, and fewer than 1010 humans (while pre-agricultural populations were orders of magnitude smaller). (9) These numbers are only moderately intimidating.

Evolutionary algorithms, however, require not only variations to select among but also a fitness function to evaluate variants, and this is typically the most computationally expensive component. A fitness function for the evolution of artificial intelligence plausibly requires simulation of neural development, learning, and cognition to evaluate fitness. We might thus do better not to look at the raw number of organisms with complex nervous systems, but instead to attend to the number of neurons in biological organisms that we might need to simulate to mimic evolution’s fitness function.

We can make a crude estimate of that latter quantity by considering insects, which dominate terrestrial animal biomass (with ants alone estimated to contribute some 15– 20%). (10) Insect brain size varies substantially, with large and social insects sporting larger brains: a honeybee brain has just under 106 neurons, a fruit fly brain has 105 neurons, and ants are in between with 250,000 neurons. (11) The majority of smaller insects may have brains of only a few thousand neurons. Erring on the side of conservatively high, if we assigned all 1019 insects fruit-fly numbers of neurons, the total would be 1024 insect neurons in the world. This could be augmented with an additional order of magnitude to account for aquatic copepods, birds, reptiles, mammals, etc., to reach 1025. (By contrast, in pre -agricultural times there were fewer than 107 humans, with under 1011 neurons each: thus fewer than 1018 human neurons in total, though humans have a higher number of synapses per neuron.) The computational cost of simulating one neuron depends on the level of detail that one includes in the simulation.

Extremely simple neuron models use about 1,000 floating-point operations per second (FLOPS) to simulate one neuron (in real-time). The electrophysiologically realistic Hodgkin– Huxley model uses 1,200,000 FLOPS. A more detailed multi-compartmental model would add another three to four orders of magnitude, while higher-level models that abstract systems of neurons could subtract two to three orders of magnitude from the simple models. (12) If we were to simulate 1025 neurons over a billion years of evolution (longer than the existence of nervous systems as we know them), and we allow our computers to run for one year, these figures would give us a requirement in the range of 1031–1044 FLOPS . For comparison, China’s Tianhe-2, the world’s most powerful supercomputer as of September 2013, provides only 3.39 ×1016 FLOPS. In recent decades, it has taken approximately 6.7 years for commodity computers to increase in power by one order of magnitude. Even a century of continued Moore’s law would not be enough to close this gap. Running more specialized hardware, or allowing longer run -times, could contribute only a few more orders of magnitude.

This figure is conservative in another respect. Evolution achieved human intelligence without aiming at this outcome. In other words, the fitness functions for natural organisms do not select only for intelligence and its precursors. (13) Even environments in which organisms with superior information processing skills reap various rewards may not select for intelligence, because improvements to intelligence can (and often do) impose significant costs, such as higher energy consumption or slower maturation times, and those costs may outweigh whatever benefits are gained from smarter behavior. 

Excessively deadly environments also reduce the value of intelligence: the shorter ones expected lifespan, the less time there will be for increased learning ability to pay off. Reduced selective pressure for intelligence slows the spread of intelligence-enhancing innovations, and thus the opportunity for selection to favor subsequent innovations that depend on them.

Furthermore, evolution may wind up stuck in local optima that humans would notice and bypass by altering tradeoffs between exploitation and exploration or by providing a smooth progression of increasingly difficult intelligence tests. (14) And as mentioned earlier, evolution scatters much of its selection power on traits that are unrelated to intelligence (such as Red Queen’s races of competitive co-evolution between immune systems and parasites).

Evolution continues to waste resources producing mutations that have proved consistently lethal, and it fails to take advantage of statistical similarities in the effects of different mutations. These are all inefficiencies in natural selection (when viewed as a means of evolving intelligence) that it would be relatively easy for a human engineer to avoid while using evolutionary algorithms to develop intelligent software. It is plausible that eliminating inefficiencies like those just described would trim many orders of magnitude off the 1031–1044 FLOPS range calculated earlier.

Unfortunately, it is difficult to know how many orders of magnitude. It is difficult even to make a rough estimate— for aught we know, the efficiency savings could be five orders of magnitude, or ten, or twenty-five. (15)


Bostrom, Nick (2014-07-03). Superintelligence: Paths, Dangers, Strategies (Kindle Locations 769-775). Oxford University Press. Kindle Edition.

Monday, April 6, 2015

Fighting Entropy

The posts were set in concrete, perfectly vertical. But after 25 years of weather, gravity, microbiological activity, cosmic rays, and many other process that are unknown, they lean badly.

But with money and effort we can fight back, and turn this neglected garden area into a thing of beauty.

Thursday, April 2, 2015

Lung replacements

Need new lungs? If you cant stay alive for 30 days on one lung, with assistance, you can get a brand
new lung with your cells, so no rejection. Once things have settled down, in another 30 days you can swap out the other side.

Basically they take your lung, strip out the cells leaving a sterile scaffolding of collagen and elastin, proteins. Then in a tank they grow a new lung with stem cells guided by the protein structure.