Information Processing

Pessimism of the Intellect, Optimism of the Will     Archive   Favorite posts   Twitter: @steve_hsu

Friday, July 03, 2015

Humans on AMC



This is a new AMC series, done in collaboration with Channel 4 in the UK. I just watched the first episode and it is really good.

Directional dominance on stature and cognition



Interesting results in this recent Nature article. The dominance effect is quite strong: the equivalent of first cousin inbreeding (homozygosity ~ 1/8) results in a decrease in height or cognitive ability of about 1/6 or 1/3 of an SD. That means the effect from alleles which depress the trait increases by significantly more than 2x in the homozygous (AA) as opposed to heterozygous (aA) case.
Directional dominance on stature and cognition in diverse human populations
(Nature July 2015; doi:10.1038/nature14618)

Homozygosity has long been associated with rare, often devastating, Mendelian disorders1, and Darwin was one of the first to recognize that inbreeding reduces evolutionary fitness2. However, the effect of the more distant parental relatedness that is common in modern human populations is less well understood. Genomic data now allow us to investigate the effects of homozygosity on traits of public health importance by observing contiguous homozygous segments (runs of homozygosity), which are inferred to be homozygous along their complete length. Given the low levels of genome-wide homozygosity prevalent in most human populations, information is required on very large numbers of people to provide sufficient power3, 4. Here we use runs of homozygosity to study 16 health-related quantitative traits in 354,224 individuals from 102 cohorts, and find statistically significant associations between summed runs of homozygosity and four complex traits: height, forced expiratory lung volume in one second, general cognitive ability and educational attainment (P < 1 × 10−300, 2.1 × 10−6, 2.5 × 10−10 and 1.8 × 10−10, respectively). In each case, increased homozygosity was associated with decreased trait value, equivalent to the offspring of first cousins being 1.2 cm shorter and having 10 months’ less education. Similar effect sizes were found across four continental groups and populations with different degrees of genome-wide homozygosity, providing evidence that homozygosity, rather than confounding, directly contributes to phenotypic variance. Contrary to earlier reports in substantially smaller samples5, 6, no evidence was seen of an influence of genome-wide homozygosity on blood pressure and low density lipoprotein cholesterol, or ten other cardio-metabolic traits. Since directional dominance is predicted for traits under directional evolutionary selection7, this study provides evidence that increased stature and cognitive function have been positively selected in human evolution, whereas many important risk factors for late-onset complex diseases may not have been.
From the paper:
... After exclusion of outliers, these effect sizes translate into a reduction of 1.2 cm in height and 137 ml in FEV1 for the offspring of first cousins, and into a decrease of 0.3 s.d. in g and 10 months’ less educational attainment.
These results support the claim that height and cognitive ability have been under positive selection in humans / hominids, so that causal variants tend to be rare and deleterious. For related discussion, see, e.g., section 3.1 in my article On the genetic architecture of intelligence and other quantitative traits and earlier post Deleterious variants affecting traits that have been under selection are rare and of small effect.

Friday, June 26, 2015

Sci Foo 2015


I'm in Palo Alto for this annual meeting of scientists and entrepreneurs at Google. If you read this blog, come over and say hello!

Action photos! Note most of the sessions were in smaller conference rooms, but we weren't allowed to take photographs there.

Tuesday, June 23, 2015

Schwinger meets Rabi


Seventeen year old Julian Schwinger meets Columbia professor I. I. Rabi (Nobel Prize 1944) and explains the EPR paper to him.
Climbing the Mountain: The Scientific Biography of Julian Schwinger [p.22-23] ... Rabi appeared; he invited Motz into his office to discuss 'a certain paper by Einstein in the Physical Review'! Motz introduced Julian and asked if he could bring his young friend along; Rabi did not object, and so it began.

The Einstein article turned out to be the famous paper of Einstein, Podolsky, and Rosen, with which young Julian was already familiar. He had studied quantum mechanics with Professor Wills at the City College, and discussed with him the problem of the reduction of a wave packet after additional information about a quantum system is gained from a measurement. 'Then they [Rabi and Motz] began talking and I sat down in the corner. They talked about the details of Einstein's paper, and somehow the conversation hinged on some mathematical point which had to do with whether something was bigger or smaller, and they couldn't make any progress. Then I spoke up and said, "Oh, but that is easy. All you have to do is to use the completeness theorem." Rabi turned and stared at me. Then it followed from there. Motz had to explain that I knew these things. I recall only Rabi's mouth gaping, and he said, "Oh, I see. Well, come over and tell us about it." I told them about how the completeness theorem would settle the matter. From that moment I became Rabi's protege. He asked, "Where are you studying?" "Oh, at City College." "Do you like it there?" I said, "No, I'm very bored."''

Watching young Julian demonstrate such 'deep understanding of things that were at the time at the frontier and not clearly understood,' Rabi decided on the spot to talk to George Pegram, then chairman of the physics department and dean of the graduate faculty, to arrange Julian's immediate transfer to Columbia. He and Motz left Julian waiting and went to see Pegram ...
Hans Bethe (Nobel Prize 1967) supported the transfer :-)
[p.24] Bethe provided an enthusiastic letter of support after he read Julian's notes on electrodynamics.'' Bethe's letter, dated 10 July 1935, reads as follows:
Dear Rabi,

Thank you very much for giving me the opportunity to talk to Mr. Schwinger. When discussing his problem with him, I entirely forgot that he was a sophomore 17 years of age. I spoke to him just as to any of the leading theoretical physicists. His knowledge of quantum electrodynamics is certainly equal to my own, and I can hardly understand how he could acquire that knowledge in less than two years and almost all by himself.

He is not the frequent type of man who just "knows" without being able to make his knowledge useful. On the contrary, his main interest consists in doing research, and in doing it exactly at the point where it is most needed at present. That is shown by his choice of his problem: When studying quantum electrodynamics, he found that an important point had been left out in a paper of mine concerning the radiation emitted by fast electrons. That radiation is at present one of the most crucial points of quantum theory. ...
Climbing the Mountain is one of the best scientific biographies I have read, on par with books by Pais on Einstein and Oppenheimer, and by Schweber on QED. The account of the early communication between Schwinger and Feynman about their very different formulations of QED is very interesting. See also Feynman's cognitive style and Feynman and the secret of magic.

Schwinger was one of the 64 mid-career scientists studied by Harvard psychologist Anne Roe.

Schwinger, of course, did not believe in wavefunction collapse or other Copenhagen mysticism: see Schwinger on quantum foundations.

Saturday, June 20, 2015

James Salter, 1925-2015


"Forgive him anything, he writes like an angel."

Remember that the life of this world is but a sport and a pastime.  NYTimes obituary.

From a 2011 post:



I've been a fan of the writer James Salter (see also here) since discovering his masterpiece A Sport and a Pastime. Salter evokes Americans in France as no one since Hemingway in A Moveable Feast. The title comes from the Koran: Remember that the life of this world is but a sport and a pastime ... :-)

I can't think of higher praise than to say I've read every bit of Salter's work I could get my hands on.
See also Lion in winter: James Salter.



INTERVIEWER

But why a memoir?

SALTER

To restore those years when one says, All this is mine—these cities, women, houses, days.

INTERVIEWER

What do you think is the ultimate impulse to write?

SALTER

To write? Because all this is going to vanish. The only thing left will be the prose and poems, the books, what is written down. Man was very fortunate to have invented the book. Without it the past would completely vanish, and we would be left with nothing, we would be naked on earth.

Wednesday, June 17, 2015

Hopfield on physics and biology

Theoretical physicist John Hopfield, inventor of the Hopfield neural network, on the differences between physics and biology. Hopfield migrated into biology after making important contributions in condensed matter theory. At Caltech, Hopfield co-taught a famous course with Carver Mead and Richard Feynman on the physics of computation.
Two cultures? Experiences at the physics-biology interface

(Phys. Biol. 11 053002 doi:10.1088/1478-3975/11/5/053002)

Abstract: 'I didn't really think of this as moving into biology, but rather as exploring another venue in which to do physics.' John Hopfield provides a personal perspective on working on the border between physical and biological sciences.

... With two parents who were physicists, I grew up with the view that science was about understanding quantitatively how things worked, not about collecting details and categorizing observations. Their view, though not so explicitly stated, was certainly that of Rutherford: 'all science is either physics or stamp collecting.' So, when selecting science as a career, I never considered working in biology and ultimately chose solid state physics research.

... I attended my first biology conference in the summer of 1970 at a small meeting with the world's experts on the hemoglobin molecule. It was held at the Villa Serbelloni in Bellagio, in sumptuous surroundings verging on decadence as I had never seen for physics meetings. One of the senior biochemists took me aside to explain to me why I had no place in biology. As he said, gentlemen did not interpret other gentlemen's data, and preferably worked on different organisms. If you wish to interpret data, you must get your own. Only the experimentalist himself knows which of the data points are reliable, and so only he should interpret them. Moreover, if you insist on interpreting other people's data, they will not publish their best data. Biology is very complicated, and any theory with mathematics is such an oversimplification that it is essentially wrong and thus useless. And so on... On closer examination, this diatribe chiefly describes differences between the physics and biology paradigms (at the time at least) for engaging in science. Physics papers use data points with error bars; biology papers lacked them. Physics was based on the quantitative replication of experiments in different laboratories; biology broadened its fact collecting by devaluing replication. Physics education emphasized being able to look at a physical system and express it in mathematical terms. Mathematical theory had great predictive power in physics, but very little in biology. As a result, mathematics is considered the language of the physics paradigm, a language in which most biologists could remain illiterate. Time has passed, but there is still an enormous difference in the biology and physics paradigms for working in science. Advice? Stick to the physics paradigm, for it brings refreshing attitudes and a different choice of problems to the interface. And have a thick skin. ...
Also by Hopfield: Physics, Computation, and Why Biology Looks so Different and Whatever happened to solid state physics?

See also In search of principles: when biology met physics (Bill Bialek), For the historians and the ladiesAs flies to wanton boys are we to the gods and Prometheus in the basement.

Friday, June 12, 2015

Entanglement and fast thermalization in heavy ion collisions


New paper! We hypothesize that rapid growth of entanglement entropy between modes in the central region and other scattering degrees of freedom is responsible for fast thermalization in heavy ion collisions.
Entanglement and Fast Quantum Thermalization in Heavy Ion Collisions (arXiv:1506.03696)

Chiu Man Ho, Stephen D. H. Hsu

Let A be subsystem of a larger system A∪B, and ψ be a typical state from the subspace of the Hilbert space H_AB satisfying an energy constraint. Then ρ_A(ψ)=Tr_B |ψ⟩⟨ψ| is nearly thermal. We discuss how this observation is related to fast thermalization of the central region (≈A) in heavy ion collisions, where B represents other degrees of freedom (soft modes, hard jets, collinear particles) outside of A. Entanglement between the modes in A and B plays a central role; the entanglement entropy S_A increases rapidly in the collision. In gauge-gravity duality, S_A is related to the area of extremal surfaces in the bulk, which can be studied using gravitational duals.




An earlier blog post Ulam on physical intuition and visualization mentioned the difference between intuition for familiar semiclassical (incoherent) particle phenomena, versus for intrinsically quantum mechanical (coherent) phenomena such as the spread of entanglement and its relation to thermalization.
[Ulam:] ... Most of the physics at Los Alamos could be reduced to the study of assemblies of particles interacting with each other, hitting each other, scattering, sometimes giving rise to new particles. Strangely enough, the actual working problems did not involve much of the mathematical apparatus of quantum theory although it lay at the base of the phenomena, but rather dynamics of a more classical kind—kinematics, statistical mechanics, large-scale motion problems, hydrodynamics, behavior of radiation, and the like. In fact, compared to quantum theory the project work was like applied mathematics as compared with abstract mathematics. If one is good at solving differential equations or using asymptotic series, one need not necessarily know the foundations of function space language. It is needed for a more fundamental understanding, of course. In the same way, quantum theory is necessary in many instances to explain the data and to explain the values of cross sections. But it was not crucial, once one understood the ideas and then the facts of events involving neutrons reacting with other nuclei.
This "dynamics of a more classical kind" did not require intuition for entanglement or high dimensional Hilbert spaces. But see von Neumann and the foundations of quantum statistical mechanics for examples of the latter.

Thursday, June 11, 2015

One Hundred Years of Statistical Developments in Animal Breeding

This nice review gives a history of the last 100 years in statistical genetics as applied to animal breeding (via Andrew Gelman).
One Hundred Years of Statistical Developments in Animal Breeding
(Annu. Rev. Anim. Biosci. 2015. 3:19–56 DOI:10.1146/annurev-animal-022114-110733)

Statistical methodology has played a key role in scientific animal breeding. Approximately one hundred years of statistical developments in animal breeding are reviewed. Some of the scientific foundations of the field are discussed, and many milestones are examined from historical and critical perspectives. The review concludes with a discussion of some future challenges and opportunities arising from the massive amount of data generated by livestock, plant, and human genome projects.
I've gone on and on about approximately additive genetic architecture for many human traits. These arguments are supported by the success of linear predictive models in animal breeding. But who has time to read literature outside of human genetics? Who has time to actually update priors in the face of strong evidence? ;-)

Wednesday, June 10, 2015

More GWAS hits on cognitive ability: ESHG 2015



This is a talk from ESHG 2015, which just happened in Glasgow. The abstract is old; at the talk the author reportedly described something like 70 genome wide significant hits (from an even larger combined sample) which are most likely associated with cognitive ability. This is SSGAC ... stay tuned!
Title: C15.1 - Genome-wide association study of 200,000 individuals identifies 18 genome-wide significant loci and provides biological insight into human cognitive function

Keywords: Educational attainment; genome-wide association; cognitive function

Authors: T. Esko1,2,3, on the behalf of Social Science Genetic Association Consortium (SSGAC); 1Estonian Genome Center, University of Tartu, Tartu, Estonia, 2Boston Children’s Hospital, Boston, MA, United States, 3Broad Institute of Harvard and MIT, Cambridge, MA, United States.

Abstract: Educational attainment, measured as years of schooling, is commonly used as a proxy for cognitive function. A recent genome wide association study (GWAS) of educational attainment conducted in a discovery sample of 100,000 individuals identified and replicated three genome-wide significant loci. Here, we report preliminary results based on conducted in 200,000 individuals. We replicate the previous three loci and report 15 novel, genome-wide significant loci for educational attainment. A polygenic score composed of 18 single nucleotide polymorphisms, one from each locus, explains ~0.4% of the variance educational attainment. Applying data-driven computational tools, we find that genes in loci that reach nominal significance (P < 5.0x10-5) strongly enrich for 11 groups of biological pathways (false discovery rates < 0.05) mostly related to the central nervous system, including dendritic spine morphogenesis (P=1.2x10-7), axon guidance (P=5.8x10-6) and synapse organization (P=1.7x10-5), and show enriched expression in various brain areas, including hippocampus, limbic system, cerebral and entorhinal cortex. We also prioritized genes in associated loci and found that several are known to harbor genes related to intellectual disability (SMARCA2, MAPT), obesity (RBFOX3, SLITRK5), and schizophrenia (GRIN2A) among others. By pointing at specific genes, pathways and brain areas, our work provides novel biological insights into several facets of human cognitive function.

Sparsity estimates for complex traits

Note the estimate of few to ten thousand causal SNP variants, consistent with my estimates for height and cognitive ability.

Sparsity (number of causal variants), along with heritability, determines the amount of data necessary to "solve" a specific trait. See Genetic architecture and predictive modeling of quantitative traits.

T1D looks like it could be cracked with only a limited amount of data.
Simultaneous Discovery, Estimation and Prediction Analysis of Complex Traits Using a Bayesian Mixture Model

PLoS Genet 11(4): e1004969. doi:10.1371/journal.pgen.1004969

Gene discovery, estimation of heritability captured by SNP arrays, inference on genetic architecture and prediction analyses of complex traits are usually performed using different statistical models and methods, leading to inefficiency and loss of power. Here we use a Bayesian mixture model that simultaneously allows variant discovery, estimation of genetic variance explained by all variants and prediction of unobserved phenotypes in new samples. We apply the method to simulated data of quantitative traits and Welcome Trust Case Control Consortium (WTCCC) data on disease and show that it provides accurate estimates of SNP-based heritability, produces unbiased estimators of risk in new samples, and that it can estimate genetic architecture by partitioning variation across hundreds to thousands of SNPs. We estimated that, depending on the trait, 2,633 to 9,411 SNPs explain all of the SNP-based heritability in the WTCCC diseases. The majority of those SNPs (>96%) had small effects, confirming a substantial polygenic component to common diseases. The proportion of the SNP-based variance explained by large effects (each SNP explaining 1% of the variance) varied markedly between diseases, ranging from almost zero for bipolar disorder to 72% for type 1 diabetes. Prediction analyses demonstrate that for diseases with major loci, such as type 1 diabetes and rheumatoid arthritis, Bayesian methods outperform profile scoring or mixed model approaches.
Table S5 below gives estimates of sparsity for various disease conditions.


Coronary Artery Disease CAD
Type 1 diabetes T1D
Type 2 diabetes T2D
Crohn's disease CD
Hypertension HT
Bipolar disorder BD
Rheumatoid arthritis RA

Replication and cumulative knowledge in life sciences

See Ioannidis at MSU for video discussion of related topics with the leading researcher in this area, and also Medical Science? Is Science Self-Correcting?
The Economics of Reproducibility in Preclinical Research (PLoS Biology)

Abstract: Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total) prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B)/year spent on preclinical research that is not reproducible—in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.
From the introduction:
Much has been written about the alarming number of preclinical studies that were later found to be irreproducible [1,2]. Flawed preclinical studies create false hope for patients waiting for lifesaving cures; moreover, they point to systemic and costly inefficiencies in the way preclinical studies are designed, conducted, and reported. Because replication and cumulative knowledge production are cornerstones of the scientific process, these widespread accounts are scientifically troubling. Such concerns are further complicated by questions about the effectiveness of the peer review process itself [3], as well as the rapid growth of postpublication peer review (e.g., PubMed Commons, PubPeer), data sharing, and open access publishing that accelerate the identification of irreproducible studies [4]. Indeed, there are many different perspectives on the size of this problem, and published estimates of irreproducibility range from 51% [5] to 89% [6] (Fig 1). Our primary goal here is not to pinpoint the exact irreproducibility rate, but rather to identify root causes of the problem, estimate the direct costs of irreproducible research, and to develop a framework to address the highest priorities. Based on examples from within life sciences, application of economic theory, and reviewing lessons learned from other industries, we conclude that community-developed best practices and standards must play a central role in improving reproducibility going forward. ...

Tuesday, June 09, 2015

Whither the World Island?


Alfred W. McCoy, Professor of History at the University of Wisconsin-Madison, writes on global geopolitics. The brief excerpts below do not do the essay justice.
Geopolitics of American Global Decline: Washington Versus China in the Twenty-First Century

... On a cold London evening in January 1904, Sir Halford Mackinder, the director of the London School of Economics, “entranced” an audience at the Royal Geographical Society on Savile Row with a paper boldly titled “The Geographical Pivot of History.” This presentation evinced, said the society’s president, “a brilliancy of description… we have seldom had equaled in this room.”

Mackinder argued that the future of global power lay not, as most British then imagined, in controlling the global sea lanes, but in controlling a vast land mass he called “Euro-Asia.” By turning the globe away from America to place central Asia at the planet’s epicenter, and then tilting the Earth’s axis northward just a bit beyond Mercator’s equatorial projection, Mackinder redrew and thus reconceptualized the world map.

His new map showed Africa, Asia, and Europe not as three separate continents, but as a unitary land mass, a veritable “world island.” Its broad, deep “heartland” — 4,000 miles from the Persian Gulf to the Siberian Sea — was so enormous that it could only be controlled from its “rimlands” in Eastern Europe or what he called its maritime “marginal” in the surrounding seas.

... “We didn’t push the Russians to intervene [in Afghanistan],” Brzezinski said in 1998, explaining his geopolitical masterstroke in this Cold War edition of the Great Game, “but we knowingly increased the probability that they would… That secret operation was an excellent idea. Its effect was to draw the Russians into the Afghan trap.”

Asked about this operation’s legacy when it came to creating a militant Islam hostile to the U.S., Brzezinski, who studied and frequently cited Mackinder, was coolly unapologetic. “What is most important to the history of the world?” he asked. “The Taliban or the collapse of the Soviet empire? Some stirred-up Moslems or the liberation of Central Europe and the end of the Cold War?”

... After decades of quiet preparation, Beijing has recently begun revealing its grand strategy for global power, move by careful move. Its two-step plan is designed to build a transcontinental infrastructure for the economic integration of the world island from within, while mobilizing military forces to surgically slice through Washington’s encircling containment.

The initial step has involved a breathtaking project to put in place an infrastructure for the continent’s economic integration. By laying down an elaborate and enormously expensive network of high-speed, high-volume railroads as well as oil and natural gas pipelines across the vast breadth of Eurasia, China may realize Mackinder’s vision in a new way. For the first time in history, the rapid transcontinental movement of critical cargo — oil, minerals, and manufactured goods — will be possible on a massive scale, thereby potentially unifying that vast landmass into a single economic zone stretching 6,500 miles from Shanghai to Madrid. In this way, the leadership in Beijing hopes to shift the locus of geopolitical power away from the maritime periphery and deep into the continent’s heartland.

... To capitalize such staggering regional growth plans, in October 2014 Beijing announced the establishment of the Asian Infrastructure Investment Bank. China’s leadership sees this institution as a future regional and, in the end, Eurasian alternative to the U.S.-dominated World Bank. So far, despite pressure from Washington not to join, 14 key countries, including close U.S. allies like Germany, Great Britain, Australia, and South Korea, have signed on. Simultaneously, China has begun building long-term trade relations with resource-rich areas of Africa, as well as with Australia and Southeast Asia, as part of its plan to economically integrate the world island.

... Lacking the geopolitical vision of Mackinder and his generation of British imperialists, America’s current leadership has failed to grasp the significance of a radical global change underway inside the Eurasian land mass. If China succeeds in linking its rising industries to the vast natural resources of the Eurasian heartland, then quite possibly, as Sir Halford Mackinder predicted on that cold London night in 1904, “the empire of the world would be in sight.”

Hmm... where have I seen this before?
Chung Kuo is a series of science fiction novels written by David Wingrove. The novels present a future history of an Earth dominated by China. ... Chung Kuo is primarily set 200 years in the future in mile-high, continent-spanning cities made of a super-plastic called 'ice'. Housing a global population of 40 billion, the cities are divided into 300 levels and success and prestige is measured by how far above the ground one lives. ... The ruling classes – who base their rule on the customs and fashions of imperial China – maintain traditional palaces and courts both on Earth and in geostationary orbit. There are also Martian research bases and the outer colonies, with their mining planets.

Friday, June 05, 2015

Game of Thrones at the Oxford Union



The three shows I've been following in recent years are Game of Thrones, Silicon Valley, and Mad Men (now over). Some of the Amazon Prime pilots I've seen look promising, like The Man in the High Castle.

Monday, June 01, 2015

James Simons interview

A great interview with Jim Simons. From academic mathematics to code breaking to financial markets :-)

Saturday, May 30, 2015

Americans in China

Featuring Evan Osnos, Kaiser Kuo, and Jeremey Goldkorn (of the Sinica podcast).


Sunday, May 24, 2015

John Nash, dead at 86



The original title of this post was For this you won a Nobel (Memorial) Prize? But see sad news at bottom.
A Beautiful Mind: Nash went to see von Neumann a few days after he passed his generals? He wanted, he had told the secretary cockily, to discuss an idea that might be of interest to Professor von Neumann. It was a rather audacious thing for a graduate student to do. Von Neumann was a public figure, had very little contact with Princeton graduate students outside of occasional lectures, and generally discouraged them from seeking him out with their research problems. But it was typical of Nash, who had gone to see Einstein the year before with the germ of an idea.

Von Neumann was sitting at an enormous desk, looking more like a prosperous bank president than an academic in his expensive three-piece suit, silk tie, and jaunty pocket handkerchief.  He had the preoccupied air of a busy executive. At the time, he was holding a dozen consultancies, "arguing the ear off Robert Oppenheimer" over the development of the H-bomb, and overseeing the construction and programming of two prototype computers. He gestured Nash to sit down. He knew who Nash was, of course, but seemed a bit puzzled by his visit.

He listened carefully, with his head cocked slightly to one side and his fingers tapping. Nash started to describe the proof he had in mind for an equilibrium in games of more than two players. But before he had gotten out more than a few disjointed sentences, von Neumann interrupted, jumped ahead to the yet unstated conclusion of Nash's argument, and said abruptly, "That's trivial, you know. That's just a fixed point theorem." 
See also What use is game theory? Compare the excerpt below about Nash's Embedding Theorem (also of interest: Theorem proving machines).
A Beautiful Mind: Nash's theorem stated that any kind of surface that embodied a special notion of smoothness can actually be embedded in Euclidean space. He showed that you could fold the manifold like a silk handkerchief, without distorting it. Nobody would have expected Nash's theorem to be true. In fact, everyone would have expected it to be false. "It showed incredible originality," said Mikhail Gromov, the geometer whose book Partial Differential Relations builds on Nash's work. He went on:
Many of us have the power to develop existing ideas. We follow paths prepared by others. But most of us could never produce anything comparable to what Nash produced. It's like lightning striking. Psychologically the barrier he broke is absolutely fantastic. He has completely changed the perspective on partial differential equations. There has been some tendency in recent decades to move from harmony to chaos. Nash says chaos is just around the corner. 
John Conway, the Princeton mathematician who discovered surreal numbers and invented the game of Life, called Nash's result "one of the most important pieces of mathematical analysis in this century."
In writing this post, I googled "a beautiful mind" to find a link to the Amazon page. I was shocked to find a news article about the death of John Nash and his wife Alicia (both are in the photo above) yesterday in a car accident! May they rest in peace.

Saturday, May 23, 2015

Ioannidis at MSU

These videos are from an interview I did with John Ioannidis when he visited Michigan State earlier this month. The whole thing (29 min) and more short clips are available here.


Is 85% of NIH funding wasted?





Early candidate gene studies rarely replicated, but GWAS hits do.





The flyer for his talk:

Friday, May 22, 2015

Genetic architecture and predictive modeling of quantitative traits



As an experiment I recorded this video using slides from a talk I gave last week at NIH. I will be giving similar talks later this spring/summer at Human Longevity Inc. and BGI. The commonality between these institutions is that all three are on the road to accumulating a million human genomes. Who will get there first?

Recording the video was easy using Keynote, although it's a bit odd to talk to yourself for an hour. I recommend that everyone do this, in order to reach a much larger audience than can fit in a lecture hall :-)

Genetic architecture and predictive modeling of quantitative traits

I discuss the application of Compressed Sensing (L1-penalized optimization or LASSO) to genomic prediction. I show that matrices comprised of human genomes are good compressed sensors, and that LASSO applied to genomic prediction exhibits a phase transition as the sample size is varied. When the sample size crosses the phase boundary complete identification of the subspace of causal variants is possible. For typical traits of interest (e.g., with heritability ~ 0.5), the phase boundary occurs at N ~ 30s, where s (sparsity) is the number of causal variants. I give some estimates of sparsity associated with complex traits such as height and cognitive ability, which suggest s ~ 10k. In practical terms, these results imply that powerful genomic prediction will be possible for many complex traits once ~ 1 million genotypes are available for analysis.

Thursday, May 21, 2015

Fifty years of twin studies


The most interesting aspect of these results is that for many traits there is no detectable non-additivity. That is, gene-gene interactions seem to be insignificant, and a simple linear genetic architecture is consistent with the results.
Meta-analysis of the heritability of human traits based on fifty years of twin studies
Nature Genetics (2015) doi:10.1038/ng.3285

Despite a century of research on complex traits in humans, the relative importance and specific nature of the influences of genes and environment on human traits remain controversial. We report a meta-analysis of twin correlations and reported variance components for 17,804 traits from 2,748 publications including 14,558,903 partly dependent twin pairs, virtually all published twin studies of complex traits. Estimates of heritability cluster strongly within functional domains, and across all traits the reported heritability is 49%. For a majority (69%) of traits, the observed twin correlations are consistent with a simple and parsimonious model where twin resemblance is solely due to additive genetic variation. The data are inconsistent with substantial influences from shared environment or non-additive genetic variation. This study provides the most comprehensive analysis of the causes of individual differences in human traits thus far and will guide future gene-mapping efforts.
See also Additivity and complex traits in mice:
You may have noticed that I am gradually collecting copious evidence for (approximate) additivity. Far too many scientists and quasi-scientists are infected by the epistasis or epigenetics meme, which is appealing to those who "revel in complexity" and would like to believe that biology is too complex to succumb to equations. ("How can it be? But what about the marvelous incomprehensible beautiful sacred complexity of Nature? But But But ...")

I sometimes explain things this way:

There is a deep evolutionary reason behind additivity: nonlinear mechanisms are fragile and often "break" due to DNA recombination in sexual reproduction. Effects which are only controlled by a single locus are more robustly passed on to offspring. ...

Many people confuse the following statements:

"The brain is complex and nonlinear and many genes interact in its construction and operation."

"Differences in brain performance between two individuals of the same species must be due to nonlinear (non-additive) effects of genes."

The first statement is true, but the second does not appear to be true across a range of species and quantitative traits.
On the genetic architecture of intelligence and other quantitative traits (p.16):
... The preceding discussion is not intended to convey an overly simplistic view of genetics or systems biology. Complex nonlinear genetic systems certainly exist and are realized in every organism. However, quantitative differences between individuals within a species may be largely due to independent linear effects of specific genetic variants. As noted, linear effects are the most readily evolvable in response to selection, whereas nonlinear gadgets are more likely to be fragile to small changes. (Evolutionary adaptations requiring significant changes to nonlinear gadgets are improbable and therefore require exponentially more time than simple adjustment of frequencies of alleles of linear effect.) One might say that, to first approximation, Biology = linear combinations of nonlinear gadgets, and most of the variation between individuals is in the (linear) way gadgets are combined, rather than in the realization of different gadgets in different individuals.

Linear models work well in practice, allowing, for example, SNP-based prediction of quantitative traits (milk yield, fat and protein content, productive life, etc.) in dairy cattle. ...

Wednesday, May 20, 2015

Imperial exams and human capital

The dangers of rent seeking and the educational signaling trap. Although the imperial examinations were probably g loaded (and hence supplied the bureaucracy with talented administrators for hundreds of years), it would have been better to examine candidates on useful knowledge, which every participant would then acquire to some degree.

See also Les Grandes Ecoles Chinoises and History Repeats.
Farewell to Confucianism: The Modernizing Effect of Dismantling China’s Imperial Examination System

Ying Bai
The Hong Kong University of Science and Technology

Imperial China employed a civil examination system to select scholar bureaucrats as ruling elites. This institution dissuaded high-performing individuals from pursuing some modernization activities, such as establishing modern firms or studying overseas. This study uses prefecture-level panel data from 1896-1910 to compare the effects of the chance of passing the civil examination on modernization before and after the abolition of the examination system. Its findings show that prefectures with higher quotas of successful candidates tended to establish more modern firms and send more students to Japan once the examination system was abolished. As higher quotas were assigned to prefectures that had an agricultural tax in the Ming Dynasty (1368-1643) of more than 150,000 stones, I adopt a regression discontinuity design to generate an instrument to resolve the potential endogeneity, and find that the results remain robust.
From the paper:
Rent seeking is costly to economic growth if “the ablest young people become rent seekers [rather] than producers” (Murphy, Shleifer, and Vishny 1991: 529). Theoretical studies suggest that if a society specifies a higher payoff for rent seeking rather than productive activities, more talent would be allocated in unproductive directions (Acemoglu 1995; Baumol 1990; Murphy, Shleifer, and Vishny 1991, 1993). This was the case in late Imperial China, when a large part of the ruling class – scholar bureaucrats – was selected on the basis of the imperial civil examination.1 The Chinese elites were provided with great incentives to invest in a traditional education and take the civil examination, and hence few incentives to study other “useful knowledge” (Kuznets 1965), such as Western science and technology.2 Thus the civil examination constituted an institutional obstacle to the rise of modern science and industry (Baumol 1990; Clark and Feenstra 2003; Huff 2003; Lin 1995).

This paper identifies the negative incentive effect of the civil exam on modernization by exploring the impact of the system’s abolition in 1904-05. The main empirical difficulty is that the abolition was universal, with no regional variation in policy implementation. To better understand the modernizing effect of the system’s abolition, I employ a simple conceptual framework that incorporates two choices open to Chinese elites: to learn from the West and pursue some modernization activities or to invest in preparing for the civil examination. In this model, the elites with a greater chance of passing the examination would be less likely to learn from the West; they would tend to pursue more modernization activities after its abolition. Accordingly, the regions with a higher chance of passing the exam should be those with a larger increase in modernization activities after the abolition, which makes it possible to employ a difference-in-differences (DID) method to identify the causal effect of abolishing the civil examination on modernization.

I exploit the variation in the probability of passing the examination among prefectures – an administrative level between the provincial and county levels. To control the regional composition of successful candidates, the central government of the Qing dynasty (1644-1911) allocated a quota of successful candidates to each prefecture.3 In terms of the chances of individual participants – measured by the ratio of quotas to population – there were great inequalities among the regions (Chang 1955). To measure the level of modernization activities in a region, I employ (1) the number of newly modern private firms (per million inhabitants) above a designated size that has equipping steam engine or electricity as a proxy for the adoption of Western technology and (2) the number of new Chinese students in Japan – the most import host country of Chinese overseas students (per million inhabitants) as a proxy of learning Western science. Though the two measures might capture other things, for instance entrepreneurship or human capital accumulation, the two activities are both intense in modern science and technology, and thus employed as the proxies of modernization. ...
From Credentialism and elite employment:
Evaluators relied so intensely on “school” as a criterion of evaluation not because they believed that the content of elite curricula better prepared students for life in their firms – in fact, evaluators tended to believe that elite and, in particular, super-elite instruction was “too abstract,” “overly theoretical,” or even “useless” compared to the more “practical” and “relevant” training offered at “lesser” institutions – but rather due to the strong cultural meanings and character judgments evaluators attributed to admission and enrollment at an elite school. I discuss the meanings evaluators attributed to educational prestige in their order of prevalence among respondents. ...

Saturday, May 16, 2015

The Grisly Folk


H.G. Wells on the first encounters between modern humans and Neanderthals. See also The Neanderthal problem and Neanderthals dumb?
The Grisly Folk: ... But one may doubt if the first human group to come into the grisly land was clever enough to solve the problems of the new warfare. Maybe they turned southward again to the gentler regions from which they had come, and were killed by or mingled with their own brethren again. Maybe they perished altogether in that new land of the grisly folk into which they had intruded. Yet the truth may be that they even held their own and increased. If they died there were others of their kind to follow them and achieve a better fate.

That was the beginning of a nightmare age for the little children of the human tribe. They knew they were watched.

Their steps were dogged. The legends of ogres and man-eating giants that haunt the childhood of the world may descend to us from those ancient days of fear. And for the Neandertalers it was the beginning of an incessant war that could end only in extermination.

The Neandertalers, albeit not so erect and tall as men, were the heavier, stronger creatures, but they were stupid, and they went alone or in twos and threes; the menfolk were swifter, quicker-witted, and more social — when they fought they fought in combination. They lined out and surrounded and pestered and pelted their antagonists from every side. They fought the men of that grisly race as dogs might fight a bear. They shouted to one another what each should do, and the Neandertaler had no speech; he did not understand. They moved too quickly for him and fought too cunningly.

Many and obstinate were the duels and battles these two sorts of men fought for this world in that bleak age of the windy steppes, thirty or forty thousand years ago. The two races were intolerable to each other. They both wanted the eaves and the banks by the rivers where the big flints were got. They fought over the dead mammoths that had been bogged in the marshes, and over the reindeer stags that had been killed in the rutting season. When a human tribe found signs of the grisly folk near their cave and squatting place, they had perforce to track them down and kill them; their own safety and the safety of their little ones was only to be secured by that killing. The Neandertalers thought the little children of men fair game and pleasant eating. ...
Razib Khan discusses other examples from this genre.

The ravages of time

These make me happy and sad at the same time.





Tuesday, May 12, 2015

The view from here: vast and mysterious



An even better visualization would be the state vector of our universe rotating in a vast Hilbert space. Too bad my brain can't picture it!

Monday, May 11, 2015

New kids on the blockchain



WSJ reports on institutional interest in blockchain technologies.
WSJ: Nasdaq OMX Group Inc. is testing a new use of the technology that underpins the digital currency bitcoin, in a bid to transform the trading of shares in private companies.

The experiment joins a slew of financial-industry forays into bitcoin-related technology. If the effort is deemed successful, Nasdaq wants to use so-called blockchain technology in its stock market, one of the world’s largest, and potentially shake up systems that have facilitated the trading of financial assets for decades. ...

The blockchain is maintained, updated and verified by a vast global network of independently owned computers known as “miners” that collectively work to prove the ledger’s authenticity.

In theory, this decentralized system for verifying information means transactions need no longer be channeled through banks, clearinghouses and other middlemen. Advocates say this “trustless” structure means direct transfers of ownership can occur over the blockchain almost instantaneously without the risk of default or manipulation by an intermediating third party.

One idea is that encrypted, digital representations of share certificates could be inserted into minute bitcoin transactions known as “Satoshis,” facilitating an immediate, verifiable transfer of stock ownership from seller to buyer.

Still, bitcoin-based settlement remains untested in the real world. Regulators worry about the anonymous status of the bitcoin miners that collectively manage the system. It is conceivable that bad actors might one day take over the mining network and destroy the integrity of its verification system, some say.

... Real-time settlement has been a goal of regulators and investors alike as it would reduce the risk of counterparty failure and free up billions of dollars of capital that is sidelined during that wait period.

Oliver Bussmann, chief investment officer of Swiss bank UBS AG, last year said the blockchain was the biggest disrupting force in the financial sector, meaning its success could potentially have far-reaching ramifications for banks, trading houses and others. His bank has since established a special blockchain lab to study uses of the technology.

Nasdaq named Fredrik Voss, a vice president, as its new “blockchain technology evangelist” to lead efforts to increase use of the technology.
See my earlier discussion Crypto-currencies, Bitcoin and Blockchain. For these kind of applications I think miners should not be the primary mechanism for blockchain verification. The bank / exchange should do it themselves and then post a large bounty (e.g., $50 million dollars) to any miner who finds an error in the publicly available blockchain. Of course, in these scenarios it would be nice to have more functionality than just transfers (which is all Bitcoin can do now). It would be trivial to encode options, derivatives contracts, conditional agreements, etc. in the blockchain. See Ethereum.
7. One interesting scenario is for a country (Singapore? Denmark?) or large financial entity (Goldman, JPM, Visa) to issue its own crypto currency, managing the blockchain itself but leaving it in the public domain so that third parties (including regulators) can verify transactions. Confidence in this kind of "Institutional Coin" (IC) would be high from the beginning. An IC with Ethereum-like capabilities could revolutionize the financial industry. In place of an opaque web of counterparty relationships leading to systemic risk, the IC blockchain would be easily audited by machine. Regulators would require that the IC authority know its customers, so pseudonymity would only be partial.

Saturday, May 09, 2015

Our Kids and Coming Apart



Nick Lemann reviews Our Kids: The American Dream in Crisis by Robert D. Putnam. At the descriptive level, Putnam's conclusions seem very similar to those of Charles Murray in Coming Apart.

Of course, description is much easier to obtain than causality.
NYBooks: ... By the logic of the book, access to social capital ought to be strongly associated with going to college and doing well there—otherwise, why stress it so strongly? The syllogism would be: social capital leads to educational attainment, which leads to mobility. But for his classmates, Putnam reports, academic achievement was the factor most predictive of college attendance, and the link between such achievement and parental encouragement (of the kind he has copiously praised in the main body of the book) was only “modestly important,” and “much weaker” than the link between class rank and college attendance. Not only that:
No other measure of parental affluence or family structure or neighborhood social capital (or indeed anything else we had measured)—none of the factors that this book has shown are so important in producing today’s opportunity gap—had any appreciable effect on college attendance or other educational attainment.
In the methods appendix, Putnam refers readers to his website for more detail on his findings about his classmates. There, he writes:
No measure of parental resources adds any predictive power whatsoever—not parental occupational status, not parental unemployment, not family economic insecurity during high school, not homeownership, not neighborhood characteristics, and not family structure…. Parental education, parental encouragement, and class rank were all modestly predictive of extracurricular participation, but holding constant those variables, extracurricular participation itself was unrelated to college-going.
So is it really the case that Putnam has shown that strong social capital once produced individual opportunity—let alone that the deterioration of social capital has produced what he calls the opportunity gap? The passages I just quoted seem to indicate that the strong association between social capital and opportunity that is Putnam’s core assertion has not been proven. Putnam doesn’t define “social capital” precisely enough to rigorously test its effects, even on as small and unrepresentative a sample as the one in his survey, and he doesn’t attempt to test its effects precisely in the present. It could even be that, rather than social capital generating prosperity, prosperity might generate social capital, which would mean Putnam has been showing us the effects of inequality, not the causes.
It seems possible to me that:

1. American society has become increasingly meritocratic in the last 50 years, with advancement more and more dependent on largely heritable attributes such as cognitive ability, conscientiousness, future time orientation, etc. Consequently, gaps between different SES groups have become more and more difficult to remediate.

2. External forces, such as automation and global economic competition, have placed a larger and larger premium on attributes such as those listed above, leaving Americans of below average ability at a severe disadvantage.

The consequences of these observations are exacerbated by an increasingly winner take all economic system.

If these points are correct, then Our Kids and Coming Apart are documenting consequences, not causes.

See also Income, Wealth and IQ , US Economic Mobility and Random microworlds: the mystery of nonshared environment.

Blog Archive

Labels