Oct. 5th, 2011

rfmcdonald: (photo)
91260026

Remember when it was so warm, and when we were so young?
rfmcdonald: (Default)
Clifford's essay, at Love and Fiction, about the perils of striving too hard for authenticity is a near-perfect summation of my thinking on that. Our late modern/post-modern era is an era that promiscuously mixes up everything. Purism for purism's sake doesn't make sense.

North Americans are not exactly alone in this desire for authenticity, but it's not universal either. The Chinese, for instance, don't seem to be as troubled by it. My dad worked as a government scientist and he went through this phase where he was always hosting dignitaries from Taiwan. I don't know why; I think there was some talk of building a semicondutor plant or something. Anyway, these visitors would always want Chinese food when they went out for dinner. Dad would always take them to the Mandarin Ogilvie, a restaurant I can personally recommend, if you're in the area.

One time he took a Japanese visitor to a fancy French Canadian restaurant in Gatineau. By chance, a young chef from Japan was training in the kitchen and there was a fancy Japanese dish on the menu. You can guess what the visitor ordered; apparently he was ecstatic. It was his favourite food, so he ate it. He was apparently disinterested in trying the local cousine (or, alternatively, pretending to himself that he was something he was not).

None of this is to say Asians never like to eat local food when they go on a trip, any more than I am trying to say American tourists never eat McDonalds on their vacations (and as a side note, if you've never eaten Western fast food on a trip to Asia, all that means is you didn't stay there very long). Nor am I suggesting that you shouldn't try to "get off the beaten path" or to see things from the local perspective when you go on a trip. If you're going to bother paying the money and burning the fossil fuels to get to some foreign land, there isn't much sense it just doing the things you'd do at home.

But at the same time, as William Shatner so ably pointed out, I do think we should remember that there is no authentic way to be anything other than to be yourself. This was the one overwhelming lesson I learned from all my travel; that wherever I went, I was always a stranger to them. Wherever I went, I was still me.

My authentic life is what I live every day. Whether you like it or not, when you go to Starbucks, work at your job, and, you know, living your actual life, you're being authentic right there.

Things can be "good" or "bad", or "boring" or "fun", or "tasty" or "bland", but there's very little you can do that isn't real. I think we should remember that this moment, the one we're having right now, is as "authentic" as anything else. That our real lives are something someone else would take a trip to see and that our food is so strange to them they might not eat it. That before our eyes is an assortment of miracles.


Go, read.
rfmcdonald: (Default)
Razib Khan's post at Discover's GNXP taking a look at the origins of Homo sapiens. This is a surprisingly complicated question, especially now that it's well-known Homo sapiens sapiens includes representatives from at least two highly distinctive hominid populations, the Neandertals and the recently-found Denisova.

One of the terms in paleoanthropology which can confuse is that of archaic Homo sapiens (AHS). This is in contrast to anatomically modern humans (AMH). A simple Out of Africa “recent-origin-with-replacement” model allowed to sidestep the semantic imprecision in tossing disparate populations into a generic category such as AHS (similarly, the term “animal” as opposed to “human” has some colloquial utility, but it’s not scientifically useful). But the possibility of admixture from archaic lineages in modern human populations forces us to grapple with the dichotomy between AHS and AMH, as modern humans may be a compound of these two categories (not to mention the idea of behaviorally modern humans, who are a subset of AMH).

I assume that fleshing out the details of a new paradigm which is both precise and accurate will be a project for the coming years. But before we move on we need to fix more sturdily our understanding of the genealogical relationships of contemporary human populations. Over the past few years there have been major strides in this domain, confirming the broad outline of a dominant African heritage for modern humans. Geneticists have moved from classical markers to SNP data, focusing on hundreds of thousands of genetic variants. But now they’re shifting to whole genome sequences, which with errors excepted encapsulate the totality of the lowest order aspect of human genetic variation.


Razib takes a look at the paper. The conclusion? The hunter-gathering San of southern Africa are the most distinctive populations because of their very early separation from the other human populations of the world, even other Africans.

Notably, our point estimate of ~130 kya suggests that the San divergence occurred ~2.5 times as long ago as the African-Eurasian divergence, that major human population groups diverged at least ~80,000 years before the out-of-Africa migration and that the San divergence is more than one-third as ancient as the human-Neanderthal divergence…Still, human effective population sizes are sufficiently large that these divergence times are small relative to the time required for lineages to find common ancestors in ancestral populations. Indeed, of the mutations differentiating a San individual from a Eurasian individual, only about 25% are expected to have arisen since the San divergence. Thus, the ancient divergence of the San population does not alter the essential fact that far more human variation occurs within population groups than between them.


Razib continues, noting that the San/non-San and Neanderthal/AMH divergence is a matter of degree, not type.

There’s now a fair amount of data and results which indicate a deep divergence between the traditionally hunter-gatherer populations of Sub-Saharan Africa and the farmers and agro-pastoralists of the continent. If these results are correct (and they seem to be in line with the earlier genomic analysis of the Bantu and Khoisan from South Africa) then we have here a rather straightforward refutation of the very simple Out of Africa model, where one East African group replaces everyone else. The Bushmen seem to be of another lineage. Secondly, these results suggest that we should be cautious about inferring a qualitative difference between Neandertals and AMH, where the former is bracketed as “archaic”. The difference between Bushmen and non-Bushmen is far less than between Neandertal and AMH, but it is by a factor of multiples, not order of magnitude.


Fascinating. What will come next out of the labs of the geneticists and the sites of the physical anthropologists and who knows what else?
rfmcdonald: (Default)
I'd meant to comment on this very interesting story on basic differences in behaviour between humans and the chimpanzees that arguably constitute the most closely related primate species, provided by Wired Science's Danielle Venton. Finding the real differences between human intelligence and non-human intelligences, not making broad sweeping judgements, is always necessary.

Might altruism be more distinctly human that we'd previously conceived?

Primate researchers, working with semi-free ranging chimpanzees at a sanctuary in Uganda, found chimpanzees recruit a helping partner only if it gets them more food than they’d get alone. The study, described in Animal Behavior, Sept. 7, is part of a current trend in primatology to unpick how motivation and mental state affects an animal’s interactions.

“It looks like motivation plays a very important role in how we behave,” said Anke Bullinger, primary author. “And it gives a hint that even though species might be cognitively capable of doing certain things, they might not show the behavior, because they just don’t want to.”

The extent of human cooperation is unique, but not cooperation itself. Chimpanzees, bonobos, elephants, and many birds work together for joint rewards.

“The interesting thing is that there isn’t much research on the motivational aspects of this,” Bullinger said. “I suspect that motivation plays a role in many aspects of cognition, not just in cooperative behavior, but also in social learning, in communication.”

For the study, Bullinger and her colleagues set food boards out of the chimpanzee’s direct reach. To bring the banana-bearing platforms close, the chimps pulled on a rope resting on the ground. Chimpanzees had two options. One board they could pull close solo. On another board, loose rope threaded between loops. To get these boards, both ends had to be pulled, so the chimpanzee had to go get their partner, waiting in an adjoining room.

When Bullinger placed two banana pieces on the single board, and four pieces on the partner board, amounting to the same payoff for each chimpanzee, the animals chose to work alone the vast majority of the time. If another banana piece for each was added to the partner board, the chimpanzees overwhelmingly choose to collaborate.

“We were a bit surprised that just one more piece made such a difference,” Bullinger said.

The study implies that chimpanzees view others as social tools, as a means of maximizing their own rewards.

“It seems they care about what they want,” said David Watts, a primate researcher and Yale anthropologist, not involved in the study. “They don’t get a partner involved because they enjoy that, or because they care about what their partner wants. While, in human cases, we are often motivated to get someone else involved, just for the sake of having them involved.”

If this represents a fundamental difference between humans and other animals, it could help explain why humans evolved to be better problem solvers, and use a wider range of habitats.

“If you have a species that is psychologically predisposed to work with others, and get them involved, that opens up possibilities,” Watts said. “There is a wider range of goals they can accomplish, the population can do much better ecologically and evolutionarily.”
rfmcdonald: (Default)
Extraordinary Observations' Rob Pitingolo has an interesting essay examining the dynamics of American housing markets. Why are there shortages even if supply grows? It's the demand side of the equation.

I [argued] that we can't really talk about the "housing market" because housing isn't homogeneous. People are willing to pay more for high-quality housing than they are for low-quality housing. So if you change the type and quality of housing in a neighborhood, you alter the underlying structure of the market itself, and the price at which people are willing to pay to live there, all else equal.

But let's set that aside for a moment and assume, for argument's sake, that housing is a commodity. From a theoretical perspective, we can draw the housing market as a series of simple charts, similar to what you might remember from an intro to microeconomics course.

In a neighborhood housing market, the demand curve is downward sloping. There's nothing notable about this; and in the short-run, the supply curve is a vertical line. This is consistent with the reality that in most neighborhoods you can't start construction whenever you want to. In fact, it's often a huge hassle to construct new units, for a variety of reasons, NIMBYism being one of them. So in the short-run, the supply is fixed.

The theory continues that in order to meet the high or pent-up demand, as well as make housing more affordable, we ought to increase the supply of housing. [...] If all goes well, the resulting equilibrium will be a lower price and higher quantity of housing.

But it doesn't end there, because we haven't accounted for the demand curve, which very well may shift. This could be because new amenities have popped up in the neighborhood; but also because the neighborhood is getting safer, cleaner and generally becoming a more desirable place to live. It could even be for a completely exogenous reason. Whatever the case, let's imagine that the end result is a rightward shift in the demand curve.

If the shift in the demand curve is great enough, it can completely overwhelm the shift in the supply curve, leading to an equilibrium with higher prices.

Now, I'm not using this as an argument to say we shouldn't work to increase density in cities or metro areas. But I do want to show that the simplistic "increase supply to make housing affordable" argument doesn't always hold, for theoretical reasons, especially at the neighborhood-level, and in urban areas that are already relatively dense and desirable.


Pitingolo goes into more detail in his post, and provides cool graphics, too!
rfmcdonald: (Default)
The New Yorker on Facebook just posted a Malcolm Gladwell article taking a look at Steve jobs' role as an innovator in computers, specifically concentrating on the things he borrowed from the Xerox company's PARC research complex--the graphical user interface, the mouse. Was he stealing? No, Gladwell argues, he was implementing in the real world technologies developed in an ideal world. He expands on this by making an analogy to the Revolution in Military Affairs, the theories which hope to explain how technology changes war.

The difference between direct and indirect manipulation—between three buttons and one button, three hundred dollars and fifteen dollars, and a roller ball supported by ball bearings and a free-rolling ball—is not trivial. It is the difference between something intended for experts, which is what Xerox PARC had in mind, and something that’s appropriate for a mass audience, which is what Apple had in mind. PARC was building a personal computer. Apple wanted to build a popular computer.

In a recent study, “The Culture of Military Innovation,” the military scholar Dima Adamsky makes a similar argument about the so-called Revolution in Military Affairs. R.M.A. refers to the way armies have transformed themselves with the tools of the digital age—such as precision-guided missiles, surveillance drones, and real-time command, control, and communications technologies—and Adamsky begins with the simple observation that it is impossible to determine who invented R.M.A. The first people to imagine how digital technology would transform warfare were a cadre of senior military intellectuals in the Soviet Union, during the nineteen-seventies. The first country to come up with these high-tech systems was the United States. And the first country to use them was Israel, in its 1982 clash with the Syrian Air Force in Lebanon’s Bekaa Valley, a battle commonly referred to as “the Bekaa Valley turkey shoot.” Israel coördinated all the major innovations of R.M.A. in a manner so devastating that it destroyed nineteen surface-to-air batteries and eighty-seven Syrian aircraft while losing only a handful of its own planes.

That’s three revolutions, not one, and Adamsky’s point is that each of these strands is necessarily distinct, drawing on separate skills and circumstances. The Soviets had a strong, centralized military bureaucracy, with a long tradition of theoretical analysis. It made sense that they were the first to understand the military implications of new information systems. But they didn’t do anything with it, because centralized military bureaucracies with strong intellectual traditions aren’t very good at connecting word and deed.

The United States, by contrast, has a decentralized, bottom-up entrepreneurial culture, which has historically had a strong orientation toward technological solutions. The military’s close ties to the country’ high-tech community made it unsurprising that the U.S. would be the first to invent precision-guidance and next-generation command-and-control communications. But those assets also meant that Soviet-style systemic analysis wasn’t going to be a priority. As for the Israelis, their military culture grew out of a background of resource constraint and constant threat. In response, they became brilliantly improvisational and creative. But, as Adamsky points out, a military built around urgent, short-term “fire extinguishing” is not going to be distinguished by reflective theory. No one stole the revolution. Each party viewed the problem from a different perspective, and carved off a different piece of the puzzle.

In the history of the mouse, Engelbart was the Soviet Union. He was the visionary, who saw the mouse before anyone else did. But visionaries are limited by their visions. “Engelbart’s self-defined mission was not to produce a product, or even a prototype; it was an open-ended search for knowledge,” Matthew Hiltzik writes, in “Dealers of Lightning” (1999), his wonderful history of Xerox PARC. “Consequently, no project in his lab ever seemed to come to an end.” Xerox PARC was the United States: it was a place where things got made. “Xerox created this perfect environment,” recalled Bob Metcalfe, who worked there through much of the nineteen-seventies, before leaving to found the networking company 3Com. “There wasn’t any hierarchy. We built out our own tools. When we needed to publish papers, we built a printer. When we needed to edit the papers, we built a computer. When we needed to connect computers, we figured out how to connect them. We had big budgets. Unlike many of our brethren, we didn’t have to teach. We could just research. It was heaven.”

But heaven is not a good place to commercialize a product. “We built a computer and it was a beautiful thing,” Metcalfe went on. “We developed our computer language, our own display, our own language. It was a gold-plated product. But it cost sixteen thousand dollars, and it needed to cost three thousand dollars.” For an actual product, you need threat and constraint—and the improvisation and creativity necessary to turn a gold-plated three-hundred-dollar mouse into something that works on Formica and costs fifteen dollars. Apple was Israel.

Xerox couldn’t have been I.B.M. and Microsoft combined, in other words. “You can be one of the most successful makers of enterprise technology products the world has ever known, but that doesn’t mean your instincts will carry over to the consumer market,” the tech writer Harry McCracken recently wrote. “They’re really different, and few companies have ever been successful in both.” He was talking about the decision by the networking giant Cisco System, this spring, to shut down its Flip camera business, at a cost of many hundreds of millions of dollars. But he could just as easily have been talking about the Xerox of forty years ago, which was one of the most successful makers of enterprise technology the world has ever known. The fair question is whether Xerox, through its research arm in Palo Alto, found a better way to be Xerox—and the answer is that it did, although that story doesn’t get told nearly as often.


Although the story of Apple, and Jobs, deserves to be told, too. (This can be done implicitly, mind. Would I be able to tell you any of this if not for the informatics wrought by Apple?)
Page generated Apr. 13th, 2026 12:05 pm
Powered by Dreamwidth Studios