Friday, January 01, 2016
Problem is, the putative intelligence responsible for nature is, alas, a stunningly peculiar species of intelligence. According to the IDers, it is unlike any intelligence anyone ever has encountered anywhere outside of theology (or science fiction). IDers propose . . . a brainless intelligence (and an omniscient one at that).
That’s right. Somehow, despite all inductive evidence to the contrary, there exists, they argue, an intelligence that requires no brain, nor even solid-state circuitry, for it to attend its business. Nature’s intelligent designer doesn’t need a body of any sort. The designer belongs to a unique class of intelligences, of which it is the sole member.
This odd construct, a disembodied supermind, derives from a faulty analogy. IDers observing human invention, notice that nature in suggestive ways looks and acts like a high-tech machine. They reason that since explaining machinery requires that we refer to intelligence, then so must explaining nature. This line of analogical thinking constitutes a case study in the disreputable practice known as cherry picking.
Sunday, November 15, 2015
|CLICK IMAGE TO ENLARGE|
But when I went back to re-watch the vid the other day, being not logged in, the comments for the vid did not include mine. Puzzled, I opened a second browser, went to YouTube, and logged in. And then, there was my comment.
So, displaying the same vid simultaneously in two browsers--one, where I'm logged in and one where I'm not logged in--gives me two different displays. When YouTube knows it's me, because I'm logged in, it includes my posted comment. When it doesn't know (at least by way of log in) who's watching, then my comment doesn't appear.
Notice that each screen gives a different count for ALL COMMENTS. One gives five and one gives six. The latter count includes my comment. So, whether the comment shows is not a function of whether my comment is among the newest or most popular.
Given the subject matter of the show and of my comment--the machinations of the ruling class--it's tempting to believe that the selective display of the comment is a function of some algorithm's politically calculated filtering.
Anybody else notice selective editing of ALL COMMENTS by YouTube--your comments displayed in ALL COMMENTS only when you're logged in?
Sunday, October 25, 2015
But the metaphor—nature as machine—does not reveal nature as having been intelligently designed. It’s a metaphor.
Invert the assumption: Maybe it’s not the case that nature is a machine, with an implied design (as if technology was primordial and nature modeled on its principles). Maybe it’s the obvious case that nature came first, with humans being smart enough to study nature’s ways and apply that experience to building tools and towns and space stations. Nature inspires. But that does not make it an example of what it inspires.
Technicians make progress when their designs harmonize with natural law, but that does not mean that nature itself, with all its laws, was designed. Nature is just ground level, ontological bedrock. It does not come with a requirement that something supernatural, behind the scenes, got it started or propels it along.
People have no trouble seeing that certain natural phenomena seem to bear hallmarks of design—certain chunks of nature resemble humanly crafted artifacts—and so some people conclude, unjustifiably, that nature must be an intelligently designed artifact. But many things resemble things that they are not. Sometimes the similarities are striking.
A bat is in striking ways similar to a bird. Both are warm-blooded vertebrates, send out distinctive vocal signals, eat insects, flap wings to fly, congregate in social groups and so on. But an expedition in search of bat eggs will end up with egg on its face.
The God of classical theology, a deity that preceded and designed the physical universe, is a kind of bat egg.
The conclusion that bat eggs and a classical God must exist is justified if bats are birds and nature is an artifact. But if the similarity between bats and birds and between nature and high-tech is just a resemblance, then both conclusions land in the dump.
Sunday, October 18, 2015
Point is, reproductive success is taken to be an effect caused by adaptations.
This position, stated otherwise, contends that, but for the adaptations, the creature would experience reproductive failure relative to its local conspecific peers.
But we can invert the assumption: Maybe the default position ought to be that organisms normally enjoy reproductive success, absent any factors, endogenous or exogenous, that undermine that success.
Organisms are integrated wholes, adapted, but not possessing adaptations. Organicism suggests that a creature doesn’t possess adaptations any more than an atom possesses protons. A proton, or a fused bundle of them, just is an atom. Atoms typically come decked out with other particles, the neutrons and electrons, but no protons, no atoms.
Similarly, a creature does not possess anatomical and physiological adaptations. It simply is its anatomy and its physiology. It comprises them, and they compose it. No physiology or anatomy, no creature.
In the case of a bird, for example, wings might be called an adaptation, but lay a couple on the ground and not much will happen. Lay a wingless bird on the ground and not much will happen beyond the suffering and demise of the bird. Whatever gets designated as an adaptation contributes no more to the rest of the creature than the rest contributes to it. Without the wings there just is no viable “rest of the bird,” that happens not to own wings. For evolutionary purposes, there just is no critter.
So, does the presence of adaptations enable or the absence of lethal/sterilizing circumstances allow a creature to live and reproduce?
Saturday, July 18, 2015
Despite gene losing its power to denote, the science of genomics continues to advance and looks determined to keep doing so. And this is also despite reductionism’s shortcomings increasingly becoming apparent, as research reveals the seemingly intractable interwovenness of the processes and subprocesses of biological metabolism.
Reductionism fails because everything that goes on inside a living cell depends on—is caused by, directly or indirectly–everything else that goes on inside the cell. And then, adding more complication, there are exogenous influences. And above it all, no locus of control. There is a lacuna of control. The cell has no brain. The sequencing of the human genome, that recent triumph of reductionism, like the cataloging of elementary particles, provides a compendium, but it resides far from the macrostructure, far from an accounting of gross outcomes. Structures and processes that define the macro world do not map readily onto elementary bits.
Friday, June 19, 2015
Nonetheless, such speculations invite philosophizing:
- Can a nonphysical anything wield intelligence?
- Can intelligence reside in a mere, albeit complex, molecule?
|Cutaway illustration of the living cell. Neither divine artifact nor improbable chemical machine.|
To suppose that a deliberating mind is needed to design or operate the biochemical levers that trigger or impede processes inside a cell is to anthropomorphize. To suppose that somewhere physically inside the cell is a something that makes such decisions as are made is to anthropomorphize. This latter observation is particularly the case now that research into gene regulatory networks demonstrates that the biochemistry inside a cell operates as an organic whole. There are dependencies and interdependencies, but no executive intelligence sits atop a hierarchy of control.
We have “intelligent” and “design,” “master genes,” “control switches,” “codes” and “programs” from which to construct an understanding of the cell as a representative organism. Such concepts are fine work-a-day metaphors, but literalizing and projecting them onto nature is a detour into anthropomorphism. Nature is not designed or programmed by an intelligence or anything else. Nature is not a whew! of chance. Nature is not of gods or fortunate happenstance. Nature is neither a miracle nor a machine.
Peel back the curtain, and there’s nothing to see. Nature, in all its messy complexity, in all its nurturing and desolation, in all its unlikely satisfactions is all there is: Organism. Nature earns its living by weaving novelty, habit, objects and subjects into ever more intense, elaborate and sublime aesthetic processes and experiences. It suffers the setbacks inherent in being alive. Its animate soul inspires each new universe it bears. Nature is ontologically animate, exuberant, irreducible, and non-contingent. This is the broad sense of organicism, the last philosophy left standing once dumb dead matter and disembodied consciousness have slapped one another silly.
Tuesday, June 09, 2015
Although each cell in that body inherits all of the fertilized egg cell’s genes, specific genes get switched on and off by other genes that produce regulatory molecules, and those genes are switched on and off by other genes and the molecules they produce according to what is needed for each type of cell. And all of this biochemical management occurs by feedback loops. The biochemistry that oversees the differentiation (and stabilization) of cell types in the developing organism is organized into gene regulatory networks (GRNs), very elaborate chemical feedback loops.
Q: How does a fertilized egg cell give rise to such a variety of cell types as compose the body of a complex organism?
A: That fertilized egg cell’s DNA arrived pre-loaded with the genetic information needed to craft the specialized cell types that compose the body of that complex organism.
The point to be made about GRNs is that, by regulating gene expression, the cellular machinery can coax from a highly conserved set of genes (those of the original fertilized egg) a liberal diversity of cell types (skin, muscle, nerve, etc.).
So far so good. But one noteworthy development is the increasing significance that evolutionary theorists ascribe to GRNs. Gene regulatory networks, not the acquisition of new genes, manage the differentiation of species from their common ancestors in much the same way as they manage the differentiation of cells in a body from their common ancestor, the fertilized egg cell. Diverse descendant phenotypes lurk within the DNA of ancestral genomes and genotypes alike, waiting to be switched on.
The science of comparative genomics confirms a fundamental conservation of DNA across species, a finding that came as a surprise to everyone. The genetic similarities seen across species are too striking to sweep under the rug, and at least some researchers are candid about the new data’s impact on evolutionary theory.
Charles R. Marshall, a biological science professor at the University of California, Berkeley, observes in a book review in the September 20, 2013 issue of Science,
"In fact, our present understanding of morphogenesis indicates that new phyla were not made by new genes but largely emerged through the rewiring of the gene regulatory networks (GRNs) of already existing genes"
Sunday, April 12, 2015
Now feathery, wild.
Blooms, a fog of spicules.
II. Millipedes occult
A vault of heavy metals,
The Kubrick sky,
The spinal dust . . . .
Splits a sky’s
Migrating through parchment.
Blue sky erased
Line by line.
V. Across veiny, steel wool
Nudged to horizon's edge.
Density of particulates.
Dispersal fields from hard
Shed their hair
Pilots feed the new blanket,
Of silken gauze, rended.
Monday, December 08, 2014
Things seemed to be going well, but then *Kablooey*, the moderator recognized, rightly, that I had diverted the conversation into a rabbit hole. I had wandered off script. And he threw me in the penalty box --- my posts in this thread were relegated to the Trash Can.
Given the unpopularity of my posts -- my arguments won no converts -- it's tempting to believe that their final resting place in the Trash Can had something to do with that unpopularity. Nonetheless, the moderator did recognize that I was not addressing the topic of scientific method directly and deserved some kind of rebuke. But the Trash Can?
For the curious, you can judge for yourself whether my comments are Trash-Can worthy:
Sunday, September 21, 2014
Reconceptualizing Evolution as an Instance of Development - Phylogeny is its own Ontogeny; Start with the Zygote.
During the development of a complex organism, a fertilized ovum, or zygote, divides in two, then again into twice as many cells and eventually into all the cells that compose the organism's body. As the cells proliferate, they differentiate in form and function into the various cell types of that particular kind of body. This differentiation into skin, stomach, nerve, and other cell types occurs even though the cells of a developing body all share a common genotype, that of the original zygote. The paradox of one genotype yielding many cellular phenotypes has been resolved, in a general sense, through the mechanisms of epigenetics. A relatively new branch of molecular biology, epigenetics addresses issues related to gene regulation and gene regulatory networks. The new discipline aims to explain how, during development, genes get turned on and off and when (as in larval or adult forms of organisms) and where (as in spleen or kidney) they do.
The new discipline is an upstart. Epigenetics would seem to demote DNA from being the cell's chief executive to its merely utilitarian, dumb server. DNA includes an archive of messenger-RNA templates (and the messenger RNA molecules transcribed from the templates still pass through an editing suite before being escorted to the ribosome, where they get translated into proteins). The molecular machinery of epigenetics, through normal chemical bonding, excites or inhibits DNA "expression" or "action." The countless combinations of sections of DNA that can be expressed and repressed here and there in sequence or in tandem produce multiform cellular phenotypes from the highly conserved DNA of the original zygote.
From a complex database a skilled operator can extract many kinds of reports, by slicing the data this way then that. DNA is such a complex database, responding to many and diverse calls for data. The creatures of the Earth are reports summoned from DNA, not expressions of any executive talent that resides in the DNA. This is the new view of things from the world of epigenetics.
Thursday, August 21, 2014
Advocacy of eugenics continues under the banner of population control and similar euphemisms.
The Anglo-American eugenicists of the early 20th century invoked Darwinian natural-selection theory to gird their ideological bent. But, according to the arguments and evidence that Alan Bennett presents in "Evolution Revolution", these social engineers did not hijack Darwinism, nor twist it into service in a way to which Darwin would have objected. On the contrary, Darwin embraced the eugenicist agenda from the outset. Not only did Darwin himself promote eugenics, but the agenda's advocates also included Darwin's half-cousin, Francis Galton, who formalized the concept and propounded it as civic duty; Thomas Henry Huxley ("Darwin's Bulldog"); and Huxley's grandsons, Julian and Aldous Huxley, Julian serving for a time as president of the British Eugenics Society and Aldous sketching a blueprint for a caste society in his "Brave New World."
The objective of the Darwinian offensive was twofold, as Bennett summarizes:
- Cast the working class in the role of the unfit.
- Denigrate religion.
But the anti-Darwinian angle of Bennett's argument unwinds in a complicated way and extends beyond discrediting the motives of Darwin and his acolytes. That is, the attack is not merely ad hominem. Bennett establishes it as a point of historical fact that the concept of "descent with modification" had been around for some time prior to Darwin. Victorian society was not hostile to the idea of evolution, which it saw as evidence of God's wisdom, in His having crafted natural law so as to give rise to the diversity of life.
Neither was the mechanism of natural selection original with Darwin. It too was a concept familiar to Victorian scientists. But natural selection failed to gain traction as a scientific idea, before Darwin and his propagandists took up the cause, because the scientists of the day perceived that it was inadequate to account for the diversity of life. Under the influence of an optimizing mechanism, such as natural selection, they reasoned, phenotypes should converge, not diverge, with the passing of generations.
Natural selection theory never has rested on solid scientific evidence or reasoning. Although, by appealing to statistics and common prejudice, Darwinians grafted onto natural selection theory the trappings of a science. As a result, the sequentially amended theory became almost infinitely elastic in its capacity to absorb anomalous findings. It managed consistently to re-describe "how nature works" in ways contrived to preserve a niche for itself in the explanatory scheme. From the time Charles Darwin foisted it upon the world, natural selection theory effectively served the ideological ends of diverse brands of racists and elitists, despite its lack of scientific rigor.
However, if we follow Bennett in rejecting natural selection as the primary engine of evolution, then we are left with a process minus any explanation as to how it works. We still have to account for evolution's particular outcomes. Bennett proposes to fill the void, but the mechanism that he nominates to serve as evolution's centerpiece arrives with its own baggage.
Sunday, May 11, 2014
|This was the sky over Minneapolis, the view from my alley.|
|I hiked over to the Ford Bridge, which connects Minneapolis and St. Paul, for a better view.|
|The planes kept coming. This fellow had a curious flight plan. Maybe he forgot his lunch.|
|Sunsets never looked like this before.|
Saturday, April 19, 2014
"The new forces, elevating in their nature though they be, do not act upon the social fabric from underneath, as was for a long time hoped and believed, but strike it at a point intermediate between top and bottom. It is as though an immense wedge were being forced, not underneath society, but through society. Those who are above the point of separation are elevated, but those who are below are crushed down."
— Henry George, Progress and Poverty
Cowen is an economist at George Mason University. He achieved fifteen minutes of fame via an e-book, The Great Stagnation in 2011. The e-book created a buzz loud enough to grab the attention of a publisher with a printing press. Stagnation insinuated its way between hard covers, from where it continued to make the case that a low-wage, slow-growth economy is something that the world had better get used to. It’s the new normal.
Average is Over, evidently a hurried sequel to Stagnation, reprises Cowen’s message: Extreme income inequality is here to stay. We can't tax the rich, after all, because they have too many channels through which to transfer the burden to the middle class and the poor.
Cowen offers up education as a tool that sub-millionaires might use to elevate themselves economically, but such education as Cowen conceives of might more candidly be called instruction, or schooling, or obedience training. So, operating in a bimodal economy—one that concentrates wealth at the tippy top and diffuses poverty across the broad bottom of the barrel, with no middle between—how does the top dispose of the barrel bottom? Cowen seems to think that that’s a problem to be solved and that he has a solution:
“What if someone proposed that in a few parts of the United States, in the warmer states, some city neighborhoods would be set aside for cheap living?”Cowen describes the housing in these cheap zones as being modest but not ramshackle, and it all seems fuzzily commonsensical. But then,
“We also would build some makeshift structures there, similar to the better dwellings you might find in a Rio de Janeiro Favela. The quality of the water and electrical infrastructure might be low by American standards, though we could supplement the neighborhood with free municipal wireless (the future version of Marie Antoinette’s famous alleged phrase, “Let them watch internet!”) Hulu and other web-based TV services would replace more expensive cable connections for those residents. Then we would allow people to move there if they desired. In essence, we would be recreating a Mexico-like or Brazil-like environment in part the united States, although with some technological add-ons and most likely greater safety.”Ok, so let’s fill another flute with bubbly and kick back to get a good look as the great unwashed descend ever deeper into collective poverty. It’s a kind of spectator sport for the well heeled, really. Let’s anticipate this decline in quality of life and contain the newly impoverished in ghettos modeled after Brazilian slums. If we call these habitats for impoverished humanity camps maybe they’ll seem almost recreational. Maybe FEMA would do a good job running these camps, keeping everything orderly and responding to emergencies. They might even cook up a motivational slogan. Maybe something like Arbeit Macht Frei.
No one should find the favela prospect objectionable, Cowen opines, after all, “no one is being forced to live in these places. Some people might prefer to live there. I might prefer to live there if my income were low enough.” He goes on to remind readers that some neighborhoods deteriorate naturally into shantytowns: “The end result is no different from the deliberate shantytowns already discussed.”
Let that cute phrase rattle around inside your skull.
What a striking public policy objective. Cowen wants his manufactured ghettos to be modeled after Brazilian favelas. So, how is it going for the residents of those South American slums? Maybe not so good. Could be better.
Sunday, February 23, 2014
You Are Being Stagnated: The social engineers are managing your standard-of-living expectations downwardly.
This assertion, about technological innovation, admittedly, invites debate, given the ongoing computerization of pretty much everything. Nonetheless, say the pessimists, for whatever reason, recent breakthroughs lack the economic potency of the technological breakthroughs of the early to middle twentieth century. Those innovations fueled deep and broad economic gains. More recent breakthroughs are of a different nature. They are hood ornaments glued onto tried-and-true technologies, and they concentrate their relatively meager economic gains selectively in the pockets of the already rich. Evidently, the "New Normal" applies only to the middle class and the poor. Cowen seems too cavalier about this implication of the new economic order: It's just the way the cookie crumbles.
Columnist Paul Krugman attributes much of the economic stagnation to a lack of demand for goods and services, essentially telling us that the legendary "job creators" won't be creating jobs any time soon, not until they see more cash in the pockets of prospective customers. If that's a major hold-up, keeping the economy from recovering, then putting cash in the pockets of consumers might be a good thing--the kind of economic stimulus that might work. If only that were the objective of the professional theoreticians and technicians "working" on the problem.
Despite these implications of Cowen's and Krugman's diagnoses, the prescription from both sides is simple: Get Used To It. Krugman glowingly cites economist and political advisor Lawrence (Larry) Summers as having arrived at the same conclusions about the New Normal and The End of Innovation. Now, there's a champion for the working bloke.
Push-back from the contrarians, those who reject the end-of-innovation premise, predictably cites the ubiquity of computing and communications technologies. The new devices have reshaped thoroughly the habits of consumers and big business alike. Computers streamline manufacturing and make comparison shopping as simple as swiping a touchscreen. The impact of the more recent technologies can hardly be said to pale next to that of the twentieth century's marvels. Nonetheless, the scope of their impact is easier to appreciate when one looks under the hood and sees that tiny little integrated circuit, or IC.
Saturday, December 21, 2013
Rapid evolution of novel forms: Environmental
It's just too pat. Carrying around a reservoir of unexpressed mutations? Just for fun? Just "in case"? Just "by chance"? I don't think so.
See report at http://wi.mit.edu/news/archive/2013/rapid-evolution-novel-forms-environmental-change-triggers-inborn-capacity
Thursday, August 29, 2013
The scene at right occurs a few minutes into the episode, "Say My Name," in Season 5 of the cable TV show, "Breaking Bad".
In the screenshot, the lower right quadrant of the sky features four or more more-or-less parallel white streaks. Maybe stunt pilots just finished an air show. Who knows?
Maybe the streaks are just oddly configured cirrus clouds. Nah.
They can't be normal jet-engine contrails, because those dissipate too quickly. They wouldn't hang around long enough to share the sky with the next flights to come along in the same path.
Hmmmm . . . .
One thing these pop culture insertions accomplish is to help normalize the new sky.
The camera doesn't linger, so we can't know whether or not in an hour or so these streaks unfolded themselves into feathery, striated, filamentous, fields of rippled gauzy gray haze that finally swallow the sky.
Sunday, July 21, 2013
Saunders acknowledges that the media's braindead incantations are agenda driven: "[. . . ] it's clear that a significant and ascendant component of that voice has become bottom-dwelling, shrill, incurious, ranting, and agenda-driven." So, he's got himself into a contradiction, though you have to untangle the essay to get a good look at it.
He continues, "It strives to antagonize us, make us feel anxious, ineffective, and alone; convince us that the world is full of enemies and of people stupider and less agreeable than ourselves; is dedicated to the idea that, outside the sphere of our immediate experience, the world works in a different, more hostile, less knowable manner. This braindead tendency is viral . . . ."
Why would he characterize the deliberate undermining of an accurate picture of the world as reflecting braindeadness? The propaganda ministers who employ the megaphone are hardly braindead. How about crazy like a fox?
Saunders aims too low. He mistakes the messenger for the composer, the program for the programmer, the on-air personality reading a teleprompter for the executive directors of the telecommunications conglomerate who employ the teleprompter reader. The collage on the cover of the first paperback edition makes clear his target: the on-air script readers.
But these poor toilers are just reciters, pieces of the corporate complex's manufactured public face. Why does Saunders fail to target the people who write the scripts that the on-air personalities are forced to read? Or their bosses? Or theirs? He calls the on-air personalities "informants," as if they possess information and share it in a spirit of civic mindedness, in a marketplace of ideas, hobbled only by their stupidity. Oh, and the profit motive. But they don't possess anything like information. They possess a knack for projecting an amiable facade. Witting or unwitting, they serve nonbraindead masters.
Sunday, July 14, 2013
Notice the advertisement that occupies the bottom half of page A5. It is an airline ad for AirFrance.
The grid of trails in the sky evokes patterns commonly associated with chemtrails, with the ropes from the swings contributing to the aerial coverage. The ad does not include any headline or body copy that references playground activity, vacationing, family travel, or anything to create a context for the kids on swings. They are context-free props, but for the chemtrail ropes. Since this ad ran, the skies over the Minneapolis-St. Paul area have received extensive, repeated trail coverage.
What went through the mind of the graphic designer who laid out this ad, or the art director who approved it, or the agency rep who sold it to AirFrance? It is implausible to suppose that the reference to chemtrails is unintentional. If similar ads appear where you live, you might want to prepare for a heavy dose of heavy metals.
A couple other points:
Such images help normalize a conspicuously hash-marked sky. NASA already is working to shape the perceptions of children, by conflating chemtrails and ordinary contrails. See here: http://mynasadata.larc.nasa.gov/804-2/contrail-watching-for-kids/
A common rebuttal to warnings about chemtrails is that they are equal-opportunity toxifiers, that not even the perpetrators could avoid inhaling the contents of the spays. It might be that through advertisement, the perpetrators signal their cohorts as to where the whammy will fall, giving them fair warning to take a vacation or other leave of absence.
Sunday, April 28, 2013
What Darwin Got Wrong." But Nagel dismisses the centerpiece of their attack—the incoherence of natural selection theory—cavalierly, to my mind, opining in a footnote that they misinterpret the theory. Really? How so? Now, that would be worth reading.
Which is not to say that the present book is not. When someone of Nagel's stature presents secular objections to the NeoDarwinian paradigm, feathers are bound to fly, as they do in any number of critical reviews of the book. Science seems to feel pressured to circle the wagons around the Darwinian account no matter the veracity of the counter arguments. It's time to take a break.
Conventional thinkers who keep natural selection theory at the top of their list of explanatory tools can use it to explain any aspect of organic nature. They need only contend that whatever is observed, say consciousness and rational thought, or blue feathers and big beaks, is as it is because that phenotypic trait was "more adaptive" than the alternatives expressed in the ancestral population. If in some instances that explanation seems implausible, then the explanation is “genetic drift.”
In any case, Nagel sidesteps this catch-all application of Darwinian reductionism by pointing out that nature from the outset must have had the potential to sprout living beings with minds. Nothing in conventional scientific thinking accounts for this potential inhering in nature. The prospect transcends the materialist, NeoDarwinian paradigm, or at least that's Nagel's contention.
Sunday, March 31, 2013
This struck me during a visit to the Minnesota state capitol building in St. Paul. The artwork inside the building's rotunda is uniformly and inescapably pagan. I take responsibility for the poor quality of the photos that follow, but the inherent pagan imagery is clear. Nothing Christian to be gleaned.
I suppose that the dearth of Biblical imagery in seats of government has something to do with the notion of a Constitutional wall that separates church and state. And the reason pagans get a free ride might have to do with that word, "church," as opposed to, say, "temple." Curiously, Mormons maintain both churches and temples, covering the bases, I suppose.
In any case, it's clear that Christianity is good enough for Joe and Josephine Blow, but paganism the elite retains to itself for its own veneration.
Saturday, March 09, 2013
August 4, 2013, update to this post: Narrative Science is a company that helps businesses communicate by turning data into stories. The company's software program, Quill, "is an artificial intelligence engine that generates, evaluates and gives voice to ideas as it discovers them in data." The company's website elaborates:"Quill imports your data and builds an appropriate narrative structure to meet the goals of your audience. Using complex Artificial Intelligence algorithms, Quill extracts and organizes key facts and insights and transforms them into stories, at scale. Quill uses data to answer important questions, provide advice and deliver powerful insight in a precise, clear narrative."That ability could come in handy. For the CIA.
The intelligence agency's business-investment arm, In-Q-Tel, has contributed an undisclosed amount of funding to Narrative Science. My original post, below, might explain why the CIA is pursuing the engineering of stories. The agency is out to weaponize narrative.
The Conference on Contemporary Political History kicked off with remarks from then director of the Miller Center, Philip Zelikow. (A few years later, during a leave from his directorship, Zelikow kept on a short leash the Kean-Hamilton Commission, also known as the 911 Commission, which he steered as its staff executive director.) In his opening remarks to the history conference, Zelikow laid out his concerns regarding contemporary political history:
“’Contemporary’” is defined functionally by those critical people and events that go into forming the public’s presumptions about its immediate past. This idea of ‘public presumption’ is akin to William McNeill’s notion of ‘public myth’ but without the negative implication sometimes invoked by the word ‘myth.’ Such presumptions are beliefs (1) thought to be true (although not necessarily known to be true with certainty), and (2) shared in common within the relevant political community. The sources for such presumptions are both personal (from direct experience) and vicarious (from books, movies, and myths).”(All quotes from Zelikow are from the Miller Center Report, Vol. 14, No. 3, Winter 1999.) We will return later to McNeill and public myth. First, we need to follow Zelikow into the sources of public presumption. He identifies four:
”First, public presumptions can be ‘generational.’ They are formed by those pivotal events that become etched in the minds of those who have lived through them [. . . ]. The current set begins in approximately 1933, although the New Deal generation is fading. The Second World War and Vietnam, however, continue to resonate powerfully.
“Second, particularly ‘searing’ or ‘molding’ events take on ‘transcendent’ importance and, therefore, retain their power even as the experiencing generation passes from the scene. In the United States, beliefs about the formation of the nation and the Constitution remain powerful today, as do beliefs about slavery and the Civil War. World War II, Vietnam, and the civil rights struggle are more recent examples.
“Third, public presumptions often concern 'dramatic stories plucked out of time,' such as the Alamo, Pickett’s Charge, or the Titanic.
“Fourth, some public presumptions gain currency because they have a particular resonance for us today, either because they invoke powerful analogies to the present [. . . .] or because they offer a causal link and seem to explain ‘why we are the way we are today.’ Taken together, we see that presumptions that remain ‘contemporary’ are—with few exceptions from the 18th and 19th centuries—events and episodes from the last 60 years.”Depending on which conspiracy theory one subscribes to, that of the Bush administration and the Kean–Hamilton Commission or some variety suggested by the 911 Truth Movement, it takes little imagination to perceive in Zelikow’s assessment a premonition or foreshadowing of events to come, his remarks about searing events taking on transcendent importance being uttered just three years prior to the attacks of 9/11.
And nowhere in his comments will one find any concern that public presumptions should reflect history accurately.
Sunday, March 03, 2013
The cover, which appears on Discover's March, 2013, issue, looks like something dredged from a 1950s science-fiction serial. A guy in a suit lifting off to his next sales meeting represents Evolution's Next Stage? Does it get any more pedestrian, or vintage, than that? What's the next breakthrough—lawn-mowing robots?
America's visionary futurists peered over the edge—and all I got was this lousy jet pack.
Is it really plausible that the editors behind this popular science magazine can't envision a future grander than one that might have dazzled them in grade school? Or, maybe technological deprivation is an idea that media owners would like us to get used to.
The cover image is so dated that its cheesiness looks insidious. Is it an artifact of the project to deliberately dumb us down? Another exercise in normalizing middle-class expectations of a stagnant-to-declining standard of living?
Surely the technological imagination can conjure something more exciting than this to sell us as Evolution's Next Stage. The next stage in evolution? Try THIS.
Sunday, January 27, 2013
Lecture: Evolution in Four Dimensions
Click the image for an excellent talk by Eva Jablonka, in which she describes provocative new findings in epigenetics and animal behavior. The findings move natural selection farther toward the periphery of evolutionary theory. Phenotypes, as they differetiate during evolution, seem to self-organize, as do the differentiating cells in a developing organism. My contention is that evolution and development resemble one another because they are two appearances of the same process, which is development.
If one could lay Darwin's concept of natural selection next to the current model of evolution, one would have a hard time finding much in common between them. Outside of the vaguest generalization of the evolutionary process, captured in Darwin's phrase, "Descent with modification," nothing much of the original formulation survives to contribute to the current model, the so-called Extended Synthesis.
Epigenetics, niche construction, phenotypic plasticity and other intriguing new developments in bioscience sit at the center of that synthesis and throw into question the foundations of the Darwinian model. Despite evolution theory's provisional character, however, the fossils record descent’s modifications, and they taunt us: What during descent accounts for these modifications?
Saturday, November 10, 2012
This realization produces a discomfort that psychologists call cognitive dissonance. It happens when a person holds beliefs that are incompatible, as in “I am a shrewd operator” and “I’ve been chumped.” A wave of cognitive dissonance is washing over the U. S. electorate, as voters face up to their chumpdom.
The electorate is seeing that political and economic fundamentals rest on something other than which party perches on the branches of the U.S. government. Wings left and right fly in and out of office, but election wins of neither camp deflect entrenched trajectories:
- The wealthy grow disproportionately wealthier and the workaday blokes correspondingly poorer.
- U.S. troops and their private-sector surrogates occupy more foreign turf.
- Our sources of sustenance become increasingly adulterated.
Fortunately, self-deception has its limits.
Philosopher of science Thomas Kuhn found in the history of science examples of self-deception's limits, and he described what happens when those limits are breached. In The Structure of Scientific Revolutions, Kuhn pointed out that when scientific experiments produce unexpected results, and eventually they will, then scientists explain away the anomalies, somehow or other, to preserve their foundational beliefs, what Kuhn called their paradigm. Scientists accommodate the anomalous results by amending the paradigm. In the above example, if “I am a shrewd operator” is the prevailing paradigm, and “I’ve been chumped” is an anomalous occurrence, then “I must have had a bad day” might be an adequate amendment to accommodate the anomalous data while preserving the paradigm.
But a paradigm can be stretched only so far. Eventually the anomalies, such as strings of coincidences or runs of bad days, accumulate beyond the elastic threshold of the paradigm, and it snaps. When that happens, the paradigm’s adherents convert or, eventually, die off, and a new generation adopts a new paradigm that accommodates the anomalous data.
The Structure of Political Revolutions.
Politics, like science, rests on foundational beliefs, or paradigms. A majority of the electorate, for example, seems to believe that the ruling class comprises more or less decent human beings who act in good faith (more or less) toward the ruled. But when outcomes contradict this belief, when its expectations are not met, then the electorate stretches the paradigm to accommodate apparent acts of bad faith, such as, for example, a pile of broken campaign promises.
A generally accepted amendment grants that our elected officials and their advisers are beneficent actors who act in good faith but are stupid. (This is the honest-but-stupid hypothesis.) Maybe they make big mistakes, because deep down they are boobs, or, more charitably, they just lack sufficient smarts to figure out how to keep their promises. The economy, the Middle East, the environment—it’s all so complicated. Sometimes things go wrong. The rulers mean well, but they bungle. Dang. Or, maybe they’re honest, well meaning and competent, but cowardly when opposed. (This variant is the honest-but-spineless hypothesis.)
Armed with these amendments, adherents of the prevailing paradigm can mitigate the dissonance created by anomalous outcomes. They can still gather and console themselves under the old paradigm. But the paradigm must have an elastic threshold; it cannot accommodate an infinite number of anomalous outcomes and amendments.
Sunday, September 23, 2012
The issues are in the tissues.A massage therapist once shared with me that piece of trade wisdom. She was making a point about the interplay between mental and physical discomforts. She was not the first to link the two.
Scottish Psychiatrist R. D. Laing in the 1960s and '70s, following onto Freud and Jung, proposed that earliest experiences in life inform not only personal psychopathologies but also the myths of the tribe. He was interested particularly in myths that share a pattern with prenatal and early postnatal developmental histories and suggested that the mythic tales might recapitulate the adventures of zygotes, embryos, and fetuses.
|Uterine Endometrium Adopting |
The Newly Arrived Blastocyst
The stories recapitulate the journeys and development of the zygote, embryo, and fetus, suggested Laing. That is, the stories parallel the prenatal story: A zygote, encased in a membrane, called the zona pellucida, travels downstream—down the fallopian tube—until, as a blastocyst, it is adopted by the uterine endometrium. Attached to the uterus, it matures into a fetus, and at the requisite time it is born into the world.
Laing also cites correspondence between Freud and Jung in which the psychoanalysts discussed another natal motif, that of the doppelganger, the hero's atrophied and subordinate twin. Examples of hero/doppelganger pairs include Gilgamesh and Enkidu, Romulus and Remus, and Don Quixote and Sancho Panza. Cain and Abel could be cited. The psychoanalysts interpreted the weak or unfortunate twin as representing that lost, discarded companion, the placenta.
Although Jung is most associated with the notion of a collective unconscious, Freud, too, toyed with the idea of a sort of phylogenetic memory. In a manuscript he shared with colleague Sandor Ferenczi, published posthumously as A Phylogenetic Fantasy, Freud speculated that some psychopathologies developed as reactions to conditions encountered by our ancestors during the ice ages.
From DNA to Phenotypes
The arrival of the first neurons provides the embryo with a mechanism, prospectively, that can record experiences, that is, with a memory. But the developmental stages prior to the appearance of the first neurons must record experiences by other means (if at all). Some other means must come into play also if phylogenetic memory can be a realistic prospect.
The Masonic Pyramid, shown here as the Great Seal, searches for its capstone.
The all-seeking eye in the capstone's place has, thanks to NASA, extended its vision to the planet Mars
There, NASA's Curiosity robot explorer last week found a real treasure: The Missing Capstone, shown below.
The official story is leaked HERE. We await further developments.
Nice color close-up is posted at http://mardew.com/hires/MSL0044-2/
|Click to Enlarge|
Saturday, August 18, 2012
- It is a creationist hypothesis. (The star larvae hypothesis has no use for, nor does it address, supernaturalism, so how it qualifies as creationism is hard to figure, unless creationism has an infinitely elastic definition.)
- Its mix of ideas includes “religious creationist arguments” and “paranormal topics.” (The hypothesis includes creationist arguments only insofar as it includes arguments that are critical of the theory of natural selection, but the criticisms of natural selection have no root in any religious consideration. The hypothesis has no use for, nor does it address, paranormal topics, unless one is using “normal” in the Kuhnian sense, in which case any reference to anomalous data is a reference to something “para”normal.)
- It is guilty of “quote mining and misrepresenting the Gaia hypothesis and panspermia ideas of Fred Hoyle.” (The hypothesis cites sources in the ordinary way that such presentations do. If there’s any mining, it’s in the sidebar quotes, but those are for color. They’re not essential to the hypothesis. The accusation of misrepresentation is strange, but mudslingers tend not to aim very carefully.)
- It denies macroevolution and claims there are no transitional fossils. (This characterization could be made only by someone who has not read, or understood, the hypothesis.)
I can’t help but ponder the P.R. cliché about no publicity being bad publicity. Since the wiki entry appeared, visits to the star larvae site have ticked up a bit.
Sunday, July 08, 2012
The image of the heroic atheist is the archetypal opposite that of the religious believer. The believer is a psychological weakling, a child unable to cope with the world of rationality, of merely physical existence, and of other cold "facts" to be faced.
In Religion for Atheists, Alain de Botton, an atheist, acknowledges these images as cultural conventions and in doing so tries to lay the foundation for a rapprochement. Little in this book will hearten his fellow atheists, however. Instead, the author chooses to acknowledge, with great sympathy, the needy child in us all.
|A triumphant Madalyn Murray O'Hair slays religion.|
"Christianity describes the capacity to accept dependence as a mark of moral and spiritual health. Only the proud and vainglorious would attempt to deny their weaknesses, while the devout can declare without awkwardness, as a sign of their faith, that they have spent time in tears at the foot of a statue of a giant wooden mother. The cult of Mary recasts vulnerability as a virtue and thus corrects our habitual tendency to believe in a conclusive division between adult and childhood selves. At the same time, Christianity is appropriately delicate in the way it frames our needs. It allows us to partake of the comfort of the maternal without forcing us to face up to our lingering and inescapable desire for an actual mother. It makes no mention of our mother. It simply offers us the imaginative pleasure of being once again young, babied and cared for by a figure who is mater to the world."
There is more than a touch of Freud underlying such sentiments.
Throughout the book runs the theme of human neediness and the need to face that neediness with humility and to acknowledge the solace that religious customs and institutions provide. The author's thesis is that purely secular customs and institutions could provide the same comforts if secular society were to make the necessary social-engineering investments. But it is an act of faith to believe that, lacking theological foundations, a secular civilization could craft traditions equal in therapeutic efficacy to those crafted by the world's God-inspired religions. Whether God's existence is real or imagined doesn't bear on His capacity to inspire, or to comfort. With neither a real nor imagined God to lean on, secular society might never be able to pull off the author's therapeutic mission.
The author does identify a substantive common ground that serves the psychological needs of both the atheist unbelievers and the believers in things unseen. To the amusement and encouragement of the star larvae hypothesis, the author points to stars as a natural intersection of secular and religious concern:
"If such a process of re-evaluation [of calibrating our lives to cosmic standards] offers any common point of access open to both atheists and believers, it may be via an element in nature which is mentioned in both the Book of Job and Spinoza's Ethics: the stars. It is through their contemplation that the secular are afforded the best chance of experiencing redemptive feelings of awe. [. . . . ] Nightly -- perhaps after the main news bulletin and before the celebrity quiz -- we might observe a moment of silence in order to contemplate the 200 to 400 billion stars in our galaxy, the 100 billion galaxies and the 3 septillion stars in the universe. Whatever their value may be to science, the stars are in the end no less valuable to mankind as solutions to our megalomania, self-pity and anxiety."
To which we reply, piously, Amen. The author's instincts have delivered him to the promised land. We encourage him to cross into it.
We shall not cease from exploration
And the end of all our exploring
Will be to arrive where we started
And know the place for the first time.
Four Quartets: Little Gidding
T. S. Eliot
Friday, June 08, 2012
He proposes that advanced civilizations "go green" and integrate themselves into nature so perfectly that they remain hidden. Adapting Arthur C. Clarke's Third Law, Schroeder expresses his solution to Fermi's paradox as,
"Any sufficiently advanced technology is indistinguishable from nature."Clarke rendered his law originally as,
"Any sufficiently advanced technology is indistinguishable from magic."*So, anything that stumps our powers of explanation might be an instance of
1. A supernatural process
2. A natural process
3. An advanced technology
The challenge for us is to figure out what's what. That is, if any sufficiently advanced technology is indistinguishable from nature and from magic, then any sufficiently frustrating-to-science natural process will be indistinguishable from a sufficiently advanced technology and from magic. And any instance of magic we might mistake for a complex corner of nature or an advanced technology. For now, let's reserve magic as god of the gaps.
If a civilization's technology, designed by that civilization's intelligence, ever becomes so "green" that it becomes indistinguishable from nature, then it will have ended nature as something distinguishable from technology. In such a universe we have no criteria by which to distinguish nature from advanced technologies, that is, from intelligent design. This restatement of Schroeder's point has a point, which is that we should cease debating naturalistic evolution versus intelligent design. We claim to have criteria by which to distinguish the two causes, based on their effects, but Schroeder's conjecture suggests otherwise.
The universe might be somebody's well-organized garbage dump, albeit not self-evidently so. That's one interpretation. "Nature is somebody's science project," quips the star larvae home page. Another interpretation. Alternatively, we might see the universe as somebody's art work. Or somebody's magical conjuration. Or somebody's dream. Or whatever. Are these interpretations necessarily mutually exclusive? (The star larvae hypothesis proposes above all, as did Whitehead, a biological interpretation: nature—the universe—as organism.)
The multiplicity of interpretations leads to a metaconclusion, perhaps, which is that we perceive the world through the lens of our instincts, conditioned biases, temperamental leanings and so on. Nothing profound there. But when we ponder the universe as a whole, the vastness of the object accommodates all conceptual frames. The concept of universe is an epistemological black hole.
The star larvae hypothesis embraces Schroeder's interpretation of Fermi's paradox because it points in a promising direction. But the hypothesis plays favorites. Schroeder says high-tech civilizations hide their garbage in plain view but camouflaged as nature. The star larvae hypothesis says high-tech civilizations simply ARE nature. Nature is high tech. Both narratives, though differing in their figure/ground designations, point to a common conclusion: If you want to see where technology is headed, look to nature.
The star larvae hypothesis invites us to contemplate the twinkles that dot the night sky and see the canopy not as housing, or hiding, technological civilizations but as comprising civilian technicians, engineered bodies abiding, alive.
*Clarke's first two laws, worth pondering in their own right, are (1) "When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong;" and (2) "The only way of discovering the limits of the possible is to venture a little way past them into the impossible."