THE AMERICA ONE NEWS
Aug 27, 2025  |  
0
 | Remer,MN
Sponsor:  QWIKET 
Sponsor:  QWIKET 
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge.
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge and Reasoning Support for Fantasy Sports and Betting Enthusiasts.
back  
topic


NextImg:The Fifty-Fifty Universe

Source: Bigstock

Way back in the 1970s, I was fascinated by cosmology, the study of the origin of the universe.

It had been discovered by Edwin Hubble in the 1920s at the Mt. Wilson observatory, which I can see from the end of my block, that the universe consists of uncounted numbers of separate galaxies, like our Milky Way. Next, Hubble found that the galaxies were flying farther apart.

In the late 1920s, a brilliant Belgian priest and physicist named Father Georges Lemaître propounded the “hypothesis of the primeval atom,” or what we now call the Big Bang: The universe as we know it had been created a finite amount of time ago.

Lemaître’s theory, which was highly reminiscent of St. Thomas Aquinas’ “Prime Mover” proof for the existence of God—everything moving in the universe must have been set into motion by an Unmoved Mover, namely God—elicited suspicion among scientists that it was all just a Catholic plot to win believers in a Creator.

“Dyson’s insight that things are most interesting when they are a hard-to-predict fifty-fifty proposition germinated within me.”

So, physicist and science fiction author Fred Hoyle, among others, proposed an alternative Steady State model in which the universe is infinitely old yet, while expanding, is otherwise stable. The empty space left behind by the receding galaxies hatches new matter through some process that is currently unimagined but is definitely natural, not supernatural.

The Big Bang and the Steady State were comparably plausible alternatives until the mid-1960s when two Bell Labs radio astronomers, Arno Penzias and Robert W. Wilson, discovered microwave background radiation coming equally from all points in the sky. They asked some Princeton theoretical physicists to explain this and were excitedly told that their finding proved we must still be within the primordial Big Bang.

I had the good fortune to have dinner with Wilson in 1978 at his alma mater of Rice U., the only Nobel laureate I’ve ever met. So I’ve always been sensitive to Princeton’s allegations that their theoreticians deserved the Nobel, not the Bell Labs knob twiddlers. (Only three Nobelists per year are allowed per field, so it was impossible to honor all the Bell Labs and Princeton scientists simultaneously).

But when I mentioned my concerns to a U. of California professor of astronomy, he laughed: “Is it a coincidence that Wilson, the finest radio astronomy experimentationist of his generation, found what nobody else did, the key discovery of the age?”

Once the Big Bang had overcome the Steady State model, the next question then became whether there had been only one Big Bang…or perhaps there had been an infinite number of them endlessly repeating (thus allowing the universe to be uncreated and dispensing with the Big Bang’s apparent need for a Creator).

Possibly the galaxies would fly apart after being launched by each successive Big Bang, but then their expansion would slow down due to gravity, eventually grinding to a stop, and then the galaxies would plummet back to each other in an unstoppable Big Crunch?

And that would, somehow, lead to yet another Big Bang?

Nobelist Steven Weinberg complained in 1977 that an infinitely old universe must be improbably perfect in its mechanism. Even the slightest flaw would mean it would break down irreparably over the eons.

But that seemed overly theoretical, rather Borgesian.

So, astronomers set off to measure the amount of matter in the observable universe. Was it dense enough to reverse the Big Bang and cause a Big Crunch?

Frustratingly, they kept coming up with contradictory results. Whatever the answer was, it was irritatingly close to the borderline.

Well…of course it is, responded physicist Freeman Dyson in his 1979 book Disturbing the Universe. It’s got to be right around the tipping point. If it were clear-cut that the universe is denser than the break-even point, then the Big Crunch would have already happened by now. If the universe were much sparser, then we’d already be well into the Big Chill.

Hence, the universe must be very near the cutoff.

Dyson, a somewhat theistic optimist, propounded his “principle of maximum diversity,” which “says that the laws of nature and the initial conditions are such as to make the universe as interesting as possible.”

Perhaps, implied Dyson, the universe was designed to be puzzling to researchers?

(In case you are wondering, the English physicist Freeman Dyson, who died in 2020 at age 96, does not appear to be closely related to the billionaire inventor Sir James Dyson.)

Eventually, cosmology got too interesting for me. In 1998, two sets of astronomers discovered that the expansion of the universe was not slowing down, as the elegant model of the 1970s had assumed, but was instead speeding up, presumably due to mysterious “dark energy.”

Huh?

At that point, I gave up following new developments in cosmology.

But Dyson’s insight that things are most interesting when they are a hard-to-predict fifty-fifty proposition germinated within me.

Consider sports gambling. The NFL has worked to make predicting at the beginning of the season who will win the Super Bowl at the end as hard as possible. And if that’s not good enough, bookies will offer you no end of clever bets to make your outcomes as random as possible.

Not surprisingly, Americans find the NFL really interesting.

That’s because the things we tend to argue about the most are those that are most arguable.

For example, in a 2009 Taki’s column, I wrote about the then-current debate over who was the greatest NFL quarterback, Peyton Manning or Tom Brady. Manning had just beaten Brady 35–34 in a game as close as the score. I quoted Steven Pinker on why the Brady vs. Manning argument was so interesting: “Mental effort seems to be engaged most with the knife edge at which one finds extreme and radically different consequences with each outcome, but the considerations militating towards each one are close to equal.”

Personally, back then I leaned toward Manning over Brady.

Today we know better, so, fortunately for me, we don’t argue over that as much.

Similarly, when it comes to the endless controversy over the causes of the racial gaps in intelligence, I endorse the anti-extremist position that most likely both nature and nurture matter.

The scientific evidence is utterly undeniable that racial differences in average intelligence exist. What remains debatable is whether these gaps are due to nature or nurture.

The moderate, sane view is that both likely matter.

Truly, how plausible could be the extremist view that the only possible cause is 100 percent nurture?

Frankly, that seems crazy to espouse.

Far more reasonable is that both nature and nurture matter.

I don’t actually have much of an opinion whether the actual distribution of causality of race and IQ is 80 percent nurture and 20 percent nature or 80 percent nature and 20 percent nurture. So, therefore, my basic assumption is that the cause is a Dysonian fifty-fifty split between nature and nurture.

That could well be wrong, but it reduces the chance of my being really off. I don’t like being badly wrong.

You might argue that twin studies suggest 80 percent nature and only 20 percent nurture. But be aware that twin studies don’t adjust for changes in era. All twins were born within a few minutes of each other rather than a generation apart.

So, I’m okay with assuming a Dysonian fifty-fifty split.

And yet, I’m much denounced for my moderation, when respectable opinion is fanatically extremist: 100 percent nurture…or nothing!

Nobody can explain why such a crank view is more sensible than that IQ is influenced by both genes and environment.

But what I suspect is that purveyors of the conventional wisdom assume that because they, of course, are lying about what they actually believe, because they would have their careers canceled if they told the truth, therefore I must be lying as well.

But what if I’m not?

Perhaps I’m instead addicted to the truth?

Ed West writes, in response to Pinker’s upcoming book on preference falsification:

One unspoken effect of preference falsification is that it causes commentators on the left to become more suspicious of conservatives, because they suspect that their views on this subject are actually more right-wing than they profess, and they are dissembling. The social pressure to have the correct outward opinions leads to a sort of Spanish Inquisition-style paranoia among the country’s moral guardians about people’s true beliefs. Tell us your real opinions, so we can get you sacked!

But, I ask, when would I have the time to mislead? Over the past few decades, I’ve posted many millions of words articulating exactly what I believe. Unlike my haters, I haven’t had the leisure to espouse any fake beliefs.

I’m constantly being accused of being a horrific hater by people who seem to assume that if I dare say X out loud, I must really believe X-squared because they only say in public the square root of what they really believe.

They can’t believe I just tell the truth as I see it.

And yet I do.

But that is inconceivable to them because it would suggest that I’m morally better than them.

Perhaps, though, Occam’s razor suggests I am?