Know This
Page 27
Here’s an idea that could lead us into a new kind of fiscal world: Can you think of how we might mitigate the problems while keeping the advantages of our current money system? Imagine a scientific method of making odorless powder from our feces and replacing money with that powder: that is, feces-standard money, or FSM.
Every morning we can put our powder into reactors situated in our communities to supply food for the microorganisms that can produce various fuels, such as methane and biodiesel. We can receive a certain amount of FSM in exchange for the powdered feces and use the FSM to obtain any equivalent value within a system. Feces, like ordinary currency, is limited and precious; nobody can make more than a certain limit, and it can be converted to energy.
Furthermore, everyone can make feces every day. Whenever we produce and use FSM, it will remind us of the bottom-line connection between FSM and human beings. Thus, FSM has meaning from the perspectives both of the economy and the human mind.
FSM can become “basic income” as long as we put feces into the reactors on a daily basis instead of continuing to flush the toilet. We don’t have to rank the feasibility of the concept of basic income against that of our present monetary system. We can use both money systems without conflicts.
FSM is different from other types of credit, such as mileage, coupons, and online coins, because it is directly connected to our existence and free will—our intention to not use the flushable toilet.
Suggestions: An FSM system can be designed with an app or in other, similar ways. FSM can be used, along with the present currency, to buy such things as gas, coffee, food, and to pay for various enjoyable activities. Values depending on the production of energy or other equivalent products from the feces at a designated time or date can be distributed to the participants of the system. (Of course, since FSM can’t provide everything we need, conventional money also has to be used.) The proposed money system will also require development of various technologies, such as biological processes for energy production—and new industries as well, such as the manufacture of bathroom appliances for the conversion of feces into powder.
The Ironies of Higher Arithmetic
Jim Holt
Philosopher and essayist; author, Why Does the World Exist?
The abc conjecture, first proposed in 1985, asserts a surprising connection between the addition and multiplication of whole numbers. (The name comes from that amiable equation, a + b = c.) It appears to be one of the deepest and most far-reaching unresolved conjectures in mathematics, intimately tied up with Roth’s theorem, the Mordell conjecture, and the generalized Szpiro conjecture.
In 2012, Shinichi Mochizuki of Kyoto University claimed to have proved the abc conjecture—potentially a stunning advance in higher mathematics. But is the proof sound? No one has a clue. Near the end of last year, some of the world’s leading experts on number theory convened in Oxford to sort abc out. They failed.
Mochizuki’s would-be proof of the abc conjecture uses a formalism he calls “inter-universal Teichmüller theory” (IUT), which features highly symmetric algebraic structures dubbed Frobenioids. At first, the problem was that no one (except, we must suppose, its creator) could understand this new and transcendently abstract formalism. Nor could anyone see how it might bear on abc. By the time of the Oxford gathering, however, three mathematicians—two of them colleagues of Mochizuki’s at Kyoto, the third from Purdue in the U.S.—had seen the light. But when they tried to explain IUT and Frobenioids, their peers had no idea what they were talking about—“indigestible,” one of the participants called their lectures.
In principle, checking a proof in mathematics shouldn’t require any intelligence or insight. It’s something a machine could do. In practice, though, a mathematician never writes out the sort of austerely detailed “formal” proof that a computer might check. Life is too short. Instead, she (lady friend of mine) offers a more or less elaborate argument that such a formal proof exists—an argument that, she hopes, will persuade her peers.
With IUT and abc, this business of persuasion has got off to a shaky start. So far, converts to the church of Mochizuki seem incapable of sharing their newfound enlightenment with the uninitiated. The abc conjecture remains a conjecture, not a theorem. That might change this summer, when number theorists plan to reconvene, this time in Kyoto, to struggle anew with the alleged proof.
So what’s the news? It’s that mathematics—which in my cynical moods I tend to regard as little more than a giant tautology, one that would be as boring to a transhuman intelligence as tic-tac-toe is to us—is really something weirder, messier, more fallible, and far more noble.
And that “Frobenioids” is available as a name for a Brooklyn indie band.
Broke People Ignoring $20 Bills on the Sidewalk
Michael Vassar
Cofounder and chief science officer, MetaMed Research
It’s not every year that Edge echoes South Park. I guess everyone’s trying to figure out what’s real right now. The 2015 South Park season revolved around people losing the ability to distinguish between news and advertising. One day they wake up broke, at war, and unable to easily distinguish friend from foe. News, as a concept, is gone. Science, as a concept, is gone. In information warfare, the assumption that reliable, low-context communication is even possible recedes into fantasy, taking with it both news and science and replacing them with politics and marketing. I think the real news, viewed from behind the new extra-strength Veil of Maya, is what you think you see with your own eyes and have checked out against the analytic parts of the scholarly literature. Here’s what I’ve got.
Last November, I was visiting Universidad Francisco Marroquín, in Guatemala, which is known primarily for being the most libertarian university in the world. While there, on the floor of the Economics Department my co-conspirator found local currency worth almost exactly US$20. Technically, the money wasn’t visible from the sidewalk, but the signs announcing gasoline prices clearly are. In the last few years, I have observed those prices decoupling both from the price of oil and from one another, whether across town or even across the street. Growing up, I always noticed whether the gas prices on opposite sides of the road differed by 1, 2, or sometimes even 3 cents. Today, such prices typically differ by more than $.20. I recently saw two stations across the street from one another with prices differing by $.36/gallon and two stations a mile apart charging respectively $2.49/gallon and $3.86/gallon. For a median American driver, $.20/gallon, invested at historically normal rates of return, would add up to about $1,500 over the next decade. Median retirement savings for families aged 55–64 is only $15,000, and for families with retirement accounts, median savings are still only $150,000.
I’m OK with people not behaving like Homo economicus, but if broke people are becoming less economically rational with time, this suggests that people don’t feel that they can predict the future in basic respects. That they aren’t relying on savings to provide for their basic needs. Who can blame them for financial recklessness? Theory, as well as practice, tells us that their leaders have been setting an ever worse example for generations.
Financial economics provides the analytic literature on economic caution and risk. In 1987, Larry Summers and Brad DeLong showed that given a risk premium (a standard assumption in financial economics), irrational noise traders crowd out rational actors over time (“The Economic Consequences of Noise Traders”). When Peter Thiel talks about the shift from “concrete optimism” to “abstract optimism,” he’s characterizing the pattern selected for by this dynamic. This shift toward noise trading inflates equity prices, concentrates wealth, and causes more speculative assets to command higher prices each decade than similarly speculative assets would have commanded in the previous decade. With 84 percent of corporate valuations now taking the form of intangibles, up from 16 percent forty years ago—that sounds like the world I see around me. The overall divergence of map from territory in economic settings ultimately means the annihilation of
strategy as we know it for most people, making economic prudence a predictably losing strategy, which means that in the long run if we can’t figure out a better way to aggregate local economic information, we won’t have the patience to effectively use that information.
We Fear the Wrong Things
David G. Myers
Professor of psychology, Hope College; author, Intuition: Its Powers and Perils
If we knew that AK-47-wielding terrorists would kill 1,000 people in the U.S. in 2016, a thinkable possibility, then we should be afraid—albeit 1/10 as afraid of other homicidal gun violence (which kills more than 10,000), and 1/20 as fearful of riding in a motor vehicle, where 22,000 Americans die annually. Yet several recent surveys show us to be much less fearful of the greater everyday threats than of the dreaded horror. The hijacking of our rationality by fears of terrorist guns highlights an important and enduring piece of scientific news: We often fear the wrong things.
Shortly after 9/11, when America was besieged by fear, I offered a calculation: If we now flew 20 percent less and instead drove half those unflown miles, then, given the greater safety of scheduled airline flights, we could expect about 800 more people to die on our roads. Why do we fear flying, when, for most of us, the most dangerous part of our trip is the drive to the airport? Why do terrorist fears so effectively inflate our stereotypes of Muslims, inflame “us vs. them” thinking, and make many of those of us who are Christians forget the ethics of Jesus (“I was a stranger and you welcomed me.”)?
Underlying our exaggerated fears is the “availability heuristic”: We fear what’s readily available in memory. Vivid, cognitively available images—a horrific air crash, a mass slaughter—distort our judgments of risk. Thus we remember—and fear—disasters (tornadoes, plane crashes, attacks) that kill people dramatically, in bunches, whereas we fear too little the threats that claim lives one by one. Bill Gates once observed that we hardly notice the half-million children quietly dying each year from rotavirus—the equivalent of four 747s full of children every day. And we discount the future (and its future weapon of mass destruction, climate change). If only such deaths were more dramatic and immediate. Imagine (to adapt one mathematician’s suggestion) that cigarettes were harmless—except, once in every 25,000 packs, for a single cigarette filled with dynamite. Not such a bad risk of having your head blown off. But with 250 million packs a day consumed worldwide, we could expect more than 10,000 gruesome daily deaths (the approximate actual toll of cigarette smoking)—surely enough to have cigarettes banned.
News-fed images can make us excessively fearful of infinitesimal risks. And so we spend an estimated $500 million on anti-terrorism security per U.S. terrorist death but only $10,000 on cancer research per cancer death. As one risk expert explained, “If it’s in the news, don’t worry about it. The very definition of news is ‘something that hardly ever happens.’”
It’s entirely normal to fear violence from those who despise us. But it’s also smart to be mindful of the realities of how most people die, lest the terrorists successfully manipulate our politics. With death on their minds, people exhibit “terror management.” They respond to death reminders by derogating those who challenge their worldviews. Before the 2004 election, reported one research team, reminders of 9/11 shifted people’s sympathies toward conservative politicians and antiterrorism policies.
Media researcher George Gerbner’s cautionary words to a 1981 congressional subcommittee ring true today: “Fearful people are more dependent, more easily manipulated and controlled, more susceptible to deceptively simple, strong, tough measures and hard-line postures.”
Ergo, we too often fear the wrong things. And it matters.
Living in Terror of Terrorism
Gerd Gigerenzer
Psychologist; director, Center for Adaptive Behavior and Cognition, Max Planck Institute for Human Development, Berlin; author, Risk Savvy
Terrorism has indeed caused a huge death toll in countries such as Afghanistan, Syria, and Nigeria. But in Europe or North America, a terrorist attack is not what will likely kill you. In a typical year, more Americans die from lightning than terrorism. A great many more die from second-hand smoke and “regular” gun violence. Even more likely, Americans can expect to lose their lives from preventable medical errors in hospitals, even in the best of them. The estimated number of unnecessary deaths has soared from 98,000 in 1999 to 440,000 annually, according to a recent study in the Journal of Patient Safety.
Why are we scared of what most likely won’t kill us? Psychology provides an answer. It is called “fear of dread risks.” This fear is elicited by a situation in which many people die within a short time. Note that the fear is not about dying but about suddenly dying together with many others at one point in time. When as many—or more—people die distributed over the year, whether from gun violence, motorcycle accidents, or in hospital beds, it’s hard to conjure up anxiety.
For that reason terrorists strike twice. First with physical force and second by capitalizing on our propensity for dread-risk fear. After 9/11, many Americans avoided flying and used their cars instead. As a consequence, some 1,600 people died from the resulting traffic accidents, more than the total number of individuals killed aboard the four hijacked planes. That can be called Osama bin Laden’s second strike. All those people could still be alive if they had flown instead of driven, seeing as there was no deadly accident on commercial airline flights in the U.S. for a number of years thereafter.
Although billions have been poured into Homeland Security and similar institutions to prevent the first strike of terrorists, almost no funding has been provided to prevent the second strike. I believe that making the public psychologically aware of how terrorists exploit our fears could save more lives than NSA Big Data analytics. It could also open people’s eyes to the fact that some politicians and other interest groups work on keeping our dread-risk fear aflame to nudge us into accepting personal surveillance and restriction of our democratic liberties. Living in terror of terrorism can be more dangerous than terrorism itself.
The State of the World Isn’t As Bad As You Think
Steven R. Quartz
Neuroscientist; professor of philosophy, Caltech; co-author (with Anette Asp), Cool
If you find yourself at a cocktail party searching for a conversation starter, I’d recommend the opening line of a recent Bill and Melinda Gates Annual Letter: “By almost any measure, the world is better than it has ever been.” Although people will react with incredulity at the very possibility that things could be getting better, they’ll welcome the opportunity to straighten you out. Be prepared for the recitation of the daily headlines—bad news piled on top of worse—that will inevitably follow. Virtually everyone I’ve mentioned this quote to is sure it’s wrong.
For example, about two-thirds of Americans believe that the number of people living in extreme poverty has doubled in the last twenty years. People point to conflicts in the Middle East as evidence of a world in chaos, the retreat of democracy, plummeting human rights, and a global decline of well-being. Yet the news from social science through the accumulation of large-scale, longitudinal data sets belies this declinist worldview.
In reality, extreme poverty has nearly halved in the last twenty years—about a billion people have escaped it. Material well-being—income, reduced infant mortality, life expectancy, educational access (particularly for females)—has grown at its greatest pace in the last few decades. The number of democracies among the developing nations has tripled since the 1980s, while the number of people killed in armed conflicts has decreased by 75 percent. This isn’t the place to delve into the details of how large-scale statistical data sets, and ones increasingly representative of the world’s population, provide a more accurate, though deeply counterintuitive, assessment of the state of the world (for that, see Steven Pinker’s The Better Angels of our Nature).
Instead, I want to suggest three reasons why it’s such important scientific news. First
, although these long-term trends may not resuscitate an old-fashioned notion of progress—certainly not one suggesting that history possesses intrinsic directionality—they do call out for a better understanding (and recognition) of the technological and cultural dynamics driving long-term patterns of historical change. What’s even more interesting is their stark demonstration of how deeply our cognitive and emotional biases distort our worldview. In particular, there’s good evidence that we don’t remember the past as it was. Instead, we systematically edit it, typically omitting the bad and highlighting the good, leading to cognitive biases such as “rosy retrospection.”
At the cultural level, these biases make us vulnerable to declinist narratives. From Pope Francis’s anti-modernist encyclical to capitalism’s inevitable death by internal contradictions and moral decline, declinist narratives intuitively resonate with our cognitive biases. They are thus an easy sell and cause us to lose sight of the fact that until a few centuries ago the world’s population was stuck in abject poverty, a subsistence-level Malthusian trap of dreary cycles of population growth and famine.
In reality, not only has material well-being increased around the globe but global inequality is also decreasing, as a result of technological and cultural innovations driving globalization. We should be particularly on guard against declinist narratives that also trigger our emotional biases, especially those hijacking the brain’s low-level threat-detection circuits. These alarmist narratives identify an immediate or imminent threat, a harbinger of decline, which unconsciously triggers the amygdala and initiates a cascade of brain chemicals—norepinephrine, acetylcholine, dopamine, serotonin, and the hormones adrenalin and cortisol—creating primal, visceral feelings of dread and locking our attention to that narrative, effectively shutting down rational appraisal.