At Ambit, we spend a lot of time reading articles that cover a wide gamut of topics, ranging from zeitgeist to futuristic, and encapsulate them in our weekly ‘Ten Interesting Things’ product. Some of the most fascinating topics covered this week are: Capitalism (Do billionaires destroy democracy?), Science (Antiscience movement is going global; Will my cat eat me if I die?), Technology (Why bitcoin won’t become like money) and Book Review (Kazuo Ishiguro sees what the future is doing to us). Here are the ten most interesting pieces that we read this week, ended April 2, 2021.1) Are billionaires really destroying democracy and capitalism?
Everywhere in the world, majority think that billionaires need to be taxed heavily. But, is it the right thing to do? This article shows how billionaires can be a force for good, especially if their resources were used for the common good. There are three ways this can happen: 1) The billionaires suddenly realize that the common good of the planet is an important goal for them, so they join forces to save humanity and nature; 2) Our governments make billionaires pay their “fair share” of taxes, or; 3) Society decides that billionaires shouldn’t exist, and our governments simply tax them out of existence.
We should start by asking where do billionaires come from? Do they just spring out of the fertile digital soil of Silicon Valley? Can anyone – with enough hard work – become a billionaire? The task should be how to create a better education and better skills in people so that they can create value for others and get paid for the value they create. In 1996 there were 423 billionaires spotted in the wild. In 2019, that number rose to 2,153. Billionaires constitute just 0.00003% of the world population, but they currently own the equivalent of 12% of the gross world product (GWP) and a much larger percentage of the total wealth of the world.
Instead of pushing billionaires, they need to come in front just like Bill and Melinda Gates and Warren Buffett. They signed the giving pledge, to make this world a better place to live. The Giving Pledge is a commitment by the world's wealthiest individuals and families to dedicate the majority of their wealth to giving back. Imagine how many low-income persons would benefit if 2,000 billionaires, not just 211 would implement a giving pledge. If you’re a billionaire, ask yourself: what kind of world do I want to leave to my children? The rest should also ask this question. Our time as humans on this planet is almost up. Who will lead us?2) The antiscience is killing thousands globally
[Source: Scientific America
Antiscience has emerged as a dominant and highly lethal force, and one that threatens global security, as much as do terrorism and nuclear proliferation. We must mount a counteroffensive and build new infrastructure to combat antiscience, just as we have for these other more widely recognized and established threats. Antiscience is the rejection of mainstream scientific views and methods or their replacement with unproven or deliberately misleading theories, often for nefarious and political gains. It targets prominent scientists and attempts to discredit them.
Beginning in the spring of 2020, the Trump White House launched a coordinated disinformation campaign that dismissed the severity of the epidemic in the United States, attributed COVID deaths to other causes, claimed hospital admissions were due to a catch-up in elective surgeries, and asserted that ultimately that the epidemic would spontaneously evaporate. It also promoted hydroxychloroquine as a spectacular cure, while downplaying the importance of masks. Other authoritarian or populist regimes in Brazil, Mexico, Nicaragua, Philippines and Tanzania adopted some or all of these elements. In the summer of 2020, the language of the antiscience political right in America was front and center at antimask and antivaccine rallies in Berlin, London and Paris. In the Berlin rally, news outlets reported ties to QAnon and extremist groups.
Adding to this toxic mix are emerging reports from U.S. and British intelligence that the Putin-led Russian government is working to destabilize democracies through elaborate programs of COVID-19 antivaccine and antiscience disinformation. Public refusal of COVID-19 vaccines now extends to India, Brazil, South Africa and many low- and middle-income countries. Thousands of deaths have so far resulted from antiscience, and this may only be the beginning as we are now seeing the impact on vaccine refusal across the U.S., Europe and the low- and middle-income countries of Africa, Asia and Latin America. Containing antiscience will require work and an interdisciplinary approach. The stakes are high given the high death toll that is already accelerating from the one-two punch of SARS CoV2 and antiscience. Antiscience is now a large and formidable security issue. 3) A deep history of work, from the stone age to the age of robots
In this article, James Suzman, a social anthropologist based in Cambridge, England, shares 5 key insights from his new book, Work: A Deep History, from the Stone Age to the Age of Robots.
I. We work a lot harder than our hunter-gatherer ancestors:
A century ago, the economist John Maynard Keynes predicted that by 2030, our workweek would be only 15 hours long. What happened? We’ve crossed all the technological thresholds Keynes identified, so why aren’t we living in the economic promised land? Well, if Keynes were here today, he’d probably blame our unshakeable instinct to work. He believed that human beings are cursed, that we have infinite desires, but there aren’t enough resources to satisfy them. As a result, everything is, by definition, scarce. Today, economists refer to this paradox as the “fundamental economic problem,” and they believe it explains our constant will to work.II. All living organisms are born to work:
Every living organism works—it seeks out and captures energy so it can grow, reproduce, and capture even more energy. But, humans are versatile—we can learn skills, develop tools, and deploy different tactics to secure our energy needs. Also, the possible motivations are endless, and they all play out simultaneously. III. We’re all farmers at heart:
The idea that hard work is a virtue and idleness a vice can be traced back to the agricultural revolution some 10,000 years ago. The practical demands of making a living from the soil upended the existing equation between effort and reward. Hunter-gatherers had relished immediate rewards—slaughter an animal, chow down right away. Farmers, on the other hand, developed “delayed return economies.” They invested their labor in the land for the promise of a reward in the future. This, of course, is the basis of our economy today.IV. Country work ain’t city work:
Even in the most sophisticated agricultural civilizations, like ancient Rome, four out of five people still lived in the countryside and worked the land. But the urbanites who managed to unshackle themselves from the challenges of food production were able to pioneer new ways of living and working. They invented a wide range of professions, setting up shop as lawyers and scribes, secretaries and accountants, poets and prostitutes. And these weren’t just careers—they were social identities.
V. Changes to work today are as profound as the agricultural revolution:
Our economic norms and institutions, not to mention our work ethic, evolved in an age when scarcity was real and visceral, an age when people made a living from the land because food was their principal source of energy. But things have changed. Our productivity has surged, courtesy of improvements in technology and our routine exploitation of fossil fuels.4) Book review: The Great Transition: Climate, Disease and Society in the Late-Medieval World, by Bruce Campbell
Bruce Campbell’s The Great Transition chronicles an important and gloomy historical moment. The two centuries between the 1260s and 1470s witnessed the collapse of international networks of exchange, multiple wars, economic contraction, repeated famines, and demographic decline. The single most profound event was what is still considered the most devastating pandemic of human history: the Black Death of the middle of the fourteenth century. Campbell’s book has twelve tables, seventy-eight figures, most of them graphs, and a bibliography running forty-six pages. Campbell has always favored data-heavy analyses; his many decades of study on English agriculture were based on massive compilations of data on crop yields, and he has recently coauthored a comprehensive survey of the British economy from the thirteenth to nineteenth centuries.
The issue of how one tells time with genetic evidence is crucial. In the one map Campbell gives showing the path of the Black Death, he overlays data from different sources. This is usually a powerful technique by which to show the larger patterns in tree-ring data or climate and crop data. Here, Campbell is collapsing two kinds of evidence for the plague’s geography. Plague is not a human disease. We should expect gaps in our documentary evidence, because outbreaks passing solely through wild animal populations would be unlikely to elicit human notice. Campbell is right that the fluctuating climate of the thirteenth and fourteenth centuries could have contributed to epizootics of plague. But there were already multiple plague strains in existence by that point, more than we can document now, and they were already scattered in various niches across the terrain of central Asia.
Campbell does not use the term Anthropocene in The Great Transition. Nevertheless, our own very modern dilemma haunts this book, for we, too, live with the specter of a Great Transition. If sudden climatic shifts could cause such devastation in the fourteenth century, when cities were smaller, travel was slower, and human contamination of air, water, and land was minimal, what hope do we have for today, when germs can travel around the world in less than two days, and more than half the world’s population of seven billion people live in cities? Does it matter whether our climate change is human induced? We are no more able to undo climate change than the people of the fourteenth century.5) Review: The enigma of reason
Scott H. Young, author of Ultralearning, in this blog talks about his last year’s favourite book, Dan Sperber and Hugo Mercier’s The Enigma of Reason. He says, the basic puzzle is: I. If reason is so useful, why do human beings seem to be the only animals to possess it? II. If reason is so powerful, why are we so bad at it? Why do we have tons of cognitive biases? The answer to both of these puzzles, which has far-reaching implications for how we think and make decisions, is that we’ve misunderstood what reason actually is. The classic view of reason is that it is simply better thinking. Reasoned thinking is better than unreasoned thinking.
Sperber and Mercier argue that reason is actually a very specialized cognitive adaptation. The reason other animals do not possess reason is because they don’t have the unique environment human beings exist in, and thus never needed to evolve the adaptation. According to Sperber and Mercier, the purpose of reason, as a faculty, is for generating and evaluating reasons. A few of the most important implications of this theory, if it is true are: I. Reasoning isn’t a big part of intelligence or (potentially) consciousness; II. It’s possible to have smart decisions, but not be able to have reasons for them; III. We are smarter when we argue than when we think alone; IV. Feedback loops may explain the role of classical reason; and V. You will reason better if reasons are harder to provide.
In the end, our minds are not separated into a war between a ruling, but often frail and feeble, reason, and a willful unconscious. Instead, it is split between many, many different unconscious processes, each with their own domains and specialized functions, with reason standing alongside them. In some senses, this is a demotion of reason, from being a godlike faculty that separates us from animals, to being just one of many tools in our mental toolkits. But in another sense, this is a restoration of reason, since instead of appearing like sloppy, feeble and poorly-functioning faculty, it appears as if reason does exactly what it was designed to do, and it does so very well.6) Reimagining the laptop for a work-from-home era
[Source: The Wall Street Journal
Almost all have been working remotely since the pandemic stuck last year. And remote working environment is here to stay. A lot of focus has turned toward the key work-from-home technology tool: the laptop. But relying so heavily on the laptop has raised all sorts of issues—from camera and sound quality to security and privacy. Here’s what a variety of experts had to say about future of remote working. I. Improve the way we look on camera:
The better we get at videoconferencing, the more we notice bad videoconferencing and poor camera angles. Innovation in software will make us all look better on camera. II. Better wireless connections:
For most people the main form of connectivity for their laptop is wireless. Various forms of wireless connectivity are being substantially improved. The latest generation of Wi-Fi (Wi-Fi 6) hit the market in 2019. Looking forward, work has already begun on Wi-Fi 7. Each of these brings further performance improvements in speed and latency. III. More, and better, screens:
Screen sizes of individual devices are unlikely to get bigger, but the total amount of screen real estate will increase. People will prefer using multiple monitors for better multitasking—to access other applications while videoconferencing, for example. IV. Nix the noise:
All sorts of audio issues arise with work-from-home use of laptops. Roommates quarreling, pets barking, etc. But, algorithms on laptops will soon be able to separate out background noises, and do so fast enough that the disturbances get continuously filtered out before leaving the laptop.V. Sharpening the background:
Virtual backgrounds are on their way to being a necessary part of the online meeting experience. As a host, dynamic background images can go a long way in differentiating your 10 a.m. Zoom meeting from the 3:30 p.m. meeting. VI. Security inside and outside the laptop:
Working from home creates a number of security concerns for companies, which will lead to enhancements for laptops that you can and can’t see. Similarly, laptop manufacturers may adopt facial recognition or other biometric unlocking software similar to what we have grown accustomed to on cellphones.
VII. Keeping the laptop safe (from kids and others):
Individuals/companies are focusing on how to protect work laptops now that they are being used more often from the home. There are a number of basic security hygiene rules that can be put in place to protect a device. These include screen-lock timers, so kids can’t access a device when the employee has left the room.7) Bitcoin: Why governments will continue their monoply over money
The entire idea behind creating bitcoin was to give the world an option to the paper or fiat money system. The paper money system is run by central banks and governments; they can manipulate it at will. In the aftermath of the financial crisis of September 2008, the Western central banks printed trillions of units of paper money in the hope of getting their economies back on track. Nakamoto looked at this as an abuse of the trust people had in paper money. And bitcoin was supposed to be a solution for this breach of trust; a cryptocurrency which did not use banks or any third party as a medium and the code for which has been written in such a way that only 21 million units can be created.
Also, unlike the paper money system which is ultimately run by individuals, the bitcoin system is decentralised and has no owner. In fact, these are the main reasons offered by those who believe bitcoin is money or at least the future of money. But, bitcoin also has its own share of issues. Any form of money needs to have a relatively stable value. Between 21 January and 16 February, the price of a bitcoin went up 59.4% to $49,225. This made bitcoin investors more wealthy. Nevertheless, the question that one needs to ask here is what does a huge increase in value of any form of money actually mean. It means that the price of goods and services that money can buy have fallen massively.
Due to the overall limit of 21 million, bitcoin is often compared to gold, the argument being, like gold, bitcoin cannot be created out of thin air. This is true, but it comes with a corollary. While supply of bitcoin is limited, the same cannot be said about the supply of cryptocurrency on the whole. Bitcoin is the most popular cryptocurrency, but it’s not the only game in town. Hence, the number of bitcoins may not go up, but the number of bitcoin-like assets will continue to go up in the years to come, as newer cryptocurrencies get launched. It’s one thing to have competition in soaps and mobile phones, it’s another thing to have different forms of money compete. Also, bitcoin and cryptocurrencies are attracting the attention of governments. Many countries are setting up committees to look into the matter.8) Why finance gurus switched their bait from millions to thousands of dollars
[Source: The New York Times Magazine
The Southern California real estate broker Kevin Paffrath uploaded a video to his “Meet Kevin” YouTube channel, updating viewers on the status of the stimulus. “Mark your calendar, there’s a big day coming!” he put this up on Jan. 9, with the dream of $2,000 stimulus checks not yet deflated. This video would be just one of dozens about potential stimulus packages posted that day, even that evening — many of them from finance influencers like Paffrath, whose pitches normally involve real estate, stocks or airline points. A year ago, they were promising to share their proprietary secrets for achieving wealth, staging monologues in the drivers’ seats of luxury cars and poolside on cruise ships.
A CNBC profile reported that Paffrath actually makes most of his money not from the industry he built his status on, not from investing or even from buying rental properties, but via his audience itself, from his YouTube channel’s advertising revenue and affiliate programs. Stimulus-check updates began doubling Paffrath’s other videos in view counts; one update became the most popular video on his channel, with 1.1 million views. Viewer demand didn’t come from upward-bound entrepreneurs after all, it seemed, but rather from those enduring the kind of precarity where the precise timing of a $2,000 deposit could mean keeping the lights on or the difference between housing and eviction. Every hour, a glut of new videos provided the latest on whether relief was coming and how many dollars of it were likely to arrive. Paffrath typically uploaded two videos each day.
In the days leading up to the relief bill becoming law, Paffrath’s stimulus content remained his most popular product; soon he was posting videos calming those members of his audience for whom the $1,400 deposit had not yet arrived. Can the path forward for someone like Paffrath really lead back to making videos from the driver’s seat of a Tesla, promising to make viewers rich? Or will what he has seen during this stint — months of tending to a public desperate for news of a couple thousand dollars — open his eyes to the possibility of being just another rich person hustling the poor? 9) Kazuo Ishiguro sees what the future is doing to us
[Source: The New York Times Magazine
Ishiguro’s new book, “Klara and the Sun,” is his first since winning the Nobel Prize in Literature in 2017. The novel is set in a near-future America, where the social divisions of the present have only widened and liberal-humanist values appear to be in terminal retreat. The book addresses itself to an urgent but neglected set of questions arising from a paradigm shift in human self-conception. If it one day becomes possible to replicate consciousness in a machine, will it still make sense to speak of an irreducible self, or will our ideas about our own exceptionalism go the way of the transistor radio?
Ishiguro is not the kind of writer who takes dictation from his characters. He has never been able to sit down at his desk and improvise, to launch into a novel from a standing start. He is a planner, patient and meticulous. Before he begins the writing proper, he will spend years in a sort of open-ended conversation with himself, jotting down ideas about tone, setting, point of view, motivation, the ins and outs of the world he is trying to build. Only once he has drawn up detailed blueprints for the entire novel does he set about the business of composing actual sentences and paragraphs.
In this, too, he follows a set of carefully honed procedures. First, writing very quickly and without pausing to make revisions, he’ll draft a chapter in longhand. He then reads it through, dividing the text into numbered sections. “Klara and the Sun” isn’t Ishiguro’s finest novel, but it provides a vision of where we are headed if we fail to move beyond this constraining view of freedom. What’s most unsettling about the future it imagines isn’t that machines like Klara are coming more and more to resemble human beings; it’s that human beings are coming more and more to resemble machines.10) Will my cat eat me if I die? Science holds the delicious answer
This article answers four questions about cat aggression, our feline relationships, and ultimately, consumption: 1) Why does my cat bite me? Cats are hunters. Their strong bite, in turn, comes in handy — whether it’s gripping a dead mouse or nibbling on your finger. Vanessa Spano is an associate veterinarian at Behavior Vets NYC and a resident at the American College of Veterinary Behaviorists. Many cat owners come to the veterinarians with perfectly natural concerns about their cats’ bite, Spano tells. However, while being bit is never fun, it’s not exactly a sign of your cat’s hidden desire to consume your flesh. Instead, your cat’s probably just playing with you, Spano says. This biting behavior, known as “mouthing,” mimics the play of young kittens.
2) Is it normal to wonder if I’ll be eaten by a cat? It’s worth unpacking the psychology behind this question. What does it tell us about the tenuous relationships that humans have with their pets? “There is certainly an ethological, biological rationale behind a human questioning another animal's desire to eat him [or] her,” Spano says. She does admit the fear is not irrational. Domestic pets can, on rare occasions, become aggressive towards their human housemates. But it’s far more likely that a cat would chow down on wild prey rather than its human companions, Spano says. 3) Would a cat eat its owner? Mikel Delgado, a cat expert at Feline Minds, tells cats won’t typically chow on their living owners. But, imagine this scenario: a human dies, leaving its cat without food for days or even weeks. You can predict how the scenario might naturally unfold.
4) What type of cat is most likely to eat its owner? “Feral cats are often used to hunting and finding their own food, but their dietary needs are basically the same,” Delgado says. “If a cat is starving, there is no reason to think they would not eat available meat, even if that was human flesh,” Delgado says. Melissa Connor, a co-author of the 2020 feral cats study and director of the Forensic Investigation Station at Colorado Mesa University, ultimately agrees with Delgado. “Feral and domestic cats do seem to have different scavenging patterns,” Connor says. However, these different eating habits aren’t the result of domestic cats forming bonds with humans. Instead, the habits relate to the “condition of the body at the time of scavenging” or the cat’s experience “in consumption of whole animal carcasses.”