Some of the most interesting topics covered in this week's iteration are related to 'Big tech and barter', 'San Francisco's seismic gamble', and 'brain similarities in best friends'
At Ambit, we spend a lot of time reading articles that cover a wide gamut of topics, including investment analysis, psychology, science, technology, philosophy, etc. We have been sharing our favourite reads with clients under our weekly ‘Ten Interesting Things’ product. Some of the most interesting topics covered in this week’s iteration are related to ‘Big tech and barter’, ‘San Francisco’s seismic gamble’, and ‘brain similarities in best friends’.
Here are the ten most interesting pieces that we read this week, ended April 27, 2018.
1) China’s Ant Financial shows cashless is king [Source: Financial Times] Ant Financial, the Alibaba financial arm that grew out of the original Alipay business, is preparing to launch a $9bn private fundraising round ahead of its much anticipated IPO. That fundraising is expected to give Ant Financial a valuation of around $150bn, making it the world’s most highly valued unlisted technology group. To put it into perspective, that means that investors believe Ant Financial is worth 50% more than Goldman Sachs. Alongside showing that a generation of new Chinese technology companies have the scale, expertise and firepower to compete with US rivals, this news also demonstrates the way that China, the second-biggest economy in the world, has embraced the shift towards a cashless society more enthusiastically than Europe or the US. “These companies are like Facebook if it had a bank on top of it and everyone had a bank account [with Facebook],” says Joe Ngai, managing partner for greater China at McKinsey. “There is really nothing like this in the west.”
Ant Financial is now one of the backbones of consumption in China. Including its rival Tencent, the two companies make up more than 90% of the country’s $16tn annual mobile payments market. The company benefits by being part of the broader Alibaba empire, which is blazing its own trail in e-commerce. The e-commerce sales are conducted on Alibaba’s shopping platform, but the payments for most of the purchases are made through Ant Financial. Analysts describe Alipay as the glue that holds together Alibaba’s online shopping empire. Alipay started as an online escrow service in 2004 that allowed merchants and shoppers to send and receive payments but, more importantly, it gave shoppers a means to claim back refunds if they felt cheated on a sale, providing a shimmer of confidence in a marketplace devoid of trust.
The evidence of Ant’s business and that of Tencent are visible on nearly every street corner in cities around China. In Beijing, Shanghai or Shenzhen, young people seldom use cash to pay for coffee, fast food or even daily groceries. Instead they scan two-dimensional QR bar codes with their phones that process the payment via Alipay or Tencent’s WeChat Pay. Alongside its payment business, Yu’E Bao, Ant’s money market fund highlights the breakthrough nature of some of its businesses. When it was launched in 2013, it allowed customers to move their balances on the payment platform — as little as Rmb1 ($0.16) — into a fund paying interest as high as 7%. Within a year it had taken in Rmb570bn in assets under management to become China’s largest money market fund, and last year it was crowned the largest in the world. Also, Sesame Credit, Ant’s credit scoring business, pulls data from users across its spectrum of businesses, including information about which users buy what, who repays debts and which clients save more than others. This data allows for better understanding of their customers, leading Ant Financial to make more profitable lending decisions.
Alongside the good news though there have been concerns around securing regulatory approvals which has led Ant to push back its planned IPO to late 2018 or 2019. To allow crucial payments infrastructure to potentially fall into the hands of foreign investors, Ant will need an explicit blessing from Beijing — something that has yet to materialise in the public domain. The fact that Temasek plans to invest in Ant in the new fundraising round suggests that the company could be closer to getting approval for having major foreign investors on its register. From an operational standpoint too, despite its dramatic growth within China, Ant Financial’s efforts to expand overseas have been much slower. While it has made significant investments in 11 overseas companies since 2015, its biggest overseas gambit collapsed in January when its $1.2bn bid for Dallas-based payments group MoneyGram International was blocked by US regulators. The buyout would have made Ant an important player in the US payments system. However, fears over a Chinese company controlling the personal financial data of US clients ultimately killed the agreement. 2) How Big Tech brought back barter economy [Source: Financial Times] Seven years ago, after the financial crisis, anthropologist David Graeber published a provocative book- Debt: The First 5,000 Years challenged how economists think about debt, credit and barter. Graeber argued that economists tended to assume financial history had moved in a neat evolutionary line: first, so-called primitive people engaged in barter (swapping food for cloth, say); then they adopted money (think ancient gold coins); last, they embraced debt (aka modern banks, mortgages and credit cards). While this picture seems appealingly easy to understand, Graeber insisted that it was completely wrong. He pointed out that simple, ancient societies had complex systems of credit and barter that did not vanish when money appeared. To put it another way, history does not always move in one direction — barter, credit and money can, and do, coexist. It is an idea we urgently need to rediscover, but this time in relation to Big Tech. In recent weeks, there has been an uproar about the revelations that large tech companies such as Facebook and Google have been harvesting consumer data for commercial ends.
At first glance, this looks exploitative. But in exchange for giving up their data, consumers have received something — digital services such as messaging systems, maps, information and apps. Indignant techies love to point out that consumers have been given these services “for free”, since there is often no monetary payment involved; meanwhile, politicians (and consumer groups) complain that tech companies have taken consumer data “for free” too. Perhaps a better way to frame these transactions is to revive that ancient term “barter”. What consumers and tech companies have essentially been doing is bartering services for personal data — in the same way that hunter-gatherers might have bartered berries for meat. We might have thought that the 20th-century economy was built on money, but the early 21st-century cyber economy is partly based on barter too.
So then does this matter? An anthropologist might say no. But most policy makers, business leaders and consumers would beg to differ. For one thing, the exact nature of what was being bartered here — the sheer volume of data being hoovered up by tech companies and their reach into our most personal messages, preferences and political views — was arguably not known by consumers. Few of us have the time or legal expertise to fully read or comprehend the lengthy terms and conditions that flash up before we can access digital services. It is also fair to say that our leaders, laws and economic models are not set up to cope with a world where barter is much more than a historical curiosity. Economists, for example, do not have any real way to include barter in their view of the economy, since they tend to measure everything according to price. “Free” items, such as apps or data exchanges, are largely ignored in the data on gross domestic product. Lawyers do not know how to cope with barter when it comes to discussing issues of antitrust or the abuse of monopoly power, since the US concept of antitrust and collusion presumes that the way to measure consumer exploitation is to see whether they have been charged excess prices — as measured with money.
Meanwhile, consumers have not been offered an alternative to the barter trades that drive the digital economy — or the chance to consider how they might structure them differently. Is it “unfair” if Facebook (or anyone else) grabs all your data in perpetuity in exchange for letting you have free social media? Does this barter actually represent good value? And is there a way to have gradations on this exchange — and enable consumers to drive a better bargain? Thankfully, a debate about this is (belatedly) starting. Policy makers in Europe are limiting the data that tech companies can take. Meanwhile, some tech entrepreneurs and data scientists are trying to introduce more clarity (and money) into these barters by campaigning for consumers to be given proper ownership of their “digital assets” (i.e. data), so that they can “sell” these in clear-cut transactions. But there are some big impediments: will consumers pay money for cyber services? Can blockchain, an electronic database for transactions, really act as a ledger for data? Would governments ever introduce the legislation needed to make this work? For now we are left in limbo: our laws and models assume we have a money-based world; but our mobile phones and laptops operate with barter trades we barely understand. 3) How Domino’s Pizza drove a 90x increase in stock value by acting like a tech start-up [Source: producthabits.com] In this piece, the author discusses how Domino’s Pizza reinvented themselves, how they took criticism positively and changed the business for good. They took a huge, scary risk and completely scrapped and remade their core product: pizza. This self-awareness, and the guts to act on it, has paid off. Now they’re punching far above their weight class. Domino’s stock has grown 90x from $2 to $180 since 2010. These growth rates dwarf those of Facebook, Google, Amazon, and Apple. And though their delivery guarantees, marketing campaigns, and even the pizza itself have gone through radical changes, most of their core principles are the same.
The most impressive part about Domino’s growth from 1960 to the early 2000s was that they became one of the biggest pizza delivery chains in the world with some of the worst pizza in the world. So how did they cope up with this? Simple, they thought like a start-up. They decided to respond to the droves of bad feedback by completely reinventing their pizza recipe. This was their best move in the history of Domino’s. While they grew their company and brand in the early days and cemented their reputation for fast, cheap, convenient pizza delivery, Domino’s listened to, and acted on, customer feedback and completely recreated their pizza recipe in response to criticism about the terrible taste. Also, Domino’s continued to reimagine ways to order pizza and is using all of the technology available to them to double down on convenience in ordering and delivery. In 2007, Domino’s was the first company in the pizza delivery industry to offer mobile ordering, which quite literally changed the way people thought about food ordering and delivery.
Delivery has always been central to the Domino’s Pizza business and brand. They’ve specified from their earliest days that they’re a pizza delivery company. This is how they’ve built convenience and reliability into their brand. They’ve been very self-aware about this and it’s been one of their huge strengths. But one of their other huge strengths is the ability to adapt. They adapted their pizza to respond to customer criticism. Now the focus of Domino’s is on newest technology like self-driving cars, zero-click ordering, and drone delivery services, which serve three larger trends in how Domino’s is innovating the process of delivery. Their advances increasingly let the customer order from more devices, ask the customer to do less to place an order, and remove aspects of human interaction from ordering and delivery. Recently, they introduced another ordering platform, Facebook. Their recent turnaround and innovations have been prime examples of how start-ups should overcome obstacles and apply forward-thinking to their core value.
4) San Francisco’s big seismic gamble [Source: NY Times] While the 1,070-foot Salesforce Tower is a landmark in San Francisco, this building, along with other high rises in the vicinity, has been built on the soft soil and sand on the edge of the bay. And that’s a potential danger for a city that sits precariously on unstable, earthquake-prone ground. But the city is also putting up taller and taller buildings clustered closer and closer together because of the state’s severe housing shortage. Now those competing pressures have prompted an anxious rethinking of building regulations. Experts are sending this message: the building code does not protect cities from earthquakes nearly as much as you might think. San Francisco now has 160 buildings taller than 240 feet and a dozen more are planned or under construction. California has strict building requirements to protect schools and hospitals from a major earthquake, but not skyscrapers. A five-story building has the same strength requirements as a 50-story building.
Over a century ago, on April 18, 1906, San Francisco experienced one of the worst earthquakes in the world history. The violent shaking ignited a fire that lasted three days, destroying 500 city blocks and 28,000 buildings. Half of the population of around 400,000 was made homeless. Many were forced to flee the city. So, after more than hundred years, how safe are the skyscrapers in San Francisco? Even the engineers who design them can’t provide exact answers. Earthquakes are too unpredictable. And few major cities have been tested by major tremblors. Previous earthquakes have revealed flaws with some skyscrapers. A widely used welding technique was found to rupture during the 1994 Northridge earthquake in Los Angeles. While California has made significant strides in earthquake preparedness over the past century, few experts like Thomas H. Heaton, the director of the Earthquake Engineering Research Laboratory at the California Institute of Technology are sceptical when it comes to building high rises in earthquake zones.
What shifted the debate on seismic safety was the sinking and tilting of the 58-floor Millennium Tower. When it was completed in 2009, the building won numerous awards for ingenuity from engineering associations. The developer and city officials knew of the building’s flaws for years, but kept them confidential until 2016, when news leaked to the public. The latest measurements, taken in December, show that the building has sunk a foot and a half and is leaning 14 inches toward neighboring high rises. It is across the street from Salesforce Tower and right next to a transit hub for buses, trains and eventually high speed rail that is being touted as the Grand Central of the West. The area around Millennium Tower is considered among the most hazardous for earthquakes. At least 100 buildings taller than 240 feet were built in areas that have a “very high” chance of liquefaction, a process where the ground acts like quicksand during an earthquake.
In light of the problems with the Millennium Tower, there are now increasing calls in California for a reassessment of earthquake risks, much of it focused on strengthening the building code. In January, a Southern California assemblyman, Adrin Nazarian, introduced a bill in the State Legislature that would require the building code to make new buildings strong enough for “functional recovery” after an earthquake. The bill passed its first hurdle, a committee hearing. The goal of the code, say proponents of a stronger one, should be the survival of cities — strengthening water systems, electrical grids and cellular networks — not just individual buildings. A seismic engineer at the University of Colorado, Dr. Keith Porter’s research offers warnings on the economic consequences of a major earthquake in the San Francisco Bay Area. He has calculated that one out of every four buildings in the Bay Area might not be usable after a magnitude 7 earthquake, which although severe is not the worst the area could experience. 5) Why is the human brain so efficient? [Source: nautil.us] The brain is often compared with another complex system that has enormous problem-solving power: the digital computer. Both the brain and the computer contains a large number of elementary units—neurons and transistors, respectively—that are wired into complex circuits to process information conveyed by electrical signals. At a global level, the architectures of the brain and the computer resemble each other but which has more problem-solving power—the brain or the computer? Given the rapid advances in computer technology in the past decades, one might think that the computer has the edge. Indeed, computers have been built and programmed to defeat human masters in complex games, such as chess in the 1990s and recently Go. As of today, however, humans triumph over computers in numerous real-world tasks—ranging from identifying a bicycle or a particular pedestrian on a crowded city street to reaching for a cup of tea and moving it smoothly to one’s lips—let alone conceptualization and creativity. So why is the computer good at certain tasks whereas the brain is better at others?
The computer has huge advantages over the brain in the speed of basic operations. Personal computers nowadays can perform elementary arithmetic operations, such as addition, at a speed of 10 billion operations per second. The brain can perform at most about a thousand basic operations per second, or 10 million times slower than the computer. The computer also has huge advantages over the brain in the precision of basic operations. The computer can represent quantities (numbers) with any desired precision according to the bits (binary digits, or 0s and 1s) assigned to each number. For instance, a 32-bit number has a precision of 1 in 232 or 4.2 billion. Empirical evidence suggests that most quantities in the nervous system (for instance, the firing frequency of neurons, which is often used to represent the intensity of stimuli) have variability of a few percent due to biological noise, or a precision of 1 in 100 at best, which is millionsfold worse than a computer.
The calculations performed by the brain, however, are neither slow nor imprecise. For example, a professional tennis player can follow the trajectory of a tennis ball after it is served at a speed as high as 160 miles per hour, move to the optimal spot on the court, position his or her arm, and swing the racket to return the ball in the opponent’s court, all within a few hundred milliseconds. Moreover, the brain can accomplish all these tasks (with the help of the body it controls) with power consumption about tenfold less than a personal computer. How does the brain achieve that? An important difference between the computer and the brain is the mode by which information is processed within each system. Computer tasks are performed largely in serial steps- like in a computer program. For this sequential cascade of operations, high precision is necessary at each step, as errors accumulate and amplify in successive steps. While the brain also uses serial steps for information processing, it also employs massively parallel processing, taking advantage of the large number of neurons and large number of connections each neuron makes.
This massively parallel strategy is possible because each neuron collects inputs from and sends output to many other neurons—on the order of 1,000 on average for both input and output for a mammalian neuron. (By contrast, each transistor has only three nodes for input and output altogether.) Information from a single neuron can be delivered to many parallel downstream pathways. At the same time, many neurons that process the same information can pool their inputs to the same downstream neuron. This latter property is particularly useful for enhancing the precision of information processing. The computer and the brain also have similarities and differences in the signaling mode of their elementary units. The transistor employs digital signaling, which uses discrete values (0s and 1s) to represent information and contributes to reliable long-distance spike propagation. However, neurons also utilize analog signaling, which uses continuous values to represent information – resulting in integration of up to thousands of inputs, enabling the brain to perform complex computations.
Another salient property of the brain, which is clearly at play in the return of service example from tennis, is that the connection strengths between neurons can be modified in response to activity and experience—a process that is widely believed by neuroscientists to be the basis for learning and memory. Repetitive training enables the neuronal circuits to become better configured for the tasks being performed, resulting in greatly improved speed and precision. Such principles of parallel processing and use-dependent modification of connection strength have both been incorporated into modern computers – most notably in the discipline of machine learning and artificial intelligence.
6) New Zealand to ban future offshore oil and gas exploration [Source: Financial Times] New Zealand has become one of the world’s first countries to ban future offshore oil and gas exploration in a move heralded by environmental campaigners as a symbolic blow to “Big Oil”. The South Pacific nation’s ban is an important policy move at a time when nations are exploring how to comply with their requirements under the Paris climate change agreement. France, Belize and Costa Rica have already announced bans on either fossil fuel exploration or production, although these are largely symbolic as none are major oil producers. However, the policy shift announced by Labour party leader Ms Ardern marks a change in direction for New Zealand, which under the previous conservative government prioritised fossil fuel exploration to help the economy grow. Over the past decade Shell, Chevron, Statoil and other resources companies have sought to search for oil and gas in waters off the coast of New Zealand, which has the world’s fourth largest maritime exclusive economic zone.
The oil lobby said it was disappointed with the “surprise” decision, which it claimed was taken without consultation with the industry. “Huge investments have been made by companies already anticipating offshore block offers which have now gone to waste and people’s jobs will be affected,” said Cameron Madgwick, chief executive of the Petroleum Exploration & Production Association of New Zealand (PEPANZ). There are 31 active oil and gas exploration permits in New Zealand, 22 of which are offshore. These existing exploration and mining rights — which run until 2030 at the latest — are being protected under the policy change, which the government said would not affect current jobs. Ms Ardern told reporters future oil and gas exploration would now be limited to onshore acreage in the oil-rich Taranaki region.
New Zealand is a relatively small oil and gas producer on the global stage, extracting most of its production from the five offshore fields of Maui, Pohokura, Kupe, Maari and Tui. It produces about 15m barrels of oil and 175bn cubic feet of gas per year, and the industry contributes about NZ$2.5bn ($1.8bn) to the local economy, according to PEPANZ. “This sends a powerful message: We are ending the age of oil,” said Russel Norman, executive director of Greenpeace New Zealand. “Just as New Zealand did in 1987 when it went nuclear-free and stood up to the powerful US military, this has shown bold global leadership on the greatest challenge of our time — putting people ahead of the interests of oil corporations and the hunt for fossil fuels that are driving dangerous climate change.” 7) You share everything with your bestie – Even your brain waves [Source: NY Times] In our life, we make friends to hangout, share feelings, discuss mutually liked topics and much more. However, researchers have found that people make friends of same characteristics, like themselves. If you notice, your friend and you would share most of the common traits/interests. People choose to have friends with same thinking, class and behaviour. But, why? Scientists have found that the brains of close friends respond in remarkably similar ways. For e.g., as they view a series of short videos: the same ebbs and swells of attention and distraction, the same peaking of reward processing here, etc. The neural response patterns evoked by the videos proved so congruent among friends, compared to patterns seen among people who were not friends, that the researchers could predict the strength of two people’s social bond based on their brain scans alone.
Carolyn Parkinson, a cognitive scientist at the University of California, Los Angeles, said that she was struck by the exceptional magnitude of similarity among friends. The results were more persuasive than she would have thought. Nicholas Christakis, a biosociologist at Yale University said that it suggests friends resemble each other not just superficially, but in the very structures of their brains. “Our results suggest that friends might be similar in how they pay attention to and process the world around them,” Dr. Parkinson said. “That shared processing could make people click more easily and have the sort of seamless social interaction that can feel so rewarding.” Dr. Christakis and his co-workers recently demonstrated that people with strong social ties had comparatively low concentrations of fibrinogen, a protein associated with the kind of chronic inflammation thought to be the source of many diseases.
Also, researchers have been intrigued by the evidence of friendship among nonhuman animals. Gerald G. Carter of the Smithsonian Tropical Research Institute in Panama and his colleagues reported last year that female vampire bats cultivate close relationships with unrelated females and will share blood meals with those friends in harsh times — a lifesaving act for animals that can’t survive much more than a day without food. Through years of tracking the behaviours of a large flock of great tits, Josh A. Firth of Oxford University and his co-workers found that individual birds showed clear preferences for some flock members over others. When a bird’s good friend died or disappeared, the bereft tit began making overtures to other birds to replace the lost comrade.
The researchers decided to explore subjects’ neural reactions to everyday, naturalistic stimuli — which these days means watching videos. The researchers started with a defined social network: an entire class of 279 graduate students at an unnamed university. The students, who all knew one another and in many cases lived in dorms together, were asked to fill out questionnaires. The students were then asked to participate in a brain scanning study and 42 agreed. As an fMRI device tracked blood flow in their brains, the students watched a series of video clips of varying lengths, an experience that Dr. Parkinson likened to channel surfing with somebody else in control of the remote. Analysing the scans of the students, Dr. Parkinson and her colleagues found strong concordance between blood flow patterns and the degree of friendship among the various participants, even after controlling for other factors that might explain similarities in neural responses, like ethnicity, religion or family income.
Using the results, the researchers were able to train a computer algorithm to predict, at a rate well above chance, the social distance between two people based on the relative similarity of their neural response patterns. Dr. Parkinson emphasized that the study was a “first pass, a proof of concept,” and that she and her colleagues still don’t know what the neural response patterns mean: what attitudes, opinions, impulses or mental thumb-twiddling the scans may be detecting. They plan next to try the experiment in reverse: to scan incoming students who don’t yet know one another and see whether those with the most congruent neural patterns end up becoming good friends. 8) The Botch of the human body [Source: WSJ ] Over the past decade, geneticists and biologists have learned more about our evolution than we ever thought possible. Not all of it is pretty. Some of our flaws are due to little more than a mismatch between the world we live in now and the world we evolved in. Our immediate ancestors spent several million years eking out a living on the grassy plains of sub-Saharan Africa, an environment that bears little resemblance to the ones most of us now inhabit. These unfortunates spent hours each day chewing tough roots, choking down leaves and stems, munching on tiny berries and gorging on rare windfalls of meat, bone marrow and worms. It was during this time that our body’s metabolic system, which determines how we derive energy from food, developed one of its defining features: We are built to pack on the pounds when food is plentiful, and we retain that weight when we are deprived of calories. This system made very good sense back then, but it has backfired now that we graze not on roots and grubs but on a veritable savanna of rich, calorie-packed foods. Because natural selection never rewarded will power, it’s no surprise that we have so little of it now.
A host of other bodily glitches are due to nothing more than the inherent limits of evolution. What sense could it ever have made, for example, to bend our backbones into an S-shape, which leaves us vulnerable to slipped disks, pinched nerves and lower back pain? Why are there seven bones in our ankle and eight in our wrist which make us vulnerable to sprains and strains? And how could our knees have come to hinge on a tiny ligament, the oft-torn anterior cruciate ligament, or ACL? All of these questions have a single, simple answer: Evolution does not make or have a plan. Natural selection can only work on the bodies we have, making slight tweaks and tugs through the randomness of mutations. Human anatomy has other design quirks that defy easy categorization. Hobbled by poor design, humans fall short of other animals in other ways. Many creatures are healthy eating the same two or three foods their whole lives. The koala can do fine eating just one kind of leaf. Humans, on the other hand, have very particular needs for very specific micronutrients. Why? Because we lost the ability to make them for ourselves.
Even the human reproductive system—which one might expect to be fairly streamlined, given its importance for the continuation of the species—is riddled with errors and inefficiencies, as one of the nearly 10% of us that struggle with infertility can attest. For instance, women’s ovaries aren’t connected to their fallopian tubes, an evolutionary oversight that sometimes leads to eggs floating pointlessly out into the abdominal cavity. Even when eggs make it into the fallopian tube, it’s a miracle that sperm are able to locate them; sperm cells must travel around 17.5 centimeters to meet the egg, which is a challenge given that this is more than 3,000 times the length of their bodies—and that sperm cells cannot turn left. The corkscrew-like movement of their tails propels them in right-hand circles along a completely random path. Considering the challenges of even fertilizing an egg, never mind the other hurdles that developing fetuses must overcome between conception and childbirth, every baby really is a miracle.
But just because evolution is kind of dumb doesn’t mean that humans are. We are intelligent and resourceful—the ultimate generalists. Rather than specialising in one habitat, one food source or one survival strategy, we evolved to find success in whatever way we could. Our big brains were the key to this creative approach, but this came with a very big drawback. As we came to rely on ingenuity in solving life’s challenges, this relaxed the pressure on our bodies. We no longer had to be in tiptop shape in order to find a way to survive and thrive. Our error-prone bodies are what happens when the pressure is taken off. Just like evolution itself, humans fail to prioritize long-term planning even in the face of imminent threat. As a result, many of the dangers our species now face are purely of our own making. Ironically, the big brain that helped humankind to transcend the limitations of our bodies may turn out to be our biggest flaw.
9) Global chocolate sales hit a sweet spot [Source: Financial Times ] A revival in global chocolate consumption — despite the trend towards healthier snacking — has been highlighted by a surge in sales at Switzerland’s Barry Callebaut, the world’s biggest supplier of chocolate and cocoa products. The Zurich-based company’s sales volumes rose 8% to 1.02m tonnes in the six months to February 28, powered by increases in all regions but especially Asia. While confectionery companies have been hit in recent years by consumers’ shift towards healthier foods, with global sales volumes falling in 2016 and 2015, last year saw a return to growth. This was helped by tumbling cocoa prices — and Barry Callebaut’s results suggested the sales revival was accelerating, despite cocoa prices starting to rise again recently.
The sweetening outlook will come as a welcome break for a chocolate industry, suggesting sales could rise even as consumers became fussier about what they eat, tempted by new product launches and promotions. “Consumers have not given up on chocolate — but they have become choosier, which still allows for volume growth,” said Jean-Philippe Bertschy, analyst at Vontobel bank in Zurich. “Even health conscious individuals can be contradictory. It’s à la carte — quinoa one day and chocolate the next,” Mr Bertschy added.
Barry Callebaut said its 8% sales volume growth in the first half of its financial year compared with a 2.5% expansion in the global chocolate confectionery market. The company continued “to see healthy market dynamics”, Antoine de Saint-Affrique, chief executive, said on Wednesday. Growth in the overall market was impacted by many factors, he told the Financial Times, “but clearly the dynamics in the market were positively influenced by innovation and promotions”. He cited as an example Barry Callebaut’s launch last year of “ruby” chocolate, a new pink variety which has a hint of berry flavours, that the confectioner claims is the biggest innovation in the sector since white chocolate in the late-1930s.
10) “2001: A space odyssey”: What it means and how it was made [Source: New Yorker] In this article, Dan Chiasson, who teaches at Wellesley College, pays tribute to one of the best sci-fi movies ever made. While at that time, 1968, when the movie was released, many felt that the movie was slow and boring, there were others who loved it too. According to one report, a young man at a show in Los Angeles plunged through the movie screen, shouting, “It’s God! It’s God!” John Lennon said he saw the film “every week.” “2001” initially opened in limited release, shown only in 70mm on curved Cinerama screens. MGM thought it had on its hands a second “Doctor Zhivago” (1965) or “Ben-Hur” (1959), or perhaps another “Spartacus” (1960). The grandeur of “2001”—the product of two men, Arthur C. Clarke and Stanley Kubrick, who were sweetly awestruck by the thought of infinite space required, in its execution, micromanagement of a previously unimaginable degree. Kubrick’s drive to show the entire arc of human life meant that he was making a special-effects movie of radical scope and ambition.
The outlines of a simple plot were already in place: Kubrick wanted “a space-probe with a landing and exploration of the Moon and Mars.” Kubrick liked to work from books, and since a suitable one did not yet exist they would write it. When they weren’t working, Clarke introduced Kubrick to his telescope and taught him to use a slide rule. They studied the scientific literature on extraterrestrial life. Kubrick grew so concerned that an alien encounter might be imminent that he sought an insurance policy from Lloyd’s of London in case his story got scooped during production. Also, in the coming decades, conspiracy theorists would allege that Kubrick had helped the government fake the Apollo 11 moon landing.
The audiences who came to “2001” expecting a sci-fi movie got, instead, an essay on time. The plot was simple and stark. A black monolith, shaped like a domino, appears at the moment in prehistory when human ancestors discover how to use tools, and is later found, in the year 2001, just below the lunar surface, where it reflects signals toward Jupiter’s moons. At the film’s conclusion, it looms again, when the ship’s sole survivor, Dave Bowman, witnesses the eclipse of human intelligence by a vague new order of being. For the final section of the film, “Jupiter and Beyond the Infinite,” Ordway, the film’s scientific consultant, read up on a doctoral thesis on psychedelics advised by Timothy Leary. Such wide-ranging research was characteristic of Clarke and Kubrick’s approach. Yet some of the most striking effects in the film are its simplest. In a movie about extraterrestrial life, Kubrick faced a crucial predicament: what would the aliens look like? Cold War-era sci-fi offered a dispiriting menu of extraterrestrial avatars: supersonic birds, scaly monsters, gelatinous blobs.
Now, in the era of the meme, we’re more likely to find the afterlife of “2001” in fragments and glimpses than in theories and explications. The film hangs on as a staple of YouTube video essays and mashups; it remains high on lists of both the greatest films ever made and the most boring. On Giphy, you can find many iconic images from “2001” looping endlessly in seconds-long increments—a jarring compression that couldn’t be more at odds with the languid eternity Kubrick sought to capture. The very fact that you can view “2001,” along with almost every film ever shot, on a palm-size device is a future that Kubrick and Clarke may have predicted, but surely wouldn’t have wanted for their own larger-than-life movie. - Saurabh Mukherjea is CEO, and Prashant Mittal is Strategist, at Ambit Capital. Views expressed are personal