October 04, 2016
By Marian L. Tupy
Last Friday, Botswana celebrated 50 years of independence. The former Bechuanaland Protectorate gained independence from Great Britain on September 30, 1966 and has thrived ever since. In far too many African countries, "Independence Day" has been a cause for lamentation, not celebration. Regrettably, African independence came at the worst possible moment. The 1960s was a decade when many Western countries seemed to have suffered a collective mental breakdown. In contrast, the USSR seemed to be doing rather well. The Soviets eclipsed the United States in the Space Race and, following the Cuban Revolution, communism gained a permanent foothold in the Americas. Appalled by the injustices of the colonial rule—and ignoring some of its benefits—Africans cast away the European yoke along with some of its more beneficial features: representative democracy, property rights, rule of law, free enterprise, and international trade. Understandably, but catastrophically, many African countries opted for the opposite of what the West had to offer and embraced socialism instead.Not so with Botswana—a country that has been politically and economically freer than the rest of Africa for much of the last half-century. Why? Seretse Khama, the first president, was a tribal chief who maintained the tradition of public meetings, or kgotlas. Kgotlas were the traditional way in which Africans made local decisions. It was a good way in which to keep the chiefs honest and accountable. When I visited the country in 2007, a game warden I spoke to in the Chobe National Park reminisced about standing behind the minister of education in the line for groceries. A shop manager recognized the minister and motioned her to the front of the line. The minister flatly refused. The exceptional humility of Botswana's politicians is just one positive consequence of such "grassroots democracy."Khama's economics were also out of step with the times. He maintained a relatively "hands-off" approach to the economy, which was, for decades, the freest in Africa. Personally, I think that Khama, in addition to being a highly educated man (he was a graduate of Balliol College, Oxford) was prevented from economic experimentation by geopolitical necessity. Back then, Botswana was surrounded by the immensely powerful South Africa in the south, South Africa–dominated South West Africa in the west, and Rhodesia in the east. Neither government would have tolerated a Marxist state in its midst. And so Khama did a couple of sensible things—he kept the market economy he inherited and did not even bother to waste money on an independent military. Today, the prosperity and stability of Botswana is a testament to his enlightened leadership.
Economic freedom in Bostwana has constistently been higher than the African average.Thanks to this higher economic freedom, Botswana's per-person GDP has increased rapidly. It passed the African average in the mid-1970s and has made steady progress since.Botswana's life expectancy quickly recovered from the HIV/AIDS epidemic, rising to retake its place above the continental average.While autocracy mired many of its African neighbors, Botswana maintained a relatively high level of democracy for the continent.Botswana has managed to control corruption within its borders better than Africa as a whole.Enabling much of this progress, Botswana's rule of law has also exceeded Africa's for decades.

This article originally appeared in Reason.

September 30, 2016
By Chelsea Follett
An ambitious plan to fight disease

In a move which echoes the recent trend of tech companies setting their sights on health problems, Facebook founder Mark Zuckerberg and his wife Priscilla Chan have pledged to spend $3 billion over the next decade towards the goal of curing, managing, or preventing all diseases by the end of the century. The donation will focus on the prevention and cure of diseases rather than solely on the treatment of diseases after they happen - a contrast to most medical spending. The couple has already committed $600 million of the donation to create a new research center, called the Biohub, which will begin by working on two projects: the Cell Atlas, a map describing different types of the body’s most vital cells, and the Infectious Disease Initiative, a project which will focus on solving such diseases as HIV and Zika. While their plan is undoubtedly ambitious, Zuckerberg and Chan hope their donation will allow scientists and engineers to come together to build the tools that will speed up innovation in advanced research and effectively solve all disease. 

New economy brings safer, more fulfilling jobs

Many have predicted that humans will work fewer hours in the future. We do in fact work fewer hours on average, but most people still have full time jobs. The quality of jobs actually has changed as well – it has gotten much more pleasant in recent decades. The average job has become much safer, with workplace fatalities falling steadily in the U.S since the early 20th century and globally since the early 1990s. People are also a lot less likely to move for work, indicating that many people no longer have to relocate for better job opportunities. Not only will rising productivity possibly lead to even fewer working hours in the future, but those working hours may continue to become much less tedious as well.

Sonic Tractor Beam Developed in Germany for under $10  

A team of engineers at the Max Planck Institute for Intelligent Systems in Stuttgart, Germany, have
designed and built a device capable of using sound waves to project images and manipulate an object’s movement. A similar device was created in October 2015 by Spanish researchers, but was far more expensive and complex, necessitating further experimentation. By introducing a simpler and more cost-effective design, the German research team was able to assemble a device which cost around $10, and can direct and organize objects on a 2D surface. Moving forward, such a device could make ultrasound-based medicine far more personalized and efficient.
  

Genetically engineered corn boosts African crop yields

Corn production sustains much of Africa. However, insects called stem borers destroy much of the corn crop, particularly in Kenya. Pesticides have proven effective in the past, but for many small farmers they are costly to implement. Monsanto has tested a new variety of corn that addresses both the cost and the pests.  The new corn variety allows crops to better withstand stem borers, thus reducing loss of production.  While the insects will eventually develop resistance to the new strand of corn, it is likely to work long enough for new alternatives to be developed. 
September 28, 2016
By Chelsea Follett
A new level of precision in eye surgeries

Surgeons at Oxford’s John Radcliffe Hospital have successfully used a new device to perform the world’s first robot assisted eye surgery. The device, known as the Robotic Retinal Dissection Device, was developed by a Dutch medical robotics firm and allows surgeons to perform intraocular surgeries with precision up to 1/1000 of a millimeter.  Previously, doctors have been able to identify retinal diseases at the microscopic level using laser scanners and microscopes. However, the possibility of minute tremors in a surgeon’s hands has made operation on these conditions very dangerous. While still in its trial phase, this device has the potential to open up a new level of complexity in eye surgery.


A new tool to use against pancreatic cancer


A new drug, known as IMM-101, has shown potential at treating metastatic pancreatic cancer by triggering the immune system to fight off the disease on its own - without any side effects or toxicity. The drug, which is used in conjunction with a form of chemotherapy, works by stimulating the otherwise inactive T-cells in a patient’s body to identify and attack the cancerous tumors after chemotherapy creates openings in the cancer cell’s protective outer layer. Pancreatic cancer is especially deadly because it is often diagnosed after it has already spread to other parts of the body. That is part of the reason why the disease usually kills within a few months of being identified. While researchers point out that further study is necessary, the results of this small trial show promise that immunotherapy drugs will soon be more effective tools in the fight against pancreatic and other forms of cancer.

Letting technology get under your skin


As technology occupies an ever greater space in our lives, an increasing number of people are taking the next step by actually having certain technology implanted beneath their skin itself.  RFID chips, or Radio Frequency Identification chips, are very small devices which can be easily implanted into the fatty tissue of person’s skin, where they then can be programmed to perform various tasks when read by an RFID scanner. Retailers estimate there to be between 30,000 to 50,000 people who currently have these chips implanted under their skin, which can be used for countless things from gaining access to buildings to acting as a digital business card. RFID chips are increasingly being proposed for use in medical applications as well. Medical personnel could scan these chips to obtain a vast amount of potentially lifesaving medical information by simply scanning a chip beneath the patient’s skin. While there are some ethical and security concerns pertaining to the use of such chips, it appears that younger generations are beginning to embrace the technology more and more.  

Tech giant tackles health


In a unique union between biology and computer science, the technology giant Microsoft has taken on a new, and rather monumental, task: solving cancer within a decade by reprogramming cells. After opening its first “wet” lab this summer, Microsoft has gathered a team of biologists, programmers, and engineers who will begin to test large maps of internal cell networks created by computer scientists. One of their goals is to create a molecular sized computer made up of DNA which could recognize cancerous cells and destroy them. Vital to the success of this project is software known as the Bio Model Analyzer, which is able to copy the behavior of a healthy cell and compare it to that of a diseased cell in order to identify where the problem occurred and how it can be corrected. By viewing cancer as a “computational problem”, Microsoft is hopeful that it can create a way to regulate cancer and effectively solve a disease that kills so many people every year.
One of the interesting side effects of rising prosperity around the world has been the change in eating habits in formerly poor countries. In China, for example, people historically relied on cereals, such as rice, for nutrition. Following economic liberalization and increased living standards—Chinese income per capita adjusted for inflation and purchasing power parity rose by an astonishing 1,300 percent between 1978 and 2015—consumption of cereals decreased. Conversely, consumption of animal products skyrocketed.Consumption of fish and seafood, specifically, has been rising meteorically since the early 1980s. Today, a typical Chinese consumes more seafood than a typical American. And that, of course, has led to many an article about the imminent depletion of global fish stocks and subsequent environmental catastrophe.
But people who worry about the new and more expensive eating habits of people in formerly poor countries need not worry too much. Human ingenuity and free markets have a way of satisfying demand in affordable and environmentally sustainable ways.Consider aquaculture. According to a recent Bloomberg story, scientists in Australia "are attempting to unlock the genome of the Black Tiger prawn to make a super invertebrate that will grow faster, fight disease more effectively and taste better than its free-roaming brethren… The prawns will grow on a 10,000 hectare… slice of the Legune cattle ranch, near the border of the Northern Territory and Western Australia."If successful, "The first offspring from the project could be ready for sale at the end of 2018, and the site is targeting full output of 162,000 tons of prawns a year. That's more than four times Australia's current annual prawn consumption."According to Chris Mitchell, director of the Seafarms Group, which plans to spend $1.5 billion building what will be the largest shrimp farm in the developed world, "the goal is to breed such hardy and tasty prawns that the project will never have to catch wild ones again."Somewhere out there Julian Simon and Norman Borlaug must be smiling.
September 21, 2016
By Chelsea Follett
Why does socialism keep reappearing, despite its failure wherever it has been tried? The answer may lie in human nature. A week ago, HumanProgress.org Editor Marian L. Tupy moderated a forum on the topic, featuring moral psychologist Jonathan Haidt and evolutionary psychologists Leda Cosmides and John Tooby. If you missed the forum, consider reading Ronald Bailey’s excellent summary over at Reason. Also please consider watching the videos of the distinguished speakers below.


 



Researchers have just developed a way to fit yet more transistors into less space, creating an even more efficient computer chip. The breakthrough is good news for "Moore's Law," or the idea that the number of transistors per square inch of an integrated circuit board will double every two years.

Computers have come a long way since the days of ENIAC. The first computer was a $6-million-dollar giant that stretched eight feet tall and 80 feet long, weighed 30 tons and needed frequent down time to replace failing vacuum tubes. A modern smart phone, in contrast, possesses about 13 hundred times the power of ENIAC and can fit in your pocket. It also costs about 17 thousand times less. (With a deal like that, no wonder that there are now more mobile phone subscriptions than there are people on the planet).

The drop-off in the price of computing power is so steep that it's difficult to comprehend. A megabyte of computer memory cost 400 million dollars in 1957. That's a hefty price tag, even before taking inflation into account. In 2013 dollars, that would be 2.6 billion. In 2015, a megabyte of memory cost about one cent.
The cost of both RAM (roughly analogous to short-term memory) and hard drive storage (long-term memory) has plummeted. Consider the progress just since 1980. In that time, the cost of a gigabyte of RAM fell from over 6 million dollars to less than five dollars; a gigabyte of hard drive storage fell from over 400 thousand dollars to three cents.Whether you're reading this article on a smart phone, tablet, laptop or desktop computer, please take a moment to appreciate how incredible that device truly is. Ever more powerful, compact and affordable computers make our lives more convenient and connected than our ancestors could have ever imagined.

They also enable a process called dematerialization—they allow us to produce and accomplish more with less. The benefits to the economy, the environment and human wellbeing are incalculable. If Moore's Law holds true, regulators stay out of the way, and outdated privacy laws catch up to the current technological realities, then things are only going to get better.


This article first appeared in Reason.
September 16, 2016
By Marian L. Tupy
In recent years, I have given a number of presentations to high-school and college students on the importance of economic freedom and persistent threat of socialism – as witnessed, for example, by the recent economic meltdown in Venezuela. One problem that I have encountered is that young people today do not have a personal memory of the Cold War, let alone an understanding of social and economic arrangements in the Soviet bloc, which, I suspect are either downplayed or ignored in American school curricula. As a result, I have written a basic guide to socialist economics, drawing on my personal experience growing up under communism. I hope that this – somewhat longer piece – will be read by the millennials, who are so often drawn to failed ideas of yore.

As a boy growing up in communist Czechoslovakia, I would, for many years, walk by a building site that was to become a local public health facility or clinic. The construction of this small and ugly square-shaped building was slow and shoddy. Parts of the structure were falling apart even while the rest of it was still being built.

Recently, I returned to Slovakia. One day, while driving through the capital of Bratislava, I noticed a brand new suburb that covered a hill that was barren a mere two years before. The sprawling development of modern and beautiful houses came with excellent roads and a large supermarket. It provided a home, privacy, and safety for hundreds of families.

How was it possible for a private company to plan, build, and sell an entire suburb in less than two years, but impossible for a communist central planner to build one small building in almost a decade?

A large part of the answer lies in “incentives.” The company that built the suburb in Slovakia did not do so out of love for humanity. The company did so, because its owners (i.e., shareholders or capitalists) wanted to make a profit. As Adam Smith, the founding father of economics, wrote in 1776, “It is not from the benevolence of the butcher, the brewer, or the baker, that we can expect our dinner, but from their regard to their own interest.”

In a normally functioning market, it is rare for only one company to provide a certain kind of good or service. The people who bought the houses in the suburb that I saw did not have to do so. They could have bought different houses built by different developers in different parts of town at different prices. Competition, in other words, forces capitalists to come up with better and cheaper products – a process that benefits us all.

Communists opposed both profit and competition. They saw profit-making as useless and immoral. In their view, capitalists did not work in the conventional sense. The real work of building the bridges and plowing the fields was done by the workers. The capitalists simply pocketed the company’s profits once the workers’ wages have been paid out. Put differently, communist believed that the capitalist class exploited the working class – and that was incompatible with the communist goal of a classless and egalitarian society.

But capitalists are neither useless nor immoral. For example, capitalists often invest in new technologies. Companies that have revolutionized our lives, like Apple and Microsoft, received their initial funding from private investors. Because their own money is on the line, capitalists tend to be much better at spotting good investment opportunities than government bureaucrats. That is why capitalist economies, not communist ones, are the leaders in technological innovation and progress.

Moreover, by investing in new technologies and by creating new companies, capitalists provide consumers with a mind-boggling variety of goods and services, create employment for billions of people, and contribute trillions of dollars in tax revenue. Of course, all investment involves at least some level of risk. Capitalists reap huge profits only when they invest wisely. When they make bad investments, capitalists often face financial ruin.

Unfortunately, communists did not share the above views and banned private investment, private property, risk-taking and profit-making. All large privately held enterprises, like shoe factories and steel mills, were nationalized. A vast majority of small privately held enterprises, like convenience stores and family farms, were also taken over by the state. The expropriated owners seldom received any compensation. Everyone now became a worker and everyone worked for the state.

In order to prevent new income inequalities and new classes from emerging, everyone was paid more-or-less equally. That proved to be a major problem. Since people did not make more money when they worked harder, few of them worked hard. The communists tried to motivate or incentivize the workforce through propaganda. Posters of strong and determined workers were ubiquitous throughout the former Soviet empire. Movies about hardworking miners and farmers were supposed to instill the population with socialist zeal.marian-communist-economics-1Propaganda alone could not increase the productivity of communist workers to Western levels. To incentivize the workforce, communist regimes resorted to terror. Workers who slacked off on the job were sometimes convicted of sabotage and shot. More often, they were sent to the Gulag – a system of forced labor camps. Sometimes, the authorities arrested and punished completely innocent people on purpose. Arbitrary terror, the communists believed, made the rest of the workforce more productive.

In the end, tens of millions of people in the Soviet Union, China, Cambodia, and other communist countries were sent to labor camps. The living and working conditions in the camps were inhuman and millions of people perished. My great uncle, who was accused and convicted of being a supporter of the underground democratic opposition in communist Czechoslovakia, was sent to mine uranium for the Soviet nuclear arms program. Working without any protection from radiation, he died of cancer.

By the late 1980s, communist regimes lost much of their revolutionary zeal. Terror and fear subsided, and productivity declined further. Thus, in the late 1980s, an average industrial worker in Western Europe was almost eight times as productive as his Polish counterpart. Put differently, in the same time and with the same resources that a Polish worker needed to produce $1 worth of goods, a Western European worker could produce $8 worth of goods.

Just as they replaced the profit motive with propaganda and terror, so the communists replaced competition with monopolistic production. Under capitalism, companies compete for customers by slashing prices and improving quality. Thus, a teenager today can choose between jeans made by Diesel, Guess, Calvin Klein, Levi’s and many others.

Communists thought that such competition was both wasteful and irrational. Instead, communist countries tended to have one monopolistic producer of cars, shoes, washing machines, etc. But, problems soon arose. Since producers in communist countries did not have to compete against anyone, they did not have any incentive to improve their products. Compare, for example, the BMW 850 that went into production in West Germany in 1989 and the Trabant that was made in East Germany at the same time.marian-communist-economics-2marian-communist-economics-3Communist producers were protected from domestic competition by having a monopoly. They were also protected from foreign competition by prohibitively high import tariffs or an outright ban on imports. Put differently, they had a “captive” consumer base. The Trabant car manufacturer did not have to worry about losing consumers, since the latter had nowhere else to go.

Moreover, the workers at the Trabant car plant received the same salary irrespective of the number of cars they produced. As a result, they produced fewer cars than were needed. People in East Germany had to wait for many years, sometimes decades, before they were able to buy one. Indeed, shortages of most consumer goods, from important items such as cars to mundane items such as sugar, were ubiquitous. Endless queuing became a part of everyday life.

Under capitalism, shortages are generally avoided through the movement of prices. Some prices, like those of national currencies traded globally, change virtually every second. Other prices change more slowly. If there is a shortage of strawberries, for example, their price will rise. As a result, fewer people will be able to buy strawberries. On the upside, the people who value strawberries the most and are willing to pay the higher price will always find them.

The movement of prices provides important information for the capitalists. Capitalists take their money and invest it in more profitable business ventures. If the price of something is rising, not enough of it is being produced. Investors rush in with new capital, hoping to make a profit. Production increases. The economy as a whole thus tends toward an “equilibrium” or a point at which capital is distributed roughly where it is needed.marian-communist-economics-4Prices are an important source of information, but where do they come from? In a capitalist economy, nobody sets prices. They emerge “spontaneously” in the market place. Every time I buy a cup of coffee on the way to work, for example, I incrementally increase the price of the coffee bean. Every time I fail to buy my usual morning cup of coffee because I am late for work, I decrease its price by a tiny amount. If everyone stopped buying coffee, its price would collapse.

Communists banned profit, capitalists, competition, free trade and much (if not all) private property – all of which are necessary for accurate prices to emerge. Instead, tens of millions of prices for items ranging from tractors to a loaf of bread were set annually (or every few years) by government bureaucrats. Since they could neither accurately predict how much bread would be produced (i.e., supplied) nor how much bread would be consumed (i.e., demanded), the bureaucrats almost always got the prices wrong.

Price-setting made shortages associated with low productivity worse. If the price of flour was set too high, bakeries would bake too little bread and bread would disappear from shops altogether. If the price of flour was set too low, too much bread would be baked and much of it would end up rotten. Put differently, communist economies were very inefficient.

To complicate matters, communists sometimes mispriced items intentionally. The price of meat, for example, was kept too low year after year out of political considerations. Low prices created an impression of affordability. On their trips abroad, communist officials would often boast that the workers in the Soviet empire could buy more meat and other produce than their Western counterparts. In reality, shops were often empty. As a consequence, money was of limited use. To get around shortages, many people in communist countries resorted to bartering goods and favors (or services).

Under communism, the state owned all production facilities, such as factories, shops and farms. In order to have something to trade with one another, people first had to “steal” from the state. A butcher, for example, stole meat and exchanged it for vegetables that the greengrocer stole. The process was inefficient, but it was also morally corrupting. Lying and stealing became widely used and trust between people declined. Far from fostering brotherhood between people, communism made everyone suspicious and resentful.marian-communist-economics-5Of course, not everyone was equally affected by shortages. Government officials and their families could generally avoid the daily hardships of life under communism by having access to special shops, schools, and hospitals. Communism started as a movement for greater equality. In reality, it was a return to feudalism. Like feudal societies, communist societies had an aristocracy composed of the communist party members. Like feudal societies, communist societies had a population of serfs with limited or no rights and little possibility of social mobility. Like feudal societies, communist societies were held together by brute force.

Postscript:

I am sometimes asked why, if communism was so inefficient, it had survived as long as it did. Part of the reason rests in the brute force with which the communists kept themselves in power. Part of it rests in the emergence of smugglers, who made the economy run more smoothly. When, for example, a communist shoe factory ran out of glue, the factory manager called his contact in the “shadow” or “underground” economy. The latter would then obtain the glue by smuggling it out of the glue factory or from abroad. Smuggling was illegal, of course, but it was preferable to dealing with the government bureaucracy – which could take years. So, in a sense, communism’s longevity can be ascribed to the emergence of a quasi-market in goods a favors (or services).

This article originally appeared in CapX.
September 14, 2016
By Chelsea Follett
Watching the news, it can be easy to feel pessimistic about the state of the environment. Many a leader has warned of environmental catastrophes to come. Pope Francis, for example, has recently said that humanity is turning the planet into a “wasteland full of debris, desolation and filth.” But there are also many who hold a more optimistic view, believing that human ingenuity can help preserve the environment. HumanProgress.org advisory board member and Rockefeller University professor Jesse H. Ausubel, who was integral to setting up the world’s first climate change conference in Geneva in 1979, has shown how technological progress allows nature to rebound. He envisions a future where humanity is ever less dependent on natural resources. Here are 7 graphs that give cause for such environmental optimism.

1. As people escape poverty and spend less time and energy on the basics of survival, they can come to care more about the environment. The incredible decline in Chinese poverty spurred by economic liberalization, for example, has coincided with better preservation of forests. In 2015, there were 511,800 more square kilometers of forest area in China than in 1990. Over the same time period, Europe gained 212,122 square kilometers of forest area, while North America gained 64,410. Africa, on the other hand—the poorest continent—lost forest area.
human-ingenuity-12. To illustrate the divergent trends in Europe and Africa, it is also helpful to look at how forest area has changed as a share of total land area. This measure makes it easier to compare the continents despite their very different sizes. While the change is very slight in percentage terms (please note the Y axis scale), the direction of the trends over the last two decades is clear.human-ingenuity-23. Not only does prosperity enable more people to care about the environment, but wealthy countries also have access to better and greener technology. As a result, many now use water much more efficiently than in the past. Consider Western Europe. According to data from the World Bank, between 1982 and 2014, Ireland increased its water productivity—the amount of GDP generated per unit of freshwater withdrawal—by 321%, while the United Kingdom’s rose by 243%.human-ingenuity-34. Better technology has also allowed wealthy countries to reduce cropland erosion. According to data from the U.S. Department of Agriculture, wind erosion of cropland in the United States decreased from 3.3 tons per acre in 1982 to 2.1 in 2007. Water erosion, similarly, fell from 4 tons per acre to 2.7 over the same time period.human-ingenuity-45. Thanks to this reduction in erosion and numerous other agricultural productivity-boosting measures, humanity now produces far more food with less land. Between 1961 and 2014, global cereal yields per unit of land increased by 154%. If farmers worldwide can reach the productivity of the US farmer, humanity will be able to return a landmass the size of India back to nature.human-ingenuity-56. Technology has not only allowed humanity to use water and land more efficiently, but it has also enabled us to reduce pollution of the air. Agricultural processes now emit far fewer greenhouse gases, even while producing more food than ever before and bringing hunger to an all-time low. In the countries surveyed by the United Nations, from 1980 to 2012 total emissions fell by almost 340 thousand gigagrams of carbon dioxide equivalent.human-ingenuity-67. Looking beyond agriculture, overall harmful emissions in the United States have actually fallen relative to the growth of the population, of GDP and of the number of vehicle miles traveled. Globally, emissions have also decreased somewhat relative to GDP.human-ingenuity-7

This article first appeared in CapX.
September 13, 2016
By Marian L. Tupy
My Cato colleague, Johan Norberg, has just published his latest book, called Progress: Ten Reasons to Look Forward to the Future. I first came across Norberg's thoughtful writing in 2003, when, in response to the Battle of Seattle and other anti-globalization protests, he published In Defense of Global Capitalism. The book made a persuasive case in favor of global trade. Thirteen years later, as the current U.S. presidential campaign shows, the book, and the arguments it contains, continue to be relevant.

But back to Progress. The book, as the title suggests, documents progress that humanity has made in ten crucial areas: food supply, sanitation, life expectancy, poverty, violence, the environment, literacy, freedom, equality and the next generation (i.e., child labor). It has been favorably reviewed in The EconomistThe (British) Spectator and, mirabile dictu, The Guardian

I am glad to report that Cato has organized a book forum for Norberg on October 12, with Reason's science correspondent Ronald Bailey as commentator. The books by both authors (Bailey published his own tribute to human progress entitled The End of Doom: Environmental Renewal in the Twenty-first Century in 2015) will be on sale.

In any case, the release of Norberg's book allows me, once again, to pitch the wealth of data on a variety of subjects that is made available, free of charge, at HumanProgress.org. The website is aimed at journalists, students, lecturers, as well as the public in general, who are interested in data concerning the state of humanity. Below, I include ten graphs pertinent to each chapter in Norberg's book.

1. Globally, food supply is at an all-time high. Even in Africa, people consume well in excess of the USDA-recommended 2,000 calories per person per day.2. Globally, some 75 percent of people have access to improved sanitation (e.g., flush toilets, septic tanks, sewers, etc.), which is important, because unhygienic disposal of human excreta has been a leading source of illness in the developing world.3. Life expectancy, as previously reported at Reason, is at an all-time high.4. The share of people living in absolute poverty, the Brookings Institution researchers believe, has never been smaller.5. Wars have become rarer, and so have homicides—at home and abroad.6. There are also good signs for the environment, as we pollute less in spite of a growing population and larger economic output.
7. Literacy, once a preserve of the few, is now widespread.
8. Economic freedom and democracy are not retreating.9. The gender pay gap is declining in rich countries, gay equality is increasing in the developed world andsegregationist attitudes have almost disappeared.

This article originally appeared in Reason.
September 12, 2016
By Chelsea Follett
Prosthetic Limbs Get a Leg Up From 3-D Printing

A Japanese tech startup has developed the software to stably and cheaply 3-D print realistic prosthetic limbs from a flesh-like polymer. Prosthetic limbs can prove prohibitively expensive for amputees, particularly prosthetics specialized for activities such as swimming or skiing, or for wearing certain styles of shoes. The startup’s technology lowers the cost of prosthetics to the point that they could supply even relatively poor countries like the Philippines—where 350,000 people need artificial limbs and 90% currently cannot afford them. The company tested its technology by donating 100,000 prosthetics to Filipinos in need.

Smoke Detector Blood Test May Boost Cancer Survival Odds

A new “smoke detector” blood test can spot cancer’s presence long before symptoms develop. Early detection that catches cancer before it has spread to multiple parts of the body is often key to a patient’s survival, so spotting the disease early on a wide scale could significantly lower cancer death rates. Just as a smoke detector finds fire indirectly, by testing for a byproduct – smoke – the new blood test finds cancer by testing for a byproduct – mutated blood cells. According to one of the researchers behind the blood test, “The old adage of no smoke without fire also applies to ‘no cancer without mutation’, as mutation is the main driving force for cancer development.”  

Body Heat Could Power Wearable Technology

Imagine a future where while working out, your music player, your watch, your heart rate monitor, and other small technological devices all draw their electricity from your own body heat. A new piece of wearable technology seeks to make that future a reality. The lightweight device, made of thermally conductive material that rests against the skin and can be worn as an armband or embedded into athletic clothing, converts body heat into energy to power other wearable electronics. The creators are particularly interested in using the tech to power heart monitors and similar health-tracking devices without the need for batteries. The armband version of the device is currently more successful at generating electricity than the version embedded into an athletic t-shirt. 
September 09, 2016
By Marian L. Tupy and Chelsea Follett
If you are a sci-fi fan, then you have probably noticed the dystopian character of movies about the future. From the classics, such as Soylent Green and Blade Runner, to modern hits, such as the Matrix trilogy and District 9, Hollywood’s take on the future is almost invariably negative. The story lines tend to centre on depletion of natural resources, like in the Mad Max movies, the emergence of highly stratified societies, like Elysium, or both.

In Hollywood’s rendition, the future consists of a few people at the top, who partake in the good life and enjoy what’s left of earth’s resources, while the much more numerous masses suffer some form of enslavement and destitution. That is, until one day, a messianic figure emerges to overthrow the existing order, slaughters the oppressors, liberates the untermenschen and ushers in an era of peace and prosperity.

One of the most recent installments in Hollywood’s ceaseless torrent of dystopianism is the widely popular Hunger Games franchise. The plot warns of the dangers of authoritarianism and of the utter failure of central planning. Thanks to capitalism, the future will look very different. Before we get to that, here is a quick summary of the plot.

The Hunger Games is a book trilogy by Suzanne Collins, consisting of The Hunger Games, Catching Fire and Mockingjay. These books were adapted into four popular movies, with the last book split into two feature films — Mockingjay Part I & II. The books sold more than 65 million copies in the United States alone, and have been translated into 51 different languages. In total, the movies made almost 3 billion dollars worldwide. In an NPR poll, The Hunger Games were second only to Harry Potter in popularity among teenagers. The three-finger salute used by the revolutionaries in The Hunger Games series became a real-life symbol of defiance in Thailand, where people were imprisoned for making the gesture.

The Hunger Games is set in what used to be the United States of America, but has transmogrified into an evil authoritarian regime called “Panem” (from Latin "panem et circenses,” or “bread and circuses”). In an extremely wealthy city located somewhere in the Rocky Mountains called the Capitol, the wealthy live impossibly lavish lives and rule over the surrounding districts. They wear elaborate make-up and bizarre modes of dress very loosely reminiscent of the opulence of French courtiers right before the French Revolution. They have constant parties with much pageantry and impressive technology (e.g., their food dispensers and showers have hundreds of buttons, etc.). At their parties, when they become full, they drink a liquid drug that causes them to vomit, so that they can enjoy more of the fine food that is available to them. Most do not produce anything and those who do “work” perform jobs like “TV host” or “fashion designer” – mainly for their own amusement.

Each of the twelve surrounding districts has a single centrally-planned economic specialization. Some districts are richer than others, but most are very poor. District 7’s people, for example, cut trees for lumber all day. The people in District 4 catch fish, while District 9 produces grain, District 10 raises livestock, and District 11 maintains orchards. The poorest district is District 12, located in Appalachia, whose people are coal miners and frequently starve. They are not allowed access to advanced technology, although they have old television sets to view government propaganda and the Hunger Games.

The people of the Capitol host a reality television show called the Hunger Games, where children from the different districts must battle each other. The children have to survive without food in a large forest-like domed arena filled with genetically engineered monsters, and kill each other as well as the monsters. The last child alive is set free to return to his or her district. The Capitol’s residents see no moral problem with the Hunger Games – the lives of the poor laborers’ children mean nothing to them.

Over the course of the series, a girl, Katniss, and boy, Peeta, from the poor Appalachian mining district manage to survive as contestants on the Hunger Games twice. That forms the plot of the first two books. In the last book, they become involved with a violent revolution against the ruling class in the Capitol. The boy, Peeta, is captured by the government and tortured, but the revolution eventually succeeds. The Hunger Games are abolished and a new government is installed. Katniss and Peeta survive the war, grow up, and eventually have children together.

If you are looking for drama and excitement as dished out by the talented Ms. Collins, feel free to watch all 548 minutes of the four movies combined. If, on the other hand, you are interested in taking a peak at the future as it is being currently created by ordinary human beings, watch this 2-minute video of a driverless tractor developed by Case IH, a manufacturer of agricultural equipment.

Chances are, fully autonomous robots will complete the process of mechanization of American farming in our lifetime. Already less than 2 percent of the American labor force works in agriculture – many as tractor and truck drivers, not manual laborers. This tiny fraction of American workforce produces enough food to feed not only the United States, but also, through American food exports, much of the rest of the world.hunger-games-1Put differently, feeding humanity does not require a permanent underclass of modern-day helots, as Hollywood would have you believe. Programmers and innovators in urban centers (i.e., the Capitol) compete with one another to produce labor-saving machines that make the lives of the people on farms (i.e., the districts) easier. Far from preventing the latter from acquiring new technology, the livelihoods of the former depend on the purchasing power of the farmers – who have higher incomes and more wealth than the American median.hunger-games-2Finally, consider agricultural productivity in the global context. As Professor Jesse H. Ausubel of the Rockefeller University writes, “agriculture has always been the greatest destroyer of nature, stripping and despoiling it, and reducing acreage left.” Thus, if humanity can further increase crop yields – since 1940, the American farmers have quintupled corn production while using the same or even less land – some of the agricultural land could be returned to nature.hunger-games-3Globally, therefore, adoption of American farming techniques could increase agricultural productivity so much that a landmass the size of India could be returned to nature, without compromising the food supply to our apparently “peaking” global population – the world’s population is likely to peak at 8.7 billion in 2055 and then start to decline. Last, but not least, tens of millions of agricultural laborers in Africa and Asia will be freed from back-breaking labor, migrate to the cities and create wealth in other ways.

If you are truly concerned about the future of humanity in general, and hunger, poverty and equality in particular, forget about The Hunger Games and embrace the driverless tractor instead.
Americans have lately been debating the tradeoffs we face as the global poor rise. Their gains have been enormous and unprecedented. And yet the American working class has struggled to better itself even as conditions have improved for most others:Image source.

Percentiles 80-95 contain many from the relatively rich countries’ lower-income classes; there are a lot of Americans in there. Other factors may be at work, but let’s say for the sake of argument that the gains by the global poor have on balance harmed at least some of them.

So why is this happening? Is it part of some other nation’s malicious plan? Is it China, perhaps? Or India? Or did we inadvertently do it to ourselves, through bad trade agreements or “soft” foreign policy?

It’s natural to want to make the story about us, or our actions, or a villain who threatens us. Those sorts of explanations are politically useful; they suggest that the right leader can get us out of the mess we’re in.

But maybe the correct explanation isn’t about us at all. One way to see this is to ask a slightly different question: Why is the Great Global Enrichment happening right now? Why didn’t it happen in the 1960s? It happened in the 1960s in Japan, after all. It presumably could have happened elsewhere too. So why not?

The left-hand side of the graph contains few Americans or Europeans. It’s mostly made up of people from India, China, and Africa. In the 1960s, India was undergoing a slow-motion economic suicide, nationalizing major industries under Jawaharlal Nehru and Indira Gandhi, and pursuing economic autarky in the false belief that that’s just what industrializing nations need to do. China’s economic suicide was much more dramatic, with the Great Leap Forward bringing ecological disaster, mass starvation, and tens of millions of deaths. Over in Africa, a colonial-era infrastructure geared toward extraction found ready use in the hands of socialist and nationalist state agents, who expropriated foreign and domestic investments to enrich only themselves, while scaring away most future investments for a generation.

Things are different today. Since the 1990s, India has steadily pursued economic liberalization, and as a result, its economic growth has accelerated. China is no more than nominally Maoist anymore, and while its human rights record remains lamentable, at least the central government isn’t micromanaging the economy with Lysenkoist pseudoscience. In Africa, expropriation and nationalization of assets are at historical lows, making it safer than ever to invest in Africa, no matter where you come from.

So… maybe the story is not about us. It’s also not about an enemy who threatens us. It’s about the rest of the world not shooting itself in the foot anymore. It’s about other societies increasingly adopting economic liberalism, which happens to be very good at lifting people out of poverty.

In the process, the rest of the world is exposing many Americans to market discipline, which, yes, is going to hurt. But the only way to stop this process is to re-impose repressive economic regimes on billions of people. That’s a step that’s equal parts unwanted, unrealistic, and unethical, and the transformation at hand is just too big to be much affected by anything else that we might do.

Both sides of our political spectrum have purely venal reasons to want the story to remain about us. The left doesn’t want to admit that economic liberalism beats command-and-control socialism when it comes to mass enrichment. The right has lately embraced economic populism as a check on a purportedly hostile world – a worldview that positively requires one or more villains. But maybe we don’t live in a hostile world. Maybe we live in an increasingly excellent world, one that we created inadvertently, through the power of good examples. If so, that’s not a change that we should want to undo. Let them have their freedom, let the curse of poverty be lifted, and let the competition continue.
September 06, 2016
By Marian L. Tupy
Last week, Case IH, a manufacturer of agricultural equipment, unveiled a prototype of a farm tractor that can plant, monitor crops, and harvest without a driver. In the future, "autonomous vehicles" could complete the process of mechanization of American farming, thereby further increasing U.S. agricultural productivity. The negative effects of increased mechanization on the labor market should be minimal, since only 1.5 percent of the American labor force works in the agricultural sector. Conversely, productivity improvements in agriculture could increase positive effects on the environment. As Jesse H. Ausubel writes, "agriculture has always been the greatest destroyer of nature, stripping and despoiling it, and reducing acreage left." Thus, if humanity can further increase crop yields—since 1940, the American farmers have quintupled corn production while using the same or even less land—some of the agricultural land could be returned to nature. Globally, adoption of American farming techniques could increase agricultural productivity so much that a landmass the size of India could be returned to nature—without compromising food supply to our apparently "peaking" global population.1. U.S. agricultural output has been growing...
2. ...even though very few Americans still work in agriculture.3. As a result of increasing farm productivity, food prices have been declining.4. Today Americans spend less on food as a share of their income than even before...5. ...and they get to consume more calories.6. Globally, food prices are lower today than what they were in 1960.7. As a result, people around the world consume more calories.8. And, in spite of global population increase, the use of land for agricultural purposes has peaked around the year 2000.

This article was originally published in Reason.
September 02, 2016
By Marian L. Tupy
On a couple of previous occasions, I have written about the failures of socialism and about socialism’s continued appeal. In those columns, I pointed to research that suggests that at least some socialist instincts, including zero-sum thinking and egalitarian sharing, might be inherent to the design of the human brain. A number of people emailed me to express their skepticism about the “innate” nature of socialism. Isn’t socialism, they said, a relatively new phenomenon that arose, in large part, as a response to the perceived “abuses” of capitalism?

As a consolidated, if not necessarily coherent, criticism of capitalism, socialism is certainly new. But, so is capitalism, as we understand it. Prior to the Industrial Revolution, few people talked about either. However, as I will show below, flashes of socialist and anti-capitalist thinking can be discerned all the way back in antiquity, thus pointing to the deep-seated nature of intuitive responses to both economic “systems.”

As mentioned, “socialism” and “capitalism” are relatively new, but their basic precepts are not. In so far as capitalism is only the latest iteration of an economic set up based on commerce, private property and profit making, there have always been those who found those three unpalatable.

Consider the following examples. Hesiod, the Greek poet who lived in 8th century BC, believed that human history could be divided into golden, silver, bronze, heroic and iron ages. The defining characteristics of the golden age, he thought, were common property and peace. The defining characteristics of his contemporary iron age were profit-making and violence.

In Homer’s Odyssey, which was probably written in the 8th century BC, the Greek hero Odysseus is insulted for resembling a captain of a merchant ship with a “greedy eye on freight and profit.” According to 5th century BC Greek historian Herodotus, the Persian emperor Cyrus the Great dismissed his Spartan enemies by saying,

“I have never yet been afraid of any men, who have a set place in the middle of their city, where they come together to cheat each other and forswear themselves. Cyrus intended these words as a reproach against all the Greeks, because of their having market-places where they buy and sell….”

Writing in the 4th century BC, Plato envisaged an ideal society ruled by “guardians,” who had no private property, so as not to “tear the city in pieces by differing about ‘mine’ and ‘not mine.’” He observed that “all the classes engaged in retail and wholesale trade … are disparaged and subjected to contempt and insults.” In the ideal state, Plato averred, only non-citizens should engage in commerce. Conversely, a citizen who becomes a merchant should be punished with imprisonment for “shaming his family.” Even the hyper-rational Aristotle agreed that “exchange [of goods for profit] is justly condemned because it involves … profiting at others’ expense.”

In ancient Rome, wrote Professor D. C. Earl of the University of Leads, “All trade was stigmatized as undignified … the word mercator [merchant] appears as almost a term of abuse.” In the first century BC, Cicero noted that retail trade is sordidus [vile] because retailers “would not make any profit unless they lied constantly.” The Roman masses shared this attitude. The comedies of Plautus were directed to a mass audience. In them, notes Earl, Plautus “makes frequent reference to the commercial classes, who are invariably treated with hostility and contempt.”

The hostility of Roman Catholic theologians to commerce is well known. Consider the Decretum Gratiani, which was the standard compilation of canon law from the time that Gratian published it in the mid-12th century AD until 1917. Accordingly, “Whoever buys something … so that it may be a material for making something else, he is no merchant. But the man who buys it in order to sell it unchanged … is cast out from God’s temple.”

Protestant theologians agreed. According to the economic historian R. H. Tawney, Martin Luther “hated commerce and capitalism.” In Das Kapital, Karl Marx approvingly quotes Luther as saying, “Great wrong and unchristian thievery and robbery are committed all over the world by merchants.” And John Calvin noted that the life of the merchant closely resembles that of a prostitute, for it is “full of tricks and traps and deceits.”

Idealized societies that various thinkers have imagined throughout the ages tended to share the prejudices of the ancients. Sir Thomas More, the Lord Chancellor of Henry VIII of England, coined the word “utopia” in a book of the same name. In More’s Utopia, both money and private property were abolished.

Over the succeeding centuries, humanity periodically acted on its revulsion toward trade, private property and profit making. Some experiments, such as those of 15th century Bohemian Taborites and 17th century Plymouth Colonists, were inspired by the Christian religion. Others, like Robert Owen’s 19th century experiments in New Harmony, Indiana, and New Lanark, Scotland, were not. In the end, all such experiments failed amid discord and poverty.

This, by necessity truncated, look at the past clearly indicates the ancient roots of human hostility toward some of the most important features of capitalism. To explore this topic further, I have organized a policy forum on “Socialism and Human Nature,” which will take place at the Cato Institute on September 14 at 11am.

The forum will feature three well-known thinkers: Jonathan Haidt, Professor of Ethical Leadership at the New York University; John Tooby, Professor of Anthropology at the University of California–Santa Barbara; and Leda Cosmides, Professor in the Department of Psychological and Brain Sciences, University of California–Santa Barbara. Our panel will further explore the evolutionary origins of these impulses and I will write about our conclusions in a future column.

This article was originally published in CapX.
September 02, 2016
By Chelsea Follett
A Cure for Alzheimer's Disease

What if a drug could ward off Alzheimer’s disease before it begins? A revolutionary new drug, aducanumab, could do just that, as well as halt the disease’s progress in those already affected. The disease poisons brain cells with clumps of damaging amyloid protein. The new drug contains antibodies that empower the body’s immune system to hone in on the toxic protein and destroy it. Amyloid practically vanished in the brains of patients given a high dose of aducanumab. Alzheimer’s researchers are hailing the drug as the greatest breakthrough in a quarter century.

Injection-Free Dental Visits

Getting a shot is never pleasant, but sometimes necessary. Fortunately, it may now be needed in one less situation. For those about to undergo dental surgery, a nasal spray has proven safe and sufficient to provide local anesthesia. The nasal spray, called Kovanase, has just successfully gained approval from the U.S. Food and Drug Administration. In the final trial during testing of the drug, 88 percent of those who tried the nasal spray were able to complete dental restorative procedures involving drilling without any anesthetic injections, compared to 28 percent of those who received a placebo nasal spray. 

A New Hope Against Zika

Zika, a mosquito-spread virus linked to severe birth defects when contracted by pregnant women, has swept across much of South and Latin America, and its carrier mosquitoes are now present as far north as Florida. A recent discovery provides hope that we may soon be able to defeat the virus: a drug already approved by the FDA to treat tapeworms also appears to prevent Zika virus from replicating in Petri dishes in a lab. The research was published in the prestigious science journal, Nature. Tweaking the tapeworm drug could lead to effective Zika treatment. 
August 31, 2016
By Chelsea Follett
It can be hard to remember that even in wealthy countries, food has not always been abundant, and in many parts of the world hunger remains a problem. Fortunately, we are making great headway towards solving it. Here are five charts summarizing the incredible progress that humanity has made against hunger.

1. According to data from the United Nations, as recently as 1992, over a quarter of the world’s population was undernourished. Since then, a dramatic decline in hunger has occurred, particularly in places like China where economic liberalization has led to rapid development. In 2015, the share of the world population suffering from undernourishment had fallen to about 18 percent, while in China it had fallen even further, to less than 10 percent.
Hunger graph 12. Not only do fewer people go hungry as a share of the population, but the total number of people suffering from hunger has also declined. Despite population growth, the number of undernourished persons has fallen from over 950 million in 1992 to about 685 million in 2015. That’s almost 270 million fewer undernourished people or a 28 percent reduction. China saw a more dramatic reduction of 51 percent. In 2015, 150 million fewer Chinese were undernourished than in 1992.Hunger graph 23. And even those who are malnourished are less severely malnourished. The average caloric shortfall among food-deprived persons (i.e., the number of calories by which they come up short of their daily requirement) has been shrinking. In 1992, a malnourished person typically consumed around 170 fewer calories per day than they needed. In China, the malnourished consumed 190 calories less than needed, on average. By 2015, the shortfall had decreased to about 100 calories worldwide and only 85 calories in China.Hunger graph 34. How has all of this progress been possible? In order to decrease hunger and feed a growing population, humanity has stepped up to the challenge by producing more food. The amount of food produced per person worldwide is now 20 percent greaterthan what it was back in 2005. And back in 2005 it was almost double of what it was back in 1961. Thanks to the Green Revolution and subsequent innovations, crop yields (i.e., the amount of food produced per unit of land) have also risen. By producing more food per hectare, we are able to spare more land for other uses and better preserve the environment. Consider cereal yields:Hunger graph 45. Importantly, as the food supply has risen, the cost of food has also fallen, on average. The price index shown below has been adjusted for inflation and represents a composite of eighteen crop and livestock prices weighted by their share of global agricultural trade. Despite an uptick in food prices since 2001, the long-term trend is clearly one of decline. Today, the cost of food is less than half of what it was back in 1900.Hunger graph 5

This article first appeared in CapX.
A couple of weeks ago, a European friend of mine, who was passing through Washington, suggested that we get together for a few drinks to bitch about politics in Europe and America. "Could we," he requested in a typically dismissive way, meet at a place that serves imported beer as opposed to "the s--t that Americans drink?"

I pointed out to my snooty European friend that not all Americans drank Budweiser and Coors, and that the country abounded with thousands of breweries producing a great variety of beer. In 2013, for example, the United States had close to 2,600 breweries. Today, one source claims, there are over 3,700.

Moreover, the quality of American beer can be very high. To celebrate the annual International Beer Day, for example, one British newspaper ranked the best beers in the world. In a story titled World's best beers to try before you die, six out of the top 17, including the overall winner, came from the United States.
The eagle-eyed reader will notice the changing fortunes of the U.S. brewing industry. During the Prohibition era in the 1920s, the deadhead of the government brought the number of U.S. breweries down to zero—at least officially. Following the repeal of Prohibition, the beer industry quickly rebounded.

Beginning in the early 1940s, however, brewing activity started to decline as the brewing industry consolidated and became dominated by the likes of Anheuser-Busch, Miller and Coors. Why? Before Prohibition, A Concise History of America's Brewing Industry notes, most beer was consumed on-tap in bars or saloons. Between 10 and 15 percent of the beer was bottled, but "it was much more expensive than draught beer."

Then, in 1935, "the American Can Company successfully canned beer for the first time. The spread of home refrigeration helped spur consumer demand for canned and bottled beer, and from 1935 onward, draught beer sales have fallen markedly. The rise of packaged beer contributed to the growing industry consolidation."

What contributed to the revival of the U.S. brewing industry between the 1990s and the present? Part of the reason, surely, rests in the decline of capital costs. According to some estimates, a budding entrepreneur can start a micro-brewery for as little as $50,000. (Thanks to technological progress, each one of us can become a beer "producer" by purchasing a beer making kit for less than $200.) So, next time you raise a glass of chilled American beer, drink to technological progress and capitalism!

This article first appeared in Reason.

August 26, 2016
By Chelsea Follett
Cancer-melting drug approved for human use

A cancer-melting tablet invented in Australia has gained approval for human use in the United States. An American doctor can now prescribe the drug, venetoclax , to anyone suffering from chronic lymphotic leukemia. The drug works by overpowering a protein that is vital to a cancer cell’s survival, causing the cell to melt. In medical trials, 80 out of 116 cancer patients who took venetoclax showed improvement, and about 20 percent of patients had their cancer melt away entirely. One patient was told he had only three weeks to live before he joined a venetoclax clinical trial. Today he is healthy after a two-year period of taking the drug. Australia, the drug’s country of origin, has not yet approved its use outside of medical trials, but researchers are confident that will soon change.

Drones save lives after Italy disaster

In the wake of a devastating 6.2 magnitude Earthquake that killed at least 267 people in Italy and
destroyed ancient villages, rescuers are still searching through the rubble for survivors. Over 1,000 aftershocks have occurred so far, endangering the disaster responders. Fortunately, drone technology is helping to make rescue operations safer. Drones are able to quickly survey the extent of damage to an area by providing live video footage, while helicopters can identify and retrieve survivors. Drones are quickly becoming common in rescue and recovery missions around the world, although they are sometimes hampered by negative public perceptions associating drones with war, as well as the complexity of obtaining government flight permits.


Compound mines gold from old electronics

What if you could mine gold from your old, discarded cell phone? A team of British and American researchers have discovered an uncomplicated and nontoxic compound that would allow you to do just that. The compound – composed of hydrochloric acid leaching solution, a primary amide and a hydrocarbon solvent that smells like perfume – separates and selectively extracts gold from the array of metals present in discarded cell phones. The extraction process is low-cost, produces no environmental waste and is more efficient than previous extraction methods. Using this compound, the researchers were able to fully extract about 80 percent of the gold from old cell phones and other discarded electronics.
August 24, 2016
By Marian L. Tupy
They say that it is difficult to make predictions, especially about the future. Back in 2000, when Robert Mugabe started to expropriate commercial farms in Zimbabwe, thus consigning that country to economic ruin, I predicted that the good people of Zimbabwe would revolt rather than see their country go down the tubes. Sixteen years later, Mugabe is still in charge and Zimbabwe's economy has been, by and large, destroyed. Having learned a lesson—note to Bill Kristol—I have not made another prediction since.

On the upside, Mugabe will have to die someday. According to South Africa's Mail & Guardian, the 92-year-old has recently relinquished many of his responsibilities, works only 30 minutes a day and had his Singaporean doctors flown in to Harare for an unspecified medical procedure. Assuming that the dictator really is on his final, unlamented, leg, let us look at three highlights of his 36 years in office. (To put Mugabe's legacy in perspective, I will compare Zimbabwe with its regional neighbors: Botswana, Namibia and Zambia.)

When Mugabe took over, life expectancy in Zimbabwe was 60.5 years. It peaked in 1990 at 63 years. Then came the HIV/AIDS epidemic and life expectancy collapsed to a low of 40.7 years in 2002. While HIV/AIDS hit the entire Southern African region, the consequences of the epidemic were particularly devastating in Zimbabwe; they were exacerbated by the collapse of Zimbabwe's healthcare system that followed the economic meltdown, malnutrition and the spread of other communicable diseases, such as cholera and tuberculosis. Today, life expectancy remains lower than what it was 36 years ago.
Now let us look at inflation adjusted income per capita, which was $633 in 1980. Average income rose to an all time high in the mid-1990s, but then collapsed to $458. That's a decline of 28 percent. (I have not used my favorite income data set, which adjusts not only for inflation, but also purchasing power parity, because it has no data for Botswana.) Contrast that with Botswana, where incomes rose by 285 percent. Even Zambia, which toyed with socialism in the 1970s and 1980s, is today richer than Zimbabwe. Worldwide, incomes rose by 57 percent and average income in Africa rose by 68 percent between 1980 and 2015. And, let us not forget that Mugabe's economic mismanagement resulted in the second highest hyperinflation in recorded history. According to my Cato colleague Steve Hanke, it reached 90 sextillion percent in 2008, with prices doubling every 24.7 hours.Last, but not least, consider political freedom. Back in 1980, Zimbabwe was hardly a liberal democracy, but Mugabe, a convinced Marxist, managed to make things much worse. He turned Zimbabwe into a one-party state and sent his North Korean-trained goons to wipe out 20,000 supporters of the opposition in the province of Matabeleland. Zimbabwe's "democracy score" nosedived between 1980 and 2008, when Mugabe's ZANU-PF party was forced into a power-sharing agreement with the opposition Movement for Democratic Change.Today, Zimbabwe, once Africa's second most sophisticated economy, is a wasteland. As the aging dictator's hold on power slips away, Mugabe's successor will face the unenviable task of undoing 36 years of failure.
DNA Database Offers Disease Insights  

A giant DNA database that pulled data from over 60,000 people from diverse parts of the world is helping scientists pinpoint the causes of disease. Analyzing the huge amount of DNA allowed an international team of researchers to newly identify 3,000 genes that may cause disease. They were also able to conclude that 160 genetic mutations previously thought to be connected to disease are in fact harmless. The researchers focused on a number of different diseases ranging from muscular dystrophy  and cystic fibrosis to some types of heart disease.  

Self-Driving Uber Fleet Arriving in Pittsburgh  

This month, the ride-sharing company Uber will unleash a new fleet of self-driving cars in Pittsburgh, Pennsylvania. To persuade customers to give the new service a try, whenever a passenger opts for a self-driving vehicle using the Uber smart-phone app, the ride will be free. All of the self-driving vehicles will have human back-up drivers present behind the wheel just in case they are needed. If all goes well in Pittsburgh, Uber hopes to one day extend the new self-driving car service to other parts of the country.  

Manmade Neurons Bring Artificial Intelligence Closer  

Tech company IBM made a breakthrough earlier this month in Zurich: a complete, fully functional, artificial neuron. Neurons are the cells responsible for much of the action in the human brain. They recognize patterns, send electric signals to other neurons and form connections. They are not perfectly predictable, and that slight degree of randomness actually makes them better at some tasks than computers. The researchers plan to link up a bunch of artificial neurons into a network like the one in the human brain. Studying artificial neurons could help researchers gain a fuller understanding of how the brain works. Artificial neurons could also help researchers one day develop human-like artificial intelligence.

Biometric Payments Could Be the Future

Everything you ever wanted to buy could soon be a literal finger’s tap away. A Japanese biometric startup, Liquid Inc. is introducing a mobile fingerprint-payment system. This would allow businesses to use fingerprints to both authorize and access customers’ financial data without the need of a credit card. A few businesses in Japan have already implemented Liquid Inc.’s system. Skeptics of the new system doubt that it can match the speed and accuracy of credit cards or other mobile payments, however others hope it will be common by the 2020 Olympic games.