Apple

President Obama received a lot of outraged criticism from the right during the 2012 campaign for his remark, “You didn’t build that.” What he meant, though he uncharacteristically said it in a fairly clumsy way, was that for every proudly self-made entrepreneur there is a huge web of supporting institutions and infrastructure built by the government. These essential supports include the obvious – the courts, the Interstate highway system, police and fire departments, etc. – but also a tremendous array of investments undertaken by national governments, many of which have provided a platform on which entrepreneurs can build. A recent article in The Economist, reviewing a new book entitled “The Entrepreneurial State,” by Mariana Mazzucato of Sussex University in England, makes this explicit. And nowhere is this investment activity more influential than the United States, supposedly the cradle of unbridled individual enterprise.

Beneficiaries of these investments include Apple: “The armed forces pioneered the Internet, GPS positioning and voice-activated ‘virtual assistants.’ They also provided much of the early funding for Silicon Valley. Academic scientists in publicly funded universities and labs developed the touchscreen and the HTML language. An obscure government body even lent Apple $500,000 before it went public.” They also include Google, which received early funding from the National Science Foundation. Pharmaceutical and biotechnology companies benefit from the $30 billion in annual funding for biomedical research from the National Institutes of Health.

As Ms. Mazzucato argues, “The entrepreneurial state does far more than just make up for the private sector’s shortcomings: through the big bets it makes on new technologies, such as aircraft or the internet, it creates and shapes the markets of the future. At its best the state is nothing less than the ultimate Schumpeterian innovator—generating the gales of creative destruction that provide strong tailwinds for private firms like Apple.”

There are reasons to be skeptical of some government investments. The Solyndra debacle, in which the Federal government provided $500 million in loan guarantees to a California manufacturer of solar panels, which promptly went bust in the face of low-cost competition from China, is a cautionary example of the dangers of bureaucrats playing at being venture capitalists. But the rallying cry of the Tea Party – the Randian (Ayn and Paul) notion that the state is essentially a parasite feeding on the efforts of bold and visionary individual entrepreneurs – is a pure fairy tale.

The Economist article asks “why are some states successful entrepreneurs while others are failures?” and it provides an answer:  “Successful states are obsessed by competition; they make scientists compete for research grants, and businesses compete for start-up funds—and leave the decisions to experts, rather than politicians or bureaucrats. They also foster networks of innovation that stretch from universities to profit-maximizing companies, keeping their own role to a minimum.” This sounds much like the blueprint for Silicon Valley, or any other successful technology-based industrial cluster.

In our current budget-cutting environment, these essential investments are under threat. According to a recent article in The Huffington Post, sequestration will cost the NIH 5% of its budget, or $1.7 billion, forcing the cancellation of 700 competitive research grants in the current Federal budget year. Similarly, the National Science Foundation is expected to issue 1,000 fewer research grants this year as a result of sequestration.

According to a statement by the Congressional Budget Office, the Federal Highway Trust Fund, which is funded by taxes on gasoline and diesel and which provides a substantial portion of the money states use to maintain state and national roads, is essentially insolvent and will run out of money completely by 2015. The only way to avoid this would be to cut transportation funding by 92% or raise the Federal gasoline tax by 50 cents a gallon, and it’s hard to imagine the Republican-controlled House of Representatives going along with the latter.

In the current political environment, much of the public investment that enables American businesses to innovate and prosper is under threat, mainly from ideologues who refuse to recognize the essential role that government initiative and funds have played since the founding of the Republic, from the Erie Canal to the interstate highway system, to nanotechnology research.

It is worth considering these facts in the context of the current legal battle between Verizon and the Federal Communications Commission. Verizon is challenging a net-neutrality order adopted by the FCC in 2010, which states that internet service providers (ISPs) cannot block lawful content and mobile broadband providers cannot block lawful websites. Verizon contends that the order violates its First Amendment rights. More to the point, Verizon and its competitors, including Comcast and AT&T, maintain that they spent billions of dollars to build their networks and should be able to grant or deny access as they please, or charge different customers different rates for transmitting on their networks. This is a spurious argument. The broadband companies are “common carriers,” a term that applies not only to telecoms companies but also to airlines, railroads, and trucking companies: they offer their services to the general public under a license or authority granted by a public regulatory body. Common carriers are subject to licensing requirements because they are using a public resource: radio spectrum, rights of way, public roads, or air traffic control systems, without which they would be unable to operate. In the case of the telecoms company, they are also using a resource – the Internet – that was developed by DARPA, the U.S. Defense Advanced Research Projects Agency.

The Internet and the public airwaves are not the property of those companies to use as they please, free from any oversight or interference. It is time to tell them, “No, you didn’t build that.”

 

Share

{ 1 comment }

Foxconn International Holdings, the world’s largest contract manufacturer of electronic components, made notorious last year by a rash of employee suicides at its Chinese factories, recently published its half-yearly financial results, which showed that its annual labor costs per employee have risen by a third over the past year, to $2,900.

Foxconn, 71% owned by Hon Hai Precision Industry of Taipei, and which also assembles products for Sony, Dell, and Hewlett Packard, employs an estimated 400,000 people at its two factories in Shenzhen (Hon Hai, with 800,000 employees, is the 10th-largest employer in the world). These people, most of them young, many of them women, work 11-hour shifts, seven days a week. According to the New York Times, Mr. Ma Xiangqiang, a 19-year-old Foxconn employee who jumped to his death from a Foxconn dormitory in January 2010, had worked 286 hours in the month prior to his suicide, including 112 hours of overtime, more than three times the legal limit. By all accounts, Foxconn is not a fun place to work, combining some of the worst features of military service, summer camp, and prison, but the problems facing Foxconn are far from unique. [click to continue…]

Share

{ 1 comment }

In the Financial Times, possibly the best newspaper in the world and full of intelligent reporting and comment, John Kay stands out for his incisive take and economy of expression. One of his most recent articles, “Why you can have an economy of people who don’t sweat,” takes to task the “manufacturing fetishists” who believe that any economic activity apart from manufacturing, agriculture, or mining is of minimal real value. We are all prey to this attitude to some degree, especially in the wake of the financial crisis, in which economies that rely heavily on a somehow “unreal” financial sector fared worse than those economies more focused on production of things you can drop on your toe.

I have long been troubled by this attitude. Somehow, a banker inventing and flogging new forms of derivatives or trading algorithms seems morally and economically less worthy than someone who makes pig iron or beer for a living, and this view may be justified. But on close examination, many other jobs, which involve more brains than brawn, generate far more real value, however you care to measure it, than tightening bolts or stitching sleeves on an assembly line.

Kay mentions Apple’s iPod  as a product in which the value added from manufacturing the device itself, including the extraction and processing of the metals and plastics it contains, amounts to three or four dollars of the two or three hundred dollars the finished product sells for. Product and production design, marketing and promotion, and logistics and distribution are where the real value is created. This is increasingly true of almost any physical product we buy. When you pay several dollars a tablet for your heart medicine or Viagra, the value resides not in the cost of producing the pill itself, which in most cases amounts to fractions of a cent, but in the billions of dollars spent on research and development and testing: the intellectual value. Even if pharmaceutical companies are often guilty of inventing cures for conditions no one knew existed until they saw the TV commercials, the intrinsic value of much of what they produce is incalculable, and has little to do with the hourly wages of workers on the assembly line. Kay also cites the example of book publishing, in which “the books that Britain exports have, for as long as I can remember, been made from trees grown abroad; but then globalization meant the paper was also made abroad, and increasingly the printing took place overseas. Soon shipments will be entirely electronic; selling a book will involve no physical objects.”

Lately I have been working on development of industrial parks in Haiti, in which international garment manufacturers, attracted by cheap labor, special trade preferences, and proximity to the U.S. market, will set up factories.

This is unpopular in many quarters. Anti-globalists point to the exploitation of poor Haitians. A Haitian garment worker can expect to make less than six percent of the earnings of an American on minimum wage. Working and living conditions will be poor, and a worker might spend a quarter of his daily earnings for transport to and from work. Though the factories for the most part are clean and well-lighted and ventilated, it is not an easy life. Many also decry the shipment of American jobs overseas, as if employing Americans as minimum-wage sewing machine operators were vital to our national interest.

Most of the garment manufacturers likely to invest in Haiti’s industrial parks are Korean companies. Once Korea itself was a huge manufacturer and exporter of clothing, but as Korea grew richer it could no longer compete with lower-wage countries. But the Korean companies that had formerly made garments in Korean factories using Korean workers did not quit the business. Instead, they began setting up factories in places like Vietnam and Cambodia and Honduras and Nicaragua, which, with Korean experience and know-how, became some of the most efficient manufacturing operations in the world. The Korean companies integrated these factories into their global supply chains, becoming suppliers of choice to U.S. clothing companies and retailers like Levi Strauss, Wal-Mart, Nike, and Gap.

When you buy a pair of Levi’s made in Haiti, what is the source of value in the product? Is it the product design, distribution, quality control, and marketing provided by Levi Strauss? Is it the factory design and construction, production engineering and management, fabric sourcing, and logistics provided by the Korean manufacturer? Or is it the Haitian sewing machine operator? Obviously, all participants in this complex supply chain play an essential role, but you’d have to be a devout Marxist to argue that the product’s value resides exclusively or mainly in the direct manufacturing labor. That may have been the case in the 19th century, but certainly is not in the 21st. What differentiates one product from another, except perhaps for hand-made watches costing $100,000 and up, is not the skill and hard work of the laborer but the design, engineering, production management, logistics, distribution, and marketing that transform an idea into a tangible item.

It goes even further. The end product needn’t even be tangible, as John Kay points out in his example of book publishing. To put it another way, why do most of us instinctively feel that manufacturing a TV set is somehow a more worthy activity than producing a TV show?

Our entire way of looking at the economy is conditioned by this bias. We talk about our balance of trade, but although statisticians and econometricians do their best to quantify trade in services, such data are much harder to capture than information on trade in physical goods. When a software package is exported or an engineering firm designs and manages construction of a new road in another country, those transactions show up in the balance of trade of both the exporting and the importing country. But it gets far more complex when we consider all the trade in non-tangible items and services that take place within a single company or supply chain. When Gap sends its new T-shirt design to a manufacturer in Haiti it is effectively exporting its intellectual property to the manufacturer. When an engineer flies from Korea to Haiti to oversee the retooling of the manufacturing line for the new product, his employer in Korea is exporting his know-how. But it’s unlikely that either of these transactions shows up in official trade statistics. So, to complement our manufacturing bias, we have a system of recording data that fails to capture much of the value of non-physical trade and thus devalues that trade in the estimation of both the public and the policy markers who rely on those data.

It gets worse. Engineering is still considered a noble endeavor, but mainly because it remains directly linked to the production of something physical, be it a toaster, a bridge, or a pair of pantyhose. But a designer? Someone who slaps his initials on a T-shirt so as to sell it for ten times the price of the equivalent generic product. Where’s the value in that? And what about the marketers, the shippers, the wholesalers, the retailers, and the advertisers, not to mention the customer support staff? Useless parasites all of them, in the popular view, selling us stuff we don’t need and adding layer upon layer of profit to make everything more expensive than it should be. A moment’s reflection should dispel that belief, but it doesn’t. So Wal-Mart, which has outsourced much of its manufacturing to China, is considered a villain for shipping American jobs overseas, even though it has devised one of the most sophisticated systems of distribution, logistics and inventory management on the planet, which employs 2.1 million people worldwide, 1.4 million of them in the U.S.

The garment industry is an especially pertinent example, because it employs so many people and, in many poor countries, is the first step on the path to industrialization and greater prosperity. It is also one of the first industries to leave a place when the cost of production rises, usually as a function of rising wages. It happened in the U.S. right after WWII, when New England textile mills moved to the Carolinas, attracted by lower wages, less powerful unions, and cheaper electricity. It has happened again as textile and garment companies moved their factories to Mexico and El Salvador and Lesotho and China, and now Vietnam, Cambodia, and Haiti. Each time this happens there are recriminations and self-criticism. We are losing the manufacturing base that makes us strong. Our leaders should have done more to keep the jobs here. Those underhanded foreigners are grabbing “our” jobs.

If this were actually true, and if the migration of low-skill, low-paid jobs to poorer countries were such a threat, unemployment in the U.S. would be much higher than it is, whereas, until the current recession hit, we had one of the lowest unemployment rates of any advanced country. Some proportion of the U.S. labor force does work slinging burgers, greeting shoppers at Wal-Mart, and calling you at dinnertime to sell you a new cable TV service, but that’s only a fraction of the total. In Massachusetts, where I live, many of the textile jobs that went south and then overseas were replaced by higher-skilled and better-paid jobs in information technology, telecommunications, biotechnology, and finance. If our government had tried to resist the economic tide and preserve those textile jobs, these new industries might never have emerged. We should rejoice at these trends instead of seeing them as emblematic of the failure of our system of economic governance.

In John Kay’s words, “The productivity of modern economies is based on the division of labor. If everyone grows their own food, and gathers their own fuel, it takes them most of the day. There is little time or energy left for conversation, entertainment, trading derivatives or inventing new goods… specialization of tasks gave opportunities to achieve economies of scale and to focus on tasks at which individuals or companies were, or became, particularly skilled. Less time had to be devoted to toolmaking, hunting and foraging, and more was available for chatting, playing music, hairdressing, insurance broking and discovering how the world worked. Some new activities required rarer skills and were consequently well rewarded…the division of labor becomes ever finer and generally increases the wealth of all involved in the production process. Perhaps it is time for manufacturing fetishists to move beyond categories set by Stone Age man’s requirements for food and shelter.”

Share

{ 2 comments }

One could argue that true genius in business is more about giving consumers things they don’t even know they want than about giving them what they want or say they want. I remember the first time I saw the Apple iPod. I went online immediately and ordered one, even though in those dark ages I had to purchase third-party software to make it run with my PC. I also remember when the Sony Walkman and the CD player and disk were both introduced, and although I was slower to get those, I marveled at the genius of the people who gave us such elegant solutions to problems most of us were only dimly aware we had. [click to continue…]

Share

{ 0 comments }

In 2003, motivated by the savagery of civil wars in Sierra Leone and Liberia, 75 countries joined a U.N.-sponsored global initiative to prevent trade in “conflict diamonds,” popularly referred to as “blood diamonds.” Conflict diamonds are gems mined in areas afflicted by armed conflict, the proceeds of which go to purchase arms and other materiel to prolong and intensify the conflict, which is usually all about control of those same diamond deposits. This initiative, called the Kimberley Process, instituted a system of certification under which governments of both source countries and purchasing countries would collaborate to prevent conflict diamonds from being sold internationally. The Kimberley Process was endorsed by major diamond producers, including world market leader De Beers, to avoid being tainted by the blood diamond label and, perhaps coincidentally, to reinforce their market dominance by banning trade in stones of uncertain provenance.  But it was also a good-faith effort to put an end to the spread of vicious conflicts motivated and fueled by mineral resources.

Less well-known than the conflicts in West Africa is the civil war that continues to rage in parts of the Democratic Republic of Congo (DRC), known at various points in its history as Zaire, the Belgian Congo, and the Congo Free State, which in the late 19th and early 20th centuries was the private preserve of Leopold II, King of the Belgians. The current war, which dates back to the 1994 Rwandan genocide and the overthrow of dictator Mobutu Sese Seko in 1997 and has its roots in earlier political and ethnic squabbles, is reckoned to be the deadliest armed conflict since the Second World War, claiming over five million lives between 1998 and 2008. [click to continue…]

Share

{ 3 comments }