Cato Op-Eds

Individual Liberty, Free Markets, and Peace
Subscribe to Cato Op-Eds feed

Jim Harper

You’ve probably heard some version of the joke about the chemist, the physicist, and the economist stranded on a desert island. With a can of food but nothing to open it, the first two set to work on ingenious technical methods of accessing nutrition. The economist declares his solution: “Assume the existence of a can opener!”…

There are parallels to this in some U.S. state regulators’ approaches to Bitcoin. Beginning with the New York Department of Financial Services six months ago, regulators have put proposals forward without articulating how their ideas would protect Bitcoin users. “Assume the existence of public interest benefits!” they seem to be saying.

When it issued its “BitLicense” proposal last August, the New York DFS claimed “[e]xtensive research and analysis” that it said “made clear the need for a new and comprehensive set of regulations that address the novel aspects and risks of virtual currency.” Yet, six months later, despite promises to do so under New York’s Freedom of Information Law, the NYDFS has not released that analysis, even while it has published a new “BitLicense” draft.

Yesterday, I filed comments with the Conference of State Bank Supervisors (CSBS) regarding their draft regulatory framework for digital currencies such as Bitcoin. CSBS is to be congratulated for taking a more methodical approach than New York. They’ve issued an outline and have called for discussion before coming up with regulatory language. But the CSBS proposal lacks an articulation of how it addresses unique challenges in the digital currency space. It simply contains a large batch of regulations similar to what is already found in the financial services world.

The European Banking Authority took a welcome tack in its report on Bitcoin last July, submitting itself to the rigor of risk management. The EBA sought to identify the risks that digital currency poses to consumers, merchants, and a small variety of other interests. The EBA report did not apply risk management as well as it could have, and it came to unduly conservative results in terms of integrating Bitcoin into the European financial services system, but the small number of genuine risks it identified can form the basis of discussion about solutions.

It is very hard to assess a batch of solutions put forward without an articulation of the problems they are intended to solve, as the draft model regulatory framework unfortunately does. Hopefully, future iterations of CSBS’s work will include needed articulation.

My comment spends some time on the assumption that state-by-state licensing for financial services providers has benefits that justify its large costs. “The public interest benefits of licensing obviously do not increase arithmetically with each additional license,” I wrote, assuming correctly, I hope, how the world works. CSBS is in a unique position to streamline the licensing regime.

I also caution CSBS about the assumption that making our finances “transparent to law enforcement” is an appropriate regulator’s role. The Supreme Court has been moving away from the Fourth Amendment doctrine under which some 1970s cases appeared to take constitutional protection away from our financial activities. Financial services regulators should take the side of law-abiding consumers on the question of financial privacy.

Sincerely, the CSBS effort is a fair one, and I think the organization is in a good position to steer its members away from technology-specific regulation like we saw from New York. I look forward to continued, deeper discussion with CSBS and to more work that integrates Bitcoin into the U.S. financial services system.

Nicole Kaeding

The Department of Energy (DOE) is admitting that it failed. Last week, it announced that it will stop development of FutureGen 2.0, a federally-financed, coal-fired power plant in Illinois. FutureGen, and its successor FutureGen 2.0, wasted millions of tax dollars, and was beset with multiple delays and cost overruns.

FutureGen was one of many federal energy projects experimenting in so-called “clean coal” technology. FutureGen sought to demonstrate the technical capabilities of carbon capture and sequestration (CCS) technology. CCS attempts to capture carbon dioxide emissions from coal-fired power plants and store it underground, eliminating an increase in atmospheric carbon dioxide.

FutureGen was launched in 2003 by the George W. Bush administration as a public-private partnership to demonstrate CCS with a site chosen in Illinois. Costs would be shared among the federal government and 12 private energy companies. The project’s estimated cost grew from $1 billion to $1.8 billion by 2008, when it was cancelled due to the cost overruns.  

In 2010 the Obama administration revived the project using stimulus funding. The new project, FutureGen 2.0, was allotted $1 billion from the federal government, with private investors supposed to be providing additional funding.

The project was plagued with problems. Estimated costs grew quickly, rising from $1.3 billion to $1.65 billion. The Congressional Research Service cited “ongoing issues with project development, [and] lack of incentives for investment from the private sector.” Private investors were unwilling to invest in the project. As of August 2014, the FutureGen Alliance had yet to raise the $650 billion in private debt and equity needed. There were additional concerns about the legality of a $1 a month surcharge to subsidize the project that would have been added to the electricity bills of all Illinois residents. Late last year, the Illinois Supreme Court agreed to hear the case.

Now, DOE announced that it will suspend funding for the project. Energy Secretary Ernest Moniz told reporters, “frankly, the project has got a bunch of challenges remaining,” which is a startling admission from the administration. DOE said that the project failed to make enough progress to keep it alive and would not meet a September 30, 2015 deadline for spending the remaining stimulus funds that it had been allotted.

The project spent $202.5 million of the $1 billion before being cancelled. Together, the two iterations of FutureGen ended up costing taxpayers $378 million.

A related issue is that proposed regulations from the Obama administration would functionally require CCS for all new coal-fired power plants in the United States. But with the failure of FutureGen, the federal government has not demonstrated that it works properly. DOE’s other CCS demonstration project in Mississippi is experiencing delays as well. Some experts question if CCS is technologically possible at a cost-effective price.

FutureGen and FutureGen 2.0 are part of a long list of DOE failures. Repeating mistakes made during the Bush administration, DOE reopened FutureGen, which put millions more tax dollars at risk. DOE should stop trying to centrally plan technological advances, and instead let entrepreneurs experiment and the market guide the nation’s energy progress.

Patrick J. Michaels

Matt Drudge has been riveting eyeballs by highlighting a London Telegraph piece calling the “fiddling” of raw temperature histories “the biggest science scandal ever.” The fact of the matter is some of the adjustments that have been tacked onto some temperature records are pretty alarming—but what do they really mean?

One of the more egregious ones has been the adjustment of the long-running record from Central Park (NYC). Basically it’s been flat for over a hundred years but the National Climatic Data Center, which generates its own global temperature history, has stuck a warming trend of several degrees in it during the last quarter-century, simply because it doesn’t agree with some other stations (which also don’t happen to be in the stable urban core of Manhattan).

Internationally, Cato Scholar Ross McKitrick and yours truly documented a propensity for many African and South American stations to report warming that really isn’t happening.  Some of those records, notably in Paraguay and central South America, have been massively altered.

At any rate, Chris Booker, author of the Telegraph article, isn’t the first person to be alarmed at what has been done to some of the temperature records.  Others, such as Richard Muller, from UC-Berkeley, along with Steven Mosher, were so concerned that they literally re-invented the surface temperature history from scratch. In doing so, both of them found the “adjustments” really don’t make all that much difference when compared the larger universe of data. While this result has been documented  by the scientific organization Berkeley Earth, it has yet to appear in one of the big climate journals, a sign that it might be having a rough time in the review process.

That’s quite different than what was found in 2012 by two Greek hydrologists, E. Steirou and D. Koutsoyiannis, who analyzed a sample of weather stations used to calculate global temperature and found the adjustments were responsible for about half of the observed warming, when compared to the raw data. Their work was presented at the annual meeting of the European Geosciences Union, but has not been published subsequently in the scientific literature. That’s not necessarily a knock on it, given the acrimonious nature of climate science, but it seems if it were an extremely robust, definitive paper, that it would have seen the light of day somewhere.

But, before you cry “science scandal” based upon the Greek results, it’s a fact that one of the adjustments that has been commonly used—taking into account the biases introduced by the time of day in which the high and low temperatures for the previous 24 hours are recorded—in fact does induce warming into most records, a change that in fact is scientifically justified.

In sum, I’d hold fire about “the biggest science scandal ever.” The facts are:

  • when the global temperature records were reworked by people as skeptical as yours truly, nothing much emerged;
  • some of the data have been mangled, like the Central Park record—and there are serious problems over some land areas in the Southern Hemisphere; and
  • some of the adjustments for measurement biases introduce scientifically defensible warming trends.

David Boaz

I’m delighted to announce that my new book, The Libertarian Mind: A Manifesto for Freedom, goes on sale today. Published by Simon & Schuster, it should be available at all fine bookstores and online book services.

I’ve tried to write a book for several audiences: for libertarians who want to deepen their understanding of libertarian ideas; for people who want to give friends and family a comprehensive but readable introduction; and for the millions of Americans who hold fiscally responsible, socially tolerant views and are looking for a political perspective that makes sense. 

The Libertarian Mind covers the intellectual history of classical liberal and libertarian ideas, along with such key themes as individualism, individual rights, pluralism, spontaneous order, law, civil society, and the market process. There’s a chapter of applied public choice (“What Big Government Is All About”), and a chapter on contemporary policy issues. I write about restoring economic growth, inequality, poverty, health care, entitlements, education, the environment, foreign policy, and civil liberties, along with such current hot topics as libertarian views of Bush and Obama; America’s libertarian heritage as described by leading political scientists; American distrust of government; overcriminalization; and cronyism, lobbying, the parasite economy, and the wealth of Washington.

The publisher is delighted to have this blurb from Senator Rand Paul: 

“They say the libertarian moment has arrived. If you want to understand and be part of that moment, read David Boaz’s The Libertarian Mind where you’ll be drawn into the ‘eternal struggle of liberty vs. power,’ where you’ll learn that libertarianism presumes that you were born free and not a subject of the state. The Libertarian Mind belongs on every freedom-lover’s bookshelf.”

I am just as happy to have high praise from legal scholar Richard Epstein:

“In an age in which the end of big government is used by politicians as a pretext for bigger, and worse, government, it is refreshing to find a readable and informative account of the basic principles of libertarian thought written by someone steeped in all aspects of the tradition. David Boaz’s Libertarian Mind unites history, philosophy, economics and law—spiced with just the right anecdotes—to bring alive a vital tradition of American political thought that deserves to be honored today in deed as well as in word.” 

Find more endorsements here from such distinguished folks as Nobel laureate Vernon Smith, John Stossel, Peter Thiel, P. J. O’Rourke, Whole Foods founder John Mackey, and author Jonathan Rauch. And please: buy the book. Then like it on Facebook, retweet it from https://twitter.com/David_Boaz, blog it, buy more copies for your friends.

 

Chris Edwards

In recent decades, the Democratic Party has moved far to the left on economic policy. I have discussed the leftward shift on tax policy, which was illustrated once again by President Obama’s generally awful proposals in his new budget (see here, here, and here).

What about regulations? Consider the following statement by President Jimmy Carter on his signing a landmark railroad deregulation bill in 1980. Have you ever heard President Obama express such views or push for similar sorts of legislation?

Today I take great pleasure in signing the Staggers Rail Act of 1980. This legislation builds on the railroad deregulation proposal I sent to Congress in March 1979. It is vital to the railroad industry and to all Americans who depend upon rail services.

By stripping away needless and costly regulation in favor of marketplace forces wherever possible, this act will help assure a strong and healthy future for our Nation’s railroads and the men and women who work for them. It will benefit shippers throughout the country by encouraging railroads to improve their equipment and better tailor their service to shipper needs. America’s consumers will benefit, for rather than face the prospect of continuing deterioration of rail freight service, consumers can be assured of improved railroads delivering their goods with dispatch …

This act is the capstone of my efforts over the past 4 years to get the Federal Government off the backs of private industry by removing needless, burdensome regulation which benefits no one and harms us all. We have deregulated the airlines, a step that restored competitive forces to the airline industry and allowed new, innovative services. We have freed the trucking industry from archaic and inflationary regulations, an action that will allow the startup of new companies, encourage price competition, and improve service. We have deregulated financial institutions, permitting banks to pay interest on checking accounts and higher interest to small savers and eliminating many restrictions on savings institutions loans.

Where regulations cannot be eliminated, we have established a program to reform the way they are produced and reviewed. By Executive order, we have mandated regulators to carefully and publicly analyze the costs of major proposals. We have required that interested members of the public be given more opportunity to participate in the regulatory process. We have established a sunset review program for major new regulations and cut Federal paperwork by 15 percent. We created a Regulatory Council, which is eliminating inconsistent regulations and encouraging innovative regulatory techniques saving hundreds of millions of dollars while still meeting important statutory goals. And Congress recently passed the Regulatory Flexibility Act, which converts into law my administrative program requiring Federal agencies to work to eliminate unnecessary regulatory burdens on small business. I am hopeful for congressional action on my broad regulatory reform proposal now pending, to help complete congressional action on my regulatory reform proposals.

Today these efforts continue with deregulation of the railroad industry and mark the past 4 years as a time in which the Congress and the executive branch stepped forward together in the most significant and successful deregulation program in our Nation’s history. We have secured the most fundamental restructuring of the relationship between industry and government since the time of the New Deal.

In recent decades the problems of the railroad industry have become severe. Its 1979 rate of return on net investment was 2.7 percent, as compared to over 10 percent for comparable industries. We have seen a number of major railroad bankruptcies and the continuing expenditure of billions of Federal dollars to keep railroads running. Service and equipment have deteriorated. A key reason for this state of affairs has been overregulation by the Federal Government. At the heart of this legislation is freeing the railroad industry and its customers from such excessive control.

Steve H. Hanke

In my misery index, I calculate a ranking for all countries where suitable data exist. My misery index — a simple sum of inflation, lending rates, and unemployment rates, minus year-on-year per capita GDP growth — is used to construct a ranking for 108 countries. The table below is a sub-index of all Latin American countries presented in the world misery index.

A higher score in the misery index means that the country, and its constituents, are more miserable. Indeed, this is a table where you do not want to be first.

Venezuela and Argentina, armed with aggressive socialist policies, end up the most miserable in the region. On the other hand, Panama, El Salvador, and Ecuador score the best on the misery index for Latin America. Panama, with roughly one tenth the misery index score of Venezuela, has used the USD as legal tender since 1904. Ecuador and El Salvador are also both dollarized (Ecuador since 2000 and El Salvador since 2001) – they use the greenback, and it is clear that the embrace of the USD trumps all other economic policies.

The lesson to be learned is clear: the tactics which socialist governments like Venezuela and Argentina employ yield miserable results, whereas dollarization is associated with less misery.

Chris Edwards

President Obama proposed an expansive spending plan for highways, transit, and other infrastructure in his 2016 budget.

Here are some of the problems with the president’s approach:

  • Misguided Funding Source. The president proposes hitting U.S. corporations with a special 14 percent tax on their accumulated foreign earnings to raise $238 billion. This proposal is likely going nowhere in Congress, and it is bad economic policy. The Obama administration seems to view the foreign operations of U.S. companies as an enemy to be punished, but in fact foreign business operations generally complement U.S. production and help boost U.S. exports.
  • Increases Spending. The Obama six-year transportation spending plan of $478 billion is an increase of $126 billion above current spending levels. Instead of increasing federal spending on highways and transit, we should be cutting it, as it is generally less efficient that state-funded spending. To close the Highway Trust Fund (HTF) gap, we should cut highway and transit spending to balance it with current HTF revenues, which mainly come from gas and diesel taxes.
  • Increases Central Control. The Obama plan would increase federal subsidies for freight rail and “would require development of state and regional freight transportation plans,” according to this description. But freight rail has been a great American success story since it was deregulated by President Jimmy Carter in 1980. So let’s not reverse course and start increasing federal intervention again. Let’s let Union Pacific and the other railroads make their own “plans;” we don’t need government-mandated plans.
  • Undermines User Pays. For reasons of both fairness and efficiency, it is a good idea to fund infrastructure with charges on infrastructure users. In recent decades, the HTF has moved away from the original user-pays model of gas taxes funding highways, as funds have been diverted to mass transit, bicycle paths, and other activities. Obama would move further away from user pays, both with his corporate tax plan and with his proposed replacement of the HTF with a broader Transportation Trust Fund.
  • Expands Mass Transit Subsidies. The Obama plan would greatly increase spending on urban bus and rail systems. But there is no proper federal role in providing subsidies for such local activities. Indeed, federal transit subsidies distort efficient local decision making—the lure of “free” federal dollars induces local politicians to make unwise and wasteful choices. Arlington, Virginia’s million-dollar bus stop is a good example.

For background on the transportation battle heating up in Congress, see my articles here and here. And see the writings of Randal O’Toole, Robert Poole, Emily Goff, and Ken Orski.

And you can check out the writings of Robert Puentes of Brookings, who joined me on C-Span today to discuss these issues.

David Boaz

Both Jeb Bush and Rand Paul are talking about broadening the appeal of the Republican Party as they move toward presidential candidacies. Both say Republicans must be able to compete with younger voters and people of all racial backgrounds. Both have talked about the failure of welfare-state programs to eliminate urban poverty. But they don’t always agree. Bush sticks with the aggressive foreign policy that came to be associated with his brother’s presidency, while Paul wants a less interventionist approach. Bush calls for “smarter, effective government” rather than smaller government, while Paul believes that smaller government would be smarter. Perhaps most notoriously, Bush strongly endorses the Common Core educational standards, building on George W. Bush’s policy of greater federal control of schooling.

Meanwhile, Paul promises to bring in new audiences by talking about foreign policy and civil liberties. As Robert Costa reported from an Iowa rally this weekend:

Turning to civil liberties, where he has quarreled with hawkish Republicans, Paul chastised the National Security Agency for its surveillance tactics. “It’s none of their damn business what you do on your phone,” he said. 

“Got to love it,” said Joey Gallagher, 22, a community organizer with stud earrings, as he nursed a honey-pilsner beer. “It’s a breath of fresh air.”

But the rest of Paul’s nascent stump speech signaled that as much as he wants to target his father’s lingering network, he is eager to be more than a long-shot ideologue.

Paul cited two liberals, Sen. Bernard Sanders (I-Vt.) and Rep. Alan Grayson (D-Fla.), during his Friday remarks and said he agrees with outgoing Attorney General Eric H. Holder Jr. on curbing federal property seizures and softening sentencing laws for nonviolent drug offenders — all a nod to his efforts to cast himself as a viable national candidate who can build bipartisan relationships and expand his party’s political reach.

“Putting a kid in jail for 55 years for selling marijuana is obscene,” Paul said.

Alan Grayson and Eric Holder? That’s pushing the Republican comfort zone. And what was the reception?

“Just look at who’s here,” said David Fischer, a former Iowa GOP official, as he surveyed the crowd at Paul’s gathering Friday at a Des Moines winery. “He is actually bringing women, college students and people who are not white into the Republican Party.”

That’s his plan. It’s a real departure from the unsuccessful candidacies of old, hawkish John McCain and old, stuffy Mitt Romney. It just might create the kind of excitement that Kennedy, Reagan, and Obama once brought to presidential politics. The question is whether those new audiences will show up for Republican caucuses and primaries to join the small-government Republicans likely to be Paul’s base.

Patrick J. Michaels and Paul C. "Chip" Knappenberger

You Ought to Have a Look is a feature from the Center for the Study of Science posted by Patrick J. Michaels and Paul C. (“Chip”) Knappenberger. While this section will feature all of the areas of interest that we are emphasizing, the prominence of the climate issue is driving a tremendous amount of web traffic. Here we post a few of the best in recent days, along with our color commentary.

Some folks are just slow to get it.

There is no way on God’s greening earth that international negotiators are going to achieve the emissions reductions that climate models tell them are necessary to keep the rise in the planet’s average surface temperature to less than 2°C above the pre-industrial value.

At the United Nations climate meeting held in Cancun back in 2012, after kicking around the idea for several years, negotiators foolishly adopted 2°C as the level associated with a “dangerous interference” with the climate system—what everyone agreed to try to avoid way back in 1992 under the Rio Treaty.

Bad idea—it won’t happen. Even the folks at the U.N. are starting to realize it.

According to an article in this week’s The Guardian titled “Paris Climate Summit: Missing Global Warming Target ‘Would Not Be Aailure’”:

EU climate chief and UN’s top climate official both play down expectations that international climate talk pledges will help hit 2C target… “2C is an objective,” Miguel Arias Canete, the EU climate chief, said. “If we have an ongoing process you can not say it is a failure if the mitigation commitments do not reach 2C.”

…In Brussels, meanwhile, the UN top climate official, Christiana Figueres, was similarly downplaying expectations, telling reporters the pledges made in the run-up to the Paris meeting later this year will “not get us onto the 2°C pathway”.

There’s so much backpeddling and spinning going on, that you’re motion sick reading the article. While we certainly did see this coming, we didn’t expect the admissions were going to start at this early date.

There is actually one way in which the global temperature rise may stay beneath 2°C, at least for the next century or so, even under the U.N.’s mid-range greenhouse gas emissions scenarios—that is, if the earth’s climate sensitivity is a lot lower than that currently adopted by the U.N. and characteristic of their ensemble of climate models.

Most climate negotiators and climate activists are loathe to admit this might be the case, as that would be the end of first-class travel to various hot spots to (yet again) euchre our monies.

But scientific evidence continues to mount, maybe even enough to send them to the back of the plane. Instead of the earth’s equilibrium climate sensitivity—how much the earth’s average surface temperature will ultimately rise given a doubling of the atmospheric concentration of carbon dioxide—being somewhere around 3°C (as the U.N. has determined), the latest scientific research is starting to center around a value of 2°C, with strong arguments for an even lower value (closer to 1.6°C).

If the earth’s response to greenhouse gas emissions is to warm at only one-half to two-thirds the rate negotiators currently assume, it means it is going to take longer (and require more emissions) to ultimately reach a temperature rise of 2.0°C. This buys the negotiators more time—something about which negotiators, conference organizers, and associated service industries should be ecstatic!

As an example of how much time would be bought by a lower climate sensitivity, researchers Joeri Rogelj, Malte Meinshousen, Jan Sedlacek, and Reto Knutti—whose work has been telling us all along that 2°C was basically impossible—ran their models incorporating some recent estimates of low sensitivity in place of the IPCC preferred sensitivity assessment. 

What they found is available via the open access literature and worth taking a look at. Figure 1 sums it up. Using a lower climate sensitivity (pink) reduces the end-of-the-century temperature rise (left) and increases the quantity of carbon dioxide emissions before reaching various temperature thresholds (right).

 

Figure 1. Temperature evolution over time (left) and in association with cumulative carbon dioxide emissions (right) from models run under different assumptions for the equilibrium climate sensitivity. The black and dark blue colors use the U.N. values; green represents a higher climate sensitivity;  and pink a lower one (from Rogelj et al., 2014). Of note is a massive gaffe—which is to implicitly attribute all warming since 1861 to greenhouse gases. The fact is that the sharp warming of the early 20th century occurred before significant emissions.

This information will surely be useful in Paris this December for those countries who seek less stringent emissions reduction timetables.

And finally, it was announced this week that the homework that the U.N. handed down to each country at the end of the last year’s climate meeting in Lima is due in draft form on February 13th, which was for every nation to publish a target and timetable for reducing carbon dioxide emissions and a plan as to how that is to be achieved.

It ought to be interesting to see what grades everyone receives on their assignments.

No doubt the U.N. officials have already seen some that were handed in early, which is why they announced that they are going to be  grading on a curve. What should have been “F”s (for failure to meet the 2° target) will now surely be given “A”s (for effort). As everyone knows, grade inflation is a worldwide phenomenon.

Paul C. "Chip" Knappenberger and Patrick J. Michaels

The Current Wisdom is a series of monthly articles in which Patrick J. Michaels and Paul C. “Chip” Knappenberger, from Cato’s Center for the Study of Science, review interesting items on global warming in the scientific literature or of a more technical nature. These items may not have received the media attention that they deserved or have been misinterpreted in the popular press.

Posted Wednesday in the Washington Post’s new online “Energy and Environment” section is a piece titled “No, Climate Models Aren’t Exaggerating Global Warming.” That’s a pretty “out there” headline considering all the evidence to the contrary.

We summed up much of the contrary evidence in a presentation at the annual meeting of the American Geophysical Union last December.  The take-home message—that climate models were on the verge of failure (basically the opposite of the Post headline)—is self-evident in Figure 1, adapted from our presentation.

Figure 1. Comparison of observed trends (colored circles according to legend) with the climate model trends (black circles) for periods from 10 to 64 years in length. All trends end with data from the year 2014 (adapted from Michaels and Knappenberger, 2014).

The figure shows (with colored circles) the value of the trend in observed global average surface temperatures in lengths ranging from 10 to 64 years and in all cases ending in 2014 (the so-called “warmest year on record”). Also included in the figure (black circles) is the average trend in surface temperatures produced by a collection of climate models for the same intervals. For example, for the period 1951–2014 (the leftmost points in the chart, representing a trend length of 64 years) the trend in the observations is 0.11°C per decade and the average model projected trend is 0.15°C per decade. During the most recent 10-year period (2005–2014, rightmost points in the chart), the observed trend is 0.01°C per decade while the model trend is 0.21°C per decade.

Clearly, over the period during which human-caused greenhouse gases have risen the fastest (basically any period ending in 2014), climate models consistently predict that the earth’s surface temperature should have warmed much faster than it did.

Given our results (and plenty like them), we were left scratching our heads over the headline of the Post article. The article was reporting on the results of a paper that was published last week in the British journal Nature by researchers Jochem Marotzke and Piers Forster, and pretty much accepted uncritically what Marotzke and Forster concluded.

The “accepted uncritically” is critical to the article’s credibility.

Figure 2 shows the results that Marotzke and Forster got when comparing observed trends to model-predicted trends of lengths of 15 years for all periods beginning from 1900 (i.e., 1900–1914) to 1998 (1998–2012). Marotzke and Forster report that overall, the model trends only depart “randomly” from the observed trends—in other words, the model results aren’t biased.

But this claim doesn’t appear to hold water.

During the first half of the record, when greenhouse gas emissions were relatively small and had little effect on the climate, the differences between the modeled and observed temperatures seem pretty well distributed between positive and negative—a sign that natural variability was the driving force between the differences. However, starting in about 1960, the model trends show few negative departures from the observations (i.e., they rarely predict less warming than was observed). This was partially due to the model mishandling of two large volcanic eruptions (Mt. Agung in 1963 and Mt. Pinatubo in 1992), but also it is quite possibly a result of the models producing too much warming as a result of increasing greenhouse gas emissions. It seems that the models work better, over the short term (say 15 years), when they are not being forced by a changing composition of the atmosphere.

Figure 2. Comparison of observed trends (black) with the climate model average trend (red) for periods of 15 years in length during the period 1900–2012 (adapted from Marotzke and Forster, 2015).

But the models appear to do worse over long periods.

Figure 3 is also from the Marotzke and Forster paper. It shows the same thing as Figure 2, but this time for 62-year-long trends. In this case, the models show a clear and persistent inability to capture the observed warming that took place during the first half of the 20th century (the models predict less warming than was observed over all 62-year periods beginning from 1900 through 1930). Then, after closely matching the observed trend for a while, the models began to overpredict the warming beginning in about 1940 and progressively do worse up through the present. In fact, the worst model performance, in terms to predicting too much warming, occurs during the period 1951–2012 (the last period examined).

Figure 3. Comparison of observed trends (black) with the climate model average trend (red) for periods of 62 years in length during the period 1900–2012 (adapted from Marotzke and Forster, 2015).

This behavior indicates that over longer periods (say, 62 years), the models exhibit systematic errors and do not adequately explain the observed evolution of the earth’s surface temperature since the beginning of the 20th century.

At least that is how we see it.

But perhaps we are seeing it wrong.

Over at the website ClimateAudit.org, Nic Lewis (of low climate sensitivity fame) has taken a very detailed (and complicated) look at the statistical methodology used by Marotzke and Forster to arrive at their results. He does not speak in glowing terms of what he found:

“I was slightly taken aback by the paper, as I would have expected either one of the authors or a peer reviewer to have spotted the major flaws in its methodology.”

“Some statistical flaws are self evident. Marotzke’s analysis treats the 75 model runs as being independent, but they are not.”

“However, there is an even more fundamental problem with Marotzke’s methodology: its logic is circular.”

Lewis ultimately concluded:

“The paper is methodologically unsound and provides spurious results. No useful, valid inferences can be drawn from it. I believe that the authors should withdraw the paper.”

Not good.

So basically no matter how* you look at the Marotzke and Forster results—taking the results at face value or throwing them out altogether—their conclusion are not well-supported. And certainly, they are no savior for poorly performing climate models.

 

*No matter how, that is, except if you are looking to try to make it appear that the growing difference between climate model projections and real world-temperature change poses no threat to aggressive measures attempting to mitigate climate change.

References:

Marotzke, J., and P. Forster, 2015. “Forcing, Feedback and Internal Variability in Global Temperature Trends.” Nature, 517, 565–570, doi:10.1038/nature14117.

Michaels, P.J., and P.C. Knappenberger, 2014. “Quantifying the Lack of Consistency Between Climate Model Projections and Observations of the Evolution of the Earth’s Average Surface Temperature since the Mid-20th Century.” American Geophysical Union Fall Meeting, San Francisco, CA, Dec. 15–19, Paper A41A-3008.

Alex Nowrasteh

The latest issue of The Economist has a good article about allowing American states to set their own migration policies.

Last spring, Cato published a policy analysis on this very topic by Brandon Fuller and Sean Rust, entitled “State-Based Visas: A Federalist Approach to Reforming U.S. Immigration Policy.” Cato’s policy analysis explores the legalities, economics, and practical hurdles of implementing a state-based visa system in addition to the existing federal system. Cato even had an event in March 2014 (video available) where critic Reihan Salam and supporter Shikha Dalmia explored the idea.

The Economist article lays out the case well. Canada and Australia have state- and provincial-based visa systems that complement their federal immigration policies. The results have been positive for those local jurisdictions because they have more information and incentive to produce a better visa policy than a distant federal government does. American states could similarly experiment with less restrictive migration policies, attracting workers of any or all skill types.

The economic impact of immigration is positive, so the downsides of decentralized immigration policy would be small. Most importantly, The Economist echoes a point that Fuller and Rust made in their policy analysis: these migrant workers should eventually be able to move around the country for work. An unrestricted internal labor market is positive for the American economy; a freer international labor market would be too.

Please read The Economist piece, Cato’s policy analysis, and watch Cato’s event on this topic.

David Boaz

At TIME I write about the rise of libertarianism, Rand Paul, and my forthcoming book (Tuesday!) The Libertarian Mind:

Tens of millions of Americans are fiscally conservative, socially tolerant, and skeptical of American military intervention….

Whether or not Rand Paul wins the presidency, one result of his campaign will be to help those tens of millions of libertarian-leaning Americans to discover that their political attitudes have a name, which will make for a stronger and more influential political faction.

In my book The Libertarian Mind I argue that the simple, timeless principles of the American Revolution—individual liberty, limited government, and free markets—are even more important in this world of instant communication, global markets, and unprecedented access to information than Jefferson or Madison could have imagined. Libertarianism is the framework for a future of freedom, growth, and progress, and it may be on the verge of a political breakout.

Read the whole thing. Buy the book.

Julian Sanchez

Proponents of network neutrality regulation are cheering the announcement this week that the Federal Communications Commission will seek to reclassify Internet Service Providers as “common carriers” under Title II of the Telecommunications Act. The move would trigger broad regulatory powers over Internet providers—some of which, such as authority to impose price controls, the FCC has said it will “forbear” from asserting—in the name of “preserving the open internet.”

Two initial thoughts:

First, the scope of the move reminds us that “net neutrality” has always been somewhat nebulously defined and therefore open to mission creep. To the extent there was any consensus definition, net neutrality was originally understood as being fundamentally about how ISPs like Comcast or Verizon treat data packets being sent to users, and whether the companies deliberately configured their routers to speed up or slow down certain traffic. Other factors that might affect the speed or quality of service—such as peering and interconnection agreements between ISPs and large content providers or backbone intermediaries—were understood to be a separate issue. In other words, net neutrality was satisfied so long as Comcast was treating packets equally once they’d reached Comcast’s network. Disputes over who should bear the cost of upgrading the connections between networks—though obviously relevant to the broader question of how quickly end-users could reach different services—were another matter.

Now the FCC will also concern itself with these contracts between corporations, giving content providers a fairly large cudgel to brandish against ISPs if they’re not happy with the peering terms on offer. In practice, even a “treat all packets equally” rule was going to be more complicated than it sounds on face, because the FCC would still have to distinguish between permitted “reasonable network management practices” and impermissible “packet discrimination.” But that’s simplicity itself next to the problem of determining, on a case by case basis, when the terms of a complex interconnection contract between two large corporations are “unfair” or “unreasonable.”

Second, it remains pretty incredible to me that we’re moving toward a broad preemptive regulatory intervention before we’ve even seen what deviations from neutrality look like in practice. Nobody, myself included, wants to see the “nightmare scenario” where ISPs attempt to turn the Internet into a “walled garden” whose users can only access the sites of their ISP’s corporate partners at usable speeds, or where ISPs act to throttle businesses that might interfere with their revenue streams from (say) cable television or voice services. There are certainly hypothetical scenarios that could play out where I’d agree intervention was justified—though I’d also expect targeted interventions by agencies like the Federal Trade Commission to be the most sensible first resort in those cases.

Instead, the FCC is preparing to impose a blanket regulatory structure—including open-ended authority to police unspecified “future conduct” of which it disapproves—in the absence of any sense of what deviations from neutrality might look like in practice. Are there models that might allow broadband to be cheaper or more fairly priced for users—where, let’s say, you buy a medium-speed package for most traffic, but Netflix pays to have high-definition movies streamed to their subscribers at a higher speed? I don’t know, but it would be interesting to find out. Instead, users who want any of their traffic delivered at the highest speed will have to continue paying for all their traffic to be delivered at that speed, whether they need it or not. The extreme version of this is the controversy over “zero-rating” in the developing world, where the Orthodox Neutralite position is that it’s better for those who can’t afford mobile Internet access to go without rather than let companies like Facebook and Wikipedia provide poor people with subsidized free access to their sites. 

The deep irony here is that “permissionless innovation” has been one of the clarion calls of proponents of neutrality regulation. The idea is that companies at the “edge” of the network introducing new services should be able to launch them without having to negotiate with every ISP in order to get their traffic carried at an acceptable speed. Users like that principle too; it’s why services like CompuServe and AOL ultimately had to abandon a “walled garden” model that gave customers access only to a select set of curated services.

But there’s another kind of permissionless innovation that the FCC’s decision is designed to preclude: innovation in business models and routing policies. As Neutralites love to point out, the neutral or “end-to-end” model has served the Internet pretty well over the past two decades. But is the model that worked for moving static, text-heavy webpages over phone lines also the optimal model for streaming video wirelessly to mobile devices? Are we sure it’s the best possible model, not just now but for all time? Are there different ways of routing traffic, or of dividing up the cost of moving packets from content providers, that might lower costs or improve quality of service? Again, I’m not certain—but I am certain we’re unlikely to find out if providers don’t get to run the experiment. It seems to me that the only reason not to want to find out is the fear that some consumers will like the results of at least some of these experiments, making it politically more difficult to entrench the sacred principle of neutrality in law. After all, you’d think that if provider deviations from neutrality in the future prove uniformly and manifestly bad for consumers or for innovation, it will only be easier to make the case for regulation.

As I argued a few years back, common carrier regimes might make sense when you’re fairly certain there’s more inertia in your infrastructure than in your regulatory structure. Networks of highways and water pipes change slowly, and it’s a good bet that a sound rule today will be a sound rule in a few years. The costs imposed by lag in the regulatory regime aren’t outrageously high, because even if someone came up with a better or cheaper way to get water to people’s homes, reengineering physical networks of pipes is going to be a pretty slow process. But wireless broadband is not a network of pipes, or even a series of tubes. Unless we’re absolutely certain we already know the best way to price and route data packets—both through fiber and over the air—there is something perverse about a regulatory approach that precludes experimentation in the name of “innovation.”

Alan Reynolds

The U.S. job market has tightened by many measures – more advertised job openings, fewer claims for initial unemployment insurance, substantial reduction in long-term unemployment and the number of discouraged workers.  Yet the percentage of working-age population that is either working or looking for work (the labor force participation rate) remains extremely low.  This is a big problem, since projections of future economic growth are constructed by adding expected growth of productivity to growth of the labor force.

Why have so many people dropped out of the labor force?  Since they’re not working (at least in the formal economy), how do they pay for things like food, rent and health care?

One explanation answers both questions: More people are relying on a variety of means-tested cash and in-kind benefits that are made available only on the condition that recipients report little or no earned income.   Since qualification for one benefit often results in qualification for others, the effect can be equivalent to a high marginal tax rate on extra work (such as switching from a 20 to 40 hour workweek, or a spouse taking a job).  Added labor income can often result in loss of multiple benefits, such as disability benefits, supplemental security income, the earned income tax credit, food stamps and Medicaid. 

This graph compares annual labor force participation rates with Congressional Budget Office data on means-tested federal benefits as a percent of GDP.  The data appear consistent with work disincentives in federal transfer payments, labor tax rates and refundable tax credits.

Dalibor Rohac

This weekend, after months of animated and often vicious campaigning, Slovaks will vote in a referendum on same-sex marriage, adoptions, and sex education. Interestingly, the referendum has not been initiated by the proponents of gay rights, which are not particularly numerous or well-organized, but rather by the social-conservative group Alliance for Family. The goal is to preempt moves towards the legalization of same-sex unions and of child adoptions by gay couples by banning them before they become a salient issue. Overturning the results of a binding referendum would then require a parliamentary supermajority and would only come at a sizeable political cost.

However, in spite of all the heated rhetoric, it seems unlikely that the threshold for the referendum’s validity will be met. Also, as I wrote in International New York Times some time ago, Slovakia is slowly becoming a more open, tolerant place – something that the referendum will hopefully not undo. However,

[i]n the meantime, the mean-spirited campaigning and frequent disparaging remarks about gays and their “condition” are a poor substitute for serious policy discussions and are making the country a much less pleasant place, and not just for its gay population.

Another disconcerting aspect of the referendum is its geopolitical dimension. For some of the campaigners a rejection of gay rights goes hand in hand with a rejection of what they see as the morally decadent West:

Former Prime Minister Jan Carnogursky, a former Catholic dissident and an outspoken supporter of the referendum, noted recently that “in Russia, one would not even have to campaign for this — over there, the protection of traditional Christian values is an integral part of government policy” and warned against the “gender ideology” exported from the United States.

We will see very soon whether the ongoing cultural war was just a blip in Central Europe’s history or whether it will leave a bitter aftertaste for years to come. Here is my essay on the referendum, written for V4 Revue. I also wrote about the referendum in Slovak, for the weekly Tyzden (paywalled), and discuss it in a video with Pavol Demes (in Slovak).

Paul C. "Chip" Knappenberger and Patrick J. Michaels

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

A pre-print of a soon to be published paper in the Journal of Asthma and Clinical Immunology describes a study conducted by researchers at the Johns Hopkins Children’s Center making this provocative finding:

Taking the United States as a whole, living in an urban neighborhood is not associated with increased asthma prevalence.

If it isn’t immediately obvious what this means, Dr. Joseph Perrone, chief science officer at the Center for Accountability in Science, spells it out in his article in The Hill:

It’s a radical finding. The study upends more than half a century of research that assumed outdoor air pollution in cities was to blame for higher asthma rates—a hypothesis repeatedly used by EPA regulators to justify the agency’s regulations.

Perrone goes on to explain:

For years, environmentalists and regulators have cited childhood asthma as an excuse for ever-stricter pollution rules. The U.S. Environmental Protection Agency (EPA), for instance, uses asthma as a pretext for nearly every “clean air” regulation issued since the 1970s.

But what if the assumed link between air pollution and childhood asthma doesn’t actually exist?

New research questions the long-held wisdom on asthma and air pollution, casting doubt over the scientific basis for EPA’s expansive regulatory agenda….

The study still points to air pollution as a cause for asthma, only it’s indoor air pollution—think second hand smoke, rodents, mold, etc.—that may be the main culprit.

This counters EPA’s asthma pretext for “clean air” regulations, as well as their regulations on climate change.

The latter was revealed in a 2009 memo recently obtained via FOIA by Chris Horner of the Competitive Enterprise Institute. In the memo, the EPA talks climate change strategy and discusses the perceived effectiveness of linking health impacts (e.g., “respiratory illness”) to climate change [emphasis added]:

Polar ice caps and the polar bears have become the climate change “mascots,” if you will, and personify the challenges was have in making this issue real for many Americans. Most Americans will never see a polar ice cap, nor will most have the chance to see a polar bear in its natural habitat. Therefore, it is easy to detach from the seriousness of the issue. Unfortunately, climate change in the abstract is an increasingly—and consistently—unpersuasive argument to make. However, if we shift from making this about the polar caps and [sic] about our neighbor with respiratory illness we can potentially bring this issue home to many Americans. As we do so, we must allow the human health argument to take center stage.

This strategy was clearly adopted.

For example, in announcing EPA’s proposed restrictions on carbon dioxide emissions from existing power plants, Administrator McCarthy’s comments were littered with health references, including her opening remarks:

About a month ago, I took a trip to the Cleveland Clinic. I met a lot of great people, but one stood out—even if he needed to stand on a chair to do it. Parker Frey is 10 years old. He’s struggled with severe asthma all his life. His mom said despite his challenges, Parker’s a tough, active kid—and a stellar hockey player.

But sometimes, she says, the air is too dangerous for him to play outside. In the United States of America, no parent should ever have that worry.

That’s why EPA exists. Our job, directed by our laws, reaffirmed by our courts, is to protect public health and the environment. Climate change, fueled by carbon pollution, supercharges risks not just to our health, but to our communities, our economy, and our way of life. That’s why EPA is delivering on a vital piece of President Obama’s Climate Action Plan.”

Later she added, “For the sake of our families’ health and our kids’ future, we have a moral obligation to act on climate.” She went on to explain, “This is not just about disappearing polar bears or melting ice caps. This is about protecting our health and our homes.” And seemingly for good measure: “As a bonus, in 2030 we’ll cut pollution that causes smog and soot 25 percent more than if we didn’t have this plan in place. The first year that these standards go into effect, we’ll avoid up to 100,000 asthma attacks and 2,100 heart attacks—and those numbers go up from there.”

In light of the new Johns Hopkin study, McCarthy’s remarks are questionable as scientific support for these hyperbolic statements is fast receding.

John Perrone hits the nail on the head with his conclusion:

This is science the EPA cannot ignore. If the agency is truly interested in “following the science,” it should spend more time addressing real public health threats than imposing costly rules based on dubious science that may only make us poorer and sicker.

We don’t find the Hopkins study at all surprising.  Smoky, confined indoor environments around the world are associated with major health issues.  Why should it be any different here?

Reference:

Keet, C.A., et al., 2015. Neighborhood poverty, urban residence, race/ethnicity, and asthma: Rethinking the inner-city asthma epidemic. Journal of Asthma and Clinical Immunology , in press.

Steve H. Hanke

Yesterday, China’s Central Bank reduced bank reserve requirements for large banks by 50 basis points to 19.5%. The Chinese know that the nominal level of national income is determined by the magnitude of the money supply. They also know that banks produce the lion’s share of China’s money. Indeed, banks produce 77% of China’s M2 money.

As shown in the accompanying chart, the average annual growth rate of China’s money supply since January 2004 has been 17.45%. At present, the annual growth rate for the money supply has slumped to 11%. China’s reduction in the banks’ reserve requirements is designed to push money growth back up towards the trend rate so that an economic slump is avoided. China has made the right move.

Daniel J. Ikenson

President Obama is presiding over what may prove to be the most significant round of trade liberalization in American history, yet he has never once made an affirmative case for that outcome. Despite various reports of intensifying outreach to members of Congress, the president’s “advocacy” is couched in enough skepticism to create and reinforce fears about trade and globalization.

Politico reports:

On Tuesday, Obama sent a letter directly to Rep. Ruben Gallego (D-Ariz.), arguing that reaching new trade agreements is the only way to stop China from dominating the global markets and letting its lax standards run the world.

“If they succeed, our competitors would be free to ignore basic environmental and labor standards, giving them an unfair advantage against American workers,” Obama wrote Gallego in a letter obtained by POLITICO. “We can’t let that happen. We should write the rules, and level the playing field for the middle class.”

Certainly, playing the China card could help win support for Trade Promotion Authority and, eventually, the Trans-Pacific Partnership, but it needn’t be the first selling point.  Pitching trade agreements as though they were innoculations from an otherwise imminent disease betrays a profound lack of understanding of the benefits of trade. With TPP near completion and the Transatlantic Trade and Investment Partnership talks expected to accelerate, the president’s stubborn refusal to make an affirmative case for his trade initiatives to the public and the skeptics in his party is disconcerting. Bill Watson was troubled by the president’s feeble advocacy of trade liberalization in his SOTU address.

Does Obama really want a legacy as the president who increased Americans’ economic liberties and opportunities when the best case he can muster for his agenda is that if we don’t adopt it we’ll get crushed? I have questioned whether he supports his own trade agenda considering – among other things – his commitment to arresting climate change and growing income inequality, both of which he believes are exacerbated by increased trade.

Never has the president described how the TPP will better integrate U.S. producers, consumers, workers, investors and taxpayers with customers, suppliers, supply-chain collaborators, and investors in Asia and the Americas. Never has he explained that by eliminating tariffs and other monopolistic impediments to trade and investment, the TPP will help increase the scope for economies of scale and specialization, which will help reduce production costs, freeing resources for lower prices, investment, and research and development. Never has he taken the time to point out that competition inspires innovation, which especially benefits companies operating in the United States, which are advantaged with privileged access to research universities and broad and deep capital markets to commercialize innovation. Never has he mentioned that by opening the door to more competition to bid on public procurement projects, the TPP will help ensure higher quality infrastructure, on-time completion, and better use of taxpayer dollars. Never has he touted the advantages to the U.S. economy of tighter integration with the world’s fastest growing region. None of these positive, promising, pioneering aspects of the TPP has been given an ounce of public attention from the president. 

Some Washington insiders will be sure to contact Bill or me to say it doesn’t matter how Obama portrays trade, as long as he gets enough votes. Well, sure, I understand the transactional nature of politics. But if you don’t try to convince anyone of the merits of trade, if you allow to lay unrebutted, to fester and metastasize, the fallacies concocted by the monopolies who benefit from restricting trade, it serves to legitimize those fears and guarantees a continuation of misinformation and discord where there should be much less.

Nicole Kaeding

One of the largest and fastest growing items in President Obama’s new budget is often overlooked. Net interest expenses will skyrocket over the next decade, growing by 250 percent.

The Congressional Budget Office (CBO) continues to warn about the rising burden of federal interest payments. Over the next decade, CBO expects net interest expenses will be the third fastest growing budget item over the next decade. Net interest represents 24 percent of the increase in federal spending during that time period.

Interest expense will increase due to two factors: higher interest rates and larger outstanding debt.

First, federal borrowing rates are currently well below average. In 2014 the 10-year Treasury rate averaged 2.5 percent according to CBO. Since 1990 the 10-year Treasury has averaged approximately 5 percent. Lower-than-normal interest rates are currently keeping the government’s borrowing expenses low, but interest rates are expected to return to historic averages.

Second, the government has added increased the federal debt quickly. Debt held by the public has grown by 120 percent since 2008, and that growth is expected to continue.

As a result of these two factors, CBO predicts that net interest will grow from 1.3 percent of gross domestic product in 2015 to 3 percent in 2025.

The president’s budget predicts a similar rise. Information from the president’s budget shows just how large interest expense will become over the next decade. By 2025 the nation will spent more on interest than it does on defense or nondefense discretionary spending, according to the president’s budget plan. Only spending on Social Security and Medicare will cost more than net interest.

The chart below shows the dramatic increase in net interest compared to defense and nondefense discretionary spending, as projected in the president’s budget.

Even with that sharp rise, the president’s budget low-balls interest costs a bit compared to CBO. His budget assumes that the 10-year Treasury rate slowly climbs from 4.0 percent in 2019 to 4.5 percent by 2025.  CBO, on the other hand, assumes that the 10-year Treasury rate is 4.5 percent in 2019 growing to 4.6 percent in 2025. In fact, the president assumes a lower interest rate in each year of his budget request.  This means that the president expects net interest to cost $785 billion in 2025 compared to $827 billion for CBO.

The president’s somewhat rosier assumptions regarding debt and deficit than CBO play a part in this comparison. As I showed on Monday, he predicts that the debt held by the public will be significantly less in 2025 than CBO does,  $20,307 trillion vs. $21,605 trillion.

Even under the president’s optimistic assumptions, net interest costs will grow quickly over the next decade, illustrating the importance of controlling the growth in federal spending. Our current path means that future generations could end up paying a high price for our current, excess spending.

Daniel R. Pearson

In a previous blog post I discussed the implications of the proposed agreement to settle the antidumping and countervailing duty (AD/CVD) cases brought by U.S. sugar producers against imports from Mexico.  That article amounted to a lament on the difficulties of trying to balance sugar supply and demand by government fiat.  Market managers employed by the U.S. Department of Agriculture (USDA) and the Department of Commerce (DOC) have a really hard job, as do their counterparts in the Mexican government.  Not only do the supply, demand, and price of sugar tend not to stay quiet and well behaved, but important firms involved in the business also can prove (from the perspective of the program managers) to be vexing and disputatious.

Such is the case with Imperial Sugar Company and AmCane Sugar, both of which are U.S. cane refiners that rely on ample supplies of raw sugar to run their operations.  Much of that raw sugar comes from other countries; in recent years Mexico has been the largest supplier to the United States.  It now appears that U.S. cane refiners were not too happy with either the original proposed settlement that was announced on October 27, 2014, or the final suspension agreements announced December 19 that set aside the underlying AD/CVD investigations. 

One source of that unhappiness seems to have been that the initial proposal would have allowed 60 percent of imports from Mexico to be in the form of refined sugar rather than raw.  The U.S. and Mexican governments acknowledged that concern in the December 19 agreement by reducing the allowable level of refined sugar imports to 53 percent.  Another issue bothering U.S. refiners likely was the relatively narrow spread between the original proposal’s import reference prices, which were 20.75 cents per pound for raw sugar and 23.75 cents per pound for refined.  U.S. refiners may have feared suppression of their processing margins, if imported refined sugar from Mexico could have been sold at only 3 cents per pound above the price of raw sugar imports.  The December 19 version increased that price spread to 3.75 cents (22.25 cents for raw and 26.0 cents for refined).  From the standpoint of the refiners, that margin still may be uncomfortably narrow.

Given those adjustments in the terms of the suspension agreements, many observers were surprised when on January 8, 2015, Imperial and AmCane took the unprecedented step of filing a challenge to the pact.  They petitioned the U.S. International Trade Commission (ITC) to determine whether the suspension agreements actually “eliminated completely” the “injurious effect of imports” on the domestic industry.  This is the first time that provision of law has been exercised since it was added to the statute in 1979.  If the ITC determines that injury has been fully ameliorated, the suspension agreements will remain in effect.  On the other hand, if the ITC determines that injury was not completely eliminated, the suspension agreements would be scrapped and both the ITC and DOC would resume work on the underlying AD/CVD investigations.

The statute grants the ITC only 75 days from the filing date to make this determination, which means the process needs to be completed by March 24.  Having never before done this type of investigation, the Commission issued a notice seeking input as to how it should evaluate whether the injury has been completely eliminated by the suspension agreements.  The ITC will hold a public meeting on February 19 “to receive oral presentations from parties to the reviews.”

Without delving into the wide variety of arguments that could be presented to the Commission, it seems reasonable to assume that Imperial and AmCane believe they will be able to provide convincing evidence – likely through use of non-public “business proprietary information” (BPI) – that the suspension agreements do not entirely eliminate their injury.  It may be challenging for supporters of the agreements to prove otherwise. 

Notwithstanding the unusual petition to the ITC, Imperial and AmCane also filed requests on January 16 with the DOC asking that the suspension agreements be terminated and the AD/CVD investigations be continued.  Sugar producers in both the United States and Mexico are not enamored with the thought that the suspension agreements might be overturned, so are challenging the legal standing of the refiners to make such requests.  DOC is seeking comments on that issue; it all appears to be quite contentious.  (There is no similar question regarding standing with respect to the refiners’ petition to the ITC.)

It’s fair to say that the overall situation is rather fluid right now.   If the refiners are found to have standing in the DOC proceeding, the AD/CVD investigations will move forward toward an eventual decision by the ITC as to whether the duties determined by DOC should be imposed.  Even if it is decided that the refiners don’t have standing at the DOC, the ITC investigation as to whether the suspension agreements entirely eliminate the domestic industry’s injury will proceed. 

It is interesting to note that both Imperial and AmCane have made clear that they are quite willing to agree to a market-management scheme that better suits their interests.  Talks with all parties to the suspension agreements may yet produce new versions that achieve consensus.  However, such a negotiation must overcome a significant hurdle.  This is basically a zero-sum game in which some other player would have to earn less money from the pact in order for the refiners to earn more.  These discussions – if they occur – could be more than just a bit fractious.

So why are the U.S. refiners apparently willing to upset the whole applecart?  Obviously they must believe they would be better off with any of three possible outcomes: 

  • An eventual negative decision by the ITC on the merits of the AD/CVD cases would mean that no duties would be applied to imports from Mexico, so refiners would have the same access to raw sugar supplies as they’ve had since the sugar provisions of NAFTA were fully implemented in 2008. 
  • An eventual affirmative ITC decision on the AD/CVD cases would mean the imposition of antidumping duties on imports of sugar from Mexico in the neighborhood of 40 percent, plus anti-subsidy duties of up to 17 percent.  Those duties likely are high enough to prevent any imports from Mexico.  How would managers of the U.S. sugar program respond to the loss of more than a million tons of Mexican sugar from the U.S. market?  Refiners may have concluded that government officials would have no choice other than to increase the tariff-rate quota (TRQ) amounts for the 40 TRQ-holding countries in order to keep the U.S. market adequately supplied.  All of that additional sugar would be in raw form, so all of it would require refining in the United States. 
  • Since U.S. refiners have demonstrated that they are willing to gum up the works of the government-regulated sugar market if their interests aren’t sufficiently taken into account, perhaps there will be a renegotiation of the suspension agreements that will treat them more favorably. 

Ahhh, the challenges of managing the U.S. and Mexican sugar markets just seem to become greater and greater.  One wonders how the United States of America has gotten itself into this dirigiste situation.  Perhaps we will live long enough to see U.S. sugar policy reformed by ending all import restrictions and domestic support measures.  (More on this topic will be available in an upcoming paper.)  If the marketplace was made open and competitive, there is little doubt that sugar still would be produced in the United States, that some of it still would be imported from other countries, and that consumers would buy some combination of the two.  If supply and demand were allowed to guide sugar production, marketing, and consumption, resource allocation and economic efficiency would improve a great deal.  Deadweight losses to the economy would be reduced.  And the former government managers of the sugar program would likely find more satisfying work and suffer fewer headaches.

Pages