Cato Op-Eds

Subscribe to Cato Op-Eds feed
Individual Liberty, Free Markets, and Peace
Updated: 55 min 57 sec ago

Pruitt v. Burwell: A Victory for the Rule of Law

Tue, 09/30/2014 - 16:55

Michael F. Cannon

From Darwin’s Fool:

The U.S. District Court for the Eastern District of Oklahoma handed the Obama administration another – and a much harsher — defeat in one of four lawsuits challenging the IRS’s attempt to implement ObamaCare’s major taxing and spending provisions where the law does not authorize them. The Patient Protection and Affordable Care Act provides that its subsidies for private health insurance, its employer mandate, and to a large extent its individual mandate only take effect within a state if the state establishes a health insurance “Exchange.” Two-thirds (36) of the states declined to establish Exchanges, which should have freed more than 50 million Americans from those taxes. Instead, the Obama administration decided to implement those taxes and expenditures in those 36 states anyway. Today’s ruling was in Pruitt v. Burwell, a case brought by Oklahoma attorney general Scott Pruitt.

These cases saw two appellate-court rulings on the same day, July 22. In Halbig v. Burwella three-judge panel of the U.S. Court of Appeals for the D.C. Circuit ordered the administration to stop. (The full D.C. Circuit has agreed to review the case en bancon December 17, a move that automatically vacates the panel ruling. In King v. Burwell, the Fourth Circuit implausibly gave the IRS the thumbs-up. (The plaintiffs have appealed that ruling to the Supreme Court.) A fourth case, Indiana v. IRS, brought by Indiana attorney general Greg Zoeller, goes to oral arguments in federal district court on October 9.

Today, federal judge Ronald A. White issued a ruling in Pruitt that sided with Halbig against King, and eviscerated the arguments made by the (more senior) judges who sided with the government in those cases…

Read the rest.

Categories: Policy Institutes

The EEOC Loses (And Loses. And Loses.) In Federal Court Again

Tue, 09/30/2014 - 15:58

Walter Olson

We keep reporting in this space about how federal courts have slapped down the long-shot lawsuits and activist legal positions advanced by the Obama administration’s Equal Employment Opportunity Commission (EEOC). In the Freeman case last year, a Maryland federal judge used such unflattering terms as “laughable,” “unreliable,” and “mind-boggling” to refer to Commission positions, while in the more recent Kaplan case, in which Cato filed a brief, a Sixth Circuit panel was only slightly more polite about the systematic shortcomings in the commission’s case. Both of those cases arose from the commission’s controversial crusade against the use of criminal and credit background checks in hiring.

Since our report in April, the commission has extended its epic, cellar-dwelling record of federal court losses with at least three more defeats. 

* Yesterday a Second Circuit panel ruled on a case in which the EEOC had claimed that female and male lawyers at the Port Authority of New York and New Jersey had not received equal pay for substantially equal work. The problems with the commission’s case were many, including a seemingly “random” choice of comparison employees that tried to dodge the significance of substantial differences between them based on how long they had been practicing law and had been at the Port Authority. Above all, the EEOC chose to rest its case on the notion that “an attorney is an attorney is an attorney,” which meant it was entitled simply to assume that Port Authority lawyers “in practice areas ranging from Contracts to Maritime and Aviation, and from Labor Relations to Workers’ Compensation” were all doing the “same” work meriting the same pay. The panel of appeals judges barely concealed its impatience with this unrealistic assumption – judges are if nothing else experienced lawyers themselves – and upheld the dismissal

* The Obama EEOC had stirred wide alarm in the business community when it decided to take the position that many clauses used in garden-variety severance agreements, in which the departing employee agrees not to sue or disparage the employer, are in reality unlawful “retaliation” against protected activity. But in the first tryout of that position, the commission fell flat on its face, as InHouse Cafe relates

On September 18, 2014, Judge John Darrah of the U.S. District Court for the Northern District of Illinois dismissed the EEOC’s lawsuit against CVS challenging CVS’ standard separation agreement. The judge will issue a written opinion explaining his decision at a later date.

* In late June, a federal judge in North Carolina granted summary judgment against the agency in a disabled-rights case brought by a complainant who, the judge found, “cannot perform the essential functions of the job with or without a reasonable accommodation.” The case was notable in at least two ways. First, it had been brought against Womble Carlyle, one of the bigger law firms in the South, which seemed to betoken the commission’s hubris: not only will we bring weak cases, it seemed to be saying, but we’ll even bring ‘em against the sorts of law firms that have the means and will to fight such things. The second notable feature was that the court had earlier ordered the EEOC to pay $22,900 in sanctions to the law firm, over an episode in which the plaintiff admitted that after the commission had begun representing her, she had shredded and discarded job-search records relevant to Womble Carlyle’s case. Lawyers call that spoliation of evidence. 

Somewhere, I’m sure, someone is thinking the commission’s real problem must be with holdover Republican-appointed judges unwilling to cut civil rights complainants a break. So it’s worth noting that in the Port Authority case, two of the three judges on the appellate panel are Obama appointees, while the district judge who dismissed the case was an appointee of Bill Clinton. The district judges in Illinois and North Carolina who dismissed the commission’s claims are respectively Clinton and Obama appointees. The problems with overreaching, extreme, and just plain sloppy litigating at the EEOC go beyond differences between liberal and conservative judges. 

Categories: Policy Institutes

Peace, Love, & Liberty: A Brilliant New Book

Tue, 09/30/2014 - 15:13

David Boaz

Hundreds of thousands of protesters are marching in Hong Kong under the banner of “Occupy Central for Love and Peace.” Have I got a book for them!

Cato Senior Fellow Tom G. Palmer has just edited Peace, Love, & Liberty, a collection of writings on peace. This is the fifth book edited by Palmer and published in collaboration with the Atlas Network, where he is executive vice president for international programs, and Students for Liberty, which plans to distribute some 300,000 copies on college campuses.

But don’t write this book off as a student handout. There’s really impressive material in here. Palmer wrote three long original essays: “Peace Is a Choice,” “The Political Economy of Empire and War,” and “The Philosophy of Peace or the Philosophy of Conflict.” These are important and substantial articles. 

But his aren’t the only impressive articles. The book also includes:

  • Steven Pinker on why we’ve seen a decline in war
  • Eric Gartzke on how free trade leads to peace
  • Rob McDonald on early Americans’ wariness of war
  • Justin Logan on the declining usefulness of war
  • Radley Balko on the militarization of police
  • Emmanuel Martin on how we all benefit if other countries prosper
  • Chris Rufer on a businessman’s view of peace
  • Sarah Skwire on war in literature
  • Cathy Reisenwitz on what individuals can do to advance peace

Plus classic pieces of literature including Mark Twain’s “War Prayer” and Wilfred Owen’s “Dulce et Decorum Est.”

And all this for only $9.95 at Amazon! Or even less from Amazon’s affiliates. If you want to buy them in bulk – and really, you should, especially for your peace-loving friends who aren’t yet libertarians – contact Students for Liberty.

Categories: Policy Institutes

Government Crowding Out, USPS Style

Tue, 09/30/2014 - 14:44

Chris Edwards

This is a really bad policy idea: the U.S. Postal Service wants to get into the grocery delivery business. Economists will sometimes support government interventions in industries where there are serious market failures. But with grocery delivery, private businesses are already performing the service, and no market failure is evident.

The USPS grocery idea is a desperate attempt to save the agency’s hide, rather than to solve any problems in the marketplace. The Washington Post frames it correctly: “After nearly six years of multibillion-dollar losses, the U.S. Postal Service has developed a new plan to help turn its finances around: Daily grocery deliveries.”

The problem is that government expansion into an activity squeezes out private providers and deters entrepreneurs from getting in. As the government expands, the private sector shrinks. Such “crowding out” occurs in many areas. An op-ed in the Wall Street Journal [$] today on retirement savings in different countries notes, “OECD data show a strong negative relationship between the generosity of public pensions and the income that retirees collect from work and private saving.”

The decline in mail volumes is prompting the USPS to extend its tentacles. GovExec reports, “from banking to passport photos, nearly all postal reform stakeholders agree any legislation must unchain the Postal Service to leverage its unique, in-every-community network to create new sources of revenue.” By “stakeholders,” GovExec appears to mean groups—such as the labor unions—that benefit from the subsidized status quo.

The Wall Street Journal reports [$] that the grocery gambit “is the latest in a string of aggressive moves by the Postal Service to compete in the package-delivery market.” But why would we want the government “aggressively” undermining private businesses, especially in an industry like package delivery that is already efficient and competitive?

If the USPS expands into new areas such banking and groceries, we will end up with a mess of cross-subsidies between the agency’s different activities. Banks, for example, would complain that subsidized USPS banking was undercutting them, which would be inefficient and unfair. Such disputes would be chronic, and each dispute would descend into a battle over accounting between lobby groups in front of Congress.

For more efficiency and less lobbying, Congress should be encouraging the USPS to shrink, not expand. Does it make sense for a letter carrier to deliver groceries? The best way to find out is to privatize the letter carrier, repeal its legal monopoly, and then let it have a go. Postal privatization works. Britain, Germany, and the Netherlands have shown the way. 

Categories: Policy Institutes

How School Choice Saves Money

Tue, 09/30/2014 - 09:50

Jason Bedrick

School choice programs expand educational opportunity, but at what cost?

Opponents of school choice frequently claim that vouchers and scholarship tax creditssiphon” money from public schools and increase the overall cost of education to the taxpayers. However, these critics generally fail to consider the reduction in expenses associated with students switching out of the district school system, wrongly assuming that all or most school costs are fixed. When students leave, they claim, a school cannot significantly reduce its costs because it cannot cut back on its major expenses, like buildings, utilities, and labor. But if that were true, then schools would require little to no additional funds to teach additional students. A proper fiscal analysis considers both the diverted or decreased revenue as well as the reduction in expenses related to variable costs.

A new study by Jeff Spalding, Director of Fiscal Policy at the Friedman Foundation for Educational Choice, does exactly that. The study examines the fiscal impact of 10 of the 21 school voucher programs nationwide, finding a cumulative savings to states of at least $1.7 billion over two decades. Spalding, the former comptroller/CFO for the city of Indianapolis, is cautious, methodical, and transparent in his analysis. He walks readers through the complex process of determining the fiscal impact of each program, identifying the impact of each variable and explaining equation along the way. He also makes relatively conservative assumptions, such as counting food service and interscholastic athletics as fixed costs even though they are variable with enrollment. Critically, Spalding accounts for those students who would have attended private school anyway, explaining:

One common complicating factor is student eligibility. If a voucher program allows students already enrolled in a private school to qualify, then those students do not directly relieve the public school system of any costs. Thus, there is a new public cost incurred for the vouchers provided to those students, but no corresponding savings for the public school system. Anytime voucher eligibility extends to students not currently enrolled in a public school, the net savings calculation must include that complicating factor.

States save money when the variable cost of each student to the district schools is greater than the cost of the voucher, accounting for the students who would have attended private school anyway. After wading through each state’s byzantine school funding formula, Spalding calculated that the voucher programs reduced expenditures across all 10 programs by $4.5 billion over two decades while costing states $2.8 billion, producing $1.7 billion in savings.

In the last 40 years, government spending on K-12 education has nearly tripled while results have been flat. Moreover, the Census Bureau projects that the elderly will make up an increasingly larger share of the population in the coming decades, straining state budgets with spending on health care and retirement benefits. Schools will have to compete with hospitals and nursing homes for scarce resources.

In other words, our education system needs to become more effective and financially efficient, fast. Large-scale school choice programs promise to do both.

Categories: Policy Institutes

Amtrak Shouldn't Get to Write Its Own Ticket

Tue, 09/30/2014 - 09:21

Ilya Shapiro

Article One, Section One of the Constitution vests “all legislative powers” in Congress. The sovereign power to make laws comes from the people, so their representatives—Congress—should make those laws.

It sounds simple enough, but once the federal government started ballooning in size and regulating everything under the sun, that simple understanding had to go. There was too much governing for Congress to handle on its own, so the courts adjusted, allowing a proliferation of government agencies to exercise lawmaking power, within certain guidelines.&

We’ve now apparently gotten to the point, however, that there’s so much governing to do that it’s too much for the government to handle on its own. In a case now before the Supreme Court, Amtrak—the for-profit, quasi-public entity that the federal government has deemed private for these purposes—has been given a part to play in making laws to regulate its competitors in the rail transportation industry.

If you think this sounds like a far cry from “all legislative powers” being vested in Congress, you’re not alone. The Association of American Railroads, which represents the rail companies subject to these regulations, sued the Department of Transportation, arguing that the Passenger Rail Improvement and Investment Act of 2008 unconstitutionally vests federal legislative power in a private entity by giving Amtrak the ability to set rail standards (in conjunction with the DOT). AAR has battled through the federal courts, most recently winning in the U.S. Court of Appeals for the D.C. Circuit, and is now trying to preserve that victory before the Supreme Court.

Cato, joined by the National Federation of Independent Business, has filed a brief supporting AAR. We argue that this case is different from other cases where courts have found prudential reasons for not enforcing the nondelegation doctrine, the concept that Congress can’t delegate its own legislative powers. As we explain, the judicial administrability, political accountability, and necessity arguments in favor of liberal delegation of lawmaking powers are far less valid in the context of delegation to private entities. Further, apart from these prudential concerns, the Court has vigilantly enforced these important structural limitations on delegation and should continue to do so here.

It’s perhaps too late to expect the courts to meaningfully rein in the massive delegation of power to the administrative state—though we should limit that delegation to implementation of law rather than actual legislation—but, as our brief explains here, it could be much worse. Many agencies are already dominated by the private interests they’re supposed to regulate (a dynamic known as “regulatory capture”), but allowing a private entity to secure a legislative role in governing its competitors not only exacerbates the problems that the administrative state already poses, it makes a mockery of the Constitution and erodes one more important structural protection for liberty.

The Supreme Court will hear oral arguments in Dept. of Transportation v. Association of American Railroads on December 8, with a decision expected in the spring.

This blog post was co-authored by Cato legal associate Julio Colomba

Categories: Policy Institutes

Addressing the Critics of This Purportedly No Good, Very Bad Chart

Mon, 09/29/2014 - 15:52

Andrew J. Coulson

For the past few years I have charted the trends in American education spending and performance (see below). The goal is to see what the national data suggest about the productivity of our education system over time. Clearly, these data suggest that our educational productivity has collapsed: the inflation-adjusted cost of sending a student all the way through the K-12 system has almost tripled while test scores near the end of high-school remain largely unchanged. Put another way, per-pupil spending and achievement are not obviously correlated.

Not everyone is happy with these charts, and in this post I’ll respond to the critics, starting with the most recent: Matt DiCarlo of the Albert Shanker Institute, an organization that honors the life and legacy of the late president of the American Federation of Teachers. DiCarlo finds the chart “misleading,” “exasperating,” and seemingly designed to “start a conversation by ending it.” Since we’re actually having a conversation about the chart and what it shows, and since I’ve had countless such conversations over the years, perhaps we can agree that the last of those accusations is more of a rhetorical flourish than a serious argument.

DiCarlo links to a couple of earlier critics to do the heavy lifting in support of his antipathy, but he does admonish the use of a single percent-change y-axis as “not appropriate for and totally obscur[ing] changes in NAEP scale scores.” This is ironic. When I first began to publish these charts, I used two separate y-axes, as shown in the image below which dates back to around 2009.

This, DiCarlo may be exasperated to hear, was roundly critized for the ostensible crime of using… 2 separate y-axes, which, apparently, is only done by knaves and charlatans according to several e-mails I received at the time. But of course the use of 2 separate y axes is not inherently misleading. It depends on why they are used, whether or not the scales are sensible, etc. But when you are trying to reach a suspicious audience, it’s not very effective to just say: “no, you’re mistaken, there’s nothing wrong with this use of 2 y-axes.” They’ll just put that down to more knavery on your part. So, thinking I would eliminate one source of spurious objections, I switched to a single percent-change axis. And now we have DiCarlo’s objection to that. Catch-22.

But let’s investigate DiCarlo’s criticism. Does the percent change scale obscure important changes in the NAEP scores? Looking at the first chart, it is easy to see that the science score fell and never quite recovered to its original level before the test was discontinued, while the reading and math scores seem largely unchanged. As it happens, the raw NAEP score for math rose from 304 to 306, while the raw reading score rose from 285 to 287, and the raw science score fell from 305 to 295. We can see the science decline quite clearly, so it hasn’t been obscured. But the two-point gains in math and reading look essentially like flat lines. Which raises the question: is a two point gain essentially equivalent to a flat line, or is it substantial?

Some people like to answer that question by saying that a gain of x points on the NAEP is equivalent to one school-year’s worth of learning. But, according to a 2012 paper commissioned by the NAEP Validity Studies Panel, claims of that sort are unempirical guesswork.

Fortunately, there is a tried-and-true metric that researchers use to quantify effect sizes: they express them in terms of standard deviations, and those measures can in turn be converted to percentile scores. For example, the earliest available std deviation for the mean reading score of 17-year-olds was 46 points. Dividing 2 by 46, we get an effect size of 0.0435 SDs. That would take you from being in the middle of the pack in the early 1970s (that is, the 50th percentile), to being at the 51.7thpercentile. So instead of outscoring half your peers, you’d outscore 51.7 percent of them. That’s not a huge difference is it? That’s not a spike-the-football, endzone dance, “In. Your. Face!” kind of improvement. It’s really pretty small.

In math, the story is similar. The earliest SD available is for the 1978 admin of the test, and it was 35. A two-point gain would be an effect size of 0.057 SDs, which would raise you from median performer to the 52.3rd percentile. Again, this is not winning the lottery. This is not an “I’d like to thank the Academy” kind of moment.

So the fact that the reading and math scores look essentially flat in the chart at the top of this post is an accurate representation of the trend in raw NAEP scores. They are essentially flat.

Next, turning to the cost series in the top chart, both of the earlier critics cited by DiCarlo believed they smelled a rat. The legend of the version of the chart they reviewed referred to the cost trend line as a “13yr running total, i.e. K-12, spending per pupil,” which I thought was self-explanatory. It wasn’t. It seems that at least one of the critics was unfamiliar with the concept of a running total, thinking it was equivalent to simply multiplying the current year figure by 13. It’s not. Because of his misunderstanding, he wrote: “the cost figure increases (supposedly the total cost of a K-12 education taken by multiplying per-pupil costs by 13) are false.” Of course the error was his own, the result of failing to understand that a running 13yr total is the annual per-pupil spending for the given year, plus the corresponding figures for the preceding 12 years. This is an estimate of what was spent to put a graduate in the given year all the way through the K-12 system–i.e., the total cost of that graduate’s K-12 public schooling.

The other critic cited by DiCarlo seems not to have read the chart’s legend at all, claiming that I use “total rather than per pupil spending (and call it ‘cost’).” The legend explicitly states that it is a running 13yr total of per-pupil spending.

But, though both these critics were mistaken, I did learn (to my great surprise) that the idea of a running total is not universally understood. So, since that time, I have elaborated the explanation in the legend and raised it to the top of the chart in an effort to make the cost trend line easier to understand.

Yet other critics have alleged that the overall flat performance of 17-year-olds is purely the result of changing demographics—i.e., the increasing test participation rates of historically lower-scoring groups, and so the aggregate data are misleading. There is a little something to the premise of this argument, but the conclusion still doesn’t follow. I explained why in my 2011 testimony before the House Education and the Workforce Committee, but I’ll summarize it here for completeness. 

It is true that both black and Hispanic students now score higher than they did in the early 1970s, and the difference isn’t negligible as it is with the overall aggregate trend. The first caveat is the that the trends for white students, who still make up the majority of test takers, are only marginally better than the overall trends. Whites gained four points in each of reading and math, and lost six points in science. The overall picture for whites is thus also essentially a flat line, and it is their performance that is chiefly responsible for the stagnation in the overall average scores, not the increasing participation of historically lower-scoring groups.

The second caveat is that all of the improvement in the scores of Hispanic and black students had occurred by around 1990, and their scores have stagnated or even declined slightly since that time (see the testimony link above). While the improvements for these subgroups of students are not negligible, they have no relationship to the relentlessly rising spending trend. Spending increased before, during, and after the period during which black and Hispanic students enjoyed their score gains. If per-pupil spending were an important cause of those gains, we would expect more uniform progress, and that is not what the data show.

Finally, what of the claims that it is unfair to chart test scores over this period because students have become harder to teach—because of poverty, single-parent families, low-birthweight or other factors associated with student performance. Claims like this are seldom accompanied by any sort of systematic numerical analysis. That’s too bad, because if the overall trend in such factors really has been negative, then they might well be dragging down student performance and skewing the NAEP scores lower. Fortunately, several years ago, prof. Jay Greene of the University of Arkansas decided to take these criticisms seriously, tabulating the trends in 16 different factors known to be associated with student achievement (including the ones listed above), and combining them into a single overall index of “teachability.” What Greene found is that, if anything, children have become marginally more teachable over this period. So we should expect some improvement in scores even if schools haven’t improved at all.

In sum, while I grant that this particular chart does not capture every interesting piece of the puzzle—no single chart could—it is both useful and an accurate depiction of the lack of correspondence between spending and student achievement in U.S. schools over the past two generations, and of the fact that spending has risen out of all proportion with the academic performance of students near the end of high school.

Categories: Policy Institutes

Venezuela: A Military Regime

Mon, 09/29/2014 - 15:30

Juan Carlos Hidalgo

Bloomberg has a story today on the many perks that the Venezuelan army enjoys vis-à-vis the downtrodden civilian population. Whereas a regular Venezuelan has to line up for hours to get basic goods (when they are available), officers enjoy privileged access to fully stocked supermarkets, new cars, housing, and many other benefits.

The obvious strategy of the Venezuelan government is to keep the armed forces happy in case it needs them to hold on to power.  But the reality is far worse.  In fact, Venezuela is now a military regime. Even though President Nicolás Maduro is a civilian, he is surrounded by people who have donned a uniform: according to Bloomberg, “A third of Venezuela’s 28 ministers and half the state governors are now active or retired officers.”

The rise in prices is not the only kind of inflation affecting Venezuela. Bloomberg reports that “its military now has between 4,000 and 5,000 generals” for a ratio of one general for every 34 servicemen (in the United States the ratio is one general per 1,490 servicemen). As expected, generals enjoy higher salaries and many other benefits. Moreover, the intelligence community believes that high-ranking army officers control most illegal activities in Venezuela, from smuggling to drug trafficking. In other words, military men are profiteering from the status quo.

All this points to an unlikely scenario where the armed forces would threaten the continuity of chavismo in power. Just the opposite, the army will be a key actor in propping up the regime, even if Venezuelans decide otherwise in the polls.

Categories: Policy Institutes

New York Airbnb Law Hurting B&Bs

Mon, 09/29/2014 - 15:22

Matthew Feeney

The campaign against Airbnb in New York City has claimed some surprising collateral damage – actual B&Bs. According to Mary White of BnB Finder, half of all the B&Bs in the city have closed down since legislation in 2011 aimed at limiting Airbnb abuses came into effect. The statewide law prohibits building owners renting rooms in residential buildings and apartments for less than 30 days.

The law may have been aimed at preventing apartment buildings from being turned into illegal hotels, but it has resulted in fines being issued to B&B owners. The owner of one of New York City’s B&Bs launched an advocacy group in 2011 that lobbies for changes in legislation but does not list its members. Given the legislation affecting B&Bs it is understandable that BnB owners would be hesitant to increase their exposure. A Crain’s New York Business article on the decline of B&Bs in New York City quotes the owner of a Brooklyn inn, who limits the amount of attention her business receives online in order to avoid city investigators:

Most people don’t even know B&Bs exist in the city. Now many innkeepers are keeping a lower profile, hoping not to attract the city’s attention.

The owner of a European-style inn in Brooklyn, who did not want to be identified, said she limits her online exposure by not listing her property with travel sites like Expedia. The reason is twofold: She wants to speak personally with her guests before they arrive to ensure “that the people who stay with us won’t steal anything from us,” and to keep out of the crosshairs of city investigators.

Airbnb has made enough on an impact in New York City for hotel industry representatives, local officials, and activists to launch Share Better, a coalition group which claims the popular site is contributing to New York City’s affordable housing crisis and allowing tenants to violate lease agreements. Earlier this month, I wrote about Share Better’s claims and discussed the regulatory gray area Airbnb operates in with MSNBC’s Eric Ortega and The New Republic’s Noam Scheiber.

As Share Better continues to make its arguments against Airbnb, regulators and lawmakers should not forget the recent decline in New York City B&Bs, the hard hit unintended victims of anti-competitive rental restrictions.

Categories: Policy Institutes

The Federal Government and American Indians

Mon, 09/29/2014 - 14:54

Chris Edwards

As research for this essay on the Bureau of Indian Affairs, I visited the Smithsonian National Museum of the American Indian (NMAI). I found virtually no information useful for my project.

I stopped by the museum information desk on the way out and said something to the effect, “There is very little here about the relationship between Indians and the federal government, yet that relationship is central to the story of American Indians over the last two centuries.” A few months ago, I emailed a similar complaint to the head of the NMAI, and he did kindly respond to me.

The museum has now taken a big step toward fixing the problem with its new exhibit about the history of treaties between tribes and the federal government. It’s a good exhibit, telling some of the stories about how the government deceived and cheated the Indians again and again, depriving them of their lands, resources, and freedom.

The general topic is interesting to me because it illustrates numerous libertarian themes, including the arrogance and dishonesty of federal officials, the eagerness of officials to substitute their own goals for individual freedom, government corruption, the failure of top-down planning from Washington, and the inability of hand-outs to create lasting prosperity.

As I discuss in my essay, there has been good news on Indian reservations in recent decades. But the federal government continues to fail in creating the legal structure needed so that people on reservations can prosper. One long-standing problem is the very poor functioning of law enforcement. In a story today about Indian tribes in North Dakota, the Washington Post says:

Investigating crime on Fort Berthold is more difficult than most places because the reservation sits in six different counties each with its own sheriff — some of whom do not have a good relationship with the tribe, according to tribal members. If the victim and suspect are both Native American, the tribal police or the FBI handles the arrest. But if the suspect is not Native American, in most cases the tribal police can detain the suspect but then have to call the sheriff in the county where the crime occurred. Sometimes they have to wait several hours before a deputy arrives to make the arrest. In a murder case, the state or the FBI might be involved, depending on the race of the victim and the suspect.

“There are volumes of treatises on Indian law that are written about this stuff,” Purdon said. “It’s very complicated. And we’re asking guys with guns and badges in uniforms at 3:30 in the morning with people yelling at each other to make these decisions — to understand the law and be able to apply it.”

I don’t know what the best solution to these particular problems is. I do know that the U.S. Constitution empowered the federal government to engage with the tribes, and that Congress should spend more time tackling such fundamental issues. Unfortunately, most members of Congress focus most of their efforts on hundreds of programs not authorized by the Constitution.

Anyway, kudos to the Washington Post for doing a series on justice issues in Native American communities. And kudos to the NMAI for informing the public about the government’s often appalling behavior over two centuries of dealing with the first Americans.

Categories: Policy Institutes

The Real Costs of HealthCare.gov

Mon, 09/29/2014 - 12:05

Nicole Kaeding

In May, Department of Health and Human Services (HHS) Secretary Sylvia Burwell testified to Congress that costs for building HealthCare.gov were $834 million. New research from Bloomberg Government suggests that Burwell’s estimate represents a low-end estimate.

According to the new report, spending for HealthCare.gov has been an estimated $2.14 billion. Burwell’s estimates did not include numerous costs related to the project. For instance, she did not include the contract costs for processing paper applications, which are used as a backup. That contract cost $300 million.

Burwell’s figure also does not include spending at the IRS and other agencies related to ACA requirements. For instance, the IRS is required to provide real-time interfacing with HealthCare.gov to verify income and family size for insurance subsidy calculations. Those requirements cost $387 million.

Bloomberg also includes $400 million in costs that were excluded by HHS using creative accounting. When it wrote the ACA, Congress did not appropriate money to HHS for the construction of a federal exchange. Instead, it provided unlimited grants to states to construct their portals. When many states refused to construct their exchanges, HHS was forced to develop HealthCare.gov, but without a dedicated source of funding. HHS said it would need to “get creative” about funding options, leaving many wondering where HHS would eventually get the money. According to Bloomberg, HHS shifted money around to finance the construction of HealthCare.gov, using a number of existing contracts to finance the website’s construction.

Finally, Bloomberg included $255 million more in costs than Burwell due to time period differences. Burwell’s costs were as of February 2014. Bloomberg included costs until August 20, 2014, and then projected the current level of spending forward to the end of the fiscal year, September 30th. But this means that their figures are likely conservative too because federal agencies often ramp up spending— particularly contract spending—as it closes out its fiscal year.

Implementing the ACA is a costly exercise; Bloomberg says the $2.14 billion for HealthCare.gov administration is only a small part of the full $73 billion costs of Obamacare since its passage in 2010. But the administration nonetheless owes taxpayers an accurate accounting for the costs of the system.

Categories: Policy Institutes

Halbig v. Burwell: House Oversight Committee Subpoenas IRS

Mon, 09/29/2014 - 12:03

Michael F. Cannon

This was a long time coming.

Those who follow Halbig v. Burwell and similar cases know the IRS stands accused of taxing, borrowing, and spending billions of dollars contrary to the clear language of federal law. The agency is quite literally subjecting more than 50 million individuals and employers to taxation without representation.

Congressional investigators have been trying to figure out how the IRS could write a rule that so clearly contradicts the plain language of the Patient Protection and Affordable Care Act. Unfortunately, the agency has been largely stonewalling their efforts to obtain documents relating the the development of the regulation challenged in the Halbig cases.

Fortunately, finally, last week the House Committe on Oversight and Government Reform used its subpoena power to demand the IRS turn over the documents that show what whent into the agency’s decision.

We’ll see if the IRS complies, or if another of the agency’s hard drives conveniently crashes.

I’ve got a fuller write-up over at Darwin’s Fool.

Categories: Policy Institutes

Democrats and Their Mansions, Again

Sat, 09/27/2014 - 16:03

David Boaz

Two articles in today’s Washington Post Real Estate section remind me of how off-target a Post political article was a couple of months ago. The House of the Week is Paul and Bunny Mellon’s Upperville, Va., estate, which features a 10,000-square-foot main house on 2,000 acres and is being offered for $70 million. The Mellons often entertained their friends John F. and Jacqueline Kennedy there. Bunny Mellon, the daughter of the man who cofounded the Warner-Lambert drug company, married the heir to the Mellon Bank fortune. Sadly, she made headlines late in her long life for her multi-million-dollar support of Sen. John Edwards’s presidential campaign, including money to cover up his extramarital affair.

Meanwhile, the feature article in the Real Estate section looks at “an American palace,” a 40,000-square-foot house (and you thought the Mellons were extravagant at 10,000 square feet!) in Potomac, Md., built by a businessman who started a company with a federal grant, built it on government contracts, and then sold it for hundreds of millions of dollars. Frank Islam says that “‘to whom much is given, much is expected.’ It’s our responsibility to give back and share.” And share he does, with the kind of people who made all that government largesse possible:

Since moving into their 14-bedroom, 23-bathroom estate in 2013, the homeowners have regularly staged events for the Democratic Party. They held a June dinner attended by Vice President Biden and a fundraiser for Sen. Al Franken (D-Minn.) this month. 

Islam and Driesman have hosted nearly all the region’s Democrats, including Maryland Gov. Martin O’Malley and Lt. Gov. Anthony Brown; Sens. Timothy M. Kaine of Virginia and Benjamin L. Cardin of Maryland; and Montgomery County Executive Isiah Leggett.

All of which reminded me of another Post story by a longtime reporter back in May, which turns out to have been about the very same mansion:

The Potomac estate of IT entrepreneur and philanthropist Frank Islam seemed more fitting for a Republican soiree than a Democratic fundraiser, some of Maryland’s top elected officials said Wednesday….

“There are not too many people who own homes like this who are great Democrats,” Sen. Benjamin L. Cardin (D-Md.) told the audience of about 400.

As I said at the time, “Democrats don’t have much trouble finding billionaires and mansions for fundraising events. Reporters shouldn’t act like it’s an unusual event.”  

A month after that Sen. Harry Reid declared in one of his tirades about billionaires in politics that the Democratic party “doesn’t have many billionaires.” (Or maybe he said “any billionaires”; the audio is unclear.) Politifact found plenty of billionaire donors to both parties. Whatever you think of many politics, reporters should stop recycling Democratic spin that big money is found on one side of the aisle.

 

Categories: Policy Institutes

Who Needs to Be More Flexible in the TPP talks? Hint: It's Not Japan.

Fri, 09/26/2014 - 12:27

K. William Watson

According to news reports, the United States and Japan have again failed to reach a bilateral agreement on lowering import barriers, a necessary prerequisite to completion of the 12-member Trans-Pacific Partnership (TPP) trade agreement. U.S. negotiators and business interests are quick to blame Japan for being reluctant to eliminate tariffs on a handful of highly traded agricultural products. In truth, though, the Japanese government has shown much greater commitment to the TPP and more willingness to take political risk than the United States. If the TPP falls apart, the blame will not lie with the Japanese.

The tariffs in question are what trade negotiators refer to as “sensitivities.” For every country in any trade negotiation, there are some trade barriers that are very difficult to lower because of the domestic political power of the businesses and industries that benefit from them. In Japan’s case these are agricultural tariffs (on rice, wheat, sugar, meat, and dairy) that are the bread and butter of Japan’s politically powerful farmers. Getting rid of sensitive barriers can be done, but it requires greater political will from both local and foreign leaders. Politicians take great risks when they oppose the interests of a powerful lobby.

I’ve noted before that criticisms of Japan’s stance are inappropriately antagonistic in light of how beneficial tariff elimination would be to Japan itself. The Japanese government know this, too. Earlier this week, Japanese Prime Minister Shinzo Abe spoke about how eager his government is to use the TPP talks as a way to enact broad agricultural reforms:

I consider it is indispensable for the future of Japanese agriculture to promote the domestic and international reforms in an integrated way.

To be honest with you, it is indeed an enormous task to suppress the resistance from the people who have been protected by vested interest. However, there is no future for them if they are not exposed to competition.

Rather than sympathize with their Japanese counterparts, however, the U.S. Trade Representative’s office continues to accuse Japan of expecting special treatment when all other TPP members are committed to more ambitious liberalization.

This attitude is incredibly hypocritical, considering that U.S. sensitivities are considered so far out of bounds that they aren’t even being discussed. Issues like agriculture subsidies, maritime shipping, antidumping reform, and government procurement aren’t on the table at all. Their absence is further evidence that it is the United States that lacks interest in taking political risk to advance the TPP.

But even if the U.S. government gets a pass on excluding all those non-tariff issues, it’s still falling short of Japan’s level of commitment. Consider this report from Japan’s Kyodo News Agency:

The United States has told Japan during their recent ministerial talks it will keep tariffs on automotive parts under a Pacific Rim free trade initiative, retracting its previous plan to scrap them immediately, negotiation sources said Thursday.

… The move is apparently out of consideration for the U.S. auto industry, which is a strong political support base for President Barack Obama’s Democratic Party, before U.S. midterm elections in November.

The United States is right to ask a lot from the other 11 countries in the TPP negotiations. But completing the agreement (not to mention getting it passed in Congress) is going to require the Obama administration to step on the toes of particular business groups and explain why the deal is good for the country as a whole. They could learn how it’s done by watching their Japanese counterparts.

Categories: Policy Institutes

Reagan and the Air Traffic Controllers

Fri, 09/26/2014 - 11:55

Chris Edwards

An obituary in the Washington Post for Robert Poli provides a chance to look back at a decisive moment in Ronald Reagan’s presidency. Poli was the head of the militant Professional Air Traffic Controllers Organization (PATCO), which launched an illegal strike in 1981. The Post describes the significance of the action:

The strike by PATCO, Reagan’s subsequent breaking of the union and the hiring of replacement workers were among the most significant job actions of their time, said Joseph A. McCartin, a professor at Georgetown University and a specialist on labor and social history. They “helped to define labor relations for the rest of the century and even into the 21st century,” he said, turning public sentiment away from striking as a legitimate labor tactic and further emboldening employers in the private sector to permanently replace striking workers.

Reagan’s hard line with the PATCO strikers six months into his presidency helped establish an image of him at home and overseas as a strong leader who would not be pushed around.

Here is the sequence of events: 

The PATCO work stoppage began Aug. 3, 1981, when at least 12,000 of the nation’s 17,000 air traffic controllers defied federal law and walked off their jobs, seeking higher pay, shorter hours, better equipment and improved working conditions in a long-simmering labor dispute.

There were widespread flight cancellations and delays, and 22 of the nation’s busiest airports were directed to reduce their scheduled flights by 50 percent.

That morning in the White House Rose Garden, Reagan declared, “I must tell those who failed to report for duty this morning they are in violation of the law, and if they don’t report for work within 48 hours, they have forfeited their jobs and will be terminated.”

Two days after the walkout began, Transportation Secretary Drew L. Lewis announced that at least 12,000 striking air traffic controllers had been terminated and would not be rehired “as long as the Reagan administration is in office.”

The Reagan administration stuck to its guns. The strikers were replaced by nonstriking controllers, air traffic supervisors, and military controllers until new controllers were trained.

The episode was a very gutsy move by Reagan, with beneficial consequences. But as I note here, the 1981 strike and response did not come out of nowhere—PATCO had been causing problems for years. In 1969, for example, about 500 members of PATCO called in “sick” in a protest, which caused major air service interruptions. And in 1970, about 3,000 members of PATCO took part in another “sickout,” or illegal strike, that caused chaos for the nation’s air traffic. Those sorts of union troubles continued during the 1970s, which set the stage for the Reagan showdown.

Today, the government’s air controllers have a different union organization, NATCA. Rather than illegal striking, these folks do what a growing number of groups in society are doing to advance their agendas: they lobby

Categories: Policy Institutes

Land Use and Local Government: The Facts On the Ground Are Libertarian

Fri, 09/26/2014 - 11:00

Walter Olson

Prof. Kenneth Stahl, who directs the Environmental Land Use and Real Estate Law Program at Chapman University School of Law, has a post at Concurring Opinions asking why libertarians aren’t more numerous among academic specialists in local government and land use law. Stahl describes his own views as siding with “leftists rather than libertarians,” that is to say, those who “have some confidence in the ability of government to solve social problems”: 

Nevertheless, were you to pick up a randomly selected piece of left-leaning land use or local government scholarship (including my own) you would likely witness a searing indictment of the way local governments operate. You would read that the land use decisionmaking process is usually a conflict between deep-pocketed developers who use campaign contributions to elect pro-growth politicians and affluent homeowners who use their ample resources to resist change that might negatively affect their property values. Land use “planning”—never a great success to begin with—has largely been displaced by the “fiscalization” of land use, in which land use decisions are based primarily on a proposed land use’s anticipated contribution to (or drain upon) a municipality’s revenues. Public schools in suburban areas have essentially been privatized due to exclusionary zoning practices, and thus placed off limits to the urban poor, whereas public schools in cities have been plundered by ravenous teachers’ unions.

… It hardly paints a pretty picture of local government. Yet, most leftists’ prescription is more government. 

To put it differently, libertarian analysis better explains what actually goes on in local government than does the standard progressive faith in the competence of government to correct supposed market failure. The post (read it in full!) goes on to discuss specifics such as annexation, incorporation, and economic stratification-by-jurisdiction; the relative success of lightly governed Houston in achieving low housing costs and attracting newcomers and economic growth; and the transference of progressives’ unmet hopes to regionalization, so memorably summed up by Jane Jacobs years ago: “A region is an area safely larger than the last one to whose problem we found no solution.”

Stahl: 

So why would left-leaning scholars, who have seen so clearly the failures of local government, place so much faith in a largely untested restructuring of governmental institutions, rather than looking to less government as the solution?

Great question.

Categories: Policy Institutes

The World Needs More Energy, Not Less

Fri, 09/26/2014 - 10:08

Paul C. "Chip" Knappenberger

This week, a few major media outlets covered my take on the effectiveness and judiciousness of President Obama’s call, at the U.N. Climate Summit, for all countries of the world to make pledges of how and how much they are going to reduce their national carbon dioxide emissions. It should be no surprise that I think such actions would be ineffective and imprudent.

My biggest criticism is that not all countries of the world are at the same stage of energy development. While the developed nations may have all the energy supplies they want and need, most developing countries do not. So, while developing countries pursue  “luxuries” like indoor lighting and clean cooking facilities (not to mention improved sanitation), developed countries are awash in the luxury of debating whether to alter the relative components of their fuel mix in hopes that it may (or may not) alter the future course of the climate.

Since historically (and today) there is an extremely tight coupling between energy production and carbon dioxide emissions (since fossil fuels are used to produce the overwhelming bulk of our energy), calls like those from President Obama to restrict carbon dioxide emissions are akin to calls to restrict energy usage and expansion.

Imposing carbon restrictions on developing nations would have large-scale negative implications, not only to those directly affected, but to the world as a whole, as a large expanse of human ingenuity–arguably humanity’s greatest resource–would remain constrained by basic survival efforts and 50-year life expectancies.

Basically, no one is going to go along with this. So despite promises, when adhering to plans to reduce carbon dioxide emissions (whether informal or formalized in a treaty) comes up against economic expansion and human welfare improvements, the latter are going to win out every time (or so we would hope).

Consequently, it is a lot easier to “talk the talk” on this issue of cutting carbon dioxide emissions than it is to “walk the walk.”  Even in the United States, where carbon dioxide emissions have been on a gentle decline for the past 7-8 years (something that the President likes to take credit for, despite that being impossible), a substantial portion of that decline has come at the hand of the recession and the rather stagnant recovery.

Some countries, however, are straightforward enough to publically recognize this and dispense with appearances. Take India for example. The new environmental minister there was forthright in a recent New York Times article:

In a blow to American hopes of reaching an international deal to fight global warming, India’s new environment minister said Wednesday that his country would not offer a plan to cut its greenhouse gas emissions ahead of a climate summit next year in Paris.

The minister, Prakash Javadekar, said in an interview that his government’s first priority was to alleviate poverty and improve the nation’s economy, which he said would necessarily involve an increase in emissions through new coal-powered electricity and transportation. He placed responsibility for what scientists call a coming climate crisis on the United States, the world’s largest historic greenhouse gas polluter, and dismissed the idea that India would make cuts to carbon emissions.

“What cuts?” Mr. Javadekar said. “That’s for more developed countries. The moral principle of historic responsibility cannot be washed away.” Mr. Javadekar was referring to an argument frequently made by developing economies — that developed economies, chiefly the United States, which spent the last century building their economies while pumping warming emissions into the atmosphere — bear the greatest responsibility for cutting pollution.

My guess is India is not alone in this sentiment, whether publically expressed or not.

The bottom line is that the world needs more energy, not less. Humanity’s overall well-being will be tied to the success of such pursuit.

Here’s is how I summed things up in my USA Today op-ed:

Overall, the world needs more energy, not less. Whatever changes in the climate that are to come, humanity will be better prepared and more resilient if we are healthier, wealthier and wiser. Restricting our ability to progress in these areas is not the best way forward.

Categories: Policy Institutes

Eric Holder's Tenure

Thu, 09/25/2014 - 17:57

Ilya Shapiro

Eric Holder’s tenure marked one of the most divisive and partisan eras of the Justice Department.  From his involvement in the bizarre guns-to-gangs operation (“Fast & Furious”), for which he has been cited for contempt by the House and referred to a federal prosecutor (which referral went nowhere due to invocations of executive privilege), to his refusal to recognize the separation of powers—enabling President Obama’s executive abuses—he politicized an already overly political Justice Department.

 One thing that differentiates Holder from other notorious attorneys general, like John Mitchell under Richard Nixon, is that Holder hasn’t gone to jail (yet; the DOJ Inspector General better lock down computer systems lest Holder’s electronic files “disappear”).

Holder’s damage to race relations may be even worse than his contempt for Congress, however, as his management of the Justice Department and use of its powers betray a desire to use the law to advance a dubious view of social justice. For example, he sued fire and police departments to enforce hiring quotas and inflamed social tensions with his pronouncements on Stand Your Ground laws. He blamed banks for not lending enough to members of racial minority groups and other banks for “predatory lending” that led to disproportionate bankruptcies among those same groups. Ironically, he’s even challenged school choice programs, which overwhelmingly help poor black kids acquire better educations.

Still, it must be said that Holder was a “uniter not a divider” on one front: under his reign, the Justice Department has suffered a record number of unanimous losses at the Supreme Court. In the last three terms alone, the  government has suffered 13 such defeats – a rate double President Clinton’s and triple President Bush’s – in areas of law ranging from criminal procedure to property rights to securities regulation to religious freedom. By not just pushing but breaking through the envelope of plausible legal argument, Attorney General Holder has done his all to expand federal (especially executive) power and contract individual liberty beyond any constitutional recognition.

Eric Holder will not be missed by those who support the rule of law.

Categories: Policy Institutes

Long-Term Solutions to the Ukraine Crisis

Thu, 09/25/2014 - 17:51

Emma Ashford

As I argued in a piece over at Forbes yesterday, western sanctions to roll back Russian action in Ukraine have been largely ineffectual. These sanctions - including asset freezes and visa bans – are ‘targeted’ at those suspected of having influence on Putin. Yet the sanctions, designed to be minimally painful for European states, are toothless - the majority of individuals sanctioned have only a minimal role in policy – and they won’t fix the long-term problem.

Over 150 individuals have been sanctioned by the United States and European Union, including 65 Ukrainian rebels, whose inclusion is presumably intended to inhibit their ability to wage conflict. The remainder are Russian, but most have no access to the corridors of power. Anatoly Sidorov, for example, the Commander of Russian military units in Crimea, is likely uninvolved in the policy formulation process. Other names are stranger, such as Ramzan Kadyrov, head of the Chechen republic. No doubt, he’s a trenchant proponent of the rebels, but he doesn’t influence Russian policy. In all, I estimate only a small proportion of those included in joint sanctions are actually involved in high-level decisionmaking.

The sanctions also vary in impact. Vladislav Surkov, suspected mastermind of Russia’s Crimea strategy, joked with reporters that sanctions didn’t worry him, as his only interest in the United States was Tupac. His point is valid: for those with no assets in Western Europe or the United States, sanctions are merely inconvenient.

Newer sanctions on companies certainly carry some more bite, restricting the ability of Russian banks to raise capital on Western markets. But they still don’t touch Russia’s key source of government revenues, an estimated 50-70% of which come from oil and gas sales. Unfortunately, Russia supplies one-third of Europe’s natural gas, and several countries (e.g., Estonia, Latvia) are entirely dependent on Russian energy. An immediate stop to imports is simply not possible, especially at the start of winter.

In the long-run, however, the most energy-dependent countries are also those most worried about Russia for security reasons. Now is an excellent time for these countries to begin to slowly divest themselves of Russian gas and oil. Dependence is a two-way street, after all: Russia is dependent on European payments for energy, and it will be difficult, time-consuming, and expensive for Russia to find alternate buyers for its resources.

The United States can help Europe with this process. The global energy market is being reshaped by innovations like fracking and liquified natural gas (LNG) transports. Thanks to shale gas, the United States is now one of the world’s largest producers of LNG, with shipments set to leave ports as early as 2015. Indeed, House Speaker John Boehner argued in March that the United States could help to curb Russia’s influence by encouraging natural gas exports to Eastern Europe. But for the sake of their own security, European states must begin the long process of shifting away from Russian energy supplies, and turning off the spigot of energy wealth that keeps the Kremlin afloat. 

Categories: Policy Institutes

The Collection of Evidence for a Low Climate Sensitivity Continues to Grow

Thu, 09/25/2014 - 17:09

Patrick J. Michaels and Paul C. "Chip" Knappenberger

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Nic Lewis and Judith Curry just published a blockbuster paper that pegs the earth’s equilibrium climate sensitivity—how much the earth’s average surface temperature is expected to rise in association with a doubling of the atmosphere’s carbon dioxide concentration—at 1.64°C (1.05°C to 4.05°C, 90% range), a value that is nearly half of the number underpinning all of President Obama’s executive actions under his Climate Action Plan.

This finding will not stop the President and the EPA from imposing more limits on greenhouse-gas emissions from fossil fuels. A wealth of similar findings have appeared in the scientific literature beginning in 2011 (see below) and they, too, have failed to dissuade him from his legacy mission.

The publication of the Lewis and Curry paper, along with another by Ragnhild Skeie and colleagues, brings the number of recent low-sensitivity climate publications to 14, by 42 authors from around the world (this doesn’t count our 2002 paper on the topic, “Revised 21st Century Temperature Projections”).  Most of these sensitivities are a good 40% below the average climate sensitivity of the models used by the U.N.’s Intergovernmental Panel on Climate Change (IPCC).

Lewis and Curry arrive at their lower equilibrium climate sensitivity estimate by using updated compilations of the earth’s observed temperature change, oceanic heat uptake, and the magnitude of human emissions, some of which should cause warming (e.g., greenhouse gases), while the others should cool (e.g., sulfate aerosols). They try to factor out “natural variability.” By comparing values of these parameters from the mid-19 century to now, they can estimate how much the earth warmed in association with human greenhouse gas emissions.

The estimate is not perfect, as there are plenty of uncertainties, some of which may never be completely resolved. But, nevertheless, Lewis and Curry have generated  a very robust observation-based estimate of the equilibrium climate sensitivity.

For those interested in the technical details, and a much more thorough description of the research, author Nic Lewis takes you through the paper (here) has made a pre-print copy of the paper freely available (here).

In the chart below, we’ve added the primary findings of Lewis and Curry as well as those of Skeie et al. to the collection of 12 other low-sensitivity papers published since 2010 that conclude that the best estimate for the earth’s climate sensitivity lies below the IPCC estimates. We’ve also included in our Figure both the IPCC’s  subjective and model-based characteristics of the equilibrium climate sensitivity. For those wondering, there are very few recent papers arguing that the IPCC estimates are too low, and they all have to contend with the fact that, according to new Cato scholar Ross McKitrick, “the pause” in warming is actually 19 years in length. 

 

Figure 1. Climate sensitivity estimates from new research beginning in 2011 (colored), compared with the assessed range given in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) and the collection of climate models used in the IPCC AR5. The “likely” (greater than a 66% likelihood of occurrence)range in the IPCC Assessment is indicated by the gray bar. The arrows indicate the 5 to 95 percent confidence bounds for each estimate along with the best estimate (median of each probability density function; or the mean of multiple estimates; colored vertical line). Ring et al. (2012) present four estimates of the climate sensitivity and the red box encompasses those estimates. The right-hand side of the IPCC AR5 range is actually the 90% upper bound (the IPCC does not actually state the value for the upper 95 percent confidence bound of their estimate). Spencer and Braswell (2013) produce a single ECS value best-matched to ocean heat content observations and internal radiative forcing.

 

References:

Aldrin, M., et al., 2012. Bayesian estimation of climate sensitivity based on a simple climate model fitted to observations of hemispheric temperature and global ocean heat content. Environmetrics, doi: 10.1002/env.2140.

Annan, J.D., and J.C Hargreaves, 2011. On the genera­tion and interpretation of probabilistic estimates of climate sensitivity. Climatic Change, 104, 324-436.

Hargreaves, J.C., et al., 2012. Can the Last Glacial Maximum constrain climate sensitivity? Geophysical Research Letters, 39, L24702, doi: 10.1029/2012GL053872

Lewis, N. 2013. An objective Bayesian, improved approach for applying optimal fingerprint techniques to estimate climate sensitivity. Journal of Climate, doi: 10.1175/JCLI-D-12-00473.1.

Lewis, N. and J.A. Curry, C., 2014. The implications for climate sensitivity of AR5 focring and heat uptake estimates. Climate Dynamic, 10.1007/s00382-014-2342-y.

Lindzen, R.S., and Y-S. Choi, 2011. On the observational determination of climate sensitivity and its implica­tions. Asia-Pacific Journal of Atmospheric Science, 47, 377-390.

Loehle, C., 2014. A minimal model for estimating climate sensitivity. Ecological Modelling, 276, 80-84.

Masters, T., 2013. Observational estimates of climate sensitivity from changes in the rate of ocean heat uptake and comparison to CMIP5  models. Climate Dynamics, doi:101007/s00382-013-1770-4

McKitrick, R., 2014. HAC-Robust Measurement of the Duration of a Trendless Subsample in a Global Climate Time Series. Open Journal of Statistics4, 527-535. doi: 10.4236/ojs.2014.47050.

Michaels. P.J. et al., 2002. Revised 21st century temperature projections. Climate Research, 23, 1-9.

Otto, A., F. E. L. Otto, O. Boucher, J. Church, G. Hegerl, P. M. Forster, N. P. Gillett, J. Gregory, G. C. Johnson, R. Knutti, N. Lewis, U. Lohmann, J. Marotzke, G. Myhre, D. Shindell, B. Stevens, and M. R. Allen, 2013. Energy budget constraints on climate response. Nature Geoscience, 6, 415-416.

Ring, M.J., et al., 2012. Causes of the global warming observed since the 19th century. Atmospheric and Climate Sciences, 2, 401-415, doi: 10.4236/acs.2012.24035.

Schmittner,  A., et al. 2011. Climate sensitivity estimat­ed from temperature reconstructions of the Last Glacial Maximum. Science, 334, 1385-1388, doi: 10.1126/science.1203513.

Skeie,  R. B., T. Berntsen, M. Aldrin, M. Holden, and G. Myhre, 2014. A lower and more constrained estimate of climate sensitivity using updated observations and detailed radiative forcing time series. Earth System Dynamics, 5, 139–175.

Spencer, R. W., and W. D. Braswell, 2013. The role of ENSO in global ocean temperature changes during 1955-2011 simulated with a 1D climate model. Asia-Pacific Journal of Atmospheric Science, doi:10.1007/s13143-014-0011-z.

van Hateren, J.H., 2012. A fractal climate response function can simulate global average temperature trends of the modern era and the past millennium. Climate Dynamics,  doi: 10.1007/s00382-012-1375-3.

Categories: Policy Institutes

Pages