Andrew M. Grossman
In what may be the biggest business case of the term, the Supreme Court today declined to show its hand.
Halliburton v. Erica P. John Fund, Inc. takes aim at shareholder class actions, a field of law that the Court itself created in a 1988 case, Basic v. Levinson. A four-justice majority in Basic held that shareholders suing over misrepresentations may prove that they relied on the false statements—a necessary element of any fraud suit—by presumption: if the market for the stock in question is more-or-less efficient, their reliance on any misrepresentations that are baked into the price of the stock may be presumed. Without this presumption, each shareholder would have to individually demonstrate his or her actual knowledge of the misrepresentation and actual reliance upon it, precluding the kind of “commonality” required to bring a class action.
Basic came at the tail-end of the Court’s decades-long experiment in policymaking by creating and defining the contours of civil actions. Where Congress passed remedial laws—here, Section 10(b) the Securities Exchange Act of 1934—the Court would often read into them “implied” causes of action allowing private litigants to bring suit and seek damages over alleged infractions that would otherwise be left to regulators.
The test of time has shown that the Court is ill-suited to this function, particularly in the securities-law context. Since Basic, stock-drop class actions have boomed, and attaining class certification (merely by relying on Basic’s presumption that the issues at play are common to class members) just about guarantees a settlement. But there has been commensurately little benefit to shareholders, who are, in the end, the ones who wind up paying any damages or settlements, with the lawyers skimming off a good portion. In other words, these suits are very likely a net negative for shareholders—which may explain why Congress has never authorized them legislatively. And in a 2013 decision, four of the Court’s conservatives stated their willingness to reconsider Basic.
Chief Justice Roberts, however, kept his own counsel then, and that is what he did today. His few questions, most directed at the plaintiffs’ counsel David Boies and Malcolm Stewart, arguing for the government in support of the plaintiffs, focused on concrete results.
“Don’t most cases settle immediately following certification and so never reach the merits?,” the Chief Justice asked Boies. Similarly, Chief Justice Roberts asked Stewart, “And when the Court decided Basic, were high-quality “event studies” available by which plaintiffs could show that the heart of Basic’s presumption—that a given misrepresentation affected the stock price—holds true?”
Justices Alito and Kennedy, who led the questioning of the justices skeptical of Basic, took the Chief’s drift. They pushed hard on a fallback argument by the defendant (also made by several law professors) that would require shareholder plaintiffs, at the certification stage, to prove that all class members were injured in the same way by the misrepresentation—that is, to show that the misrepresentation caused a price impact. That showing, the defendant’s counsel Aaron Streett argued, is the “glue” that holds together the class. Even Justices Breyer and Sotomayor, assumed to be unlikely pickups for the defendants, expressed some sympathy for the idea that price impact is at least relevant to demonstrating commonality at the certification stage, though they wavered on whose burden it should be: plaintiffs to prove it, or defendants to rebut it. And Stewart conceded that requiring plaintiffs to make such a showing, such as through an event study, would not have any negative effects, given that (under any view of the law) plaintiffs are required to make such a showing at some point in the case—albeit, as currently understood, after certification during the merits phase if, that is, the case has not already settled.
The chief barrier to overturning Basic may not be its logic, its wisdom, or even its correctness as a matter of law, but instead stare decisis—that is, the Court’s respect for its prior decisions, particularly where they interpret statutes that Congress may subsequently reverse through legislation. Justice Kagan, in particular, seemed to suggest by her questioning that any changes since Basic have been minor and do not justify the Court’s upending settled law. The Chief Justice as well suggested that the Court is not well-suited to track developments in the field of economics that might undermine Basic and instead should leave that task to Congress.
If one had to make a prediction, it is that Basic’s presumption of shareholder reliance will continue in force, but with a new requirement of price impact engrafted upon it. While that result would not correct the Court’s initial mistake of creating and then expanding a kind of lawsuit that Congress never envisioned, it would at least limit the damage and do so in a way that is consistent with the Court’s overall jurisprudence on implied rights of action and class action certification. That would cut down on abusive litigation, while leaving the bigger questions of policy to Congress—where they rightly belong.
Yesterday’s budget from President Obama claimed to raise taxes by $650 billion in addition to the $650 billion in tax hikes from January 2013. However, careful analysis shows that the president wants much more money from American’s pocketbook. The exact amount isn’t entire clear due to the games the Office of Management and Budget is playing with its various tables, but if the president had his way, more than $1 trillion in tax hikes would be coming.
Here is just a sample of the tax hikes the president proposes:
- “Buffet Tax” ($53 billion): President Obama resurrected this tax that would require high-income individuals to pay at least 30% of their income in taxes.
- Limiting tax teduction ($598 billion): President Obama would also limit the value of itemized deductions for high-income earners.
- Changes to the “Death Tax” ($131 billion): The president suggests going back to the estate tax rules of 2009 which would increase the marginal tax rate on estates and lower the exemption, subjecting more assets to taxation.
- Changes to oil and gas taxation ($44 billion): Frequently criticized by the president, these tax provisions are not subsidies to oil and gas companies, but instead ameliorate the tax code’s improper treatment of capital expenditures.
- Changes to international taxation ($276 billion): Instead of moving the United States to a territorial tax system like the rest of the industrialized world, the president proposes further raising taxes on corporation with overseas earnings.
- Cap on 401(k)/IRA Contributions ($28 billion): This provision would prohibit individuals from contributing to retirement accounts if the balance is greater than $3 million.
- Increase in tobacco taxes ($78 billion): To pay for his universal pre-k proposal, President Obama would increase the tobacco tax from $1.10/pack to $1.95/pack.
Like so many other sections of the budget, the president is trotting out old, tired, blame-the-rich rhetoric instead of tackling the country’s real problems.
Utah Constitutional Amendment 3, passed by referendum in 2004, states that no union other than one between a man and a woman may be recognized as a marriage. Derek Kitchen and five co-plaintiffs took issue with this definition and filed a lawsuit in federal district court last year to challenge the gay marriage ban. In a surprising and widely publicized December 2013 ruling, the court invalidated the amendment, finding that such a restriction was an affront to equal protection and the fundamental right to marry.
Meanwhile, Mary Bishop and Sharon Baldwin also filed a federal suit to challenge a similar provision that was added to Oklahoma’s constitution by referendum in 2004. Like Utah’s district court, the Oklahoma district court found the amendment unconstitutional. Following on the heels of last term’s Supreme Court ruling in United States v. Windsor—which struck down part of the Defense of Marriage Act—these ground-breaking red-state cases are now both before the U.S. Court of Appeals for the Tenth Circuit, which will consider the constitutionality of a state’s decision to exclude same-sex unions from the definition of marriage.
Reprising our collaboration in Hollingsworth v. Perry—the Prop 8 case in which the Supreme Court avoided ruling on the merits—Cato and the Constitutional Accountability Center have filed a brief supporting the Utah and Oklahoma plaintiffs’ fight for equality under the law in their respective challenges. We argue that the Equal Protection Clause of the Fourteenth Amendment was intended to protect from this same type of arbitrary and invidious singling-out that the Utah and Oklahoma marriage restrictions effect; that the original meaning of the Equal Protection Clause confirms that its protections are to be interpreted broadly; and that the clause provides every person the equal right to marry a person of his or her choice. We believe that the Utah and Oklahoma constitutional amendments conflict with the equal protection rights of those same-sex couples whose unions are treated differently than those of opposite-sex couples.
Every person has the right to choose whom to marry, and to have that decision respected equally by the state in which they live. Especially in the wake of Windsor, it is becoming clearer that laws like these that force same-sex unions into second-class status have no place in a free society. The Tenth Circuit should affirm the district courts’ decisions.
With briefing in Kitchen v. Herbert and Bishop v. Smith now complete, the Tenth Circuit will be hearing argument shortly, with a decision expected in late spring or summer.
This blogpost was co-authored by Cato legal associate Julio Colomba.
It seems mind-boggling. Minnesota public school staff forced a barefoot teenage girl in a wet bathing suit to stand outside in sub-zero weather until she developed frostbite.
It happened around 8:30 a.m. Wednesday at Como Park High School in St. Paul. Fourteen-year-old Kayona Hagen-Tietz says she was in the school’s pool when the fire alarm went off.
While other students had gotten out earlier and were able to put on dry clothes, Hagen-Tietz said she was rushed out with just her towel.
On Wednesday morning, the temperature was 5 below, and the wind chill was 25 below.
A teacher prevented her from getting her clothes from her locker because the rules stipulate that everyone must immediately leave the building in the event of a fire alarm. Shivering, the student pleaded to be allowed to go inside a car or another building but her request was denied.
Hagen-Tietz asked to wait inside an employee’s car, or at the elementary school across the street. But administrators believed that this would violate official policy, and could get the school in trouble, so they opted to simply let the girl freeze.
Students huddled around her and a teacher gave her a coat, but she stood barefoot for ten minutes before obtaining permission to sit in a vehicle. By that point, she had already developed frostbite.
How can something like this happen? Public choice theory offers an important insight: even in the public sector, people tend to act in their own self-interest and respond to incentives.
In this case, each of the rules makes perfect sense in the abstract. The fire alarm may signal a real danger so exiting the building with alacrity is essential. Likewise, rules about keeping students on school property and out of adults’ vehicles reflect legitimate concerns about student safety. And yet, in enforcing these safety rules rigidly in an extreme case where they should not apply, the teachers and administrators violated the rules’ very intention.
Government schools and their employees are not held accountable to parents, but to bureaucrats and their top-down rules. No doubt the school staff were well-intentioned—they recognized the harm this girl was suffering and almost certainly wanted to help her—but they were more afraid of violating the rules. Though the rules were written to protect students, in this case the rules were inimical to a student’s well-being, yet the staff still chose compliance over common sense.
No school is perfect—no human institution is—but incidences like these are much less likely to occur when schools are held directly accountable to parents. The only way to make government schools directly accountable to parents, as their private counter-parts already are, is to enact educational choice programs that empower parents to vote with their (non-frostbitten) feet.
Daniel J. Mitchell
The President’s new budget has been unveiled.
There are lots of provisions that deserve detailed attention, but I always look first at the overall trends. Most specifically, I want to see what’s happening with the burden of government spending.
And you probably won’t be surprised to see that Obama isn’t imposing any fiscal restraint. He wants spending to increase more than twice as fast as needed to keep pace with inflation.
What makes these numbers so disappointing is that we learned last month that even a modest bit of spending discipline is all that’s needed to balance the budget.
By the way, you probably won’t be surprised to learn that the President also wants a $651 billion net tax hike.
P.S. Since we’re talking about government spending, I may as well add some more bad news.
I’ve shared some really outrageous examples of government waste, but here’s a new example that has me foaming at the mouth. Government bureaucrats are flying in luxury and sticking taxpayers with big costs. Here are some of the odious details from the Washington Examiner.
What can $4,367 buy? For one NASA employee, it bought a business-class flight from Frankfurt, Germany, to Vienna, Austria. Coach-class fare for the same flight was $39. The federal government spent millions of dollars on thousands of upgraded flights for employees in 2012 and 2013, paying many times more for business and first-class seats than the same flights would have cost in coach or the government-contracted rate. …Agencies report their premium travel expenses to the General Services Administration each year. These reports were obtained by the Washington Examiner through Freedom of Information Act requests. …The most common reasons across agencies for such “premium” flights in 2012 and 2013 were medical necessities and flights with more than 14 hours of travel time.
By the way, “medical necessities” is a loophole that can be exploited. All too often, bureaucrats get excuses from their doctors saying that they have bad backs (or something similarly dodgy) and that they require extra seating space.
But I’m digressing. It’s sometimes hard to focus when there are so many examples of foolish government policy.
Let’s look at more examples of taxpayers getting ripped off.
One such flight was a trip from Washington, D.C., to Brussels, Belgium, which cost $6,612 instead of $863. Similar mission-required upgrades included several flights to Kuwait for $6,911 instead of $1,471, a flight from D.C. to Tokyo for $7,234 instead of $1,081 and a trip from D.C. to Paris for $6,037 instead of $477. …NASA employees also racked up a long list of flights that cost 26, 72 and even 112 times the cost of coach fares, according to Examiner calculations. Several space agency employees flew from Oslo, Norway, to Tromso, Norway – a trip that should have cost $65. Instead, each flew business class for $4,668. Another NASA employee flew from Frankfurt, Germany, to Cologne, Germany, for $6,851 instead of $133, a flight that cost almost 52 times more than the coach fare. …One flight from D.C. to Hanoi, Vietnam, for an informational meeting cost $15,529 instead of $1,649, according to the agency’s 2012 report.
Frankfurt to Cologne for $6851 and a domestic flight in Norway for $4668?!? Did these trips include caviar and a masseuse? Were the planes made of gold?
I do enough international travel to know that these prices are absurd, even if you somehow think bureaucrats should get business class travel (and they shouldn’t).
And as you might suspect, much of the travel was for wasteful boondoggles.
Department of the Interior employees, for example, flew to such exotic locations as Costa Rica, Denmark, Japan and South Africa in 2012. …The Department of Labor sent employees to places like Vietnam and the Philippines for “informational meetings,” conferences and site visits.
The one sliver of good news is that taxpayers didn’t get mistreated to the same extent last year as they did the previous year.
The agencies spent $5.7 million in 2012, almost double the $3 million they paid for premium travel in 2013.
The moral of the story is that lowering overall budgets - as happened in 2013 - is the only effective way of reducing waste.
Moments ago, the President released his fiscal year 2015 budget request. Sadly, for those who support smart, sensible budgeting, the President’s budget is nothing to celebrate. The budget increases spending and fails to tackle the true drivers of our budget problem—entitlement spending. All deficit reduction included in the budget is from revenue increases, not spending cuts.
Over the several days, we’ll be analyzing the President’s budget in full detail, but here are the top-line numbers from the President’s budget.
1-The President’s budget suggests spending more than the Ryan-Murray budget passed in December. Under the agreement reached by Budget Chairs Congressman Ryan and Senator Murray, and supported by the President, discretionary spending for fiscal year 2015 should be $1,014 trillion. The President’s budget includes a section that bumps that up by $56 billion, paid for mostly by tax increases.
2-Over the ten-year budgetary window, the President spends $171 billion more than Congressional Budget Office (CBO) projections. The budget also does not reach balance and runs deficits every single year.
3- According to Obama’s budget, the federal government will collect $3.3 trillion in tax revenue this year, more than any other year in history. The budget includes $650 billion in new revenue though various distortionary tax hikes. The President’s Office of Management and Budget also made rosier assumptions about the growth of the economy over the next ten years. As a result, over the ten-year budgetary window, the President’s collects an additional $3.1 trillion in revenue than CBO assumed in February.
Steve H. Hanke
Last year, Nicholas Krus and I published a chapter, “World Hyperinflations”, in the Routledge Handbook of Major Events in Economic History. We documented 56 hyperinflations – cases in which monthly inflation rates exceeded 50% per month. Only seven of those hyperinflations have savaged Latin America (see the accompanying table).
At present, the world’s highest inflation resides in Latin America, namely in Venezuela. The Johns Hopkins – Cato Institute Troubled Currencies Project, which I direct, estimates that Venezuela’s implied annual inflation rate is 302%. Will Venezuela be the eighth country to join the Latin American Hall of Shame? Maybe. But, it has a long way to go.
The Hanke-Krus Hyperinflation Table
Latin American edition
Source: Steve H. Hanke and Nicholas Krus (2013), “World Hyperinflations”, in Randall Parker and Robert Whaples (eds.) Routledge Handbook of Major Events in Economic History, London: Routledge Publishing.
Ukrainians won an important political battle by ousting the corrupt Viktor Yanukovich as president. But replacing Yanukovich with another dubious politico will change little.
Washington also triumphed. Without doing much—no troops, no money, few words—Americans watched protestors frustrate Russia’s Vladimir Putin.
But now Russia is attempting to win as well, intervening in Crimea. Moscow has created a tinderbox ready to burst into flames. The only certainty is that the U.S. should avoid being drawn into a war with Russia.
In 2010 Yanukovich triumphed in a poll considered to be fair if not entirely clean. His corrupt proclivities surprised no one. However, while tarred as pro-Russian, in accepting Putin’s largesse last November Yanukovich actually refused to sign the Moscow-led Customs Union.
Still, protestors filled Maidan Square in Kiev over Yanukovich’s rejection of a trade agreement with the European Union. As I point out in my latest Forbes column: “The issue, in contrast to Kiev’s later brutal treatment of protestors, had nothing to do with democracy, human rights, or even sovereignty.” As such, it was not America’s business, but up to the Ukrainian people.
And Ukraine is divided. Broadly speaking, the nation’s west is nationalist and leans European while the east is Russo-friendly.
Demonstrations quickly turned into a de facto putsch or street revolution. Yanukovich’s ouster was a gain for Ukraine, but similar street violence could be deployed against better elected leaders in the future.
Moreover, many of those who look east and voted for Yanukovich feel cheated. There was no fascist coup, but the government they helped elect was violently overthrown. Some of them, especially in Crimea, prefer to shift their allegiance to Russia.
Kiev should engage disenfranchised Yanukovich backers. Kiev also should reassure Moscow that Ukraine will not join any anti-Russian bloc, including NATO. But if Crimeans, in particular, want to return to Russia, they should be able to do so.
There is no important let alone vital security issue at stake for the U.S. in the specific choices Ukrainians make. The violent protests against the Yanukovich government demonstrate that Moscow has no hope of dominating the country. Kiev will be independent and almost certainly will look west economically.
Russia could still play the new Great Game. Unfortunately, rather than play Vladimir Putin upended the board by taking effective control of the Crimea.
Yet Putin tossed aside his trump card, a planned referendum by Crimea’s residents. A majority secession vote would have allowed him to claim the moral high ground. However, an election conducted under foreign occupation lacks credibility.
As it stands Russia has committed acts of aggression and war.
Even in the worst case the U.S. has no cause for military intervention. Who controls the Crimea ain’t worth a possible nuclear confrontation.
Putin is a nasty guy, but Great Power wannabe Russia is no ideologically-driven superpower Soviet Union. Moscow perceives its vital interests as securing regional security, not winning global domination. Yet bringing Ukraine into NATO would have created a formal legal commitment to start World War III.
The allies should develop an out for Russia. For instance, Moscow withdraws its forces while Kiev schedules independence referendums in Russian-leaning areas.
If Putin refuses to draw back, Washington and Brussels have little choice but to retaliate. The allies could impose a range of sanctions, though most steps, other than excluding Russian banks from international finance, wouldn’t have much impact.
Tougher would be banning investment and trade, though the Europeans are unlikely to stop purchasing natural gas from Moscow. The other problem is the tougher the response the more likely Russia would harm American interests elsewhere, including in Afghanistan, Iran, and Korea.
The Ukrainian people deserve a better future. But that is not within Washington’s power to bestow. Today the U.S. should concentrate on pulling Russia back from the brink in Ukraine.
A new cold war is in no one’s interest. A hot war would be a global catastrophe.
Should parents helping their child’s teacher put on a short class party have to submit to a background check first? Is it child endangerment to leave your toddler in the car for a few minutes on a mild day while you run into a shop? If your child gets hurt falling off a swing, is it potential child neglect not to sue every solvent defendant in sight? Should police have arrested a dad who walked into school at pickup time rather than wait outside for his kids as he was supposed to?
Author Lenore Skenazy has led the charge against the forces of legal and societal overprotectiveness in her book Free-Range Kids and at her popular blog of the same name. This Thursday, March 6 – rescheduled from a weather-canceled event originally set for last month – she’ll be the Cato Institute’s guest for a lunchtime talk on helicopter parenting and its near relation, helicopter governance; I’ll be moderating and commenting. The event is free and open to the public, but you need to register, which you can do here. You can also watch online live at this link.
If you only read one Cato brief this Supreme Court term, it should be this one.
Believe it or not, it’s illegal in Ohio to lie about politicians, for politicians to lie about other politicians, or for politicians to lie about themselves. That is, it violates an election law—this isn’t anything related to slander or libel, which has higher standards of proof for public figures—to make “false statements” in campaign-related contexts.
During the 2010 House Elections, a pro-life advocacy group called the Susan B. Anthony List (SBA List), published ads in Ohio claiming that then-Rep. Steven Driehaus, who was running for re-election, had voted to fund abortions with federal money (because he had voted for Obamacare). Rather than contesting the truth of these claims in the court of public opinion, Driehaus filed a complaint with the Ohio Election Commission (OEC) under a state law that makes it a crime to “disseminate a false statement concerning a candidate, either knowing the same to be false or with reckless disregard of whether it was false.”
While the complaint was ultimately dropped, the SBA List took Driehaus and the OEC to federal court, seeking to have this law declared unconstitutional and thus enable advocacy groups to have more freedom going forward. The case has now reached the Supreme Court.
Joined by legendary satirist (and Cato’s H.L. Mencken Research Fellow) P.J. O’Rourke, our brief supports the SBA List and reminds the Court of the important role that “truthiness”—facts you feel you in heart, not in your head—plays in American politics, and the importance of satire and spin more broadly. We ask the Court a simple yet profound question: Doesn’t the First Amendment’s guarantee of free speech protect one man’s truth even if it happens to be another man’s lie? And who’s to judge—and on what scale—when a statement slides “too far” into the realm of falsehood?
However well intentioned Ohio legislators may have been, laws that criminalize “false” speech don’t replace truthiness and snark with high-minded ideas and “just the facts.” Instead, they chill speech, replacing the sort of vigorous political dialogue that’s at the core of the democratic process with silence. The Supreme Court of all institutions should understand that just because a statement isn’t fully true, that doesn’t mean it doesn’t have its place in public discourse. Moreover, pundits and satirists are much-better placed to evaluate and send-up half-truths than government agencies.
The Supreme Court will hear argument in Susan B. Anthony List v. Driehaus on April 22.
K. William Watson
Cato’s congressional trade votes database now includes votes from last year on major trade bills and amendments in both houses of Congress. The purpose of the database is to educate the public about the trade policy preferences of individual members. We do that by recording their votes on major trade bills and amendments and using the data to map a broader ideological profile.
Whether a particular member qualifies as a free trader, an isolationist, an internationalist, or an interventionist based on our methodology depends on their support for (or opposition to) trade barriers and subsidies.
In previous years, the farm bill and its various amendments have provided a treasure trove of vote data to pin down members’ proclivities on specific commodities and willingness to use public money to distort the economy for the benefit of select cronies. This year was no different, except that votes taken in the House of Representatives on the full package bill have been excluded. Those votes hinged almost entirely on the issue of food stamps, and because the purpose of the database is to reveal members’ trade policy positions, including them in the database would be inappropriate.
That doesn’t mean, of course, that you shouldn’t be dismayed by Republicans who, after successfully removing food stamps from the bill so that productive debate could be had on reforming farm programs, nevertheless voted en masse to continue our Soviet-style agriculture policy with no significant change.
The new votes on the site include the Senate farm bill, failed votes in both houses to reform the sugar program, an amendment to avoid protectionist regulations on imported olive oil, an extension of “Buy American” policies in government procurement, and a continuation of export marketing subsidies for wealthy agribusiness.
I encourage you to check out the site, read up on our unique methodology, and find out just how protectionist your favorite (or least favorite) member of Congress really is.
Patrick J. Michaels and Paul C. "Chip" Knappenberger
The Current Wisdom is a series of monthly articles in which Patrick J. Michaels and Paul C. “Chip” Knappenberger, from Cato’s Center for the Study of Science, review interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.
With all the stern talk about global warming and widespread concern over climate change, you would think that we humans would have a propensity for cooler temperatures. Everywhere you look, the misery that rising temperatures (and the associated evils) will supposedly heap upon us seems to dominate reports about the coming climate. But do patterns of population movement really support the idea that we prefer cooler locations?
Since 1900, the population of the United States increased from about 76 million people to about 309 million people in 2010. Accompanying that population growth were major advances in technology and industry, including vast improvements in our nation’s system of transportation. As planes, trains, and automobiles replaced the horse and buggy, Americans became more mobile, and where we live was no longer connected primarily with proximity to where we were born. Instead, we became much freer to choose our place of residence based on considerations other than ease of getting there.
Where has our new-found freedom of mobility led us? Figure 1 shows the rate of population change from 1900 to 2010 for each of the contiguous 48 states. Notice the increases in states with warm climates such as Florida, Texas, and California, and also in states with big industry (that is, jobs), such as New York, Michigan, and Ohio for example.
Figure 1. The state-by-state population trend (people/year) from 1900 to 2010 (data from U.S. Census Bureau).
Which states are people less likely to choose to live in? States such as North Dakota, South Dakota, Montana, Maine, Vermont—all of which have harsh climates and low temperatures.
Comparing a map of the change in population (Figure 1) with a map depicting the average temperature of each state (Figure 2) reveals a pretty strong indication that people seem to be seeking out warmer states.
Figure 2. The state-by-state average annual temperature for the period 1900-2010 (statewide temperture data available from the U.S. National Climatic Data Center).
Another way of looking at human temperature preferences is to calculate what we’ll call the “average experiential temperature”—that is, the annual temperature that the average person living in the lower 48 states experiences each year. We can calculate this value by first multiplying the average temperature in each state during a particular year by the state’s population in the same year. Then we sum this product across the 48 contiguous states, and finally divide this sum by the total population of the country. In other words, the temperature in states with larger populations weigh more heavily on the national composite experiential temperature than does the temperature in those states with sparser populations. As the population of the country redistributes itself over time, we can track how the average person’s climate changes.
When we do that for each year from 1900 to 2013, we get the result shown in Figure 3—a steadily rising temperature. In fact, the average experiential temperature has risen by a total of about 3.85ºF over the course of the last 114 years (a rate of 0.34ºF per decade).
Figure 3. The average experiential temperature of the population of the United States, 1900 to 2013.
But the history of experiential temperatures alone can’t tell us whether the increase has been unwillingly forced upon us by a large-scale warming of the climate from, say, an enhanced greenhouse effect, or whether the change results from Americans seeking out warmer locales on their own accord.
U.S. Average Temperature
To answer this question, we must calculate the area-weighted average temperature of the United States—that is, the combination of the yearly average temperature within each state weighted by that state’s total area. In this case, it is the size of the state, rather than the size of its population, that matters—the bigger the state, the bigger its contribution to the nationwide average.
The result of this calculation is a quite different looking temperature history. In Figure 4, we included the annual U.S. average temperature history along with the annual U.S. “experiential” temperature from Figure 3. We see that, while the United States actual temperature has fluctuated a bit, experiencing warm decades such as the 1930s and 1990s and cold ones such as the 1910s and 1970s, it has increased only slightly during the 20th century—about 0.90ºF (a rate of 0.08ºF/decade).
Figure 4. Average temperature of the United States, 1900 to 2013.
For what it’s worth, when you calculate the national temperature this way (using the state-by-state temperature data from the National Climatic Data Center, NCDC), you get a heckuva lot less warming than is in the “official” NCDC record put out by the U.S. Department of Commerce. The difference lies in the “adjustments” plastered on to the original data. Both records are adjusted for a bias known as “time of day” when the previous 24-hour highs and lows recorded. It’s complicated, but it also does slightly alter the data.
But the official version is additionally massaged more than—well, we can’t say in polite company. A laundry list can be found here. The sum of all of those adjustments is to put about twice as much warming in the record as is in our state-averaged plot.
Seeking the Heat
Although there has been a slight warm-up of the actual temperature, that rise is nowhere near the increase in the experiential temperature. In fact, the average experiential temperature has climbed at a rate more than four times that of the U. S. average temperature—which is the experiential temperature had the population distribution not changed at all. That means that Americans have actively been moving to warmer climates. And there is every indication that they are continuing to do so, as evidenced by the strong rise in experiential temperatures during the past 20 or 30 years.
While climatologists have not generally appreciated this fact, it has been long recognized and appreciated by sociologists. As both people’s mobility and their ability to select the climate they prefer have increased throughout this past century, the core of the U.S. population has moved southward—into warmer climates. The overall migration of people into the southern “Sunbelt” states has created a temperature change over time for the “average American” that far outstrips the most pessimistic measurements of global warming for the past century, and rivals the projections for the next!
Apparently, people–or Americans at least–seem to prefer a warmer climate to a cooler one. Next time climate prognosticators warn of the perils of rising temperatures, remember this: when given the means and a choice, some (or rather, most) like it hot!
(Special thanks to Robert C. Balling Jr. and Randy Cerveny, who assisted with early versions of this research.)
Paul Krugman weighed in yesterday on the Trans Pacific Partnership (TPP). I agree with one of his points; I disagree with another.
First, the disagreement: Krugman claims protectionism is mostly gone, and thus the TPP is not all that important:
The first thing you need to know about trade deals in general is that they aren’t what they used to be. The glory days of trade negotiations—the days of deals like the Kennedy Round of the 1960s, which sharply reduced tariffs around the world—are long behind us.
Why? Basically, old-fashioned trade deals are a victim of their own success: there just isn’t much more protectionism to eliminate. Average U.S. tariff rates have fallen by two-thirds since 1960. The most recent report on American import restraints by the International Trade Commission puts their total cost at less than 0.01 percent of G.D.P.
Tariffs on certain goods are still quite high. A publication called World Tariff Profiles illustrates this nicely. If you look at p. 170 for U.S. statistics, you will see tariff duties for four general product categories of over 10%. You’ll also see maximum tariffs (i.e., the high tariff on particular products) of over 100%!
And if you look at the duty rates for other countries, they are generally much higher.
And none of that includes special “trade remedy” tariffs (anti-dumping, countervailing duties, safeguards), subsidies, discriminatory government procurement, or domestic laws and regulations that discriminate (such as local content requirements).
So, protectionism is alive and well.
Turning to the part where I agree with him, he says:
But the fact remains that, these days, “trade agreements” are mainly about other things. What they’re really about, in particular, is property rights—things like the ability to enforce patents on drugs and copyrights on movies. And so it is with T.P.P.
… Is this a good thing from a global point of view? Doubtful. The kind of property rights we’re talking about here can alternatively be described as legal monopolies. True, temporary monopolies are, in fact, how we reward new ideas; but arguing that we need even more monopolization is very dubious—and has nothing at all to do with classical arguments for free trade.
Now, the corporations benefiting from enhanced control over intellectual property would often be American. But this doesn’t mean that the T.P.P. is in our national interest. What’s good for Big Pharma is by no means always good for America.
I don’t have much to add to his points, which I think are pretty good ones. In my view, there’s a need for a real debate on how much intellectual property protection is appropriate (and, in fact, we will be discussing this here at Cato next week). Unfortunately, that’s not what we are getting either domestically or in the international trade context, where it seems that more is always better.
Daniel J. Mitchell
To make fun of big efforts that produce small results, the Roman poet Horace wrote, “The mountains will be in labor, and a ridiculous mouse will be brought forth.”
That line sums up my view of the new tax reform plan introduced by Rep. Dave Camp (R-Mich.), chairman of the House Ways and Means Committee.
Back in 1995, tax reform was a hot issue. The House Majority Leader, Dick Armey, had proposed a flat tax. Congressman Billy Tauzin was pushing a version of a national sales tax. And there were several additional proposals jockeying for attention.
To make sense of the clutter, I wrote a paper for the Heritage Foundation that demonstrated how to grade the various proposals that had been proposed.
As you can see, I included obvious features such as low tax rates, simplicity, double taxation, and social engineering, but I also graded plans based on other features such as civil liberties, fairness, and downside risk.
There obviously have been many new plans since I wrote this paper, most notably the Fair Tax (a different version of a national sales tax than the Tauzin plan), Simpson-Bowles, the Ryan Roadmap, Domenici-Rivlin, the Heritage Foundation’s American Dream proposal, the Baucus-Hatch blank slate, and—as noted above—the new tax reform plan by Chairman Camp.
Given his powerful position as head of the tax-writing committee, let’s use my 1995 methodology to assess the pros and cons of Camp’s plan.
Rates: The top tax rate for individual taxpayers is reduced from 39.6 percent to 35 percent, which is a disappointingly modest step in the right direction. The corporate tax rate falls from 35 percent to 25 percent, which is more praiseworthy, though Camp doesn’t explain why small businesses (which file using the individual income tax) should pay higher rates than large companies.
Simplicity: Camp claims that he will eliminate 25 percent of the tax code, which certainly is welcome news since the code has swelled to 70,000-plus pages of loopholes, exemptions, deductions, credits, penalties, exclusions, preferences, and other distortions. And his proposal does eliminate some deductions, including the state and local tax deduction (which perversely rewards states with higher fiscal burdens).
Saving and Investment: Ever since Reagan slashed tax rates in the 1980s, the most anti-growth feature of the tax code is probably the pervasive double taxation of income that is saved and invested. Shockingly, the Camp plan worsens the tax treatment of capital, with higher taxation of dividends and capital gains and depreciation rules that are even more onerous than current law.
Social Engineering: Some of the worst distortions in the tax code are left in place, including the health care exclusion for almost all taxpayers. This means that people will continue to make economically irrational decisions solely to benefit from certain tax provisions.
Civil Liberties: The Camp plan does nothing to change the fact that the IRS has both the need and power to collect massive amounts of private financial data from taxpayers. Nor does the proposal end the upside-down practice of making taxpayers prove their innocence in any dispute with the tax authorities.
Fairness: In a non-corrupt tax system, all income is taxed, but only one time. On this basis, Camp’s plan is difficult to assess. Loopholes are slightly reduced, but double taxation is worse, so it’s hard to say whether the system is more fair or less.
Risk: There is no value-added tax, which is a critically important feature of any tax reform plan. As such, there is no risk the Camp plan will become a Trojan Horse for a massive expansion of the fiscal burden.
Evasion: People are reluctant to comply with the tax system when rates are punitive and/or there’s a perception of rampant unfairness. It’s possible that the slightly lower statutory rates may improve incentives to obey the law, but that will be offset by the higher tax burden on saving and investment.
International Competitiveness: Reducing the corporate tax rate will help attract jobs and investment, and the plan also mitigates some of worst features of America’s “worldwide” tax regime.
Now that we’ve taken a broad look at the components of Camp’s plan, let’s look at the grades in comparison to the other plans I’ve reviewed over the years:
You can see why I’m underwhelmed by his proposal.
Camp’s proposal may be an improvement over the status quo, but my main reaction is, what’s the point?
In other words, why go through months of hearings and set up all sorts of working groups, only to propose a timid plan?
Now, perhaps, readers will understand why I’m rather pessimistic about achieving real tax reform.
We know the right policies to fix the tax code.
And we have ready-made plans—such as the flat tax and national sales tax—that would achieve the goals of tax reform.
Camp’s plan, by contrast, simply rearranges the deck chairs on the Titanic.
P.S.: If you need to be cheered up after reading all this, here’s some more IRS humor to brighten your day, including the IRS version of the quadratic formula, a new Obama 1040 form, a list of tax day tips from David Letterman, a cartoon of how GPS would work if operated by the IRS, a sale on 1040-form toilet paper (apparently a real product), and two satirical songs about the tax agency (here and here).
Whatever its words, a poster without a striking image is a missed opportunity, and incongruous, vaguely disturbing images often work best. (The snake is among the most unsettling creatures on earth to gaze at, yet it figures as the sympathetic subject in not one but two great American political images, the “Don’t Tread on Me” Gadsden flag and Ben Franklin’s “Join or Die.”) For World Press Freedom Day last year, a journalists’-advocacy group in Jordan came up with this simple design. Yes, today’s tyrants are more interested in clamping controls on keyboards, blogs, and cellphone transmissions, but for evocativeness it’s hard to beat the chained nib of an old-style fountain pen, trembling somewhat as if in resistance.
Today, social media and meme culture endlessly rework classic posters and poster genres for purposes of commentary and satire. That stands in a great tradition: as a means of persuasion, posters are themselves a powerful part of the press. Use them in a good cause, and enjoy them too. [Earlier entries in this series: Monday, Tuesday, Wednesday, Thursday]
Patrick J. Michaels and Paul C. "Chip" Knappenberger
Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”
We have two new entries to the long (and growing) list of papers appearing the in recent scientific literature that argue that the earth’s climate sensitivity—the ultimate rise in the earth’s average surface temperature from a doubling of the atmospheric carbon dioxide content—is close to 2°C, or near the low end of the range of possible values presented by the U.N.’s Intergovernmental Panel on Climate Change (IPCC). With a low-end warming comes low-end impacts and an overall lack of urgency for federal rules and regulations (such as those outlined in the President’s Climate Action Plan) to limit carbon dioxide emissions and limit our energy choices.
The first is the result of a research effort conducted by Craig Loehle and published in the journal Ecological Modelling. The paper is a pretty straightforward determination of the climate sensitivity. Loehle first uses a model of natural modulations to remove the influence of natural variability (such as solar activity and ocean circulation cycles) from the observed temperature history since 1850. The linear trend in the post-1950 residuals from Loehle’s natural variability model was then assumed to be largely the result, in net, of human carbon dioxide emissions. By dividing the total temperature change (as indicated by the best-fit linear trend) by the observed rise in atmospheric carbon dioxide content, and then applying that relationship to a doubling of the carbon dioxide content, Loehle arrives at an estimate of the earth’s transient climate sensitivity—transient, in the sense that at the time of CO2 doubling, the earth has yet to reach a state of equilibrium and some warming is still to come.
Loehle estimated the equilibrium climate sensitivity from his transient calculation based on the average transient:equilibrium ratio projected by the collection of climate models used in the IPCC’s most recent Assessment Report. In doing so, he arrived at an equilibrium climate sensitivity estimate of 1.99°C with a 95% confidence range of it being between 1.75°C and 2.23°C.
Compare Loehle’s estimate to the IPCC’s latest assessment of the earth’s equilibrium climate sensitivity which assigns a 66 percent or greater likelihood that it lies somewhere in the range from 1.5°C to 4.5°C. Loehle’s determination is more precise and decidedly towards the low end of the range.
The second entry to our list of low climate sensitivity estimates comes from Roy Spencer and William Braswell and published in the Asia-Pacific Journal of Atmospheric Sciences. Spencer and Braswell used a very simple climate model to simulate the global temperature variations averaged over the top 2000 meters of the global ocean during the period 1955-2011. They first ran the simulation using only volcanic and anthropogenic influences on the climate. They ran the simulation again adding a simple take on the natural variability contributed by the El Niño/La Niña process. And they ran the simulation a final time adding in a more complex situation involving a feedback from El Niño/La Niña onto natural cloud characteristics. They then compared their model results with the set of real-world observations.
What the found, was the that the complex situation involving El Niño/La Niña feedbacks onto cloud properties produced the best match to the observations. And this situation also produced the lowest estimate for the earth’s climate sensitivity to carbon dioxide emissions—a value of 1.3°C.
Spencer and Braswell freely admit that using their simple model is just the first step in a complicated diagnosis, but also point out that the results from simple models provide insight that should help guide the development of more complex models, and ultimately could help unravel some of the mystery as to why full climate models produce high estimates of the earth’s equilibrium climate sensitivity, while estimates based in real-world observations are much lower.
Our Figure below helps to illustrate the discrepancy between climate model estimates and real-world estimates of the earth’s equilibrium climate sensitivity. It shows Loehle’s determination as well as that of Spencer and Braswell along with 16 other estimates reported in the scientific literature, beginning in 2011. Also included in our Figure is both the IPCC’s latest assessment of the literature as well as the characteristics of the equilibrium climate sensitivity from the collection of climate models that the IPCC uses to base its impacts assessment.Figure 1. Climate sensitivity estimates from new research beginning in 2011 (colored), compared with the assessed range given in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) and the collection of climate models used in the IPCC AR5. The “likely” (greater than a 66% likelihood of occurrence)range in the IPCC Assessment is indicated by the gray bar. The arrows indicate the 5 to 95 percent confidence bounds for each estimate along with the best estimate (median of each probability density function; or the mean of multiple estimates; colored vertical line). Ring et al. (2012) present four estimates of the climate sensitivity and the red box encompasses those estimates. The right-hand side of the IPCC AR5 range is actually the 90% upper bound (the IPCC does not actually state the value for the upper 95 percent confidence bound of their estimate). Spencer and Braswell (2013) produce a single ECS value best-matched to ocean heat content observations and internal radiative forcing.
Quite obviously, the IPCC is rapidly losing is credibility.
As a result, the Obama Administration would do better to come to grips with this fact and stop deferring to the IPCC findings when trying to justify increasingly burdensome federal regulation of carbon dioxide emissions, with the combined effects of manipulating markets and restricting energy choices.
Loehle, C., 2014. A minimal model for estimating climate sensitivity. Ecological Modelling, 276, 80-84.
Spencer, R.W., and W. D. Braswell, 2013. The role of ENSO in global ocean temperature changes during 1955-2011 simulated with a 1D climate model. Asia-Pacific Journal of Atmospheric Sciences, doi:10.1007/s13143-014-0011-z.
From the Washington Post:
Annapolis Police Chief Michael A. Pristoop thought he came prepared when he testified before a Maryland state Senate panel on Tuesday about the perils of legalizing marijuana.
In researching his testimony against two bills before the Judicial Proceedings Committee, Pristoop said, he had found a news article to illustrate the risks of legalization: 37 people in Colorado, he said, had died of marijuana oversdoses on the very day that the state legalized pot….
Trouble is, the facts were about as close to the truth as oregano is to pot. After a quick Google search on his laptop, Raskin — the sponsor of the legalization bill that was the subject of the Senate hearing — advised the chief that the Colorado overdose story, despite its deadpan delivery, had been made up for laughs by The Daily Currant, an online comedy magazine.
Ouch! For more on the momentum of marijuana law reform, check out today’s New York Times.
Today, Senate Budget Chairman Patty Murray sent her caucus a memo on the country’s fiscal outlook. She details the “$3.3 trillion in deficit reduction put in place over the last few years;” a likely refrain in President Obama’s budget next week. However, Chairman Murray’s memo leaves much to be desired.
The first section of Murray’s memo highlights the various ways that the deficit has been reduced over the last several years by Congress and the President. The bulk of savings are from the discretionary spending caps put into place by the Budget Control Act (BCA) of 2011. Another large share of deficit reduction came from $727 billion in tax increases over the last several years. All told, Murray counts $3.3 trillion in deficit reduction. This is an incredibly small step to tackling our $4 trillion budget.
But after detailing all of the great things this fiscal restraint is doing for the country, Chairman Murray completely turns course. Instead of detailing additional ways to cut spending and continue these marginal improvements, she starts a laundry list of needed government “investments”—spending programs. She calls for more spending on infrastructure, jobs programs, a minimum wage increase, and increased funding for research and development.
Also notably, she also does not include future sequester cuts in her numbers; an implicit acknowledgement that she does not plan to keep those promised cuts.
Chairman Murray’s memo also fails to acknowledge the impending fiscal crisis. According to the most recent Congressional Budget Office (CBO) report, the country’s debt and deficit are stable for the next few years, but by 2017 they increase dramatically again. CBO expects the deficit to rise to 4% of GDP by the end of the decade. Even though, revenues as a percent of GDP will be close to historical levels, Chairman Murray calls for more tax hikes, ignoring the real driver of our fiscal issues: spending.
The refrain from many over the last year has been that the deficit is back under control, and that Congress should go back to wildly spending. Chairman Murray’s memo follows that path. It fails to acknowledge the need for fiscal restraint and sets the stage for next week’s release of the President’s budget.
I wish I knew more about the medical system, so that I could write more intelligently about domestic and international trade issues in this area. With the caveat that there is a lot I don’t know here, I’m excited by the following development that will hopefully allow more inter-state trade in medical services:
A significant barrier to the interstate practice of telehealth is closer to being broken down. The Federation of State Medical Boards (FSMB) has completed and distributed a draft Interstate Medical Licensure Compact, designed to facilitate physician licensure portability that should enhance the practice of interstate telehealth. Essentially, the compact would create an additional licensing pathway, through which physicians would be able to obtain expedited licensure in participating states. As the FSMB notes in its draft, the compact “complements the existing licensing and regulatory authority of state medical boards, ensures the safety of patients, and provides physicians with enhanced portability of their license to practice medicine outside their state of primary licensure.” This is a potentially significant development because burdensome state licensure requirements have been a major impediment to the interstate practice of telehealth. A physician practicing telehealth is generally required to obtain a medical license in the state where the patient—not the physician—is located. As a consequence, physicians wishing to treat patients in multiple states need to obtain a license in each of those states in order to practice medicine lawfully, a lengthy and expensive process.
Thinking even bigger, the same idea could be applied internationally.
Jeffrey A. Miron
The standard argument for occupational licensing - government-imposed limits on who can supply medical, legal, plumbing, and other services - is that such laws protect the public from low-quality provision of these services.
This argument is not convincing on its own: licensing limits the quantity of services provided, raising price, and thus harming consumers. A necessary condition for licensing to make sense, therefore, is that any improvements in service quality outweight the losses from higher prices.
A new study, however, finds that when de-regulation allows nurse practioners to perform more tasks without doctor supervision, the price of well-child medical exams declines (as implied by standard economics), with no “changes … in outcomes such as infant mortality rates.”
In at least this case, therefore, licensure is all cost and no benefit.