Are you on Instagram? The Cato Institute is!
We joined the popular image-sharing site in late October. Follow us at http://instagram.com/catoinstitute.
Wondering how YOU can spread the message of liberty on Instagram? Make sure to come to this month’s New Media Lunch. Join the Cato Institute this Thursday at noon for a lunchtime presentation, followed by a roundtable discussion. Allen Gannett of Trackmaven will highlight some interesting discoveries from TrackMaven’s recently released study of Fortune 500 companies on Instagram and share tips for translating their success to the nonprofit world. Make sure to register as space is limited.
Not in D.C.? We will be livestreaming Allen’s presentation. Just navigate to http://www.cato.org/live at noon Eastern Time this Thursday, November 21st. You can also join the conversation on Twitter using #NewMediaLunch.
Paul C. "Chip" Knappenberger and Patrick J. Michaels
Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”
A new paper just hit the scientific literature that argues that the apparent pause in the rise in global average surface temperatures during the past 16 years was really just a slowdown.
As you may imagine, this paper, by Kevin Cowtan and Robert Way is being hotly discussed in the global warming blogs, with reaction ranging from a warm embrace by the global-warming-is-going-to-be-bad-for-us crowd to revulsion from the human-activities-have-no-effect-on-the-climate claque.
The lukewarmers (a school we take some credit for establishing) seem to be taking the results in stride. After all, the “pause” as curious as it is/was, is not central to the primary argument that, yes, human activities are pressuring the planet to warm, but that the rate of warming is going to be much slower than is being projected by the collection of global climate models (upon which mainstream projections of future climate change—and the resulting climate alarm (i.e., calls for emission regulations, etc.)—are based).
Under the adjustments to the observed global temperature history put together by Cowtan and Way, the models fare a bit better than they do with the unadjusted temperature record. That is, the observed temperature trend over the past 34 years (the period of record analyzed by Cowtan and Way) is a tiny bit closer to the average trend from the collection of climate models used in the new report from the U.N.’s Intergovernmental Panel on Climate Change (IPCC) than is the old temperature record.
Specifically, while the trend in observed global temperatures from 1979-2012 as calculated by Cowtan and Way is 0.17°C/decade, it is 0.16°C/decade in the temperature record compiled by the U.K. Hadley Center (the record that Cowtan and Way adjusted). Because of the sampling errors associated with trend estimation, these values are not significantly different from one another. Whether the 0.17°C/decade is significantly different from the climate model average simulated trend during that period of 0.23°C/decade is discussed extensively below.
But, suffice it to say that an insignificant difference of 0.01°C/decade in the global trend measured over more than 30 years is pretty small beer and doesn’t give model apologists very much to get happy over.
Instead, the attention is being deflected to “The Pause”—the leveling off of global surface temperatures during the past 16 years (give or take). Here, the new results from Cowtan and Way show that during the period 1997-2012, instead of a statistically insignificant rise at a rate of 0.05°C/decade as is contained in the “old” temperature record, the rise becomes a statistically significant 0.12°C/decade. “The Pause” is transformed into “The Slowdown” and alarmists rejoice because global warming hasn’t stopped after all. (If the logic sounds backwards, it does to us as well, if you were worried about catastrophic global warming, wouldn’t you rejoice at findings that indicate that future climate change was going to be only modest, more so than results to the contrary?)
The science behind the new Cowtan and Way research is still being digested by the community of climate scientists and other interested parties alike. The main idea is that the existing compilations of the global average temperature are very data-sparse in the high latitudes. And since the Arctic (more so than the Antarctic) is warming faster than the global average, the lack of data there may mean that the global average temperature trend may be underestimated. Cowtan and Way developed a methodology which relied on other limited sources of temperature information from the Arctic (such as floating buoys and satellite observations) to try to make an estimate of how the surface temperature was behaving in regions lacking more traditional temperature observations (the authors released an informative video explaining their research which may better help you understand what they did). They found that the warming in the data-sparse regions was progressing faster than the global average (especially during the past couple of years) and that when they included the data that they derived for these regions in the computation of the global average temperature, they found the global trend was higher than previously reported—just how much higher depended on the period over which the trend was calculated. As we showed, the trend more than doubled over the period from 1997-2012, but barely increased at all over the longer period 1979-2012.
Figure 1 shows the impact on the global average temperature trend for all trend lengths between 10 and 35 years (incorporating our educated guess as to what the 2013 temperature anomaly will be), and compares that to the distribution of climate model simulations of the same period. Statistically speaking, instead of there being a clear inconsistency (i.e., the observed trend value falls outside of the range which encompasses 95% of all modeled trends) between the observations and the climate mode simulations for lengths ranging generally from 11 to 28 years and a marginal inconsistency (i.e., the observed trend value falls outside of the range which encompasses 90% of all modeled trends) for most of the other lengths, now the observations track closely the marginal inconsistency line, although trends of length 17, 19, 20, 21 remain clearly inconsistent with the collection of modeled trends. Still, throughout the entirely of the 35-yr period (ending in 2013), the observed trend lies far below the model average simulated trend (additional information on the impact of the new Cowtan and Way adjustments on modeled/observed temperature comparison can be found here).
Figure 1. Temperature trends ranging in length from 10 to 35 years (ending in a preliminary 2013) calculated using the data from the U.K. Hadley Center (blue dots), the adjustments to the U.K. Hadley Center data made by Cowtan and Way (red dots) extrapolated through 2013, and the average of climate model simulations (black dots). The range that encompasses 90% (light grey lines) and 95% (dotted black lines) of climate model trends is also included.
The Cowtan and Way analysis is an attempt at using additional types of temperature information, or extracting “information” from records that have already told their stories, to fill in the missing data in the Arctic. There are concerns about the appropriateness of both the data sources and the methodologies applied to them.
A major one is in the applicability of satellite data at such high latitudes. The nature of the satellite’s orbit forces it to look “sideways” in order to sample polar regions. In fact, the orbit is such that the highest latitude areas cannot be seen at all. This is compounded by the fact that cold regions can develop substantial “inversions” of near-ground temperature, in which temperature actually rises with height such that there is not a straightforward relationship between the surface temperature and the temperature of the lower atmosphere where the satellites measure the temperature. If the nature of this complex relationship is not constant in time, an error is introduced into the Cowtan and Way analysis.
Another unresolved problem comes up when extrapolating land-based weather station data far into the Arctic Ocean. While land temperatures can bounce around a lot, the fact that much of the ocean is partially ice-covered for many months. Under “well-mixed” conditions, this forces the near-surface temperature to be constrained to values near the freezing point of salt water, whether or not the associated land station is much warmer or colder.
You can run this experiment yourself by filling a glass with a mix of ice and water and then making sure it is well mixed. The water surface temperature must hover around 33°F until all the ice melts. Given that the near-surface temperature is close to the water temperature, the limitations of land data become obvious.
Considering all of the above, we advise caution with regard to Cowtan and Way’s findings. While adding high arctic data should increase the observed trend, the nature of the data means that the amount of additional rise is subject to further revision. As they themselves note, there’s quite a bit more work to be done this area.
In the meantime, their results have tentatively breathed a small hint of life back into the climate models, basically buying them a bit more time—time for either the observed temperatures to start rising rapidly as current models expect, or, time for the modelers to try to fix/improve cloud processes, oceanic processes, and other process of variability (both natural and anthropogenic) that lie behind what would be the clearly overheated projections.
We’ve also taken a look at how “sensitive” the results are to the length of the ongoing pause/slowdown. Our educated guess is that the “bit” of time that the Cowtan and Way findings bought the models is only a few years long, and it is a fact, not a guess, that each additional year at the current rate of lukewarming increases the disconnection between the models and reality.
Cowtan, K., and R. G. Way, 2013. Coverage bias in the HadCRUT4 temperature series and its impact on recent temperature trends. Quarterly Journal of the Royal Meteorological Society, doi: 10.1002/qj.2297.
Juan Carlos Hidalgo
Chile went to the polls yesterday in what was perhaps the most important presidential election since the return of democracy in 1990. Many foreign observers focused on the curiosity that the two leading candidates were both daughters of Air Force generals who chose opposing sides during the military coup that toppled socialist president Salvador Allende in 1973. But what is at stake in this election wasn’t Chile’s past, but its future.
Let’s first recapitulate where Chile stands today: Thanks to the free market reforms implemented since 1975 by the military government of Augusto Pinochet – that were subsequently deepened by the democratic center-left governments that ruled the country since 1990 – Chile can boast the following accomplishments:
- It’s the freest economy in Latin America and it stands 11th in the world (ahead of the United States) in the Economic Freedom of the World report.
- It has more than tripled its income per capita since 1990 to $19,100 (PPP), which is the highest in Latin America.
- According to the IMF, by 2017 Chile will reach an income per capita of $23,800, which is the official threshold to become a developed country.
- According to the UN Economic Commission on Latin America and the Caribbean (ECLAC), Chile has the most impressive poverty reduction record in Latin America in the last two decades. The poverty rate went down from 45% in the mid-1980s to 11% in 2011, the lowest in the region.
- It has the strongest democratic institutions of Latin America according to the Rule of Law Index of the World Justice Project.
- It’s the least corrupt country in Latin America according to Transparency International.
- Along with Costa Rica and Uruguay, it has the best record in Latin America on political rights and civil liberties, according to Freedom House.
- High income inequality, which has always been a sore in the eyes of many, has decreased in the last decade.
With such an impressive record, it’s quite puzzling that the leading candidate, former president Michelle Bachelet, is running again under a platform calling for changes that would significantly alter the Chilean model by increasing the role of the government in the economy. In particular, Bachelet is proposing free higher education to everyone, the abolition of for-profit private schools and universities, the introduction of a state-owned pension fund in the country’s private pension system, higher taxes on businesses and professionals, and even a new constitution.
Bachelet came in first in yesterday’s election with 46.7% of the vote – short of the 50% necessary to avoid a runoff. On December 15th she’ll have to face again the center-right candidate Evelyn Matthei who came in second with 25%.
It’s very likely that Bachelet will win the runoff, but her governing coalition – which for the first time includes the Communist Party – came short of the two-thirds majority needed to change the constitution. However, her coalition does have enough votes to push for her reforms on taxes, education and pensions.
It is worth noting that, despite talk of Bachelet enjoying massive support among Chileans, not only did she fail to avoid a runoff, but she actually received fewer votes yesterday (3,070,012) than what she got in the first round of 2005 (3,190,691). A lot has to do with the fact that yesterday’s was Chile’s first presidential election with voluntary voting. Approximately 50% of Chileans able to vote didn’t show up to the polls. This means that Bachelet received the vote of only 22% of registered voters, hardly an overwhelming mandate for radical changes.
This doesn’t mean that Bachelet won’t push for those reforms though. After all, her coalition captured a majority of the seats in Congress. Unfortunately, a large segment of Chile’s society seems to suffer from a “high expectations trap,” which involves the danger that a false sense of prosperity sets in before the country actually becomes rich. What we have seen in recent years is that new middle class has become the driving force behind demands for the further expansion of the welfare state.
The future of the successful Chilean model will be at stake in the next 4 years.
Jeffrey A. Miron
Only a heartless libertarian could possibly object to bans on child labor, right? After all, no one wants to live in some Dickensian dystopia in which children toil endlessly under brutal conditions.
Unless, of course, bans harm, rather than help, both children and their families. And in a new working paper, economists Prashant Bharadwaj (UCSD), Leah Lakdawala (Michigan State), and Nicholas Li (Toronto), find just that. They
… examine the consequences of India’s landmark legislation against child labor, the Child Labor (Prohibition and Regulation) Act of 1986. … [and] show that child wages decrease and child labor increases after the ban. These results are consistent with a theoretical model … in which families use child labor to reach subsistence constraints and where child wages decrease in response to bans, leading poor families to utilize more child labor. The increase in child labor comes at the expense of reduced school enrollment.
And it gets worse. The authors
… also examine the effects of the ban at the household level. Using linked consumption and expenditure data, [they] find that along various margins of household expenditure, consumption, calorie intake and asset holdings, households are worse off after the ban.
Good intentions are just that; intentions, not results. The law of unintended consequences should never be ignored.
Douglas Walburg faces potential liability of $16-48 million. What heinous acts caused such astronomical damages? A violation of 47 C.F.R. § 16.1200(a)(3)(iv), an FCC regulation that enables lawsuits against senders of unsolicited faxes.
Walburg, however, never sent any unsolicited faxes; he was sued under the regulation by a class of plaintiffs for failing to include opt-out language in faxes sent to those who expressly authorized Walburg to send them the faxes.
The district court ruled for Walburg, holding that the regulation should be narrowly interpreted so as to require opt-out notices only for unsolicited faxes. But on appeal, the Federal Communications Commission, not previously party to the case, filed an amicus brief explaining that its regulation applies to previously authorized faxes too. Walburg argued that the FCC lacked statutory authority to regulate authorized advertisements. In response, the FCC filed another brief, arguing that the Hobbs Act prevents federal courts from considering challenges to the validity of FCC regulations when raised as a defense in a private lawsuit. Although the U.S. Court of Appeals for the Eighth Circuit recognized that Walburg’s argument may have merit, it declined to hear it and ruled that the Hobbs Act indeed prevents judicial review of administrative regulations except on appeal from prior agency review.
In this case, however, Walburg couldn’t have raised his challenge in an administrative setting because the regulation at issue outsources enforcement to private parties in civil suits! Moreover, having not been charged until the period for agency review lapsed, he has no plausible way to defend himself from the ruinous liability he will be subject to if not permitted to challenge the regulation’s validity. Rather than face those odds, Walburg has petitioned the Supreme Court to hear his case, arguing that the Eighth Circuit was wrong to deny him the right to judicial review without having to initiate a separate (and impossible) administrative review.
Cato agrees, and has joined the National Federation of Independent Business on an amicus brief supporting Walburg’s petition. We argue that the Supreme Court should hear the case because the Eighth Circuit’s ruling permits administrative agencies to insulate themselves from judicial review while denying those harmed by their regulations the basic due-process right to meaningfully defend themselves. The Court should hear the case because it offers the opportunity to resolve lower-court disputes about when the right to judicial review arises and whether a defendant can be forced to bear the burden of establishing a court’s jurisdiction.
These are important due-process implications raised in this case, and the Court would do well to adopt a rule consistent with the Eleventh Circuit’s holding on this issue—one that protects the right to immediately and meaningfully defend oneself from unlawful regulations. Otherwise, more and more Americans will end up finding themselves at the bad end of obscene regulatory penalties by unaccountable government agencies, with no real means to defend themselves.
The Court will decide whether to take Walburg v. Nack early in the new year.
Patrick J. Michaels
Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”
Global warming buffs have been fond of claiming that the roaring winds of Typhoon Haiyan were the highest ever measured in a landfalling tropical cyclone, and that therefore (?) this is a result of climate change. In reality, it’s unclear whether or not it holds the modern record for the strongest surface wind at landfall.
This won’t be known until there is a thorough examination of its debris field.
The storm of record is 1969 Hurricane Camille, which I rode out in an oceanfront laboratory about 25 miles east of the eye. There’s a variety of evidence arguing that Camille is going to be able to retain her crown.
The lowest pressure in Haiyan was 895 millibars, or 26.42 inches of mercury. To give an idea, the needle on your grandmonther’s dial barometer would have to turn two complete counterclockwise circles to get there. While there have been four storms in the Atlantic in the modern era that have been as strong or a bit stronger, the western Pacific sees one of these approximately every two years or so.
Camille’s lowest pressure was a bit higher, at 905 mb (26.72 inches). At first blush it would therefore seem Haiyan would win the blowhard award hands down, but Hayian had a very large eye around which its winds swirled, while Camille’s was one of the smallest ever measured. At times in its brief life, Camille’s was so small that the hurricane hunter aircraft could not safely complete a 360 degree turn without brushing through the devastating innermost cloud band, something you just don’t want to be near in a turning aircraft. In fact, the last aircraft to get into Camille, which measured 190mph sustained winds, lost an engine in the severe turbulence and fortunately was able to limp home.
Haiyan’s estimated 195mph winds were derived from satellite data, rather than being directly sensed by an aircraft. But winds over the open ocean are always greater than those at landfall because of friction, and the five mph difference between the two storms is physically meaningless.
The chance that an onshore anemometer (wind-speed and direction sensor) will survive such a storm isn’t very high, so the winds are inferred by scientists and engineers from the texture and distribution of what’s left behind.
Every year, our National Hurricane Center summarizes the Atlantic hurricane season in painstaking detail in article published in the prestigious journal Monthly Weather Review. Describing Camille’s destruction, it said:
Maximum winds near the coastline could not be measured, but from an appraisal of splintering of structures within a few hundred yards of the coast, velocities probably approached 175 k[nots].
That’s 201 mph.(Higher winds have been measured on small islands. With Haiyan and Camille, we are talking about storms running into large landmasses, where friction takes place.)
Camille killed 143 along the Gulf Coast, while Haiyan’s toll is currently estimated to be more than 2,500.
The difference, which is more than an order of magnitude, is largely (but not completely) due to poverty. Despite experiencing roughly five landfalling tropical cyclones per year, Philippine infrastructure simply isn’t as sound as it is in wealthier countries. As a grim example, a number of Haiyan’s casualties actually occurred in government-designated shelters that collapsed in the roaring eyewall.
In addition, the transportation infrastructure simply couldn’t handle a mass evacuation. If a similar situation applied to the U.S. Gulf Coast, Camille would have killed thousands at landfall, a fact noted in the Hurricane Center’s report on the 1969 season. Where Haiyan hit in the Philippines, there simply weren’t any roads capable of evacuating the citizens of Tacloban City safely inland, forcing them to ride it out dangerously close to the invading ocean and exposed to winds that pulverized most structures.
So, while we really don’t know which storm had higher winds, we do know that more affluent societies are much less affected by even the strongest storms. As Indur Goklany, (who writes frequently for Cato) has pointed out, if left to develop, the entire world will be much more resilient to climate change than it would be if the ineffective policies to “stop” it slowed economic growth.
Ted Galen Carpenter
A November 13 article in Reuters discusses the growing controversy over NATO’s new headquarters being built outside of Brussels. The price tag—some $1 billion—has raised more than a few eyebrows. “When defense budgets are being cut and in general when governments are under so much pressure from taxpayers to save money, it looks terribly extravagant,” opines Daniel Keohane, head of a leading think tank in Belgium. Several members of the British parliament also have questioned the cost.
NATO officials, though, defend the project, asserting that the existing headquarters, built in 1967, has outlived its usefulness. Of course, the same point could be made with far greater validity about the NATO alliance itself. After all, it was created during the depths of the Cold War in 1949 to, as Lord Harold Ismay, NATO’s secretary general at the time, pithily observed, “keep the Russians out, the Americans in, and the Germans down.” Given the collapse of the Soviet Union and Russia’s manifold demographic, economic, and military limitations as a successor state, that mission now seems to be more than a little obsolete. The past two decades, the alliance has been conducting a frantic search for relevant new missions, resulting in a dubious decision to add members in Eastern Europe and wage even more dubious wars in places like Kosovo and Afghanistan.
Not only is NATO an alliance in search of purpose, but the willingness of the European members to free-ride on the military commitment of the United States to Europe’s defense is now even worse than it was during the Cold War. The already anemic military budgets of NATO’s European members have sagged further, and in some cases they are in virtual free fall. To build a billion-dollar, palatial headquarters under such circumstances exhibits contempt for taxpayers—especially U.S. taxpayers.
There seems to be a tendency of U.S. officials to endorse the building of expensive monuments to institutional egos at precisely the time that the institution in question has lost relevance. We saw that process take place in Iraq. Just as the nation-building mission was quickly heading south, the Bush administration built an embassy in Baghdad that was nearly as large as Vatican City. Today, it stands as a symbol of how badly Washington exaggerated the extent of America’s interests in Iraq and misconstrued the extent of U.S. influence there. With the construction of NATO’s new headquarters, we have yet another monument to hubris.
Steve H. Hanke
The story of the Venezuelan economy and its troubled currency, the bolivar, can be summed up with the following phrase: “From bad to worse”—over and over again. Yes, the ever deteriorating situation in Venezuela has taken yet another turn for the worse.
In a panicked, misguided response to the country’s economic woes, Venezuelan president Nicolas Maduro has requested emergency powers over the economy. And the Maduro government recently announced plans to institute a new exchange rate for tourists in an attempt to quash arbitrage-driven currency smuggling.
These measures will likely prove too little, too late for the Venezuelan economy and its troubled currency, the bolivar. Indeed, the country’s economy has been in decline since Hugo Chavez imposed his unique brand of socialism on Venezuela.
For years, Venezuela has sustained a massive social spending program, combined with costly price and labor controls, as well as an aggressive annual foreign aid strategy. This fiscal house of cards has been kept afloat—barely—by oil revenues.
But as the price tag of the Chavez/Maduro regime has grown, the country has dipped more and more into the coffers of its state-owned oil company, PDVSA, and (increasingly) the country’s central bank.
Since Chavez’s death, this house of cards has begun to collapse, and the black market exchange rate between the bolivar (VEF) and the U.S. dollar (USD) tells the tale. Since Chavez’s death on March 5, 2013, the bolivar has lost 62.36% of its value on the black market, as shown in the chart below the jump.
This, in turn has brought about very high inflation in Venezuela. The government has responded by imposing ever tougher price controls to suppress the inflation. But those policies have failed, resulting in shortages of critical goods, such as toilet paper, without addressing the root cause of Venezuela’s inflation woes.
The Maduro government has responded to this problem with the very same tactics employed by other regimes with troubled currencies. Yes, from Mugabe’s Zimbabwe to North Korea today, the playbook is simple, if misguided: deny and deceive.
Currently, official government data put Venezuela’s inflation rate at a mere 50% (a woefully inaccurate figure to begin with). Yet, on Tuesday October 22nd, finance minister Nelson Merentes submitted a proposed 2014 budget to the National Assembly that projected inflation at a level nearly half of the current official inflation rate.
At present, it seems doubtful that the Maduro government has any reason for optimism about Venezuela’s economy in the coming year. Indeed, this latest budget figure is simply an attempt to hide the truth about the Venezuela’s massive inflation problem.
Just how big of a problem is inflation in Venezuela? The implied annual inflation rate in Venezuela is actually now in the triple digits, coming in at a whopping 283%, as shown in the chart below.
What’s more, the implied monthly inflation rate has now ramped up to 36%, as shown in the chart below. That’s dangerously close to the hyperinflation threshold of 50% per month. This is due to an accelerating depreciation of the bolivar, reflecting Venezuelan’s deteriorating economic outlook.
At this pace, Venezuela could join the Hall of Shame as the world’s 60th episode of hyperinflation. Alas, it seems the Maduro government is determined to double down on its failed policies, rather than face the music about the failed chavismo economic experiment.
K. William Watson
No one is surprised that 151 liberal Democrats in the House don’t support granting the president fast track authority to negotiate trade agreements. But two groups of Republicans have now signed letters to the President this week joining those Democrats in their opposition. The news media have reported the story as evidence that the tea party opposes President Obama’s trade agenda.
The signatories of the letters are an odd combination of young, party-line Republicans and old-guard isolationists who oppose free trade. Neither group has anything to do with the tea party and both seem confused about how fast track works.
One of the assertions made in the letters is that establishing fast track authority cedes to the President Congress’s constitutional power to regulate trade. This is just wrong. Fast track is not a grant of authority to the President but rather an exercise of authority by Congress.
No one doubts that Congress can pass statutes that regulate trade. Also clear is that the President can enter into treaties with the advice and consent of the Senate. Fast track combines those processes by having the President negotiate the terms of an agreement, after which both houses of Congress pass (or not) a statute that ratifies and implements that agreement.
The contentious aspect of fast track authority comes from the fact that Congress agrees to hold an up-or-down vote on any agreement submitted by the president with no opportunity to add amendments. The good sense of this arrangement is obvious when you consider that a trade agreement is the product of complex and lengthy international negotiations that cannot be adjusted at the last minute just to accommodate each congressman’s pet issue.
But when establishing fast track authority, Congress also imposes restrictions on what must be included (or excluded) from any trade agreement placed on the fast track. In essence, Congress agrees to adopt practical, streamlined parliamentary procedures as long as the President negotiates agreements it likes. Fast track allows Congress to exert influence at an earlier, less-disruptive stage in the process.
While the legitimacy of fast track might be an interesting topic for constitutional scholars, the controversy is mostly a proxy for larger policy arguments about the value of trade in general. Protectionists disapprove of fast track and trade agreements because they want more barriers to trade, regardless of its constitutional status. Similarly, proponents of increased trade approve of fast track because trade agreements put a check on Congress’s tendency to protect domestic industries.
So why have 27 Republicans come out against fast track? Cato’s online trade votes database can help us answer the question.
About half the Republicans who signed the letters are old-guard isolationists who have opposed trade for decades. Indeed, if you were looking for a list of anti-trade Republicans, you need look no further than these signatories. Some of them have been in Congress for over 20 years voting consistently for higher tariffs. You can take a look at the impressive voting records of some these thankfully atypical Republicans here: Rob Bishop, John Duncan, Michael Fitzpatrick, Duncan Hunter, Walter Jones, Frank LoBiondo, David MicKinley, and Chris Smith.
Most of the other Republicans expressing opposition to fast track haven’t been in Congress very long and have voted in lock-step the Republican leadership on trade issues. They have voted in favor of lowering barriers but only in reciprocal trade agreements and they support corporate welfare subsidies like the Export–Import Bank. I don’t know why they’ve decided to take a stand against fast track, which Republicn leadership strongly favors. Perhaps they really think it’s unconstitutional or perhaps their motives are largely partisan. Significantly adding to the confusion is the fact that of the five freshman who have taken a position against fast track, four of those signed a letter in June emphasizing their support for the policy.
Oddly, there is little doubt that this second group of moderate Republicans will vote in favor of any free trade agreement submitted to Congress even if the President gets “unconstitutional” fast track authority. That’s not at all true of the Democrats who have voiced their opposition—they can be counted on to oppose trade agreements at every step of the way. Ultimately these Republicans seem to be quite confused about what they want and how they should go about getting it. I recommend that they stop listening to their party’s lingering protectionist minority about fast track, or anything else for that matter.
Caleb O. Brown
Thousands of family dogs are shot by law enforcement each year. A documentary now being assembled tries to find out why. You can watch the trailer above.
Some readers may be familiar with Cheye Calvo, the mayor of Berwyn Heights, Maryland. Police raided his home when they discovered that a large package of marijuana in transit was addressed to Calvo’s wife. The package, meant for someone else, led police to execute a SWAT raid during which they shot the family’s dogs, Payton and Chase. Puppycide is a feature-length documentary that attempts to capture both the impact on traumatized families and the rationales offered by police.
When police arrived, the package was unopened. No one was arrested. To my knowledge, the police never issued an apology.
The filmmakers at Ozymandias Media are currently running a Kickstarter campaign to get the film completed.
Another year, another Veterans Day. But November 11 began as Armistice Day, commemorating the end of World War I. The day remains a stark reminder of the stupidity of war.
On the 11th hour of the 11th day of the 11th month of 1918 World War I came to an end. In succeeding years allied states commemorated the conflict’s end on November 11.
Some 20 million people died in World War I. The horrific conflict brought down the continent’s established order, loosed the pestilence of totalitarianism, and led to even deadlier World War II. The Great War, as it was originally called, was stupid beyond measure.
As the 20th century dawned, Europe enjoyed both peace and prosperity. However, Europe’s environment was combustible. One match strike set the continent ablaze.
On June 28, 1914 the Serbian nationalist Gavrilo Princip gunned down Archduke Franz Ferdinand, the heir to the throne, in Sarajevo, capital of the Austro-Hungarian province of Bosnia.
Vienna decided to use this act of state terrorism to break its Serbian antagonist. Germany stood by its ally. However, Serbia was backed by Russia, which in turn was allied with France. As conflict erupted other combatants jumped or were drawn in. The contending blocs, the Central Powers versus the Entente, acted as transmission belts of war.
There really was little to choose from between the two militaristic blocs. The sins of the Central Powers are well-known, but the Entente’s members were no angels.
Serbia’s military intelligence was implicated in the Archduke’s murder. Tsarist Russia was an anti-Semitic despotism. Historically France was dangerous and militaristic, and its revanchist desire for war with Germany was strong. Britain opposed Germany more for commercial and imperial than humanitarian reasons. Belgium was perhaps the worst colonial power, responsible for the deaths of millions in the Belgian Congo.
The early Americans were determined to avoid getting entangled in imperial European affairs. However, as I point out in my latest Forbes online column, by World War I the U.S. had changed:
The so-called Progressives, led by Presidents Theodore Roosevelt and Woodrow Wilson, had taken charge. They were statists, imperialists, and militarists—inveterate social engineers on a global scale. After President Wilson was reelected in 1916, he hoped to remake the international order. That required America to be a belligerent, even though it had no significant interest in the conflict.
The trigger for U.S. involvement was both foolish and fraudulent. London broke international law by imposing a starvation blockade on Germany, ultimately killing hundreds of thousands of German civilians. Berlin responded with a new weapon, the submarine.
Some Americans died after traveling on British vessels, which carried bullets as well as babies. The famous Lusitania was an armed reserve cruiser carrying munitions through a war zone—making it a legitimate military target.
However, under pressure from the allied-sympathetic Wilson, Germany suspended U-boat attacks until February 1917. After Berlin resumed unrestricted submarine warfare President Wilson chose war. Some 200,000 Americans died, the victims of a president suffering from a toxic mix of egotism and myopia.
Alas, contra people’s hopes, the conflict did not turn out to be the War to End War. Washington’s entry allowed imposition of the Versailles Treaty, a “Diktat” highlighted by the allies’ greedy grab for plunder amid sanctimonious claims of justice. Adolf Hitler and World War II were the conflict’s most disastrous consequences.
Sometimes wars must be fought, and sometimes even the stupidest wars cannot be avoided. But often they could and should be. Like World War I.
To criticize America’s wars is not to doubt the patriotism and bravery of those who fought. Rather, to criticize the conflicts is to highlight the foolishness, arrogance, and ignorance of those who launched new wars or intervened in old ones.
After this Veterans Day Americans should contemplate how they have allowed politicians to drag the U.S. into unnecessary and costly wars, filling Arlington Cemetery and so many other final resting places with America’s finest. After this Veterans Day Americans should rededicate themselves to peace.
A new GAO report recommends that Congress end the SPOT program, which attempts to catch terrorists by suspicious behaviors they may exhibit at airport checkpoints. The Transportation Security Administration currently spends more than $200 million a year on the Screening of Passengers by Observation Techniques program, even though there has been criticism from the start that there is no solid science behind it.
Here are observations about SPOT from my new Cato study on the TSA to be released next Tuesday:
The SPOT program illustrates the problems with top-down federal control over aviation security. The TSA ‘deployed SPOT nationwide before first determining whether there was a scientifically valid basis’ for it, notes the GAO. Nor did the TSA perform a cost-benefit analysis of SPOT before it was deployed. That is the way that the federal government often works—it rolls out an expensive ‘solution’ for the entire nation without adequate research, and it resists efforts to cut programs even if the benefits do not materialize.
The new Cato study focuses on a decade of TSA shortcomings and the advantages of privatizing airport security screening. In sharp contrast to the American approach of a federal monopoly over aviation security, the great majority of European countries and Canada use competitive contracting for airport screening.
If you are in D.C. today, please drop by our Capitol Hill noon forum on the TSA.
Opponents of school choice are once again seeking to restrict the ability of parents to select the best education for their child. New Hampshire’s Education Tax Credit program, enacted in 2012, permits businesses who donate to a state-approved scholarship organization to deduct 85% of that donation from their annual taxes. These scholarship organizations then offer students that meet specific criteria scholarships to attend nonpublic schools or schools outside the students’ district, or to homeschool.
Because a large number of New Hampshire’s nonpublic schools are religious, opponents of school choice have challenged the program under, among other things, an 1877 amendment to the state constitution frequently referred to as a Blaine Amendment. Blaine Amendments, which were passed during that period in many states for the sole purpose of preventing Catholic schools from receiving government funding, prohibits “money raised by taxation” from being “granted or applied for the use of the schools or institutions of any religious sect or denomination.”
A New Hampshire trial court found that the Education Tax Credit program violated this discriminatory amendment and that scholarship funds could therefore not be granted to students attending religious schools. But the court incorrectly reasoned that money exempted from taxation under the tax-credit program was the equivalent of a government expenditure of public funds and therefore “money raised by taxation.” This type of reasoning—often referred to as “tax expenditure analysis”—has been explicitly rejected by other state supreme courts and the U.S. Supreme Court.
Indeed, as the Supreme Court held in Arizona Christian School Tuition Organization v. Winn (2011), such an approach “assumes that income should be treated as if it were government property even if it has not come into the tax collector’s hands … . Private bank accounts cannot be equated with the … State Treasury.” The trial court’s holding is similarly at odds with the original understanding of the Blaine Amendment and is unsupported by New Hampshire case law.
The State of New Hampshire and the Network for Educational Opportunity, represented by the Institute for Justice, have taken the case to the New Hampshire Supreme Court. Cato has filed an amicus brief supporting them, arguing that the educational tax credits are not “money raised by taxation” according to the original understanding of the 1877 amendment, New Hampshire case law, and U.S. Supreme Court precedent. The New Hampshire Supreme Court should reverse the trial court and restore a vital source of educational freedom and opportunity.
The case is Duncan v. New Hampshire.
Cato legal associate Lauren Barlow co-authored this post.
Steve H. Hanke
On October 28th, I wrote a blog post, “The NSA’s Rent is Too Damn High,” in which I looked at the $52.6 billion price tag for America’s spook infrastructure – the so-called “black budget.” When allocated across every American taxpayer, this staggering sum comes out to $574 per taxpayer, per year.
But, there are other edifying ways of gaining perspective on such a whopping amount of money. Doing so is important. Indeed, according to John Maynard Keynes’ biographer, Lord Skidelsky, Keynes believed that a good economist must always have “a sense of magnitudes.”
We can get a sense of magnitudes by looking at this year’s black budget as a portion of the major sources of the federal government’s revenues. The table below tells that tale:Source of Federal Revenue 2012 Amount $ Billion Black Budget $ Billion Black Budget as % of Revenue Source Individual Income Taxes $1,132.21 $52.60 4.6% Corporate Income Taxes $242.29 $52.60 21.7% Social Insurance Taxes $845.31 $52.60 6.2% Excise Taxes $79.06 $52.60 66.5% Estate and Gift Taxes $13.97 $52.60 376.4% Customs Duties $30.31 $52.60 173.6% Miscellaneous Receipts $107.01 $52.60 49.2% Deficit (Borrowing) $1,086.96 $52.60 4.8% Source: Congressional Budget Office
One story after another emerges about dysfunctional federal programs plagued by waste, fraud, and abuse. The core problem is that the government has grown so large that trying to make it function with efficiency and soundness has become impossible.
But Congress compounds the problem by creating programs that are ideal targets for crooks and scammers, and they resist repealing them even after years of scandal. The Earned Income Tax Credit, for example, has long suffered from an “improper payment” rate of more than 20 percent, which translates into throwing $10 billion of our hard-earned money down the drain every year. Whatever the policy arguments in favor of the program, that level of waste is hugely unfair to taxpayers and the program ought to repealed on this basis alone.
The Washington Post discusses another long-abused activity today—contract set-asides for small businesses. The article profiles how a Virginia businessman hit the jackpot with $1 billion of federal contracts by posing as a “disadvantaged” and “small” business under the Small Business Administration’s 8(a) program. The Post found that this Mercedes-driving owner of a luxury home is certainly not “disadvantaged,” and his business empire is not “small.”
The whole thing is disgusting, and I suspect a congressional committee will hold an oversight hearing to pretend to be concerned about the case. But the SBA 8(a) program gets abused over and over, as do other federal preference activities, such as Alaska Native Corporations. All such preferences ought to be abolished, and the government should live up to the goal of “Equal Justice Under Law” engraved on the Supreme Court building.
Indeed the entire SBA ought to be abolished. The agency’s programs distort the economy and promote crony capitalism. Americans are sick of dysfunction in Washington, but if they want the government to operate with any degree of integrity and competence they should demand much less of it.
Marian L. Tupy
Chimney sweeps are making a comeback in Great Britain. “According to the National Association of Chimney Sweeps,” the Telegraph reports, “Britain is experiencing the largest boom in chimney sweeping since Victorian times. ‘It’s been remarkable,’ said president Martin Glynn. ‘When we started NACS in 1982, there were just 30 members. Today we have 540 members nationally.’” The reason? Gas and electricity prices have risen so high that people prefer to burn wood in their previously unused or underused fireplaces.
Why are the energy prices so high? The British government’s pathological obsession with renewable sources of energy. Converting renewable energy (e.g., wind and solar) is much more expensive than conventional sources of energy (e.g., coal and gas). Since the government mandates that a certain percentage of energy consumption has to consist of renewable sources, energy prices are rising – fast.
That leads to this amazing paradox: Progressivism started in Victorian England – a time and place of tremendous and unprecedented material and social progress that, nonetheless, also had a darker side. The stories of workers living in freezing and damp dwellings, children chimney sweeps suffocating while on the job, and environmental degradation, were both horrific and true.The evolution of progressivism has come full circle and progressives have turned reactionary: To combat the (highly uncertain) effects of climate change, modern-day progressives have embraced policies that lead to more burning of wood, freezing homes for the poor and vulnerable, and chimney sweeps back at work.
Christopher A. Preble
Benjamin Friedman and I have an op-ed in today’s International New York Times (and the New York Times iPad app, I just checked) which calls for shrinking the U.S. nuclear arsenal, and moving from a triad of delivery systems—bombers, land-based intercontinental ballistic missiles (ICBMs), and submarine-launched ballistic missiles (SLBMs)—to a submarine-only monad.
The main focus of the piece is on the strategy that led to the enormous growth of the arsenal in the 1950s and 60s, and the attendant history of the triad. We go into the history to show that the strategy driving our nuclear force posture is outdated and based on inaccurate assumptions. The rationale for the triad is equally dubious given the vast technological gains since ICBMs and SLBMs were first developed and deployed.
But the international system has obviously changed since the days of the Cold War. Potential targets for American nuclear weapons are growing scarcer. New nuclear powers like North Korea struggle to deploy even a handful of delivery vehicles. Targeting China’s few long-range missiles demands intelligence to find them, not sheer numbers of warheads to hit them. And Russia’s plans to modernize its non-nuclear forces suggest that it is not aiming for nuclear parity.
The op-ed draws from our recent white paper, “The End of Overkill?” and will be the subject of an upcoming event on Capitol Hill, for those of you who missed the policy forum at Cato last month. We’ve spoken and written about the paper before, but my hope is that additional exposure will draw attention to an understudied phenomenon: nuclear overkill. Placement in the New York Times certainly should help.
The fiscal situation helps, too. As we explain in the paper and the op-ed, the various military services grabbed a share of the nuclear mission in order to grow their budgets in the 1950s. Even the Army, effectively barred from developing strategic nuclear weapons, managed to get into the nuclear strategy game through “flexible response,” the claim that the presence of large numbers of U.S. troops stationed in Europe enhanced our ability to deter attacks on our allies. Such claims were dubious even then, but few people were inclined to scrutinize them.
By contrast, today’s budget battles are forcing the services to compete with one another, and with themselves (e.g., surface ships vs. submarines in the Navy, or ICBMs vs. fighter aircraft in the Air Force). In that context, as we conclude in the op-ed:
Budget-conscious service chiefs may see nuclear weapons as an attractive target, especially given their irrelevance in recent wars.
Pentagon competition helped create the triad; restored competition could help kill it.
You can read the whole thing here.
Why do parents choose a particular school? What information do they consider in making that choice? Do they prioritize high standardized test scores, rigorous college preparation, moral or religious instruction, or something else?
This morning, the Friedman Foundation released a new study, “More Than Scores: An Analysis of How and Why Parents Choose Private Schools,” that sheds light on these questions. The study surveyed 754 low- and middle-income parents whose children received scholarships from Georgia GOAL, a scholarship organization operating under Georgia’s scholarship tax credit law.
The study’s findings provide analysts and advocates across the education policy spectrum with much to consider.
Consistent with previous research, the study found extremely high levels of parental satisfaction with 98.6 percent of respondents answering that they are “satisfied” or “very satisfied” with their chosen school relative to their previous experience at a government school. Opponents of school choice argue that we should focus our efforts on improving district schools, but we should not expect that any one school will be able to meet the needs of all students living in a given geographic area. This study and prior research clearly demonstrate that the district schools are failing to meet the needs of a significant portion of the population.
Moreover, contrary to those school choice opponents who argue that low-income and especially black families “don’t know how to make good choices for their children,” the study found that “low-income parents, single parents, African-American parents, and parents with less than a college education are willing and able to be informed and active education consumers on behalf of their children.” For example, about 93 percent of parents indicated that they would be “willing to take three or more time-consuming steps to obtain the desired information” about their children’s potential schools (e.g. - taking a tour, consulting with friends, or attending an informational meeting). The authors note that the studies findings cannot be generalized to the population at large since the survey sample was limited to parents who already completed the application process and received scholarships. That said, even the poorest people in the poorest nations on the planet have proven willing and able to select a quality education for their children.
But the study’s most interesting findings should give pause to supporters of school choice who seek to mandate standardized testing and other top-down reforms like Common Core.
The survey asked parents to identify the top five reasons they chose their child’s particular school using a list of 21 options plus “other.”
The top five reasons why parents chose a private school for their children are all related to school climate and classroom management, including “better student discipline” (50.9 percent), “better learning environment” (50.8 percent), “smaller class sizes” (48.9 percent), “improved student safety” (46.8 percent), and “more individual attention for my child” (39.3 percent).
By contrast, standardized testing ranked very low on the list of parental priorities:
Student performance on standardized test scores is one of the least important pieces of information upon which parents base their decision regarding the private school to which they send their children. Only 10.2 percent of the parents who completed the survey listed higher standardized test scores as one of their top five reasons why they chose a particular private school for their child.
These findings provide another reason why school choice programs should not require testing. Parents value different aspects of education very differently. Standardized testing creates a powerful incentive toward conformity, which diminishes the diversity of educational options available. Moreover, the findings indicate, at the very least, that parents recognize the limitations of standardized testing as useful measurement of learning and perhaps indicate a strong demand for schools that are not part of the standardized testing regime.
To determine what sort of information parents seek when making their decision, the survey asked them to rank 22 pieces of information as important or not. The top three “important” pieces of information were the average class size (80.2 percent), whether the school is accredited, (70.2 percent), and the curriculum and course descriptions (69.9 percent). Standardized test scores came in sixth place with barely more than half of respondents (52.8 percent) ranking it as “important.” As the authors note, that is “a somewhat low ranking relative to the disproportionate emphasis that many educators, politicians, policymakers, business leaders, and the media are placing on national standards and standardized testing.” Likewise, when asked to identify the most important information, only 5.4 percent of parents selected standardized test scores.
Whereas most “accountability” reformers emphasize testing, the study demonstrates that parents hold schools directly accountable and punish lack of performance or transparency by voting with their feet. As the study’s authors conclude:
Because they risk losing students to other K–12 schools in the educational marketplace, private schools have an incentive to voluntarily provide the information desired by parents. Based on the survey results, the failure of a private school to provide information would (79 percent) or might (20 percent) negatively impact a parent’s decision on whether to send his or her children there.
In other words, to the extent that some parents find standardized testing to be a useful tool, the market creates an incentive for schools to test their students and report the results. But whereas the some education reformers would mandate testing for all students, a market allows parents who distrust or dislike testing to choose to avoid it while still empowering them to find the information they need to make an informed decision about their child’s education.
Paul C. "Chip" Knappenberger and Patrick J. Michaels
It’s about time!
For months, we have been hammering away at the point that the Feds’ current determination of the social cost of carbon is grossly out of touch with the relevant scientific literature and economic guidance.
Perhaps in response to the fact that they can’t argue against what we have been saying, the Administration has finally capitulated and is opening up their determination of the social cost of carbon (SCC) for public comment.
Their SCC calculation—in keeping with the playbook of the president’s Climate Action Plan—is a backdoor way of implementing a carbon tax. And it is slowly, pervasively, and worse of all, silently, creeping into all of our lives. We’ve been trying to stop all of this by, at the very least, pulling back the cloak of secrecy and trying to make this once-esoteric subject a topic of dinnertime conversation.
Meanwhile, the government’s regulatory push using the SCC continues.
The Institute for Energy Research has recently identified nearly 30 federal regulations which have incorporated the SCC into their cost benefit analysis (and several more have been recently announced).
The SCC is used to make regulations seem less costly. We say “seem,” because the “benefit” from reducing carbon dioxide (CO2) emissions, as valued by the SCC, is likely never to be realized by the American consumer—yet the other costs (such as increased manufacturing costs) most assuredly will be.
The SCC is a theoretical cost of each additional CO2 emission. But the theory is so loosey-goosey that with a little creativity, you can arrive at pretty much any value for the SCC—a point noted by M.I.T.’s Robert Pindyck in an article for the Summer 2013 edition of Cato’s Regulation.
As the Obama Administration wants to regulate away as many carbon dioxide emissions as possible, it is in its own self-interest to try to arrive at the highest SCC value possible. This way, the more that CO2 emissions are reduced, the more money is “saved.”
Or so the idea goes.
But their path towards a high SCC is one away from both the best science and the most common-sense economics.
Instead, we want to point out several opportunities to draw further attention to the short-comings in the Administration’s SCC determination.
The period for accepting public comments on several proposed rulemakings is open, and provides a good opportunity to remind the issuing agency what they did wrong. For example, here is a recently-announced regulation proposal from the Department of Energy (DoE) which seeks to impose higher energy efficiency rules for residential furnace fans. It employs the SCC to make this rule seem a lot sweeter than it actually is.
We have already submitted comments on several of these proposed regulations, including DoE regulations to increase the efficiency standards for Microwave Ovens, Walk-In Freezers, and Commercial Refrigeration Equipment.
So, it’s important that the White House’s Office of Management and Budget (OBM) just announced that the social cost of carbon determination currently in force will be open to public comment starting sometime in the presumably near future (keep an eye on the Federal Register for the official announcement).
While it is too early to tell, this willingness to hear public comments on the SCC probably originated from the comments received on the Petition to Reconsider the proposed Microwave Oven ruling—the first rulemaking to incorporate the Administration’s latest-worst iteration of the SCC (which was about a 50% increase over its original figure). There hasn’t been an official announcement as to the result of Petition, but the scientific argument against it is a Cato product.
More than likely, though, this will all be for show. The feds could selectively use some comments and somehow find a way to raise the SCC even further. Like we said, that’s easy to do—crank down the discount rate, or crank up the damage function (make-up new damages not included in the current models)—even while paying lip service to the lowered equilibrium climate sensitivity and the CO2 fertilization effect.
We’d be more than happy to be wrong about this. But until then, our efforts to set things straight will continue.
Tim Lynch was right. Dallas Buyers Club is a terrific movie with a strong libertarian message about self-help, entrepreneurship, overbearing and even lethal regulation, and social tolerance. Matthew McConaughey, almost unrecognizable after losing 40 pounds, plays Ron Woodroof, a homophobic electrician in 1985 who learns he has AIDS and has 30 days to live. There’s lots of strong language in his denunciation of the kinds of people who get AIDS, which he certainly is not. But after doing some research, he asks his doctor for AZT, the only drug for HIV/AIDS then available, but he wasn’t eligible for the trials then in process. He turns to the black market, finds his way to Mexico, encounters a doctor who tells him that AZT is toxic and that there are better vitamins and drugs, and beats his original prognosis. As it occurs to him that there are plenty of other people in Dallas who could use these drugs, he sees an opportunity to make some money – if he can only learn to deal with gay people.
Soon he’s setting up a “buyers club,” in an attempt to evade FDA regulations on selling illegal or non-approved drugs. He’s got customers – oops, potential members – lining up. He’s on planes to Japan and Amsterdam to get drugs not available in the United States. And at every turn he’s impeded and harassed by the FDA, which insists that people with terminal illnesses just accept their fate. Can’t have them taking drugs that might be dangerous! You’ll be surprised to see how many armed FDA agents it takes to raid a storefront clinic operated by two dying men.
Go see Dallas Buyers Club.