David Fahrentold reports in the Washington Post:
[P]ork, the habit of using taxpayer money for a legislator’s pet cause…. appears to be stronger even than death.
That’s clear from the story of the Robert C. Byrd Highway, a decades-old road project in West Virginia that had received earmarked funds for years from Sen. Robert C. Byrd (D-W.Va.), the longest-serving senator in history, who died in 2010.
The highway has been maligned as a wasteful road to nowhere. But, now, it has outlived earmarks. It has even outlived Byrd.
This year, with continued support from Sen. John D. Rockefeller IV (D-W.Va.) the highway got $40 million in federal money. It will need about that much every year, state officials say, until it’s finished in 2035.
Paul C. "Chip" Knappenberger
In my recent op-ed for The Hill examining the Obama administration’s estimation of the social cost of carbon (SCC)—a measure of how much future damage is purportedly going to be caused by each ton of carbon dioxide that is emitted through human activities—I identified two major problems with their measure.
First, the administration’s SCC was based on an estimate of global rather than domestic damages from anthropogenic climate change—an odd scope for a measure designed to be incorporated in the cost/benefit analysis of U.S. rules and regulations governing domestic activities (such as the energy efficiency of microwave ovens sold in the United States). In fact, Office and Management and Budget (OMB) guidelines state that
Your analysis should focus on benefits and costs that accrue to citizens and residents of the United States. Where you choose to evaluate a regulation that is likely to have effects beyond the borders of the United States, these effects should be reported separately.
Instead of “reporting separately,” the administration’s SCC embodies “effects beyond the borders of the United States.”
Second, the administration recently revised (upwards) its initial calculation of the SCC. In doing so, it included updates to its underlying economic/climate-change/damage models, but it did not include any updates to the characteristics of the equilibrium climate sensitivity used by the models. Since the equilibrium climate sensitivity is the key factor in how much climate change will result from a given amount of anthropogenic carbon dioxide emissions, and since there is mounting scientific evidence that the equilibrium climate sensitivity is better constrained and lower than that used in the initial analysis, there is no defensible reason why the new science was not included in the administration’s revised SCC calculation.
So that’s two strikes against it.
What I didn’t go into in my op-ed, because it is a rather complicated topic, is the choice of discount rate used in the administration’s SCC analysis. The discount rate, generally put, reflects how much you are willing to pay now to avert future damages. The lower the discount rate, the more costly (in today’s dollars) future damages become. The same OMB guidelines mentioned above also cover the selection of the discount rate to use in cost/benefit analysis. The OMB guidance is that as a default an analysis should use a 7 percent discount rate as the base case, and to show the sensitivity of the results to the discount rate assumption, the analysis also should include the results of using a 3 percent discount rate.
The administration ignored that guideline as well. Instead it opted to determine the SCC using discount rates of 2.5, 3, and 5 percent, and didn’t include results for a 7 percent rate—results that would have indicated a substantially reduced cost of future damages.
With now three strikes against it, the administration’s determination of the social cost of carbon should be tossed out.
The door to doing so has just been opened slightly with the announcement that the Department of Energy is opening for public comment a Petition for Reconsideration of its use of the administration’s newly figured and newly increased SCC in its above-mentioned microwave oven energy efficiency rule. We’ll see what becomes of that.
In the meantime, here’s an example of how the SCC is currently being used and abused in the justification of new regulations. Economist Robert Murphy (who recently testified to Congress as to the problems with the administration’s SCC methodology) posted this gem from the Environmental Protection Agency discussions of a proposed new rule regulating discharges from steam electric power plants (and how the rule may impact carbon dioxide emissions from the plants):
As the above table shows, EPA is being a dutiful federal agency, following Executive Branch guidelines on how to calculate costs and benefits—it reports its findings using both a 3 percent and a 7 percent discount rate. Yet as the footnote explains, when reporting the benefits of reducing CO2 emissions, the EPA actually can’t use a 7 percent discount rate, because an estimate of the SCC (social cost of carbon) for a 7 percent rate is “not available.” Why is it not available? Because the [administration’s] Working Group explicitly ignored the OMB guidelines, and only reported the figures for 3 percent and 5 percent.
We thus have an absurd situation, in which EPA and other regulatory agencies will be following the rules and calculating benefits and costs at both the 3 percent and 7 percent discount rates. Yet, when they express the “social benefits” of reducing greenhouse gas emissions at the 7 percent rate, they are actually going to plug in the wrong number, and explain in a footnote why they are doing so. To repeat, this is important, because the “right” number would show that there are virtually no “social benefits” from reducing greenhouse gas emissions.
As I have explained elsewhere, there are far more problems to the Obama Administration’s computer-model-case against carbon, than just the choice of discount rate. Yet the knots into which the federal government has tied itself, in order to avoid revealing the truth about the actual economic literature, is quite revealing—not to mention hilarious.
The administration’s SCC is a devious tool designed to justify more and more expensive rules and regulations impacting virtually every aspects of our lives, and it is developed by violating federal guidelines and ignoring the best science.
All around bad news.
In an interview with CNN yesterday, outgoing FBI director Robert Mueller offered up words one could characterize as defending mass surveillance of all Americans’ phone calling. Indeed his interview has been portrayed as a defense of such spying, with outlets like NRO’s “The Corner” reporting “Outgoing FBI Chief: ‘Good Chance’ NSA Would Have Prevented ‘Part’ of 9/11.” But Director Mueller spoke much more equivocally than that.
Here’s what he actually said.
CNN: If we had the kind of intelligence that we were collecting through the NSA before September 11th, the kind of intelligence collection that we have now, do you think 9/11 would have been prevented?
MUELLER: I think there’s a good chance we would have prevented at least a part of 9/11. In other words, there were four planes. There were almost 20 — 19 persons involved. I think we would have had a much better chance of identifying those individuals who were contemplating that attack.
CNN: By this mass collection of information?
MUELLER: By the various programs that have been put in place since then. … It’s both the programs (under the Patriot Act) but also the ability to share the information that has made such dramatic change in our ability to identify and stop plots.
Mueller vaguely cited “various programs,” giving them a retroactive chance of preventing “a part of 9/11.” But even this defense of post-9/11 powers is insufficient.
In our 2006 paper, “Effective Counterterrorism and the Limited Role of Predictive Data Mining,” IBM scientist Jeff Jonas and I recounted the ease with which 9/11 attackers Khalid al-Mihdhar and Nawaf al-Hazmi could have been found had government investigators pursued them with alacrity. The 9/11 Commission said with respect to al-Mihdhar, “No one was looking for him.” Had they been caught and their associations examined, the 9/11 plot probably could have been rolled up. Sluggish investigation was a permissive factor in the 9/11 attacks, producing tragic results that nobody foresaw.
That absence of foresight is a twin with retrospective assessments like Mueller’s, which fail to account for the fact that nobody knew ahead of 9/11 what devastation might occur. Immediately after the 9/11 attacks, everybody knew what such an attack could cause, and everybody began responding to the problem of terrorism.
Would Patriot Act programs have prevented at least a part of 9/11? Almost certainly not, given pre-9/11 perceptions that terrorism was at the low end of threats to safety and security. A dozen years since 9/11, terrorism is again at the low end of threats to safety and security because of multiplicitous efforts worldwide and among all segments of society. It is not Patriot Act programs and certainly not mass domestic surveillance that make us safe. Even Mueller didn’t defend NSA spying.
I’ve often pointed out that the modern trend in America toward loading more and more legal risks and obligations onto employers tends to have the presumably unintended effect of creating a disincentive to employ people, especially when there is any hint that an employment relationship, even if productive otherwise, might take on elements of conflict.
Don’t just take my word for it. Here’s an item I’ve been meaning to note for a while from the excellent lawyer-blogger Eric B. Meyer of Dilworth Paxson in Philadelphia, who represents employers. It’s no longer brand new but has lost none of its relevance:
In the world of Human Resources, “hire slow, fire fast” generally holds true to avoid just about any lawsuit.
Meyer goes on to describe the case of a nursing assistant at a New Jersey senior living center who was written up for absenteeism, rules violations, and insubordination, and put on a series of supposedly final warnings and a “last chance” agreement.
Which is to say, the employer did not follow the maxim of “fire fast.” HR folks can probably guess what happened next: the worker filed a request under the Family and Medical Leave Act (FMLA), a federal law that 1) requires the employer to hold open a vacant job for an absentee under various circumstances and 2) lays out a minefield of ways the employer can incur liability if it then can be construed as having “discouraged” the request or “retaliated” against it. Much of the gamesmanship of employment law develops from doctrines like retaliation: an underlying claim of discriminatory treatment may be hopelessly weak, but retaliation will succeed in keeping the suit going. (When the Supreme Court very slightly narrowed liability for retaliation in this summer’s case of University of Texas Southwestern v. Nassar, the peals of anguish from the legal Left went on for weeks.) In this case the New Jersey senior center stepped on one of the mines: it proceeded to fire the worker based on a last-straw-on-the-camel further offense that others testified would normally not count as a firing matter by itself.
If you’re an employer in some region or industry where employees seldom sue, you may be able to offer lenient discipline policies with multiple chances in hopes of breaking the bad habits of an otherwise wanted employee. States like California and Massachusetts, which have laid out drastic legal consequences for employers whose workers do not get full lunch breaks, are also the states where you are likeliest to find seemingly draconian employer policies of firing or disciplining workers caught doing even a tiny bit of work over lunch.
Most experienced HR people I’ve met seem to find it easy to grasp the legal logic of “Hire Slow, Fire Fast.” Why is it so hard for elected lawmakers to grasp?
Daniel J. Ikenson
This morning, Cato published a new study of mine titled, “Reversing Worrisome Trends: How to Attract and Retain Investment in a Competitive Global Economy.” The thrust of the paper is that, despite still being the world’s premiere destination for foreign direct investment, the U.S. share of the global stock of direct investment fell from 39% in 1999 to 17% today.
This downward trend is attributable to two broad factors. First, developing economies – many of which have achieved greater political stability, sustained economic growth, improved infrastructure and higher-quality worker skill sets – are now viable options for pulling in the kinds of FDI that was once untenable in those locales. Second, a deteriorating business and investment climate in the United States – owing to burgeoning, burdensome, and uncertain regulations; an antiquated, punitive corporate tax system; incoherent immigration, energy, and trade policies; a wayward tort system; cronyism and perceptions thereof; and other perverse incentives and disincentives of policy have pushed investment away.
The first trend should be welcomed and embraced; the second must be reversed. From the study:
Unlike ever before, the world’s producers have a wealth of options when it comes to where and how they organize product development, production, assembly, distribution, and other functions on the continuum from product conception to consumption. As businesses look to the most productive combinations of labor and capital, to the most efficient production processes, and to the best ways of getting products and services to market, perceptions about the business environment can be determinative. In a global economy, “offshoring” is an inevitable consequence of competition. And policy improvement should be the broad, beneficial result.
The capacity of the United States to continue to be a magnet for both foreign and domestic investment is largely a function of its advantages, many of which are shaped by public policy. Considerations of taxes, regulations, trade openness, access to skilled workers, infrastructure, energy policy, and dozens of other policy matters factor into decisions about whether, where, and how much to invest. It should be of major concern that inward FDI has been erratic and relatively downward trending in recent years, but why that is the case should not be a mystery. U.S. scores on a variety of renowned business surveys and investment indices measuring policy and perceptions of policy suggest that the U.S. business environment is becoming increasingly less hospitable.
Although some policymakers recognize the need for reform, others seem to be impervious to the investment-repelling effects of some of the laws and regulations they create. Some see the shale gas and oil booms as more than sufficient for overcoming policy shortcomings and attracting the necessary investment. The most naive consider “American” companies to be tethered to the U.S. economy and obligated to invest and hire in the United States, regardless of the quality of the business and policy environments. They fail to appreciate that increasingly transnational U.S.-based businesses are not obligated to invest, produce, or hire in the United States.
It is the responsibility of policymakers, however, to create an environment that is more attractive to prospective investors. Current laws, regulations, and other conditions affecting the U.S. business environment are conspiring to deter inward investment and to encourage companies to offshore operations that could otherwise be performed competitively in the United States.
A proper accounting of these policies, followed by implementation of reforms to remedy shortcomings, will be necessary if the United States is going to compete effectively for the investment required to fuel economic growth and higher living standards.
Details, charts, and analysis, and citations are all included here.
Paul C. "Chip" Knappenberger and Patrick J. Michaels
Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”
The U.N.’s Intergovernmental Panel on Climate Change (IPCC) is nearing the final stages of its Fifth Assessment Report (AR5)—the latest, greatest version of its assessment of the science of climate change. Information is leaking out, with some regularity, as to what the final report will contain (why it is secretive in the first place is beyond us).
A few weeks ago, The Economist reported on some of the information from the new IPCC report that was leaked. The key piece of information concerned the IPCC’s assessment of the equilibrium climate sensitivity—how much the earth’s average surface temperature increases as a result of a doubling of the atmospheric carbon dioxide concentration. As we have been reporting, the research now dominating the scientific literature indicates that the equilibrium climate sensitivity is around 2.0°C. This value is about 40% lower than the average climate sensitivity value of the climate models used by the IPCC to make their future projections of climate change, including among other projections, those for temperature and sea level rise. The Economist suggested that the IPCC was going to lower their assessed value for the equilibrium climate change based on the mountain of evidence from the literature, but gave no indication whether the IPCC was also going to, accordingly, lower all the projections made throughout their report.
The IPCC has three options:
1. Round-file the entire AR5 as it now stands and start again.
2. Release the current AR5 with a statement that indicates that all the climate change and impacts described within are likely overestimated by around 50%, or
3. Do nothing and mislead policymakers and the rest of the world.
We’re betting on door number 3.
In its article earlier this week reporting on its own acquired leaked information from the IPCC AR5 report, the New York Times basically proved us right.
The Times article, written by global warming enthusiast Justin Gillis, was spun to play up the perceived horrors from the AR5—that humans have caused the majority of the temperature rise since 1950 (failing to mention that the observed rise is only about 75 percent the value that it was supposed to be according to the IPCC, that the warming rate has been declining, or new studies which suggest that decreased aerosol emissions have played a significant role in the observed warming), that the sea level rise was possibly going to be large, dramatic, and dangerous (despite a plethora of new scientific findings to the contrary, see our latest Current Wisdom for example), that climate change was leading to more and more extreme weather (ignoring that climate change was probably averting more extreme weather than it was creating), and that human-caused greenhouse gas emissions were going to push temperatures rapidly upwards (brushing aside the plethora of new scientific evidence that the future temperature rise will continue to be less than expected, just as it has been for the past 50 years).
[As an aside, you can tell right from the start that an article about anthropogenic climate change resulting from human emissions of carbon dioxide is going to alarmist if it is accompanied by a picture of a smokestack spewing out water vapor (or anything else for that matter—after all, carbon dioxide is an odorless, colorless gas). You know that it is going to be completely over the top if the picture of the smokestack spewing water vapor is backlit by the sun, a geometry which makes the water vapor emissions appear black and thus dirty and foreboding. The Times article includes both tricks.]
So if Justin Gillis’s New York Times article is any indication of the actual contents of the upcoming Fifth Assessment Report from the IPCC, or how its contents are going to be spun, it should be plainly obvious that our Option #3 is going to be the chosen course—“ Do nothing and mislead policymakers and the rest of the world.”
We can’t say we are surprised.
But neither can we say that that the IPCC’s new results will be published without a huge groundswell of pushback from those who won’t be fooled by the IPCC’s misassessment of the current state of climate science.
Stay tuned for the fallout from this mushroom.
Superabundant federal student aid has done a huge amount to get us into our bankrupting college mess. To get us out, today President Obama will propose, essentially, “soft” price controls. But they will likely leave the root problem intact while, if anything, adding new kinds of woe.
On his college bus tour, President Obama will propose that Washington start publishing ratings of schools based on such measures as average tuition, graduation rates, debt and earnings of graduates, and the percentage of a college’s students who are low-income. The ratings would also “compare colleges with similar missions.” Ultimately, the president will propose that the availability of aid be partly conditioned on the new ratings.
Let’s be clear: The price of college is almost certainly far higher than it should be, fueled largely by federal aid that essentially tells colleges “charge whatever you want – we’ll give students the money.” That’s a major reason that average, inflation-adjusted prices have more than doubled in the last 30 years. And it is good that the president, and many others, are essentially acknowledging the inflationary reality of aid. But will price controls help or hurt?
Perhaps the first question is, will the controls actually lower prices? As I’ve pointed out before, there is abundant, readily available data about colleges, earnings, etc., and it seems to have little effect on college consumption. Why? Many reasons, but there are almost certainly two major ones. First, the Feds will give almost any student almost any amount of money needed to pursue any major. That eliminates the terrific service that private lenders, who need to take decent risks, would provide: telling people when their college decisions are unrealistic, and not giving them the rope to financially hang themselves. The other big problem is that soundbite-driven politicians love to tell everyone they need to go to college, both driving credential inflation – a degree is often considered an essential even if it has no bearing on the skills one needs for a specific job – and pushing people into expensive degree programs because they think they are necessary to prosper.
Publishing data – and the president promises a “Datapalooza” – simply isn’t likely to help much if aid remains. But what about conditioning receipt of aid on a school’s rating? That might control prices, but it depends on the specifics of the withholding. In particular, what triggers sanctions – for instance, what constitutes raising prices too high, too fast – and what is the punishment per, say, unit of inflation? Alas, while he suggests conditioning Pell Grant and loan terms and amounts on the ratings, the president ultimately punts on these crucial matters, saying that he will convene lots of hearings to work out lots of details, and a final decision won’t be made until 2018.
At least in the near future, then, federal price controls won’t likely exert big, downward pressure on college costs. And despite the huge problem of tuition hyperinflation, that is probably a good thing.
Looking at the ratings criteria, it is pretty clear that private colleges – which can’t keep sticker prices artificially low by taking taxpayer dough upfront – will look disproportionately bad. That may be mitigated somewhat if private institutions are compared only to other private institutions, but even if that is the initial format how long will it be kept? And will for-profit schools – which primarily seek to furnish education for “gainful employment” – be compared to community colleges, which have a similar mission but are much more heavily subsidized upfront? (And which, according to federal data, appear to perform significantly worse than for-profit schools?)
There are also serious problems with making debt and earnings the ultimate end points for judging success or failure. First, on what time frame do you base earnings? One year after graduation? Five years? Ten years? And what about people who are happy to take on debt to study subjects that aren’t highly remunerative but are nonetheless rewarding? As long as the borrowers pay those debts off, why should Washington decide that the institutions are less worthy choices than other schools? Of course the answer is that the Feds are paying, but it is the paying that is the problem, and using that power to “manage” higher education only compounds the ills. Finally, collecting big data, as we all now know, opens us all up for invasions of privacy, as well as empowering even more intrusive federal management to decide who should major in what and where they should do it. Indeed, there are serious cradle-to-grave implications for all of this when coupled with the data collection driven by Race to the Top and other federal education laws.
Finally, there is the problem of judging schools based on how many low-income students they enroll. The unfortunate, but pretty well documented reality is that many low-income people enter higher education, incur big costs, but disproportionately don’t finish. This is no doubt a function of many things – bad K-12 schools, a greater need to work while in college, disproportionately attending weak institutions – but encouraging schools to simply enroll more low-income students won’t solve these problems. And conditioning aid amounts on graduation rates for such students won’t ultimately help. Yes, graduation rates might go up, but colleges could easily pass students along, or push them into easier majors, or do other things that keep the schools out of trouble while giving out largely worthless degrees.
Thanks in large part to federal aid, the price of college has risen astronomically, kneecapping students and taxpayers. Price controls will only mask the root problem while creating new pains of their own. Only phasing out student aid – eliminating the root problem – will get us the higher education system we need.
Ted Galen Carpenter
Officials, pundits, and gun control activists on both sides of the U.S.-Mexico border habitually argue that allegedly lax gun laws in the United States bear heavy responsibility for the drug-related violence in Mexico. As I write over at the National Interest Online, the latest example of that reasoning is a new study from the Council on Foreign Relations arguing that the “flow of high-powered weaponry from the United States exacerbates the soaring rates” of such violence.
It is hardly a new argument. Former Secretary of State Hillary Clinton repeatedly embraced the Mexican government’s view that permissive U.S. gun laws were a major contributor to bloodletting in Mexico. In summit meetings with both the current Mexican president and his predecessor, President Obama has adopted a similar position. But that is merely a convenient scapegoat for the horrific violence in our southern neighbor. The underlying reason for Mexico’s agony is not the easy availability of guns, but the enormous profitability of the illegal drug trade and the various pathologies that it spawns, including violence and pervasive corruption.
Indeed, the argument that supposedly lax U.S. gun laws are a major reason for Mexico’s drug violence is a red herring. That’s not to say that the cartels don’t get some of their weaponry from gun shops, flea markets, pawn shops, and gun shows in the United States, as gun control zealots charge. They do, but they also get them from numerous other sources. As I note in chapter 9 of The Fire Next Door, my latest book on the international drug war, the cartels obtain weapons from the international black market, the armories of Central American countries the U.S. helped fill during the fight against communist insurgents during the 1980s, and even Mexico’s own military depots.
The principal reason the drug gangs can obtain all the firepower they want is that they have vast financial resources at their disposal. Mexico’s share of the annual $300 billion to $350 billion global trade in illegal drugs is estimated to be at least $35 billion, and perhaps as much as $60 billion. The U.S.-led prohibition strategy is largely responsible for that perverse situation. Banning marijuana, cocaine, and other drugs today does not work any better than the prohibition of alcohol did in the 1920s and early 1930s. In both cases, it merely inflated profits and guaranteed that the trade would be dominated by violent criminals.
If we really want to help Mexico curb the carnage that has claimed more than 80,000 lives over the past 6 ½ years, the United States needs to adopt a strategy that de-funds the drug cartels. That means ending prohibition, not pursuing the quixotic goal of tougher gun laws.
Most people know the story of the boy who was rescuing sea stars that had washed up on a beach by throwing them back into the ocean. When a man scoffed to the boy that his efforts didn’t make a difference since he couldn’t save all of them, the boy tossed another sea star back into the ocean and replied, “It made a difference to that one.” The little-known ending to the story is that the boy was sued by the Southern Poverty Law Center for violating the Constitution’s Equal Protection clause.
Sadly, this is only a slight exaggeration. Earlier this week, the Southern Poverty Law Center filed a federal lawsuit contending that Alabama’s new scholarship tax credit program violates the Equal Protection clause and harms the low-income students attending failing public schools whom the law is intended to help:
[SPLC] President Richard Cohen said the new Alabama Accountability Act will take millions away from public schools and will make the failing schools worse than they are now. He said the law was promoted by Republican Gov. Robert Bentley as giving students a way out of failing schools.
“It’s a lie. Our clients do not have a way out of the failing schools that they are in,” he said.
The Montgomery-based law center sued on the opening day of classes for most public schools in Alabama. The suit focuses on a part of the law that allows families with children in Alabama’s 78 failing public schools to move them to a non-failing public school or to a private school that participates in the program. They can get a state tax credit of about $3,500 annually to help cover private school costs.
The lawsuit was filed on behalf of eight plaintiffs who say that they can’t afford to go to private schools and that the non-failing public schools are not accessible. The lawsuit raises equal protection issues.
One of the eight plaintiffs, Mariah Russaw, said she couldn’t afford the transportation costs even if her 12-year-old grandson, J.R., could leave Barbour County Junior High School in Clayton. All junior highs in the Barbour County school system are on the failing list. The nearest non-failing public school is 19 miles away in Pike County. The nearest private school is about 30 miles away, but it is not participating in the program.
The 62-year-old grandmother said it wouldn’t matter if the private school were participating. “I cannot afford to transport him to another school,” she said.
In short, SPLC argues that if the law can’t rescue every child from a failing school, then it shouldn’t be allowed to rescue any child. Not only would this line of reasoning hobble almost every government effort to incrementally address any problem, but the argument also rests on a misunderstanding of the status quo and the law’s likely impact.
The SPLC lawsuit claims that the law “creates two classes of students assigned to failing schools – those who can escape because of their parents’ income or where they live and those, like the Plaintiffs here, who cannot.” In fact, those two classes of students already exist. In our existing education system, low-income families are trapped in failing schools while wealthier families can afford either to live in districts with better public schools or to send their children to private school. The scholarship tax credit program is too limited to solve all the existing inequities, but it moves more students out of the first category and into the second. In other words, by expanding opportunities to low-income families, it makes an already unequal education system more equal.
Moreover, there is no evidence the program does harm to students who remain in public schools. The SPLC claims that the failing public schools are “likely to deteriorate further as their funding is continually diminished” as a result of students fleeing from those schools. But a mere assertion that harm is “likely” doesn’t cut it. Had the SPLC consulted the research literature instead of their fevered imaginations, they would have discovered that 22 of 23 studies of school choice programs found that they have positive impact on public school performance. The last study found no visible impact.
In other words, the increased choice and competition help both the students who participate in the program and those students who remain in their assigned public schools. Striking down the program would thus make matters worse for the litigants and other families like them, not better. Expanding the program would improve outcomes even further. If the SPLC is truly motivated by a desire to help low-income families, it should drop its lawsuit and join the effort to expand educational options. There are lots of sea stars left on the beach and they could use a hand.
Patrick J. Michaels
The Current Wisdom is a series of monthly articles in which Patrick J. Michaels, director of the Center for the Study of Science, reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.
Could President Obama have picked a worse time to announce his Climate Action Plan?
Global warming has been stuck in neutral for more than a decade and a half, scientists are increasingly suggesting that future climate change projections are overblown, and now, arguably the greatest threat from global warming—a large and rapid sea level rise (SLR)—has been shown overly lurid (SOL; what did you think I meant?).
You hardly need an “action plan” when there is so little “action” worth responding to.
As I frequently discuss the lack of warming and the decreases in the estimates of future climate change, I’ll focus here on new scientific findings concerning the potential for future sea level rise, interspersing a little travelogue.
Projections of a large sea-level rise this century depend on rapid ice loss from Greenland and/or Antarctica. Yes, as ocean waters warm, they expand, but this expansion-induced rise is pretty well constrained and limited to being about 6 inches plus or minus a couple of inches by century’s end. And the contribution from melting glaciers/ice in other parts of the world (not counting Greenland and Antarctica) is even smaller, maybe 2-4 inches. So that adds up to about 8-12 inches of sea level rise by the year 2100—not much different than that which has already occurred over the past century. This is hardly catastrophic.
So getting a good handle on the contributions from Antarctica and Greenland is essential if you want to develop a reasonable expectation for the future. Lacking a good handle leads to unreasonable projections.
Here is an example of the latter.
A breathless passage from the book version of Al Gore’s An Inconvenient Truth:
I flew over Greenland in 2005 and saw for myself the pools of meltwater covering large expanses on top of the ice. …These pools have always been known to occur, but the difference now is that there are many more of them covering a far larger area of ice. …In Greenland, as in the Antarctic Peninsula, this meltwater is now believed to keep sinking all the way down to the bottom, cutting deep crevasses and vertical tunnels that scientists call “moulins.”
When water reaches the bottom of the ice, it lubricates the surface of the bedrock and destabilizes the ice mass, raising fears that the ice mass will slide more quickly towards the ocean.
…If Greenland melted or broke up and slipped into the sea—or if half of Greenland and half of Antarctica melted or broke up and slipped into the sea, sea levels worldwide would increase by between 18 and 20 feet.
Tony Blair’s advisor, David King, is among the scientists who have been warning about potential consequences of large changes in these ice shelves. At a 2004 conference in Berlin, he said: THE MAPS OF THE WORLD WILL HAVE TO BE REDRAWN. [all caps in original]
Gore went on to include page after page of now and then maps of the world’s major cities after a sea level rise of 20 feet (of course, assuming no adaptive measures put in place).
But Gore’s disaster mechanism has been shown to be impotent and ineffective. In fact, a collection of recent papers published in the peer-reviewed scientific literature basically dispels all myths foretelling a large sea level rise this century coming from ice loss on Greenland. Recent research on Antarctica largely does the same.
First off, research by Sarah Shannon and 18 co-authors takes direct aim at Gore’s mechanism in their paper “Enhanced basal lubrication and the contribution of the Greenland ice sheet to future sea-level rise.” Here is what they conclude, in direct opposition to Gore’s claims:
Although changes in lubrication generate widespread effects on the flow and form of the ice sheet, they do not affect substantial net mass loss; increase in the ice sheet’s contribution to sea-level rise from basal lubrication is projected by all models to be no more than 5 percent of the contribution from surface mass budget forcing alone.
And “no more than 5 percent” turns out to be, by the year 2100, somewhere between 0 and 3 millimeters, or in English units, a tenth of an inch or less. Some disaster. Certainly “18 to 20 feet” is a lot scarier, but it is just plain wrong.
Another new study looks at (among other things) the sea-level rise effect of the acceleration of the discharge rate of those glaciers across Greenland, which directly empty out into the sea. Heiko Goelzer and fellow researchers found that after an initial bump in the contribution to sea level rise as these glaciers retreat, once they draw back to the grounding line—the point where the outlet glaciers stop floating and instead rest on the bedrock—the loss rate slows dramatically. They conclude that the contribution from dynamical changes to the flow rate of outlet glaciers may contribute between 8 to 18 millimeters of sea level rise by the year 2100. That is about a quarter to three-quarters of an inch. Again, not even close to a disaster.
Here’s your climate news scoop of the day: The highest discharge-volume glacier in the entire Northern Hemisphere—Greenland’s Jakobshavn—has grounded, which is really going to put the kibosh on the Greenlandic myth. Here’s a picture I took from my own Greenland sojourn* earlier this summer. It shows the southern end of Jakobshavn glacier, on June 24.
Looking south along the calving front of the Jakobshavn glacier, June 24, 2013. Photo by Patrick Michaels.
You can see that it is grounded over most of its humongous 10-kilometer face. The calved ice drops off in smaller chunks, dramatically reducing the size of the bergs that will eventually float down the spectacular Ilulissat Icefjord.
A small portion of the glacier was perhaps still floating when I was there, right near the north end, as indicated by a reduction in the height of the calving face, as shown in this photo.
Looking north along the calving front of the Jakobshavn glacier, June 24, 2013. Photo by Patrick Michaels.
As a tidewater glacier, Jakobshavn regularly calves some tremendous icebergs that take a couple of years to make their way down the 35-mile fjord, only to ground on the terminal moraine near Ilulissat (and conveniently located in view of the Hotel Arctic’s live webcam, here). Because the glacier has largely grounded, these bergs are not the giants that they once were (although some sizeable icebergs continue to be produced in the early summer as the floating ice tongues established in the winter break up). Hie thee to Ilulissat! The sooner the better!! Presumably some views through the webcam (which was near my room) will convince you!
(The terminal moraine near Ilulissat dates to the end of the Little Ice Age—meaning that the productive fishery at the mouth of the fjord was probably inaccessible. Farther south, such an expansion of ice no doubt covered much of the Viking pastureland, chasing them to places elsewhere (including North America?)).
A third new study examined the direct contribution of changes in the surface mass balance (SMB) of Greenland (that is, total run off from ice melting minus total gains from enhanced snowfall) to future sea level rise (they did not consider ice loss from glacier speed). In their study “Estimating the Greenland ice sheet surface mass balance contribution to future sea level rise using the regional atmospheric climate model MAR,” Xavier Fettweis and colleagues found that declines in the SMB by the year 2100 led to somewhere between 2 centimeters and 13 centimeters of sea level rise, depending of the carbon dioxide emissions scenario used in their model. That’s somewhere between 1 and 5 inches (and these projections are based on climate models which, according to the latest science, overestimate future warming by some 70 percent).
So adding all of these effects up—basal lubrication, glacial dynamics, and enhanced melting—the total global sea level rise by the end of the 21st century originating from Greenland projected by the latest, greatest scientific studies averages out to be maybe 3 to 4 inches. Ho hum.
Like I said, sea level rise disaster scenarios that are dreamed up by Greenland shedding large volumes of ice (a la Al Gore, Jim Hansen, etc.) are SOL.
Fettweis, X., et al., 2013. Estimating the Greenland ice sheet surface mass balance contribution to future sea level rise using the regional atmospheric climate model MAR. The Cryosphere, 7, 469-489.
Goelzer, H., et al., 2013. Sensitivity of Greenland ice sheet projections to model formulations, Journal of Glaciology, 59, 733-749, doi:10.3189/2013JoG12J182
Shannon, S., et al., 2013. Enhanced basal lubrication and the contribution of the Greenland ice sheet to future sea-level rise. Proceedings of the National Academy of Sciences, doi:10.1073/pnas.1212647110
*Get that ticket to Greenland pronto! Travel hint: the shortest route is through Reykjavik on Iceland Air and then on Air Iceland to Ilullisat. Reserve in advance and you can get a Saga Class (business) seat for pretty cheap compared to the Majors (which will take you all the way to Copenhagen and then backtracking on Air Greenland’s A330 to Kangerlussaq (Sondre Stromfjord) and an additional connection to Ilulissat, i.e. $$$$).
Phew. That was close.
Earlier this month, Secretary of Agriculture Tom Vilsack said that without a new farm bill to replace the 2008 farm bill, the USDA would not have the authority or the funds to continue paying the $147m per year bribe we had settled with Brazil in 2010 as part of a trade deal. (The fulsome details are available in this blog post, written at the time of the deal, and more about the underlying trade dispute is available in this 2005 policy analysis by Cato Adjunct Scholar Dan Sumner). And without those bribes, Brazil would likely suspend the ceasefire deal and retaliate against U.S. export interests by raising import taxes and suspending its obligations to protect Americans’ intellectual property. So, Mr. Vilsack implied, Congress needs to pass a farm bill now, and include changes to the cotton program that would satisfy the Brazilians and prevent a trade war.
Well stand down, America, because according to some unnamed trade experts quoted by Inside U.S. Trade today [$], Mr. Vilsack’s analysis is not exactly correct. He may even be lying:
Agriculture Secretary Tom Vilsack misconstrued the facts, or was at least misleading, when he claimed last week that the U.S. government will lose the authority on Oct. 1 to continue paying Brazil $147 million annually under a temporary settlement to a World Trade Organization dispute, according to four non-government experts.
The statement, these experts agreed, was clearly aimed at pressuring Congress to pass a new farm bill and thereby resolve the longstanding fight with Brazil over agricultural subsidies…
But a decision on whether to end that authority is clearly within the purview of the administration – not Congress, these experts said. In other words, if the authority is expiring this fall, it is only because the administration has determined internally that it wants it to expire and does not want to continue making the payments, they said…
“In my opinion, if the farm bill were extended again for one year and the Brazilians were OK with another year’s worth of payments and there was an agreement from both sides, Vilsack would have the authority to continue payments,” one Senate aide said. “He and the [administration] just don’t want to and they are using this as leverage” to try and secure passage of a new farm bill, the aide added.
Numerous other sources also pointed out that the administration did not gain new authority from Congress when it instituted the payments in 2010, and said there is no reason to believe it would need such an act of Congress now in order to maintain its current ability to pay Brazilian cotton growers. No appropriations legislation has been passed in several years, and this has not affected the payments at all…
Another agriculture industry source also said the secretary’s implication that passage of the farm bill would fix the dispute with Brazil was also misleading. Brazil has continuously criticized the pending farm bill proposals for potentially increasing the level of government support for cotton farmers compared with current levels, and for not making any changes to a WTO-faulted export credit guarantee program also operated by the CCC.
Vilsack also said last week that, due to sequestration, the administration would not be able to pay the full amount to Brazil this year either (Inside U.S. Trade, Aug. 9). Observers also said it was unclear to them whether this was really the case, or if this was an elected choice by the administration designed to increase pressure on Congress. [all emphases added]
So it looks like the mad panic to pass a farm bill is unjustified, at least as far as the Brazil cotton dispute is concerned. Mr. Vilsack, whose agency clearly has a vested interest in the farm bill’s passage given that it justifies the USDA’s very existence, may have been stretching the truth in pursuit of his goal. The USDA can keep paying the bribes whatever happens, and the farm bill proposals under consideration are unlikely to satisfy the Brazilians anyway.
The ideal solution, of course, would be to do away with U.S. cotton supports (and all other agricultural programs) altogether. We could save taxpayers and consumers billions of dollars, and put that $147m to better use. That would be a farm bill worth passing immediately.
I don’t know if it is intentionally being done to promote the Common Core national curriculum standards, or they are honest but failed efforts to objectively describe what the Core is, but recent polling on the Core has been heavily slanted to get pro-Core responses.
Case in point, the newest Education Next public opinion poll, which in the past has offered terrific efforts to compensate for wording in other polls seemingly designed to elicit negative results against school choice. But on Common Core? Just read the question for yourself (#32 on the questionnaire):
As you may know, all states are currently deciding whether or not to adopt the Common Core standards in reading and math. If adopted, these standards would be used to hold the state’s schools accountable for their performance. Do you support or oppose adoption of the Common Core standards in your state?
First and foremost, that “all states are deciding” whether or not to adopt the Core is just incorrect. Some states are contemplating leaving the Common Core, but almost all states decided they would adopt in 2010. Many, of course, did so in a rush to get federal Race to the Top money. Indeed, federal coercion–and the flash adoption it spurred–are two of the biggest objections to the Core, and this question acts like those hugely controversial things simply never happened.
Second, how many people, knowing little else about the Core, are going to oppose something that generically will hold “schools accountable for their performance?” Probably not many. And the fact is the Core does not hold anyone accountable for performance. That would be the role of tests coupled with sanctions, not the Core itself. Core supporters love to bash opponents for attributing to the Core things that do not directly come from it–data mining, squeezing out literature–but seem to have no trouble wrongly attributing positive things directly to it.
It’s no wonder the Education Next pollsters found big support for the Core, but faster rising opposition: Much support likely comes from respondents only knowing what the pollsters tell them, while opposition is almost certainly coming primarily from people who over the last year have become aware of the reality of Core, and don’t like it.
Just as bad as the Education Next poll is the AP-NORC “National Education Survey” that came out a few days ago, though it does furnish one very useful piece of information: more than half of respondents knew “little” or “nothing” about the Core, showing how influential a leading question could be. Unfortunately, then they provided such a question (Q30), saying that “the objective of the Common Core is to provide consistent, clear standards across all states for students in grades K-12.” Who, knowing little to nothing about the Common Core, is going to oppose “consistent, clear standards?” That there is big debate about how consistent and clear they are is in no way indicated in the question, and, not surprisingly, it gets a plurality to say they think the Core will “improve the quality of education.” Perhaps the amazing thing is that it didn’t get a majority to say that.
In the end, whether national standards are a good or bad policy doesn’t have a lot to do with public opinion polls. But wouldn’t it be nice if the polls weren’t obviously slanted toward pro-Core outcomes?
… and you’re not following developments in Fourth Amendment law.
Jeffrey Toobin is the latest to claim that Smith v. Maryland settles the Fourth Amendment issues around the National Security Agency’s acquisition of data about every call made in the United States. He even links to the text of the decision in a recent blog post.
The majority opinion in Smith did say that people don’t have such expecations, but that rationale is weak, and the facts of Smith are inapposite to the present controversy. I think that’s easily gathered from reading the case with awareness of legal currents.
Here’s what happened in Smith:
On March 5, 1976, in Baltimore, Md., Patricia McDonough was robbed. She gave the police a description of the robber and of a 1975 Monte Carlo automobile she had observed near the scene of the crime. After the robbery, McDonough began receiving threatening and obscene phone calls from a man identifying himself as the robber. On one occasion, the caller asked that she step out on her front porch; she did so, and saw the 1975 Monte Carlo she had earlier described to police moving slowly past her home. On March 16, police spotted a man who met McDonough’s description driving a 1975 Monte Carlo in her neighborhood. By tracing the license plate number, police learned that the car was registered in the name of petitioner, Michael Lee Smith.
The next day, the telephone company, at police request, installed a pen register at its central offices to record the numbers dialed from the telephone at petitioner’s home. The police did not get a warrant or court order before having the pen register installed. The register revealed that on March 17 a call was placed from petitioner’s home to McDonough’s phone. On the basis of this and other evidence, the police obtained a warrant to search petitioner’s residence. The search revealed that a page in petitioner’s phone book was turned down to the name and number of Patricia McDonough; the phone book was seized. Petitioner was arrested, and a six-man lineup was held on March 19. McDonough identified petitioner as the man who had robbed her. (citations omitted)
It is not possible to argue honestly that the facts of Smith are anything like the NSA’s bulk data collection. The police had weighty evidence implicating one man. The telephone company voluntarily applied a pen register, collecting analog information about the use of one phone line by that one suspect. I can’t think of a factual situation that could be at a further extreme than NSA’s telephone calling surveillance program.
Apologists for NSA spying have to rest their argument entirely on the Smith Court’s conclusion that there is no “expectation of privacy” in phone dialing information. But this is an unsafe resting place for at least two reasons.
First, the Court decided the Smith case wrongly, misapplying the “reasonable expectation of privacy” test, as courts often do. Randy Barnett and I pointed this out in our recent brief to the Supreme Court:
Justice Blackmun inaccurately applied [“reasonable expectation”] doctrine. The question whether a person has an actual (subjective) expectation of privacy is a question of fact, but the Court treated it as an objective question, denying the possibility of such an expectation. (“[I]t is too much to believe that telephone subscribers, under these circumstances, harbor any general expectation that the numbers they dial will remain secret.”) Having misapplied the subjective part of the Katz test, the Court appears also to have botched the objective part. Justice Blackmun marshaled arguments for the position that an expectation of privacy is unreasonable, but made no comparing or contrasting mention of counterarguments. Most likely, he treated the objective part of the Katz test subjectively, universalizing his own opinion as though it were the one true opinion on privacy around telephone dialing information. (citations omitted)
But more importantly, the Supreme Court is moving away from the “reasonable expectation of privacy” test entirely. In major Fourth Amendment decisions like Kyllo (2001) and Jones (2012), the Court did not use the “reasonable expectation of privacy” test to examine scanning of a home with a thermal imager and tracking of a vehicle with a GPS device. (Both require a warrant.) In Jardines, decided last term, the Court did not use the “reasonable expectation of privacy” test in finding that walking a drug-sniffing dog to the front door of a home was a Fourth Amendment search also requiring a warrant. As I said in a blog post about this minor victory for the Fourth Amendment, “Any case is a good case if it declines to use the failed ‘reasonable expectation of privacy’ doctrine.”
So, does Smith dispense with our Fourth Amendment interests in phone dialing information? Here’s what Justice Sotomayor said about that in Jones:
[I]t may be necessary to reconsider the premise that an individual has no reasonable expectation of privacy in information voluntarily disclosed to third parties. This approach is ill suited to the digital age, in which people reveal a great deal of information about themselves to third parties in the course of carrying out mundane tasks…. I would not assume that all information voluntarily disclosed to some member of the public for a limited purpose is, for that reason alone, disentitled to Fourth Amendment protection.
I’m going to do you a favor here: Don’t bet on Smith v. Maryland.
With dead protesters littering the streets of Cairo, Secretary of State John Kerry’s theory that Egypt’s military rulers “were restoring democracy” isn’t looking very good. The dead won’t be able to vote in the new and improved Egypt.
Instead of acting as the regime’s enabler, the Obama administration should cut off foreign “aid.” If there is influence for Washington to exercise, officials should do so quietly and informally.
Unfortunately, U.S. policy toward Egypt has rarely focused on the Egyptian people. The $75 billion in “aid” was largely a payoff to successive dictators and their military Praetorian Guards. Washington worried about “stability,” not democracy.
Hosni Mubarak was finally overthrown in 2011. In last year’s presidential election, the Brotherhood’s Mohamed Morsi defeated Mubarak’s last prime minister, Ahmed Shafik. The secular liberals were not a factor.
Morsi failed to establish his organization’s democratic bona fides, and especially to reach out to disaffected Egyptians who only reluctantly backed him. But his opponents were little better, while the Mubarak state remained largely intact and undercut him at every turn.
It would have taken extraordinary skill, forbearance, and luck, none of which President Morsi possessed, to have succeeded. Had the opposition simply waited Morsi would have discredited political Islam—democratically. In this way, argued Reuel Marc Gerecht of the Foundation for Defense of Democracies: “The Egyptian military may have snatched defeat from the jaws of victory.”
Instead, Morsi’s disparate opponents backed SCAF in staging the July 3 coup: the president removed, his top aides arrested, his movement’s media shuttered and journalists arrested, the president and others charged with fanciful offenses, and his supporters gunned down in the streets.
Certainly it was an odd way to go about “restoring democracy.” David Kramer, Freedom House’s president, cited a “significant decline in most of the country’s democratic institutions” after Morsi’s ouster. What the al-Sisi government actually restored was the old Mubarak structure.
The Brotherhood resisted the military’s demand for abject surrender. After meeting with government officials Sen. Lindsey Graham (R-SC) said: “You could tell people were itching for a fight.” Indeed, reported the Washington Post, “Two weeks before the bloody crackdown in Cairo, the Obama administration, working with European and Persian Gulf allies, believed it was close to a deal to have Islamist supporters of ousted President Mohamed Morsi disband street encampments in return for a pledge of nonviolence from Egypt’s interim authorities. But the military-backed government rejected the deal and ordered its security forces to break up the protests.”
The military government acknowledged over 600 dead, and the toll almost certainly was much higher. Many of the killings appeared to be deliberate, the result of army snipers. Sherif Mansour of the Committee to Protect Journalists decried the “systematic” targeting of the press.
The slaughter in Cairo sparked more violence nationwide, including Brotherhood attacks on government buildings and Coptic churches. Although the army has the near-term advantage, the movement has survived prior attempts at suppression. Moreover, the government is encouraging the rise of a more radical and violent leadership. Al-Qaeda’s head, Ayman al-Zawahiri, was a member of the Brotherhood and imprisoned and tortured during a prior crackdown.
Continuing civil disorder and violence is almost certain. Terrorism is possible. The kind of strife in Iraq after the U.S. invasion and Algeria in the 1990s also is a risk. Of course, in any such conflict there will be little room for liberal and democratic values.
The Obama administration has ignored U.S. law requiring an aid cut-off after a coup because it wanted to preserve its “leverage.” Unfortunately, Washington has consistently demonstrated its impotence in Cairo. Most recently, Washington has been begging the military to promote reconciliation, without evident success.
The carnage in Cairo mimics that in Beijing’s Tiananmen Square. To subsidize Cairo today is to underwrite murder. Washington’s best policy is to support neither side and leave this tragic conflict to the Egyptian people.
Occupational licensing laws make it harder and more expensive for people to get jobs or to create innovative businesses that might not fit into to conceptual box designed by last generation’s regulators. Worse, while these laws are supposed to be about protecting consumers against dangerous or inept practitioners, they’re often exploited by existing businesses to bar newcomers from competing against them.
But these problems are nothing compared to “Certificate of Public Convenience and Necessity” laws, also called “Certificate of Need” or CON laws. Unlike typical licensing rules, CON laws don’t have anything to do with whether a person is educated or qualified. Instead, they prohibit you from going into business unless you first prove to bureaucrats that a new business is “needed.” And these laws rarely define “need,” or explain how to prove it. Still worse, such laws usually allow existing firms to block a newcomer from staring a competing business. In short, CON laws bar you from going into business until you get permission from your own competitors. (It sounds like something from an Ayn Rand novel, right?)
Last week, Cato adjunct scholar Timothy Sandefur and his colleagues at the Pacific Legal Foundation filed a motion with a federal judge in Kentucky asking the court to strike down that state’s CON law for moving companies. The details are here, and they’re telling.
There have been 39 applications for new moving licenses since 2007. Those that were not “protested” by existing moving companies were approved without incident. But in 19 cases, existing firms did object. And in all of those cases, one of two things happened: either the applicant gave up and abandoned the application, or the government denied it on the grounds that existing moving services were “adequate.” The state never approved an application that was protested by existing firms, no matter what. In one case, an applicant who’d been working for moving companies for 39 years was denied a license in a decision that declared him fully qualified–but said existing companies didn’t need the competition. No wonder Sandefur calls the law “the Competitor’s Veto.”
Notably, of the 114 “protests” filed against applications for new moving licenses in the past five years, all said that the reason for protesting was that a new moving company would cause competition. None even alleged that the applicant was dangerous or incompetent or dishonest.
Not only is Kentucky’s CON law explicitly designed to protect established companies against entrepreneurs who want to work hard to support themselves, but they’re also incomprehensibly vague. What is a “need”? What qualifies as “adequate”? Nobody knows, and state officials testified under oath that they don’t use any objective standards when making such determinations.
The Bluegrass State, by the way, isn’t the only example of this sort of thing. In an article forthcoming in the George Mason Civil Rights Law Journal, Sandefur demonstrates the same pattern in Missouri, where he challenged the constitutionality of a very similar law. There, too, all of the 110 protests filed against moving license applications were filed by existing firms, and all explicitly said that their only reason for objecting was to avoid competition. (That law was repealed last year.)
The Supreme Court has made clear that licensing requirements must focus on the applicant’s “fitness or capacity to practice the profession,” and it has invalidated CON laws that only protect cartels against competitors. Like those laws, Kentucky’s CON law isn’t about protecting the public. It’s based on fallacious and outdated economic theories that saw competition as wasteful and inefficient. Economists now generally agree that competition is the source of efficiency – and that when government tries to decide what kind of businesses are “needed,” that power will be captured by private interests seeking to benefit themselves at the expense of consumers and entrepreneurs. Here’s hoping the court strikes down the Kentucky anti-competition law and enforce the constitutional right to earn a living.
Daniel J. Mitchell
As regular readers know, one of my great challenges in life is trying to educate policy makers about the Laffer Curve, which is simply a way of illustrating that government won’t collect any revenue if tax rates are zero, but also won’t collect much revenue if tax rates are 100 percent. After all, very few people will be willing to earn and report income if the government steals every penny.
In other words, you can’t estimate changes in tax revenues simply by looking at changes in tax rates. You also have to consider changes in taxable income. Only a fool, for instance, would assume that you can double tax revenue by doubling tax rates.
But how do you explain this to the average person? Or, if you want a bigger challenge, how do you get this point across to a politician?
Over the years, I’ve picked up a few teaching examples that seem to be effective. People are always shocked, for example, when I show them the IRS numbers on how rich people paid a lot more tax when Reagan cut the top tax rate from 70 percent to 28 percent.
And they’re also more likely to understand why class-warfare tax policy won’t work when I show them the IRS data on how upper-income taxpayers have considerable control over the timing, level, and composition of their income.
Perhaps my favorite teaching technique, though, is to ask folks to pretend that they’re running a restaurant and to think about what might happen to their sales if they double the price of hamburgers. Would it make sense to assume that they would get twice as much revenue?
Almost everybody understands that hamburger sales would plummet and that they would likely lose revenue.
Well, great minds (or at least wonky minds) think alike, because the Tax Foundation has released a great video on dynamic scoring and they use donuts to make the same point.The Economics of Tax Reform: Lessons from the Donut Shop
The video suggests that it would be a good idea to modernize the revenue-estimating process.
I fully agree. The Joint Committee on Taxation, which is responsible for revenue estimates on Capitol Hill, is notorious for using antiquated and biased methodology.
I elaborate (and use my hamburger example) in this video I narrated for the Center for Freedom and Prosperity.The Laffer Curve, Part III: Dynamic Scoring
P.S. The Joint Committee on Taxation also is responsible for producing biased estimates of so-called tax expenditures.
P.P.S. Only 15 percent of CPAs (the folks who see first-hand how taxes impact behavior) agree with the Joint Committee on Taxation’s methodology.
In a nation with a strong tradition of holding major political contests in years divisible by the number two, politicos are mostly confined to chirping about distant elections during odd-numbered years. The exceptions in the year following a presidential election are New Jersey and Virginia, which hold their gubernatorial elections. In addition, due to the passing of Senator Frank Lautenberg, New Jersey will hold a special election to the U.S. Senate. In all three elections, one or both of the major candidates have made school choice an issue. That makes sense because school choice is increasingly popular, especially once implemented. Unfortunately, while the candidates should be commended for promoting school choice policies in general, their specifics leave much to be desired.
Last week, the Republican gubernatorial candidate in Virginia, Ken Cuccinelli, unveiled an education plan calling for an expansion of the state’s scholarship tax credit program (or the creation of a separate program) that would direct funds to students currently attending a failing public school. However, what Virginia’s scholarship tax credit program really needs is the policy equivalent of Extreme Home Makeover to remove unnecessary regulations on private schools, shift administration of the program to the Department of Revenue, increase the credit amount, and expand the uses of the scholarships beyond just tuition. As Andrew Coulson has demonstrated, it is the least regulated, most market-like private schools that do the best job of serving families.
In New Jersey, Governor Chris Christie is once again advocating for a scholarship tax credit program, just as he had promised in 2009. Thus far, Christie has not fulfilled his promise. While Christie has repeatedly included a tax credit program in his proposed budgets, he has also repeatedly signed budgets lacking such programs. Certainly Christie faces a hostile legislature on this issue, but he has proven capable of getting his top priorities through. Private school choice does not appear to be one of them. That said, recognizing the need for competition, Christie did implement a modest public school choice law and has helped transform some traditional district schools into charter schools. It’s certainly possible that if reelected, he might spend his enhanced political capital on finally enacting private school choice. Color me skeptical, but if the latest polls are any indication, voters will give Christie the opportunity to finally keep his promise.
In the New Jersey Senate race, both candidates have declared their support for school choice. Indeed, the issue has become somewhat of a political football, with former mayor Steve Lonegan accusing Mayor Cory Booker of not being sufficiently pro-school choice:
“It is time for Cory Booker to man up and say once and for all whether he will support school vouchers if he is elected to the U.S. Senate or will he join President Obama in shutting down school voucher programs…Cory had seven years to give low-income students in Newark a chance at receiving a quality education. Instead, he has offered platitudes and vague statements.”
The attack is somewhat disingenuous since a voucher program would have to be enacted at the state level, not the local level. Some commentators see the attack as an effort to drive a wedge between Booker and his base to dampen support on election day, though Booker’s support for vouchers didn’t hurt him in the primary. What’s not clear is how either candidate’s support for school choice will translate into policy in the Senate. Supporting the Washington, D.C. voucher program is certainly laudable, but pushing for a national voucher program would be misguided.
In summary, it is encouraging that the popularity of school choice programs has translated into greater political support, but this year’s elections don’t offer much for school choice advocates to get excited about.
A new essay at Downsizing Government focuses on infrastructure investment. The essay discusses problems with federal infrastructure spending and the advantages of privatizing infrastructure to the full extent possible.
Unfortunately, the current administration’s infrastructure policy has been mainly focused on increasing spending on misguided activities such as high-speed rail. But here are some of the problems with such a federal-led approach to infrastructure:
- Investment is misallocated. Federal investments are often based on political pork-barrel factors rather than actual marketplace demands. Amtrak investment, for example, has long been spread around to low-population areas where passenger rail makes little economic sense. Most of Amtrak’s financial losses come from long-distance routes through rural areas that account for only a small fraction of all riders. Every lawmaker wants an Amtrak route through their state, so investment gets misallocated away from where it is really needed, such as the Northeast corridor.
- Infrastructure is utilized inefficiently. Government infrastructure is often utilized inefficiently because supply and demand are not balanced by market prices. The vast water infrastructure operated by the Bureau of Reclamation, for example, greatly underprices irrigation water in western United States. The result is wasted resources, harm to the environment, and a looming water crisis in many areas in the West.
- Investment is mismanaged. Federal agencies don’t have the strong incentives that businesses do to ensure that infrastructure projects are constructed and operated efficiently. Federal highway, energy, airport, and air traffic control projects, for example, have often experienced large cost overruns. The Big Dig in Boston—which was two-thirds funded by the federal government—exploded in cost to five times the original estimate. And over much of the last century, the Army Corps of Engineers and the Bureau of Reclamation were known for spending on boondoggle projects, distorting their analyses, harming the environment, and spending on projects to further private interests rather than the general public interest.
- Mistakes are replicated across the nation. Perhaps the biggest problem with federal intervention in infrastructure is that when Washington makes mistakes it replicates them across the nation. High-rise public housing projects, for example, were a terrible idea that federal funding helped spread nationwide. Federal subsidies for light-rail projects have biased cities to opt for these expensive systems, even though they are generally less efficient and flexible than bus systems. High-speed rail represents another federal effort to induce the states to spend money on uneconomical infrastructure.
- Burdensome regulations. A final problem with federal infrastructure spending is that it usually comes part and parcel with piles of regulations. Federal Davis-Bacon labor rules, for example, raise the cost of building state and local infrastructure. In general, federal regulations impose one-size-fits-all solutions on the states even though the states may have diverse infrastructure needs.
Many policymakers are concerned that America have top-notch infrastructure to compete in the global economy. But the best way forward is for the federal government to cut subsidies and to devolve control over infrastructure to state and local governments. To meet demands for new infrastructure capacity, the states should innovate with privatization.
America’s entrepreneurs are looking for new opportunities. So let’s give them a crack at improving the nation’s infrastructure by reducing federal subsidies and regulations that deter private investment in airports, highways, and many other facilities.
For more, see www.downsizinggovernment.org/infrastructure-investment.