Cato Op-Eds

Individual Liberty, Free Markets, and Peace
Subscribe to Cato Op-Eds feed

Chris Edwards

Americans are concerned about the performance of the federal bureaucracy. Many people think that federal workers are overpaid and underworked. Some recent news stories provide fresh input to the debate.

A story yesterday at GovExec.com regards pay and performance. The federal pay structure is less efficient than private pay structures because it is generally based on seniority, not job performance. But GovExec.com finds that attempts to introduce federal performance pay have not worked very well either:

Most federal agencies are not making meaningful distinctions in performance ratings and bonuses for senior executives, according to a new watchdog report. About 85 percent of career senior executives received “outstanding” or “exceeds fully successful” ratings in their performance reviews between fiscal years 2010 and 2013, at the same time that agencies have made smaller distinctions in the amount of individual bonuses, the Government Accountability Office found. This has created a system where nearly everyone is considered outstanding…

The level of federal pay is the focus of another recent story. GovExec.com reports on the large number of workers who enjoy high pay:

More than 16,900 federal employees took home in excess of $200,000 in base salary in 2014, according to a partial database of federal salary data.

The report is based on data from FedSmith.com, which is an excellent source of federal workforce information. Fedsmith’s database can list employees and their salaries by agency. For example, there are 159 people at the Small Business Administration who made more than $150,000 in wages in 2014. That’s 159 too many in my view, as the agency should be closed down.

Another recent article regards federal firing. The Federal Times confirms the extraordinarily low firing rate in the federal government compared to the private sector:

Even as lawmakers press for greater accountability within government, agencies have fired fewer employees than at any time in the last 10 years, according to data from the Office of Personnel Management.

Agencies fired 9,537 federal employees for discipline or performance issues in fiscal 2014, down from 9,634 in 2013 and down from a high of 11,770 in fiscal 2010, according to the data. The firing rate held at 0.46 percent of the workforce in both fiscal 2013 and fiscal 2014 — the lowest rate in 10 years.

The private sector fires nearly six times as many employees — about 3.2 percent — according to the Bureau of Labor Statistics, and whether the government fires too few people or just not the right people is the subject of continued debate.

For more on the federal workforce, see here.

Ilya Shapiro

This morning the Supreme Court ruled in Yates v. United States that Sarbanes-Oxley—the massive legislation prompted by the accounting scandals of the early 2000s—can’t be used to prosecute a fisherman who caught undersized grouper.  It makes eminent intuitive sense. Luckily, it’s also correct as a matter of statutory interpretation. That is, even though the relevant provision (Section 1519) punishes those who would knowingly destroy or conceal “any record, document, or tangible object” in order to impede an investigation, Justice Ginsburg is correct in writing for the plurality that “it would cut §1519 loose from its financial-fraud mooring to hold that it encompasses [objects not] used to record or preserve information.”

And Justice Alito, in a narrow concurrence that ultimately controls the case, is even more correct to apply traditional canons of statutory construction—the rules that guide judges in interpreting laws—and thereby find that “tangible object,” in the context of the list of nouns that are Sarbanes-Oxley’s target, refers to “something similar to records or documents.” In a colorful opinion rife with salamanders, crocodiles, and oil derricks, Alito asks the correct question: “How does one make a false entry on a fish?”

As Cato wrote in our brief, words such as “record” and “document” modify the term “tangible object” to include things like hard drives and floppy disks (remember those?), not grouper. Moreover, an all-encompassing reading of “tangible object” would render the words “record” and “document” unnecessary. And the broader context of Sarbanes-Oxley illuminates the relevant meaning here: The Act focuses on financial fraud in the context of companies, not fauna. Thus, the words “tangible object” should be read differently in Sarbanes-Oxley than they would be in, say, the Federal Rules of Criminal Procedure.

If the term “tangible object” were read as broadly as the government wished, it could criminalize an unfathomable range of activities, from throwing away cigarette butts to washing away footprints in the sand. It wouldn’t provide adequate notice about potential legal violations, to which individuals have a right to so they can plan their actions accordingly and avoid getting caught in government nets.

After all, prosecutors and law enforcement officials can’t arbitrarily expand the range of criminal offenses as if they themselves were fishermen, exaggerating the size of their catches to a credulous legal system.

Doug Bandow

Christianity is thriving in China. There may be more religious believers than Communist Party members. 

Beijing’s sensitivities to religion are well-known.  Religion offers a competitive worldview to the Party.  The latter fears many Christians, especially Catholics, have loyalties beyond China’s borders.  Religion brings people together in ways that might eventually influence politics.

In its early days, the People’s Republic of China responded harshly to religious activity, but official policy has moderated over time.  There is an increasing amount of reluctant toleration of religious belief. 

Beijing appears to have a more relaxed policy.  Last year, I visited a church of around 800 in the capital.  It operated openly, attracted many young people, and hosted dozens of baptisms on the Sunday I attended.  I saw a car in traffic that sported the traditional Christian “fish.” 

Ironically, the lesson of the West’s experience with religion is that the best way for a government to avoid conflict between religious believers and political authorities is to provide the greatest freedom possible.  Obviously, there have been many strains of Christianity throughout the centuries.  However, the faith emphasizes a transcendent commitment to God while accommodating many different political perspectives.

The Apostle Paul, whose ministry benefited from the order imposed by the Roman Empire, urged submission to the ruling authorities.  There were exceptions, however, most obviously when secular rulers sought to impede the exercise of faith.

For instance, when the Jewish leadership in the Sanhedrin instructed the Apostles Peter and John to no longer preach about Jesus’s death and resurrection, they responded that they had to obey God rather than men.  The original disciples and their followers persevered despite episodic persecution. 

Ironically, the Romans found Christians to be good citizens.  Indeed, Christians helped ameliorate some of the social problems evident in the ancient world. 

Only when more abusive emperors demanded to be worshiped as gods did Christians resist, often at the cost of their lives. Contra the empire’s expectations, persecution did not extinguish religious faith. 

As Christianity became the majority faith in the West, the faithful began to play a much larger political role.  But in general they sought to shape, not overturn, the political order.  And their greatest concern always was the status of their faith, both individually and communally.  The worst battles between religious and government powers occurred when the latter sought to exercise spiritual authority.  

The PRC doesn’t easily fit into the Western experience.  However, one lesson clearly applies.  The best way to minimize political confrontations between church and state is to reduce government restraints on religion.  Christians have no unified view of politics, but believers everywhere agree on the importance of being allowed to worship God.

Interfering with the ability of people to live their faith guarantees, indeed, requires resistance.  Whatever the government’s objectives, the impact will be social conflict.  Believers will perceive the state to be challenging their core faith. 

As I point out on China-US Focus:  “The Party cannot win this battle.  For years Christianity’s growth was concentrated in rural areas, but recently has spread to the cities and reached a better educated population.  Based on existing growth rates there could be nearly 250 million Christians in China by 2030.” 

Beijing hopes to make Christianity conform to China.  Gu Mengfei of the Three-Self Patriotic Movement, which represents legal Protestant churches, explained that the PRC desired to “encourage more believers to make contributions to the country’s harmonious social progress.”  Such harmony is best achieved by eliminating the greatest source of potential conflict, barriers to religious practice. 

China increasingly is influencing Asia and the world.  But the PRC in turn will be influenced by developments within.  One of those is religion.  The government would best respond in a way that accommodates increased religious faith.

Jeffrey Miron

On February 26, 2015, marijuana becomes legal (again) under the laws of Washington, D.C. The key rules are:

  • It will be legal to possess up to two ounces of pot.
  • It will be legal to smoke said pot on private property.
  • It will be legal to transfer (give) an ounce or less of pot to someone else.
  • It will be legal to grow and cultivate up to six pot plants—no more than three mature ones—in your home.
  • You must be 21 years old to possess, consume, or grow pot.
  • Selling pot will still be illegal.
  • As will be smoking pot in any public space, which includes restaurants, bars, and coffee shops.
  • And, of course, none of this applies to any federal land (which accounts for 22 percent of the District), which considers marijuana illegal.

Overall, this is progress.  But note that:

1. Federal marijuana prohibition still applies.

2. The age limit of 21 is misguided (just as with alcohol).  That limit guarantees that much marijuana use will remain outside the law.

3. The limit on possession amounts is silly; the ban on sale is idiotic.

4. Perhaps restaurants, bars, and coffee shops will circumvent the ban on smoking in public by offering free edibles.

5. The federal government owns 22 percent of the land in D.C.?  Geez.

 

Daniel J. Ikenson

Trade Promotion Authority (TPA or Fast-Track Negotiating Authority) is not an executive power grab.  It is a compact between the legislative and executive branches, which each have distinct authorities under the Constitution when it comes to conducting trade policy. The purpose of forging such a compact is that negotiations would be impracticable – and likely interminable – if each provision were subject to the whims of 535 legislators.

Opponents of trade liberalization have smeared TPA as a wholesale capitulation to the president, who allegedly is freed of any congressional oversight and given a blank check to negotiate unamendable trade deals in secret without any input from Congress – only the capacity to vote up or down on the final deal. In reality, though, TPA is the vehicle through which Congress conveys its trade policy objectives, conditions, and demands to the president, who negotiates with those parameters in mind. Provided the president concludes a negotiation that abides those congressional parameters, the deal is given fast track consideration, which means essentially that legislative procedures are streamlined and expedited.

The trade committees are reportedly close to introducing trade promotion authority legislation, although there remains some debate about what it should include. Enforceable provisions to discipline currency manipulation would be a bad idea, as would be including provisions to reauthorize the ineffective and misguided Trade Adjustment Assistance program (which is widely acknowledged to be a payoff to organized labor).

But one important provision (or set of provisions) that has created a bit of an impasse between Senate Finance Committee Chairman Orrin Hatch (R-UT) and its Ranking Member Sen. Ron Wyden (D-OR) concerns certification that an agreement abides the requisite congressional conditions to be afforded fast track treatment. Those of us who argue that TPA is not an executive power grab, but a practical, constitutional solution to a policymaking quandary must acknowledge the propriety of such a provision – or a provision that accomplishes as much. There must be a mechanism through which the president is held to account – that the deal reflects the broad wishes of Congress.

Under previous TPA legislation, Congress was afforded opportunities to offer “Resolutions of Disapproval” over procedural concerns (including whether the trade deal advanced the objectives and goals of Congress). Such resolutions were required to be reported to and approved by the respective trade committees in each chamber. Certainly, any new trade promotion authority legislation will include similar provisions. But Sen. Wyden is reportedly believes that stripping an agreement of fast track treatment should be easier to accomplish than it was under previous TPA legislation – perhaps by allowing for more channels through which such “resolutions” could come to the floor for a vote and requiring 60, as opposed to 67, votes in the Senate to pass the resolution.

Talk of an affirmative need to “certify” that the agreement comports with congressional objectives – as opposed to passing a resolution that it doesn’t comport – would also seem to present another bottleneck that would amount to a second TPA vote. Of course, many Democratic Party constituencies that oppose trade liberalization generally would appreciate these lower thresholds, which make derailing trade agreements easier.  And that concern explains Chairman Hatch’s view that Wyden’s position would “make it so that fast track won’t work … the whole purpose of fast track is to be able to get these things … either rejected or approved.”

Congress must have the authority to decide whether an agreement qualifies for fast track treatment, otherwise there is nothing to hold the president to account.  But that authority should not be so vast as to negate the purpose and effect of trade promotion authority. Ensuring necessary checks and balances should not be a partisan matter.

Benjamin H. Friedman

The Boston Globe kindly published a piece I wrote about the lack of strategy guiding Pentagon spending, but gave it the somewhat misleading title and subtitle:  “The Pentagon’s Bloat: Accounting tricks and self-interested politicians ensure that US military spending will remain immune from any real ‘hard choices.’”*

The article doesn’t really bemoan bloat, in the standard sense of wasteful or inefficient pursuit of objectives. It complains about excessive objectives—our overly capacious definition of security—and explains the cause.

My argument is that it would be terrific if Ashton Carter, the new Secretary of Defense, and the military service chiefs were correct in their contention that they cannot execute the U.S. security strategy without exceeding the $499 billion cap that law imposes on 2016 Pentagon spending. They made that claim in requesting a budget that requires raising the cap by $34 billion or eliminating it, another $51 billion for war and relief from future years’ caps.

Our current “strategy” isn’t really one. Strategy, by definition, requires prioritization among competing threats and methods of defending against them. Our government uses that word to rationalize the avoidance of those choices. The primacy theory that best describes our approach to security is really a justification for a log-roll of disparate military interests and goals, most only vaguely related to our safety. A poorer state facing more pressing threats would have to choose among those objectives, which is what strategy does. Poverty demands choices that wealth avoids. And as realists explain, big threats unify preferences, lowering obstacles to strategy formation.

The United States has long been rich and safe enough to minimize choices among defenses and avoid strategy. So we get the phony, listicle sort: recitations of nice things that we hope U.S. military power might accomplish, justified as security objectives. That has the effect of conflating safety with values, and promoting a sense of insecurity.

I argue that the current austerity—the Pentagon budget is down almost 25 percent, in real terms, including the wars, since 2010—will not cause us to change course and pick among allies or region to defend, military services to fund, and possible wars to fight. My article’s subtitle notwithstanding, that’s not because of politicians’ self-interest, which, incidentally, is both desirable and imperishable in a democracy.

I blame three other culprits. One is the monopolistic behavior of the military services. If they competed more for each other’s share of the budget, they might offer alternative strategic concepts. Another is the beliefs of the Pentagon civilians who might encourage that sort of competition. They think interservice conflict is always bad, even though it can enhance their management ability, and embrace primacy. Elsewhere I explain why.

A third culprit is the shallowness of the current austerity. The cuts underway take us only back to about 2004 levels of military spending, and two-thirds of that is just declining war costs. So the pressure to make hard choices in the base (non-war) budget is limited. Plus the uncapped war budgets (Overseas Contingency Operations or OCO funding) increasingly include funds that belong in the base. That “accounting trick” pads the budget against austerity and the need for hard choices. Note also that OCO is, like the 2001 Authorization of Military Force, a tool of executive wars powers. Vague AUMFs provide legal authority for wars presidents want to start without further democratic check; OCO provides the funds. Because wars should be hard to start, we should get rid of both.

These factors lead me to conclude pessimistically:

With a trim here and an accounting trick there, the Department of Defense will muddle along its present course, while elected leaders justify it with paeans about American military power’s indispensability to every pleasant noun that “global” can modify. We that object might take solace in the fact that our hubris is a luxury that our fortune affords. Only blessed nations can worry so much about their safety while confusing it with everything they want.

*An article Justin Logan and I wrote for Orbis a few years ago includes a similar but more developed version of this argument. 

Steve H. Hanke

Since the New Year, Ukraine’s currency – the hryvnia – has collapsed, losing 51 percent of its value against the U.S. dollar. To put this rout into perspective, consider that the Russian ruble has only lost 8 percent against the greenback during the same period.

Like night follows day, the hryvnia’s meltdown has resulted in a surge of inflation. The last official Ukrainian year-over-year inflation rate is 28.5 percent. This rate was reported for January and is out of date. That said, the official inflation rate has consistently and massively understated Ukraine’s brutal inflation. At present, Ukraine’s implied annual inflation rate is 272 percent. This is the world’s highest inflation rate, well above Venezuela’s 127 percent rate (see the accompanying chart).

When inflation rates are elevated, standard economic theory and reliable empirical techniques allow us to produce accurate inflation estimates. With free market exchange-rate data (usually black-market data), the inflation rate can be calculated. Indeed, the principle of purchasing power parity (PPP), which links changes in exchange rates and changes in prices, allows for a reliable inflation estimate.

To calculate the inflation rate in Ukraine, all that is required is a rather straightforward application of a standard, time-tested economic theory (read: PPP). At present, the black-market UAH/USD exchange rate sits at 33.78. Using this figure and black-market exchange rate data that the Johns Hopkins-Cato Institute for Troubled Currencies Project has collected over the past year, I estimate Ukraine’s current annual inflation rate to be 272 percent – and its monthly inflation rate to be 64.5 percent. This rate exceeds the 50 percent per month threshold required to qualify for hyperinflation. So, if Ukraine sustains its current monthly rate of inflation for several more months, it will enter the

So, if Ukraine sustains its current monthly rate of inflation for several more months, it will enter the record books as the world’s 57th hyperinflation episode. 

Chris Edwards

Policymakers are battling over a funding bill for the Department of Homeland Security (DHS) and its agencies, including the Federal Emergency Management Agency (FEMA). The disagreement over the bill involves the funding of President Obama’s recent immigration actions.

If a DHS funding bill is not approved, the department will partially shut down. The administration has been highlighting the negative effects of that possibility, but the battle illustrates how the government has grown far too large. Federal shutdowns may cause disruption, but that is because the government has extended its tentacles into activities that should be left to state and local governments and the private sector.

To the extent possible, we should move the most important activities in society out of Washington because the federal government has become such a screwed-up institution. Air traffic control, for example, is too crucial to allow it to get caught in D.C. budget squabbles, as it did in 2013. Air traffic control should be privatized.

Let’s look at the story being told by FEMA head Craig Fugate about a possible shutdown:

I can say with certainty that the current standoff has a real impact on our ability to ensure that a wide range of emergency personnel across the country have the resources they need to do their jobs and keep our communities safer and more secure…

At FEMA, one of our critical missions is reviewing applications and awarding grants to communities across the country, which can help firefighters, police officers, hospital workers, and emergency managers get the staff, training and equipment they need to prepare for, respond to, recover from, and mitigate a wide array of hazards…

Today, we find ourselves in the midst of yet another continuing resolution, which only provides short-term, temporary funding to our agency. This isn’t just a slight technical difference – it has a major impact on our ability to assist state, local, and tribal public safety agencies…

Making matters worse, the current situation is a showstopper for our grant program. Our application process for grants should have started in October; it is now February and we still haven’t been able to issue new grants. Moreover, during these ongoing continuing resolutions, local first responders from across the U.S. have made plans to attend training classes at one of our three national training centers, where they will learn valuable skills they can bring back to their communities – only to have a wrench thrown in the works caused by uncertainty in the budget. Our state, local, and tribal partners are facing increasingly urgent choices about how they will make ends meet without matching FEMA grants.

Mr. Fugate is a well-regarded leader, unlike some of FEMA’s past leaders. But his argument is like if the government took over food distribution in America, the Federal Administrator for Food would point to the urgent crisis if his budget was blocked. The lesson is that the more control the federal government has over society, the more vulnerable we all are to its dysfunction.

As for FEMA, my recent study examines why it is a mistake to fund and direct disaster preparation, response, and relief from Washington. FEMA’s response to some major disasters has been slow, disorganized, and profligate. Fugate implied that local police and firefighters across the country are now hooked on the federal teat. How could that possibly be a good idea?

Federalism is supposed to undergird America’s system of handling disasters, particularly natural disasters. State, local, and private organizations should play the dominant role. So however the current battle over DHS funding turns out, policymakers should begin cutting FEMA’s budget and handing back responsibility for disasters to the states and private sector.

Paul C. "Chip" Knappenberger and Patrick J. Michaels

The White House Council for Environmental Quality (CEQ) has released a draft of revised guidance that “describes how Federal departments and agencies should consider the effects of greenhouse gas emissions and climate change” under reviews governed by the National Environmental Policy Act (NEPA)—an act which basically requires some sort of assessment as to the environmental impacts of all proposed federal actions.

Under the revised guidance, the CEQ makes it clear that they want federal agencies now to include the impact on climate change in their environmental assessments.

But here’s the kicker, the CEQ doesn’t want the climate change impacts to be described using measures of climate—like temperature, precipitation, storm intensity or frequency, etc.— but rather by using the measure of greenhouse gas emissions.

Basically, the CEQ guidance is a roadmap for how to circumvent the NEPA requirements.

Here is how the CEQ characterizes the intent of the NEPA:

NEPA is designed to promote disclosure and consideration of potential environmental effects on the human environment resulting from proposed actions, and to provide decisionmakers with alternatives to mitigate these effects. NEPA ensures that agencies take account of environmental effects as an integral part of the agency’s own decision-making process before decisions are made. It informs decisionmakers by ensuring agencies consider environmental consequences as they decide whether to proceed with a proposed action and, if so, how to take appropriate steps to eliminate or mitigate adverse effects. NEPA also informs the public, promoting transparency of and accountability for consideration of significant environmental effects. A better decision, rather than better—or even excellent—paperwork is the goal of such analysis.

Clearly, the emphasis of NEPA is on the “environment” and better informing policymakers and the public as to the potential impacts of proposed federal actions on the environment.

But here is how the CEQ summarizes the intent of its new guidance:

Agencies should consider the following when addressing climate change:

(1) the potential effects of a proposed action on climate change as indicated by its GHG emissions;

This is represents a fundamental scientific error—greenhouse gas (GHG) emissions are not themselves a measure of an “environmental effect” nor are they an indicator of “climate change.”

This misdirection—one inconsistent with the NEPA— immediately caught our attention and we developed and submitted a Comment on the CEQ guidance that pointed out this glaring error. The public comment period, which originally closed yesterday, has been extended until March 25, 2015.

The sense of our Comment was by cloaking climate change impacts in the guise of greenhouse gas emissions serves not to “promote transparency,” or “inform decisionmakers” and “the public” but rather has the opposite intent—misdirection and misinformation.

Why does the CEQ seek to limit the climate change discussion to greenhouse gases?

In light of the difficulties in attributing specific climate impacts to individual projects, CEQ recommends agencies use the projected GHG emissions and also, when appropriate, potential changes in carbon sequestration and storage, as the proxy for assessing a proposed action’s potential climate change impacts. This approach allows an agency to present the environmental impacts of the proposed action in clear terms and with sufficient information to make a reasoned choice between the no-action and proposed alternatives and mitigations, and ensure the professional and scientific integrity of the discussion and analysis.

They got the first part right. The reason it is “difficult” is not because tools don’t exist—after all that’s what climate models have been developed for, to take carbon dioxide emissions and covert them to environmental impacts—but rather that any attempt to run the emissions through such climate models would show they would have no detectable impact.

In other words, it would prove that the assessment of climate change impacts of federal actions, as directed by the CEQ, to be a complete and utter waste of time.

How do we know this? Because even a complete cessation of all greenhouse gases from the U.S. starting tomorrow and running forever would only serve to avert somewhat less than 0.15°C of future global temperature rise between now and the end of the century—an amount that is environmentally insignificant. Lesser actions will have lesser impacts; you can see for yourself here.

This is the last thing the White House wants federal agencies to conclude. So instead of assessing actual climate impacts (of which there are none) of federal actions, the CEQ directs agencies to cast the effect in terms of greenhouse gas emissions—which can be used for all sorts of mischief.  For example, see how the EPA uses greenhouse gas emissions instead of climate change to promote its regulations limiting carbon dioxide emissions from power plants.

No doubt this is the type of analysis that the CEQ has in mind—one which seeks to elevate policy initiatives (like the Climate Action Plan) above hard scientific analysis.

Here is how we concluded our Comment to the CEQ:

To best serve policymakers and the general public, the CEQ should state that all but the largest federal actions have an undetectable and inconsequential impact on the environment through changes in the climate. And for the largest federal actions, an analysis of the explicit environmental impacts resulting from greenhouse gas emissions arising from the action should be detailed, with the impacts assessment not limited to climate change but also to include other environmental effects such as impacts on overall vegetative health (including crop yield and production).

As called for in the guidelines described in this current draft—substituting greenhouse gas emissions for climate change and other environmental impacts—is not only insufficient, but is scientifically inadequate and potentially misleading. As such, these CEQ guidelines should be rescinded and discarded.

Our Comment, in its entirety, is available here.

Jonathan Blanks

Those who follow police misconduct closely know that patterns of abuse can become normalized when tolerated or unchecked by police supervisors. Abuses that went unreported or were unsubstantiated in years past have been exposed by the growing presence of camera phones and other technologies that record police-public interactions. But they can’t catch them all.

The Guardian’s Spencer Ackerman has reported a truly disturbing practice in Chicago. The police have established a “black site” area where Americans are held incommunicado to be interrogated. Prisoners are held without charge and in violation of their constitutional rights and without access to legal counsel:

The facility, a nondescript warehouse on Chicago’s west side known as Homan Square, has long been the scene of secretive work by special police units. Interviews with local attorneys and one protester who spent the better part of a day shackled in Homan Square describe operations that deny access to basic constitutional rights.

Alleged police practices at Homan Square, according to those familiar with the facility who spoke out to the Guardian after its investigation into Chicago police abuse, include:

  • Keeping arrestees out of official booking databases.
  • Beating by police, resulting in head wounds.
  • Shackling for prolonged periods.
  • Denying attorneys access to the “secure” facility.
  • Holding people without legal counsel for between 12 and 24 hours, including people as young as 15.

At least one man was found unresponsive in a Homan Square “interview room” and later pronounced dead.

Unlike a precinct, no one taken to Homan Square is said to be booked. Witnesses, suspects or other Chicagoans who end up inside do not appear to have a public, searchable record entered into a database indicating where they are, as happens when someone is booked at a precinct. Lawyers and relatives insist there is no way of finding their whereabouts. Those lawyers who have attempted to gain access to Homan Square are most often turned away, even as their clients remain in custody inside.

“It’s sort of an open secret among attorneys that regularly make police station visits, this place – if you can’t find a client in the system, odds are they’re there,” said Chicago lawyer Julia Bartmes.

This is not Chicago’s first brush with systematic abuse of citizens. Just this month, a retired CPD detective was released from prison for covering up the torture and false confessions of over 100 people in the 1970s and ‘80s. He still collects a $4,000 per month pension.

Police transparency is essential to effective policing, but police organizations often protect their officers from outside scrutiny, making it difficult to hold officers accountable for repeated violations of policy. Secretive internal investigations can stonewall public inquiry into disputed officer-related shootings committed in broad daylight. Left unchecked, entire police departments can develop institutional tolerance for constitutional violations in day-to-day policing. 

But what Ackerman reports seems to be the ultimate lack of police transparency. If what he reports is true, a full investigation should be launched by government officials outside of the Chicago Police Department to examine such egregious violations of civil and constitutional rights.

At PoliceMisconduct.net, we’re shining a light to bring these abuses out into the open.

Read Ackerman’s powerful report here.

Julian Sanchez

At a  New America Foundation conference on cybersecurity Monday, NSA Director Mike Rogers gave an interview that—despite his best efforts to deal exclusively in uninformative platitudes—did produce a few lively moments. The most interesting of these came when techies in the audience—security guru Bruce Schneier and Yahoo’s chief information security officer Alex Stamos—challenged Rogers’ endorsement of a “legal framework” for requiring device manufacturers and telecommunications service providers to give the government backdoor access to their users’ encrypted communciations. (Rogers repeatedly objected to the term “backdoor” on the grounds that it “sounds shady”—but that is quite clearly the correct technical term for what he’s seeking.) Rogers’ exchange with Stamos, transcribed by John Reed of Just Security, is particularly illluminating:

Alex Stamos (AS): “Thank you, Admiral. My name is Alex Stamos, I’m the CISO for Yahoo!. … So it sounds like you agree with Director Comey that we should be building defects into the encryption in our products so that the US government can decrypt…

Mike Rogers (MR): That would be your characterization. [laughing]

AS: No, I think Bruce Schneier and Ed Felton and all of the best public cryptographers in the world would agree that you can’t really build backdoors in crypto. That it’s like drilling a hole in the windshield.

MR: I’ve got a lot of world-class cryptographers at the National Security Agency.

AS: I’ve talked to some of those folks and some of them agree too, but…

MR: Oh, we agree that we don’t accept each others’ premise. [laughing]

AS: We’ll agree to disagree on that. So, if we’re going to build defects/backdoors or golden master keys for the US government, do you believe we should do so — we have about 1.3 billion users around the world — should we do for the Chinese government, the Russian government, the Saudi Arabian government, the Israeli government, the French government? Which of those countries should we give backdoors to?

MR: So, I’m not gonna… I mean, the way you framed the question isn’t designed to elicit a response.

AS: Well, do you believe we should build backdoors for other countries?

MR: My position is — hey look, I think that we’re lying that this isn’t technically feasible. Now, it needs to be done within a framework. I’m the first to acknowledge that. You don’t want the FBI and you don’t want the NSA unilaterally deciding, so, what are we going to access and what are we not going to access? That shouldn’t be for us. I just believe that this is achievable. We’ll have to work our way through it. And I’m the first to acknowledge there are international implications. I think we can work our way through this.

AS: So you do believe then, that we should build those for other countries if they pass laws?

MR: I think we can work our way through this.

AS: I’m sure the Chinese and Russians are going to have the same opinion.

MR: I said I think we can work through this.

I’ve written previously about why backdoor mandates are a horrible, horrible idea—and Stamos hits on some of the reasons I’ve pointed to in his question.   What’s most obviously disturbing here is that the head of the NSA didn’t even seem to have a bad response prepared to such an obvious objection—he has no serious response at all. China and Russia may not be able to force American firms like Google and Apple to redesign their products to be more spy-friendly, but if the American govenrment does their dirty work for them with some form of legal backdoor mandate, those firms will be hard pressed to resist demands from repressive regimes to hand over the keys. Rogers’ unreflective response seems like a symptom of what a senior intelligence official once described to me as the “tyranny of the inbox”: A mindset so myopically focused on solving one’s own immediate practical problems that the bigger picture—the dangerous long-term consequences of the easiest or most obvious quick fix solution—are barely considered.

What we also see, however, is a hint to why officials like Rogers and FBI Director James Comey seem so dismissive of the overwhelming consensus of security professionals and crypographers that it’s not technically feasible to implement a magical “golden key” that will permit the “good guys” to unlock encrypted data while leaving it secure against other adversaries.  No doubt these officials are asking their own experts a narrow, technical question and getting a narrow, technically correct answer: There is a subfield of cryptography known as “kleptography” that studies the design of “asymmetric backdoors.”  The idea is that the designer of a cryptographic algorithm can bake into it a very specific vulnerability that depends on a lengthy mathematical key that is too large to guess and cannot be easily reverse-engineered from the algorithm itself. Probably the most famous example of this is the vulnerability in the Dual Ellipitic Curve algorithm NSA is believed to have inserted in a widely-used commercial security suite.  More prosaically, there is the method companies like Apple use to control what software can run on their devices: Their processors are hard-coded with the company’s public key, and (in theory) will only run software signed by Apple’s private developer key.

So there’s a sense in which it is technically feasible to do what NSA and FBI would like. There’s also a sense in which it’s technically possible for a human being to go without oxygen for ten minutes—but in practice you’ll be in for some rude surprises unless you ask the follow up question: “Will the person be left irreperably brain damaged?” When Comey or Rogers get a ten minute briefing from their experts about the plausibility of designing “golden key” backdoors, they are probably getting the technically accurate answer that yes, on paper, it is possible to construct a cryptographic algorithm with a vulnerability that depends on a long mathematical key known only to the algorithm’s designer, and which it would be computationally infeasible for an adversary to find via a “brute force” attack. In theory. But to quote that eminent cryptographer Homer Simpson: “I agree with you in theory, Marge.  In theory, communism works. In theory.” 

The trouble, as any good information security pro will also tell you, is that real world systems are rarely as tidy as the theories, and the history of cryptography is littered with robust-looking cryptogaphic algorithms that proved vulnerable under extended scrutiny or were ultimately impossible to implement securely under real-world conditions, where they crypto is inevitably just one component in a larger security and software ecosystem. A measure of adaptability is one virtue of “end to end” encryption, where cryptographic keys are generated by, and held exclusively by, the end users: If my private encryption key is stolen or otherwise compromised, I can “revoke” the corresponding public key and generate a new one. If some clever method is discovered that allows an attacker to search the “key space” of a cryptosystem more quickly than was previously thought possible, I can compensate by generating a longer key that remains beyond the reach of any attacker’s computing resources. But if a “golden key” that works against an entire class of systems is cracked or compromised, the entire system is vulnerable—which makes it worthwhile for sophisticated attackers to devote enormous resources to compromising that key, far beyond what it would make sense to expend on the key for any single individual or company.

So maybe you don’t want a single master key: Maybe you prefer a model where every device or instance of software has its own corresponding backdoor key. This creates its own special set of problems, because now you’ve got to maintain and distribute and control access to the database of backdoor keys, and ensure that new keys can’t be generated and used without creating a corresponding key in the master database. This weak point—key distribution—is the one NSA and GCHQ are purported to have exploited in last week’s story about the theft of cell phone SIM card keys.  Needless to say, this model also massively reduces the flexibility of a communications or data storage system, since it means you need some centralized authority to generate and distribute all these keys.  (Contrast a system like GPG, which allows users to generate as many keys as they need without any further interaction with the software creator.) You also, of course, have the added problem of designing your system to resist  modification by the user or device owner, so the keys can’t be changed once they leave the manufacturer.

As I’ve argued elswhere,  the feasibility of implementing a crypto backdoor depends significantly on the nature of the system where you’re trying to implement it.  If you want backdoors in an ecosystem like Apple’s, where you have a single manufacturer producing devices with hardcoded cryptographic keys and exerting control over the software running on its devices, maybe (maybe) you can pull it off without too massive a reduction in the overall security of the system.  Ditto if you’re running a communications system where all messages are routed through a centralized array of servers—assuming users are willing to trust that centralized hub with access to their most sensitive data.  If, on the other hand, you want backdoors that are compatible with a decentralized peer-to-peer communications network that uses software-generated keys running on a range of different types of computing hardware, that’s going to be a much bigger problem.  So when Mike Rogers asks his technical experts whether Apple could realistically comply with a mandate to provide backdoor access to encrypted iPhone data, they might well tell him it’s technically doable—but that doesn’t mean there wouldn’t be serious problems implementing such a mandate generally.

In short, Rogers’ dismissive attitude in the exchange above seems like prime evidence that a little knowledge can indeed be a dangerous thing. He’s got a lot of “world class cryptographers” eager to give him the—very narrowly technically accurate—answer he wants to hear: It is mathematically possible to create backdoors of this sort, at least on certain types of systems.  The reason the rest of the cryptographic community disagrees is that they’re not limiting themselves to giving a simplified five-minute answer to the precise question the boss asked, or finding an abstract solution to a chalkboard problem.   In other words, they’re looking at the bigger picture and recognizing that actually implementing these solutions across a range of data storage and communications architectures—even on the dubious premise that the global market could be compelled to use broken American crypto indefinitely—would create an intractable array of new security problems.  We can only hope that eventually one of the in-house experts that our intelligence leaders actually listen to will sit the boss down for long enough to break the bad news.

Ilya Shapiro

Freedom of contract—the right of individuals to manage and govern their own affairs—is a basic and necessary liberty. The appropriate role of the government in contract-law disputes is to hold parties to their word, not to enforce its own policy preferences.

The New Jersey Supreme Court recently struck a blow against that basic freedom, however, in ruling that clearly worded arbitration provisions—one of the most common parts of consumer contracts—are unenforceable unless the parties comply with multiple superfluous formalities. The case arose when Patricia Atalese retained a law firm, U.S. Legal Services Group, to negotiate with creditors on her behalf. Atalese signed a retainer agreement with a standard arbitration provision: she checked a box that unambiguously indicated that she read and understood that all disputes would be settled via arbitration. Then, after a dispute over legal fees, Atalese disregarded the arbitration agreement and filed a lawsuit in state court.

The trial court dismissed her complaint and compelled arbitration, a ruling that was affirmed by the intermediate appellate court. But instead of letting that decision stand, the New Jersey Supreme Court broke from years of tradition and federal precedent found the arbitration provision unenforceable because it lacked certain magic words stating, in addition to all disputes being resolved by arbitration, that the parties were waiving their right to a civil jury trial.

Cato, joined by the National Federation of Independent Business, has filed an amicus brief urging the U.S. Supreme Court to review the case. We make three key points. First, the New Jersey court’s proposed requirement—that contracts with an arbitration provision include belt-and-suspenders-and-drawstring language regarding jury-trial waiver—is redundant. Agreeing to submit a dispute to an impartial arbitrator instead of going through the expense of litigation is the very essence of an arbitration agreement.

Second, the ruling is contrary to federal law. The Federal Arbitration Act, which has been in place for nearly 100 years, affords arbitration agreements certain protections. Specifically, it demands from the states a certain amount of even-handedness: states can’t nullify or refuse to enforce such agreements for reasons other than those that would invalidate any contractual provision (e.g., coercion, fraud, illegal subject matter, etc.). Since New Jersey law doesn’t require parties to state clearly that they understand the legal consequences of other contractual provisions (since a signature is accepted as evidence that the agreement was read and understood), applying that additional standard to arbitration agreements alone would violate federal law.

Finally, this ruling threatens to upset the business world by calling into question the validity of millions of contracts to which parties have mutually and unambiguously agreed. The vast majority of arbitration provisions, including many of those featured in previous Supreme Court cases, don’t include the explicit language mandated by the New Jersey ruling.

In short, the value of contracts lies in the certainty they create while enabling mutual gains. By creating substantial uncertainty about the enforceability of tens of thousands of arbitration agreements, the New Jersey Supreme Court will force businesses, litigants, and the taxpayers who fund the court system to waste time and money litigating disputes that would be more appropriately resolved by arbitration—as all the parties agreed in the first place.

The Supreme Court will decide whether to take up U.S. Legal Services Group v. Atalese this spring.

Steve H. Hanke

The financial press has become inundated with the word “austerity.” Since Greece’s left-wing Syriza proclaimed an “anti-austerity revolution,” strong adjectives, like “incredibly savage,” precede that overused word.

What was once a good word has become a weaselword. That, according to the Oxford Dictionary, is “a word that destroys the force of a statement, as a weasel ruins an egg by sucking out its contents.” How could that be?

Well, in the hands of an unscrupulous or uninformed writer, the inversion of a perfectly good word into a weaselword is an easy task. All one has to do is leave the meaning of a word undefined or vague, rendering the word’s meaning so obscure as to make it non-operational. With that, a meaningless weaselword is created.

In its current usage, the word austerity is so obscure as to evoke Fritz Machlup’s paraphrase of Goethe’s line from Faust: To conceal ignorance, Mephistopheles counsels a student to misuse words. Such is the story and fate of austerity.

Mark A. Calabria

Debate over whether to subject the Federal Reserve to a policy audit has occasionally focused on the size and composition of the Fed’s balance sheet. While I don’t see this issue as central to the merits of an audit, it has given rise to a considerable amount of smug posturing. Let’s step beyond the posturing and give these questions some of the attention they deserve.

First the facts. The Fed’s balance sheet has ballooned over the last few years to about $4.5 trillion. And yes, the Fed discloses such. No argument there. The Fed, like most central banks, has traditionally conducted its open-market operations in the “short end” of the market. The various rounds of quantitative easing have changed that. For instance the vast majority of its holdings of Fannie & Freddie mortgage-backed securities ($1.7 trillion) have an average maturity of well over 10 years. Similarly the Fed’s stock of treasuries have long maturities, about a fourth of those holdings in excess of 10 years.

Now the leverage question. We all get that the Fed cannot go “bankrupt” like Lehman. But that’s because “bankrupt” is a legal condition and one from which the Fed has been exempted. Just like Fannie and Freddie cannot go “bankrupt” (they are considered legally outside the bankruptcy code). The eminent economist historian Barry Eichengreen tells us the Fed’s leverage doesn’t matter as “the central bank can simply ask the government to replenish its capital, much like when a government covers the losses of its national post office.” Some of us would say that’s a problem not a solution, just like it is with the Post Office.

Others would suggest the Fed’s leverage doesn’t matter because “the Fed creates money”. Again that misses the point. Any losses could be covered by printing money, but isn’t that inflationary?  And that, of course, is just another form of taxation. So it seems Senator Paul’s primary point, that the Fed’s balance sheet exposes the taxpayer to some risk has actually been supported, not discredited, by these supposed rebuttals.

Let’s get to another issue, the maturity of the Fed’s assets. There’s a good reason central banks generally stay in the short end of the market. It avoids taking on any interest rate risk.  When rates go up, bond values fall. Yes the Fed can avoid recognizing those losses by simply not selling those assets. But that creates problems of its own. If we do see inflation, normally the Fed would sell assets to drain liquidity from the market. But would the Fed be willing to sell assets at a loss? At the very least there would be some reluctance. And yes they could cover those losses by printing money, but that’s hardly helpful if the Fed finds itself in a situation of rising prices.

The point here is that the Fed’s balance sheet does raise tough questions about its exit strategy.  Perhaps the economy will remain soft for years and the Fed can exit gracefully.  Perhaps not.  I raised this possibility before Congress a year ago.  I don’t know anyone with a crystal ball on these issues.  But one thing is certain, this is a debate we should be having.  Its the “nothing to see here, move along” crowd that poses the true risk to our economy.

Juan Carlos Hidalgo

Nicaragua’s plan to build an Interoceanic Canal that would rival the Panama Canal could be a major environmental disaster if it goes forward. That’s the assessment of Axel Meyer and Jorge Huete-Pérez, two scientists familiar with the project, in a recent article in Nature. Disturbingly, the authors point out,

No economic or environmental feasibility studies have yet been revealed to the public. Nicaragua has not solicited its own environmental impact assessment and will rely instead on a study commissioned by the HKND [The Hong Kong-based company that has the concession to build the canal]. The company has no obligation to reveal the results to the Nicaraguan public.

In recent weeks we have seen similar opinions aired in the Washington Post, Wired, The Economist, and other media. In their article, Meyer and Huete-Pérez explain how the $50-billion project (more than four times Nicaragua’s GDP), would require “The excavation of hundreds of kilometres from coast to coast, traversing Lake Nicaragua, the largest drinking-water reservoir in the region, [and] will destroy around 400,000 hectares of rainforests and wetlands.” So far, the Nicaraguan government has remained mum about the environmental impact of the project. Daniel Ortega, the country’s president, only said last year that “some trees have to be removed.”  

Interestingly, despite this potential massive threat to one of the most pristine environmental reservoirs in the Americas, none of the leading international environmental organizations, such as Greenpeace, Friends of the Earth or the Sierra Club, has issued a single statement about the Nicaragua Canal.

We know for a fact that this is not out of lack of interest in Central America. After all, some of these organizations were pretty vocal in their opposition to CAFTA. Why isn’t the Nicaragua Canal proposal commanding the attention of these international environmental groups?

Paul C. "Chip" Knappenberger and Patrick J. Michaels

You Ought to Have a Look is a feature from the Center for the Study of Science posted by Patrick J. Michaels and Paul C. (“Chip”) Knappenberger.  While this section will feature all of the areas of interest that we are emphasizing, the prominence of the climate issue is driving a tremendous amount of web traffic.  Here we post a few of the best in recent days, along with our color commentary.

In this week’s You Ought to Have a Look, we’re going to catch up on some new climate science that hasn’t gotten the deserved attention—for reasons soon to be obvious.

First up is a new study comparing climate model projections with observed changes in the sea ice extent around Antarctica.

While everyone seems to talk about the decline in the sea ice in the Northern Hemisphere, considerably less discussion focuses on the increase in sea ice in the Southern Hemisphere. If it is mentioned at all, it is usually quickly followed by something like “but this doesn’t disprove global warming, it is consistent with it.”

But, even the folks delivering these lines probably realize that the latter bit is a stretch.

In fact, the IPCC and others have been trying downplay this inconvenient truth ever since folks first started to note the increase. And the excuses are getting more involved.

A new study pretty much exposes the emperor.

A team of three Chinese scientists led by Qi Shu compared the observed trends in Southern Hemisphere sea ice extent (SIE) with those projected by the collection of climate models used to forecast future climate changes by the U.N.’s Intergovernmental Panel on Climate Change (IPCC). In a nutshell, they found increases in sea ice around Antarctic were not consistent with human-caused climate change at all—or at least not by how climate models foresee it taking place. Figure 1 shows the extent of the mismatch—rather shocking, really.

 

Figure 1. Comparison of observed (blue) and mean climate model projected (red) changes in Antarctic sea ice extent  (from Shu et al., 2015).

Shu et al. write:

The linear trend of satellite-observed Antarctic SIE is 1.29 (±0.57) x 105 km2 decade-1; only about 1/7 [climate] models show increasing trends, and the linear trend of [multi-model mean] is negative with the value of -3.36 (±0.15)x105 km2 decade-1.

This should pretty much quell talk that everything climate is proceeding according to plan.

For all the details, be sure to check out the full paper (which is open access).

 

The next paper worth having a look at is one that examines the impact of urbanization on thunderstorm development in the southeastern U.S.

Recall that a global warming talking point is greenhouse gas-induced climate change will result in more episodes of intense precipitation.

As with all manner of extreme weather events, the association is far from being so simple. All sorts of confounding factors impact the observed changes in precipitation and make disentangling and identifying any impact from anthropogenic global warming nearly impossible. We have discussed this previously, and this new research provides more such evidence.

A team of researchers led by Alex Haberlie developed a method of locating “isolated convection initiation” (ICI) events from historic radar data. ICI events are basically thunderstorm kickstarters. Examining 17 years of data for the region around Atlanta, the team found:

Results reveal that ICI events occur more often over the urban area compared to its surrounding rural counterparts, confirming that anthropogenic-induced changes in land cover in moist tropical environments lead to more initiation events, resulting thunderstorms and affiliated hazards over the developed area.

In other words, pointing to increases in thunderstorms and declaring greenhouse gas emissions the culprit is overly simplistic and potentially misleading.

The full details are available here, although they are behind a paywall. But even a read of the abstract will prove enlightening. Turns out climate change is not so simple.

 

And finally, as the number of people shivering from cold in the Eastern U.S. increases, so, too, does the effort to link the cold to global warming—mostly through feedback from declines in Arctic sea ice.

While we have been over this before—the linkages are less than robust—we’re always happy to see new research on the topic.

Just published is a paper by University of Washington’s Dennis Hartmann that examines the causes behind last winter’s brutal cold in the eastern U.S. Instead of a link to sea ice and high latitude conditions, he found tropical sea surface temperature (SST) anomalies in the Pacific Ocean were a driving force behind the cold air outbreaks last winter. Hartmann further notes as of his writing the paper (in January 2015) the same conditions present last winter persisted into this one. The current situation bears this out.

This passage from Hartmann’s paper bears repeating and is worth keeping in mind:

This result is consistent with a long history of observational and modeling work indicating that SST anomalies in the extratropics are strongly driven by atmospheric circulation anomalies, while SST anomalies in the tropics can strongly force the atmospheric circulation.

In other words, while extratropical circulation drives our daily weather, tropical sea surface temperature patterns drive the circulation. Thus, don’t look to the Arctic to explain winter’s weather, but rather the Tropics. Hopefully, John Holdren etc. will take this to heart.

Dalibor Rohac

In a recent article for the Weekly Standard, I noted that freedom in Hungary was under attack. In the past several years, the Prime Minister Viktor Orban has tightened its control over media, harassed civil society organizations, politicized the judiciary, nationalized $14 billion worth of assets from private pension funds, and populated the board of Hungary’s central bank by appointees of the ruling party, Fidesz. Mr Orban – who was once seen as a pro-market, liberal reformer – has also become Vladimir Putin’s most reliable partner in the EU, having hosted him for a working visit just last week.

But not all Hungarians are applauding as the country descends deeper into what could be called ‘goulash authoritarianism’. In fact, the parliamentary by-election in the county of Veszprem on Sunday has brought a very encouraging piece of news. A Fidesz candidate was defeated by an independent candidate, Zoltan Kesz, endorsed by a coalition of left-of-center parties.

“The left-right divide has been turned on its head in Hungary; the relevant distinction here is between the pro-Western and pro-Eastern political parties,” says Mr Kesz, referring to Mr Orban’s geopolitical allegiances. It should also be said that Mr Kesz is no ordinary politician. An activist and English teacher, he is the founder of Hungary’s premier libertarian think-tank, the Free Market Foundation. Interestingly, given the toxic political and ideological environment in Hungary, the organization has become known as the leading voice against racism in the country, and much of its efforts have been aimed to counter the rise of political forces such as the xenophobic Jobbik party, which is currently the third largest political force in the country.

Mr Kesz’ election is significant because it brings an end to the narrow supermajority, which Fidesz enjoyed in the Hungarian parliament since the election last year. In 2013, the parliament passed a number of controversial constitutional amendments, and many feared that the unchecked dominance of Fidesz could herald the demise of Hungarian democracy. While Mr Kesz’ electoral victory assuages those fears somewhat, he will be fighting an uphill battle to get his country back on track.

Ilya Shapiro

Vietnam vet Robert Rosebrock is 72 years old, but he’s still got enough fight in him to stand up for what he believes in. The Veteran’s Administration of Greater Los Angeles (VAGLA) and the U.S. Court of Appeals for the Ninth Circuit would prefer his fight to be in vain.

Rosebrock’s fight here is a protest against VAGLA’s use of a parcel of land deeded to the U.S. government for the care of homeless veterans for purposes other than that purpose.  For example, VAGLA leased parts of the land to a private school, an entertainment company, and a soccer club, and occasionally used it for hosting events. Every Sunday for 66 weeks, Rosebrock hung at least one and as many as 30 U.S. flags from a border fence on the VA property that he believed was being misused.

After seeing a celebrity gala event on the property one Sunday afternoon, Rosebrock started hanging flags with the stars down, signifying dire distress to life and property—the distress faced by LA’s homeless veterans. At this point, VAGLA started enforcing its policy against “displaying of placards or posting of materials on bulletin boards or elsewhere on [VA] property.” When Rosebrock continued, believing his First Amendment rights would protect him, he was issued six criminal citations. He then stopped hanging his flag upside down but was later allowed to hang it right-side-up—a clear if unusual example of viewpoint-based speech discrimination that violates the First Amendment.

That part of his case was a slam-dunk; the difficulty came in making the violation matter. Rosebrock turned to the courts asking two things: an order that would stop VAGLA from discriminating against him in the future and one that would allow him to display his flag stars-down for an amount of time equal to how long he had been denied the right to display it. The district court found that because the VAGLA associate director sent an email to the VA police that the “no signs” regulation should be enforced precisely, Rosebrock’s requested remedies were moot—meaning, basically, that because VAGLA said it would play by the rules, the Court wouldn’t order them to. This is known in legal circles as “voluntary cessation”.

Not long after the district court’s dismissal, the VA police disregarded the email and allowed Iraq War veterans to protest in violation of the regulation. Rosebrock raised this fact when he appealed to the Ninth Circuit, but the Ninth Circuit affirmed the ruling without even addressing the continued discriminatory enforcement. To paraphrase, the appellate panel held that although there is a great burden for parties seeking to prove mootness through voluntary cessation, we should trust that the government will do what it says.

Robert Rosebrook said, “no thanks,” and is petitioning the Supreme Court to hear his case. Cato agrees, and has joined the Pacific Legal Foundation and Institute for Justice on a brief supporting the petition. We point out that the federal appeals courts are split on this mootness/voluntary cessation issue, that it’s an issue that arises all the time, and that there’s no reason that government entities should be given a benefit of the doubt while everyone else has to prove “it is absolutely clear that the allegedly wrongful behavior could not reasonably be expected to recur.”

The Supreme Court should take this case, Rosebrock v. Hoffman, and tell the lower courts what we know, what Robert Rosebrock knows, and what everyone else in the country should already know by now: it doesn’t always make sense to take the government at its word.

Cato legal associate Julio Colomba contributed to this blogpost.

Walter Olson

We’ve reported earlier in this space on how the Obama administration’s Equal Employment Opportunity Commission (EEOC) keeps getting slapped down by federal judges over what we called its “long-shot lawsuits and activist legal positions.” Now the Fourth Circuit has weighed in on a high-profile employment screening case from Maryland – and it too has given the EEOC a good thwacking, in this case over “pervasive errors and utterly unreliable analysis” in the expert testimony it marshaled to show the employer’s liability. Those are the words of a three-judge panel consisting of Judge Roger Gregory, originally appointed to the court by Bill Clinton before being re-appointed by his successor George W. Bush, joined by Obama appointee Albert Diaz and GWB appointee G. Steven Agee. 

The case arose from the EEOC’s much-publicized initiative of going after employers that use criminal background checks in hiring, which the agency insists often have improper disparate impact on minority applicants and have not been validated as necessary for business reasons. It sued the Freeman Cos., a provider of convention and exposition services, over its screening methods, but Freeman won after district court judge Roger Titus shredded the EEOC’s proffered expert evidence as “laughable,” “unreliable,” and “mind-boggling.” The EEOC appealed to the Fourth Circuit. 

If it was expecting vindication there, it was very wrong. Agreeing with Judge Titus, Judge Gregory cited the “pervasive errors and utterly unreliable analysis” of the commission’s expert report, by psychologist Kevin Murphy. “The sheer number of mistakes and omissions in Murphy’s analysis renders it ‘outside the range where experts might reasonably differ,’” which meant it could not have been an abuse of discretion for Judge Titus to exclude it. 

Strong language, yet Judge Agee chose to write a separate concurrence “to address my concern with the EEOC’s disappointing litigation conduct.” Noting a pattern in multiple cases, Agee faulted the commission’s lawyers for circling the wagons on behalf of its statistical methods despite repeated judicial hints that it needed to strengthen its quality control. “Despite Murphy’s record of slipshod work, faulty analysis, and statistical sleight of hand, the EEOC continues on appeal to defend his testimony.” If the agency doesn’t watch out, exasperated judges might start imposing more sanctions against it. 

Incidentally, as a counterpoint to the EEOC’s bullheadedness, the U.S. Commission on Civil Rights a year back did a briefing program on employee screening and criminal background checks that tries to include an actual balance of views. You can read and download it here.

Charles Hughes

Fresh off another victory lap last week, Obamacare supporters awoke last Friday to the news that the government had given nearly one million exchange enrollees incorrect tax forms that could significantly affect their tax returns. 800,000 enrollees in the federal exchange and roughly 100,000 in California were given the wrong forms, called 1095-As, which provide a monthly account of the premium subsidies exchange enrollees receive. The government uses that information to determine that the subsidy amounts are correct (although a pending Supreme Court case raises questions about the legality of any subsidies offered through the federal exchange). Enrollees using the wrong information when filing their taxes would make it impossible for the government to verify that they got the right amount of subsidies.

Government officials will now try to remedy their mistake by sending out new forms to the affected customers. These tax documents contained the wrong price for the ‘benchmark plan’, the second-lowest cost silver plan available that is used to calculate the exchange subsidy amount. A post on the HealthCare.gov blog explains that the erroneous forms included the benchmark plan premiums for 2015 instead of 2014, which led to the wrong subsidy amount being displayed on the forms people use to file their taxes. The errors are not confined to one area, so incorrect forms were sent throughout the country, making it harder for enrollees to know if they are affected. Those given the wrong form will be able to access their corrected one sometime in early March, according to the report. 50,000 people in this group have already filed their taxes using the incorrect tax information. Officials are now in the process of trying to contact this group, and they will likely have to resubmit their tax returns. Enrollees who already filed will not find much help at HealthCare.gov for now, which only reads: “Additional information will be provided shortly.” Overall, nearly one million exchange enrollees could see delays in getting their income tax refunds, or find that their size of the refund has changed due to corrections in the tax form. Many of these people depend on this tax refund, and unanticipated problems could have significant adverse consequences.

Filing taxes is already a cumbersome and aggravating process. Obamacare has made it even more arduous as people have to attest to having health insurance coverage and how much they receive in exchange subsidies. Even worse, it nearly one in five HealthCare.gov customers was sent the wrong forms, and these people will have to delay filing their taxes, or even resubmit them. While this blunder will not cause the law to spiral out of control, it does reveal the potential for ongoing problems with its implementation. Following the news, HealthCare.gov CEO Kevin Counihan told reporters “We’re not doing any victory laps.” Other Obamacare supporters should take this lesson to heart.

Pages