We have an uncompetitive federal corporate tax rate of 35 percent compared to Canada’s 15 percent. Our Roth IRA is inferior to Canada’s TFSA, as Amity Shlaes and I discussed in the Wall Street Journal. And while Serena Williams still tops rising star Eugenie Bouchard, we should be paying attention to ”What Canada Can Teach Us About Tennis.”
Now we face another competitive threat from the north. This time it’s British Columbia seaports says Bloomberg:
Container ships sailing across the northern Pacific are carrying more cargo and are setting course for British Columbia to avoid delays from a possible strike by U.S. West Coast longshoremen. Traffic in Prince Rupert soared 49 percent in July from a year earlier, according to data compiled by Bloomberg Intelligence, while volume dropped 19 percent in Seattle, its nearest major U.S. rival.
Canadian ports are gaining an advantage over their U.S. rivals amid an economic recovery that’s increasing container volumes from East Asia. While U.S. West Coast ports are mired in a labor dispute and congestion hobbles local railways, Prince Rupert is winning customers with its shorter sailing times from China and efficient infrastructure that can whisk freight to the U.S. Midwest and beyond.
“If people are using the Canadian ports now out of concern for a slowdown, and they like what they see and they like the processing times and the experience, they’ll continue to funnel some of their traffic that way,” Emma Griffith, a director at Fitch Ratings in New York.
So Canadian seaports are gaining in the short-term because of our self-inflicted wound, but they may also gain in the long-term because of both natural and man-made advantages:
[Prince Rupert] lies ice-free 745 kilometers (462 miles) northwest of Vancouver, is as many as 68 hours closer to Shanghai in sailing time than is Los Angeles, according to the Prince Rupert Port Authority. Including rail times, cargo transiting from Shanghai through Prince Rupert would reach Chicago two days quicker than if the ships called at Oakland or Seattle-Tacoma, and three quicker than if they unloaded in Los Angeles…
One of Prince Rupert’s advantages is that inbound containers can be transferred directly to trains rather than trucks that head to a distribution center, which is what happens at other West Coast ports, according to Kris Schumacher, a spokesman for the port authority. This kind of traffic, which uses different modes of transportation, is known within the industry as intermodal freight, and it’s booming for Canadian National.
Meanwhile back on the United States, it’s antibusiness-as-usual:
…there’s no indication when new contracts will be signed for workers at 29 ports from Washington state to California. About 20,000 dockworkers represented by the International Longshore and Warehouse Union have been without a contract since early July. The union and the maritime association are negotiating over work rules, salaries and health-care benefits.
In 2002, the maritime association locked out U.S. West Coast port workers after contract talks broke down. The 10-day shutdown ended when then-President George W. Bush invoked the rarely used Taft-Hartley Act to reopen the ports. The dispute cost the U.S. economy $1 billion a day, according to the maritime association.
Over at Cato’s Police Misconduct web site, we have identified the ‘worst case’ for August.
As you may have already guessed, it was the Ferguson Police Department. As the events in Ferguson played out during August, the police department there put on a clinic on how not to police a community. From the withholding of Darren Wilson’s name (he was the officer who shot Michael Brown six times), to brandishing weapons of war against a community expressing its anger and mourning through protest, and blatantly targeting journalists for arrest and assault, the events in Ferguson have shown just how disastrous poor policing can be to a community. If there is any silver lining to the situation, it is that people across the country have been presented with a good look at the consequences of when police misconduct goes unchecked and bad policies, like militarizing local police forces, are allowed to continue. Things were bad enough in Ferguson for them to collectively qualify as the worst police misconduct of August, but the situation will be much worse if the lessons of Ferguson are not learned and the mistakes not corrected in the future—and not just in Ferguson, but in similar towns around the country.
Finally, a not-so-’honorable mention’ goes to the Denver police officer who tried to get out of his DUI arrest by telling the arresting officer “Bro, I’m a cop.” That he would even attempt such a ploy tells us something about the police subculture–where too many law enforcement officers come to believe that they are above the law. They aren’t, and the arresting officer did the right thing by getting a dangerous drunk driver off the streets—cop or not.
Today the Washington Post is starting a series of articles entitled, “Stop and Seize,” which take a critical look at the power of the government to take cash away from people using civil asset forfeiture laws. Here are a few of the findings from the Post investigation:
- There have been 61,998 cash seizures made on highways and elsewhere since 9/11 without search warrants or indictments through the Equitable Sharing Program, totaling more than $2.5 billion. State and local authorities kept more than $1.7 billion of that while Justice, Homeland Security and other federal agencies received $800 million. Half of the seizures were below $8,800.
- Only a sixth of the seizures were legally challenged, in part because of the costs of legal action against the government. But in 41 percent of cases — 4,455 — where there was a challenge, the government agreed to return money. The appeals process took more than a year in 40 percent of those cases and often required owners of the cash to sign agreements not to sue police over the seizures.
- Hundreds of state and local departments and drug task forces appear to rely on seized cash, despite a federal ban on the money to pay salaries or otherwise support budgets. The Post found that 298 departments and 210 task forces have seized the equivalent of 20 percent or more of their annual budgets since 2008.
- Agencies with police known to be participating in the Black Asphalt intelligence network have seen a 32 percent jump in seizures beginning in 2005, three times the rate of other police departments. Desert Snow-trained officers reported more than $427 million in cash seizures during highway stops in just one five-year period, according to company officials. More than 25,000 police have belonged to Black Asphalt, company officials said.
Mandrel Stuart, a 35-year-old African American owner of a small barbecue restaurant in Staunton, Va., was stunned when police took $17,550 from him during a stop in 2012 for a minor traffic infraction on Interstate 66 in Fairfax. He rejected a settlement with the government for half of his money and demanded a jury trial. He eventually got his money back but lost his business because he didn’t have the cash to pay his overhead. “I paid taxes on that money. I worked for that money,” Stuart said. “Why should I give them my money?”That’s a question that Cato has been asking policymakers for many years now. In 1992, Cato published “American Forfeiture Law: When Property Owners Meet the Prosecutor.” In 1995, Cato published, Forfeiting Our Property Rights: Is Your Property Safe from Seizure?, by the late Rep. Henry Hyde (R-IL). In 1999, Cato held a conference titled, “Forfeiture Reform: Now, or Never? More recently, in 2010, Cato hosted an event for the authors of Policing for Profit, a report from our friends at the Institute for Justice. Over the years, in blog posts, op-eds, congressional testimony, radio interviews, and university lectures, Cato scholars have been defending the rights of people from forfeiture abuse.
Large government projects often double in cost between when they are first considered and when they are finally completed. This pattern—call it “Edwards’ Law”—is revealed in story after story about highways, airports, computer systems, and other types of government infrastructure.
The most expensive train station in the U.S. is taking shape at the site of the former World Trade Center, a majestic marble-and-steel commuter hub that was seen by project boosters as a landmark to American hope and resilience.
Instead, the terminal connecting New Jersey with downtown Manhattan has turned into a public-works embarrassment. Overtaking the project’s emotional resonance is a practical question: How could such a high-profile project fall eight years behind schedule and at least $2 billion over budget?
An analysis of federal oversight reports viewed by The Wall Street Journal and interviews with current and former officials show a project sunk in a morass of politics and government.
Edwards’ law takes effect:
When completed in 2015, the station is on track to cost between $3.7 and $4 billion, more than double its original budget of $1.7 billion to $2 billion.
What were some of the causes of the cost doubling? Typical government stuff, it appears, such as squabbling between agencies and political incentives detached from the bottom line:
Those redesigning the World Trade Center—destroyed by terrorists in 2001—were besieged by demands from various agencies and officials, and “the answer was never, ‘No,’ ” said Christopher Ward, executive director from 2008 to 2011 of the Port Authority of New York and New Jersey, the project’s builder.”
…The Port Authority, run jointly by the two states, has long been known for political infighting. City, state and federal agencies, as well as real-estate developer Larry Silverstein, also joined in. In public and private clashes, they each pushed to include their own ideas, making the site’s design ever more complex, former project officials said. These disputes added significant delays and costs to the transit station…
Former New York Mayor Michael Bloomberg, for example, insisted the memorial plaza be finished by the 10th anniversary of the Sept. 11, 2001 attacks. The request added more than $100 million to costs and months of delay…”
Conflicting goals among agencies were a major cause of delays and added costs, an analysis by the Journal of monthly oversight reports by the Federal Transit Administration shows.
Here is a rule of thumb for citizens to remember when they hear about a proposed government project: whatever dollar figure the politician claims, double it for a more realistic estimate. For more, see here.
ObamaCare Exchanges Recklessly, Often Unlawfully, Throwing Taxpayer Money At Health Insurance Companies
Michael F. Cannon
The Obama administration has no idea how many people are currently enrolled [in exchanges] but they keep cutting checks for hundreds of millions of dollars a month for insurance subsidies for people who may or may not have paid their premium, continued their insurance, or are even legal residents.
And if you think they’re doing those “enrollees” a favor, remember that if it turns out a recipient wasn’t eligible for the subsidy, he or she has to pay the money back.
Surprised? Don’t be. This is part of a deliberate, consistent strategy by the Obama administration to throw money at individual voters and key health care industry groups—lawfully or not—to buy support for this consistently unpopular law.
Jason Millman of the Washington Post’s Wonkblog casually assumes that Democratic-appointed judges can be counted on to uphold the Affordable Care Act and its implementation against any legal challenge:
The Obama administration and supporters of the president’s health-care law are probably breathing a little easier this morning after some pretty big news from the U.S. Court of Appeals for the District of Columbia.
A few months after a three-member panel of the court ruled the federal government can’t provide insurance subsidies through federal-run exchanges in 36 states, the court on Thursday granted the Obama administration’s request for the entire panel to re-hear the case. The en banc hearing, as it’s known, wasn’t entirely unexpected—and with a heavy makeup of Democratic-appointed judges on the panel, it seems likely the administration will get a more favorable ruling when the entire court reconsiders the case later this year.
I don’t know. I know that Obamacare passed in both the House and Senate on straight party-line votes, over unanimous Republican opposition. But judges aren’t politicians. With a slew of Reagan- and Bush-appointed judges striking down gay marriage bans, I hope and expect that Democratic-appointed judges will show similar nonpartisan judiciousness when they consider the challenge to the IRS’s illegal implementation of insurance subsidies.
[cross-posted, slightly adapted, from Overlawyered]
…cheese counters could soon be a lot less aromatic, with several popular cheeses falling victim to a more zealous U.S. Food and Drug Administration. Roquefort — France’s top-selling blue — is in the agency’s cross hairs along with raw-milk versions of Morbier, St. Nectaire and Tomme de Savoie. …
Of course, French creameries haven’t changed their recipes for any of these classic cheeses. But their wheels are flunking now because the FDA has drastically cut allowances for a typically harmless bacterium by a factor of 10.
The new rules have resulted in holds even on super-safe Parmigiano Reggiano, and the risk of losing a costly shipment of a perishable commodity is likely to be enough to drive many European producers out of the market for export to America entirely. Highly praised artisanal cheese makers in the United States are facing shutdown as well.
They told us this administration was going to be run by wine-and-cheese liberals. Now where are they when they could do us some good?
Peter Van Doren
Rather than selling cars through independent dealers, the upstart electric car maker Tesla sells its automobiles directly to consumers. However, many states prohibit direct auto sales, thanks to laws from the mid-20th century that ostensibly were intended to protect dealers from automakers’ market power. The need for that protection was questionable when the laws and regulations were adopted and are even more dubious in today’s highly competitive auto market. But they are especially inappropriate when applied to a small new automaker that solely wants to engage in direct sales.
This week, the Georgia Automobile Dealers Association filed a petition with the state’s Department of Revenue in an attempt to bar further sales of Tesla sedans. Such battles have erupted in numerous states, from Missouri to New Jersey. In the latest issue of Regulation, University of Michigan Law professor Daniel Crane argues that dealer distribution restrictions are based on faulty ideas of consumer protection. Traditional dealers claim that competition among a brand’s dealers prevents the manufacturer from “gouging” consumers and extracting monopoly profits. Crane argues that standard economic theory demonstrates that these claims are nonsense. Firms with market power will be able to claim monopoly profits, regardless of whether middlemen, such as dealerships, are involved.
Moreover, by restricting competition among business models for auto sales, laws such as those in Georgia stifle competition among automakers. When companies such as Tesla seek to lower costs through innovative business designs, they face costly regulatory hurdles and legal challenges such as the sales ban in Georgia. These laws protect existing dealers and hurt consumers.
Mark A. Calabria
A recent paper from the Brookings Institute raises an important observation that businesses are “becoming older,” that is, the age profile of American business is increasingly dominated by older firms. One reason is that the entry rate of new businesses has been steadily declining for decades.
While this decline has been witnessed across firm size, it has been most dramatic among small firms. One potential contributor to the decline in new small businesses is the long run decline in the personal savings rate. According to the Census Bureau’s Survey of Business Owners, the number one, by a long shot, source of capital for new businesses is the personal savings of the owner. For firms with employees, about 72 percent relied upon personal/family savings for start-up capital. The other dominate sources of capital, credit cards and home equity, were much less frequently used. Recent legislative changes (2009 Card Act) and a volatile housing market have made those sources less reliable in recent years.
The chart below compares the trend in entry rates for new business establishments with less than five employees with the personal savings rate. The correlation between the two is 0.62. While both the decline in business entry and savings are likely driven by common macroeconomic factors, it seems plausible that if households have fewer saving, they are also less likely to be able to start a business. My preferred response would be to eliminate policies, such as those in the tax code or current monetary policy, which penalize savings. I suspect others might have different suggestions.
Michael F. Cannon
My reaction to the D.C. Circuit’s decision to grant en banc review of Halbig v. Burwell in a nutshell:
- It is unnecessary.
- It is unwise.
- It is unfortunate.
- It appears political, as would a decision to overrule Halbig.
- It will likely only delay Supreme Court review.
- En banc review does not necessarily mean the court will overturn Halbig, though it doesn’t look good.
- I predict that even if the court overturns Halbig, the Obama administration will lose ground.
- The D.C. Circuit will not have the last word.
If you want to go outside the nutshell, where I unpack all this with more words and facts and links, go here.
A quick note on unfortunate happenings at the U.S. Court of Appeals for the D.C. Circuit this morning: The court vacated its excellent July 22 decision in Halbig v. Burwell, which had held that Obamacare’s plain language precluded the federal government from subsidizing the health insurance premiums of policies people obtain through exchanges established by the federal government. Just hours after that July 22 decision came down, the Fourth Circuit Court of Appeals ruled the other way on the question in King v. Burwell, setting up a circuit split and a reason for the Supreme Court to promptly decide the question, especially given the scope and magnitude of the issues at stake (36 states have declined to establish state exchanges, for which Obamacare does provided subsidies).
Thus, with the D.C. Circuit now having vacated its three-judge panel’s decision and having agreed to rehear the case en banc (by the entire court), there is no longer a circuit split and less urgency for the Supreme Court to take up the issue. Other cases challenging the federal subsidies are coming along, but for the moment, this is where things are. For more on these issues, see Ilya’s latest post and a WSJ op-ed by Adam White, both written before this morning’s decision. It’s rare for any circuit, but especially for the D.C. Circuit, to grant en banc rehearings. But then nothing has been normal about Obamacare, which is what you should expect when so politicized a program is thrust upon the nation.
When the Affordable Care Act was being debated in Congress, former House Speaker Nancy Pelosi infamously insisted that “we have to pass the bill to find out what’s in it.” It turns out, however, that the Obama administration—which has been making it up as it goes along with regard to ACA enforcement—doesn’t care “what’s in it.”
The IRS in particular has been implementing Obamacare as it thinks the law should be, not as it is. The ACA encourages states to establish health insurance exchanges by offering people who get their health coverage “through an Exchange established by the State” a tax credit—a subsidy to help them pay their premium. In the event a state declines to establish an exchange, Section 1321 further empowers the Department of Health and Human Services to establish federal exchange in states that decline to establish their own exchanges (without providing for the premium subsidy).
When, contrary to the expectations of the law’s achitects, 34 states declined to establish an exchange—two more have since failed—the IRS decided that those getting their insurance on federally established exchanges should qualify for tax credits regardless of the statutory text. In conflict with the U.S. Court of Appeals for the D.C. Circuit in a similar case called Halbig v. Burwell, the Fourth Circuit in King v. Burwell found the legal text to be ambiguous and thus deferred to the IRS interpretation.
The so-called Chevron doctrine counsels that statutory text controls when Congress has spoken clearly on an issue. But where Congress is ambiguous or silent, the agency can fill the regulatory gap with its own rules and policies. The problem here is that the ACA’s text was not ambiguous and there is no evidence that Congress intended to delegate to the IRS the power to determine whether billions of taxpayer dollars should annually be dispersed to those purchasing health care coverage on federal exchanges. That the Fourth Circuit has bent over backwards to accommodate the administration’s latest Obamacare “fix shows that it, too, is not so concerned with “what’s in” the law.
To that end, Cato joined four other organizations to support the plaintiffs’ petition for review by the Supreme Court. Our brief argues that the Court should hear the case because it offers the opportunity to reverse potentially grave harm to the separation of powers, to correct a misapplication of the Chevron doctrine, and to restore the idea that drastically altering the operation of a major legislative act belongs to the political process and not in a back rooms of an administrative agency. Just because those who voted for the ACA didn’t care what it said doesn’t mean that the executive and judicial branches should also turn a blind eye.
To see the legal machinations now at play in these cases regarding the Obamacare-IRS-tax-credit, see my recent op-ed in the National Law Journal. Since that was published this past Monday, the government received a 30-day extension in which it has to file its response to the King cert petition. That means that the Supreme Court will be considering at some point next month whether to take the case.
Mark A. Calabria
In the Dodd-Frank Act, Congress, without irony, decided the best way to end “too big to fail” was to have a committee of regulators label certain companies “too big to fail.” That committee, established under Title I of Dodd-Frank, is called the Financial Stability Oversight Council (FSOC) and is chaired by the Treasury Secretary. Like so much of Dodd-Frank, FSOC gets to write its own rules. Unfortunately FSOC won’t even write those rules, but instead it has decided that it knows systemic risk when it sees it. This has led to an ad hoc process that almost makes the bailouts of 2008 look systematic.
Compare the process for asset management firms and that for insurance companies. In late 2013, the Treasury released a report on the asset management industry. It was widely viewed as an attempt to make the case for labeling some asset management firms “systemic.” The report was widely criticized. Such criticism did not stop FSOC from conducting a public conference on the asset management industry in May 2014. Whether it was the public reaction to the conference or the paper, FSOC has largely abandoned labeling asset managers as “too big to fail.” That was an appropriate outcome as firms in that industry are not systemic and shouldn’t be lead to expect a federal rescue.
Now don’t get me wrong: A shoddy report and a conference do not constitute a thorough process. As someone who has overseen a rulemaking process, I can say they do not even meet the basics of the Administrative Procedures Act. But just when that process seemed wholly inadequate, along comes the “process” for insurance companies.
Not unexpectedly, AIG went along without a peep. Given its role in the crisis that’s not a surprise. But there’s been no report or even a conference on whether insurance companies pose systemic risk. Completing either one would, of course, require FSOC to define systemic risk and to offer some minimal metrics. Instead, what we have is unelected bureaucrats simply making it up as they go along.
And here I was thinking Dodd-Frank was meant to end the haphazard behavior of regulators in 2008 and lead us towards a predictable rules-based approach to ending systemic risk!
The federal government owned or leased 650,000 motor vehicles in fiscal year 2012. DHS’s fleet was the government’s second largest, consisting of 56,000 vehicles. This armada of cars and trucks cost taxpayers $534 million in 2012. Given the large expense, the IG reviewed a portion of the DHS fleet, 753 vehicles, “to determine whether, for FY2012, the Department met requirements to right size the composition of its motor vehicle fleet, [and] eliminate underused vehicles.”
The IG found that DHS vehicle management is poor. Vehicle identification numbers were not listed correctly for 39 percent of vehicles. Fifty-four percent of acquisition dates did not match other department records. The most damning finding was that 59 percent of vehicles were underused, meaning they were driven less than 12,000 miles, the governmental standard, in one year. Apparently, DHS has far too many cars and trucks, even assuming that the vehicles are used for efficient purposes.
The IG found that DHS does not purge unnecessary vehicles. Eighty-six percent of the underused vehicles were still owned by the department a year later. DHS was unable to provide documentation justifying vehicle retention and the additional expense.
These results led the IG to conclude: “we estimate that operating these underused vehicles cost between $35.3 million and $48.6 million. For these reasons, DHS cannot ensure its vehicle fleet composition is cost efficient, complies with department requirements, and has the correct number of motor vehicles to accomplish this mission.”
This is not the first time DHS has been criticized for its handling of its vehicle fleet. In 2013, the Government Accountability Office criticized DHS for similar problems, including incomplete data and failing to adequately analyze and utilize its vehicles. DHS is also not the only agency with underutilized vehicles, as GAO has been highlighted for years.
The federal government spends $3 billion annually on its vehicle fleet, excluding the United State Postal Service. This should be an area in which bipartisan reforms are possible.
Patrick J. Michaels and Paul C. "Chip" Knappenberger
Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”
In our post last week titled “Climate Alarmism: When is this Bozo Going Down?” we described how new research increasingly casts doubt on the validity of climate models and their projections of future climate change. It is increasing clear that climate models simply predict too much warming from human greenhouse gas emissions.
But the scientific community, or at least that part of it which makes its living off climate alarm, is slow to accept this.
Who can blame these folks? More money flows from the government into universities (or government labs) to study the effects of climate change if we all agree that human greenhouse gas emissions are leading to climate change of a dangerous magnitude.
So it is left to the emeritus or retired profs to lay bare the truth.
A fine example of this can be found in a recent article in the New York Times’ DotEarth blog run by ex-Times science reporter Andy Revkin. In his story looking into the implications of new scientific findings concerning the potential impacts of ocean circulation variability on our understanding of the behavior the global average surface history (parts of which we described in our last post), Revkin interviewed four prominent climate researchers. The level of confidence that each showed in the mainstream (climate model-driven) global warming meme (despite this new research suggesting that something may be rotten in the state of Denmark) appears proportional to how much professional advancement still lies ahead.
Josh Willis’ (a young scientist from the government’s Jet Propulsion Laboratory) views on climate change seemed unshaken by the new research:
In regards to your question, if you mean how robust is the “slowdown” in global surface warming, the answer is it just probably just barely statistically significant. If you are wondering whether is it meaningful in terms of the public discourse about climate change, I would say the answer is no. The basic story of human caused global warming and its coming impacts is still the same: humans are causing it and the future will bring higher sea levels and warmer temperatures, the only questions are: how much and how fast?
Andrew Dessler, a mid-career professor at Texas A&M, appeared pretty much equally unmoved:
Second, I think it’s important to put the hiatus in context. This is not an existential threat to the mainstream theory of climate. We are not going to find out that, lo and behold, carbon dioxide is not a greenhouse gas and is not causing warming. Rather, I expect that the hiatus will help us understand how ocean variability interacts with the long-term warming that humans are causing. In a few years, as we get to understand this more, skeptics will move on (just like they dropped arguments about the hockey stick and about the surface station record) to their next reason not to believe climate science.
John Michael Wallace, a late-career professor at the University of Washington who recently became emeritus, expressed much more interest in the idea that the new research could lower confidence in just how much human greenhouse gases were impacting climate change:
…It seemed to me that the hiatus in the warming, which by then was approaching ten years in length, should not be dismissed as a statistical fluke. It was as legitimate a part of the record as the rapid rises in global-mean temperature in the 1980s and 1990s…
The new paper by Tung and Chen goes much farther than we did in making the case that Atlantic multidecadal variability needs to be considered in the attribution of climate change. I’m glad to see that it is attracting attention in the scientific community, along with recent papers of Kosaka et al. and Meehl et al. emphasizing the role of ENSO-like variability. I hope this will lead to a broader discussion about the contribution of natural variability to local climate trends and to the statistics of extreme events.
And finally Carl Wunsch, late-career professor emeritus at M.I.T. was pretty frank about the new research and the state of climate science:
The central problem of climate science is to ask what you do and say when your data are, by almost any standard, inadequate? If I spend three years analyzing my data, and the only defensible inference is that “the data are inadequate to answer the question,” how do you publish? How do you get your grant renewed? A common answer is to distort the calculation of the uncertainty, or ignore it all together, and proclaim an exciting story that the New York Times will pick up.
We couldn’t have said that better ourselves!
Uber has experienced an explosion in signups in Germany after a Frankfurt court handed down a temporary injunction banning the transport technology company. The countrywide ban follows a suit brought against Uber by Taxi Deutschland, an association of taxi dispatchers. Taxi Deutschland claimed that Uber did not have the necessary permits to operate. Under German law, drivers without a commercial licenses can pick up passengers as long as they do not charge more than the operating costs for the ride.
The Telegraph notes that Uber says it will continue to operate in Germany despite Taxi Deutschland claiming it will seek a fine of as much as €250,000 every time Uber provides a service without a license.
Since the injunction was issued, Uber claims that it has experienced a larger than 500 percent increase in signups compared to the same period last week in some parts of Germany. As City A.M.’s Guy Bentley notes, the Uber app has been downloaded in parts of Germany where Uber drivers do not operate:
The taxi app company doubled signups in all five German cities where it operates, with demand in both Hamburg and Düsseldorf rising over 500 per cent. Even people who don’t live in cities where Uber operates are downloading the app, perhaps in an act of capitalist solidarity. Uber now ranks in the top ten most downloaded apps in Germany.
According to Uber, the increased signups on September 2 in the five German cities where it operates compared to the same period last week are as follows:
- Uber Hamburg up 590 percent
- Uber Dusseldorf up 518 percent
- Uber Munich up 329 percent
- Uber Berlin up 270 percent
- Uber Frankfurt up 228 percent
I noted in July that, according to Uber’s U.K. general manager, Uber enjoyed an 850 percent increase in British signups in one day following a London black cab protest against how Uber was being treated by London’s transportation agency.
Taxi companies are understandably frustrated by the rise of Uber and will continue to seek legal means to stifle the company’s growth. However, events in Germany and the U.K. have shown that attacks on Uber can provide the company with welcome exposure and new customers.
[cross-posted from Overlawyered]
The need for police forces isn’t going away, so what practical suggestions do libertarians have in the here and now for discouraging police resort to excessive force? Thanks to Ed Krayewski at Reason for quoting me on the subject of tackling the power of police unions, which not only protect bad actors from removal but tie the hands of well-intentioned administrators in a dozen other ways and exert political pressure against effective reform. (Other suggestions in the piece: increase use of body- and dash-cams, extend the role of civilian oversight boards, and end the Drug War; relatedly, curtail SWAT tactics and the use of other paramilitary force.)
On a perhaps not unrelated note, the Washington Post reports today on the police shooting of an unarmed suburban Washington, D.C. man in his front doorway after he refused to let police into his home following a domestic call. The fact that jumped out at me was that, a year after it happened, the Fairfax County police department is still releasing no information about the incident, not even the name of the officer who pulled the trigger. According to the Post’s account (related lawsuit), police shot kitchen contractor John Geer once but first aid did not arrive until an hour later — he bled to death — and his body remained unmoved for hours, like that of Michael Brown on the street in Ferguson, Mo. The Fairfax chief says his department is just following its own policy by not releasing the officer’s name or other information while an investigation is pending (and pending and pending) — but how that policy came to be adopted, and for whose benefit, are questions worth asking.
Michael D. Tanner
A New York Times editorial yesterday brought attention to the severe shortage in the number of kidneys available for transplant. There are over 100,000 Americans on the waiting list for a kidney transplant, and the average wait time is almost five years. Last year there were only 4,715 transplants from living donors. The vast majority of these donations were from relatives, only 463 kidney donations were from unrelated individuals. Relative to the pool of people waiting, this is little more than a drop in the bucket. Clearly, demand for kidneys is far outpacing the supply and our system for supplying viable organs to those who need them is failing, and these failures have serious consequences. The National Kidney Foundation estimates that almost 3,381 patients died while waiting for a kidney transplant last year. Absent change, this problem will only get worse in the future.
It is commendable that the editorial board raises this issue, but they then devote the rest of the post to trying to find ways to remedy the problem without having to “resor[t] to paying for kidneys.” The primary reasons for dismissing any kind of market for organs are that such a market is prohibited by law and is not supported by the World Health Organization. There are ethical concerns and fears of exploitation, and these should not be summarily dismissed. However, given how badly our current system is failing, could it be time to rethink our policy? Could a free-market better address the needs of thousands of organ transplant patients? Because organ sales are currently illegal in most industrialized countries, there is almost no empirical data and no sense of what to expect if we were to make this shift. However, at a Cato event in March, Sigrid Fry-Revere shared what she learned from observing one of the only countries with a free-market for organ donations.“The Kidney Sellers: A Journey of Discovery in Iran” (featuring the author, Sigrid Fry-Revere)
The Sunlight Foundation reports that the Federal Communications Commission has received more than 800,000 public comments on the topic of “net neutrality,” more than 60 percent of them form letters written by organized campaigns and more than 200 from law firms on behalf of themselves or their clients. That’s an impressive outpouring of public comments.
But Berin Szoka, a long-ago Cato intern who now runs TechFreedom, argues, “This debate is no longer about net neutrality. A radical fringe has hijacked the conversation in an attempt to undo two decades of bipartisan consensus against heavy-handed government control of the Internet.” TechFreedom has just launched DontBreakThe.Net, a web-based campaign to expose the danger facing the internet from well-meaning demands for something called “net neutrality.” In an open letter to FCC chairman Tom Wheeler, Szoka says:
Subjecting broadband to Title II of the 1996 Telecom Act would trigger endless litigation, cripple investment, slow broadband deployment and upgrades, and thus harm underserved communities. Al Gore may not have exactly ‘invented the Internet,’ but President Clinton’s FCC chairman Bill Kennard deserves much credit for choosing not to embroil the Internet in what he called the ‘morass’ of Title II. Kennard’s approach of ‘vigilant restraint’ unleashed over $1 trillion in private investment, which built the broadband networks everyone takes for granted today. Abandoning that approach would truly break the Internet.
Net Neutrality supporters such as Google, Facebook, and the NAACP haven’t jumped on the Title II bandwagon because they understand that Title II would threaten the entire Internet. Title II proponents claim the FCC can simply ‘reclassify’ broadband, but in truth, there’s no such thing as reclassification, only re-interpretation of the key definitions of the 1996 Telecom Act. If the FCC re-opens that Pandora’s Box, the bright line Chairman Kennard drew between Title II and the Internet will disappear forever. Startups and edge/content providers will inevitably be caught in the fray. And besides, the FCC has a long history of overstepping its bounds.
Invoking Title II would trigger years of litigation. It’s not clear the FCC could ultimately ‘reclassify’ broadband at all, and even less clear the FCC could, or actually would, follow through on talk of paring back Title II’s most burdensome rules, like retail price controls. Even if ‘reclassification’ stood up in court, the FCC still couldn’t do what net neutrality hardliners want: banning prioritization. The FCC would succeed only in creating a dark cloud of legal uncertainty. That would slow broadband upgrades and discourage new entrants, such as Google Fiber, from entering the market at all.
The best policy would be to maintain the ‘Hands off the Net’ approach that has otherwise prevailed for 20 years. Innovation could thrive, and regulators could still keep a watchful eye, intervening only where there is clear evidence of actual harm, not just abstract fears. As former FCC Chairman Bill Kennard put it, ‘I don’t want to dump the whole morass of Title II regulation on the cable pipe.’ If we want to maintain a free and open Internet, and encourage broadband competition, the FCC would do well to heed his advice.
TechFreedom created this catchy graphic for its campaign to encourage more people to understand what’s at stake in the so-called “net neutrality” fight.
We’ve gotten used to American dominance in the internet/software industries. That may not last forever:
Over the weekend, China announced that it was planning to launch a homegrown operating system to replace Windows and Android for running the nation’s desktop and mobile devices. The first iteration of this “Made in China” OS could roll out as early as October
There is some good and bad with this. The good is that more competition in these industries would be great. While there is already a fair amount of disruption and innovation here, there are also some key products/services where a couple firms are pretty dominant. Consumers would benefit tremendously from more competition.
At the same time, this may not play out exactly like we free market supporters would like. For example:
- “This new Chinese OS, almost certainly, would be much more of a top-down initiative from the Chinese government (via the Ministry of Industry and Information Technology) that may be more about meeting the needs of the government rather than meeting the needs of consumers.”
- “China’s government could build in backdoors and trapdoors to make it easier to monitor, control and censor users.”
- The basis of this new entrant into the OS market may in part be due to the fact that “Microsoft Windows 8 has been banned in China for use on new government computers since May.”
Americans have gained a lot from Chinese exports of products over the years. It would be great if the same thing could happen in the internet and software worlds.