Entries in National Broadband Plan (6)


NTCA: Include Text and Broadband to Cut USF Contribution Rate in Half 

With $2b Decrease to Revenue Base in 2 Years, “The USF is Being Starved”

When you glance at the National Broadband Plan action item website, you will see that it is reportedly at the 80% completion mark as the plan nears its 2 year anniversary. Then, look at the section called “Accelerate Universal Broadband Access and Adoption”—this is where all the “good stuff” on USF/ICC lives—and you will see that the FCC’s goals have largely been achieved. Except for one: the elusive USF Contributions NPRM. According to the action item agenda, “To stabilize support mechanisms for universal service programs, in Q4 2010 propose rules to reform the process for collecting contributions to the USF.” Well, here we are a year after this objective should have been achieved and still no progress on USF contributions reform…and NTCA is not letting the FCC forget about it, as evidenced by a January 9, 2012 Ex Parte meeting with FCC staff.

NTCA discussed with FCC staff “prompt and effective reform of the contributions mechanism that enables the federal universal service fund.” NTCA argued for a revenue-based contributions mechanisms that “is technology neutral and best captures the value that consumers place on competing services;” “reflects the balance that consumers strike between different service offerings and the evolution of consumer preference;” is the “most equitable means of sharing responsibility;” and “can be implemented quickly with little burden to providers or the industry.” NTCA further argues that a revenue-based mechanism would be stabilizing and not overly complex, unlike a mechanism based on numbers or connections.

NTCA believes that the FCC has ample authority to extend the contributions base. One of the most convincing arguments was that the FCC has obviously made it a key mission to reform USF for the broadband era, so it logically follows that contributions should also be broadband-centric. NTCA explains, “Given that the Commission has indicated that retooling the USF program to support broadband-capable networks is among the most significant policy priorities, it would be both self-defeating and ironically anomalous for the Commission to build a broadband-focused fund of tomorrow on a foundation comprised solely of legacy services that fewer and fewer customers are buying.”

“Fewer and fewer” is certainly no exaggeration—if the USF contribution base is kept as it is, there literally won’t be anyone to keep it afloat in the coming years. NTCA estimates that “Over the past 2 years, assessable Telecommunications Revenues declined by $2 billion.” Just looking ahead three years, JSI Capital Advisors has projected that total wired access lines will decline from 86.49 in 2012 to 56.55m in 2015; meanwhile broadband connections will increase from 102.30m in 2012 to 115.48m in 2015 (The ILEC Advisor: Communications Industry Forecast 2011-2020: ILEC and CLEC Access Lines; Communications Industry Forecast 2011-2020: Broadband Connections and Market Share). So… why hasn’t the FCC broadened the contribution base to include broadband connections yet?

Much of the fault likely lies in the challenge of deciding exactly who should contribute, and how much. NTCA recommends a revenue-based methodology, and suggests that the contributions base should be broadened to include non-interconnected VoIP revenue, fixed and mobile broadband revenues (a $49b base in 2012), and texting revenue (a $20b base in 2012). NTCA argues, “Non-interconnected VoIP and texting cannot function without supported networks, and should thus contribute.” Furthermore, “texting is increasingly a substitute for voice calls.” According to NTCA, including broadband and texting revenues in the contributions base could likely cut the steadily-increasing contributions factor in half. Fixing the supply side of USF is crucial, and NTCA stresses that “The shrinking Contributions Base must be fixed or all of Universal Service is at risk.”

So you include voice lines, broadband lines, all VoIP and texting in the contributions base—is that it? Probably not, according to NTCA, but the FCC should not hesitate to implement initial reforms before they decide who else should contribute. NTCA urges the FCC to study “how to address business models that rely heavily upon driving traffic from others to specific websites or web-based enterprises.”

Determining contributions for web-based businesses that “place substantial burdens on networks” and probably wouldn’t exist if not for the networks they utilize is definitely a murky area, but as NTCA suggests, this could be a longer-term goal. How would the FCC begin to determine which web-based businesses should pay into USF? Would it be based on traffic, revenue, bandwidth utilization, or something else? With the lines between service provider and content provider becoming increasingly blurry (Google being the prime example), how will the FCC apply a new methodology for companies that fall into different categories? One can expect content providers to cry foul that the FCC is attempting to stifle innovation by imposing fees in this realm, but if it weren’t for the networks, these businesses would not exist. They certainly wouldn’t be generating billions in revenue, or contributing massive strain on broadband networks, thus requiring service providers to continually invest in upgrading their facilities…

The debate over contributions reform is likely to heat up in the coming months, and it will be very interesting to see what the FCC—and the industry—comes up with for new methodologies and solutions to address the rapidly shrinking contributions base. What do you think should happen…and what do you think will happen?

NTCA’s Ex Parte filing is available here.


State USF Reform Impact Studies Predict RLEC “Death Spiral”

At Least 4 Rural States in Big Trouble if USF Reform Goes Wrong for RLECs

Anxiety about the impending decision on USF/ICC reform is definitely reaching a boiling point, if the long list of ex parte filings in the last couple of weeks is any indication. In addition to industry pleas and gripes for or against particular plans, two more states have weighed in on the potential impact of reducing or eliminating high cost USF support for RLECs. Universities from Colorado and Missouri conducted economic model analysis to determine the impact of USF reform on state jobs, income and taxes, similar to previous studies performed by universities in New Mexico and Kansas (The ILEC Advisor: New Mexico Study Depicts Life Without USF; September 27, 2011)

The state USF and National Broadband Plan impact studies do not take a position on any proposed course of action; rather they analyze what might happen in the state’s economy if RLECs lose their current level of high cost USF support. It should be noted that the studies only look at the impact of reducing or eliminating USF, not ICC as well. The results of losing USF alone are pretty dire, and I can’t imagine any of these states would be better off after further RLEC revenue reductions due to dramatic drops in ICC. For states where access revenue makes up about 30% of total RLEC revenue, you could probably just double the losses shown in these studies, which were mostly calculated assuming that RLECs receive (and stand to lose) about 30% of their revenue from USF.

Like the New Mexico study, the study conducted by Missouri State University’s Bureau of Economic Research calculates potential losses based on the assumption that the state’s 35 RLECs would lose all of their USF support. According to the authors, “It is highly probable that many of the ILECs in Missouri will not be able to survive such a transition in the long run and would go bankrupt.” Furthermore, “Even if the ILECs would survive they would decrease their investment in new infrastructure and equipment by approximately 40%,” which may negatively impact the quality, availability and affordability of telephone and broadband service. The combination of more expensive and less available broadband and the possibility that customers would have to drop services are “two things the FCC has stated it does not want.”

The Missouri study explains two possible options that RLECs would have if USF were eliminated. The first option is to increase rates, but “for every 10% increase in price that the ILECs use to offset the decrease in universal service funds, they will lose 7.6% of their customers” to either wireless substitution or no telecommunications service at all. The authors explain that “this creates a death spiral. In order to regain revenues, prices are raised, customers lost, creating pressure to raise prices again, which will again result in more customer losses. The result could be that a significant number of the ILECs, unable to make up the lost revenue, will cease operations all together.” The second option would be to drastically cut costs, including firing employees and reducing investment, which would contribute to unemployment and slow down broadband deployment.

The Colorado State University study took the analysis a little further. This study modeled different scenarios—eliminating USF, reducing USF by 30%, raising rates to help recover lost USF, and not raising rates, where the cost of losing USF support would be “borne entirely by providers.” The Colorado study reiterated the findings in the Missouri study, and concluded, “With such dramatic losses providers would likely follow one of two courses, each with important ramifications for Colorado’s rural communities.” The two courses are going out of business or surviving but with a much lower level of service. Another “death spiral” would likely occur in Colorado: “Some rural providers indicated that it would be difficult for them to raise prices without losing substantial numbers of customers. There was some indication that the impact of losing customers would be the provider going out of business.”

The four studies published so far used similar input-output models to illustrate the impact of eliminating or reducing USF on jobs, income and local/state taxes. The results are summarized in the table below. Although some of the studies project losses over a period of 5-10 years, I felt that the immediate impact results (estimates for 2012) were most accurate and revealing, given the assumptions in the studies. Some of the longer-range projections appeared fuzzy to me, as the studies did not consider possible USF replacements, like CAF or new revenue sources. The Missouri study also added the cumulative losses over 5 years, which led me to believe that their longer-term results were over-estimated.

We won’t know for sure until the final rules are revealed (expected this week, with a vote at the Oct. 27 FCC Open Meeting) exactly what percentage of USF support RLECs stand to lose (or how quickly they will lose it), but it might be an interesting exercise for companies to run the numbers and see what the impact of reducing or eliminating USF would be for their specific communities in terms of direct and indirect job, income and tax losses. How do you plan to offset USF revenue losses in order to avoid a “death spiral?” Obviously, generating more revenue is the solution; but what specific operating strategies and new business opportunities have enough momentum to overcome such significant losses and ensure future growth and profitability?

The Missouri and Colorado studies are available at SaveRuralBroadband.org.


LightSquared Forges Ahead Despite Widespread Resistance

Unveils “Interference Solution;” Has Announced 14 Partners to Date

Since it emerged on the wireless scene more than a year ago, aspiring satellite-terrestrial LTE network developer LightSquared has been nothing if not high-profile. Today, following a firestorm of political mud slinging last week between LightSquared head Philippe Falcone and opponents to the technology, the company announced a “simple, affordable solution to GPS interference issues.”

The company said that it has signed an agreement with Javad GNSS Inc. to develop a system to eliminate the interference with high-precision GPS devices, including those employed in the agriculture, surveying, construction and defense industries.

In a press release, Lightsquared said, “Javad GNSS has completed the design, made prototypes and tested those prototypes. Preproduction units will be released for public tests in October, followed by mass production. High-precision receivers for positioning applications are expected to go to market by November 2011 and precision timing devices by March 2012.”

Sanjiv Ahuja, chairman and chief executive officer of LightSquared said, “I have said from the beginning that this interference issue will be resolved as soon as smart engineers like Javad Ashjaee put their minds to it. With this new system, Mr. Ashjaee makes another mark for himself as a cutting edge pioneer in the precision GPS industry, a field he has helped shape for more than 30 years.” He added, “This breakthrough is a final step toward LightSquared’s goal of building a nationwide wireless network that will bring lower prices and better service to Americans from coast to coast.”

mmmm….Maybe. But based on the widespread discord surrounding not only the potential for interference with critical aviation and defense department systems, but also allegations that the Obama administration and Julian Genachowski’s FCC attempted to unduly influence various military and independent officials with respect to their testimony before the House Armed Services Subcommittee last week, I question the ability of LightSquared to meet its launch schedule. The company has said that it will begin to offer commercial service in the second half of 2012.

The Daily Beast reported yesterday that “A second government official has come forward saying the White House tried to influence his testimony concerning a wireless broadband project backed by a Democratic donor that military officials fear might impair sensitive satellite navigation systems.

“Anthony Russo, director of the National Coordination Office for Space-Based Positioning, Navigation, and Timing, told The Daily Beast he rejected “guidance” from the White House’s Office of Budget and Management suggesting he tell Congress that the government’s concerns about the project by the firm LightSquared could be resolved in 90 days, a timetable favorable to the company’s plans. “They gave that to me and presumably the other witnesses,” Russo said. “There is one sentence I disagreed with, which said that I thought the testing could be resolved in 90 days. So I took it out.”

Russo added that he believed the necessary testing would take at least six months. Last week, the online news magazine had reported that four-star Air Force Gen. William Shelton, head of U.S. Space Command, had also been pressured to reword his testimony. Insinuations were made that political donations made by Falcone to the Democratic Party played a role in the fiasco (despite the fact that Falcone has donated even more money to Republicans than to Democrats). Chairman Genowchowski’s (an Obama appointee) failure to appear at the House Armed Services Subcommittee hearing last Thursday also raised the ire of Republicans:

"I have the unfortunate responsibility to inform the subcommittee that FCC Chairman Genachowski refused to appear today," Ohio Representative Michael Turner (R) said, calling the no-show "symptomatic of a disregard by the Chairman to the consequences of the FCC's January 26 waiver to LightSquared." He added, "I consider the Chairman's failure to show up today to be an affront to the House Armed Services Committee."

Unless you’ve been in a cave for this entire year, you probably know that making the House Republicans in this Congress mad can be a risky affair…and I don’t think LightSquared is out of the woods, despite today’s optimistic announcement.

It is possible that LightSquared will overcome the technological and interference challenges that its L-band spectrum faces…it may even be likely. But in my opinion it’s not at all clear that this can happen and then a system be deployed in time for its announced “partners” to effectively compete in the marketplace. In fact, I would venture to say that most of the 14 listed below, including Sprint, were largely hedging their bets when they signed on the dotted line.

 Nevertheless, as I wrote last week, I do think it's critical for ILECs to ensure they are working toward offering their customers the highest speed broadband service that is feasable in their markets. Depending on how rural your service area, signing on LightSquared's dotted line (as long as there's a 'Get Out of Jail Free' card), probably doesn't hurt, and who knows...might even be useful some day.


Building for the Future: Gig.U's Investment in 1GB Networks

Public/Private Partnerships Spur Ultra-High-Speed Internet

When Google (Nasdaq:GOOG) announced in early 2010 that it would build a 1GB fiber network in one lucky city, the company received more than 1,100 applications. Eager citizens and local organizations made the case (sometimes in outlandish ways) for why their cities needed ultra-high-speed networks for their businesses, schools, hospitals, local government, and homes. Elise Kohn, program director for University Community Next Generation Innovation Project (a.k.a., Gig.U), says the Google experiment demonstrated an unprecedented desire for ultra-high-speed networks across the country, making a strong case for why ultra-high-speed networks were essential to U.S. growth. But, as a national investment, who would be willing to pay for such extensive infrastructure? And what sectors would make immediate use of such robust connectivity? According to Kohn and Blair Levin, who is heading up the Gig.U project, research universities and their surrounding communities will be the foundation of the 1GB revolution. These communities conduct top-notch research, scientific innovation, medical advances, and so on, which makes them a vital test-bed for ultra-high-speed capabilities. In short, research universities both “consume and create,” in Kohn's words, and will allow us to see what's capable in the future with 1GB.

“We're not saying everyone in America needs a gig—that's why this is a targeted investment where there's highest demand and highest yield,” Kohn says. At research universities, innovation and development would benefit from faster broadband speeds and even allow new advances in science, engineering, and medicine—key fields to U.S. global competitiveness. “If you look internationally or at what's happening at research universities,” according to Kohn, “there are important reasons that, if you want to be ahead, [1GB] is where it's going.” Not only would an ultra-high-speed network allow for smooth videoconferencing and webcasting, but the improved capabilities and data transfer rates would encourage the development of new applications, research opportunities, and learning tools. As just one example, Kohn sites current innovations in medical technology that, with advanced network capabilities, allow surgeons to practice on life-like 3D projections when training for open-heart surgery.

Kohn also highlights technologies already implemented at Case Western Reserve, a school that she calls “a great champion of Gig.U's plan.” Case Western is one of Gig.U's 30 members and last year set up a pilot program connecting a several block area surrounding campus. The Case Connection Zone now provides 1GB fiber-optic networking to more than 100 homes and has been a test bed for what Gig.U plans to do across the nation. “A number of our members [universities] are very well connected on campus,” Kohn says, “so that's not necessarily where we need to fill a need. But staff, faculty, and researchers go home at night, students live off-campus... and the research and development—the advanced work that they're doing—continues there.”

These dynamic research communities can also attract new businesses to a town or city, according to Lev Gonick, chief information officer at Case Western. Gonick said that within three months of implementing Case Connection Zone, three startups moved to the neighborhood. “Gig.U members came together to address our unique connectivity gap. We intimately understand that for American research institutions to continue to provide leadership in areas important to U.S. competitiveness, we have to act to improve the market opportunity for upgrading the networks in our university communities. We believe a small amount of investment can yield big returns for the American economy and our society,” says Gonick.

And Gig.U agrees with Gonick's more national focus. Its entire leadership team has direct experience with America's broadband needs (and lack) from working in various capacities at the FCC. Levin served as director of the FCC's National Broadband Plan, where he asserted that broadband was essential to American growth and competitiveness and that ultra-high-speed would be key to cutting edge research and development. Kohn says the National Broadband Plan also revealed that ultra-high-speed was not something the federal government would be able to invest in, at least in the short term. So early this year, Levin contacted CIOs at several universities to get the conversation going, and at the end of July Gig.U's project was announced publicly.

Gig.U's member universities come from nearly every region of the country—from the deserts of Arizona and New Mexico to the mountains of Colorado, and from the heartland states of Nebraska and Illinois, to coastal communities in Maine, Florida, and Hawaii. Most importantly, the research universities of Gig.U represent midsized communities which could potentially benefit from advanced connectivity, according to Kohn. “The universities in Gig.U have strong relationships with the communities around them,” Kohn says, “so we're allowing the universities to do the outreach to communities and surrounding areas [to explain the Gig.U initiative].”

Karl Kowalski, chief information technology officer for the University of Alaska System, says he thinks Gig.U's public/private partnership will bring value for the community surrounding University of Alaska. “While much has been done to connect the University of Alaska Fairbanks to major research networks,” he says, “our communities, our partners and our state could advance this research through innovative testbeds and community involvement if ultra-high speed networks were available to all.”

At West Virginia University, another of Gig.U's member companies, Chief Information Officer Rehan Khan says that the group is looking for proposals in order "to deploy networks not in decades but rather within the next several years." The school, along with Gig.U's other members, hopes that new networks will spur local economies and job opportunities in their regions. Jay Cole, WVU chief of staff who initiated the University’s involvement in Gig.U said, "It is the general population we are seeking to serve and encourage to use University innovation to create new jobs and improve the economy."

On Aug. 18, Gig.U issued a Request for Information in the form of an open letter, saying the group will “consider ways in which multiple Project communities can work together... to improve the private sector business case for next-generation networks.” Kohn says the group has sought input from a variety of communications providers—from national providers like AT&T (NYSE:T), Comcast (Nasdaq:CMCSA), Frontier (NYSE:FTR), Windstream (Nasdaq:WIN), and Verizon (NYSE:VZ), to regional providers like Blackfoot Telecommunications Group in Montana and Smithville Communications in Indiana. “We are doing direct outreach to them,” Kohn says, “and they are also coming to member companies and expressing interest. We've also talked with Google, Lucent (NYSE:ALU), Cisco (Nasdaq:CSCO), and anyone involved in the ecosystem. If providers in the vicinity of one of our members have an idea for how to meet the needs of that community, together, they should definitely respond. It's a learning exercise.” The Request for Information period will end in November.

It's still hard to tell what Gig.U will look like when implemented, but Kohn says much of that will depend on the specific needs and the network configuration of each member university and its community. The group is not seeking federal funding, however, and new network build outs would be funded by Gig.U members as well as private-sector companies and non-profits who join the project.

When asked about the precariousness of a “build-it-and-they-will-come” approach, Kohn said that scenario isn't really a concern in Gig.U's case. “Research universities and the communities around them already have a history of development, and this really creates a cycle of opportunity.” Kohn says this is not unlike the progression to high-speed from dial-up, in the way that high-speed has become a new standard, while creating new applications and advancements. “The risk/return profile for a private company to help build out these networks is better because of the universities,” according to Kohn. “They're more tech-savvy communities. Give them access now and they'll understand what they can do, and with those advances, more and more will start to need it."


It's 2018: Where’s the PSTN?

Telecom Experts Argue PSTN Could be Dead by 2018

On June 29, 2011, the Technical Advisory Council (TAC) presented a newsworthy and controversial recommendation to the FCC: “The FCC should take steps to prepare for the inevitable transition from the PSTN,” and they should do it as fast as possible by establishing a specific end date. TAC referenced a National Center for Health Statistics report that determined only 6% of the U.S. population will use the PSTN in 2018; therefore 2018 seems like a reasonable year to put the PSTN to bed forever. There is no denying that landline PSTN customers are bailing at a rapid pace, but is 2018 too soon to expect 100% broadband and wireless adoption, such that no Americans are without at least one reliable communications connection?

Tom Evslin (a member of TAC, a telecom expert and author of the blog Fractals of Change) noted in a blog post last week that “People are making a free-market decision to abandon the PSTN for cellular or VoIP service.  People are chatting and texting and emailing and tweeting instead of talking.” Free market momentum aside, Evslin argues that the government needs to be involved in the transition so that people are not stranded without any form of communication, which raises special concerns for public safety. Regarding the phase-out, Evslin argues, “The date, in my opinion, should be the earliest possible time we can assure that alternatives to the PSTN are universally available, so long as we spend less public money in providing these alternatives than it would cost us to keep the PSTN alive past the date certain.”

Several of TAC’s recommendations have specific consequences for RLECs and rural Americans, because Universal Service is currently married to the PSTN. Although the FCC is hoping to reform USF to support broadband networks, we haven’t quite gotten to that point yet. TAC recommends that the FCC “change USF funding and spending to support universal coverage and other social goals;” and “assure that mobile and/or broadband replacements are available everywhere PSTN is currently provided. The need will be greatest in rural areas.” Although I agree that the PSTN is well on the road to dying a slow death, I feel that it might be a bit hasty to start looking at ways to expedite the death of the PSTN before the ink is dry on rules for reforming USF. Furthermore, I also think it is necessary to reform USF contributions, which currently come from PSTN services, before moving towards a PSTN-less nation.  

I think the real controversy comes in when deciding how to end funding for the PSTN in rural areas. Evslin asks, “Why continue to subsidize the most expensive and least effective way of keeping people in touch?” I feel that this question really gets at the core of the USF reform debate, as many believe that it has clearly become wasteful and inefficient for consumers to foot the bill for slow adopters to continue using landline phones. However, I think this issue needs to be looked at from the perspective of telecom providers who use the PSTN to provide DSL and other services in addition to telephone services. For many providers, telephone service is becoming the least important source of revenue and is basically just an add-on for broadband. As a result, consumers can utilize the PSTN foundation for landline calls or for VoIP calls using Skype or other over-the-top applications, and they can tweet and e-mail and Facebook all they want. Many consumers also like the security and reliability that a landline provides, even if they don’t use it very often, and this is especially true for households that have poor wireless coverage. Evslin also notes, “What about leaving great grandma with no 911 and no way to call her daughters?” How will the FCC ensure that all the grandmas are willing and capable of using wireless or VoIP before the PSTN is phased out? I know how much trouble it was teaching my own grandma how to use a cellular phone, so this seems like a fairly daunting challenge that must be addressed with great care and consideration for all types of consumers.

A blog post on Telecompetitor by Bernie Arnason also commented on the difficulty of defining the PSTN. He asks, “Are the fiber connections to the wireless towers which carry wireless traffic and eventually interconnect with the PSTN, part of the PSTN? Are copper local loops that provide DSL service no longer part of the PSTN?” I believe that these are definitely some of the most important questions—where do we draw the line between the PSTN that should be phased out and the PSTN that is an integral component of broadband and wireless communications networks? I wonder if there is really a point to ending the PSTN if USF subsidies are eventually going to be entirely for broadband anyway—if there is still a consumer demand for landline service, why not just continue to offer it at the company’s full expense? I personally liked Hargray Telephone Company’s Broadband Incentive Plan for USF because it allowed for ongoing landline cost recovery as long as there were landline customers, but increased the subsidies for broadband depending on the broadband speeds that customers subscribed to. So, a company could in theory only have 10 landline customers in 2018 and therefore only get support for those 10 customers based on their 2011 support level. Meanwhile, the real cost recovery would come from the 10, 20, 100 Mbps broadband customers, where the company would get the 2011 landline recovery amount times a weighting factor based on the speed.

The discussion on ending the PSTN is definitely in the early stages, so it is hard to tell if TAC’s exact recommendations will come to fruition or not. TAC also recommends updating the National Broadband Plan to include the PSTN phase-out, but I think that the USF reform rules should be published before this step can be considered more seriously, else the FCC may end up creating even more anxiety-inducing regulatory uncertainty. Meanwhile, it wouldn’t be a bad idea for RLECs to start thinking about 2018, and start estimating their cord-cutting rates for the next few years. It also might not be a bad idea for RLECs to start teaching the local grandmas how to use cell phones and Skype.

Learn more about TAC’s PSTN recommendations here. Tom Evslin’s blog post is available here, and Bernie Arnason’s is here.