Category Archives: Technology

Why hyperlocal won’t save newspapers (and what will)

Stack of newspapers

Newspapers are in a tight spot. Advertising revenues have been declining for 11 years straight, and classified ads have all but vanished in the face of Craigslist and eBay. The move online hasn’t been smooth for them, either. First, they gave away their product, hoping to make it up on volume with increased ad sales. That hasn’t exactly worked—for every $1 newspaper websites bring in, they have lost $25 in print advertising. The only bright spot is subscriptions, which have miraculously held steady. Many papers are trimming their publication schedules—the New Orleans Times-Picayune the most recently notable of them—leading many communities to fear the ultimate demise of important institutions.

Whenever a business or industry falls on hard times, people trip over themselves to propose turnaround plans. Newspapers are no exception, and I’ll be damned if I’m going to be left out of the fray.¹ My diagnosis? Too many newspapers have placed their bets on intensely local coverage, or hyperlocal as they call it in the biz. That’s a mistake. To remain profitable, they need to concentrate on a particular topic instead of a geographic region.

That epiphany occurred to me Christmas morning over a bowl of cereal at my in-laws. I was flipping through the Houston Chronicle when I noticed the paper had branded their energy coverage, FuelFix. Not the best name, but it’s a sound idea. Houston is a major hub for the oil and gas industry, and Chronicle reporters have spent years, even decades reporting on it. Who else would be so positioned to cover the industry?

The Chronicle isn’t the first paper to experiment with trade-specific coverage. The New York Times has done the same thing with financial firms and DealBook, to much success. By providing consistent, nearly obsessive coverage of an industry, both papers attract new readers and new advertisers interested in reaching a targeted audience.

Those two data points made me realize that most papers have it all wrong, at least as far as profitability is concerned. Hyperlocal coverage will never be profitable enough. On a local level, there’s simply not enough news worth paying for. Try too hard and you end up with stories like this. Subscribers will never fill the void—there simply aren’t enough people willing to pay for local news, especially when they can get the basics on TV, for free.² Hyperlocal won’t attract enough advertisers, either. The local advertising pie just isn’t that big.

This is where trade publication sections like FuelFix and DealBook come in. Revenues from their higher ad rates (and, yes, maybe even new subscribers) can subsidize the rest of the paper. It’s a new twist on an old model. In the past, classifieds and legal notices kept the rest of the paper afloat. Today, trade sections can serve the same role.

New York and Houston aren’t the only cities with papers that could benefit from a trade publication model. The Detroit Free Press already closely covers the auto industry, but it could do more. The Milwaukee Journal Sentinel could dig deep into manufacturing. The Chicago Tribune might look at commodities or financial firms outside of New York. The San Jose Mercury News and the San Francisco Chronicle could step up their tech coverage, too. There’s a lot of competition in that sector already, but it is highly profitable. And if that fails, there’s still a chance for them in biotech. Not all papers can follow this model, but having some survivors is better than none.

With profits from the trade side, newspapers could continue covering the less profitable—but arguably more important—stories. It makes business sense, too. Without the rest of the newspaper, the trade section loses some of its credibility. It would be just another trade publication.

This plan isn’t problem free. Like in the past, advertising-editorial conflicts could scuttle the whole experiment. But unlike other proposed new business models, that devil is well known. Newspapers have managed such conflicts by erecting firewalls between advertising and editorial sections. The same could be done with trade sections by separating the two newsrooms. Even better, papers could spin them off the trade sections into wholly-owned subsidiaries and let the profits flow back to the regular news side. It might be enough to let newspapers live to die another day.


  1. I have my reasons for wanting venerable papers to survive. For one, I worked as a science reporter for Chicago Tribune in the summer of 2008. In my childhood, I spent many mornings and evenings reading the local papers. And as an adult, I’ve come to appreciate the importance of print journalism—there are simply some stories better told and better remembered in written form.
  2. I’d be surprised if the other option—providing deep insights on the news—would change the equation, at least on a local level.

Photo by jeffeaton.

How TED and The City 2.0 took the internet for a ride

TED Global 2012

Last year, TED made a lot of noise when it announced that it was awarding its TED Prize to something called “The City 2.0”. In case you don’t know what “The City 2.0” is, it’s an idea. At least that’s what TED was telling us. They were awarding the prize to an “idea” instead of a person, sort of like when Time Magazine goes all crazy and awards the Person of the Year to a machine.

Well, TED isn’t about machines, but it is all about ideas, so it gave its award to an idea, which was really like giving an award to itself, which as you’ll see in a bit is actually a more accurate statement than you’d think. Did I mention that this idea had a website? It does. One that TED designed and built themselves. Well, half-built. See, it wasn’t exactly ready when the TED Prize was announced, which is funny because TED both built the site and gave the award.

The humdinger behind the original The City 2.0 ¹ was that people could use the site to start grass roots campaigns to change their neighborhoods and cities. The idea was that “the reach of the cloud” and “the power of the crowd” would join forces and, from that totally awesome buzzword high-five, ten winners would emerge.

Wait, ten winners? Didn’t an idea already win the prize? Turns out that even though an idea can have a website, it can’t accept $100,000. Well, it could if TED, which came up with the idea for The City 2.0, had awarded the money to itself, but that would just look tacky. So instead they split up the $100,000 with the intent of awarding it to those ten winners who would bubble up from the cloudy crowd.² I’m not sure how the cloudy crowd picks the best ideas, but at the time I guessed it would have something to do with voting, as most cloudy crowds do.

This is the awkwardly placed paragraph where I say that I reached out to The City 2.0 for comment because I felt like I had to and because I actually did. They haven’t responded, not even a “no comment” or anything.

With a whole $10,000 on the line, I’m sure TED was expecting the The City 2.0 to be a hotbed of entrepreneurial activity. It wasn’t. Two months after the prize was awarded, I checked back. The site was finished, but it didn’t seem like anybody else knew that. There were just 124 ideas, most of which only had one or two votes in favor of them. Less than two months remained until TED Global, the event where the ten winners were supposed to be announced.

To be fair, there were a few ideas that were winners, some of which would go on to be actual winners, but they still weren’t announced at TED Global. (How these winners were determined I do not know because the only thing on the site about them was a handful of lousy sentences.) Most, though, were definitely not good. Some were rehashes of city council debates:³

Livable Elgin Parkway

Eglin Pkwy in Ft Walton is a 7 lane high speed road. In order to foster development / creativity, this could be reduced to a 2 lane road. Through traffic should be redirected to a high speed location.

Others didn’t even bother mentioning cities or local issues:

Creating a Viable 3rd Party

Democrats=Republicans

And my favorite?

Make the public aware of City2.0 [sic]

In order for this to become a reality local residents need to be made aware of The Wish.

Umm, guys, you realize anybody can read this, right?

You know how you feel when you see someone do something embarrassing and you feel embarrassed for them? That’s how I felt for The City 2.0.

Well, the TED people aren’t idiots. They know a failed venture when they see one, so they quietly pulled the plug on “The City 2.0” and rolled out… “The City 2.0”. The, um, 2.0 version of the site scrapped the map and didn’t show people’s submissions, because, like, some people’s ideas are totally lame and we don’t want that messing up our snazzy website we dropped a lot of coin on. In it’s place were “stories” that were selected by TED along with random urbanist links. That’s, like, totally better.

Did you know that October 13 was TEDxCity2.0 Day? Neither did I, and one of the supposed TEDxCity2.0 events was in Boston, just two miles from where I live. It had people talking about leadership and Mars and poetry and financial markets and communication and aerodynamics and mind reading but it didn’t have anyone talking about cities. Huh.

To date, TED has awarded eight of its ten prizes, five of which were given to projects that had already existed before The City 2.0 got all spendy. That’s not to say the winners don’t deserve it—they’re great projects—but it’s not exactly the crowdsourced, spontaneous wonderland what TED led us to believe The City 2.0 would be about. Two more are supposed to be awarded in November and December. It’s December already.

One of the things that upsets me about The City 2.0 isn’t that it was a half-baked idea with a sucky website that dozens of other organizations were already doing way better. Or that it was a transparent marketing ploy meant to draw more attention to TED than to the issue. Or that there was absolutely zero transparency about how winners were going to be selected despite pandering to the “crowd” and its infinite wisdom. No, what gets me is that so many people bought into it. ArchDaily ran such a breathless piece that I almost mistook it for a press release. The Atlantic Cities ran a laudatory article. Inhabitat, too.

Well, not everyone was buying it. The Next American City was doubtful—at first. They wrote a bit about how The City 2.0 was a major letdown, man. But later, they heard about the relaunch of The City 2.0 somehow, wrote a nice piece about it, and ended up winning one of the ten The City 2.0 awards. I also wrote a skeptical piece about The City 2.0 way back when, but I didn’t hear about the relaunch and didn’t write a nice piece about it and didn’t win an award.

I shouldn’t be surprised by all of this. In fact, I’m not. I know TED is a marketing machine and that its only real interest is making sure that the TED name is everywhere. I mean, what else can you expect from a conference that’s feels the need to address its own elitism? But that doesn’t make me any less angry. Angry that TED exploited the rampant churnalism that’s so prevalent on the internet. Angry that so many people bought into TED’s hokey and transparently vain message. And angry that TED would so arrogantly presume to fix something as complex as the city without giving it any more thought than would a few fresh-faced marketing graduates.


  1. They obviously didn’t have any better ideas for a name. Oh, and all links in this article to the original version of the site are from a publicly viewable development server. It looks marginally worse than the actual site did at the time, but all the data stored on the dev server is legit.
  2. Apparently, TED has run out of famous wealthy people to give their prize to, so they settled on ten less famous and less wealthy people. Maybe because less wealthy people are more easily wowed by less money?
  3. Have you ever been to a city council meeting? People can get angry there. I once saw a guy punch his own dog! Not really, though. But I can totally imagine it happening.
  4. Someone should tell them how version numbers work.
  5. GASP! That’s a long acronym.
  6. Somewhere, a stenographer just got his wings.
  7. NOTE: Correlation ≠ causation. In case you didn’t know that because you haven’t taken statistics or don’t read Slashdot.
  8. Churn + journalism. Lots of news and yet somehow no actual news. Sort of like CNN.

Photo by Stefan Schäfer, Lich.

Related posts:

Can crowdsourcing save the city?

In Star Wars, cities are evil

Responsive urban design

Searching for the truth in Apple’s Maps

New iOS 6 Maps

Last week Apple released iOS 6. Along with the usual bevy of new features and refinements, the company made one change that’s drawn some ire. The built-in Maps app ditched Google as its data source, instead relying on information from Apple’s own servers. Apple has doubtless been preparing for the change for years, but that didn’t mean the transition was smooth. In fact, it was anything but.

I can attest to that from first-hand experience. I managed to snag one of the last iPhone 5s in Cambridge last Friday. To be sure, it’s a gorgeous device—for some reason the tall screen combined with its thin profile make me feel like I’m holding an artifact of the future. But that also meant I was forced to run iOS 6 and use Apple’s own maps.¹ The software behind the new Maps app is slicker and snappier than before, as I predicted a few months ago, but no one ever doubted that Apple could write some killer software. The concern was over the data. Could Apple build a GIS as complex and detailed as Google in just a few years? In short, no.

But for all the inconvenience this will bring in the short-term, I still think Apple’s move away from Google was the right move. For everyone involved, users included. The importance of Apple’s Maps isn’t that it snubs a competitor or that it now provides turn-by-turn directions. No, Apple’s in-house solution is significant because it serves as a cartographic foil to Google’s Maps.

Compared with the Google Maps of today, Apple’s shortcomings are obvious. The lack of transit directions is problematic,² many addresses are misplaced, and its routing algorithms need some tweaking. As a 1.0, though, Apple’s Maps compares favorably with Google Maps circa 2005. Back then, there was no street view, no hybrid view, no API for developers. But it’s not 2005, it’s 2012. If Google’s excellent maps had never existed, we’d be heralding our new cartographic savior. But years ago Google showed us how useful good maps can be. We’ve already seen the light.

And we love it. Google’s interface is slick and responsive. Their experience in search means queries return useful results more often than not. The company has spent years polishing its maps, adding insane amounts of detail, revealing secrets even we didn’t know about our own neighborhoods. As a result, Google Maps has become the de facto source for web maps.

But that dominance poses a problem. Too often, we assume that maps are pictographic snapshots of the world around us. But we forget that they paraphrase the real world. Maps abridge things here, gloss over details there, and otherwise simplify the world. Someone has to decide what gets left out. Whether it’s one person or an entire company making those decisions, the potential for manipulation—intended or not—is high. Everyone has their own biases, and Google’s is advertising. As a result, businesses can pay to alter the map. It’s a subtle reminder that maps aren’t purely factual—they’re facts filtered through a worldview.

That’s not to say Apple’s cartographers and programmers don’t have their own biases. But the added competition will keep Apple, Google, and others more honest. For now, it’s Apple that needs to catch up; their maps are clearly lacking. Yet there may be a time when Apple’s maps exceed Google’s in quality, forcing the search giant to reconsider how it gathers and displays information, including ads. There will never be a map that everyone considers 100 percent correct, but the more competition, the closer we’ll get. In the long run, that benefits everyone.


  1. Yes, I could have used Google’s web interface in a pinch, but it’s not as elegant and I wanted to put the new Maps app through its paces.
  2. I used transit heavily over the weekend, and I can tell you that third-party apps are no panacea. Apple’s going to need to incorporate transit data sooner than later. It’s my guess that, once they iron out the thousands of agreements with as many agencies, they will.

Photo from Apple.

Why you should be excited about vector-based maps in iOS 6

iOS vector-based maps

Apple announced today that it’s revamping the Maps application on iOS devices—iPhone, iPad, iPod touch—introducing a lot of showy new features like turn-by-turn directions and 3D flyovers. While those make for sexy commercials, they won’t be as impactful as the switch from raster- to vector-based map data. If you’re not sure why you should be excited about the change—and you should be—read on.

Web mapping has revolutionized cartography, but from a data perspective, it’s still stuck in the past. Google Maps, MapQuest, and others deliver map data as images. From a computer’s perspective, they’re just like photos snapped by digital cameras. They’re a series of pixels that combine to form an image. That means they’re well suited for web browsers, which are adept at displaying image files.

But that same strength is also a weakness. Image files can have either small file sizes or high levels of detail, not both. To compensate, web mapping applications load a series of tiles based on the extents of your view and how closely you’re zoomed in. Drag the map or zoom further, and they load more tiles to show more of the map or provide more detail. That trade-off has worked well in a broadband world, but it begins to show some cracks on mobile devices, which have more limited bandwidth.

Enter vector-based maps. They’re nothing new—geographic information systems have been storing and displaying vector data for decades—but their application in consumer mapping systems is. Vector data is stored as a series of lines and points instead of pixels. That means the only data stored is the data that matters. A vector street map, for example, will have data only on the streets’ paths, resulting in a smaller file. A raster map of the same also would have data representing the blocks in between the streets, even if they are empty, creating a larger file.

Another benefit of vector data is that it is just as precise at the widest zoom as it is at the closest. That eliminates the need for loading more tiles as you zoom in or out. It also means the entire data set—all scales, all extents—can be lighter-weight than a comparably detailed raster data set. That means map data can be sent more quickly to your iPhone. And it’ll look better—no more jagged boundaries if you happen to slip in between the zoom levels predefined by Google.

Vector-based maps aren’t a panacea, though. They’re simply not an option when it comes to displaying satellite photos—raster data is the only real choice for that. Plus, vector maps tend to require more processing power to display than raster maps. As a result, vector maps weren’t feasible on mobile devices until recently.

Those hurdles have disappeared in the past few years, at least for apps running on mobile devices. Google rolled out a vector-based maps app for Android in December 2010, and now that Apple has joined the party, it’s clear vector maps are here to stay. That doesn’t mean raster maps will disappear—satellite photos are still best viewed that way, and desktop and laptop users will be stuck with raster street maps for a while longer because of shortcomings of today’s web browsers. But mobile users—who have the most to gain at this point—will see immediate, though maybe not immediately apparent, benefits.

Photo courtesy of Apple.

Population density fostered literacy, the Industrial Revolution

Class portrait, unknown English school (undated)

Without the Industrial Revolution, there would be no modern agriculture, no modern medicine, no climate change, no population boom. A rapid-fire series of inventions reshaped one economy after another, eventually affecting the lives of every person on the planet. But exactly how it all began is still the subject of intense debate among scholars. Three economists, Raouf Boucekkine, Dominique Peeters, and David de la Croix, think population density had something to do with it.

Their argument is relatively simple: The Industrial Revolution was fostered by a surge in literacy rates. Improvements in reading and writing were nurtured by the spread of schools. And the founding of schools was aided by rising population density.

Unlike violent revolutions where monarchs lost their heads, the Industrial Revolution had no specific powder-keg. Though if you had to trace it to one event, James Hargreaves’ invention of the spinning jenny would be as good as any. Hargreaves, a weaver from Lancashire, England, devised a machine that allowed spinners to produce more and better yarn. Spinners loathed the contraption, fearing that they would be replaced by machines. But the cat was out of the bag, and subsequent inventions like the steam engine and better blast furnaces used in iron production would only hasten the pace of change.

This wave of ideas that drove the Industrial Revolution didn’t fall out of the ether. Literacy in England had been steadily rising since the 16th century when between the 1720s and 1740s, it skyrocketed. In just two decades, literacy rose from 58 percent to 70 percent among men and from 26 percent to 32 percent among women. The three economists combed through historical documents searching for an explanation and discovered a startling rise in school establishments starting in 1700 and extending through 1740. In just 40 years, 988 schools were founded in Britain, nearly as many as had been established in previous centuries.

School establishments in Great Britain before 1860

The reason behind the remarkable flurry of school establishments, the economists suspected, was a rise in population density in Great Britain. To test this theory, they developed a mathematical model that simulated how demographic, technological, and productivity changes influenced school establishments. The model’s most significant variable was population density, which the authors’ claim can explain at least one-third of the rise in literacy between 1530 and 1850. No other variable came close to explaining as much.

Logistically, it makes sense. Aside from cost, one of the big hurdles preventing children from attending school was proximity. The authors’ recount statistics and anecdotes from the report of the Schools Inquiry Commission of 1868, which said boys would travel up to an hour or more each way to get to school. One 11 year old girl walked ten miles a day for her schooling.

Many people knew of the value of an education even in those days, but there were obvious limits to how far a person could travel to obtain one. Yet as population density on the island rose, headmasters could confidently establish more schools, knowing they could attract enough students to fill their classrooms. What those students learned not only prepared them for a rapidly changing economy, it also cultivated a society which valued knowledge and ideas. That did more than just help spark the Industrial Revolution—it gave Great Britain a decades-long head start.

Sources:

Boucekkine, R., Croix, D., & Peeters, D. (2007). Early Literacy Achievements, Population Density, and the Transition to Modern Growth Journal of the European Economic Association, 5 (1), 183-226 DOI: 10.1162/JEEA.2007.5.1.183

Stephens, W. (1990). Literacy in England, Scotland, and Wales, 1500-1900 History of Education Quarterly, 30 (4) DOI: 10.2307/368946

Related posts:

Hidden cost of sprawl: Getting to school

Hunter-gatherer populations show humans are hardwired for density

Do people follow trains, or do trains follow people? London’s Underground solves a riddle

Photo scanned by pellethepoet.

What do population density, lightning, and the phone company have in common?

Lightning strike in Tokyo

File this one under “applications of population density”. Researchers working for Nippon Telephone and Telegraph—better known as NTT—discovered they could use an area’s population density to predict telecommunications equipment failure due to lighting strikes.

Telecommunications is an expensive business. Like other infrastructure, it requires a lot of manpower and capital to expand and maintain. But unlike many other systems, telecommunications—especially cellular network technology—has been advancing at a breakneck pace, requiring equipment to be upgraded or replaced every few years to stay current. Furthermore, the equipment is both delicate and expensive. Something like a lightning strike can easily cost tens to hundreds of thousands of dollars to repair.

The NTT researchers were interested in predicting where lightning strikes would exact the most damage in coming years, especially since some climate models predict more severe weather, which can lead to more lightning. The study focused on three prefectures in Japan—Tokyo, Saitama, and Gunma—which represent a gradient of population density ranging from one of the most built-up urban environments to relatively sparse farmland. The prefectures also fall along a gradient of lightning intensity, with Gunma at the high end receiving 10 strikes per square kilometer and Tokyo at the low end receiving 3 strikes per square kilometer.

Using past data on lightning strikes, telecom equipment failures due to lightning strikes, and the 2005 Japanese census, they developed a model to describe how telecom equipment failures due to lightning correlate with population density. At first blush, I expected urban areas to receive the brunt of the impact—after all, they have loads more equipment than rural areas—but the results were just the opposite. Expensive circuitry and antennas were safer in urban Tokyo than they were in rural Gunma, even when the discrepancy in lightning strikes between the two regions was taken into account.

The authors offer two explanations for why telecom equipment is safer in urban areas. First, many of the copper lines that feed base stations and boxes run underground in cities, which lowers the induced voltage during a strike. Second, the equipment itself tends to be exposed to the elements in the country, either on the ground or perched atop telephone poles. In the city, most of it in encased in apartment buildings.

But there is another possible explanation they missed—the design of telecom networks and their relationship to population density. The evidence lies in their calculated coefficient that describes  how population density can predict equipment failures due to lightning strikes. The coefficient is ¾, and if you’ve been reading this blog for a while, you’ll no doubt recognize that number. As an exponent, ¾ is powerful descriptor, explaining a variety of phenomenon ranging from how plant sizes influences population density to how human population density affects the density of place names.

In this case, ¾ seems to say less about the pattern of lightning strikes than it does about telecom network design and the differences between rural and urban infrastructure. Denser populations require more equipment, but not at a fixed rate. Cellular networks provide a good example. In rural areas, cell sizes are limited by area, not the number of users. It’s the opposite in the city—the more users, the smaller cells become. Therefore, phone companies can rely on fewer cells and less equipment per person in the city than in the country.

The relationship between infrastructure demands and population density could go a long way to explaining why there is a lower rate of equipment failure in denser areas—there’s simply less equipment per person in the city than in the country. But the fact that telecom infrastructure—and damage to it—appears to scale at the same power that describes an range of phenomena related to density and metabolism, well, that’s just too good to be a coincidence.

Sources:

X. Zhang, A. Sugiyama, & H. Kitabayashi (2011). Estimating telecommunication equipment failures due to lightning surges by using population density 2011 IEEE International Conference on Quality and Reliability (ICQR) , 182-185 : 10.1109/ICQR.2011.6031705

Photo by potarou.

Related posts:

Floral metabolic densities

Hunter-gatherer populations show humans are hardwired for density

The curious relationship between place names and population density