From where I stand, 2015 is going to be a big year.
Sure, I can tell you about wearables reaching a new level of maturity. Likely not Google Glass in its current form, but the next wave of devices that include not only the Oculus (developed by a former journalism major) but also Samsung and others that convert smartphones into full immersion, mobile devices. Mainstream wearables will likely stay on your wrist, at least while out in public. But their user interfaces (hands free, constant listening, always collecting data, etc.) will be baked into all our current and future devices.
Or I can tell you about the type of content that is going to be produced on and for these new devices, including virtual reality and augmented reality. Two words: Magic Leap. This mystery company has gotten attention from Google and others in the form of millions of investment dollars. Cinematic VR — created through high-end cameras or strung-together GoPros — is going to be a new, completely different form of video storytelling. Point-and-shoot 3D modeling is coming fast too, eventually making Unity 3D the latest edition to a newsroom’s growing toolkit.
I can also tell you about the social media platforms that will continue to pop up with a mobile-first focus. While most are likely to disappear just as quickly, there will be a handful that stick around and grow. They’ll likely be centered around anonymous broadcasting, unknown masses easily creating more bite-size, disposal content. (For example, keep an eye on Yik Yak. Users have already begun to break news there faster than Twitter. If you’re near a college campus, check out how wildly adopted it already is…but also prepare yourself for the raw realness that gets posted.) Do yourself a favor and read this Medium piece from Kik’s CEO, who lays out the mobile future, all happening outside of Silicon Valley.
Maybe I can tell you about how diverse communities — gender, people of color, and age — will shift into content creator/driver/leadership roles because they feel empowered (or frustrated) enough to stop waiting for traditional outlets to reflect them. In fact, finally, this year organizations will realize the future is with those groups. (Okay, that one is more of a hope.)
But the truth is, my prediction is all these things finally coming together. (Note: I didn’t say “convergence.”) For me, 2015 is the year technology empowers content — whether creation or consumption. It’s not the year of technology — it’s the year of the content made on the technology. Meaning tools for creating apps, VR, AR, etc. are moving from developers to content creators. Think of the moment when blogs empowered non-coders to create new content on the web.
Yes, we’ll continue to develop and get new technologies, but 2015 is the year when content creators will learn that shovelware is a dead end and gimmicky content for pageviews is pointless. They will mature by embracing the true nature of these platforms. It’s not the technology, it’s the content.
This is something I see here in the City of Angels. All those “big things” I listed above are happening — and likely centered — in Los Angeles. This town is the content capital of the world. You know it for films and TV, as well as video games. But did you know it is also the place for viral web videos? YouTube, Vine, Instagram, etc. “stars” appear to be coming from here. Creative content people are frustrated with the industry and creating their content on their own terms. Sound familiar?
You’ve heard podcasts are big right now? This town has been doing it for years and it doesn’t sound like radio. Kevin Smith, Marc Maron, Chris Hardwick, Felicia Day and countless others have been creating podcasts for and of the web for years and years, gathering a following of millions. (Check out Kevin Smith talking about the creation and development process for his most recent film Tusk in this Nerdist interview.)
While Yik Yak is in Atlanta and Kik Messenger is in Waterloo, Ontario, Los Angeles is home to Snapchat, Tinder, and Whisper…truly mobile apps that have influenced the tech industries. It’s also home to Oculus (VR leaders) and Qualcomm (AR leaders).
The game has changed. As well as who is playing and creating the game. Let’s hope journalism is ready it.
Robert Hernandez is an assistant professor of professional practice at the Annenberg School for Communication and Journalism at the University of Southern California.
If you Google “integrated newsroom,” you’ll see loads of search results for articles, op-eds, presentations, academic papers, and even images of seating charts. (This one makes me chuckle a little.) Everyone has their own definition of what this buzzword-y phrase means (as is often the case with buzzwords), but even so, I’ll throw mine onto the pile: An “integrated newsroom” is one in which reporters, designers, developers (and more) work closely, regularly, and happily together to develop features, apps, visuals, and tools.
Media organizations are still trying to crack the code when it comes to building integrated newsrooms, and this will remain a major challenge in 2015. There are several newsrooms with visual/graphics teams that are stunningly good at their jobs. But even with a dream team of developers, designers, and reporters outputting a formidable corpus of apps and editorial features, people and roles are still getting siloed — sometimes even silenced — in every newsroom.
Vox Media, where I work as a developer, is still working towards building an integrated newsroom, but I think we’re coming close. As a relatively young company built from and firmly grounded in technology, there are of course many factors that make us slightly better positioned to do so. To compare our position to a large newspaper is unfair; they have to think about print alongside digital, and have larger newsrooms with all the bureaucratic trappings that come with such scale. Even so, I think there are a few lessons that can still be drawn from our experience and challenges in building an integrated newsroom culture at Vox Media.How does a newsroom become integrated?
I can only speak from personal experience, so first I’ll lay out some disclaimers. I’ve only worked in one newsroom, and even at Vox Media, I’m not officially a developer dedicated to any editorial team. I don’t sit in the Vox.com or SB Nation editorial bullpen in D.C. We do have an editorial apps team that works across all seven of our verticals to produce special features, as well as a developer solely dedicated to building web apps for Vox.com. My day-to-day work is primarily in building out our branded and native advertising content and platforms. But I think that makes my point even more salient: Although I’m not officially a newsroom/editorial developer, I’ve still gotten a glimpse of what it’s like to work in such a collaborative and integrated environment.
So here’s my single, most important tip for building a truly integrated newsroom: Talk to one another. And maybe become BFFs.Educate, empathize, and encourage.
That said, it’s also important to build a culture where developers and designers aren’t forced into a position to “serve” the needs of writers and reporters. This is tricky, because I think we often see the content creation process as being content first, with all the “production stuff” (visuals, interactives, etc.) coming last. But creating an integrated newsroom doesn’t mean establishing an effective and efficient way for developers to build stuff out for writers’ content. It means building a culture where developers and writers can work together, in parallel, to solve a common problem and tell a story together. Have respect for one another, get all parties involved early in the process, and you’ll have way more fun that way.
The last point is encouragement. We have many editors at Vox who consistently encourage those on the product team to contribute writing and content to the website. This signals to me, as a developer, that my ideas are being heard and that I have a real opportunity to contribute to the site, whether that’s through a written article or a web app.Break down walls and empower people.
Not physical walls (although you could do that too!) — I mean the walls of communication, walls of process, walls of bureaucracy. These have their purposes, but when building an integrated newsroom culture, they can impede more than help. Process and good management are important and necessary for producing a thing, but in the idea-origination stage of writing and developing, they can hinder. Let chaos breed some creativity.
At Vox Media, one tool of communication we use to break down such walls is Slack, the increasingly popular internal chat tool. Slack allows you to create as many chatrooms as you’d like. We create public chatrooms for most things and encourage people to join any room they wish. I don’t like pointing to one single tool as a silver bullet (your culture needs to complement the tool), but I truly believe this is one of the reasons why our team has become as collaborative, creative, and integrated as it is currently. Conversations in Slack range from serious to completely lol-worthy (gifs, bots, etc.). What’s great is that good (even genius) ideas can spring from either type of conversation. You’d be surprised at how many jokes in Slack have evolved into real stories or features on our sites.
One recent example of such a project is the Serial podcast app we built. After some discussion about the latest episode in our cross-company Serial chatroom (yes, we have one), a few people mentioned it might be a great opportunity to build an app that could serve as a “guide” to all the people mentioned in the podcast. Developers, designers, and writers from Vox all worked together on wireframing, designing, building, and gathering content. Five days later, an app was born. Sure, most everyone who worked on the app worked on it outside of their normal work projects and assignments. But working with a “flex” team of people who had never worked together before, on a project we were all very passionate about, was not only a blast but also helped educate and reinforce the idea that we work in an environment where such rogue projects are totally acceptable and encouraged.
If someone has an idea, they should feel empowered to run with it. Having an editor sign off on an idea should be a five-minute chat, not a days-long process. Getting an app set up should be a few hours’ worth of a developer’s time, not a several-week process of giving that person the right credentials or authorization on your system. (At Vox Media, most developers know how to set up an editorial app and have permission to do so.) Breaking down these structural impediments helps us build cool things faster.What are the challenges of building an integrated newsroom?
Building this type of integrated culture isn’t easy. First and foremost, input and output are often at odds with one another. A reporter may want one thing. The developer may understand better the technical constraints of that thing and know she won’t have time to build it out and meet deadline, so she may have another thing in mind. This challenge merely boils down to perspective, process, and communication. Building a great app or tool is a team effort, and everyone should feel involved early on.
Second, it turns out that deadlines are not super fun. Building out interactive apps and visualizations for breaking news is incredibly difficult, though some newsrooms are doing it well. To get around this, start with evergreen pitches first to get a feel for a collaborative process without the burden of meeting a looming deadline.
Lastly, scaling and sustaining this type of integrated culture is difficult. Our company has had success so far because the size of our team is relatively small compared to others. All members of editorial and product sit on the same floor in our D.C. office, allowing communication to be natural and easy. Aside from building an integrated newsroom culture, the challenge of scaling such a culture could be one of the most difficult we face in 2015 and in the coming years.
Alisha Ramos is a front-end designer at Vox Media.
Here is hoping 2015 will see:
— The Pulitzer Prizes are not given to the same annual one-day wonders.
— Stories on Jill Abramson morph into stories from Jill Abramson.
— More math — any math — in all those gushy profiles of new media startups and journalism’s saviors.
— That likely new owners of the Financial Times and The Economist value the journalism there as much as the trophy brands they are buying.
— That more newsrooms (and Wall Street) will shun third-party “partner” parasites.
— An acknowledgement that 20/20 is good as hindsight goes, but if 2020 is your target for major change, you are in deep trouble.
— The good people of Circa’s newsroom have a soft, good landing somewhere.
— A meaningful thaw in the glacial pace at which most mainstream newsrooms still get their journalism to their audiences.
— More Ken Doctor analysis and less mainstream media punditry.
— The realization that if you are now behind your readers’ habits (think social and mobile, for example), your news organization is not playing catchup but is actually going backward.
Raju Narisetti is senior vice president for strategy at News Corp.
The conversation around the disruption of the news is surrounded by fear. We pick apart the successes and failures of massive media experiments with little sensitivity for the humans behind the screen. But even as editors and owners clash and journalists and technologists are pitted against each other, a movement of collaboration and experimentation rises, growing stronger everyday.
A growing contingent of pioneers have been working on the ground to build the new frontier of news. Many of them organizing through the global grassroots movement of Hacks/Hackers, where I’m now the executive director. In just five years, a community of 60,000 people around the world has grown in more than 80 locations. From every avenue, people flock to this space where journalists and technologists work together outside of their normal workplace in environments that encourage collaboration, community, and creativity.
Reflecting on technology’s disruption of journalism Emily Bell recently declared: “No serious news organization can expect to have an audience or a future if it hasn’t already worked out its place in the digital ecosystem.” I’ll add my own call that the ecosystem of the journalism industry requires an awakening to a collaborative and supportive environment where we unify to better serve public information needs.
That movement is growing. There are communities actively embracing the unification of journalism and technology all around the world. (Are you part of one? Take this quick survey.) It will continue to expand as a space of inclusivity, cross-disciplinary experimentation and where diversity is valued.
Not only will that integration become mainstream, but I predict that in 2015 we’ll see it flourish with an increase of creativity and imagination in how we solve public information needs. Here are three ways I see this happening:People-first strategies
We’ll see more projects like Melody Kramer’s People Not In News Commenting On The News embark on active listening and communicating with audiences in new ways. Those conversations will inform innovation in reporting and design methodologies and continue to revolutionize the ways we keep communities informed.Experiments in cross-disciplinary teams, including community, arts, research, and data
Projects like the documentary Broken City Poets from Youth Speaks and the Center for Investigative Reporting pair poetry and investigative reporting to share stories of the community in Stockton. In 2015, Hacks/Hackers launches its own creative experiment piloting Convergences, conjoining arts and community to collect, analyze and tell stories with data. In this case, we’re looking at the changing skills of the modern journalist and the shape, scale, vitality, hybridity and dynamism of the growing Hacks/Hackers network.Multi-language tools and strategies will emerge
As we increasingly communicate with global audiences, discoveries will be made that leverage illustration, the arts, and software engineering to explore multi-language tools and strategies to help people share stories across wider communities and discover ways to open up access to information.
Jeanne Brooks is the executive director of Hacks/Hackers.
Over the past 22 years, 1,059 journalists have been killed. Worldwide, some 430 journalists are in exile from their home countries. Hundreds more are injured, persecuted, muzzled, and threatened, mostly by governments and sometimes by influential non-governmental forces, all interested in stifling a free, fearless press. Much of this happens in countries where autocratic regimes are the norm and press freedom is ignored.
RELATED ARTICLEKen Doctor: “Rosewater,” cascading censorship, and press freedomDecember 18, 2014Meanwhile, there’s a lot of lip service paid, often after the fact, to such threats. In December, for instance, the U.N. held one of its usual summits to talk about this problem, and all that it resulted in was a so-called communiqué. The Committee to Protect Journalists (CPJ), a very worthy nonprofit group, does offer emergency assistance through its Gene Roberts Fund for Emergency Assistance, which has helped dozens of journalists in distress. But CPJ does this in the larger context of needing its donor-raised resources for its broader mission of protecting and promoting press freedoms worldwide.
So my idea is simple: Create a Journalist Rescue Fund.
It will be a global effort that will offer journalists under threat an avenue to be temporarily placed in journalism schools at universities around the world, and in media organizations willing to provide a short-term professional home. Such placement will allow them to continue to practice journalism without fearing for their lives, or even to acquire new skills in a safe environment until a more permanent home is found back in their country or elsewhere. The host partners for this program will be universities and news organizations around the world.
The idea isn’t as far-fetched as it might seem at first blush. It is based on my personal exposure to the very successful Scholar Rescue Fund, which is managed by the New York-based International Institute of Education — a 93-year-old organization that also administers the Fulbright Fellowships, among other things, and focuses on opening access to U.S. education to young people from around the world.
For the past 10 years, the Scholar Rescue Fund has placed nearly 500 scholars, mostly professors from 48 countries, in universities in other nations so they could fearlessly continue their work. These professors were all considered threats in their home countries, including most recently Iran, Iraq, Syria, and Libya, and often targeted by those regimes. The Scholar Rescue Fund is entirely funded by donations and managed by IIE; it began with a small number of major visionary philanthropists drawing from their personal family experiences of being persecuted.
The Scholar Rescue Fund has done tremendously impactful work in a very non-controversial way, working with many of the countries where such scholars are at risk. The Fund has lined up universities in 40 countries, including dozens in the U.S., and scores of partners to help make Scholar Rescue a reality. There’s a lot of documented research on its success with some highlights here.
A Journalist Rescue Fund, modeled on the Scholar Rescue Fund, would be at the heart of deeply held beliefs about free markets and free societies, the value of a free press, and the growing role of technology in furthering journalism communities and conversations around the world — all of which ought to stand for free expression.
A well endowed Journalist Rescue Fund, perhaps jointly administered by two globally reputable nonprofits — the Institute of International Education and the International Center for Journalists — and working in coordination with CPJ would have a profound, practical, and immediate impact on preserving hard-won freedoms in many countries, and in ensuring a legacy that goes well beyond individual institutions.
I believe a significant initial commitment of around $5 million could provide for a decade-long initial funding, with the goal to then become self-sufficient through other fundraising efforts.
Any philanthropists out there willing to help jumpstart what will be a enduring, global legacy of protecting journalists?
Raju Narisetti is senior vice president for strategy at News Corp. He serves as a trustee of the Institute of International Education and as a board member of International Center for Journalists.
Photo of a Bahraini anti-government protester Sept. 12 by AP/Hasan Jamali.
If there’s a news outlet you would expect to be ahead of the curve in digital media, it might be Wired. The San Francisco-based magazine of technology has been at it longer than just about anyone; it launched HotWired.com back in 1994, with completely different content from the print magazine. Its creators, whose efforts were chronicled by Kyle Vanhemert on the occasion of the site’s 20th anniversary, were among the first to try and shape what a successful digital news business might look like.
— Cat Simões (@velouria) October 27, 2014
But the path from here to there hasn’t been particularly straightforward. HotWired, or Wired Digital, was sold to Lycos in 1999. There, it became Wired News but was otherwise largely abandoned until Condé Nast, which already owned the print magazine, bought it in 2006. After that, the magazine’s digital presence gradually advanced, with the website continuing to be run as a separate entity with a separate staff working in a different part of the building.
RELATED ARTICLEAround the globe with the International New York TimesOctober 15, 2013In 2012, boy-wonder Scott Dadich became editor-in-chief of the magazine. The following year, Dadich hired the magazine’s first ever director of product management, Hayley Nelson.
RELATED ARTICLEManaging assets across platformsDecember 17, 2014“There was no product organization when I got here. They didn’t really know what product meant,” says Nelson. “That’s the product manager’s legacy — you’re always evangelizing. ‘Here’s what I do! I’m at this unique intersection between tech and sales and edit. I try to triangulate and listen to what everyone wants to do and make it all go forward.’”
Nelson had worked for Wired before, back when the magazine had only been around for a couple of years. Eventually, her career — with stops at Wharton, Johns Hopkins, the Associated Press, AIG, and more — took her to The New York Times, where she worked in product management for seven years on projects like building out business and technology verticals and merging the International Herald Tribune with the Times. It was also there that she met Chris Jones, now vice president of product development at Condé Nast, who ultimately convinced Nelson to make the move back to Wired.
Understanding what product management really is can be a challenge for journalists:
If you missed it yesterday. Newsies, techies and that troublesome term "product." https://t.co/4s2ZCa30Xm An observation of mine.
— Jay Rosen (@jayrosen_nyu) December 7, 2014
Which is why when I met with Nelson in Wired’s office in San Francisco, I asked her to explain to me the day-to-day experience of what a product manager actually does.Breaking free of “the print-digital paradigm”
Nelson was brought on at Wired to facilitate the integration of the digital and print sides of the magazine, a job that has been both gradual and complicated. Her first task, though, was to build a team that could execute the project. It’s no small thing to hire developers and designers in the Bay Area when you’re a media company who can’t hope to compete with the salary offerings of tech companies.
“When I got the job, I sort of said, well, I need a project manager. And we need a web producer. And we need production people that report to the web producer. And let me look at the tech team — do we have the right skill sets? I don’t think so, we need four more people,” Nelson says. “So over two years, we’ve gone from four to eight and a change in leadership and brought in a lot of new talent.”
Retaining that talent is also a challenge, one which Nelson tries to incorporate into her management strategy. For example, Nelson promoted lead engineer Kathleen Vignos to software management engineer. Since then, Vignos has played an increasingly important role in shaping the culture of Wired’s tech team, instituting magazine-wide demos of new products that Nelson says have made the editorial team increasingly supportive of and excited about the work being done by their developer peers.
One shift at Wired that was already underway when Nelson arrived was changes to the newsroom seating chart. Digital and print editorial teams are no longer siloed, but sit together, and all are overseen by Dadich as editor-in-chief. In addition, some of the bigger names on the print masthead — Mark Robinson, Adam Rogers — have been tapped to work on digital projects.
“We put the people that do the web production of the magazine and some of the front-end design together with the art people from the magazine, for example,” says Nelson. “They stopped having their daily meetings at the same time, so they could go to each other’s meetings.”
But as personnel issues have started to resolve, merging print and digital content has been trickier. Like all publications making this transition, Wired has to deal with scheduling incompatibility of a print magazine that’s published once a month and a website that publishes new stories every day.
Recently, certain writers have found new ways to work around these disparities. For example, senior staff writer Mat Honan (who just left Wired to run BuzzFeed’s San Francisco bureau) was among the first at Wired to experiment with publishing early reporting online, and then producing a cleaned-up version of the same story to be published in print later on.
“It’s a lot of hard work to juggle both, and to move at two very different paces,” he says. “I think we could do a better job with figuring out who is doing what and when, so people don’t get crushed with concurrent print and web deadlines. Calendaring remains very hard.”
Honan says that while he considers himself equally part of the web and print teams, editors still tend to think of themselves as having more of a distinction in terms of medium. For Nelson, that’s one of the few remaining distinctions she’d like to see erased.
“If we’re going to be organized by vertical on the digital side, maybe those editors oversee the vertical coverage across all the platforms. That would be a sort of an end goal,” says Nelson. “I think eventually we want to break free of this print/digital paradigm, and it’s just going to be about content, streams of content.”
For the time being, most of Wired’s print content is available online for free, but not all of it translates well to the web, which could eventually mean killing some print features.
Nelson says she’s been closely watching how The New Yorker has dealt with its paywall and integration. This summer, that magazine unveiled a much applauded redesign of its website. The overhaul included both technical and design updates, and brought with it a temporary lifting of the magazine’s paywall. The magazine used the free period as an opportunity to track reader behavior, mainly with the intent of measuring how many stories the average visitor reads in a month. This data was used to develop the rules for The New Yorker’s metered paywall, which went into effect in November.
These strategic developments have been widely perceived as a success for the magazine. Traffic to the homepage continued to grow the day the new paywall launched. Fast Company published a story headlined “How the New Yorker finally figured out the Internet.” In an interview in October, David Remnick credited in part the “the people who own the joint” — Condé Nast — for the magazine’s continued business and editorial success.
“There’s product people in place at every [Condé Nast] title, so we do a lot of sharing — are you guys experimenting with this? are you using that tool? have you spoken to this vendor? There’s a lot of shared learnings across the titles,” she says.
Indeed, it wouldn’t be surprising if Wired followed in The New Yorker’s footsteps, launching a metered paywall after its next redesign, which Nelson says is currently slated for completion sometime in the first half of 2015. But for Nelson, the redesign has been a distant goal in the face of much more immediate problems for the magazine.Managing with agility
“When I got the job, Scott basically said: So, where do we start? I kind of looked at the website…and said, low hanging fruit? You need to refresh your mobile experience and redo your gallery experience, because it’s just abominable. Like, embarrassing,” she says. “The other big thing in Year 1 that we did is moving from 17 different blog installs to one, which is this massive back-end project that we called Pangea.”
Since the days of HotWired, the magazine’s digital identity had splintered into lots of different “sub-brands,” many of which lived in different templates on the Wired Digital backend. Migrating all of that content onto an interoperable CMS — WordPress, in this case — was a major headache. But it also highlighted some particular issues with Wired’s brand as perceived by readers.
“We’re trying to narrow the focus a little bit to our six key categories. Some of that is just gut feel. Some of it we heard in user testing, that our sub-brands weren’t testing with readers; they didn’t have names that meant something specific to readers,” Nelson says. “I don’t think we have all the answers about that.”
For now, Nelson says she tackles problems by dividing them into categories of immediate, weekly, and long-term projects. Immediate issues are when the site breaks — bugs that need to be dealt with, general maintenance, messed-up code, etc. Projects on the weekly scale are discussed on Mondays; the team uses Trello software for project management. Every day at noon, there’s a standup meeting to discuss progress.
Nelson’s title puts her “firmly in technology” at Wired and, indeed, her management style is derived from agile management, a product of the software industry. But as a director of product management, her role is very much about being the touchpoint between all parts of the magazine. At The New York Times, she spent more time in the newsroom, which makes her comfortable interacting with Wired editorial, while her background in business (she has an MBA from Wharton) makes her at home when talking to consumer sales.
For Condé titles, sales is centralized in New York, but each property sells differently. Though there were struggles moving print inventory a few years ago, Nelson says Wired’s digital ad sales have always been at the front of the pack. “We have been the most successful with digital advertising at the company, partly because we’re more native to the space — tech advertisers were there first before fashion and beauty,” she says. Video ads, too, came to Wired’s advertisers before other Condé Nast brands like Vogue, Allure, or GQ.
It didn’t hurt that in 2013, Wired launched Amplifi, a studio that creates custom digital campaigns, videos, micro sites and events for brands. The magazine also offers ad units that can be tied to stories that are trending on social media, according to Nelson.
Another development in ad tech that’s influenced the magazine is viewability, a developing standard which says at least 50 percent of an ad needs to be visible for at least one second for it to be counted as an impression. What it will look like in practice: fewer ads in the right rail, and more ads interspersed throughout long stories that come into view as the reader scrolls. Expanding these efforts will factor into Nelson’s most ambitious effort at Wired so far — the redesign.“The whole shebang”
As with many recent digital overhauls, the most important and ambitious thing about the new Wired site will be its responsive design. Unlike some leading websites, such as BuzzFeed, the majority of Wired’s traffic still comes through desktop. Most audience growth, however, is coming from mobile, and the site has been tested extensively on mobile devices. “We can’t roll that out in chunks — it’s sort of the whole shebang,” says Nelson. “It’s a massive project.”
RELATED ARTICLEWired releases images via Creative Commons, but reopens a debate on what “noncommercial” meansNovember 8, 2011In addition to designing and building a site ready to be viewed on all platforms, Nelson says her team has also been working to make the new site more visually rich, taking advantage of the extensive photo resources Wired has so long been known for in print. The new Wired also aims to offer better content recommendation for readers.
“We don’t actually do user authentication on the site right now. It’s actually something that’s in our road map for next year,” Nelson says. “That’s what will power future personalization on the site.”
Tagging content will allow Nelson’s team to collect data on what users have and haven’t read, and what content they seek out. Incorporating this behavioral information into the backend of the new site will make it easier to build an experience that caters to readers’ individual needs. Nelson is especially interested in finding ways to segment readers on a more granular level than existing categories allow for. “I think there are a lot more buckets than the social user, the searcher, the loyal fan,” she says.
Design-wise, the Wired team is taking inspiration from across and beyond the media industry. Nelson says her team looked at Fast Company as well as non-journalistic sites like Google and Pinterest as potential models. Wired considers The Verge a top competitor in terms of rate of production, traffic, and video content, but Nelson says they’re not huge fans of the Vox Media site’s square-heavy, multicolored design. For a taste of what a sleek, reimagined Wired might look like, check out their “Space. Time. Dimension.” package, guest edited by Interstellar director Christopher Nolan. Readers can now sign up to beta test the redesigned website.
— Harrison Weber (@HarrisonWeber) December 11, 2014The bubble and beyond
Nelson, her team, and everyone at Wired have a lot of work ahead of them before the magazine’s digital brand catches up with the competition out in Silicon Valley. After all, what some consider a tech journalism bubble — Pando, Recode, BuzzFeed San Francisco, Vice’s Motherboard, Medium’s Backchannel, The Information, etc. — has been swelling for a while now, and Wired has some catching up to do.
It’s unclear whether that will happen under Condé. The Awl published a story earlier this month in which the editors reported that Scott Dadich, the editor-in-chief, might be trying to buy the magazine.
It would require many millions of dollars, obviously, but that’s nothing for a few venture capitalists in these Golden Days of the Content Bubble, especially for the Valley’s longstanding Magazine of Record. (How would it make money for its investors? The Wired conference business has never been as glittering as All Things D or The Atlantic‘s, but that could always change. And the Wired Store is just one of its infinite #branding opportunities.) The fifty-million-dollarish question: Would Condé let it go?
For her part, Nelson says headquarters has been relatively helpful when it comes to Wired trying new things, like a different email vendor or content management system. “I’m very much into the test-and-learn — let’s try different things and see what we get out of them,” she says. “And I think, for the most part, corporate has been really supportive of that.”
But there’s no doubt that for someone managing a media tech team — in which the name of the game is rapid iteration and competitors like Vox have what seem like endless technical resources — reporting to a century-old magazine publishing company could be frustrating. Whatever the future of Wired’s ownership, Nelson has her work cut out for her.
Image by Richard Giles used under a Creative Commons license.
Patriarchy, it’s time. You’ve had a good run there, turning a blind eye to diversity on one hand and trumpeting meritocracy on the other, as though both those things together means you’re so gosh-darn focused on quality that you just don’t see race, gender, or class. But the jig is up, and has been for a long time — the only difference is that in 2015 you’re going to have to do something about it, because it’s starting to be at best a headache and at worst a hit to your bottom line.
This year has brought issues of diversity and privilege to the forefront of the conversation — and with it, a lot of swift public outrage. That outrage, facilitated by social media, has for the first time, begun to reliably reach the inner sanctum. And suddenly, that inner sanctum has had to address it — uncomfortably, contritely, irritably, faux-apologetically, or maybe, just maybe, with the first real glimmer of awareness that when times are changing, leaders get out in front of that change.
That time seems to be, well, now. Organizations are getting caught flat-footed with all-male boards (hi Twitter!) and excruciating diversity numbers (don’t be evil, Google!), and the ensuing public pushback has prompted public hand-wringing (getting there, Microsoft!) and actual progress (lean in, Facebook!).
There’s a clear business case for diversity (duh — you are an idiot if you think the best decisions are arrived at by a slate of clones), but on a more urgent level, there’s a PR case and it’s that kind of pressure that accelerates the march of progress (well hello, GoDaddy). Organizations don’t like being subject to a barrage of angry tweets, and like it less when it balloons into boycotts and mainstream news stories (jury’s still out, Uber). Most of the time, the org braces for a few days and then the story passes, which is why we see lineups like The Wall Street Journal celebration of dudes in April and Web Summit in Dublin or, just this week, Business Insider’s all-white all-male list of Startups to Watch in 2015. But those of us who keep an eye on such things are seeing incremental change (Gigaom’s 22 percent women in this conference is still a serious improvement) — and even better, we’re starting to see other people keep an eye on such things. (See The New York Times’ Margaret Sullivan ruminating over byline disparities. Of course that was in April, when The New York Times could boast a female top editor. A glass cliff sure can change a landscape!).
Ergo, quotas. Not out of generosity or an earnest commitment to changing the ratio — please, nothing hates change more than the status quo — but out of urgency. Organizations are realizing that actual diversity results takes effort and commitment, and can’t be waved away with an obligatory seminar and vague promises to do better. It comes down to making it a priority. And when something is a priority, it becomes someone’s job.
And lo! In November, I saw something that made my heart leap: Bloomberg went there. From Bloomberg editor-in-chief Matthew Winkler’s memo to staff: “All Bloomberg News enterprise work must include at least one woman’s voice, and preferably a balance of men and women. Women are engaged in every topic we cover. Our journalism should reflect that variety.”
This felt like a huge step — and one that will not only ease headaches for Busy And Impatient People In Charge but will also actually make that coverage better and smarter, because it will include a broader range of perspectives and thus stories, angles, and insights. So bravo, Bloomberg! I look forward to the spread of that ethos to the actual slate of Busy And Impatient People In Charge (Who No Doubt Got There On Merit). Baby steps.
I’ve been thumping this drum for years and years now, and I think we may have reached the tipping point where complacency cedes to proactivity and stubborn blinders are forced off by even more stubborn awareness. The word “diversity” may bring eyerolls (and if that’s you, mofo, check yourself because you’re perilously behind) but it’s also bringing headaches, and power doesn’t like headaches. (Hello from a headache! Happy to be here.) Upshot: The more someone like Satya Nardella takes the heat for bad diversity numbers, the more incentivized he will be to say to his managers: Fix this.
And guess what? “We tried!” is no longer an excuse. “We couldn’t find anyone qualified!” is no longer an excuse. “I asked my three white dude friends and they couldn’t think of anyone, but look, they suggested these other three white dudes, and oh, the merit!” is not only no longer an excuse but, honestly, just an embarrassment — so seriously, just keep that one to yourself. It’s not a question of “just checking a box” — because in 2014, we now know that the ranks of the under-represented and over-qualified are thick with great candidates. It’s a matter of making it a priority to find them, book them, list them, feature them, hire them, promote them, invest in them, cultivate them, and pay them.
The tide is turning. Hello, quotas! See you in 2015.
Something remarkable happened this year. Something I’ve been waiting for for a long time.
News reports on the web finally started to look more and more like… well, web-native articles.
Not print articles online, not broadcast journalism online, but online journalism, online. I’m talking about journalism which isn’t just text: whether that means linking and embedding or mixing text with images, video or audio.
So what changed in 2014? Here are three factors I’ve noticed growing in influence over the last 12 months.Twitter, Facebook and mobile: algorithms and audiences
Starting with perhaps the biggest influence of all: Twitter and Facebook love multimedia.
In the case of Facebook, having an image or video with your update can be the difference between thousands of shares, and not being seen at all: it’s built into their algorithm.
Twitter is slightly different: having an image is not (yet) a factor in their algorithm, but it is the single biggest factor in whether a tweet gets retweeted or not.
But Twitter and Facebook are just the most visible aspects of the broader move towards mobile consumption of news – and with that, the rise of visual social networks.
Most UK news organisations in 2014 have seen the majority of traffic coming from mobile (that’s phones and tablets), while research this year revealed a fifth of readers only use mobile.
In this context, having visual cues – not just once, but regularly – is key to reporting a story online. The Mail Online have been setting the style for some time now, peppering their reports with images and embedded video, creating long pages which contribute to metrics on dwell time and engagement.
Other outlets are now starting to adopt similar approaches.The Mirror, for example not only adds images and video, but entire galleries too:
Even The Independent, with fewer resources than its competitors, has been able to mix text with images and video – which takes us on to the next factor.Skilling up: curation
Recognising that journalists should ‘do what they do best and link to the rest‘, adding curation to the job description has normalised the idea that journalism online can include material produced by people outside of your news organisation.
Content management systems increasingly reflect this, too, with embedding of tweets, video and audio getting easier – for some at least.
Which takes us on to…Availability: from YouTube to Vine
There are not enough multimedia journalists to produce multimedia journalism. But there is, now, enough material online for journalists to incorporate into their work.
Many public organisations now maintain Flickr and Instagram accounts which act as public photo libraries. Video is distributed and archived on institutions’ YouTube channels. And it’s all sent out on Twitter and Facebook. The press release, as one Fire Service press officer told me recently, is “90% dead”. Instead, the focus is on releasing its constituent parts.
And of course the spread of mobile phones and social media platforms hasn’t just had an impact on consumption: if Instagram does now have more users than Twitter it’s not because people are sharing text updates. Vine has lowered the bar to sharing video and shaped the reporting of the Ferguson protests along the way. And SnapChat added to the mix a year ago with its ‘stories’.
It has been over a decade since commentators began talking about online journalism’s linked, multimedia promise: its ability to go beyond text, and beyond the single article. But when we talked about that we perhaps always expected it would be the journalists making all the media.
So far, that’s the exception rather than the rule.
Filed under: online journalism Tagged: curation, embedding, linking, multimedia
In comments sections of news articles and blog posts, it’s not uncommon to see someone quip “I went straight to the comments.” Many newsreaders actively seek not just to read articles when they come to a website, but also to engage on hot-button issues in politics, entertainment, or sports. Engagement is a primary appeal of consuming news online. Vibrant comment sections are a key way of growing and maintaining a news website’s readership.
Over the past few years, some news websites, notably ESPN, have moved to to using Facebook for comments. Due to Facebook’s real name policy, using Facebook for comments in this way links commenting on news articles to the rest of the commenters’ lives. This is the most common way for a website to impose a real name policy in its comments section.
2015 will see a return to discussion formats that permit individuals to create and maintain a profile separate from their primary social and professional profile. In a world where cyberharrassment is pervasive, requiring use of one’s real name forces individuals to accept any consequences and shaming that would come from offensive speech in real life. But this norm-imposed limitation of offensive speech comes at a substantial cost.
Requiring the use of real names also stifles speech that is not offensive. A person might not want her coworkers or family members to know that she enjoys cosplay or has libertarian political views — not because those aspects of her life are shameful or because she would speak about those interests in offensive ways, but because, for whatever reason, she would prefer that not be part of her public image. Such a person would not engage in forums that have real name policies. In this way, news sources with real name comment polices limit readership.
Accepting a degree of anonymity is a prerequisite for allowing individuals to engage with the news online. Engagement is a key part of the news consumption experience. Anonymity enables more engagement. More engagement leads to more clicks and thus, more revenue for content providers. 2015 will see these forces come to a head, and begin a move towards fewer real name policies in website comment sections.
AUSTIN— In 2007, Texas regulators quietly relaxed the state’s long-term air pollution guideline for benzene, one of the world’s most toxic and thoroughly studied chemicals. The number they came up with, still in effect, was 40 percent weaker, or less health-protective, than the old one.
The decision by the Texas Commission on Environmental Quality (TCEQ) was a boon for oil refineries, petrochemical plants and other benzene-emitting facilities, because it allowed them to release more benzene into the air without triggering regulatory scrutiny. But it defied the trend of scientific research, which shows that even small amounts of benzene can cause leukemia. The American Petroleum Institute, lobbyist for some of the nation’s largest benzene producers, privately acknowledged as early as 1948 that the only "absolutely safe" dose was zero.
It’s "the most irresponsible action I've heard of in my life," said Jim Tarr, an air-quality consultant who worked for the TCEQ’s predecessor agency in the 1970s. "I certainly can't find another regulatory agency in the U.S. that's done that."
The benzene decision was part of Texas’ sweeping overhaul of its air pollution guidelines. An analysis by InsideClimate News shows that the TCEQ has loosened two-thirds of the protections for the 45 chemicals it has re-assessed since 2007, even though the state’s guidelines at the time were already among the nation’s weakest.
The changes are being supervised by TCEQ toxicologist Michael Honeycutt, who began updating the way Texas develops its guidelines in 2003, when he was promoted to division chief. A genial, bespectacled man who takes great pride in his work, Honeycutt is a trusted advisor to top TCEQ officials and often acts as the agency’s scientific spokesman. He is also a frequent critic of federal efforts to reduce air pollution.
Honeycutt's actions reflect Texas’s pro-industry approach to air quality, which InsideClimate News and the Center for Public Integrity have been examining for the past year and a half. Most of the air-quality guidelines the state’s oil and gas producers are supposed to meet are not legally enforceable regulations. That means violators are rarely punished, and residents who complain about foul air near drilling sites have few places to turn for help.
Texas has made its anti-regulatory stance known on the national front. Attorney General Greg Abbott, the state’s governor-elect, has taken legal action against the U.S. Environmental Protection Agency 19 times since 2010, arguing that overly restrictive regulations stifle business growth, cost jobs and threaten the state’s economy. The EPA is “a runaway federal agency that must be reined in,” Abbott said last year when he challenged greenhouse gas regulations.
Honeycutt has publicly criticized the EPA for being overzealous in its regulation of ozone, which exacerbates asthma; particulate matter, a known respiratory hazard; and hexavalent chromium, the cancer-causing chemical that launched the Erin Brockovich case. In testimony before a congressional committee in 2011, he said the EPA had been overly cautious in evaluating the toxicity of mercury, a powerful neurotoxin known to lower IQ. Mercury is particularly harmful to developing fetuses.
"EPA ignores the fact that Japanese eat 10 times more fish than Americans do and have higher levels of mercury in their blood, but have lower rates of coronary heart disease and high scores on their IQ tests," Honeycutt said in a letter responding to written questions from one of the committee members after the hearing.
State Rep. Lon Burnam, a Fort Worth Democrat who has tried for years to strengthen Texas public health regulations, said Honeycutt's role as chief toxicologist is more political than scientific.
"I consider him an apologist for the polluters," Burnam said. "I think he doesn't give a tinker's dam about public health."
Honeycutt said the toxicologists on his staff are good scientists who take their jobs seriously.
“Our friends and family live in this state, too,” Honeycutt said. “My son wants to go to school in Houston, and I want him to be just as protected as every other kid in Houston."
Scientists interviewed for this story agree that Texas needed to update the process it uses to set air quality guidelines. When Honeycutt took over, he introduced formalized methods of risk assessment, an interdisciplinary field of science that includes toxicology, epidemiology and biostatistics. Risk assessment has become the most widely used method of determining the health risks chemicals pose to the public.
But scientists say the process has inherent uncertainties that open the door to bias.
"This is done across the spectrum, not only from those more inclined to have higher permissible standards, but also by those that would like to have lower ones," said Maria Morandi, a private consultant who formerly worked as a health scientist at the University of Texas School of Public Health in Houston.
The problem, Morandi said, is that finding the scientifically "correct" exposure level for each of the thousands of chemicals industries release into the air is impossible because it would require exorbitantly expensive experiments, or illegal and unethical testing on humans. The best scientists can do, she said, is extrapolate data from existing studies and hope the numbers they produce are low enough to protect a majority of the population.
The potential for bias comes in when the risk assessment team chooses which studies to include or exclude, and how to weigh the available evidence. Some scientists lean toward the side of public health and believe many existing standards aren’t strong enough. Others tend to be more lenient, taking the view that overly protective standards place needless and expensive burdens on industry.
It's "about what questions you ask, what uncertainties you leave alone and which ones you decide to focus on," said Ruthann Rudel, director of research at the Silent Spring Institute, a research center in Massachusetts. Bias in risk assessment is rarely a product of fraudulent science, she said, but rather a reflection of how scientists choose to frame their analysis.
The InsideClimate News analysis shows that in Texas, the bias tilts toward industry.
As of September, nearly 60 percent of the new guidelines Honeycutt’s team derived for outdoor air quality are less protective than analogous numbers used by the EPA and by California, whose guidelines are among the strictest in the nation.
A year after its benzene announcement, the TCEQ released a new cancer risk assessment guideline for another high-profile chemical: 1,3-butadiene, which is produced by the synthetic rubber industry and can cause leukemia. Texas is responsible for the majority of the nation's butadiene emissions.
Ron Melnick, a former scientist at the National Institute of Environmental Health Sciences, analyzed the TCEQ’s 139-page description of its butadiene decision-making process for InsideClimate News. When Melnick compared the Texas approach with the EPA’s, he said Texas “dismissed anything which might have made the risk seem higher than what they wanted."
The TCEQ’s new butadiene number is 60 times less protective than the EPA’s and 340 times less protective than California’s.
Such glaring discrepancies are possible—and perfectly legal—because the federal government rarely sets legally enforceable air quality standards for the chemicals it has assessed. That leaves each state to come up with its own approach for each chemical, which means people in different states are protected to different levels. A chemical release that could trigger a public-health alert in California, for instance, might not even be noticed by Texas regulators.
"It’s confusing, because you cross the state boundary and the toxicity of the chemical changes,” said Loren Raun, a health scientist who works for the city of Houston and teaches at Rice University. “That, right there, is a problem."
Few Texans are aware that Honeycutt’s department is changing the state’s air-quality guidelines. Because they are not legally enforceable standards, the toxicology department can update them without public hearings or approval from top officials, according to former TCEQ Commissioner Larry Soward.
When the TCEQ released its benzene proposal in 2007, the only person who submitted a public comment was a representative of a chemical trade group, who urged the TCEQ to further weaken the guideline. The agency refused.
Soward, who was one of the TCEQ’s top three officials when the benzene guideline was changed, said he didn’t learn of the revision until InsideClimate News asked him about it in July. Soward left the agency in 2009 and spent several years working for Air Alliance Houston, an environmental group.
When Soward was appointed a TCEQ commissioner in 2003, he said, he often met with Honeycutt to discuss public health issues and thought the toxicologist "was a very scientific-based, impartial person." By 2005, however, Soward felt Honeycutt was advocating for "positions he felt like he was supposed to advocate” for, regardless of the science.
"I think he really believes…that air pollutants don't really have a health effect unless there's such a toxic exposure to them that it leads to direct problems," Soward said. "I used to joke I didn't think there was a toxic pollutant he didn't like."
Burnam, the state representative, blames the TCEQ’s governor-appointed commissioners for the agency’s pro-industry bent.
"For the past 20 years, you’ve either had oil industry [George W.] Bush or oil industry apologist [Rick] Perry making all the appointments,” said Burnam, who was defeated in the March Democratic primary and leaves office this month. “…The good [employees] at the lower levels are totally frustrated and hamstrung."
'I love this job'
Honeycutt’s 15-member division is one of the largest state toxicology departments in the country. In addition to setting air-quality guidelines, it reviews air and water monitoring data, advises emergency crews after chemical accidents and provides scientific expertise to agency officials.
"We have probably one of the best toxicology departments in the world," Commissioner Toby Baker said at a TCEQ hearing last year.
The division's size remained relatively steady even when the TCEQ’s operating budget dropped 39 percent from 2008 to 2013. Its stature rose in 2012, when an agency-wide reorganization put Honeycutt’s department directly under the office of the TCEQ’s executive director. The toxicology division now occupies a suite of offices and cubicles in a gleaming blue building in Austin, on the same floor as the executive director and the TCEQ’s three commissioners.
"I love this job," Honeycutt said in November, during an interview in his spacious office. "This is the job I went to school to learn how to do. I get to sit on the side of the table opposite everybody. One thing we’ve learned is, usually when everybody’s mad at you you’re probably doing your job right."
Honeycutt, 48, studied toxicology at the University of Northeast Louisiana at Monroe, 30 miles from his hometown. His high school yearbook reveals he graduated with honors and was a leader, or "beau," of the library club.
He stayed at Monroe to get his Ph.D. in toxicology. David Roane, who now chairs the pharmacology department at East Tennessee State University, advised Honeycutt on his dissertation about how earthworms dispose of the element cadmium.
“I trusted his work more than most people and found him to be conscientious in a small town kind of way,” Roane said. “He was a real wholesome guy.”
Carey Pope was teaching in the toxicology department when Honeycutt was a graduate student. The two men still occasionally run into each other. Pope describes Honeycutt as “the kind of guy who was always the first in line to help you.”
After graduation, Honeycutt worked three years as a researcher for the Army Corps of Engineers, where he focused on screening for contaminants in sediments and soils. He joined the TCEQ in 1996, when the agency was still known by its former name: the Texas Natural Resource Conservation Commission, or TNRCC. Critics called it "train wreck."
By the time Honeycutt was promoted to toxicology division chief, TNRCC had become TCEQ and the agency was under fire for the way it managed air quality guidelines. The problem was the haphazard way it set Effects Screening Levels, or ESLs, for thousands of chemicals.
ESLs are critical because the TCEQ uses them to draft the air permits it issues to oil and gas production sites, refineries, power plants and other industries. Companies must show that chemical concentrations at the boundaries of their facilities will meet the ESLs. If they don’t, the TCEQ can require them to adjust their operations.
Most chemicals have a short-term ESL (for hour-long exposures) and long-term ESL (for annual average concentrations). For example, the short-term ESL for benzene in 2003 was 25 parts per billion (ppb) of benzene in air. The long-term benzene ESL was 1.0 ppb.
When the Houston Chronicle reported in 2005 that the TCEQ’s ESLs were among the least protective in the country, Honeycutt told the newspaper his department was addressing the problem by changing the way ESLs are established.
The TCEQ hired a nonprofit consulting firm—Toxicology Excellence for Risk Assessment (TERA)—to convene a panel of outside scientists to review the new procedure. TERA was founded by Michael Dourson, a former EPA toxicologist and one of Honeycutt’s close friends. TERA often works for industry and runs a database that has raised the profile of industry-funded risk-assessment values.
Morandi, the consultant, sat on the TERA review panel and said she was comfortable with the TCEQ document.
But she said what also matters is how the protocol is applied to individual chemicals.
The TCEQ documents its risk assessments in long, complex reports that are posted online for public comment. Honeycutt said he has tried to encourage more feedback by extending the comment period from 60 to 90 days. But few people outside industry have the time and expertise to understand or critique the highly technical documents.
The Texas environmental community tends to rely on a single expert—Elena Craft of the Environmental Defense Fund—to weigh in on risk assessment science. Air Alliance Houston Executive Director Adrian Shelley said he often turns to Craft for help on these issues.
Of the 56 comments that have been filed for the 45 chemicals the TCEQ has assessed, only one came from the environmental community. About 80 percent of the comments came from industry groups, including the American Chemistry Council and ExxonMobil.
Adam Finkel, executive director of the University of Pennsylvania's Penn Program on Regulation and a former director of health standards programs for the Occupational Safety and Health Administration, said environmental groups could help level the playing field by hiring more scientists who understand risk assessment.
Some environmental organizations have multi-million dollar budgets, he said, but they're focused on other issues.
'This is crazy'
While the TCEQ was developing its risk-assessment strategy, air pollution was making waves in the Texas press. In January 2005, a TCEQ report linked 1,3-butadiene and benzene to elevated cancer risks in Harris County. The county is home to Houston and many refineries and petrochemical plants that emit both chemicals.
The butadiene levels corresponded to two additional cancer cases per 10,000 people—20 times what the TCEQ considered acceptable at the time. Benzene levels were seven times higher than the TCEQ’s benchmark cancer risk.
That same month, the Houston Chronicle published "In Harm’s Way," a series by reporter Dina Cappiello. The newspaper had placed air monitors at 100 locations near large industrial sources and found 84 readings “high enough that they would trigger a full-scale federal investigation if these communities were hazardous waste sites."
Only a few measurements exceeded the TCEQ's cancer exposure guidelines, which the paper reported were "among the most lenient in the country." The Chronicle noted that the results "would be considered a serious health risk in other states."
The two reports hit a nerve with Bill White, a year into his first term as Houston’s mayor. A deputy secretary of energy during the Clinton administration, White made air quality a priority during his three terms as mayor. But he found himself fighting the TCEQ as well as the industries that were polluting his city.
A policy analysis article by Texas academics summed up the situation:
"The problem in Houston has been compounded by the reluctance of state and regional regulators to assume a strong role in pollution control and environmental enforcement, particularly concerning the chemical and refining industry, which is a key source of jobs and philanthropy in the region."
The TCEQ increased air monitoring in Harris County, but Houston wanted more concrete action. Honeycutt met frequently with Elena Marks, White’s director of health and environmental policy from 2004 to 2009.
Marks is now a fellow at Rice University, researching health care policy. She said she often came away from those meetings frustrated, because Honeycutt “always seemed to err on the side against human health."
When city and county officials hosted a town hall meeting to discuss the alarming reports, the TCEQ didn’t show up, despite its pledge to send at least two representatives. Honeycutt later criticized the TCEQ’s own report, saying it was "overpredictive" about the cancer risks.
When Houston threatened to sue Texas Petrochemicals, the main culprit behind the elevated butadiene levels, Marks said the TCEQ got "pissed off” and worked out a pollution-reduction plan with the company. But the agreement was voluntary, and Houston continued to threaten legal action. Texas Petrochemicals finally reached a legally binding agreement with the city to reduce its emissions, and butadiene levels began to drop.
To tackle the benzene problem, White tried to persuade local businesses and the TCEQ to work together on a regional benzene reduction plan, but he said the TCEQ wasn’t interested.
Benzene levels in Houston did begin to fall. But White, now senior advisor and chairman of the financial firm Lazard Houston, attributes the change to the city's aggressive leadership, which "created a tremendous incentive for compliance and put pressure on the TCEQ."
Marks put it more bluntly. “Every time we found benzene emissions…we were just a pain in the ass—and the plants thought it was just easier to curb benzene."
When asked to comment on the TCEQ's role in Houston during those years, agency spokesman Terry Clawson said in an email: "The TCEQ works in partnership with local governmental [entities] to address environmental issues within their communities."
In 2007, as Houston was still struggling to remove benzene from its air, Honeycutt’s department weakened the long-term benzene guideline 40 percent, from 1.0 ppb to 1.4 ppb.
The new number was 13 times weaker than California’s guideline. It was at the least-protective end of the range recommended by the EPA, which last updated its benzene numbers in 1998.
Marks remembers her shock when she learned of the change.
"My reaction was 'This is crazy. Why would you do that?'” she said. “The more you learn, the more likely you’d be to tighten any standards or screening levels."
An examination of the TCEQ’s decision on butadiene shows how its conclusions could differ so sharply from the EPA’s.
The EPA’s analysis, done in 2002, relied primarily on an industry-funded University of Alabama-Birmingham study from the 1990s that tracked leukemia rates in workers.
The TCEQ’s analysis used a 2004 study by the same researchers, also funded by the industry. They said their original study had vastly underestimated the amount of butadiene the workers were exposed to, which meant it had overestimated the risk.
Melnick, the former NIEHS scientist who analyzed the TCEQ's butadiene document, said it’s hard to tell which of the two University of Alabama studies is more accurate—but the discrepancies show the "murky" history of the reports.
Because the TCEQ used the second study as its starting point, it began its analysis with numbers that showed butadiene was less toxic, Melnick said. It then made a series of subsequent decisions that made the number even less conservative, including using a different statistical model and not adopting some uncertainty factors used by the EPA.
Melnick said it’s impossible to say the Texas number is wrong. But it’s clear that "Texas tried to load it up to allow the highest exposure possible."
'The things they didn't like'
Texas has invested time and money to oppose two federal efforts that could lead to tighter chemical regulations.
Its first effort was to address a 2009 National Academy of Sciences risk assessment report authored by Finkel, the Penn professor, and 14 other scientists from academia, government and consulting firms. Among other things, the report recommended that scientists reconsider the long-held assumption that any chemical not known to cause cancer has a safety threshold—a level below which it is completely safe. If adopted by risk assessors, the recommendation could lead to additional regulations.
The following year, the TCEQ helped lead a series of workshops to discuss the National Academies’ report. They were sponsored by the Alliance for Risk Assessment, a spinoff of TERA—the consulting firm founded by Honeycutt's friend Michael Dourson. Both Dourson and Honeycutt sit on the alliance steering committee. Dourson said the TCEQ came up with the idea for the workshops.
The TCEQ has awarded TERA at least $700,000 in contracts since 2010, with $7,000 going to the alliance to help fund the workshops. Honeycutt said that to avoid conflicts of interest he recuses himself whenever the TCEQ proposes a project to the alliance.
Honeycutt chaired the first workshop, which was held at TCEQ headquarters. Commission Chairman Bryan Shaw gave the opening speech. The agency has hosted three of the eight workshops that have taken place so far. More than 50 groups from industry, government, consulting and research centers support the workshops, according to the alliance website.
Honeycutt and Dourson say the workshops are designed to expand upon the National Academies’ report and foster collaborations to develop practical risk assessment methods. But Finkel and two other health scientists who work in risk assessment say the main focus was to criticize the report, especially the part about the non-carcinogen thresholds.
"They were essentially formed to respond to that report and the things they didn't like," said Tracey Woodruff, a professor at the University of California, San Francisco, who studies reproductive health and the environment.
Finkel said the workshops were so biased toward industry's point of view that he stopped attending them.
'He is our expert'
The TCEQ has also consistently opposed the EPA’s handling of ozone, one of six compounds with federal air standards. Ozone is created primarily by fossil fuel emissions and is known to exacerbate respiratory and cardiovascular disease. Exhaustive reviews by EPA scientists and independent agency advisors have urged that the federal standard of 75 parts per billion be lowered.
In November, the EPA proposed a new standard of 65-70 ppb, which the agency predicted would prevent thousands of premature deaths and asthma-related emergency room visits each year.
Just minutes after the EPA’s announcement, the TCEQ issued a press release in which chairman Shaw described the decision as "shortsighted."
A lowered standard would create serious problems for Texas’ three largest cities—Houston, San Antonio and Dallas—which are out of compliance with the current standard.
Honeycutt has criticized the EPA’s ozone science at public hearings, in comments submitted to EPA’s ozone panel, in presentations at scientific conferences and his own scientific analyses posted on the TCEQ’s website.
In the TCEQ’s October newsletter, he said his agency’s "in-depth review" of the EPA’s scientific analysis found that "further lowering of the ozone standard will fail to provide any measurable increase in human health protection."
"The fervor with which they’ve been critical of the ozone standard…is unprecedented," said Craft, the Environmental Defense Fund scientist. "I can’t think of another state where they’ve spent the amount of time and resources on this issue as Texas."
Last year, the TCEQ paid a Massachusetts-based consulting firm, Gradient, $1.65 million to examine the science behind EPA's air quality standards, which include ozone. Clawson, the TCEQ spokesman, recently told the Texas Tribune that the agency is developing a separate Gradient contract to "provide a comprehensive review" of the science "addressing potential impact of ozone on asthma."
One of Honeycutt’s main objections to EPA science is that it’s based on an eight-hour ozone exposure. He thinks the standard should be weakened because people are rarely outside for that long.
The problem with that reasoning, Craft said, is that some people, including construction workers, do spend most of their day outside. "And what if someone wanted to stay outside all day? I think most people want the option of being able to go outside and feeling like you're breathing air that is healthy."
Another of Honeycutt’s arguments relies on a 2009 study that projects a few dozen more deaths in Houston if the ozone standard is tightened. Those results were based on the assumption that Houston would use a particular cleanup strategy that targets only one class of ozone-forming chemicals, Craft said. The TCEQ can avoid the problem by choosing a different plan, she said.
Robert Haley, an epidemiologist at the University of Texas Southwestern Medical Center, grappled with the TCEQ’s position on ozone last year when he was lobbying for the shutdown of three coal-fired power plants that contribute to the ozone problem in Dallas. Haley is a member of the Dallas County Medical Society, which has petitioned for the closures.
Haley spoke with each of the TCEQ commissioners in back-to-back meetings and said Honeycutt sat in on all of them. The commissioners "all deferred to him, [saying] 'He is our expert.'…
"They consult him on everything."
Not long after those meetings, the TCEQ denied the petition.
"It does no one any good to go and require reducing ozone if we're not having a beneficial impact," Shaw said during the petition hearing. "And there's data…that suggests that [reducing] ozone may not be giving us that benefit."
'Does not necessarily indicate a problem'
The TCEQ’s critics say the agency’s industry-friendly ESLs are just part of the air-quality problem in Texas. The bigger problem, they say, is that violations of the ESLs don’t necessarily trigger regulatory action.
The Texas guidelines are "just a number that they picked, and they said that when the air pollution monitors hit that number, then they would investigate further,” said Marks, the former Houston environmental director. “There was no actual consequence to finding the air quality was above that particular number."
It's hard to tell when or how the TCEQ enforces the ESLs.
The agency’s toxicology website says if airborne concentrations "exceed the screening levels [ESLs], it does not necessarily indicate a problem but rather triggers a review in more depth."
The website also says ESLs are only used to screen companies that apply for air permits, and should not be used to gauge outdoor air quality. The TCEQ has a separate set of health-based guidelines to evaluate air-monitoring data, and those numbers can be up to three times less protective than the ESLs.
But even when these more lenient numbers are exceeded, the TCEQ doesn’t necessarily see a health risk.
When InsideClimate News asked the TCEQ what happens when air-monitoring data exceed guidelines, spokeswoman Andrea Morrow said the data are examined on a case-by-case basis. For example, she said, if an air monitor showed 5,000 ppb of a chemical whose TCEQ’s guideline was 1,000 ppb, the toxicology department would say "there is a potential for adverse health effects."
In other words, even a number that’s five times the TCEQ guideline doesn’t automatically trigger enforcement action.
InsideClimate News then asked if the agency had ever penalized or shut down a facility for violating ESLs or air monitoring guidelines. Clawson, the spokesman, said the TCEQ "does not collect and track information on enforcement actions” in a way that would enable him to answer that question. To get that information, he said, it would be necessary to examine individual investigation reports.
When Honeycutt was asked if he thought the ESLs should be turned into legally enforceable standards, he said that decision rests with the state legislature. A 2007 House bill that aimed to do that never made it out of committee.
Honeycutt defended the way his agency reacts when guidelines are exceeded. He said the TCEQ handles the problem by putting neighborhoods or regions with elevated chemical levels on an Air Pollutant Watch List. The agency then dedicates more resources to improving air quality in those areas, perhaps by investigating local industries or doing additional air monitoring.
"We don't just note [the problem] and go on with our lives," Honeycutt said. "We do do something about it."
But being added to the list doesn't guarantee a speedy solution.
In 1998, a neighborhood in Corpus Christi was placed on the list because annual benzene concentrations exceeded the TCEQ's 1.0 ppb guideline. The neighborhood was removed from the list 12 years later, in 2010—not because the annual benzene average had dropped below that level, but because the agency had weakened the guideline to 1.4 ppb.
The agency’s website cited the new guideline as the reason for the delisting.
My book celebrates African innovation, and doing more with less. It’s in that spirit that I offer two related predictions for news in 2015: First: A push to target global audiences. Second: Product lessons from “mobile-first” markets.
My first prediction doubles as advice for publishers. Growth is an obsession for all new media companies, and in its simplest form, growth means “new.” Leave aside your millennial acquisition strategies, product tie-ins, and paid marketing campaigns — the newest digital news consumers are in emerging markets.
Global expansion as business strategy is gaining traction: Quartz and The Huffington Post both launched India properties in 2014; Politico is headed to Europe. BuzzFeed is hiring reporters in Nairobi, Lagos, and San Paulo. The New York Times has deputized editors to focus on international audience development and, after 125 years, The Wall Street Journal is committing resources to African bureaus. The wire services have long maintained outposts around the globe, but investment in foreign reporting for foreign audiences is the next step for western media.
I predict that 2015 will see additional expansion into the largest emerging markets where English is spoken — India, Nigeria, South Africa — and eventually, those where it’s not: Brazil, Mexico, Turkey, Indonesia. For years, North American media salivated over China’s huge population denominator, despite the controlled information economy. It’s smoother sailing in these other countries, with millions of eyeballs, significant print traditions, and an accelerating digital culture. Add to the mix the access-to-knowledge investments from large technology companies like Facebook and Google and you have a massive expansion of the addressable market.
Transposing content and strategy to new markets requires nuance. As a onetime foreign reporter in Kenya, I know the pitfalls of the lone correspondent model. It’s not clear whether global expansion should target western audiences (with famously low tolerance for “passport stories”) or local ones — nor whether western entrants should aim to complement the existing news ecosystem or serve as outright substitutes. Advertising markets also vary in sophistication and saturation. But global expansion leverages the simultaneity that defines modern media. And strong reporting from truly independent media could transform societies where the accuracy (much less freedom) of the press is in doubt.
What’s assured about the entry into these new markets is that the next billion news consumers will be mobile. With an array of devices on which to consume news, ranging from tablets to televisions, wealthy OECD countries have been “pulled” into mobile. Emerging media markets, by contrast, are born mobile first. Four billion people are still offline — but I’ve seen firsthand how data and hardware are becoming cheaper and more abundant. Telecoms and handset manufacturers are planning for a smartphone revolution — it’s only sensible for the media industry to do the same.
As a result of both trends, I predict the character of news presentation is going to converge — on developing, not developed market terms. We’ve seen carrier-independent chat and mobile payments explode on mobile in emerging markets and then make their way to the wealthy world. For news publishers, the product and content lessons learned from serving emerging markets could prove invaluable in the quest to build a mobile experience that does more with less.
Dayo Olopade is author of The Bright Continent: Breaking Rules and Making Change in Modern Africa and a Knight Law and Media Scholar at Yale.