On The Future Of Feature Writing

Espn

Multi-chapter, multimedia, highly immersive digital features from publishers seem to be becoming quite the thing. The latest is the rather lovely ESPN Grantland story of the Iditarod Trail Sled Dog Race, which follows on from their similar feature on The Long, Strange Trip of Dock Ellis. Then there was the beautiful Pitchfork cover story on Natasha Khan which wonderfully integrated text, imagery and audio.

Pitchfork

And this example from Outside magazine, which graphically tells the story of an early ascent on Everest, is notable for incorporating third party commercial sponsorship (albeit in a pretty clunky way).

Outside
The best known example is the New York Times' Snowfall of-course, which integrated video, images and graphics in a seamless, flowing narrative. And as The Atlantic described it, did it in a way "that makes multimedia feel natural and useful, not just tacked on".

Snowfall
Inevitably some have lauded Snowfall and its ilk as the 'future of journalism', but others have pointed out (mostly on Twitter at the time) just how time and resource intensive such projects can be. How can Snowfall be the future of journalism, they say, when it took six months and the involvement of 16 people to produce?

I think that kind of misses the point. To paraphrase The Atlantic, Snowfall (and others like it) are not the future of journalism, but that's OK. Snowfall recently won the Pulitzer for feature writing. This kind of reader experience where one element flows seamlessly into the next is uniquely suited to longform feature content (and to creating immersive feature experiences on tablet devices), and I'd argue far less suited to punchier, news driven content. Snowfall was produced as as the result of an off-and-on project, and an almost documentary-style approach outside of the realms of that which was possible with the normal CMS.

As the skills of journalists develop, as the shape of the skillset in media organisations expands to encompass new design, video, and graphics skills, as commissioning proceedures change, as integration with third party tools improves, as CMS's develop to enable a greater, more intuitive use of multimedia content, this kind of immersive experience will only get quicker and easier to do.

In the meantime, it's a good time to experiment and learn. Andrew Kueneman, Deputy Director of Digital Design at the NYT said of Snowfall: "In the long term, we also walk away from an effort like this with many valuable lessons in design, development, team collaboration, editing, promotion, etc.—lessons we can apply going forward, and ones we could only learn while working on deadline."

So whilst this kind of thing might not be the future of journalism, it is surely at least part of the future of journalism, and a potentially quite significant one at that. This kind of mixing of traditional writing skills with techniques that are more akin to filmaking enable a far richer pallette for features writers. Given that, I'm really surprised that we haven't seen more magazine publishers experimenting with this type of approach.

HT @Mrjamescarson for the Outside link


Layar And Magazines

People have been trying for years to crack the idea of creating some kind of interplay between print media and screen media. Unfortunately most of the offerings that people have come up with have never acheived any kind of traction, mostly because they've been simply too, well, clunky. I remember some extremely enthusiastic people coming to see me in the early noughties with a product that involved the reader having to hold their magazine up close to a PC webcam so that it might read a bar code embedded in a print ad (I remember thinkng that by the time you've done that, you might as well have typed the URL into your browser).

So we've been left with the modern version of that - QR codes - that as any print Art Director will tell you can look rather messy on a carefully crafted magazine page layout. In order for this sort of stuff to have any value at all, it's got to involve as negligible an amount of effort as possible on behalf of the reader, and be simple, intuitive and seamless.

Of-course when augmented reality came along, some were quick to create examples of it working with magazines and print ads, but they've still felt very gimmicky in a way that Layar's new version, demonstrated above, does not. Not that I think for a minute that anyone's going to read a whole magazine with their phone held up in front of it, but I rather like the idea of creating dynamic tags that enable me to buy stuff I see in a fashion piece (in reality I'm not that fashionable), or see an accompanying video to a print feature on my phone should I wish. Perhaps this sort of stuff is starting to come of age.

HT Hugo Rodger Brown for the link 


News+

What would a newspaper look like if it was invented today? That's the question that Bonnier have asked themselves and it's a good one. Because the answer is quite possibly not much like the current crop of digital news apps. Almost a year ago, Bonnier collaborated with BERG for a pre-iPad re-imagining of the digital magazine. What I liked about it was that it didn't make the mistake of trying to impose inappropriate analogue features onto a screen based digital experience.

Similarly, there's also a lot to like about News+. Like the fact that they have advocated a 'third way' that differs from both the traditional newspaper and the web, but combines editorial curation with plenty of opportunities for social curation, interaction and conversation. Publishers have seemed very keen to develop apps which look great, but are relatively closed systems that box content in and speak of a desire to control. It's almost like they're already forgetting the digital lessons that have long been so painful to learn.

News+ concept live from Bonnier from Bonnier on Vimeo.


Cooking Dinner With The iPad

Early iPad content apps seem to have suffered from a McLuhan-esque propensity to consider a new technology through the lens of the old, and what Jakob Nielsen called a 'crushing print metaphor'. There's perhaps an inevitability to the inconsistency in usability that Nielsen described, but the iPad is a re-imagined interface, and so requires re-imagined formatting of content.

Award-winning photographer William Hereford has created an experiment combining the kind of typeface you typically see in quality magazines with video which has been shot and edited to feel like a still photograph. Says William:

"My hope is to develop this video to work with tablet computers so that you could "swipe" between the vignettes instead of them playing with a rigid sequence from start to end. Tablets (and the internet really) provide the opportunity to look at moving images with the same studied intensity as a still photograph. Traditionally we are at the director's mercy regarding when a shot begins and ends- the whole experience is fleeting, which can be wonderful , but I like the idea of creating a moving image which runs on a loop or is shot over a long period of time so the media can be consumed and studied in ways a traditional film cannot."

Video, combining with photography, combining with print. A recombinant media experience. Surely we should be seeing more experimentation like this from established content producers?

Cooking Dinner Vol. I from William Hereford on Vimeo.

HT


The First Digital Magazine

 Ipad-app-spread

So, back in December the Swedish publisher Bonnier collaborated with designers BERG London to conceptualise a vision for digital magazines on touchscreen devices. They did a pretty decent job - the most compelling vision, I thought, amongst a rash of others that were around or have come out since.

Well now the vision has been brought to life in the form of an iPad app for Popular Science magazine, called Popular Science +. With only 60 days available since Apple announced the iPad, they've had 6 editorial teams in 3 countries working to re-imagine the form of magazines. Popular Science+ is available now in iTunes but a preview (below) has been made available at the Bonnier Beta Lab.

As before, I like that they have deconstructed the magazine form and reassembled it with a design vision that captures some of what makes magazine consumption such an inherently unique experience, and balanced it with the particular attributes of digital consumption. To paraphrase Sarah Ohrvall at Bonnier R & D (who kindly sent me a note about the launch) rather than producing what BERG call “a wrist screen running clock software”, they have built the watch. The real question is how ready publishers (with already stretched resources and long priority lists) are to serve another platform. For those lucky enough to already have an iPad, you can download the app here.

Mag+ live with Popular Science+ from Bonnier on Vimeo.