আমাদের কথা খুঁজে নিন

   

Man vs. Algorithm: When Media Companies Need a Human Touch

Before Amazon evolved into the ecommerce behemoth we know today, it functioned more like an online literary publication. For the first few years after launching in 1995, it relied primarily on a team of editors and writers to review books — the first product Amazon sold — and write copy for the homepage. The goal was to give the young site an authoritative voice and prove to customers that they were dealing with real people rather than machines, an important consideration at a time when many were wary of sharing credit card information online.
By the early 2000s, however, the editorial staff found itself locked in a losing battle against another group at Amazon, simply referred to as the personalization team, which developed algorithms to recommend products to users. One algorithm in particular called Amabot proved to be the death blow for Amazon's editorial staff.
"Amabot replaced the personable, handcrafted sections of the site with automatically generated recommendations in a standardized layout," according to The Everything Store, a new book exploring the history of Amazon. "The system handily won a series of tests and demonstrated it could sell as many products as the human editors."
See also: Betaworks' Vision For the Future of Online News
Amazon, a data-driven company that has always been fiercely focused on managing costs, sided with Amabot and the majority of the editorial team was laid off or assigned to other areas. One Amazon employee later summed up the sentiment among the team members by placing an ad in the 2002 Valentine's Day edition of a local Seattle paper addressed to the algorithm that made them obsolete:
DEAREST AMABOT: If you only had a heart to absorb our hatred ... Thanks for nothing, you jury-rigged rust bucket. The gorgeous messiness of flesh and blood will prevail!
More than a decade later, "algorithm" is no longer such a dirty word. Major media companies like Amazon, Netflix and Pandora rely on algorithms to personalize the user experience and many journalism outlets now use algorithms to recommend stories, organize the homepage and even generate original content. But that doesn't mean real editors and curators have completely lost out. Sometimes the gorgeous messiness of flesh and blood does prevail.
"I'm going to be sitting with you all day here," the voice says a few moments before the first song starts playing. "I'm going to tell some stories, we're going to argue about some things and I'm going to bring on some very special guests."
Usually when you hear someone talking between songs on an Internet radio station, it's an ad, but on this Slacker Radio station it's a big part of the experience. The service recruits dozens of professional DJs and offers 350 curated radio stations covering country music, blues and the promisingly titled "chill" genre. In this case, the voice belongs to Brandon Victor Dixon, a stage actor who appeared in The Color Purple and the new musical Motown, and is guest hosting a countdown of the 55 Best Motown Songs of All Time.
As he makes his way through the list, Dixon talks about the history behind the songs, sings an occasional line from a tune and claims to be "offended" that certain tracks ("Bernadette" by The Four Tops, No. 54) didn't rank higher on the list. For a moment, you might really think you're listening to a classic terrestrial radio station, and that's the point. These stations combine the efficiency of being able to skip songs and fine tune the playlist with the quirkiness that comes from having an actual radio host.
"We're building stations that if a computer tried to build, it would blow up," Kevin Stapleford, Slacker's senior director of programming, told Mashable in a recent interview.
Stapleford spent most of his career working with terrestrial radio stations at Clear Channel and Finest City Broadcasting, and joined Slacker at the beginning of this year to help with its efforts to double down on human curation. Now he oversees a digitally savvy network of radio stations that let hosts adjust their selections based on data from feedback online — analyzing whether listeners skip certain tracks, for example — but which are not run by data alone. For Stapleford and Slacker, the decision to shift away from the industry norm of relying entirely on algorithms isn't just about the noble cause of resurrecting traditional radio; it's an effective differentiator from rivals like Pandora and Spotify and a feature that has proven to be more engaging for users.

Slacker, which now has 30 million listeners, has found that users spend 20% more time listening to stations curated by real people. What's more, users who pay for a Slacker subscription end up spending 84% of their time on the service listening to the curated stations rather than creating playlists driven by algorithms, as one might do on Pandora. Stapleford attributes this success to the variety and personality of having a real person calling the shots.
"There are things that algorithms can do and things they just can't," he says. "One thing they can't do is delight and surprise a listener: you want to throw a curve ball every once in a while."
Michael Silberman recently had a similar revelation. Silberman, the general manager of digital media at New York magazine, says the publication has tried for years to find ways to maximize the number of articles that readers consume when visiting NYMag.com. As part of this effort, his team tested a promising algorithm from nRelate, a content recommendation platform, which automatically generated related articles at the bottom of posts on Vulture, New York's entertainment section.
What happened next surprised everyone: It turned out the links curated by editors "significantly outperformed" those generated by the algorithm. "We tested an algorithm against human selection and human selection won," Silberman told Mashable. "Humans are pretty good at figuring out what other humans are going to be interested in clicking."
That doesn't mean NYMag.com is suddenly pushing for everything to be done manually. Silberman says the company is very much interested in using algorithms and personalization technology to better target the reader's interest and filter out stories he or she may have already seen. But for the foreseeable future, he expects editors will continue to play a role in suggesting stories on the website and determining which stories are featured at the top of the main page.
"What we want to do is make sure that we are using our editorial resources in the most effective way possible," Silberman said in response to a question about whether he would draw any line in the sand for dividing the roles that algorithms and journalists play in the newsroom. "Certainly we won't have robots writing stories so that seems like an obvious line in the sand."
As it turns out, that may not be such an obvious line for all publications.
Robbie Allen has probably fielded more questions from frightened journalists than most CEOs. Allen runs Automated Insights, a North Carolina-based company that uses data processing and algorithms to turn numbers into narratives. The service generates more than 300 million pieces of content per year — which Allen proudly claims is greater than all other media organizations combined — on data-heavy subjects like sports scores, earnings reports and real estate stories.
Given that the content is produced automatically, the startup can customize stories for particular needs and users. It provides fantasy football recaps to Yahoo, which are tailored to each user's league. It has also provided sports recaps to local news outlets and, according to Allen, is close to announcing a partnership to provide NFL-related content to The Associated Press. (Narrative Science, a similar service, currently provides content to Forbes and others.)
"Early on there were a lot of questions about whether we were going to automate journalists out of a job and whether journalists in general should be worried about us," Allen told Mashable. As he sees it though, Automated Insights is going after the kind of hyper-personalized content that traditional publications and reporters never touched. "There's no sort of competition with journalists in general for that type of content."
While the automated stories may get the most attention in the press, it's actually just one of three functions that Allen says Automated Insights can serve for the media. The company can use its data tools to spot trends that may be an inspiration for more deeply reported stories and it can use data to write the framework of a story, which reporters and editors can then fill in with quotes, images and other details. The latter model is how Automated Insights sometimes works with local news outlets and it provides a clear example of at least one thing algorithms can't do: hound people for quotes.
"There's no well-defined source for getting quotes in a timely fashion. That requires good ol' fashioned human interaction," he says, allowing just enough of a pause to let a journalist breathe a sigh of relief before adding: "I don't rule out anything could be automated."
Some news organizations have decided to be proactive and come up with their own tools to automate coverage. At The Los Angeles Times, database producers Ben Welsh and Ken Schwencke have quietly developed algorithms to generate stories on crime and, more recently, earthquakes moments after they happen.
"There just aren't enough people in the newsroom to say in a city the size of Los Angeles that there's a couple more burglaries than usual in a neighborhood," Welsh told Mashable, explaining the need to be more scientific with crime coverage. Likewise, he said the publication was sometimes "inconsistent" in deciding which earthquakes merited coverage in a region that gets more than its fair share. Editors, he said, would resort to asking themselves, "Did we personally feel it in the newsroom?"
With that in mind, Welsh and Schwencke worked together with editors to create algorithms for each subject that would take into account factors like the magnitude and location of an earthquake or how many standard deviations above the average crime had been in a certain time period to determine whether to run a story. For those incidents that merit coverage, a short breaking news item like the one below is generated automatically with a few lines of text, some stats and a map of the relevant area.
It's not exactly Pulitzer Prize-winning writing, but as Welsh points out, "it will be on the blog within seconds and we won't spend a ton of resources."
While some might blanch at the idea of automating breaking news, Welsh argues that it is still very much steeped in the editorial process. By consulting editors early on, he believes their editorial judgment "is enshrined in the code." The story itself is also reviewed by an editor before it publishes on the site to ensure it passes muster. "Nothing that we do is put up without a human looking at it," he says.
Welsh is quick to point out the limitations of relying on algorithms for news coverage. An algorithm may be able to write up a short blog post on an earthquake, but it can't do the legwork needed to go beyond that and produce something like the Times' recent investigative feature on homes and offices at risk of collapse from future earthquake.
"There's all sorts of outside information that a human will have when reading something that will put it in a greater context," he says. "Computers are really literal. They do what you tell them to do — exactly what you tell them to do. There's a blessing and a curse to that."
Image: Amazon, Mashable composite

সোর্স: http://mashable.com/

অনলাইনে ছড়িয়ে ছিটিয়ে থাকা কথা গুলোকেই সহজে জানবার সুবিধার জন্য একত্রিত করে আমাদের কথা । এখানে সংগৃহিত কথা গুলোর সত্ব (copyright) সম্পূর্ণভাবে সোর্স সাইটের লেখকের এবং আমাদের কথাতে প্রতিটা কথাতেই সোর্স সাইটের রেফারেন্স লিংক উধৃত আছে ।