Tag Archives: Content

Vanity Press or Monopoly Busters?

A few months ago, I got an email from “iConcept Press” inviting me to write a book chapter in their IR journal based on my AAAI paper. I ignored it, like I ignored another email in a similar vein from another “publishing house”, and found at least one blogger who was just as suspicious at this seemingly mass solicitation.

You see, in the academy we are conditioned to believe that the lower chances of acceptance, the better the venue for publishing, so if you’re willing to accept me to your club right from the start – huh, forget it!

A couple of weeks ago I got another mail from them. This time, the happy bunch invited me to be a reviewer on one of their books. Now, that was really amusing – if not a writer, then I’d be a reviewer? pathetic, I thought. But is the picture really this simple?

It was interesting, first, to see that they do actually use a peer-review system, even if perhaps not a super-duper double-blind system. And then I started wondering, is that conditioning for favoring low-acceptance publications really still relevant in the self-publishing era?

I remember when I published my first paper on AAAI, I was quite outraged at the idea that you have to pay, then to give away all copyrights, and then be used as a money bait for readers, as the publication meant I could not give free access to my own readers, unless I pay again. In a time when publishing your words on the web is such a common privilege, that seems plain wrong.

Back in the times when publishing was a costly process, high selection rate guaranteed that subscribers won’t waste their money sponsoring the print of low-quality papers. Furthermore, anything not printed had a very low chance of getting read by other researchers, not to mention cited, and so readers relied on editors to indeed include only the best. Nowadays, papers are read mostly online, and if your paper is accessible to search engines, that suffices – whoever finds your research useful will read and cite it. This Wikipedia entry has the whole story in a nutshell.

So as for myself – I still did not publish or review in iConcept press, but I am now less dismissive of this somewhat disruptive industry; not because it will win over the established venues, but because it will accelerate the move towards decentralized and online publishing, better fit for our era.

Advertisements

Death of a News Reader

Dave Winer says I don’t read his posts. He’s right, I admit. I skim.

I’m overloaded. So in the past few months I’ve gradually reduced my subscription list from over 50 feeds to around a dozen, and at the same time increased my reliance on Genieo, which claims to be tracking already 537 feeds for me (though not all are ones I really would fully subscribe to, but that’s the beauty of it…)

When trying to understand what had happened, I came to realize my reader subscriptions list was made of two types of feeds:

  1. Feeds that are generally on topics I’m interested in
  2. Blogs where I thought the author was interesting or smart

Type #1 is, being practical, simply not scalable. There are just too many good sources out there, and not all posts in them are really read-worthy for me, even if just to skim through. So I let Genieo discover those feeds (just clicking through to some posts) and then removed them from my subscription list. It’s amazing how good it feels to safely eliminate a feed from your reader (“…yes, I am sure I want to delete!” :))

Type #2 is more tricky as I would usually be interested in all of the posts even if not in my topics of interest. These include blogs by friends, and blogs by smart people I stumbled upon who seemed worth following. I also wouldn’t want Genieo (or any other learning reader for that matter) to think I’m generally interested in those more random topics and clutter my personalized feed. So I still kept this much shorter list in my reader, but I know I can visit them a lot less frequently and not lose anything.

This combination has been working well for me in recent months. Social diet hurray!

The (Filtered) Web is Your Feed

A few months ago I was complaining here about my rss overload. A commenter suggested that I take a look at my6sense, a browser extension (now also iPhone app) that acts as a smart RSS reader, emphasizing the entries you should be reading. I wanted to give my6sense a go then, but the technical experience was lousy, and moreover – I was expected to migrate my rss reading to it. Too much of a hassle, I gave it up.

In the past few weeks I’ve been test-driving a new player – Genieo, which takes the basic my6sense idea a few steps further. Genieo installs an actual application, not just extension, that plugs into your browser. It tracks your rss feeds automatically, simply by looking for rss feeds in the pages you’re browsing, and learns your feeds without any setup work.

Genieo then goes further to discover feeds on pages you visit even if you’re not subscribed to them, turning your entire browsing history into one big rss feed.  It finally filters this massive pool of content using a semantic profile it builds for your interests, based on analyzing the text you’ve read so far.

For IR people this may sound a lot like Watson, Jay Budzik’s academic project turned contextual search turned an advertising technology acquisition. Watson approached this problem as a search problem: how would I formulate search queries that would run in the background, fetching me the most relevant documents that match the user’s current context? problem is, users are not constantly searching, and would get quickly annoyed by showing general search results when not asked for.

The good thing about an rss feed is that it explicitly says “this is a list of content items to be consumed from this source“, and its temporal nature provides a natural preference ranking (prefer recent items), so a heuristic of “users would be interested in recent and relevant items from feeds in pages they visited” works around the general search difficulty pretty well. Genieo circumvents the expected privacy outcry by running the entire logic on the client side, nothing of the analyzed data leaves your PC (privacy warriors would probably run sniffers to validate that).

In my personal experience, the quality of most results is excellent, and they are almost always posts that would interest me. Genieo quickly picked up my feed subscriptions from clicks I made in my reader to the full article in a browser window (from which it extracted the rss feed), and after a while I could see it gradually picking up on my favorite memes (search, social and others). I did not give up my rss reader for Genieo yet, and I also still have many little annoyances with it, but overall for an initial version, it works surprisingly well.

However, the target audience that is even more suited for Genieo is the not rss-savvy users like me, but the masses out there who don’t know and don’t care about reading feeds. They just want interesting news, and they don’t mind missing on the full list (a-la Dave Winer’s “River of News” concept). Such users will find tools like Genieo as useful as a personal news valet can be.

To Tweet or Not to Tweet (hint: that’s not the question)

I was catching up on my RSS overload the other day, when this side note in a post by Naaman on Social Media Multitaskers caught my attention:

“I find that I now blog thoughts that are too long to fit in a tweet; so feel free to follow my tweets…”

"I am the man. I suffered. I was there." CC by 'Kalense Kid'/Flickr

I’m not too much of a media multitasker myself, so I don’t experience this duality first hand, but I can imagine it: you get an interesting thought or experience, then you think is this major enough to develop into a blog post, for which I’ll go over here, or is it not that heavy / can’t be bothered, in which case I flutter my wings over there. Actually I do experience these, just that in the other case I simply drop it (and excuse me for not considering Facebook status updates an option, that’s stuff for another post…)

This should not have been a dilemma at all, had blogging platforms evolved to accommodate microblogging, which today is somehow seen as the centralized domain of a single commercial company. You really should be able to hop on your publishing platform, write that thought down, regardless of length, and fire it out. No need to figure out which channel to use, and whether the intended readers are indeed following you there. Similarly, your friends/readers should not have to register to your feeds on different platforms but rather consume one only, and rely on a powerful set of rules to filter your stream as they find fit.

posterous-mediumPosterous is a great (and fast growing) example of how easy it can be from the blogger’s perspective. Just post it (or rather, email it) and it will get published as needed (e.g. shortened for twitter). But it does not make it any easier on the consumer, who still needs to decide where to best follow this blogger (does he perhaps write additional blog posts directly on his blog that won’t show up on his twitter? or vice versa?’) and reduces the basic filtering capability that may have existed when different post types were distributed into the different services.

No need to reinvent the wheel here, blogging platforms are abundant, decentralized and perfectly fit to remain our publishing hub, with their developed CMS and the loose but well-defined social networks. What blogging platforms should do – heck, what Automattic should do to evolve, is:

  1. Conversation support the realtime conversational nature of short posts, with the right UI and notifications mechanisms. The “P2” microblogging-optimized theme released almost two years ago was a good start, sadly it still followed the line of thought of “blog or microblog, not both”. To move forward, Automattic need to realize that Twitter is not a personality, it’s a state of mind, hence also P2 can’t be a permanent theme, it should be a contextual theme.
  2. Publishingacquire Posterous. As simple as that. These guys got their fame by understanding the pains of publishing anytime anywhere, they know a thing or two on usability and persuasion, and they have great buzz. The latter is not luxury – a buzzed-up acquisition makes it very clear that this is a major strategy for you, a lot more than if you’d develop the same changes yourself.
  3. Consuming – that’s the tricky part… how do you embed Twitter and WordPress into the same stream, when each consumer has their own desired blend of it. We don’t want to invent a new technology, RSS is here to stay. We do want better ways of filtering our floods using better tagging coupled with more clever feed options. How exactly – I do hope there’s an entire team at Automattic working exactly on that…

    The Broken Web

    Dave Winer recently pointed out two trends that pose risk to user-created content on the web:

    • Over-reliance on url-shorteners. Fueled by twitter’s laconic style, more and more links to content are created using an indirection via url shortener services such as bit.ly and tr.im. The collapse of such a service may turn tons of links into broken links in an instant.
    • Centralized conversation platforms. Shifting the conversation away from their blogs, influencing content publishers chose to center on platforms such as twitter and FriendFeed. Besides the increased noise inherent to lifestreaming, there is increased risk in making your contributions (and having your readers contribute back) in a site run by a private company with no real commitment to its users.

    In the past two weeks both these risks materialized to some extent. The url-shortener service tr.im shut down, and that 404-iceberg was avoided in the last minute by the owners’ decision to open-source it. Then Facebook acquired FriendFeed, and their PR said

    “…FriendFeed.com will continue to operate normally for the time being as the teams determine the longer term plans for the product.”

    Hmm, right… So Scoble’s blog still loves him, and is probably a safer publishing venue.

    But why is this such a big deal anyway?Broken web of intrigue, CC by 'Looking for a Lighthouse'/Flickr

    We tend to forget how much we have invested into such services until they break down (as was the case with ma.gnolia). The web’s strength is in storing and being able to search in the content produced by millions of earthlings. The impact of frailness of large amounts of content or links is significant. Especially for social search, that content could be vital (OK, perhaps except for that part about what you had for breakfast).

    As always with such issues, the best solution is decentralization. For url shorteners, the ‘shortlink’ protocol was already suggested for site-maintained shorteners, and WordPress has already implemented it. My blog is already enabled, try http://wp.me/plBAi-8Q.  And then content decentralization is in our hands. Think about it the next time you post your thoughts into twitter rather than in your blog…