Monthly Archives: December 2003

Confessions of a Napster 2.0 user

As I’ve mentioned before, I’ve been using Napster 2.0 (formerly Pressplay) for about 8 months or so. Despite a couple of minor glitches, overall, it’s been a very positive experience. I’m on the $9.95/mo plan. A few observations from continued use:

  • First, I’m at my desk probably 10 hours/day on average, and I’m streaming content most of this time. I listen to a lot more music now than I ever did. And I listen to a much wider variety of artists now.
     
  • When I read about some new music, or a friend tells me about some new artist, I can just search for them on Napster and listen to them instantly. There have been only a couple of times when I couldn’t find what I wanted; and the artists I couldn’t find don’t seem to be available on any online service that I’ve seen. Examples – Madonna and LeAnn Rimes.
     
  • When I find something I really like, and I’m constantly listening to it, I go buy the CD rather than buying the music through the service. Then I can play it in the car, in my disconnected home stereo, and rip it to my computer to listen to it there. This is good for the label, I suppose, but I wonder what it does to the margins for Napster. I’m guessing at $0.99/track, there is very little margin there anyway…so hopefully they’re making a reasonable profit from the monthly fee when I’m constantly streaming.
     
  • Most songs on Napster have the full song available for streaming, but a few songs/artists only have 30-second previews. I’ve found that if an entire album is only previews, I won’t listen to it…but one of my favorites has all full songs, except for one preview-only song. Annoying, but I hate to admit it’s actually a good incentive to buy the CD. I want to hear the rest of that song! :-)
     
  • I really want a 10 foot interface for this stuff that I could run on a Media Center PC, which I have heard rumors actually does exist. I guess now all I need is that Media Center PC!

PR on the Cheap

Freelance writer Ron Miller wrote an article featuring NewsGator in his latest Home Base column in Network World Fusion, titled PR on the Cheap. From Ron’s weblog:

I describe how small companies who can’t afford to hire a PR firm or a clipping service, can use a news aggregator such as NewsGator to help monitor the Web and blogs for information about their company or markets.

Great stuff!

Aggregators that automatically download web pages

This is a pretty common request for NewsGator:

Perhaps I’m missing something but I think that actually having a reader go out and retrieve the referenced news web page along with the summary feed is much more valuable…  Reading hundreds of news headlines is less useful when you are travelling, offline, etc.  as there is no way to get the actual content.

Wouldn’t it be possible to add a feature that retrieves the referenced URL?

[NewsGator Forums]

Currently, NewsGator shows whatever is in the feed – nothing more, nothing less. If the feed contains full content, that what will be shown; if the feed contains only excerpts, that’s what will be shown. In essense, we show whatever the publisher intended.

There are other tools that will go out and retrieve the contents of the web site at the link specified in the RSS item automatically at retrieval time (as opposed to viewing time), so it can be read offline, which is what’s essentially being asked for above.

If the feed publisher really intended you to see the complete web page inside your aggregation tool, they could put the complete content inside the feed…then we would show that.  But often times they don’t, obviously.

So we’re caught between doing what the publisher wants (driving a click-through), or doing what the user says they want (scrape the page).  It’s a tough call – we don’t want to upset the publishers, as they’re the ones providing the content…

There are also a number of downsides with a scraping mechanism.  It uses a sizable amount of bandwidth to retrieve all of these pages.  You may not even be interested in some of the pages, so they were retrieved for nothing, costing the publisher additional bandwidth.  Advertising stats on the publisher side will be skewed.  It’s a tough call.

Any comments?

EContent Top 100 – NewsGator Technologies

NewsGator Technologies has been listed in the EContent Magazine Top 100 list for 2003! From the magazine:

“Welcome to the third annual EContent 100 – our list of companies that matter most in the digital content industry.”

So far, this is only in the print magazine, not yet online…I’ll add a link to this post when it hits their web site.

I’d like to take this opportunity to thank our staff, contractors, partners, and customers for making this possible. And if you think 2003 was good, wait until you see what we’re going to ship in 2004. :-)

Update: 2003 EContent Top 100 now online

PC Magazine on RSS again

In the December 30, 2003 issue of PC Magazine, they review several blogging tools (congratulations to Six Apart for the Editor’s Choice!), and talk about RSS and popular aggregators, including FeedDemon, SharpReader, and NewsGator. From the article:

RSS is even poised to change the business world. […] According to Greg Reinacker, the founder of NewsGator, programmers at at least one software company are using his aggregator to notify one another whenever they make changes to their communal code base.

MIME types and feed: again

Joe Gregorio has a good post about MIME types and the feed: scheme:

There has been much talk today, and in the far past, of how to automatically handle syndication subscription. The conversation was first brought up and thoroughly discussed by Greg Reinacker. The issue has resurfaced on the [atom-syntax] mailing list. Now there are a small contingent of folks pushing for a new uri scheme called ‘feed:’ that would enable syndication subscription. Creating a new URI scheme is a bad idea, don’t do it. [BitWorking]

There’s more to his post; click through and read it.

I had a long response typed out, and it’s been sitting in my Drafts folder for a day now. I deleted it; I just have two short comments now:

First: many folks don’t have sufficient access to their web server to change MIME types. Not a problem for say, Blogger and TypePad, but it is indeed a problem for, say, some Radio and DasBlog users.

Second: As Joe mentions, using MIME types will require adding some kind of link tag to the feed, and he proposes exactly this for Atom feeds. But that doesn’t help for all of the existing RSS feeds in the wild today…and adding a tag to all RSS feeds is problematic (spec is frozen so we can’t require the tag, and adoption rates for an optional tag will be dismal).

It’s all about adoption rates – for something to be useful to the average user, it needs to be pretty pervasive. Both of the above issues fly in the face of quick adoption.

More reading here, here, here, and here. And lots of other places I’ve missed, probably. :-)