Scoble references someone worried about scaling syndication requests:
RSS and Atom feeds are pulled down by news aggregators like the NewsGator I use every hour. Multiply that by 20 million people and that's a bandwidth bill that's many times higher than it is via a web browser (because browsers only visit occassionally, and not every day).
RSS and Atom feeds pull down all the content every time, even content that hasn't changed since last time. Is there a way for him to only send down a feed that's changed, or even better, a partial feed with only new items? (My feed sends down 75 items each time whether new or not).
This is one of the things that already has a stock set of answers:
- Have your server handle conditional-get
- Have your server use mod-gzip
That's pretty much it - it has worked for years to scale the web, and I expect it will work for years to scale syndication as well...