If you are using Lylina (the open source RSS aggregator) and are hosting with Dreamhost, use the following code fix to enable functionality on Dreamhost.
Open discovery.php in your text editor of choice and jump to line 46. Remove the following:
$get_feed = file_get_contents($url);
Replace with (remove the spaces from ‘e x e c’):
$ch = curl_init();
$timeout = 5; // set to zero for no timeout
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$get_feed = curl_e x e c($ch);
Save and upload discovery.php and you should be all set.
Reasoning: Dreamhost disable file_get_contents() as a security precaution (mainly to thwart clueless webmasters from calling files via querystrings) and relies upon cURL to achieve the same effect. My fix makes that change and should have little to no effect on overall Lylina performance.
*Please note: I am not a developer of Lylina. Please direct any support issues to the Lylina SourceForge project forum.
There has been a lot of buzz throughout the net this weekend as most people publishing RSS feeds saw dramatic increases in their subscriber numbers as reported by services such as FeedBurner.
The reason? Google finally started reporting subscriber numbers from their services. Yahoo, Bloglines, et. al., have been reporting this information for some time but Google has been a bit behind the times. Friday, they announced this was changing, and Saturday FeedBurner’s subscriber reporting was updated.
If you aren’t using a feed service, you can gauge your subscribers by quickly perusing your log file. The Google feed bot now reports subscribers when grabbing your feed, such as:
“Feedfetcher-Google; (+http://www.google.com/feedfetcher.html; 27 subscribers; feed-id=xxxxxxxxxxxxxxxxxxx)”
Subscribers counts to my sites jumped from 20% -140%.
One of the sites I write for regularly has a rather large visitor base, 5k daily uniques to the site and over 2k rss feed subscribers. We’ve always published a full article feed for the convenience of our readers but in the last year we have constantly battled splogs – spam blogs who exist only to aggregate others’ feeds for monetization purposes.
We’ve experimented with few different tactics, but lately have settled on “poisoning” the feed with blatant copyright information with links back to our site and requests to inform us if they find our feed on a non-approved site:
© 2006 thesiteinquestion.com. This RSS Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact firstname.lastname@example.org so we can take the appropriate legal action.
While it may be a minor inconvenience to our subscribers reading our feed in their own feed reader, it amounts to 2-3 lines of text following some white space at the end of each post and should be fairly unobtrusive.
Another tactic we’ve tried in the past is stuffing hidden links in our feed. Taking a page out of the link-spamming technique of including a hidded div and creating keyword rich links to domains that feed into affiliate programs. While this was highly successful in creating backlinks to sites (blackhat SEO Tip to think about), many of the web-based feed readers were running into display problems with the hidden div (namely bloglines.com). We’ve since discontinued the practice until we can perfect a way to avoid the display issues.
I hate splogs.