The idea is to develop an online service to track topics or entities in the news. A customer can register one tracker with a keyword of interest and an RSS feed or an HTML site for free, with retention of up to 30 days, for example. He then gets a dashboard with basic statistics: number of relevant articles per day, trend, whatever. Additional trackers are charged per bundle. Advanced metrics are charged as well, for example: key word clustering, political bias, sentiment, geographic clustering. Custom metrics can be developed against a consulting fee together with the client. Maybe the customer wants to create a time line with events and correlate that with the time line with articles. The idea is to position the service between feedly and trackur by focusing on news and more official sources, not on blogs and social media, and offer an added value in comparison with a scraper service such as scrapehub by providing more analytics and text mining capabilities. The service will be targeted against sophisticated users who need to analyse large volumes of data professionally and need more then just a news aggregator service and don't have the capability to develop and use a web scraper. Social media will not be the primary target, I am more thinking of news sites, public sources, patent sites, all public and official outings. It will be more a service that offers basic research capabilities then a brand recognition service. In the political science literature there are very interesting examples on how to track topics and persons and analyse content. I am working on developing a prototype with a minimal viable product where customers can register a feed and see simple descriptive statistics on a dashboard, and iterate from there.