Sitemaps work in tandem with incoming links.
Basically the more a bot comes back from a certain search engine, it will spider your site deeper and deeper.
It is quite easy to get a spider to come back to your site with the use of social bookmarking sites.
All a sitemap really does it show the search engine pages on your site that may be deep linked for instance an article written 2 months ago would not be linked from the front page and therefore a spider may or may not find it.
However remember that because Sott uses a lot news from sources that published the information first it is unlikely you will be ranked for those keywords unless you added enough commentary where the content may become unique again.
Hope i did not muddy the water but basically a site map can help get your site indexed in the search engines more fully though it does not mean you will be ranked in the top 10 for those keywords.
If you do decide to make a site map for ranking purposes i can get the links back to the site needed so that the spiders will follow the information often with social bookmarking systems i already have set up.
You could search your own raw server logs if you wanted to see what kind of pages are currently getting ranked with no active effort and lots of other useful information.
I think a sitemap would actually be more useful for humans to browse than the bots though, would be easier to view past articles.