Sitemaps, keywords, metadata and other techniques collectively known as search engine optimization may have once served a purpose, but in the 21st century indexing techniques are moving forward at a much faster rate than any of the techniques designed to help you rank better.
I have always been of the mindset that if your site is a big enough mess that the only way a Google bot can find and properly index a page is by searching for helpful metadata or finding a link to that page in your sitemap then the efforts you are putting into SEO would be much better spent fixing the mess that created that indexing problem in the first place.
Your first and primary audience for website improvements should always be your users. The search engine's goals should be the same as yours. That is to provide the best content to your users in the fastest way possible. If you don't care about providing quality content to your users, it is probably time to look into a different career path.
Instead of worrying about providing additional extraneous information to bots you should be removing barriers from REAL users.
What do these kinds of barriers typically look like?
- Slow page loads with too many unnecessary images.
- Large HTML templates full of extraneous elements. This will be a big contributor to #1
- Disorganized tag structure. Is our <h1> actually the top level element for the page?
- Ignoring accessibility considerations. These techniques help users while providing extra metadata for search engines.
- Failing to properly secure the connection between the server and the user.
Do I need a sitemap?
Not if your navigation is well thought out.
Do I need to agonize over my metadata?
Not if the content is written for humans.
Do I need to mark up all my content with the structured data format of the week?
Only if you are a masochist.
The internet is always going to be a bit of a chaotic mess with sites being built and generated by all sorts of different tools. Luckily for site owners Google has some of the best software engineers in the world tasked with digging through this mess to surface the most relevant content. Quality content will always get traffic and better search placements, and low-quality pages won't. With every trick that is found to expose lower-quality content a more sophisticated technique will be created to prevent showing that content to end users who don't want to see it.