I'll also be adding insights into how this site was built with code examples and thoughts on how those could be further developed.
Feel free to leave questions or comments at the bottom of each post - I just ask people to create an account to filter out the spammers. You won't receive any unsolicited communication or find your email sold to a marketing list.
Wagtail - Configure the robots.txt and Block Search Indexing (the correct way)
Rather than just being a static file, you can use Django/Wagtail templating to create a dynamically generated robots.txt. This is not the place to block search engine crawlers though, I'll show a method to apply that from your base template.
Configuring a Dynamic Sitemap on Wagtail
A sitemap lists a website’s most important pages, making sure search engines can find and crawl them. It's important to keep your sitemap up to date for optimal SEO. With a quick bit of coding, you can set your sitemap to be created dynamically on demand, ensuring it always reflects the latest content. There's another tweak needed for routable pages and multi-lingual sites using wagtail-localize.
Making Wagtail pages more SEO friendly with Wagtail Metadata
Wagtail pages are great for creating a lot of rich content straight out of the box, but for SEO optimization, they need some tweaking.
Here, I subclass the Page model with some help from the wagtail-metadata plug-in.
This subclassed model becomes the base for all site pages and holds all the data for og metadata, twitter cards, page description etc..
Wagtail is a leading open source CMS utilising Python and the Django framework. Tens of thousands of organisations worldwide, including Google, Mozilla, NASA, and the British NHS are now using Wagtail. In case you're new to Wagtail, and looking to learn as a developer, I've gathered some great learning resources here to get you started.