Idea behind is to have aurelia render the meta tags on client and on server
Don’t know yet how it will pan out for crawlers that execute javascript… will they rescan meta tags after navigating via js
That’s a good point about them possibly crawling through the JS and checking the meta. Might have to add code to also update the meta dynamically too (not hard).
I ran a quick google page speed insights. Looks like they think it runs okay. I also run it behind a reverse nginx proxy with gzip and 304 not modified etc enabled.
Page speed is important for SEO because Google ranks pages more highly if they load faster and work better on mobile devices and so on. They’re pushing PWA (progressive web apps), so they may even rank sites running service workers more highly. The more optimizations the better. Meta tags for description and keywords is important, and so is opengraph tags for things like facebook, twitter, discord, rocket chat, and any other chat/posting program that reads opengraph meta tags for website previews.
Apart from the minification options in the webpack config, I also run all of the images through compressor.io when I save them, to improve speed.
May I ask you if you have time publish a post about How to make Aurelia apps SEO friendly in both Browser and SSR?
I believe your knowledge and experience can help everyone.
I have a very low level of knowledge in this field and I am confident that this information will also help others.
I think yes. Besides metadata, you will need to provide href to <a> elements in addition to route-href or click.delegate() so that Googlebot can follow links in your site.