You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
By forcing bots to use compatibility mode instead of the web app, we can ensure that all pages will be crawled by bots such as Google, ensuring that we do not lose any SEO benefits that comes with our site.
I envision that this would work off of an allow-list of user agents, and the list would be filter-able. This would not only allow us to set up a default list of agents, but it would also allow us to add more agents to the list should the site need that.
The text was updated successfully, but these errors were encountered:
in #6, use_compatibility_mode was made to be extend-able. This makes it possible to add this as a decision list item. I'm inclined to make this a separate composer package that is used by the boilerplate by default. This would make it possible to remove this if someone wanted to-do that for some reason.
Related, #6
By forcing bots to use compatibility mode instead of the web app, we can ensure that all pages will be crawled by bots such as Google, ensuring that we do not lose any SEO benefits that comes with our site.
I envision that this would work off of an allow-list of user agents, and the list would be filter-able. This would not only allow us to set up a default list of agents, but it would also allow us to add more agents to the list should the site need that.
The text was updated successfully, but these errors were encountered: