The mission of No More Homelessness (NMH) is to help every homeless person in the best possible manner through its Experience Cloud site. NMH's site manager wants to set up search engine optimization (SEQ) to ensure NMH's public Experience Cloud site is visible to search engines.
Which two practices does the site manager need to do to ensure SEO is implemented successfully? Choose 2 answers
- Check whether a custom robots.txt file to control indexing has been created.
- Check whether the Experience site is public and activated. Pencil & Paper
- Check whether the SEO Institute has provided the approval for the site with end date.
- Check whether manual sitemap refresh happens on the last day of every month.
Answer(s): A,B
Explanation:
A robots.txt file is a text file that tells web crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. You can create a custom robots.txt file for your Experience Cloud site to control how search engines index your site. To make your Experience Cloud site visible to search engines, you also need to make sure that your site is public and activated. A public site allows anyone on the internet to access your site without logging in. An activated site is live and ready for visitors.
Reveal Solution
Next Question