Which out of the box Salesforce B2B Commerce page can give instructions to web crawlers from accessing specific Salesforce B2B Commerce pages?

Which out of the box Salesforce B2B Commerce page can give instructions to web crawlers from accessing specific Salesforce B2B Commerce pages?
A . CCCat?SiteMap
B . cc_RobotsTxT
C . CCSiteIndex
D . CCPage

Answer: B

Explanation:

The out of the box Salesforce B2B Commerce page that can give instructions to web crawlers from accessing specific Salesforce B2B Commerce pages is cc_RobotsTxt. This is a Visualforce page that generates a robots.txt file, which is a text file that tells web crawlers which pages or files they can or can’t request from a site. The page uses the configuration settings CO.RobotsTxtAllow and CO.RobotsTxtDisallow to specify which paths are allowed or disallowed for web crawlers. For example, User-agent: * Disallow: /CCCart will instruct web crawlers to not access the CCCart page.

Salesforce Reference: B2B Commerce and D2C Commerce Developer Guide, Robots.txt File

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments