Search engines crawl web pages by following links from one page to another. If pages on your site aren't well-linked, or your site has few links from other sites, then not all pages on your website may be visible in this normal crawling process.
XML Sitemaps inform search engines of the URLs available for crawling on a website, including URLs that may not be visible in the normal crawling process. You can also use Sitemaps to provide search engines with additional information about your pages, such as the last update date or how often the page is expected to change.
An SEO strategist will check that your XML Sitemap is correct and only includes pages that you want to be visible in search engine results. They will also conduct an audit to identify any discrepancies between robot.txt files and canonical tags on your website and your XML Sitemap.