Levo logo

Pricing

Levo Studio

Sitemap & Robots

LA

Levo Academy

Table of Contents
Your Sitemap (Fully Automatic)
What Levo does for you
Viewing your sitemaps
Submitting to search engines
Robots.txt Settings
1. Allowed Sources
2. Disallowed Sources
3. External Sitemap URLs
Saving your changes
What Your robots.txt File Actually Looks Like
Our Recommendation

Your website needs two things to show up on search engines like Google: a sitemap that lists every page on your site, and a robots.txt file that tells search engines which pages they are allowed to visit. Levo handles both automatically — but this tab lets you fine-tune the rules when needed.

How to get here: Open your site in Levo → Site Settings → click the Sitemap & Robots tab.


Your Sitemap (Fully Automatic)

A sitemap is exactly what it sounds like — a file that lists every page on your website so search engines can find them.

What Levo does for you

  • Publish a page → Levo adds it to your sitemap automatically.

  • Unpublish a page → Levo removes it from your sitemap automatically.

  • Static pages (like Home, About, Contact) are grouped into one shared sitemap.

  • Collection pages (like Blog posts or Events) each get their own sitemap, so the files stay small even if you have thousands of entries.

You never need to create, upload, or maintain a sitemap yourself.

Viewing your sitemaps

You can see all sitemaps at any time. From the Pages list in your dashboard, click the three-dot menu (⋮) next to any page and select View Sitemap. A popup will show you:

Column

What it means

Name

The identifier for that sitemap file

URLs

The number of pages included in it

Updated

When the sitemap was last regenerated

Link

A clickable link to the actual sitemap file

Submitting to search engines

Your main sitemap lives at:

https://yourdomain.com/sitemap.xml

Submit this URL once to Google Search Console and Bing Webmaster Tools. After that, search engines will re-read your sitemap on their own — you do not need to resubmit every time you publish a new page.


Robots.txt Settings

Your robots.txt file is a set of instructions for search engine crawlers. It tells them which parts of your site they can explore and which parts they should skip. When a search engine visits your website, the robots.txt file is one of the very first things it checks.

The Sitemap & Robots tab has three fields you can configure:

1. Allowed Sources

Label in Levo: Allowed sources

This is where you list paths that search engines are explicitly allowed to crawl.

  • By default, this is set to /, which means "allow everything."

  • You typically only need to add entries here if you have blocked a broad path (see below) but want to make an exception for one specific page within it.

Example: You blocked /internal/, but you want Google to still find /internal/careers. Add /internal/careers as an allowed source.

Each path must start with / (for example: /blog/public-posts). Click Add Source to add more paths.

When to use this:

  • You have added a broad disallow rule (like /internal/) but need to make an exception for one or two pages inside that folder.

  • You want to be explicit about which sections of your site should be crawled, for organizational clarity.

When NOT to use this:

  • Your site is fully public and you have not added any disallow rules. The default / already covers everything — adding more allowed paths does nothing extra.

  • You want to hide a page from Google. This field is for allowing access, not restricting it.

2. Disallowed Sources

Label in Levo: Disallowed sources

This tells search engines not to crawl certain sections of your site. Useful for things like:

  • /thank-you/ — confirmation pages after a form submission

  • /internal/ — pages meant only for your team

  • /staging/ — work-in-progress pages you are still testing

Each path must start with /. Click Add Source to add more paths.

When to use this:

  • You have thank-you pages, confirmation pages, or form-success pages that provide no value in search results.

  • You have internal or admin-only pages that are not meant for the public.

  • You are testing work-in-progress pages under a /staging/ or /draft/ path and do not want them indexed yet.

When NOT to use this:

  • You want to completely hide a page from search results. Disallowing only stops crawling — if another site links to your page, Google can still show the URL (without a description). For full removal, use the Disable Indexing toggle in that page's Page Settings → SEO tab instead.

  • Your pages contain sensitive or confidential information. Robots.txt is a request, not a security wall. Anyone can still visit the URL directly. For truly private pages, use Levo's page access controls.

  • You are unsure which paths to block. When in doubt, leave this empty — blocking the wrong path can accidentally hide important pages from Google.

[!WARNING] Disallowing a path stops Google from crawling it, but it does not guarantee the page is invisible. If another website links to that page, Google may still show the URL in search results (just without a page description). To fully hide a page from search results, use the Disable Indexing toggle inside that page's Page Settings → SEO tab instead.

3. External Sitemap URLs

Label in Levo: External Sitemap URLs

Levo already includes your main sitemap in the robots.txt file automatically. However, if you manage content on other platforms outside of Levo — and those platforms have their own sitemaps — you can add those URLs here so search engines can discover them too.

Each entry must be a full URL (for example: https://shop.yourdomain.com/sitemap.xml). Click Add URL to add more.

When to use this:

  • You run an online store on Shopify, a help center on Notion or GitBook, or a subdomain blog on WordPress — and you want Google to discover those pages through your main domain's robots.txt.

  • You recently migrated to Levo but still have legacy content hosted elsewhere that search engines should continue indexing.

When NOT to use this:

  • All your pages live inside Levo. Your Levo sitemap is already included automatically — there is nothing to add.

  • You are adding your own Levo sitemap URL here. Levo already does this for you; adding it again would just create a duplicate entry.

Saving your changes

After editing any of the three fields above, click Save Changes. The button stays disabled until you make a change.


What Your robots.txt File Actually Looks Like

Behind the scenes, Levo generates a standard robots.txt file from your settings. If you set up a few rules, the output would look something like this:

User-agent: *
Allow: /
Disallow: /thank-you/
Disallow: /internal/

Sitemap: https://yourdomain.com/sitemap.xml
  • User-agent: * means these rules apply to all search engines, not just Google.

  • The Sitemap: line is included automatically — you do not need to add it yourself.


Our Recommendation

If you run a standard, public website with no sensitive or private pages, you do not need to change anything here. Levo's default settings — allow everything, disallow nothing — are already optimized for search engine visibility. Most users will never need to touch this tab.


← Prev
Restricting Site Access by Country

Next →
Adding Custom HTML, CSS & JavaScript

Related Docs

Workspace Identity, Billing & Developer Access

The Workspace Hub is the foundational control layer of your platform. It centralizes critical administrative functions, including organizational identity, billing management, integrations, and developer access.

Activity Log — Track Every Workspace Change

The Activity Log provides a clear record of important actions within your workspace. It shows who performed an action and when, helping you maintain visibility and accountability.

Roles & Permissions — Controlling Team Access

A Role is a defined set of permissions that determines what a team member can view and perform within the platform. Levo operates on a granular, action-based permission model, allowing administrators to explicitly control whether a user can:
1/0
Levo logo

Level up engagement on any website

Pricing

Contact Us

Help & Support

Copyright © 2026. All Rights Reserved

Terms & Conditions

Privacy Policy