At its heart, Drupal is a way for people to build great websites in a short period of time. By SEO standards, Drupal’s clean, open source code makes it a very flexible and powerful content management system (CMS), but it takes some work to get it configured just the right way for the search engines.
With this tutorial and basic knowledge of Drupal, you can build a perfectly optimized website. If you want a significant advantage over your competitors who are not using Drupal, and maximize the return on investment of your Drupal website, this tutorial is for you.
The right tools make any project go smoothly. When you decide to SEO your Drupal website, you’ll need the following:
The latest release of the open-source Drupal CMS, Drupal 6, can be downloaded from http://drupal.org.
A module is a community-created plugin that enhances Drupal’s core functionality. From XML sitemaps to better page titles, modules are crucial to the search engine optimization of any Drupal site.
How to Install 99% of Drupal Modules:
Now that you know how to install modules, there are several that you’ll need in order to optimize your Drupal site:
The most crucial part of an SEO project is finding the right keywords. If you pick the wrong keywords, you’ll spend months working only to find that there is nobody who wants to buy your product. An extra few hours researching your keywords in the beginning will help you avoid this fate.
The Google Keyword Planner shows how many people searched for a particular keyword and related keywords in the last month or so. At its most basic, you type in the keyword you’re interested in and it gives you the quantity of searches for that keyword and derivatives of that keyword.
Once you have used the keyword tools, you should have a list of keywords that will be the focus of your SEO campaign.
The first step in convincing Google that you are the best is to tweak your site so that the keywords show up in all the right places. These changes to your site for the search engines are collectively called On-Page Optimization.
You will need to use your keywords on your site in the following ways:
A page title is line of text in the HTML of a web page, summarizing what the web page is about. Using the page title module, you have full control of inserting keywords into your page titles throughout your site.
In the admin settings, under the Name field, enter what you want your site to be called.
Text that describes a link that is used by search engines to help determine what the linked-to page is about. This is a great place for your keywords. In plain HTML, a link title would appear as an element of the <a> tag, as follows:
The URL (Universal Resource Locator) is the address used by a browser to locate a certain piece of content. In Drupal’s core, URLs are dynamic, which mean they contain strange characters that are not search engine-friendly. To turn on clean URLs in Drupal, point your browser to www.yourDrupalsite.com/admin/settings/clean-urls or go to your admin screen and click Clean URLs. Once you have turned on clean URLs in Drupal, optimize your URLs with keywords using the Path module, recommended in step 1.
Headings are visible page titles that appear large and bold at the top of a web site or page. These are terrific indicators of what the page is about and should be integral to the site structure. Identified with <H1>, <H2>, all the way up to <H6>, these header tags communicate exactly what the page is about. Use keywords in your headings, inside the header tags, for an optimized page heading.
The menu of your site can make a big difference to your site’s indexability and standings in the search engines. You need to make sure that the keywords in your site’s internal navigation gives insights into the subject of your site. Here’s how you do it:
Meta tags are pieces of text in the header of your website that tell search engine spiders about your site, but are invisible to your human visitors. While search engines now ignore meta tags as a ranking mechanism, they do take them in account for other things.
There are about a dozen different meta tags that you can use but here are the main ones you should care about:
In Drupal, it’s very easy to set meta tags for each node thanks to the Meta tags module. This handy module gives you some extra fields on each node that you create so that you can put in a description, keywords, and other meta data as you want.
As smart as Google’s spiders are, it’s possible for them to miss pages on your site. The solution is a sitemap. Using a sitemap helps the search engine crawlers find more of your pages. In my experience, submitting an XML Sitemap to Google will greatly increase the number of pages when you do a site: search.
The XML Sitemap module creates a sitemap that conforms to the sitemap.org specification. Carry out the following steps to set up the XML Sitemap module:
1. Download the XML Sitemap module and install it just like a normal Drupal module.
2. Go to www.yourDrupalsite.com/admin/settings/xmlsitemap or go to your admin screen and click on Administer | Site Configuration | XML sitemap link.
3. Click on Settings and you’ll see a few options:
Minimum sitemap lifetime: It determines that minimum amount of time that the module will wait before renewing the sitemap. Use this feature if you have an enormous sitemap that is taking too many server resources. Most sites should leave this set on No minimum.
Include a style sheet in the: The sitemaps will generate a simple css file to include with the sitemap that is generated. It’s not necessary for the search engines but very helpful for troubleshooting or if any humans are going to view the sitemap. Leave it checked.
Generate sitemaps for the following languages: In the future, this option will allow you to actually specify sitemaps for different languages. This is very important for international sites who want to show up in localized search engines. For now, English is the only option and should remain checked.
4. Click the Advanced settings drop-down and you’ll see several additional options.
Number of links in each sitemap page allows you to specify how many links to pages on your web site will be in each sitemap. Leave it on Automatic unless you are having trouble with the search engines accepting the sitemap.
Maximum number of sitemap links to process at once sets the number of additional links that the module will add to your sitemap each time the cron runs. This highlights one of the biggest differences between the new XML sitemap and the old one. The new sitemap only processes new nodes and updates the existing sitemap instead of reprocessing every time the sitemap is accessed. Leave this setting alone unless you notice that cron is timing out.
Sitemap cache directory allows you to set where the sitemap data will be stored. This is data that is not shown to the search engines or users; it’s only used by the module.
Base URL is the base URL of your site and generally should be left as it is.
5. Click on the Front page drop-down and set these options:
Front page priority: 1.0 is the highest setting you can give a page in the XML sitemap. On most web sites, the front page is the single most important part of your site so, this setting should probably be left at 1.0.
Front page change frequency: Tells the search engines how often they should revisit your front page. Adjust this setting to reflect how often the front page of your site changes.
6. Open the Content types drop-down:
Here, you will see each Content type listed separately. You probably want to leave these settings alone so that all your content shows up in the sitemap.
Default priority allows you to set the default for each node that you create of that content type. Default is usually .5 but you can adjust it if you want certain pages of a higher or lower priority.
Click on Save content type.
Repeat for each content type that you wish to change.
7. Click Save configuration.
8. Now, you need to run cron. Cron is a recurring script that takes care of many maintenance issues in Drupal including populating the XML sitemap. To run cron, point your browser to http://www.yourDrupalsite.com/cron.php and wait until the page stops loading. You will not receive any indication that it’s complete except that your browser will stop loading the page.
9. Submit your XML sitemap to Google.
The robots.txt file is a Google-required file that sits at the root level of your web site and carries instructions for robots and spiders that may crawl your site. Drupal 6 provides a standard robots.txt file that does an adequate job.
1. Open your browser and visit the following link: www.yourDrupalsite.com/robots.txt
2. Using your FTP program or command line editor, navigate to the top-level of your Drupal web site and locate the robots.txt file.
3. Make a backup of the file.
4. Open the robots.txt file for editing. If necessary, download the file and open it in a local text editor tool.
5. Most directives in the robots.txt file are based on the line User-agent:. If you are going to give different instructions to different engines, be sure to place them above the User-agent: *, as some search engines will only read the directives for * if you place their specific instructions following that section.
6. Add the lines you want.
7. Save your robots.txt file, uploading it if necessary, replacing the existing file (you backed it up, didn’t you?).
8. Point your browser to www.yourDrupalsite.com/robots.txt and double-check that your changes are in effect. You may need to do a refresh on your browser to see the changes.
In many cases, you should tweak your robots.txt file for optimal SEO results. Here are several changes you can make to the file to meet your needs in certain situations:
You are developing a new site and you don’t want it to show up in any search engine until you’re ready to launch it. Add Disallow: * just after the User-agent:
Say you’re running a very slow server and you don’t want the crawlers to slow your site down for other users. Adjust the Crawl-delay by changing it from 10 to 20.
If you’re on a super-fast server (and you should be, right?) you can tell the bots to bring it on! Change the Crawl-delay to 5 or even 1 second. Monitor your server closely for a few days to make sure it can handle the extra load.
Say you’re running a site which allows people to upload their own images but you don’t necessarily want those images to show up in Google. Add these lines at the bottom of your robots.txt file:
User-agent: Googlebot-Image Disallow: /*.jpg$ Disallow: /*.gif$ Disallow: /*.png$
If all of the files were in the /files/users/images/ directory, you could do this:
User-agent: Googlebot-Image Disallow: /files/users/images/
Say you noticed in your server logs that there was a bad robot out there that was scraping all your content. You can try to prevent this by adding this to the bottom of your robots.txt file:
User-agent: Bad-Robot Disallow: *
If you have installed the XML Sitemap module, then you’ve got a great tool that you should send out to all of the search engines. However, it’s tedious to go to each engine’s site and upload your URL. Instead, you can add a couple of simple lines to the robots.txt file.
Dynastic Tech‘s Drupal SEO team might help with your Drupal site. We recommend contacting them to get a quote.