Skip to content

Latest commit

 

History

History
146 lines (107 loc) · 3.84 KB

README.md

File metadata and controls

146 lines (107 loc) · 3.84 KB

Angular-SEO

SEO for AngularJS apps made easy. Based on PhantomJS and yearofmoo's article.

This version has an updated with a detailed Nginx file that passes request Host information to phantomjs.

So only one instance of phantomjs is required to server multiple static sites.

Note: it can also be used to serve a single website if a URL is passed in start parameters

Requirements

You will need PhantomJS to make this work, as it will render the application to HTML.

How to use

The solution is made of 3 parts:

  • small modification of your static HTML file
  • an AngularJS module, that you have to include and call
  • PhantomJS script

Modifying your static HTML

Just add this to your <head> to enable AJAX indexing by the crawlers.

    <meta name="fragment" content="!" />

AngularJS Module

Just include angular-seo.js and then add the seo module to you app:

angular.module('app', ['ng', 'seo']);

If you are using RequireJS, the script will detect it and auto define itself BUT you will need to have an angular shim defined, as angular-seo requires it:

requirejs.config({
    paths: {
        angular: 'http://cdnjs.cloudflare.com/ajax/libs/angular.js/1.0.3/angular.min',
    },
    shim: {
        angular: {
            exports: 'angular'
        }
    }
});

Then you must call $scope.htmlReady() when you think the page is complete. This is nescessary because of the async nature of AngularJS (such as with AJAX calls).

function MyCtrl($scope) {
    Items.query({}, function(items) {
        $scope.items = items;
        $scope.htmlReady();
    });
}

And that's all there is to do on the app side.

PhantomJS Module

For the app to be properly rendered, you will need to run the angular-seo-server.js with PhantomJS. Make sure to disable caching:

To serve a snapshot of a single website, use

$ phantomjs --disk-cache=no angular-seo-server.js [port] [URL prefix]

To serve multiple website using a single instance of phantomJS, just dont pass a URL prefix

$ phantomjs --disk-cache=no angular-seo-server.js [port]

URL prefix is the URL that will be prepended to the path the crawlers will try to get.

Some examples:

$ phantomjs --disk-cache=no angular-seo-server.js 8888 http://localhost:8000/myapp
$ phantomjs --disk-cache=no angular-seo-server.js 8888 file:///path/to/index.html

Testing the setup

Google and Bing replace #! (hashbang) with ?_escaped_fragment_= so htttp://localhost/app.html#!/route becomes htttp://localhost/app.html?_escaped_fragment_=/route.

So say you app is running on http://localhost:8000/index.html (works with file:// URLs too). First, run PhantomJS:

$ phantomjs --disk-cache=no angular-seo-server.js 8888 http://localhost:8000/index.html
Listening on 8888...
Press Ctrl+C to stop.

Then try with cURL:

$ curl 'http://localhost:8888/?_escaped_fragment_=/route'

You should then have a complete, rendered HTML output.

Running in behind Nginx (or other)

If course you don't want regular users to see this, only crawlers. To detect that, just look for an _escaped_fragment_ in the query args.

For instance with Nginx:

# Production config
server {
  server_name <%= server_name %> default;
  listen <%= server_port %>;
   
  location / {
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $remote_addr;
    proxy_set_header Host $http_host;

    # If not search enging pass static
    root "<%= base_dir %>";
    
    if ($args ~ _escaped_fragment_) {
      proxy_pass http://localhost:8888;
      break;
    }

    try_files $uri $uri/ /index.html;
  }
}