Blog

SEO Key Metrics and Factors for implementing a SEO quick check tool

In case you are searching for ways on how to implement a simple but useful SEO quick check tool here are some tips on the key metrics and factors required to get you started. The SEO quick check tool itself is based on PHP to provide a proof of concept that can be easily adapted and extended. First, let’s have a quick review of the SEO key metrics and factors.

SEO Key Metrics and Factors

First, most of the work related to SEO is based on working on the HTML code level. Thus, inspecting the HTML code via its DOM tree plays and important part when conducting and evaluating SEO checks. A few of the key metrics and factors relevant to SEO are:

  1. Title
  2. Meta-description
  3. Meta-keywords (not really anymore but for the sake of completeness)
  4. OpenGraph Meta-tags (as alternative or addition to traditional meta-tags)
  5. Additional general Meta-tags (locale, Google webmaster tools verification, etc.)
  6. Headers <h*> and their ordering
  7. Alternate text attributes for images
  8. Microdata

Based on the the underlying HTML code the following metrics can be calculated:

  1. Length and “quality” of data provided
  2. Data volume
  3. Text to HTML ratio
  4. Loading time

Apart from these core metrics make sure that the general syntax is correct and matches the W3C standards:

  1. W3C validation

You should even go one step further and validate against the Web Content Accessibility Guidelines (WCAG):

  1. WCAG validation (level A-AAA)

In addition to the HTML generated make sure to provide search engines enough information on the pages available to be indexed and those that should be left aside, i.e. by providing a XML sitemap and a robots.txt file:

  1. XML sitemap
  2. robots.txt

The XML sitemap can either by a sitemap index consisting of multiple sitemaps where each is for instance referring to a special page type (posts vs. pages) or a simple list of URLs. Link metrics in return can be differentiated by site internal and external links:

  1. internal links
  2. external links

When it comes to linking and SEO acquiring link juice is the ultimate goal you should be going for. By getting backlinks from preferably established websites link juice is transferred back to your site, thus strengthening it. This list is not complete and there are loads of details you need to keep in mind when dealing with SEO. Nevertheless, this post is about implementing a SEO quick check tool, right?

Implementing a SEO quick check tool

The following presents a proof of concept for implementing a SEO quick check tool written in PHP. Feel free to use it as a foundation. First of all, let’s assemble our toolset to save us a lot of trouble parsing and evaluating the DOM tree.

cURL

Of course there also exists a PHP extension of cURL. Make sure that the corresponding extension is activated in your php.ini. We will be using cURL for getting various remote assets, starting with the webste HTML code itself:

 
function curl_get($url) {
  $ch = curl_init();
  curl_setopt($ch, CURLOPT_URL, $url);
  curl_setopt($ch, CURLOPT_TIMEOUT, 30);
  curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
  curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
  curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);

  $start = microtime();
  $result = curl_exec($ch);

  if ($result === false) {
    return array('error' => curl_error($ch));
  }

  curl_close($ch);
  $end = microtime();

  return array('data' => $result, 'duration' => abs(round($end - $start, 2)));
}

This function will be used throughout the SEO quick check tool and returns an array containing the fields

  1. data: data received based on $url
  2. duration: for benchmarking loading duration
  3. error: in case something went wrong otherwise not set

In case you need additional headers, etc. feel free to adjust this function to your needs.

Simple HTML DOM

Once we have the HTML code  we need to parse it into a DOM tree that we can evaluate. For PHP there exists a handy tool called Simple HTML DOM Parser that does a nice job parsing HTML code to a DOM:

$htmlDOM = <strong>str_get_html</strong>($html);

Yes, that’s all you need to parse the HTML code into a DOM object which we will use the evaluate various tags. Please refer to the Simple HTML DOM Parser Manual for more information on how to use this tool.

SimpleXML

When dealing with XML in PHP SimpleXML is definitely the way to go. We will be using SimpleXML for parsing XML sitemaps. First, we need to check if a XML sitemap is present by inspecting the robots.txt and then we will be using the cURL function defined above to retrieve the sitemap for further inspection.

Check robots.txt

 
$robotsTxtResponse = <strong>curl_get</strong>($robotsUrl); //$url + "/robots.txt", use e.g. parse_url() to assemble URL correctly
$robotsTxt = $robotsTxtResponse['data']; //make sure to check if 'error' is not set in the response

So, let’s assume that robots.txt exists and the content is available through $robotsTxt.

Load XML Sitemap

Based on the contents of robots.txt we can check if a XML sitemap is present:

 
$xmlResponse = <strong>curl_get</strong>($html); 
$xml = $xmlResponse['data']; //make sure to check if 'error' is not set in the response

$siteMapUrl = null;
$siteMapMatches = array();

if (preg_match('#Sitemap:\s?(.+)$#', $robotsTxt, $siteMapMatches)) {
  if (count($siteMapMatches) < 3) {
    // we got ourselves a sitemap URL in $siteMapMatches[1]
    $siteMapUrl = $siteMapMatches[1]);
  }
}

Let’s assume we have a sitemap URL determined above in $siteMapUrl our next step would be to check if it’s a plain sitemap or a sitemap index, i.e. a list of sitemaps for various content types such as pages, posts, categories, etc.

 
// load sitemap
$siteMapData = curl_get($siteMapUrl);

$isSitemapIndex = false;
$sitemaps = array();
$sitemapUrls = array();

if (preg_match('/<urlset/', $siteMapData)) { // plain sitemap
  $sitemapUrlIndex = $xml = new SimpleXMLElement($siteMapData);
 
  if (isset($sitemapUrlIndex->url)) {
    foreach ($sitemapUrlIndex->url as $v) {
      $sitemapUrls[] = $v->loc;
    }
  }
} else if (preg_match('/<sitemapindex/', $siteMapData)) { // sitemap index
  $sitemapIndex = $xml = new SimpleXMLElement($siteMapData);

  if (isset($sitemapIndex->sitemap)) {
    $isSitemapIndex = true;
    foreach ($sitemapIndex->sitemap as $v) {
      $sitemaps[] = $v->loc;
    }
  }
}

Depending on the contents of the original sitemap this snippet parses plain sitemaps or nested sitemaps inside.

W3C Validator

In order to validate an URL for W3C conformity you can use the handy w3c-validator by micheh. The code required to run this validator is pretty simple:

 
$validator = new \W3C\HtmlValidator();
$result = $validator->validateInput($html); // $html from above

if ($result->isValid()) {
  // Hurray! no errors found :)
} else {
  // Hmm... check failed
  //$result->getErrorCount()
  //$result->getWarningCount()
}

Again, please refer to the w3c-validator documentation for more information.

Google Web Search API

Although technically speaking deprecated, the Google Web Search API still is handy to quickly generate the search preview:

 
// use user's IP to reduce&nbsp;server-to-server requests
$googleWebSearchApiUrl = "https://ajax.googleapis.com/ajax/services/search/web?v=1.0&amp;"
 . "q=site:" . urlencode($url) . "&amp;userip=" . $_SERVER['REMOTE_ADDR'];

$googleWebSearchApiResponse = curl_get($googleWebSearchApiUrl);
$googleWebSearchApiResponseArray = json_decode($googleWebSearchApiResponse, true); // do some checks here if request succeeded

// access data from response
$searchResults = $searchResultData['responseData']['cursor']['resultCount'];
$searchResultAdditionalData = $searchResultData['responseData']['results'];

Conclusion

As you can see implementing a basic SEO quick check tool can be achieved with a basic set of tools and frameworks. Furthermore, based on the key metrics determined you are able to quickly identify potential SEO problems.

Live Demo

Enough of the theoretical information? Ok! Head over to the Both Interact SEO Quick Check Tool for a live demonstration of this SEO Quick Check Tool. In case you like it feel free to drop a comment.

Manually update WordPress

So you want to manually upgrade your WordPress installation since the one-click update option is not available due to an ancient (pre 2.7) version or missing file permissions. Although using the one-click update option is the most comfortable one you can almost as easily update WordPress manually. The following steps show you how to manually update WordPress.

Backup database

As usual make a backup of your database and your files. In case you don’t have access to tools such as phpMyAdmin or a shell terminal with proper permissions to export your database have a look at Shuttle-Export, which is an easy to use PHP based MySQL dump library. Make sure to remove this tool after the upgrade process again or protect it with .htpasswd since you most likely don’t want strangers to export your database.

Deactivate plugins

Next, before removing files from your existing installation make sure to deactivate all plugins.

Remove existing installation files

Once again, make sure you have backed up your data and deactivated your plugins. Then remove all files and folders except the following from your installation:

  1. /wp-content
  2. /.htacess
  3. /wp-config.php

Copy new files

Afterwards, copy all files and folders from your new WordPress version, thus overwriting existing files except the ones mentioned above.

Database upgrade

Now it’s time to check if a database upgrade is necessary. Thus, navigate to /wp-admin/. If your are asked click on the upgrade database button. Once this is done click continue and re-login into the dashboard as admin.

Final checks

Finally, check your settings and enable your plugins again. Also check for possible plugin updates. This is all there is to manually updating WordPress. Almost as easy as the one-click update, right? Make sure to remove or password protect your Shuttle-Export installation. You’re all set!

Install PHP intl extension using Homebrew on XAMPP

Recently, when trying to install the PHP intl extension for a Symfony2 based web project using Homebrew to following error message kept showing up:

/usr/local/bin/brew: /usr/local/Library/brew.rb: /System/Library/Frameworks/Ruby.framework/Versions/1.8/usr/bin/ruby: bad interpreter: No such file or directory /usr/local/bin/brew: line 23: /usr/local/Library/brew.rb: Undefined error: 0

This post steps explains what is needed to install PHP intl extension using Homebrew on XAMPP.

Reinstall required?

Trying to do a quick re-install did not succeed:

bash-3.2$ ruby -e “$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)”

It appears Homebrew is already installed. If your intent is to reinstall you should do the following before running this installer again:

bash-3.2$ rm -rf /usr/local/Cellar /usr/local/.git && brew cleanup

So, basically it’s advised to remove existing files and re-start the installer which resulted in

rm: /usr/local/Cellar/libmemcached/1.0.18/AUTHORS: Permission denied

rm: /usr/local/Cellar/libmemcached/1.0.18/bin/memcapable: Permission denied

rm: /usr/local/Cellar/libmemcached/1.0.18/bin/memcat: Permission denied

… and many more errors

Sudo to the rescue?

Ok, so let’s remove the files using sudo:

bash-3.2$ sudo rm -rf /usr/local/Cellar /usr/local/.git && brew cleanup

Password: *****

/usr/local/bin/brew: /usr/local/Library/brew.rb: /System/Library/Frameworks/Ruby.framework/Versions/1.8/usr/bin/ruby: bad interpreter: No such file or directory

/usr/local/bin/brew: line 23: /usr/local/Library/brew.rb: Undefined error: 0

Well, this is kind of embarrassing 🙂

Since the folder have been successfully removed it might just be the brew cleanup command that causes the problems?

bash-3.2$ sudo brew cleanup

/usr/local/bin/brew: /usr/local/Library/brew.rb: /System/Library/Frameworks/Ruby.framework/Versions/1.8/usr/bin/ruby: bad interpreter: No such file or directory

/usr/local/bin/brew: line 23: /usr/local/Library/brew.rb: Undefined error: 0

It’s not 🙂 So, after manually removing these directories again it’s time for another attempt to call the brew installer:

bash-3.2$ ruby -e “$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)”

==> This script will install:

/usr/local/bin/brew

/usr/local/Library/…

/usr/local/share/man/man1/brew.1

==> The following directories will be made group writable:

/usr/local/sbin

/usr/local/share/man/man7

Press RETURN to continue or any other key to abort

==> /usr/bin/sudo /bin/chmod g+rwx /usr/local/sbin /usr/local/share/man/man7

==> /usr/bin/sudo /bin/mkdir /Library/Caches/Homebrew

==> /usr/bin/sudo /bin/chmod g+rwx /Library/Caches/Homebrew

==> Downloading and installing Homebrew…

remote: Counting objects: 3580, done.

remote: Compressing objects: 100% (3426/3426), done.

remote: Total 3580 (delta 35), reused 1458 (delta 18), pack-reused 0

Receiving objects: 100% (3580/3580), 2.74 MiB | 963.00 KiB/s, done.

Resolving deltas: 100% (35/35), done.

From https://github.com/Homebrew/homebrew

* [new branch]      master     -> origin/master

error: unable to unlink old ‘Library/Homebrew/extend/ENV/shared.rb’ (Permission denied)

error: unable to unlink old ‘Library/Homebrew/extend/ENV/std.rb’ (Permission denied)

error: unable to unlink old ‘Library/Homebrew/extend/ENV/super.rb’ (Permission denied)

error: unable to unlink old ‘Library/Homebrew/hooks/bottles.rb’ (Permission denied)

error: unable to create file Library/Homebrew/language/go.rb (Permission denied)

error: unable to unlink old ‘Library/Homebrew/language/haskell.rb’ (Permission denied)

error: unable to create file Library/Homebrew/language/java.rb (Permission denied)

error: unable to unlink old ‘Library/Homebrew/language/python.rb’ (Permission denied)

error: unable to create file Library/Homebrew/utils/fork.rb (Permission denied)

error: unable to unlink old ‘Library/Homebrew/utils/inreplace.rb’ (Permission denied)

error: unable to unlink old ‘Library/Homebrew/utils/json.rb’ (Permission denied)

error: unable to create file Library/Homebrew/utils/popen.rb (Permission denied)

error: unable to unlink old ‘Library/Homebrew/vendor/okjson.rb’ (Permission denied)

Checking out files: 100% (3584/3584), done.

fatal: Could not reset index file to revision ‘origin/master’.

Failed during: git reset –hard origin/master

Almost! As you can see we also need to manually remove the Library/Homebrew folder and finally it works:

bash-3.2$ ruby -e “$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)”

==> This script will install:

/usr/local/bin/brew

/usr/local/Library/…

/usr/local/share/man/man1/brew.1

Press RETURN to continue or any other key to abort

==> Downloading and installing Homebrew…

remote: Counting objects: 3580, done.

remote: Compressing objects: 100% (3426/3426), done.

remote: Total 3580 (delta 35), reused 1458 (delta 18), pack-reused 0

Receiving objects: 100% (3580/3580), 2.74 MiB | 502.00 KiB/s, done.

Resolving deltas: 100% (35/35), done.

From https://github.com/Homebrew/homebrew

* [new branch]      master     -> origin/master

HEAD is now at f45116a swiftlint: add 0.1.0 bottle.

==> Installation successful!

==> Next steps

Run `brew help` to get started

bash-3.2$ brew help

Example usage:

  brew [info | home | options ] [FORMULA…]

  brew install FORMULA…

  brew uninstall FORMULA…

  brew search [foo]

  brew list [FORMULA…]

  brew update

  brew upgrade [–all | FORMULA…]

  brew pin/unpin [FORMULA…]

Troubleshooting:

  brew doctor

  brew install -vd FORMULA

  brew [–env | config]

Brewing:

  brew create [URL [–no-fetch]]

  brew edit [FORMULA…]

  open https://github.com/Homebrew/homebrew/blob/master/share/doc/homebrew/Formula-Cookbook.md

Further help:

  man brew

  brew home

Pretty easy, right? 😉 Now it’s time to actually install the PHP intl extension.

Install PHP intl extension using Homebrew

To actually install the PHP intl extension we can use Homebrew, thanks to Gigamike for the tip:

bash-3.2$ brew install icu4c

==> Downloading https://homebrew.bintray.com/bottles/icu4c-55.1.yosemite.bottle.tar.gz

######################################################################## 100,0%

==> Pouring icu4c-55.1.yosemite.bottle.tar.gz

==> Caveats

This formula is keg-only, which means it was not symlinked into /usr/local.

Mac OS X already provides this software and installing another version in

parallel can cause all kinds of trouble.

OS X provides libicucore.dylib (but nothing else).

Generally there are no consequences of this for you. If you build your

own software and it requires this formula, you’ll need to add to your

build variables:

    LDFLAGS:  -L/usr/local/opt/icu4c/lib

    CPPFLAGS: -I/usr/local/opt/icu4c/include

==> Summary

🍺  /usr/local/Cellar/icu4c/55.1: 244 files, 66M

And then finally install intl:

bash-3.2$ sudo /Applications/XAMPP/xamppfiles/bin/pecl install intl

Cannot find autoconf phpize failed

In case you get an error Cannot find autoconf

bash-3.2$ sudo /Applications/XAMPP/xamppfiles/bin/pecl install intl

downloading intl-3.0.0.tgz …

Starting to download intl-3.0.0.tgz (248,200 bytes)

…………………………………………….done: 248,200 bytes

150 source files, building

running: phpize

Configuring for:

PHP Api Version:         20121113

Zend Module Api No:      20121212

Zend Extension Api No:   220121212

Cannot find autoconf. Please check your autoconf installation and the

$PHP_AUTOCONF environment variable. Then, rerun this script.

ERROR: `phpize’ failed

make sure that autoconf is installed and PHP_AUTOCONF is set correctly (which is not set by default when using XAMPP).

You can easily install autoconf through Homebrew too:

bash-3.2$ brew install autoconf

==> Downloading https://homebrew.bintray.com/bottles/autoconf-2.69.yosemite.bottle.1.tar.gz

######################################################################## 100,0%

==> Pouring autoconf-2.69.yosemite.bottle.1.tar.gz

Warning: This keg was marked linked already, continuing anyway

🍺  /usr/local/Cellar/autoconf/2.69: 70 files, 3,1M

Next, make sure that PHP_AUTOCONF is in your path by checking the path to autoconf:

bash-3.2$ which autoconf

/usr/local/bin/autoconf

 Now set PHP_AUTOCONF:

bash-3.2$ export PHP_AUTOCONF=/usr/local/bin/autoconf

and start the intl installation through PECL again:

bash-3.2$ sudo /Applications/XAMPP/xamppfiles/bin/pecl install intl

downloading intl-3.0.0.tgz …

Starting to download intl-3.0.0.tgz (248,200 bytes)

…………………………………………….done: 248,200 bytes

150 source files, building

running: phpize

Configuring for:

PHP Api Version:         20121113

Zend Module Api No:      20121212

Zend Extension Api No:   220121212

Specify where ICU libraries and headers can be found [DEFAULT] :

building in /private/tmp/pear/temp/pear-build-rootlnRgnx/intl-3.0.0

running: /private/tmp/pear/temp/intl/configure –with-icu-dir=DEFAULT

checking for grep that handles long lines and -e… /usr/bin/grep

checking for egrep… /usr/bin/grep -E

checking for a sed that does not truncate output… /usr/bin/sed

checking for cc… cc

checking whether the C compiler works… yes

checking for C compiler default output file name… a.out

checking for suffix of executables…

checking whether we are cross compiling… no

checking for suffix of object files… o

checking whether we are using the GNU C compiler… yes

checking whether cc accepts -g… yes

checking for cc option to accept ISO C89… none needed

checking how to run the C preprocessor… cc -E

checking for icc… no

checking for suncc… no

checking whether cc understands -c and -o together… yes

checking for system library directory… lib

checking if compiler supports -R… no

checking if compiler supports -Wl,-rpath,… yes

checking build system type… x86_64-apple-darwin14.3.0

checking host system type… x86_64-apple-darwin14.3.0

checking target system type… x86_64-apple-darwin14.3.0

checking for PHP prefix… /Applications/XAMPP/xamppfiles

checking for PHP includes… -I/Applications/XAMPP/xamppfiles/include/php -I/Applications/XAMPP/xamppfiles/include/php/main -I/Applications/XAMPP/xamppfiles/include/php/TSRM -I/Applications/XAMPP/xamppfiles/include/php/Zend -I/Applications/XAMPP/xamppfiles/include/php/ext -I/Applications/XAMPP/xamppfiles/include/php/ext/date/lib

checking for PHP extension directory… /Applications/XAMPP/xamppfiles/lib/php/extensions/no-debug-non-zts-20121212

checking for PHP installed headers prefix… /Applications/XAMPP/xamppfiles/include/php

checking if debug is enabled… no

checking if zts is enabled… no

… a lot more messages and compile output and finally:

———————————————————————-

Libraries have been installed in:

   /private/tmp/pear/temp/pear-build-rootlnRgnx/intl-3.0.0/modules

If you ever happen to want to link against installed libraries

in a given directory, LIBDIR, you must either use libtool, and

specify the full pathname of the library, or use the `-LLIBDIR’

flag during linking and do at least one of the following:

   – add LIBDIR to the `DYLD_LIBRARY_PATH’ environment variable

     during execution

See any operating system documentation about shared libraries for

more information, such as the ld(1) and ld.so(8) manual pages.

———————————————————————-

Build complete.

Don’t forget to run ‘make test’.

running: make INSTALL_ROOT=”/private/tmp/pear/temp/pear-build-rootlnRgnx/install-intl-3.0.0″ install

Installing shared extensions:     /private/tmp/pear/temp/pear-build-rootlnRgnx/install-intl-3.0.0/Applications/XAMPP/xamppfiles/lib/php/extensions/no-debug-non-zts-20121212/

running: find “/private/tmp/pear/temp/pear-build-rootlnRgnx/install-intl-3.0.0” | xargs ls -dils

10839659   0 drwxr-xr-x  3 root  wheel     102 19 Mai 14:48 /private/tmp/pear/temp/pear-build-rootlnRgnx/install-intl-3.0.0

10840601   0 drwxr-xr-x  3 root  wheel     102 19 Mai 14:48 /private/tmp/pear/temp/pear-build-rootlnRgnx/install-intl-3.0.0/Applications

10840602   0 drwxr-xr-x  3 root  wheel     102 19 Mai 14:48 /private/tmp/pear/temp/pear-build-rootlnRgnx/install-intl-3.0.0/Applications/XAMPP

10840603   0 drwxr-xr-x  3 root  wheel     102 19 Mai 14:48 /private/tmp/pear/temp/pear-build-rootlnRgnx/install-intl-3.0.0/Applications/XAMPP/xamppfiles

10840604   0 drwxr-xr-x  3 root  wheel     102 19 Mai 14:48 /private/tmp/pear/temp/pear-build-rootlnRgnx/install-intl-3.0.0/Applications/XAMPP/xamppfiles/lib

10840605   0 drwxr-xr-x  3 root  wheel     102 19 Mai 14:48 /private/tmp/pear/temp/pear-build-rootlnRgnx/install-intl-3.0.0/Applications/XAMPP/xamppfiles/lib/php

10840606   0 drwxr-xr-x  3 root  wheel     102 19 Mai 14:48 /private/tmp/pear/temp/pear-build-rootlnRgnx/install-intl-3.0.0/Applications/XAMPP/xamppfiles/lib/php/extensions

10840607   0 drwxr-xr-x  3 root  wheel     102 19 Mai 14:48 /private/tmp/pear/temp/pear-build-rootlnRgnx/install-intl-3.0.0/Applications/XAMPP/xamppfiles/lib/php/extensions/no-debug-non-zts-20121212

10840608 856 -rwxr-xr-x  1 root  wheel  434572 19 Mai 14:48 /private/tmp/pear/temp/pear-build-rootlnRgnx/install-intl-3.0.0/Applications/XAMPP/xamppfiles/lib/php/extensions/no-debug-non-zts-20121212/intl.so

Build process completed successfully

Installing ‘/Applications/XAMPP/xamppfiles/lib/php/extensions/no-debug-non-zts-20121212/intl.so’

install ok: channel://pecl.php.net/intl-3.0.0

configuration option “php_ini” is not set to php.ini location

You should add “extension=intl.so” to php.ini

Activate intl Extension in php.ini

Finally, make sure to add

extension=intl.so

to your php.ini and restart Apache. If everything is set up correctly you will see the intl information using phpinfo():

PHP intl extension phpinfo

 You are all set to go!

WCAG 2.0 – Web Content Accessibility Guidelines

When designing web sites and portals make sure to also address general accessibility issues governed for instance by the WCAG 2.0 – Web Content Accessibility Guidelines.

WCAG 2.0 – Web Content Accessibility Guidelines

Basically, the WCAG is composed of three priority levels:

  1. Level A (beginner),
  2. Level AA (intermediate)
  3. Level AAA (advanced)

Each level adds additional requirements concerning the four guidelines principles

  1. perceivable
  2. operable
  3. understandable
  4. robust

Perceivable (section 1.1 Text alternatives through 1.4 Distinguishable) defines that

Information and user interface components must be presentable to users in ways they can perceive.

Operable (section 2.1 Operable through 2.4 Navigable) makes sure that

User Interface components and navigation must be operable.

Understandable (section 3.1 Readable through 3.3 Input Assistence) defines that

Information and the operation of user interface must be understandable.

Robust (section 4.1 Compatible) finally makes sure that

Content must be robust enough that it can be interpreted reliably by a wide variety of user agents, including assistive technologies.

WCAG 2.0 – Checklists

Below you find checklists for each WACG level published by Luke McGrath that outlines the guidelines to make websites WCAG conformant. Following the checklist for each of the levels you find online tools that enable you to check websites for conformance.

WCAG 2.0 – Checklist Level A (Beginner)

Guideline Summary
1.1.1 – Non-text Content Provide text alternatives for non-text content
1.2.1 – Audio-only and Video-only (Pre-recorded) Provide an alternative to video-only and audio-only content
1.2.2 – Captions (Pre-recorded) Provide captions for videos with audio
1.2.3 – Audio Description or Media Alternative (Pre-recorded) Video with audio has a second alternative
1.3.1 – Info and Relationships Logical structure
1.3.2 – Meaningful Sequence Present content in a meaningful order
1.3.3 – Sensory Characteristics Use more than one sense for instructions
1.4.1 – Use of Colour Don’t use presentation that relies solely on colour
1.4.2 – Audio Control Don’t play audio automatically
2.1.1 – Keyboard Accessible by keyboard only
2.1.2 – No Keyboard Trap Don’t trap keyboard users
2.2.1 – Timing Adjustable Time limits have user controls
2.2.2 – Pause, Stop, Hide Provide user controls for moving content
2.3.1 – Three Flashes or Below No content flashes more than three times per second
2.4.1 – Bypass Blocks Provide a ‘Skip to Content’ link
2.4.2 – Page Titled Use helpful and clear page titles
2.4.3 – Focus Order Logical order
2.4.4 – Link Purpose (In Context) Every link’s purpose is clear from its context
3.1.1 – Language of Page Page has a language assigned
3.2.1 – On Focus Elements do not change when they receive focus
3.2.2 – On Input Elements do not change when they receive input
3.3.1 – Error Identification Clearly identify input errors
3.3.2 – Labels or Instructions Label elements and give instructions
4.1.1 – Parsing No major code errors
4.1.2 – Name, Role, Value Build all elements for accessibility

WCAG 2.0 Level A basically makes sure that the content is accessible based on the four main principles perceivable, operable, understandable and robust. As stated above levels AA and AAA add additional requirements to these principles which are outlined below.

WCAG 2.0 checklist Level AA (Intermediate)

Guideline Summary
1.2.4 – Captions (Live) Live videos have captions
1.2.5 – Audio Description (Pre-recorded) Users have access to audio description for video content
1.4.3 – Contrast (Minimum) Contrast ratio between text and background is at least 4.5:1
1.4.4 – Resize Text Text can be resized to 200% without loss of content or function
1.4.5 – Images of Text Don’t use images of text
2.4.5 – Multiple Ways Offer several ways to find pages
2.4.6 – Headings and Labels Use clear headings and labels
2.4.7 – Focus Visible Ensure keyboard focus is visible and clear
3.1.2 – Language of Parts Tell users when the language on a page changes
3.2.3 – Consistent Navigation Use menus consistently
3.2.4 – Consistent Identification Use icons and buttons consistently
3.3.3 – Error Suggestion Suggest fixes when users make errors
3.3.4- Error Prevention (Legal, Financial, Data) Reduce the risk of input errors for sensitive data

WCAG 2.0 checklist Level AAA (Advanced)

Guideline Summary
1.2.6 – Sign Language (Pre-recorded) Provide sign language translations for videos
1.2.7 – Extended Audio description (Pre-recorded) Provide extended audio description for videos
1.2.8 – Media Alternative (Pre-recorded) Provide a text alternative to videos
1.2.9 – Audio Only (Live) Provide alternatives for live audio
1.4.6 – Contrast (Enhanced) Contrast ratio between text and background is at least 7:1
1.4.7 – Low or No Background Audio Audio is clear for listeners to hear
1.4.8 – Visual Presentation Offer users a range of presentation options
1.4.9 – Images of Text (No Exception) Don’t use images of text
2.1.3 – Keyboard (No Exception) Accessible by keyboard only, without exception
2.2.3 – No Timing No time limits
2.2.4 – Interruptions Don’t interrupt users
2.2.5 – Re-authenticating Save user data when re-authenticating
2.3.2 – Three Flashes No content flashes more than three times per second
2.4.8 – Location Let users know where they are
2.4.9 – Link Purpose (Link Only) Every link’s purpose is clear from its text
2.4.10 – Section Headings Break up content with headings
3.1.3 – Unusual words Explain any strange words
3.1.4 – Abbreviations Explain any abbreviations
3.1.5 – Reading Level Users with nine years of school can read your content
3.1.6 – Pronunciation Explain any words that are hard to pronounce
3.2.5 – Change on Request Don’t change elements on your website until users ask
3.3.5 – Help Provide detailed help and instructions
3.3.6 – Error Prevention (All) Reduce the risk of all input errors

 

jQuery Lights out Plugin

In current times, saving energy in all its variants is becoming more and more important each day. When it comes to websites one simple way to save energy is to dim the display when idle.

Both Interact has released a simple yet versatile cross-browser jQuery lights out plugin. In order to keep things simple you only need to make sure to include jQuery 1.3 and higher and the plugin itself. It will then do the magic of automatically adding the required DOM elements for the overlays and register some event handlers to track user interactions and idle times.

Feel free to download the jQuery lights out plugin from Github and adjust it to your needs.

Disable SSLv3 support for Apache

In case you haven’t disabled support for SSLv3 for Apache yet – do so now! You can easily disable SSLv3 using your Apache configuration httpd.conf using the option -SSLv3:

SSLHonorCipherOrder on
SSLProtocol -ALL -SSLv3 +TLSv1 +TLSv1.1 +TLSv1.2
SSLCipherSuite ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+3DES:DH+3DES:RSA+AES:RSA+3DES:!aNULL:!MD5:!DSS

As always, make sure to restart Apache afterwards. Note that depending on your setup you might need to set the list of supported protocols for each vhost entry separately.

Test your configuration

Test your site’s security status to conform to best practice

  1. certificates
  2. protocol support
  3. key exchange
  4. cipher strength

at Qualys SSLLabs. SSL Analyzer. This tool will check various parameters and provide you with an overall rating: Qualys SSL Lab Test Results

Resolving IPv6 issues with GMail using qmail smtp

In case you are running into problems with Google Mail (GMail) rejecting mail from your qmail server running on native IPv6 with the following message

2a00:1450:4013:0c01:0000:0000:0000:001a failed after I sent the message. Remote host said: 550-5.7.1 [xxx:xxx:xxx:xxx::x 12] Our system has detected that this 550-5.7.1 message is likely unsolicited mail. To reduce the amount of spam sent 550-5.7.1 to Gmail, this message has been blocked. Please visit 550-5.7.1 http://support.google.com/mail/bin/answer.py?hl=en&answer=188131 for 550 5.7.1 more information. e7si30051500wiy.79 – gsmtp

make sure that you double-check the following settings:

  1. MX record matches the hostname set in qmail, /var/qmail/control/me
  2. Reverse DNS record for your server IP, i.e. PTR record, for both IPv6 and IPv4
  3. SPF records are set properly and point to your MX records and server IP for both IPv6 and IPv4

Check MX record

First of all make sure that you MX records match the hostname set in the qmail configuration. You can check your MX settings using the MXToolbox.

Check Reverse DNS record – PTR record

Second, make sure that your IPv4 and especially IPv6 IPs have a PTR record that resolves to the MX entry set for your domain. You can check your Reverse DNS lookup using the MXToolbox.

Check SPF record

Finally, GMail (and probably other mail providers too) will check for correct SPF or DKIM records. Normally, the SPF record only is sufficient. Checkout the SPFWizard on how to assemble your IN TXT SPF record. Make sure to add entries for IPv6 too, e.g.

v=spf1 mx a ip4:xx.xx.xx.xx ip6:y:yyyy:yyy:yyy:yyyy::y include:mail.yourserver.com include:mail2.yourserver.com -all

Check SMTP

Finally, check your server setup by either sending out mail to GMail recipients or testing it with the excellent Allaboutspam e-mail tester. That should get qmail working with native IPv6 support and GMail.

Currency conversion for Google Shopping Feed in Magento

When generating Google Shopping Feeds you need to make sure to use the correct currency for the target country. Thus, if you are planning for instance to target Great Britain make sure to use GBP as currency for prices in the Products Feed. This can be easily achieved by using the Magento extension Wyomind Data Feed Manager. It will automatically convert from the default currency currently set to the one specified in the product pattern configuration.

Enable conversion currencies

For instance, let’s say EUR is your default currency. Then you first need to ensure that your target currencies are enabled in the Magento backend via System -> Configuration -> Currency Setup: Magento Currency Setup

Automatic currency conversion for Products Feed

In order for the automatic conversion to work you need to ensure that built-in the conversion web service ran at least once. Thus, either run the web service manually or better set it up to run on a regular basis through your cron setup: Magento Currency Conversion Webservice Setup The Data Feed manager currency conversion will work properly once the initial conversion rate data is available. Make sure to update the conversion rates on a regular basis!

Syntax for price conversion

In order to use the price conversion in the Data Feed manager use the following syntax:

{normal_price,[GBP],[0]} GBP

This will convert the default currency set to GBP based on the current conversion rates determined by the web service. Have a look at the Wyomind Data Feed Manager documentation for more information on how to use this extension.

Disable caching of API callbacks for Viveum Magento extension

Using a caching server like Varnish is mandatory for running Magento (efficiently). When using extensions that make use of callback API calls make sure that you exclude them from your caching rules.

As we are using Viveum as payment gateway service for most eCommerce projects we generally deploy the official Viveum Magento extension. When using this extension in combination with Varnish make sure that at least exclude the following API calls from your caching list, e.g. through Turpentine URL Blacklist:

ops/payment/
ops/api/

We experienced problems especially related to Paypal transactions without excluding these callbacks, including wrong customer information on the Paypal checkout page.

Google mobile search update

Today, Google has released its newest search algorithm update for mobile search results. It is specifically targeted at displaying mobile-friendly website results in favor on non-mobile versions. In the first phase of this update the focus lies in searches conducted on smartphones only, tablets and other mobile devices are not yet being targeted by this update, but chances are that they will be added soon.

Quick check

Is your site mobile-friendly? Check it now to ensure that you are listed in the mobile search results using the Google Mobile Friendly Test.

More information

Find more information on the smec blog (German).

Mass import products to Magento multi-store setup

So, you are about to import a large amount of products into Magento. Well, there are a couple of different approaches available. First, you could use the ancient Data Flow import based on specific profiles. Then, you could use the improved Data Import version, which still is (very) resource consuming. Lastly, there is magmi – a highly efficient tool to mass import data to Magento, which is the preferred way to go in almost all cases when dealing with larger sets of product data in Magento. This post shows how to mass import products to Magento multi-store setups using magmi.

Export products

First, you need to export existing products as a CSV file. This can be achieved (again) using Data Flow, Data Import or a more elaborate database dump. Let’s assume at this point that you have a CSV file ready (very convenient, I know). In case you don’t, simply run the built-in Data Export funtion from within the admin backend (specify only those columns you definitely need to keep the processing to a minimum) or use the default Export All Products Data Flow profile. Edit this CSV to your needs so that you have a working set of products to be imported afterwards. Make sure at this point to retain the required columns so that you don’t run into problems afterwards. The columns required for the import process depend on the scenario at hand but rest assured you need at least sku, store and any additional attribute such as name or description.

Export data manipulation tools

Make sure to edit your CSV file with a tool that is capable of handling UTF-8 correctly and also supports setting proper delimiters. LibreOffice is a very handy solution to do this. You can easily use the Save as functionality to set proper delimiters for magmi to import your edited CSV afterwards. Use comma (,) and as column separators, which is the default setting. Here is a working sample CSV extract with the respective required columns in the first row:

"sku","description","short_description","name","store"
"sample-1","My shiny product for Englisch store","Shiny product","Product A EN","en"
"sample-1","My shiny product for German store","Shiny product","Product A DE","de"

CSV import schema

In order to mass import products for particular store views in a multi store setup make sure that the store column is set correctly with the store view code. Have a look at core_store table or at the admin backend under System / Manage Stores to determine to correct store view code.

Note: The regular Data Import interface provided by Magento will activate the “Use default” setting on various attributes of the imported/updated products, thus potentially doing something you are not intending with importing/updating products! Instead, use magmi and the store column to explicitely set set which products to import into which store view.

Note that you can also specify a list of store codes for the store column. The fastest and (imho) safest way to mass import products to Magento is Magmi – the Magento Mass Importer. Next, we are going to setup magmi and set up a profile to mass import products.

Setup Magmi

First, download and install magmi and protect it from outside access (e.g. via .htaccess). Note that by default, the web interface will be accessible by everyone who knows the corresponding URL! Open the web interface: http://www.your-domain/magmi/web/magmi.php and

  1. Edit the global settings
  2. add and edit the import profile

Edit global configuration

Set your connection details and Magento version at hand: MAGMI Global configuration

Create an import profile

Next, setup the import profile based on the Default profile and specify CSV as data source: MAGMI Profile configuration

Make sure to enable the Magmi Optimizer as this will speed up the process significantly, especially when dealing with thousands of entries.

Run import

Finally, choose your CSV import file and set the import mode. You can choose between three different modes:

  1. Update existing items only, skip new ones
  2. Create new items, update existing ones
  3. create new items, skip existing ones

Again, for multi-store setups make sure that the store column contains the correct store code view code. The import process should run pretty fast using magmi.

Reindex data

Optionally, you can choose to kick off the reindexing process using magmi (enable the option Magento Magmi Reindexer), run the indexer with the shell script provided by Magento or use the corresponding admin backend option.

String comparison in Typoscript through user functions in TYPO3

Generally, string comparison using the on-board tools provided by TypoScript can be quite cumbersome in TYPO3. Although, for instance for globalStrings there exists the possibility to use regular expressions and the * character as wildcard, oftentimes this is not flexible enough to handle more complex conditions, such as combined conditions.

User functions in TypoScript

Luckily, it is quite easy to add more complex string comparison functionality to TypoScript through user functions. Simply add your user defined function in localconf.php, e.g.

/**
 * Return TRUE on success, FALSE otherwise.
 * @return boolean
 */
function user_match($var1, $var2, $var3) {
  if($var1 == $var2) {
    return TRUE;
  }

  if($var1 == $var3) {
    return preg_match('#...#', $var2);
  }

  ...
     
  return false;
}

You can then call this function in TypoScript as conditional statement:

[userFunc = user_match(value1, value2, value3)]
  # do something here
[end]

Have a look at the Typoscript Conditions reference for more information.

WordPress Rating-Widget shows blank reporting graph when using SSL

By default, the free version of the WordPress Rating-Widget does not officially “support” SSL/https setups. In reality, there are no problems using it on SSL setups except when it comes to the reporting graph which is loaded via a http connection set in the configuration, thus causing CORS to kick and prohibit non-safe external requests: Rating-Widget empty reporting graph When looking at the failing request using the developer toolbar you can see the CORS warning: Rating-Widget CORS error So, at this point feel free to either buy the pro version or change one line in the configuration to enable SSL/HTTPS support for the free version too (which I find should be supported in the free version too).

Enabling SSL support for reporting graph

Having a quick look at how the widget assembles to reporting graph URL for the iframe reveals that only one constant needs to be changed: WP_RW__ADDRESS. In lib/config.common.php change the following line

define( 'WP_RW__ADDRESS', 'http://' . WP_RW__DOMAIN );

to

define( 'WP_RW__ADDRESS', WP_RW__PROTOCOL . '://' . WP_RW__DOMAIN );

to automatically set the correct protocol based on your current setup. Voila, the graph works with https too: rating-widget-graph-screen

Retrieving value of hidden input DOM element in Google Tag Manager (GTM)

When implementing event tracking of user comments using Google Tag Manager for WordPress I came across a pretty strange behavior when trying to retrieve the value of a hidden <input> field holding the current comment post id (comment_post_ID) that should serve as event label for Google (Universal) Analytics.

WordPress Comment Form

Below you find the WordPress comment form used by most templates today. Pay attention to line 11 that shows the hidden input field comment_post_ID. This field’s value should be used as event label.

<form id="commentform" class="comment-form" action="http://www.yourdomain.com/wp-comments-post.php" method="post" novalidate="">
<span id="email-notes">Your email address will not be published.</span> Required fields are marked <span class="required">*</span> 
<label for="author">Name <span class="required">*</span></label> 
<input id="author" name="author" size="30" type="text" value="" /> 
<label for="email">Email <span class="required">*</span></label> 
<input id="email" name="email" size="30" type="email" value="" /> 
<label for="url">Website</label> <input id="url" name="url" size="30" type="url" value="" /> 
<label for="comment">Comment</label> 
<textarea id="comment" cols="45" name="comment" rows="8"></textarea> 
<input id="submit" class="submit" name="submit" type="submit" value="Send your comment" /> 
<input id="comment_post_ID" name="comment_post_ID" type="hidden" value="SOME_ID" /> 
<input id="comment_parent" name="comment_parent" type="hidden" value="0" /> 
<input id="akismet_comment_nonce" name="akismet_comment_nonce" type="hidden" value="SOME_HASH" /> 
<input id="ak_js" name="ak_js" type="hidden" value="SOME_HASH" />
</form>

Retrieve value using Google Tag Manager

Per definition, Google Tag Manager evaluates DOM element variables based on an Element ID and optionally on an Attribute value. In case no Attribute Name is specified the value attribute will be used by default:

If the attribute name is set, the variable’s value will be that of the DOM element attribute; otherwise, the variable’s value will be the text of the DOM element.

Unfortunately, this does not seem to (always) work for hidden input fields. Therefore, make sure to specify the Attribute Name value manually: Google Tag Manager DOM Element variable Although, you can also use Custom JavaScript variables to achieve the same DOM element variables are clearly the way to go here.

Cyanogenmod Updater crashes when using Nightly builts

In case you are trying to update Cyanogenmod the a current Nightly built (e.g. Nightly Cm-11-20150201) chances are that the CM updater crashes before even rebooting to the actual updating process. This problem has been confirmed by PsychoI3oy on the CM forum:

Yeah, we’ve been getting a lot of crashes in the backend from cmupdater. It has been fixed and we’re doing a special re-run of cm11 nightlies starting today that will have this fix. You’ll still have to manually install this update but after that it should be fine. For anyone else having this issue: please stop hitting report on the crash button. We have plenty of reports. Thanks

Solution

The solution is pretty simple: Just do a manual update by powering up your device while pressing Home + Volume Up + Power simulateously. Of course make sure to download a recent nightly version using CM updater. As stated in various bug reports this issue will be fixed once a manual update was executed. Personally, I did not experience problems with updates using the regular CM updater approach after executing the process manually as described above.