Mugo Web main content.

Using a custom static cache handler for offline wiki sync in eZ Publish

By: Xavier Cousin | January 6, 2012 | eZ Publish development tips

Mugo has an internal wiki to share and store various information for all team members. Previously we used mostly text files saved in Dropbox, which actually worked quite well. However, we are now using a more powerful solution that combines eZ Publish, a custom static cache handler, and Dropbox.

The Dropbox-only approach was already a good solution because it provided:

  • Immediate sync to all team members
  • Offline access
  • Ease of use
  • Several backup sources

We wanted to add better searchability of content, wiki content formatting, and better content organization without sacrificing any of the above points. After deciding that a password-protected eZ Publish install, along with its enterprise search extension eZ Find, would be able to achieve the new desired functionality, the key was to write a custom static cache handler to bring the eZ Publish content (stored on our server) to Dropbox (on each team member's computer). eZ Publish already comes with a static cache handler in order to store the full HTML output of pages, but it didn't fully meet our needs.

Thankfully, a recent Mugo pull request to support custom static cache handlers had just been merged into the eZ Publish kernel, making this possible :) The pull request was originally intended to more easily purge the cache of a reversy proxy such as Varnish but we were of course very happy to find this new, unforeseen use case!

The first step is to activate the static cache in eZ Publish and create a custom handler; to do so, we use the following settings in site.ini.append.php:

#this is if you want to re-generate the cache as soon as an object is published
#set your own cache handler by giving its class name 
#(make sure you regenerate the autoload after you add your class)

Then, we have some standard settings in staticcache.ini.append.php; note that we're storing the static cache files in a static/ folder:

<?php /* #?ini charset="utf-8"?

#folder in which the cache will be generated (the folder must already 
#exist with the proper permissions or the cache may not be written)
# A list of url's to cache



*/ ?>

As a base for our custom static cache handler, we copied the default class (kernel/classes/ezstaticcache.php). Our only modifications were to the storeCache() method, in order to accomplish the following whenever new content was added or existing content was updated:

  1. Bypass the login requirement on our internal wiki in order to store the content. The default static cache handler does a cURL request to save the HTML output of each page, which would not suffice in our case.
  2. Since our local copies would reside outside of the normal eZ Publish environment, we implemented some rewrite rules for design elements such as stylesheets and images
  3. Save binary files locally

The code for the first two goals is shown below. We need to fetch the full view module result of the content in question, inserted into the correct pagelayout for the proper siteaccess

//remember the current siteaccess to put it back after
$current_access = eZSiteAccess::current();
//set the siteaccess
$access = array( 'name' => 'mysiteaccess', 'type' => eZSiteAccess::TYPE_DEFAULT );
eZSiteAccess::change( $access );

//then run the content/view with the current node
$moduleName = "content";
$Module = eZModule::findModule( $moduleName );
$result = $Module->run(
    array ( 'full', $nodeId ),
        'ViewCache' => false,
        'AttributeValidation' => array( 'processed' => true, 'attributes' => false ),
        'CollectionAttributes' => false
    //in this parameter, you can set any variable in the array and each of them will be 
    //available in the view_parameters variable which allows you in templates to make the 
    //difference between a live and cached version (if you want to remove things from the
    //static view for example). You can also get the content of pages by setting the limit 
    //and offset variables here.
    array( 'viewtype' => 'static' )

//once we have the content back from the module, we can fetch the pagelayout
$moduleResult[ 'content' ] = $result[ 'content' ];
$tpl->setVariable( "module_result", $moduleResult );
$content = $tpl->fetch( "design:pagelayout.tpl" );

//In this case I have to rewrite some css / js / image links to point them to the proper folder in the dropbox
$content = preg_replace( '/<link ([^>]*)href="\/extension\/myextension\/design\/myextension([^"]*)"([^>]*)>/sm', '<link $1href="$2"$3 />', $content );
$content = preg_replace( '/<script ([^>]*)src="\/extension\/myextension\/design\/myextension([^"]*)"([^>]*)><\/script>/sm', '<script $1src="$2"$3></script>', $content );
$content = preg_replace( '/"\/extension\/myextension\/design\/myextension\/images\/([^"]*)"/', '"/images/$1"', $content );

//set the access back to what it was
eZSiteAccess::change( $current_access );

For simplicity purposes, not shown is a custom method to scan the content in question and see whether it references any binary files that we need to store.

On the Dropbox side, we did a bit of manual setup to copy the design elements of our new wiki into images/, javascript/ and stylesheets/ folders. The mapping of the such files to the Dropbox folders is made by the preg_replace() calls in the code above. A Dropbox instance sits on the eZ Publish server in order to sync the contents of the static/ folder (as configured above in staticcache.ini.append.php) via a symbolic link. This folder is shared with all of the team members' Dropbox accounts.

(In order to install Dropbox via the command line, follow these instructions.)

And that's it! Do you have a similar wiki need that you handled in a different way? Or do you have a new usage for a custom static cache handler, now that it's possible to have one in eZ Publish? Let us know!