The document component - Kore Nordmann

The document component

Conversion between different document markup languages is not only common for Content Management Systems, but also common applications like wikis or forums do it all the day.

In most cases it is just one language used in the user interface, like some wiki dialect, or BBCode and rendering this to HTML. But at some point you might also want to integrate technical documentation, written in Docbook, or render PDF documents for printing.

The document component, from the eZ Components project, knows about different markup languages and can convert between all of them - while the conversion process itself stays highly configurable. The known languages are currently:

Docbook
A XML based document markup language, developed since the early 1990s. It implements the most complete document markup.
ReST
ReStructured Text is a text based format, also with a quite complete markup, which is especially nice to read when viewing text files, but still can be converted to lots of different output formats.
HTML
We all know this. But HTML has issues, when considered as a document markup language, because most documents are framed by layout, navigation, etc. The document component implements a filter stack with some heuristics for efficient HTML conversion.
Wiki
There hundreds of wiki markup flavours out there. The document component can currently read three flavours (Creole, Dokuwiki and Confluence) and write one (Creole). Most wiki dialects lack support for common markup, like footnotes, citations or even inner page links.
eZ XML
The XML based markup language used internally by eZ Publish.

To make the power of these conversions visible, I quickly hacked up a small application, which allows you to enter text in a textarea, validate the input and convert the input between all those markup languages. Have fun playing with that.

The code

The code which does all the conversion stuff is just this, with some additional error handling:

$classes = array( 'rst' => 'ezcDocumentRst', 'docbook' => 'ezcDocumentDocbook', 'creole' => 'ezcDocumentWiki', 'xhtml' => 'ezcDocumentXhtml', 'ezxml' => 'ezcDocumentEzXml', ); $sourceClass = $classes[$from]; $source = new $sourceClass(); $source->options->errorReporting = E_PARSE; $source->loadString( $text ); $destinationClass = $classes[$to]; $destination = new $destinationClass(); $destination->options->errorReporting = E_PARSE; $destination->createFromDocbook( $source->getAsDocbook() ); echo $destination;

It first instantiates the document class representing the input format and loads the passed text. After that the destination format class is created and its content is set from the converted source document.

The document component uses Docbook as an intermediate format, because it covers all markup used by any of the other languages. But if required conversion shortcuts can be implemented, like for direct ReST to HTML conversion.

More examples and starting points for extending the document component with custom markup or custom conversions are offered in the tutorial.

The future

Currently we are working on adding support for ODF reading and writing, to make the Open Document Format an equal member in the set of formats.

Additionally we are implementing user customizable PDF rendering for all documents in this release cycle. This will use different backends, like pecl/haru for the actual PDF creation and focus on styling and proper text rendering. The design document for this is, as always, available in the eZ Components SVN.

You are welcome to contribute support for more markup formats, like additional wiki dialects, BBCode or Markdown - contact us on IRC or using the mailinglist.

Comments