Keeping Drupal up-to-date with a legacy database
In past posts Hani has brought up sample cases of migrating data over from various WCI formats into Drupal (from basic content, to the knowledge directory, all the way back to ALUI Publisher)
In a recent exercise in the Drupal realm I took a look at a problem involving a large database back-end already in place and the task was for Drupal to simply keep up with the data in the legacy system and (optimistically) eventually take over operations as a content management system entirely. For now I would consider my homework complete if I can get Drupal to keep up. 'Keeping up' in this puzzle means a few things:
- some level of knowledge of what is currently in the legacy database
- being aware when new entries pop in,
- updates to existing entries,
- a mechanism for both the legacy database and the Drupal instance to talk to each other in some coherent way.
For step one I will be using the migration module as a developer tool. The module itself is not meant to be an end-user facing module; it is meant to be extended for whatever your migration purposes/needs are on a given task. In my case here, I will be leveraging the built-in support for DBTNG, MSSQL, and a variety of Oracle API sources - the general hope here being from this list your flavor of database should be covered. (As always, extending the functionality of the migrate module to other sources ... is left as an exercise, to whomever. Open!) Using a one time run of the modified migration module we will be able to get up to date on all contents in the current state of the legacy database.
With the needs described in 2, 3, and 4 - I am going to create and implement a few API calls to be shared between the legacy system and Drupal. Using a freshly revamped 3.x services module this decision is made easier:
- any services I create to tackle this particular legacy issue can be easily locked down, as to not interfere or be interfered with by the traffic coming and going from 99% of the rest of the Drupal instance.
- thorough integration to Drupal functionality like files, nodes, taxonomy, users, etc.
- framework is designed to be extensible
With the services module, I will now have REST end-points to await requirements 2 and 3 which land within the usual CRUD operations. For example, my legacy system will now look to call http://drupal.instance/my_services_path/push_new_entry/ with JSON or XML arguments to allow Drupal to be aware of the existence of a new entry.
Through these module extensions a majority of the heavy lifting is now accomplished, of course there are side work tasks that still need to be filled in but for now the broad strokes have been worked out - and largely already implemented out in the wild by the Drupal community.
- Log in to post comments