Migrating Large XWiki Archives
I'm not sure how many of you use XWiki, but at our company, we use it extensively as our knowledge base for IT information. Lately, our XWiki export backups are topping 100+ megabytes. Well, not too surprising. The real problem you run into is that when you try to load one of those exported archives, the Java virtual machine will almost invariably experience out of memory errors. This is a problem I have had on several occasions, when trying to either upgrade or migrate our XWiki installation.
To solve this problem, let us start by examining the structure of the XAR archive files that are created by an XWiki export. The XAR file format was created by Google for use with Java applications. The XAR format consists of a package catalog (package.xml) and all of the related serialized objects exported for the application. In the case of XWiki, these files are Wiki articles containing either the last version of the article or the article and it's history of edits along with author information. The format of the file is as follows:
By judicious editing, you can decrease the size of an XAR archive and load up smaller chunks which do not cause an out of memory error. The simple solution is to extract an XAR archive using an UNZIP tool. Once extracted, create new XAR zip files with portions of the articles and modified versions of the packages.xml catalog.
I hope that this helps others, and if it does, please feel free to let me know in the comments. Perhaps you also have other XWiki tips, you can comment with those as well!!
To solve this problem, let us start by examining the structure of the XAR archive files that are created by an XWiki export. The XAR file format was created by Google for use with Java applications. The XAR format consists of a package catalog (package.xml) and all of the related serialized objects exported for the application. In the case of XWiki, these files are Wiki articles containing either the last version of the article or the article and it's history of edits along with author information. The format of the file is as follows:
/ (root)
/package.xml (The catalog of contained serialized objects)
/<Space>/<Article>
By judicious editing, you can decrease the size of an XAR archive and load up smaller chunks which do not cause an out of memory error. The simple solution is to extract an XAR archive using an UNZIP tool. Once extracted, create new XAR zip files with portions of the articles and modified versions of the packages.xml catalog.
I hope that this helps others, and if it does, please feel free to let me know in the comments. Perhaps you also have other XWiki tips, you can comment with those as well!!
Comments