IBM Notes and Domino V9.0.1 extends support and enhances its collaboration toolset with social capabilities from IBM Connections V5.5

September 13, 2016 – 8:02 am

read this for detailled information

Domino JNA vs. IBM Domino Java API

September 11, 2016 – 12:02 pm

Today, I finally found the time to take a closer look into the Domino JNA project. The project has been created by Karsten Lehmann, mindoo.

The goal of the project is to provide low level API methods that can be used in Java to speed-up the retrival of data; especially, if you have to deal with lots of data, this can be a significant performance impact.

My szenarion is the following:

Read entries from a Notes Application into a Java bean and add data from another document in another application to the bean.The number of documents in the source application can be from 1 – n. I do not “own” the design of the source database, so I cannot modify it.

In this article, I will concentrate on reading the source application in the fastest way possible. I will show, how I do it right now and also, how you can use Domino JNA.

Here is a small piece of Java to measure the duration of data retrival.

public class StopWatch {
	private long	startTime	= 0;
	private long	stopTime	= 0;
	private boolean	running		= false;
	public void start() {
		this.startTime = System.nanoTime();
		this.running = true;
	public void stop() {
		this.stopTime = System.nanoTime();
		this.running = false;
	// elaspsed time in milliseconds
	public long getElapsedTime() {
		long elapsed;
		if (running) {
			elapsed = (System.nanoTime() - startTime);
		} else {
			elapsed = (stopTime - startTime);
		return elapsed;
	// elaspsed time in seconds
	public long getElapsedTimeSecs() {
		long elapsed;
		if (running) {
			elapsed = ((System.nanoTime() - startTime) / 1000);
		} else {
			elapsed = ((stopTime - startTime) / 1000);
		return elapsed;

I am using “fakenames.nsf” in my sample code. You can download the two sample databases fakenames.nsf and fakenames-views.nsf from this URL:

Next, place them in the data folder of your IBM Notes Client.

here is the code, that I use in my application. It uses a NotesNavigator to traverse the view. It then opens the underlying document for each entry found using entry.getDocument() and prints values for some items to the console.

I need to do it this way, because I need the contents of some items to identify the document in the other database. Unfortunately not all needed values are in the view. So just reading the column values is not an option.

import lotus.domino.Database;
import lotus.domino.Document;
import lotus.domino.NotesFactory;
import lotus.domino.NotesThread;
import lotus.domino.Session;
import lotus.domino.View;
import lotus.domino.ViewEntry;
import lotus.domino.ViewNavigator;
public class Domino {
	public static void main(String[] args) {
		try {
			StopWatch stopWatch = new StopWatch();
			Session session = NotesFactory.createSession();
			Database dbData = session.getDatabase("", "fakenames.nsf");
			View view = dbData.getView("People");
			ViewNavigator navUsers = null;
			ViewEntry vweUser = null;
			ViewEntry vweTemp = null;
			Document docUser = null;
			navUsers = view.createViewNav();
			navUsers.setEntryOptions(ViewNavigator.VN_ENTRYOPT_NOCOUNTDATA + ViewNavigator.VN_ENTRYOPT_NOCOLUMNVALUES);
			vweUser = navUsers.getFirst();
			navUsers.setCacheGuidance(Integer.MAX_VALUE, ViewNavigator.VN_CACHEGUIDANCE_READSELECTIVE);
			while (vweUser != null) {
				docUser = vweUser.getDocument();
				        docUser.getItemValueString("lastname") + ", " + docUser.getItemValueString("firstname"));
				vweTemp = navUsers.getNext(vweUser);
				vweUser = vweTemp;
		} catch (Exception e) {
		} finally {

Now to Domino JNA. Here is the code. First, I get all IDs from the documents in the view and then I take the result to get the underlying documents and the data.

import java.util.EnumSet;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.concurrent.Callable;
import com.mindoo.domino.jna.NotesCollection;
import com.mindoo.domino.jna.NotesCollection.EntriesAsListCallback;
import com.mindoo.domino.jna.NotesDatabase;
import com.mindoo.domino.jna.NotesIDTable;
import com.mindoo.domino.jna.NotesNote;
import com.mindoo.domino.jna.NotesViewEntryData;
import com.mindoo.domino.jna.constants.Navigate;
import com.mindoo.domino.jna.constants.OpenNote;
import com.mindoo.domino.jna.constants.ReadMask;
import com.mindoo.domino.jna.gc.NotesGC;
import lotus.domino.NotesException;
import lotus.domino.NotesFactory;
import lotus.domino.NotesThread;
import lotus.domino.Session;
public class DominoApi {
	public static void main(String[] args) throws NotesException {
		try {
			NotesGC.runWithAutoGC(new Callable() {
				public Object call() throws Exception {
					StopWatch stopWatch = new StopWatch();
					Session session = NotesFactory.createSession();
					NotesDatabase dbData = new NotesDatabase(session, "", "fakenames.nsf");
					NotesCollection colFromDbData = dbData.openCollectionByName("People");
					boolean includeCategoryIds = false;
					LinkedHashSet allIds = colFromDbData.getAllIds(includeCategoryIds);
					NotesIDTable selectedList = colFromDbData.getSelectedList();
					String startPos = "0";
					int entriesToSkip = 1;
					int entriesToReturn = Integer.MAX_VALUE;
					EnumSet returnNavigator = EnumSet.of(Navigate.NEXT_SELECTED);
					int bufferSize = Integer.MAX_VALUE;
					EnumSet returnData = EnumSet.of(ReadMask.NOTEID, ReadMask.SUMMARY);
					List selectedEntries = colFromDbData.getAllEntries(startPos, entriesToSkip,
			                returnNavigator, bufferSize, returnData, new EntriesAsListCallback(entriesToReturn));
					for (NotesViewEntryData currEntry : selectedEntries) {
						NotesNote note = dbData.openNoteById(currEntry.getNoteId(), EnumSet.noneOf(OpenNote.class));
			                    note.getItemValueString("lastname") + ", " + note.getItemValueString("firstname"));
					return null;
		} catch (Exception e) {
		} finally {

Now, what do you think? Which code is faster? Domino JNA? Well, not really in my szenario.

I have done a couple of tests on a local machine for both code sample.

The average time ( from 100 runs each ) for my code to get 40.000 documents from the “fakenames.nsf” and to print the values from the firstname and lastName item is 9.70 seconds; the average for Domino JNA is 10.65 seconds.

This does not mean that Domino JNA does not have any advantage over the standard IBM Domino Java API; It depends on the szenario. And in my szenario, there is no advantage in using Domino JNA. It would only result in an advanced complexity and platform dependancies.

If you have read this far, here is an extra for you.  I played with the options and found that the

navUsers.<strong>setCacheGuidance</strong>(Integer.MAX_VALUE, ViewNavigator.VN_CACHEGUIDANCE_READSELECTIVE);

is a significant performance boost.


Without setting the cache guidance, the average time to get the data out of the application was 11.40 seconds. I could not see any difference in using VN_CACHEGUIDANCE_READALL instead of  VN_CACHEGUIDANCE_READSELECTIVE.

ONTF DomBackup released

April 29, 2016 – 4:27 pm

I have uploaded a new version of ONTF DomBackup. Version contains updated binaries for Windows 32/64 Bit.

It adresses an issue that leads to data losses when compressing files larger 2GB. The issue is described here. The issue is Windows related only!

The fixed version uses 7Zip as an external compression tool.


DAOS – Find a note for a missing NLO file

March 8, 2016 – 5:35 pm

Today, I saw the following message on my Domino console

[0E88:005A-0FF4] 08.03.2016 17:19:01   The database d:\Domino\data\mail\ukrause.nsf was unable to open or read the file d:\DAOS\0002\97FC43BEED143800A6608E557BE888498DB9BC5100015B7C.nlo: File truncated – file may have been damaged

If you see such a message in your production environment, you should immediately find out

  • what causes the damage?
  • which note does the .nlo belong to?

In my case, the answer for the first one is: Anti Virus Software.

And here is, how I found the answer to the second on

If not already in place, set


Next trigger a console command

tell daosmgr LISTNLO MAP -V mail/ukrause.nsf

This will create a file listNLO.txt in you Domino data directory

Open the file and search for the .nlo file in question




You now have the noteId of the document that has a ticket to the damaged .nlo.

Use your favorite tool to find and open the document.


In my case, it was just a SPAM mail. So no worries. But if this happens in production, you should now go and find the ( hopefully intact ) nlo in your backup and restore it.

Windows Fixpack Update Idiocy

February 28, 2016 – 9:12 am

Today I decided to install some recommended and optional fixes on my “productive” Windows 2008 R2 /64 Server.

In general, this has been a straight-forward task in the past. Select all fixes and click Install. Grab a new cup of coffee, restart the machine after upgrade and carry on with daily business.

As always, I looked for the available free space and found it to be sufficient.


The overall size of all fixes to be installed was ~80MB, so 1.63GB should be more than enough.


8 of 10 fixes installed without any issues, but the remainin 2 reported errors. As always, the MS error messages are completely useless.

So I asked Google for advice and found at least one hint how to install the DotNet 4.5.2 update. “Download the 4.5.2 offline installer”

I did and ran it locally. A message box popped up and I could not believe what I could see.


So, the successful install consumed 900MB for ~70MB of fixes.

MS should really re-think their upgrade strategy.

And yes, I know. This is ALL so much better on Linux and MAC.



After another fixpack install ( 1.1 MB )


So, where has my free space gone??

Even removal of features needs free disk space. Insane !!


By the way. Free disk space is now 0Bytes …

[Vaadin] – Create a simple twin column modal multi-select dialog

January 30, 2016 – 10:29 am

Vaadin is an open source Web application framework for rich Internet applications. In contrast to JavaScript libraries and browser-plugin based solutions, it features a server-side architecture, which means that the majority of the logic runs on the servers. Ajax technology is used at the browser-side to ensure a rich and interactive user experience. On the client-side Vaadin is built on top of and can be extended with Google Web Toolkit.

To let the user interact and enter or change data, you can use simple text fields. But, to keep data consitent, sometimes you want to make only specific values available and let the user select one ore more of the values.

Here is a small sample of a twin column modal select dialog box. It uses only Vaadins basic features, so no 3rd party plugins are needed.

Here is, how it looks


And here is the source code

package org.bluemix.challenge;
import javax.servlet.annotation.WebServlet;
import com.vaadin.annotations.Theme;
import com.vaadin.annotations.VaadinServletConfiguration;
import com.vaadin.annotations.Widgetset;
import com.vaadin.event.ShortcutAction;
import com.vaadin.server.VaadinRequest;
import com.vaadin.server.VaadinServlet;
import com.vaadin.ui.Button;
import com.vaadin.ui.Button.ClickEvent;
import com.vaadin.ui.FormLayout;
import com.vaadin.ui.TwinColSelect;
import com.vaadin.ui.UI;
import com.vaadin.ui.VerticalLayout;
import com.vaadin.ui.Window;
import com.vaadin.ui.Window.CloseEvent;
import com.vaadin.ui.Window.CloseListener;
public class MyUI extends UI {
	private static final int OPTION_COUNT = 6;
	public String selectedOptions = "";
	protected void init(VaadinRequest vaadinRequest) {
		final VerticalLayout layout = new VerticalLayout();
		Button button = new Button("Click Me");
		button.addClickListener(new Button.ClickListener() {
			public void buttonClick(ClickEvent event) {
				final Window dialog = new Window("Select one or more ...");
				dialog.setWidth(400.0f, Unit.PIXELS);
				// register esc key as close shortcut
				TwinColSelect twinColl = new TwinColSelect();
				for (int i = 0; i &lt; OPTION_COUNT; i++) {
					twinColl.setItemCaption(i, "Option " + i);
				twinColl.setLeftColumnCaption("Available options");
				twinColl.setRightColumnCaption("Selected options");
				twinColl.setWidth(95.0f, Unit.PERCENTAGE);
				twinColl.addValueChangeListener(new ValueChangeListener() {
					public void valueChange(ValueChangeEvent event) {
						selectedOptions = String.valueOf(event.getProperty().getValue());
				final FormLayout dialogContent = new FormLayout();
				dialogContent.addComponent(new Button("Done", new Button.ClickListener() {
					public void buttonClick(ClickEvent event) {
				dialog.addCloseListener(new CloseListener() {
					public void windowClose(CloseEvent e) {
	@WebServlet(urlPatterns = "/*", name = "MyUIServlet", asyncSupported = true)
	@VaadinServletConfiguration(ui = MyUI.class, productionMode = false)
	public static class MyUIServlet extends VaadinServlet {
		private static final long serialVersionUID = 452468769467758600L;

When the dialog is closed, it prints the selected options to the console.

The dialog can be closed by hitting the ESC key. Also here the code returns the selected options ( if any )


This is just a basic sample; you can create a custom component and also pass the available options using a bean or whatever suites best for your need.


Latest Windows 10 update completely wrecked my dev environment

January 5, 2016 – 9:00 am

I am doing dev with Visual Studio and other tools and programs on a Windows 10 VM. Today, I decided to check for updates and install them, because I had not done any update for the past 2 month.

In general, these updates do not d any harm to the installed system, but today was different. The update took very long, and it looked like just the same when upgrading from Windows 8.1 to Windows 10. A couple of restarts, hints that all my data is still there, where I put it, and other nice hints that nothing special will happen.

Half an hour later, the system showed the logon screen. After login, The desktop was empty and an error message appeared, telling me, that The system can no longer access my shared MacBook drive. Network settings were completely overwritten, all system environment variables gone. Visual Studio had no longer any clue, where to find my libraries …

So I had to go back to the last working Windows version. To my very surprise, this worked. And, it was quick. It took only 5 minutes to restore the prior version.


All my tools and drives are back.

Telekom, O-Zwei und ich

December 2, 2015 – 6:16 pm

Welchen Kleinkrieg die unterschiedlichen Telefonanbieter in Deutschland untereinander auch immer ausfechten; der Kunde ist der Leidtragende.

Am Freitag kam endlich der Telekomiker. Gut 3 Wochen nach Beauftragung eines O2 DSL Anschlusses.
Wenn ich gewusst hätte, wie das ended, ich hätte ihm keinen Kaffee angeboten.
Nach gut 20 Minuten ist er unverrichteter Dinge wieder abgedackelt.

Nö, da kann er nichts machen, da muss O2 erst “Arbeiten an der Endleitung durchführen”.

Übersetzt heisst das: “Wenn ich im Keller 2 Drähte auf Leiste 1, Klemme 10a:10b lege, dann muss O2 dafür sorgen, daß in der ersten Etage in der Unterverteilung das so gesteckt ist, daß im Büro der Router ein Signal bekommt”.

Klemme 1, 10a:10b ist jetzt ein Beispiel, das ich in meiner grenzenlosen Naivität selbst gewählt habe.

Der Telekomiker wusste nicht, wo GENAU er die signalführenden Strippen auflegen soll.

Anruf beim O2 Support mit der Frage “Wat nu??” Antwort. “Rufen Sie mal den Bautrupp an, da muss eine neue Endleitung gezogen werden. Die ist laut Telekomiker defekt”.

Mein Einwand des “nicht ganz dicht zu sein” wurde gehört aber weder besttigt noch dementiert.

Heute war dann Mister O-Zwei höchstselbst vor Ort und hat die notwendigen “Arbeiten an der Endleitung” durchgeführt.
Ich habe aus dem Debakel mit dem Telekomiker gelernt und keinen Kaffee angeboten.
Ende vom Lied. Immer noch kein DSL, weil “der Anschluss noch nicht geschaltet ist”.

Heisst, O2 hat die Endleitung auf der einen Seite der Leiste 1, 10a:10b aufgelegt, die Leitung der Telekom hängt aber immer noch fröhich und signalführend im Raum.

O-Zwei teilte mir unaufgefordert mit, dass die telekomiker das schon mal gerne so machen, weil wir keinen telekomiker Vertrag haben sondern O2.

Ein weiterer Anruf beim Support mit der Bitte um ein Update zur Dichtigkeit ergab lediglich, dass man mein Anliegen an die Telekomiker weiterleiten wird und man sich dann evtl. noch einmal dazu herablassen würde, mir einen neuen Termin zu nennen.

Zur Dichtigkeit gab es, auch auf erneute Nachfrage keine konkreten Auskünfte. Ich habe mir aber bereits eine eigene Meinung gebildet.

Nächster Gig der Telekomiker ist am  Montag, 07.12.2015.


[Vaadin] – widgetsets ‘com.vaadin.defaultwidgetset’ does not contain implementation for com.vaadin.addon.charts

November 15, 2015 – 8:12 am

While working on the IBM Vaadin Challenge, I ran into an issue after adding the charts component to may new project.


I implemented the charts components by adding the the following line to my ivy.xml file

 <dependency org="com.vaadin.addon"

and recompiled the widgetset.

Never the less, the error message appeared.
If you ( like me ) are new to Vaadin, you might spend some time to solve the puzzle. So I thought, I write a short description, how to fix this.

Goto your src folder and locate the compiled widgetset


Next, open the file. I contails a line similar like this

@VaadinServletConfiguration(productionMode = false, ui = ChartUI.class)

Modify the line so it will point to your widgetset ( do not include the ‘.gwt.xml’ part )

@VaadinServletConfiguration(productionMode = false, ui = ChartUI.class,

When you now run the application, it will display your chart.

Build Windows executables on Linux

October 23, 2015 – 8:31 am

If you have to build a binary ( .exe, .dll, … ) from source code for LINUX and WINDOWS, you need at least one build environment for each operating system.
In today’s world, this is not a big deal, because we can have as many virtual machines as we want thanks to VMWARE ( or VirtualBox ).
But this could become a complex environment and might also increase license costs to name just a few problems.

Many of us use Linux as our primary operating system. So the question is: “Can we use Linux to compile and link Windows executables?”

The concept of targeting a different platform than the compiler is running on is not new, and is known as cross-compilation.

Cross-compiling Windows binaries on Linux may have many benefits to it.

  • Reduced operating system complexity.
    On cross-platform projects that are also built on Linux, we can get one less operating system to maintain.
  • Access to Unix build tools.
    Build tools such as make, autoconf, automake and Unix utilities as grep, sed, and cat, to mention a few, become available for use in Windows builds as well. Even though projects such as MSYS port a few of these utilities to Windows, the performance is generally lower, and the versions are older and less supported than the native Unix counterparts. Also, if you already have a build environment set up under Linux, you don’t have to set it up again on Windows, but just use the existing one.
  • Lower license costs.
    As we know, Windows costs in terms of license fees. Building on Linux, developers do not need to have a Windows installation on their machines, but maybe just a central Windows installation for testing purposes.

On a Linux build environment, a gcc that compiles native binaries is usually installed in “/usr/bin”.
Native headers and libraries are in turn found in “/usr/include” and “/usr/lib”, respectively.
We can see that all these directories are rooted in “/usr”.

Any number of cross-compiler environments can be installed on the same system, as long as they are rooted in different directories.

To compile and link a Windows executable on Linux do the following

(1) Go to the MinGW-w64 download page.

For 64Bit, open “Toolchains targetting Win64″ , followed by “Automated Builds” and download a recent version to /tmp
For 32Bit, open “Toolchains targetting Win32″ , followed by “Automated Builds” and download a recent version to /tmp


(2) Create 2 directories mkdir /opt/mingw32 and mkdir /opt/mingw64

(3) Unpack the .b2z files to the according directories

For 64Bit tar xvf mingw-w64-bin_x86_64-linux_20131228.tar.bz2 -C /opt/mingw64
For 32Bit tar xvf mingw-w32-bin_x86_64-linux_20131227.tar.bz2 -C /opt/mingw32

(4) Create a new hello.c file in /tmp and paste the following code into it

#include <stdio.h>
int main()
printf("Hello World!\n");
return 0;

(5) Next, you can build the Windows binaries using the following commands

For 64Bit /opt/mingw64/bin/x86_64-w64-mingw32-gcc /tmp/hello.c -o /tmp/hello-w64.exe
For 32Bit /opt/mingw32/bin/i686-w64-mingw32-gcc /tmp/hello.c -o /tmp/hello-w32.exe

You now have to Windows binaries that can be downloaded to a Windows environment.



Well, this is just a simple sample and for more complex projects you probably have to do a little more work, but it should give you the idea, how cross-compiling can be implemented.

[C++] – A plain simple sample to write to and read from shared memory

September 15, 2015 – 7:52 am

If you have two programs ( or two threads ) running on the same computer, you might need a mechanism to share information amongst both programs or transfer values from one program to the other.

One of the possible solutions is “shared memory”. Most of us know shared memory only from server crashes and the like.

Here is a simple sample written in C to show, how you can use a shared memory object. The sample uses the BOOST libraries. BOOST libraries provide a very easy way of managing shared memory objects independent from the underlying operating system.

#include <boost/interprocess/managed_shared_memory.hpp>
using namespace boost::interprocess;
int main()
	// delete SHM if exists
	// create a new SHM object and allocate space
	managed_shared_memory managed_shm(open_or_create, "my_shm", 1024);
	// write into SHM
	// Type: int, Name: my_int, Value: 99
	int *i = managed_shm.construct("my_int")(99);
	std::cout << "Write  into shared memory: "<< *i << '\n';
	// write into SHM
	// Type: std::string, Name: my_string, Value: "Hello World"
	std::string *sz = managed_shm.construct("my_string")("Hello World");
	std::cout << "Write  into shared memory: "<< *sz << '\n' << '\n';
	// read INT from SHM
	std::pair<int*, std::size_t> pInt = managed_shm.find("my_int");
	if (pInt.first) {
		std::cout << "Read  from shared memory: "<< *pInt.first << '\n';
	else {
		std::cout << "my_int not found" << '\n';
	// read STRING from SHM
	std::pair<std::string*, std::size_t> pString = managed_shm.find("my_string");
	if (pString.first) {
		std::cout << "Read  from shared memory: "<< *pString.first << '\n';
	else {
		std::cout << "my_string not found" << '\n';

[How To] – Create your own IBM Notes Splash Screen

September 12, 2015 – 7:47 am

Inspired from Thomas Bahn’s post, I started to play with the IBM Notes Start Screen.

My first “creation” was “YellowVerse 9“.


This technote describes what you need to replace the original start screen with your own creation.

It is important, that you save your image as Windows BMP. This is the only format that the IBM Notes client can handle.

To modify the existing splash.bmp image, I’ve used Snagit. But it also works with MS Paint.
And, if you own a more sophisticated graphics program and a graphic tablet, then you have much more possibilities.

The main challenge is to find images, that can be made transparent. While the .bmp format does not support transparency, it is possible to add tranparent images as an additional layer to the .bmp.

For your convenience, I have added image templates that can be used as a starting point.

You can also build your very own splash screen. The basic .bmp image needs to be 650x503px. But if you really want to do it from scratch, you propably need more than just a simple graphics program.

Here is, what I did with SnagIt.


A couple of people asked on Twitter and other social media channels for the already posted splash screens. You can download them here.


This is nothing that enhances productivity or even a new way to work. It’s a time killer, but fun …

Update: here are some more …




Download additional files

VMWare Workstation – Unable to open kernel device “.\Global\vmx86” : The system cannot find the file specified

September 10, 2015 – 9:16 am

I recently upgraded my VMWare Workstation from Version 10 to 12. The software is running on Windows 10/64.

I never had any issues with VMWare Workstation 10 on Windows 7, 8 and 8.1. But after the upgrade, almost after every restart I saw the following error message when I tried to start a VM


There are several Google search results, even for older versions. Here is the most recent one, that adresses the issue and provides a ( non working ) workaround.

I uninstalled, rebooted, installed the software as advised in the technote. After several restarts it seemed to work, but the error message returned right after the next system restart.

I then looked at AntiVirus and AntiMalware software as a potential candidate for the trouble. I found a couple of registry entries that had been identified to be ‘potentielly unwanted’ and quarantined.

I restored them and after a restart I could start the VMs. Problem solved !

Err, not really.

The error message returned this morning … Damn.

Next, I looked into the Event Log. Not really helpful, because it only said that something went wrong, but no further information.

But I could at least see a pattern. Each time, the error ocurred, It looks like some service was not started because of missing dependencies.

Next, I ran services.msc and found the following.


I tried to start the services manually. Both services started without any errors. And also, I was able to start the VMs.

I am not really sure what causes the service start to fail; looks like some kind of bad timing.

I will now change the service startup from automatic to manual and add some start/stop scripts to my desktop.

I do not use the VMs on a daily basis; so starting the VMWare services manually will also save some system resources.

Speaking at SNoUG

September 4, 2015 – 7:57 am

After 2013 & 2014, I again have the honor to speak at SNouG ( Swiss Notes User Group ) in Zurich on 28-Oct-2015

My session is titled “Honey, I shrunk the data!”. This session has been held a couple of times before at various user groups, but it seems that there still is a strong interest in this topic.

I will not only cover data and design compression, DAOS and some new compact features. The session also includes all things DBMT as well.

See you in Zurich!

MWLUG2015 – Session Slides And Sample Application

August 23, 2015 – 8:25 am

MWLUG 2015 in Atlanta was a blast. Thanks to all, who have organized the event. Also a big thank you to all, who attended my session.

Here is the presentation and the sample database & XML data



More Magic For TeamCity Build Automation

August 12, 2015 – 7:43 am

I recently added a great plugin to my TeamCity configuration that makes the deployment of new releases simple. The plugin adds some new deployers to the TeamCity server.

I am using the FTP Deployer.


The deploy-64 build step opens a connection to my FTP server and uploads files that have been built.

In additional steps, I create the template from the database that sits on my Domino Dev server, stamp it with the release and build number and also copy the license files and release notes into the release directory. The directory is automatically created when the files are deployed.

With this configuration at hand, I am able to create a new build for Windows 32/64 and Linux 32/64, add additional files, automagically increment the build number and zip all bits and pieces into a / distro.tgz.


Building a commandline interface with boost::program_options

July 26, 2015 – 8:08 am

My ONTF project “DomBackUp” can be configured and run via the command-line without having the configuration database installed on the Domino server.
DomBackUp is a fork of nKBackup by Tom Lyne et al. I found the command-line interface code a bit hard to maintain when it comes to enhancements.
So I was looking for an alternative way to create the command-line interface in C++ platform-independent.

boost::program_options is a library that makes it easy to parse command-line options, for example, for console applications.

Here is the code snippet from the DomBackUp project

try {
namespace po = boost::program_options;
po::options_description description("DomBackUp usage",PO_LINE_LENGTH);
	("help,h", "Display this help message")
	("source,s", po::value( &szSource ), "source file/folder")
	("dest,d", po::value( &szDestination ), "destination file/folder")
	("include-sub-dirs,inc", po::bool_switch( &bIncludeSubDirs )->default_value(false), 
	"Include subdirectories, applies to folder backup only (optional, default = false)")
	("throttle,t", po::bool_switch( &bThrottle )->default_value(false), 
	"Wait short time between file writes (optional, default = false)")
	("zip,z", po::bool_switch( &bZipAfterBackup )->default_value(false),
	"Move file to .zip archive after backup (optional, default = false)")
	("unique-filename,u", po::bool_switch( &bUniqueFileName )->default_value(false),
	"appends a unix timestamp to the archive filename (optional, default = false)")
	("application-type,a", po::value( &szAppType )->default_value("nsf"),
	"nsf,ntf,both = application type (optional, default = nsf only)")
	("input-file,f", po::value( &szInputFile ), 
	"Backup all the files specified in an .ind file created in the Domino data folder to <dest>")
	("version,v", "Display the version number");
po::variables_map vm;
po::store(po::command_line_parser(argc, argv).options(description).run(), vm);
if(vm.count("help")) {
	std::cout << description << "\n";
	return FALSE;
if(vm.count("version")) {
	AddInLogMessageText("V %s on %s", NOERROR, szVersionNumber.c_str(), szServer);
	AddInLogMessageText("Termination complete.\n", NOERROR);
	return FALSE;
catch(po::error& e) { 
AddInLogMessageText((char*)e.what(), NOERROR);
return (ERR(error));

First of all, you have to include some header files and (optional) set the namespace to be used. The only downside of using boost::program_options is the fact that you have to build the static .lib. While most parts of the boost libraries are header only, boost::program_options is not. But once you have the library at hand, it is easy to write the code.

The first line of code creates your interface and sets the Title as well as the line-length. The line-length is 80 by default.

po::options_description description("DomBackUp usage",PO_LINE_LENGTH);

The next thing to do is to add all the options you like to control via the command-line

	("help,h", "Display this help message")
	("source,s", po::value( &szSource ), "source file/folder")

Each option can have a description and the command, both verbose (–help) and abbreviated (-h).
In addition, you can bind the command to a variable. The variable can be of any type, string, int or boolean.

For boolean variables, you can implement some kind of toggle mechanism.

	("zip,z", po::bool_switch( &bZipAfterBackup )->default_value(false),
	"Move file to .zip archive after backup (optional, default = false)")

In this example, &bZipAfterBackup by default is set to false. To enable the feature, simply add –zip or -z to the command-line. This will toggle from false tu true.

The –help (-h) command does not have any value binding. So we have to add a parser and some ther lines of code that are processed, when the –help (-h) command is being used.

        po::variables_map vm;
	po::store(po::command_line_parser(argc, argv).options(description).run(), vm);
	if(vm.count("help")) {
		std::cout << "\n" << description << "\n";
		return FALSE;

This is what you get when you type “load nbackup -h”

commandlineCool, isn’t it?. Just a couple of lines of code and you get a professional command-line interface.

As said before, the downside is the overhead of having the boost libraries included in your project.

But the footprint is minimal, although the boost libraries are many GB of code.

Building ONTF DomBackUp with TeamCity

July 26, 2015 – 7:00 am

Building binaries for multiple platforms from source is time consuming. For my ONTF project “DomBackUp” I have to build binaries for AIX (not sure, if I can support AIX in future builds ) , LINUX and Windows, both on 32Bit and 64Bit.

Aside from Atlassian JIRA and STASH, a couple of virtual machines are involved. I also started using TEAMCITY to automatically build the binaries without going to each of the build systems and invoke the build process manually.

TeamCity is a Java-based CI server package. The TeamCity installation and configuration is quick and easy. The fact that it is Java-based should not be an impediment to Windows development shops.
The TeamCity server is a main component, but the browser-hosted interface serves as the primary way to administer TeamCity users, agents, projects, and build configurations.

The main advantages are

  • Easy to setup, use, and configure
  • Widely-used and well documented
  • Integration with a wide variety of tools and technologies
  • Professional Server is free for up to 3 agents and 20 build configurations.


You can create the build steps manually or let TeamCity search your GIT repository for existing .sln files and propose build steps automatically. All you have to do is to review the steps and select the correct build configuration ( x64 or Win32 ). For the Linux builds, I use boost build scripts. So you only have to tell TeamCity to invoke the script when the build agent runs.

TeamCity will automatically grab the output from the build script. This makes it very easy to identify errors in the script itself or even compile errors.



Using a CI solution makes it very easy to create nightly builds. Due to the flexibility in configuration, you are able to create the binaries aside from any needed template, have CI update and harmonize the version and build numbers across all parts of your project, create release notes from JIRA and put all parts together to build a shippable package.

The process can be triggered manually or scheduled to create nightly builds.

You are also able to rebuild any previous version by simply checkout the correct commit from your repository.

[CMake] How to turn off incremental linking

July 17, 2015 – 9:47 am

I recently had to build a static library using CMake. In the current CMake version, apparently something has changed in the linker settings.

The build always ended with an error:

LINK : fatal error LNK1123: failure during conversion to COFF: file invalid
or corrupt [D:\0.GIT\libarchive-3.1.2\vc10.32\CMakeFiles\CMakeTmp\cmTC_bbd0c.vcxproj]

Looking at the linker settings, I found that CMake seems to enable incremental linking by default and this is the root cause for the trouble; at least with Visual Studio 2010


C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\bin\link.exe /ERRORREPORT:QUEUE
/NOLOGO kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib
/MANIFEST /ManifestFile:”cmTC_bbd0c.dir\Debug\cmTC_bbd0c.exe.intermediate.manifest”
/MANIFESTUAC:”level=’asInvoker’ uiAccess=’false'”
/DEBUG /PDB:”D:/0.GIT/libarchive-3.1.2/vc10.32/CMakeFiles/CMakeTmp/Debug/cmTC_bbd0c.pdb”
/MACHINE:X86 cmTC_bbd0c.dir\Debug\cmTC_bbd0c.exe.embed.manifest.res
cmTC_bbd0c.dir\Debug\testCCompiler.obj /machine:X86 /debug

I searched the web for a solution, but could not find anything that really works. So here is what I did to fix the issue.

In in [CMakePrgmDir]\share\cmake-3.3\Modules\Platform\windows.cmake

modify the line 242 ( might be different in other versions )

# add /debug and /INCREMENTAL:YES to DEBUG and RELWITHDEBINFO also add pdbtype
# on versions that support it
#set( MSVC_INCREMENTAL_YES_FLAG "/INCREMENTAL" ) ' comment out here

With this modification in place, I was now able to let CMake create the project configuration files.

Code Quest

July 16, 2015 – 7:37 am

A couple of days ago, I had to investigate on an issue in a rather old template. The template had been delivered to a customer ages ago. After downloading the template from the repository ( no, not from a GIT or SVN repository ) and after Domino Designer had opened the template, I saw a lot of errors in the Script Libraries. No worries at this time, because all of those issues can be solved.


I then I opened one of the libs that contains the specific function. The contents of that lib looked different to the curent version; no surprise is that.
Then I searched the template for the function. Nothing. The function was not there, althoug I could see from the code, that it IS in the template.

Calls to that function did not show an error, so it had to be somewhere inside the code. But I could not find it.

I then opened the template in YTRIA ScanEz and navigated to the lib that was supposed to contain the function. Aside from the usual $ScriptLib items, the design also had $ScriptLib_error items.

ErrorInLS1In the end, I found my function in one of the $ScriptLib items. This template had been build in Designer 7 and the current Domino Designer 9.0.1 seems to have a problem with this mix of $ScriptLib and $ScriptLib_error items when opening such a design element in the editor.

As an aside, I found that the “Recompile All LotusScript” apparently does not have any problems finding the correct code; all code compiled without any complaint.

Next, I deleted the $ScriptLib_error items from the design


I omitted the warning ( don’t do that, if you do not know, what you’re doing !!)


and after another recompile and reopen of the template in Domino Designer , the template was fully functional again.