Photogrammetry Testing 10: MicMac

Here’s the original post, and links to all posts

I have previously outlined my goal of testing multiple photogrammetry solutions on a single dataset, and reporting times and results.

I’m using a dataset based on photographs of this Styracosaurus model (I’ve had it since I was quite young):


The dataset has 53 photos in total, and is available from this link. [This will be moved to figshare in due course].

The model is about 12 cm in total length, has texture for the scales, and a reflective brass nameplate on the base. The model was situated on a glass desk, and there may be reflection issues from that.

I’ve had a few requests to take a look at MicMac. I’ve dabbled with it before, and I’ve seen plenty of praise for it online, but because it – and the website you get it from – are in French, I [like a typical English speaker] haven’t invested much time in trying to suss it out.
Hold on to your pants folks, because I’m going to fumble through. Most of what I’m doing I took from the English language wiki. You can download the latest binaries from here.

As far as I can tell, there’s no interface, and this thing is run entirely from the command line. Default install went to C:\MicMac64bits for me, and for the sake of simplicity [i.e. so I don’t have to call binaries from a different directory] I’ve stuck my folder of photos, “Styrac” in the Micmac binary folder. Not great practice but it’s convenient here.
So I’ll open a command prompt inside the images folder, ‘Styrac’.
As an aside, the most recent version of windows 10, the Creators Update, allows execution of windows binaries from the bash on windows command line, which is ace. As such I’m running everything through bash so I can use the unix ‘time’ command to time how long each stage takes. However, that means the GPU is currently off limits (I think), though to my knowledge MicMac doesn’t leverage the GPU.

Tie Point Matching

The first command to run is:

 ..\mm3d.exe Tapioca MulScale “.*.JPG” 500 2500

The ‘Tapioca’ command tells Micmac to look for tie points. ‘Mulscale’ means it will first run through the images at 500px resolution (500 pixels on longest side) to find most likely matches, before running at a higher resolution, in this case 2500. Tapioca and Mulscale are case-dependent.
I had a load of errors about my Sony Nex-6 that zoomed by, but it kept on trucking.
Time taken for tie Points Search: 5m 54 seconds

Internal and Relative orientation.

It’s not helping that the MicMac process uses different terms than I’ve come across before… Seems this is a portion of what I’d normally call matching cameras?
The command is:

 ..\mm3D.exe Tapas RadialStd “.*.JPG” Out=MyFolder

We’re creating a new sub-folder here called “MyFolder” in which ‘stuff’ will be stored (I have obviously not delved into the nitty gritty for this one) – you can call this whatever you want.
Time: 1m 33 seconds

Visualize Relative Orientation

This command just outputs stuff so we can see the cameras and sparse point cloud in meshlab:

 ..\mm3d.exe AperiCloud “.*.JPG” MyFolder

Time taken: 1m 43 seconds
Here’s said sparse point cloud (AperiCloud_MyFolder.ply) visualized in meshlab:


So far so good, seems everything is in order.

Image Masking

Now we can mask the sparse point cloud. Run:

 ..\mm3d.exe SaisieMasqQT AperiCloud_MyFolder.ply

This opens a GUI showing the point cloud and cameras:


Zoom and move the mouse according the controls at the bottom of the window, and then hit ‘F9’ to change to selection mode, and draw a polygon around the area you’re interested in (in this case, the point cloud actually contains very little that isn’t of interest, so I’m being fairly liberal with my selection). Left click to build the polygon, and right click to close the loop:


Then go to Selection->add inside polygon to selection. I just choose all the points. Then File->Save Selection Info, leaving two files in your images folder, in this case AperiCloud_MyFolder_polyg3d.xml and AperiCloud_MyFolder_selectionInfo.xml.

Dense Correlation

The command to use is:

..\mm3d.exe C3DC BigMac “.*.JPG” MyFolder

Time Taken: 19m 0 seconds

We’ve now created a dense point cloud, which appeared in my Styrac folder as “C3DC_BigMac.ply”. Here it is in Meshlab:


This is pretty poor, especially if we view the other side:


There are tools available on the website (but not in the tutorials) that describe meshing and texturing, but honestly the reconstruction is so poor I’m not going to bother.

Conclusions and notes.

Before commenting on the quality of the model, I want to point out that MicMac saves all intermediate steps and files, and in this case that left >20gb of files in my images folder.

The model is, obviously, pretty poor. Time wasn’t great, and the usability of the tools is the hardest I’ve encountered yet.

Ultimately I can’t recommend MicMac to a novice user when there are so many clearer, easier to use packages available. I have no doubt as to the power of getting into the settings in MicMac, but by god it’s not straightforward to use, and the tutorials and instructions available are not great. If you have plenty of time you could probably get much better results, and I’d be interested to hear from people who’ve run this dataset through Micmac.


14 thoughts on “Photogrammetry Testing 10: MicMac

  1. Hei!
    I have tried to run your dataset in MicMac and ran into some issues :

    1) All images are not taken at the same focal according to the EXIFs (25mm and 37mm), which makes me think they might actually all have slightly different focal since a zoom lens can’t really be stable away from its extrema (here that would be 16 and 50mm). Unless specified, MicMac does not try to find a different camera calibration for each image reporting the same focal length, which leads to suboptimal calibration (some form of averaging of all the calibration of the images reporting the same focal). MicMac default behavior is to expect the imagery to be acquired with a camera with a stable internal orientation (the photogrammetric term for the lens and camera calibration).

    2) Image quality is not the greatest (ISO 2000 to 3200 on a NEX-6 is quite noisy, and sharpness was an issue, see image DSC09945.JPG for instance) so I had difficulties/failed at finding tie points for the images of the back side of the statue on the side of the plaque.
    Removing these images from the bundle returned an acceptable product, with a noticeable hole there…

    For this very dataset, MicMac is indeed not the best tool, but I am sure you would have a different conclusion with a different dataset (as some authors have published in scientific journals).

    1. Thanks. Yeah, then dataset is far from ideal, which helps me test robustness of software. I’m sure Micmac can produce strong results with ideal photos – the same would be true of most of the software I’ve tested. My experience has generally been that robustness for variable parameters (e.g. zoom lens) is absolutely vital to making sure multiple lab members can collect data in the field, or in the lab with a range of devices.

      OpenMVG suffers similarly with my datasets – it’s very rare I can get a good camera alignment with openMVG.

      Thanks for running the dataset through your workflow!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.