Using of World-Machine in project with seamless open world of 262x262 km

Hello Guys!

I would like to share our experience of using World Machine with you. We have been developing a large scale project based on the real geospatial data. We’ve already finished the project’s first stage and released its preview. The demo was showcased at ITEC 2013 in Rome in the May and it was highly appreciated by the visitors of our booth.


By the way, we have been asked a lot about the program we have used in our project for the landscape refinement. So, we did some advertisement of WM =)
Your program is the only one which can process a huge amount of the satelites data, but there are some issues to which I want you to pay the attention.

[hr]

–About the Project–

The project is about the recreation of the really existing place on the Earth surface (based on the satellite data) by means of Unigine engine. The greatest diversity of the free satellite data with the big detailing was introduced in the United States. Thereby, we have chosen the lot of the 262x262km in size in Washington, USA. As a source data for the project we used the following maps: an Elevation map with the 10m precision and a LandCover map with the 30m precision (surface types map, where each type is specified with the concrete color). We made the decision not to use Imagery map since they are patchy and filled with the huge amount of the artifacts, such as shadows, smoke, objects on the surfaces, different light conditions and so on.

Considering the fact, that there was a lack of the given surfaces data accuracy for it realistic representation, we needed to find a way or method how to detail it. Also we needed some supplementary data, such as the trees density mask, terrain materials mask etc. relying on the existing ones. That’s why we had recourse to World Machine.

Technical details on the project:
Landscape geometric size – 262144x262144m. Maximum height – 4400m. All the landscape is split onto 16x16 terrains.
Height map resolution – 4m/pixel (it’s planned to be decreased to 2m/pixel).
Diffuse texture, Normal Map and Splat Masks (consisting of 5 materials) resolution – 2m/pixel, therefore the resolution of each texture is 131072 pixels. With the help of the Tile Builder Device we split textures onto 64x64 tiles per 2048 pixels each.
There is a great number of the objects (trees, rocks, branches etc.) placed on the surface by means of Unigine engine. Each object needs its own bitmap mask responsible for the distribution of objects on the surface. Now there are 7 masks with the resolution of 8m/pixel.
Calculation has taken approximately 3 days on the computer with the following characteristics: AMD FX™ - 8350 Eight-Core Processor 4 Ghz, RAM 32 Gb.

[hr]

–Creation–

The creation of the project has been split onto 2 stages (each stage required its own TMD file):

  1. The first stage was about working on the source height map. On this stage we filtered height map from the unnecessary artifacts, detailed the landscape by means of the erosion, additionally detailed the mountain relief, created the sea floor and the snowcaps.

As an input for World Machine (Tiled File Input Device) we set the source height map in VTP BT format (downloaded with the GlobalMapper and split onto the tiles).
As an output we had the Height Map, Erosion Flow Map, Snow Depth Mask. We merged this maps into RGB16 format texture for further work on the second stage.

  1. On the second stage we created the Diffuse map, Normal map, Splat masks and the masks for objects.

    As an input here we set the RGB16 texture, obtained on the first stage, and the LandCover map.

    The main idea of this stage was to split the landscape into the differential areas (sand, rocks, stones, ground, meadows, bushes, snow and etc.) depending on the data from the LandCover map and the Height map. After, each area was filled with textures and detailed by its own principles. There were previously prepared Diffuse and Bump Maps used for the detailing.

[hr]

–Issues and their possible solutions–

– Why do I use two files in the project:

  1. It’s not always necessary to compute the data in the final resolution.
  2. Erosion Device working mechanism is terrifically dependable on the resolution you work with. While for the low resolution there will be realistic variegated result, - for the high resolution the result will be too homogeneous. Besides, when computing with the high resolution we have to use the tiles of the smaller size, and, thereafter, erosion threads cease to be seamless on the whole landscape. (To the point, in my opinion, one of the good methods of creating sufficiently realistic stream canals by means of World Machine is the sequential usage of the Erosion Device for the different resolutions).
  3. It’s necessary to have already pre-computed data for the some required stages of the work.

What is required to use only one file:

  1. Provide for the work with the different resolutions for each device. As an example, you can see how is it implemented in the Allegorithmic Substance Designer. It will give the opportunity to optimize the calculations if its possible (blur operations particularly need to be optimized), and also provide more complex erosion control.
  2. Provide for the cache saving into files on the intermediate stages of the work. As I’ve read in this forum, this option is being planned.

–Big load time when working with the bitmap data.
Now the second file includes two Tiles File Input Devices. First one full resolution – 32768x32768, second one – 6864x6864. Thus, the time required for the extent opening is about 6 minutes. Hence, if we need to increase the input data resolution or use the additional input data with the high resolution, this time period will be increased to the unacceptable values. We’ve already covered this problem here - http://forum.world-machine.com/index.php?topic=2091.0. As I can see, it deals with the cache pyramid creation when working in the Layout View and Explorer View modes.

How to solve the problem:

  1. As have been suggested in the forum topic, - add a saving option to already calculated cache data. However, it will take a while especially when using a big amount of data. This process is of the biggest upset when you don’t use Layout View and Explorer View modes.
  2. Add an option to the Tiled Input Device disabling the cache pyramid creation which will entirely solve the problem with the loading. However, it will limit the usage of the Layout View mode. But I could enabling it only on the required devices.

–Lack of the opportunity to change or disable the input data interpolation.
As far as I can see for now, if the input texture is bigger or smaller than the output one, bilinear filtration is used by default. But when working with the height maps, bilinear interpolation cause a low-quality result – it would be much better to use trilinear interpolation. Let alone working with the LandCover maps – they don’t need any type of filtration, because each surface type has its concrete color. For now, on the pixels seams the secondary colors are coming out, which makes masks generation much more complicated. To minimize such an effect, I’m forced to make quite a complicated filtration in WM.
Thereby, it would be of a lot of help if such an option will be added into the File Input and Tiled File Input devices.

–Integer parameters problem in the Expander and Blur Devices.
If you enable Scale independent option, value of 1 will be assigned to all the integer parameters, which is too big. But there is no possibility of minimizing this values. You can disable Scale independent option at all, but it will cause absolutely different results for the calculations with the different resolutions.

How to solve the problem:

  1. Add the usage of the float values instead of integer for this devises.
  2. The perfect solution will be to have an option of setting values in meters.

–There is no such device that allows to use the bias/stretching/distortion of the pixels depending on the direction.
Lets take a look at the example of using such a feature. Mountains, for example, are completely different in their size and structure. With the lapse of time, mountains are collapsed and crumbled, and the debris stream from each mountain has its concrete color. Availability of such a device will easily allow to spread this color down along the mountain surface. Besides, it would help to create a dirt debris on the snow surface.
As an analogs of such device there are a Refraction and Offset components of the FilterForge program.

–There is no possibility of working with the relative values in such devices as Clamp and Constant.
For some operations I need to set definite values – 0.5 or 1. But now this values are specified in meters. It means that if you need to change the maximum height of the landscape for some reasons (for example, change the area where the calculations take place), you need to carefully go through all the devices and change this values to the right ones. So, it would be great to have an opportunity to switch between the meters and relative values.

–A large space on the hard drive is needed for the calculations.

  1. Temporary files incredibly increase their storage during the calculations. In my experience, temporary files took almost 1TB, when the resulting files were about 200-300GB in size. Besides, the data is saved to the resulting files after being calculated into the temporary files. If the free space on the hard drive has run out, the calculations stop and all the data is lost.

How to solve the problem:
1.Perhaps, the solution for this concern will be to save the files every time after the calculations and clear the temporary area. Also, it would be helpful to add the warning if there is not enough free space on the hard drive.
2. Lack of the option to set the resolution for output data. It would allow to store and calculate the data with the lower resolution.
3. The output data formats for the 8-bit grayscale is not optimal. There is no such a format as PNG low precision. TGA files are always saved as RGBA, though the format allows to store just one data channel.

–Usability:

  1. There is an extremely lack of the tool, allowing to select all the devices, groups, nodes and drag them in one action. For now, if I need to paste blocks inside the scheme, I need to do a backbreaking work!
  2. In the schemes, we have a huge amount of links, that are intersecting and distract the attention.
    As a solution, I would have added such an option, that will allow to make a connection between input and output blocks without showing all the links, but only implying them. It would simplify the scheme creation and decrease the amount of the links.

[hr]

–Future Plans–

Now we start working on such objects as lakes, rivers, roads and cities. Obviously, the most appropriate way for it is to use vector data from such sources as Openstreet Map. But even now I can say that World Machine doesn’t have tools to work with the big vector data. Tools, available for now, allows to work with vector only manually or based on the limited options. As a solution, we surely will be able to convert this data into the bitmap format. But this bitmap files must be of the high resolution and work with them will hardly be handy because of the big loading time.

[hr]

I would like to thank you, Stephen, for the continuous work and for frequent updates! But if you want the program to be used in the big serious projects and even became a standart in this area, you should pay your attention to the details. I hope my report will be useful for you.

Hi there,

Thanks for the write up! Your project looks great. And also for your candid feedback of what works and what doesn’t for your workflow!

You are absolutely right about the challenges with “big world” data. This has been a great growth area over the last year or two for World Machine, and represents an awesome opportunity to created focused enhancements to improve the product.

I’ll comment on this more pretty soon, but improving WM’s ability to function at scale is definitely one of the dev push directions we can go for the next version.

As a game enthusiast and independent developer, I am compelled to say that Unigine’s business model is antiquated. As an enthusiast I’m disappointed that so few titles utilize the engine. As an independent developer I personally have been denied the opportunity to even see what the engine has to offer; not that I have tens of thousands of dollars to put in front of anyone anyway.

There is a tremendous value in having more users - as many as possible - working with your software. Call me crazy but if Unigine isn’t attracting the big names that can afford to write all those zeros on a check, it’s probably time to considering getting behind independents.

Unigine?! Pffff… We won indie license ~2 years ago. The engine is spaghetti code and slow, support non-existent (unless I suppose $40k is spent for source license). So after couple of years spent on it and not being able to get anything done with it (small indie devs don’t have 10 C++ coders on staff, which is what Unigine needs), we were cut out from all updates, forums, docs, etc.

Indies should stay away as far as they can from this engine. It’s designed to be used in simulation apps by large agencies who can throw money on support and tools (no Blender add-ons for Unigine, nor any free specs someone can use to make add-ons).

UNIGINE is targeted to medium and large projects, where it fits their needs very well, our customers are usually very happy with outstanding code quality and support.
Some other products are better for indie developers, but I guess it’s offtopic in this thread.

Big big +1 for request to do an allegorithmic substance like “varied resolution” solution.
Atm i’m eating the CPU time instead of spliting time, but i do use early input devices for 32K+ generation that would be just fine at 4K themselves and take forever to compute, this would fix that for me without needing to split files!

In the same vein - and it seems like it would be low hanging fruit - I would appreciate being able to produce output at lower resolutions per output device. E.g., I need a 4096 height map but only a 2048 splat.

Not that it’s a big deal to resize…

The good news is that multiresolution worlds will most likely be the next important new feature to be released!

They’re actually already up and running fine in my dev build, but the UI controls still need to be built out. But you will be able to specify individual resolutions for devices and groups of devices, and possibly be able to set a per-type default as well. So for example, some or all your heightmaps could be done at 2048x2048, but then be upscaled when used to drive textures… And of course, you could also do expensive “earthmoving” at a lower res, saving alot of time on high res builds.

I’ll be talking roadmap and more details very soon…

Any chance for LOD iterations to be generated for each tile? :slight_smile:

That’s very very very very good news to me! I just finished a 4 full day render at 32K that could’ve gone much quicker if some nodes were lower res with rather little quality loss so it’s a very welcome change! Do you have some form of ETA? Will it be possible to get an (unsupported) early beta?
Thanks!

I’ve be announcing a bunch of dev-direction related news soon, but in terms of the multires support I hope to have an early dev build out with a possibly non-final implementation within a couple weeks.

Well I individually have been declined to be able, to even see what the engine has to offer not that i have lots of money to put in front of anyone anyway.

uciueaiqi@newgmailruner.com

uhutupah@newgmailruner.com

viagra@theporndude.com