BlockLeftTop, PRELOAD BlockLeftBottom, PRELOAD BlockLeftStretch, PRELOAD BlockTop, PRELOAD BlockBottom, PRELOAD BlockStretch, PRELOAD BlockRightTop, PRELOAD BlockRightBottom, PRELOAD BlockRightStretch, PRELOAD
DeltaEngine

AltDevConf Mono Slides about C# in Games

by Benjamin Nitschke 16. February 2012 13:46
Miguel de Icaza was speaking on the AltDev Conference a few days ago about .NET in games, especially C# with Mono.

There are some interesting points in the slides and some nice history leasons.

We are also honored to be both mentioned with our game SoulCraft and the Delta Engine is also mentioned at the end.


Here are the slides: http://tirania.org/slides/AltDevConf-2012-Mono.pdf

Delta Engine Articles on PocketGamer.biz

by Benjamin Nitschke 29. November 2011 15:13

Our CEO Karsten was recently interviewed about Multiplatform development. Here is the article on PocketGamer.biz from yesterday:

http://www.pocketgamer.biz/r/PG.Biz/Delta+Engine/news.asp?c=35600

 

A few days ago PocketGamer.biz also published an article about our Delta Engine release and iOS, Android and WP7 support:

http://www.pocketgamer.biz/r/PG.Biz/Delta+Engine/news.asp?c=35571

Enjoy :D

Miguel de Icaza speaks about Xamarin and Delta Engine

by Benjamin Nitschke 14. September 2011 18:47
In one of the latest episodes of the awesome .NET Rocks! show Miguel de Icaza and Nat Friedman speak about their new company Xamarin that is offering MonoTouch for iOS and Mono for Android and also mention our Delta Engine at the end. I listen to the show since 8-9 years (on an off, sometimes trying to catch up, they already have almost 700 episodes) and while it covers nothing about game programming, it is one of the best, most fun and most informative sources of .NET development company information you can get (aside from reading the usual news from sources like CodeProject ^^).

Anway, thanks Miguel for mentioning us. The public beta of the Delta Engine is also coming soon and we have some pretty cool announcements around our licensing model we hopefully can soon announce :)

Also don't forget to check out the currently running Build conference from Microsoft about Windows 8. Lots of cool information about Windows 8 and tablets and the direction Microsoft is heading. We obviously like all the platforms and all platforms Microsoft and Mono provide .NET support for are first class citizens of the Delta Engine (which is coming soon, public beta starts in 2.5 weeks) :)

Delta Engine Multiplatform Market Research and Engine Comparison

by Benjamin Nitschke 6. May 2011 07:17

This is the presentation I hold on the Quo Vadis conference in Berlin 2011-05-03.

The topics of this presentation are:

  • Why Multiplatform?
  • Market Analysis
  • Engine Comparison
  • Mobile Development is challenging (10 most important tips)
  • Example Game on iPhone, Android, WP7
  • Our Solution: The Delta Engine


Download the presentation here (11 MB, DeltaEngineMultiplatformDevelopmentPresentation2011-05.ppt).

You can also check out the example game Blocks and its source code on our Delta Engine Examples page.

People still play Arena Wars from good old 2004

by Benjamin Nitschke 15. March 2011 20:22
This video was recorded today by a Fan, this is pretty cool for our good old game we developed in 2003 as Students (with the release in 2004), this is now 7-8 years ago. I still think the graphics are totally fine for a simple RTS game.
Fun stuff :) We should maybe provide that good old game with our great Delta Engine some time soon ^^

Delta Engine Interview from the CeBIT 2011

by Benjamin Nitschke 14. March 2011 16:51
Here is a small interview I made for Channel 9 / MSDN on the CeBIT 2011 a week ago. It is in German and I am only talking quickly about the Engine and the Soulcraft Tech Demo we showed there.


Original link: http://channel9.msdn.com/Blogs/Lori/CeBIT-2011-MobileBits

CeBIT 2011 Pictures

by Benjamin Nitschke 8. March 2011 10:51
The CeBIT is now over. Here are some quick pictures I made in the last few days. I hope you enjoy it :) All pictures are also in our gallery.

The entrance of the CeBIT. The CeBIT 2011 was especially full on Friday and Saturday.


I am preparing for an interview for Microsoft's Channel9 TV. I explained what the Delta Engine is about and showed our flyer and the demo running on several devices. I also talked about developing multiplatform games for a bit.


On Saturday I hold a little speech in the MSDN Tech-Room about Multiplatform Development and showed our Soulcraft Tech Demo. You can find the presentation here. After the Presentation lots of people came and asked questions and we probably had more visitors at our booth than the rest of the week together ^^


Yummy food at Microsoft's Booth. The main topic of the CeBIT was Cloud Computing, but I always thought it is nothing new, we all got our emails in the cloud for 10-20 years. Once we have huge demand on the engine, we will move our build and content servers into the cloud as well, but for now our single server can handle everything with ease.


And here is one extra photo at the end of the week with the other Microsoft Partners sharing our booth at P26.


Text and graphics displayed with just water. Pretty cool and amazingly precise. But someone really wanted to write city in a strange way ;)


Otherwise on Friday and Saturday the crowd went crazy as usual when there is something to win or if they throw a pen into the masses. Lots of colorful booths also, mostly boring business stuff however.

>
I did not see much hardware, but there were plenty of 3D Monitors (nothing really amazing however) and millions of Android Tablet devices like from this Korean company. At least 10 different tablets can be seen in this picture.


In Hall 19 and 23 lots of eSports and loudness was going on. Players fought in League of Legends, Quake Live, Counterstrike and StarCraft 2 in the Intel Extreme Masters. Because most of this was on Saturday and I had to be at the booth all day I missed most of it :( You can watch the VODs online on ESL however.

>
Servers and network cabling at the Microsoft booth. Looks much more chaotic in our server room and we do not have this many cables.

>
In hall 2 there was a lot of Open Source booths, the Linux Tux Penguin, Open Office and stuff like that.

>
On Friday evening some guys from our team also went to the Serious Game Awards, which was pretty cool, free drinks and food and a good Award ceremony. But otherwise the games were really boring and serious ^^

>
On Thursday Boje and me also went to the Afterwork Clubbing Party by Microsoft in the Funhouse. The party was huge, with DJs, lots of food and drinks and cocktails and plenty of really drunk Microsoft guys :D

>
That's it for my CeBIT 2011 pictures. On Saturday evening there was also a nice Get-Together with all the other partners and Microsoft people with some beers and food. After that I just fell down into the bed and slept most of the rest of the week.

SoulCraft Tech Demo was shown at CES by NVidia

by Benjamin Nitschke 5. January 2011 23:08
A small tech demo for our upcoming mobile game SoulCraft was shown by Nvidia on their CES press conference 2011. More details can be found in the much longer development blog post I just made. SoulCraft is an RPG game coming out this year that runs on iPhone, iPad, Windows Phone 7, Android phones and tablets as well as on PC, Xbox 360, MacOS and Linux.

Here is a quick video about the early SoulCraft Tech Demo, remember that this is only an early build and we are happy to have it running on Android Tegra hardware. We will improve this demo until next month and then have more amazing things to show :)


Here is a photo from the NVidia CES press conference 2011 from today while our SoulCraft Tech Demo is being shown on a LG Tegra Android device:


More links:
And finally some screenshots from the demo:






Again, if you want to read more, go to the next blog post, lots of details there.

Delayed blogging of building the first SoulCraft Tech Demo version

by Benjamin Nitschke 5. January 2011 22:14
Note: If you just want to see some screenshots and a video about the CES SoulCraft Tech Demo, go to the next blog post, this post is all technical and boring :D

Welcome to my super big blog post for the SoulCraft Tech Demo, more specifically the first iteration of the demo for the CES Nvidia press conference which was today. Nvidia was showing their Tegra2 chip on different devices like the LG mobile phones and Acer tablets. And our SoulCraft Tech Demo is among the things they showed, my colleague Karsten was on stage today (20 minutes ago) and helped presenting it. Here is a photo from the event with our demo on screen.


Instead of blogging normally about the development process I decided to do delayed blogging of the last 8 days developing this first version of our tech demo. Delayed blogging means I wrote down what I did every day, but I did not publish it. Instead it is now all published at once. Reasons for this are:
  • This was a Top Secret Project for NVidias CES press conference ^^
  • I also did not want to piss anyone off by blogging about problems with tools or hardware.
  • No one would have believed me that we could pull it off (just look at the first screenshot)
  • Even internally everyone was very skeptic. If we would fail and not be able to bring the demo on the Nvidia Tegra device, I could just wait until it finally works. The CES was NOT a milestone for us. This demo was planned to be done till mid-February.
  • Nice way to blog the daily process without having to think too much about making beautiful screenshots, just the end result counts :)

I am not going to talk a lot of what happened before (this blog post is already long enough). As mentioned in my previous blog post we spend most time in the last months on the Delta Engine and optimizing 2D and 3D rendering for all platforms we support (iPhone, iPad, iPod Touch, Android, Windows Phone 7, PC, Xbox, etc.). These platforms are very different and some of them like PC and Android have hundreds of possible configurations and performance. Some platforms do not support custom shaders at all (like the Windows Phone 7, ES 1.1 Android devices or older iPhones) or not very well (like slower Android devices or even the iPad, which just has too many pixels to fill for its slow GPU), these are all things we have to worry about when designing our engine and building shaders and 3D content.

Day 1 (day after Chrismas)



Well, this is not really Day 1 of the SoulCraft Tech Demo, it even has nothing to do with SoulCraft yet and it was actually the whole Christmas week (20-26. Dec 2010). The result of all the hard work is still boring. All I worked on the whole week was to get a line on the screen. This might sound retarded, everyone knows it would not take much more than a line of code to put a line of red pixels on the screen, but in this case this line was a crazy amount of work. The reason for all this work that even goes back to November is our new rendering system, which is fully dynamic and can handle all kinds of vertex formats and data on all platforms we support, which are very different. When working on the Delta Engine it is always easier to focus on one specific task and optimize it like crazy, which is how the whole Delta Engine is build. We rarely have to go back to optimized parts and rethink them, they usually kick as (like our module system, the rendering system, the input system, etc.).

We spend ruffly 80% of our time in December to get all the following to work (obviously not just for one single line, but for everything else to come, but this line proofs it all works now):
  • Dynamic shader creation with many different features that can all be combined and customized, fully optimized for each platform, most tweaking work goes into this.
  • Fully flexible vertex formats (you can put whatever data into it and it works on all platforms), this makes our messy optimization code for iPhone, Android, WP7 now much easier, we just use shorts, bytes or floats and can switch formats around with some flags without having to rewrite huge portions of code (it took many weeks to put those iPhone optimization in place)
  • All rendering is now fully abstracted away, models have no knowledge of any vertex formats, shaders, geometry data, vertex buffers, etc. we just have to pass in the precomputed geometry (or calculate new one on the fly like for lines, effects, unit tests, etc. which can easily be done millions of times per second) and let the Delta Engine handle all the rendering and optimizations. This makes writing rendering code A LOT easier, namespaces like Model Rendering shrunk 90% and render now 10-100 times faster.
  • Low level graphics code has become more complex and we spend a lot of time optimizing stuff there, but high level rendering is now very easy and straight forward. Now changes are easy to make and we can experiment a lot more.
  • Also finished optimized 2D, 3D, dynamic and static geometry rendering (10k fps, without SwapBuffer up to 300k with lines). Implemented a much more efficient system that works great for now. Some improvements like merging geometries still have to be done, but thats easier to do now too.
  • Tested and fixed font rendering and also improved 3D sphere generation, now UVs fit much better and there are no more wiggly UV errors!
  • We also had a major holdup on day 1 because OpenGL would randomly crash with InvalidOperation when loading dds textures, which made testing shaders and rendering very hard. It actually took a whole day until this issue was fixed. The problem was not the loading itself, but instead what happened before that, some gl methods reported an error, but we did not catch it until the texture loading happened (which actually worked, but looked like it crashed).

Day 2



Day 2 is also very boring, I just extended the unit tests from yesterday and we now can draw some more complex models, still generated on the fly. We also had to do a lot of design decisions for the 3D content. It is the week between Christmas and New Year and 90% of our artists are not here, enjoying their holiday. We had to stick with what already was working and reduced the level a lot to make sure we could really pull it off.

We also worked on adding more shaders, testing the path camera to be used in the demo and prepared our FBX importing tools and validators for tomorrow to test importing all 3D models.

Day 3



On day 3 we started testing content and optimizing it. Even the reduced level has over 150 single meshes in it, rendering them all out one by one would be way too slow (that even would be slow on PC). One important thing the Delta Engine does automatically for you at the content generation stage is to pack textures together and handle all the rendering code fully automatic for you. This works great for 2D content, but it was not implemented for 3D models yet. So today I remapped all 3D models to use the correct atlas texture coordinates. Later this can be used to merge all meshes that share the same atlas textures and shaders.

Another task today was to test all 3D models as many of them were not successfully imported with our new FBX importer. The code changes were pretty quick, but exporting and testing all 3D models one by one takes a lot of time. Good thing we did reduce the demo early this week from 50 different 3D models to 20 that worked. Today we reduced it to around 15, the rest was too hard to fix and would have taken too much time (only one artist left to help us out and clean up the mess from before, thanks Mathias for your hard work ^^).

Day 4



Today I helped fine-tuning and optimizing shaders, a lot of things could be merged and simplified. For example all of our 3D models we tested yesterday have the same ambient, diffuse and specular colors set, so we could optimize them all out and simplify some shaders by a few instructions.

Then we went crazy and added Fresnel, Specular Maps, Normal Mapping with Diffuse Maps, Light Maps and Diffuse Specular for the shaders used by the characters used in the SoulCraft demo. This is our friend the Abomination. Actually he is just a big enemy to be slaughtered in the game. The shader complexity went up to 40-50 instructions, but we optimized it down to 4 texture reads (diffuse map, normal map, specular map, light map) and 23 instructions to do all of the above. The result is what you see on the screenshot above. It runs pretty good on our emulator, but we have to test it on the mobile devices soon and make sure that this will not be rendered fullscreen (pixel fill-rate will kill us). Please note that we will optimize all shaders more in the next month. The light map will not be used anymore (instead he will be animated and we might think about real time shadows) and the specular map could be used from the diffuse map (e.g. inverted) or merged with the diffuse or normal map. Finally we have to test this on more platforms and check if our fall-backs look good enough.

Please also note that this Abomination guy has way too many polygons and vertices (28000 used vertices), but our artists had no time to reduce him more. Good thing we decided to skip animation and the level is pretty small, so our vertex shaders can be very fast enough. This won't be as easy once we enable animation again and make the level much more complex and bigger.

Day 5



Originally we planed to deploy a first build today on Android, but a lot of our content still did not work well, shaders needed to be tested and optimized more. We also wanted to implement vertex compression to make sure we can save as much bandwidth as possible (that was the biggest performance improvement for the ZombieParty iPad version), but in the end that feature did not make it yet because we could not re-normalize the normals and tangents correctly.

We also started today with importing the level with all its meshes and loading all lightmap uvs from the level and merge it with the already imported 3d models without lightmaps. As you can see from the screenshot above the lightmap and its UVs work fine after importing and look pretty cool, it gives the model ambient occlusion, especially when there are lot of vertices (and yes, this guy has a lot of vertices and and a lot of detail). Some 3D models were broken and had no tangents exported. The lightmaps UVs from the level also did not fit with our existing imported meshes. We hoped we can fix this issue by using the vertices from the level and remapping them to the 3d models imported earlier. On the screenshot above you see the result of merging these vertices, but we had major problems using the index buffers and had to disable them. What we did not know yet was that all merged vertices were totally messed up, only the position and lightmap uv fitted, but the normal uvs were totally messed up (rotated, flipped and whatever else the stupid lightmap baker did) and all texture seams were also broken. But it came even worse: Different 3D meshes from the level had different numbers of lightmap uvs and even different uv seams, aside from breaking all our geometry and uvs anyway. Our initial approach to render a list of meshes with a extra lightmap uv vertex stream could not work anymore. At least not that easily. In theory it is possible to merge all used meshed in the level that have the same vertex positions to fit into one unified mesh that could work with all lightmap uvs, but this could even mean that index buffers could not be used and much more vertices would have to be processed. This is really complicated and I will probably write about this in the future when we attack this problem again.

This was basically a show-stopper. We could have said right there and then: Okay, we have a major roadblock here, we cannot continue. Everyone agreed that we should continue trying to search for alternatives. Maybe rendering out the whole level in merged meshes could work too. This would mean A LOT more different vertices need to be processed. For example some models like the "Wall" is set 30-40 times in the level, the same as some other columns or trees. Now each of those would have to be rendered out individually instead of using instancing. This obviously also has advantages as we can combine a lot of meshes into big meshes that all share the same materials and atlas textures. However, this all would be to be implemented till Monday (today was Friday and the last of the year and tomorrow is new years). Risky risky ..

Day 6 (New Years)

Okay, after a short new years party (at least I did not crash my non-existing car like my brother did) and after some sleep (not enough) I started working on a solution for the lightmap uv stream problem from yesterday. First all mesh data would have to be merged, this was actually quite easy due our content processor system. Rendering the positions and UVs out with the merged lightmap uv stream was also quick and easy, I just forced a simple textured shader with 1 instructions on all geometry and rendered it out. But all UVs were completely messed up, rotated and just plain wrong. This happened because the vertices from the level and the ones from the mesh were different (same in number, but just ordered differently). So I used the UVs from the level since our instancing would not work anyways, this would not matter (each mesh, even if it was just a copy, had different UVs). To keeps things simple all other features were disabled (no normals, no tangents, no index buffers, no vertex compression, no shaders except textured shader). The following was the result. The level working finally with normal diffuse map uvs (yesterday we had just the lightmap uvs working).


So far so good, now I tried to enable one feature after another. This is what happened when I tried to use the index buffers from the imported meshes. They totally did not fit anymore. There was also another issue on this screenshot, I forgot to add the vertex offset for each mesh to the index buffer, but even after fixing this the original index buffer could not be used and a completly new index buffer needed to be created, which was a lot bigger to my surprise. But after thinking about it it mades sense. The new vertex contains not only positions, uvs, the normals and tangents, but now also the lightmap uvs, which have different texture seams and borders than the normal uvs for diffuse and normal mapping. Again, not good for performance and obviously our fault for not testing this out earlier, but what can you do if the level was only completed yesterday.


After going back and just merging the vertex positions, uvs, normals and tangents and displaying the level again with the new index buffers and using different shaders (normal mapping, etc.), some models like the church in the background were inside out because they have mirrored world matrices. I needed to find an algorithm to detect inverted render matrices, but was unable to do so. Instead I hacked in a couple of simple checks that see if the X, Y and Z axis form a right handed coordinate system (if Z is pointing in the other direction, then it must be inverted). Now if such a case was detected all triangles of that mesh are flipped around, then it looks all good again. Another thing that was already wrong in the first screenshot is the position of each mesh. The object matrix was already applied for all imported meshes, but the level stored the render matrix for each mesh in a way that applied the object matrix again. This had to be undo-ed by applying the pivot transformation inverse. On this third screenshot the whole level was already rendering with 1200 FPS with over 250000 vertices. Not bad, but remember my PC is obviously faster than a mobile device and none of the complex shaders are used yet.


After fixing the mirrored matrices and fixing the positioning of each mesh, this is what our little world looks like. Originally even the reduced level was filled with almost twice as many objects, but we needed to cut some because of the already mentioned tangent problems and because of performance. Basically we deleted everything that was not visible from the center of the level. For me this was when I realized that the demo would not look any better than this screenshot except for better shaders and fixing some other things. With lightmaps enabled this looks already pretty nifty, but it would be more amazing to have a more full level and all the other currently missing features like animations, effects, etc. But you have to settle with what you have and we made some quick decisions to skip all content that did not work yet to get things done in the next few days.


Day 7

Uhh, just one day left until this all has to work on the Android Tegra device and we did not even put anything except some simple unit tests on the device. The problem was the level was not even working on the PC yet and every little problem took many hours to fix. We wanted to save time by only having to deploy the project once on the tegra and then do some final fixes there if we still run into problems. But still the idea was to deploy a first version on Android by the end of the day. We already missed our deadline at Day 5 for the first build because of all the level and lightmap problems. Early this day I played around with different texture settings (mipmaps on, off, sharpen mipmaps or not, using bilinear filtering or using trilinear filtering). When I deployed the following unit test on the Android Nvidia Tegra device it looked even worse and there were compression artifacts clearly visible and because of the 565 bit framebuffer the bilinear filtering lines looked way worse. It made the whole demo look crappy. So for a minor performance cost (not detectable on PC, but on mobile devices trilinear filtering costs a bit) I enabled trilinear filtering, which made it look better, but the mipmaps were still crappy and blurry. Sharpening made it even worse, they did not fit together anymore. Again because of time constrains I left it the way it was before (mipmaps, but no sharpening, and using trilinear filtering for big planes like the ground). In the end no one noticed anything bad about the ground, we even did reduce the shader from 18 instructions with NormalMapping, Specular-Diffuse calculation and applying our fake-HDR LightMaps and so on to just 5 instructions (2 texture reads, displaying the diffuse map and our fake HDR lightmap on top). This was one of the biggest performance gains beside reducing the resolution (see tomorrow).


Next, we tried to enable vertex data compression again, each vertex has 52 bytes (3 floats for position, 2 floats for uvs, 3 floats for normals, 3 floats for tangents and finally 2 more floats for the lightmap uvs, which is 13 floats = 13*4 bytes = 52 bytes), which is a crazy amount for mobile platforms. As mentioned on Day 1 our content system can happily change anything at the geometry and vertex level. For example vertex compression can be enabled when generating content and then it goes down to 13 shorts = 26 bytes (32 bytes on platforms that require 4 byte alignments like XNA/DirectX), which can be twice as good for bandwidth and performance. As you can see from the following screenshot it was not so easy because we had this 4 byte alignment still turned on, but only stored 26 bytes for each vertex. This messed up the whole level rendering. Even after fixing this we still had the problem that normals and tangents were totally messed up and even re-normalizing them in the vertex shader did not help. We tried around fixing this for a few hours, but again had to give up because there was no more time left. We would rather have a slow working demo than nothing.


So at the end of the day everything came together, we quickly put together a demo on the PC with the Delta Engine, added the path camera and displayed the level with all shaders. There were obviously issues like lighting was a little dark and sky cube mapping was broken, but most of it could be fixed easily or there would be more time tomorrow when fine tuning. Sadly the Android deployment turned out to be a disaster! I started deploying in the evening when most of our level content issues were fixed and it at least somehow worked on the PC. Compiling it for Android with the Delta Engine was pretty easy and it ran right away on the Tegra, but we got a major problem with content loading, everything above 1MB crashes the Android AssetManager with an Java IO exception. I worked all night trying to find a solution to this problem and wrote a little re-packaging tool that used the aapt tool (Android Asset Packaging Tool) with the -0 "" flag to save all content uncompressed. This worked for a while with reduced content, but the full content with >48MB still crashed badly. In fact after launching the application nothing happened anymore, no log, nothing, it just froze. Debugging also did not work (has always been pretty broken anyway).


But this was not the only issue. When testing level meshes one by one new problems quickly appeared like we saw that the depth buffer was disabled and our attempt to turn it on was completely ignored. We would have to write our own framework methods to call the OpenGL device creation methods ourselves as the MonoDroid framework is very broken. I reported a couple of bugs for the MonoDroid framework (which is still in preview btw, you can't expect that everything works, we had plenty of other issues in the past, but always found a way around them).

The plane to the CES in Las Vegas was also leaving in a few hours for one of us and my plane would go at the following night. I already had to decide not to fly to the CES because this would take at least another day to fix and I needed to sleep badly after working through so many nights (was awake for 24 hours this day already). This was pretty sad for me because I would have enjoyed some days of relaxing in Las Vegas and seeing the CES for the first time, but the flight would have been too stressful and the demo needed more time and fine-tuning than the few hours that were left.

Some of the issues like the Depth Buffer (without it the demo is unusable) and the 1MB issue were real deal breakers again. Others would have probably given up and again no one internally believed that the Android Tegra version could be fixed in this short amount of time left ^^ I worked all day on Day 8 and I got plenty of help from my fellow programmers, but again most time was lost in the night when I tried to reduce each content to below 1MB to make the demo run on the device (the whole level needed to be restructured and I re-saved every single texture and 3d model again). Once I had something working, everything else broke. For example I wanted to test if the resolution change worked and switched to another Android tegra device and suddenly all my drivers brake and I could not deploy anything anymore. Reinstalling drivers took hours and then the Android Tegra devices crashed and did not boot up again. It had to be flashed again and this kind of stuff obviously has to happen at the very last day when no time is left. Obviously the guys at Nvidia and my colleague that already was sitting in a plane to Las Vegas were pretty nervous that this all could fail. But in the end it all worked out, all the hard work payed off, as you can read in Day 8:

Day 8

Screenshot 1 from the final result today (this looks 1:1 the same way on Android). Pretty good for just a few days of hard work :)


Screenshot 2 shows the Abomination, which runs a little below 30fps when looking at it this close on the Android Tegra device, but looks pretty amazing.


And Screenshot 3 shows our level and how everything fits together. Obviously the level is small and boring yet, but it will be improved in the near future! See the video from the next blog post to get a feel how this all looks together and on the device. Again, this is only the first iteration of the demo, we will make it more spectacular till next month.


Here you can see a few screenshots from the Android Tegra Version that went through a lot more optimization and tweaking today. Since I spoke about optimizations and other Android issues yesterday (some of them were also fixed today), I will focus on the content in today's discussion as most of the final touch and optimization was about content. As mentioned before some content had to be removed because we had tangent export problems (they were missing from some models and we have no regenerate tangent method yet), other content was too shader complex, some shaders could be simplified, other content had to be moved around to fill less pixels on the screen. And there was also some fine-tuning like moving overlapping planes around to avoid z-fighting and repackaging some atlas textures, increase textures near the camera, reduce some in the background, etc.

Our target for this SoulCraft Tech Demo is reaching at least 30fps in 1024x600 on the Android Tegra. Now that we also have a 1338x768 Tegra and played around with it, we might even want to build a version in that resolution (and later even full hd 1920x1080 version). Some of our unit tests run perfectly fine with 60fps in that resolution, but for now the resolution was reduced a bit because of our content, shaders are too complex right now and some models are too heavy too. We also could not enable all features and vertex compression yet because we had some last minute problems and no time to figure them out for this first iteration (like enabling AA and testing our optimized shaders with compressed vertices).

Due our complex shaders and because most of our 3D content was created for PC/Console we could only reach about 15-20fps from the version yesterday and thus reduced the resolution a bit to 800x480 (1/3 less pixels to render) today. I also did some extra optimizations (ground shader complexity reduced from 18 instructions to 4). Now it runs absolutely stable at 30fps on Android Tegras (we also did test this on other Android devices, but performance is not very good there, we need a much lower resolution there and lower poly counts). Performance is good now, but some content does not look as great as it could because of some optimizations (like the ground shader, but even the character shaders have some issues right now and will be improved in the future, you can see the difference if you look at the screenshots from the last days).

Some pixel shaders use up to 30 instructions, e.g. the Characters have Diffuse Maps, Normal Maps, Specular Maps, Fresnel calculation and even use Light Maps right now (to fit better into the level, we will have to remove light maps once animation are implemented). This is quite a lot for a mobile platform, especially if the resolution is high. On PC or Consoles often complex shaders with 50 or 100 instructions can be used, but even that is not common, often a simple shader with 10-20 instructions looks good enough (or 1-8 instructions for older hardware) and allows crazy high resolutions (like on my 30 inch screen with 2560x1600, I love it). We will optimize our shaders further and try to achieve good effects with them by doing some more tricks (merging textures, doing crazy pixel shader optimizations, etc.). Our target is 1-4 (2 on average) instructions for low mobile hardware, 2-12 (6 on average) instructions on medium mobile hardware and 4-24 (6-16 on average) for high quality mobile hardware like Tegra. There is also another profile for PC and Consoles that does not have an upper limit (30, 50 or even 100 instructions are possible). The good thing is a shader can be very complex on the PC and will work fine on a fast GPU, but if the user has a slower PC we can easily switch to the high/medium/low mobile shader profiles and use the same optimizations for PCs or slower consoles.

Textures were initially 2048x2048, but there was visually no difference with 1024x1024 textures and many textures have gone down to 512x512 or even 256x256 to optimize bandwidth even more. Our level was initially more than twice as big and has about 50 models, but only 15 made it into this first demo because of time constraints. There is a lot more to come for this SoulCraft Tech Demo, the CES version is just the first iteration. It will not be easy for our artists as many of them are working for the first time on 3D content for mobile platforms and we had many heated debates, but now with a working demo on the Tegra everyone will understand the vertex, polygon and texture issues a lot easier.

Another example is target specification for characters or complex buildings: Max. 5000 vertices and max. 4000 polygons are allowed. We did have some 3D models with less polygons, but all buildings, the characters for this demo are around 25000-30000 vertices with 8000-10000 polygons. That is way too much for a mobile platform. We still managed to render this out quite good because we optimized our vertex shaders and this demo has no animation yet (but it will be added soon). This content will be optimized better for the next iteration of this demo, we will add a lot more content instead and make the world bigger, fuller and more exciting with animations and effects.

Please also note that the SoulCraft Tech Demo is NOT just a showoff for the great Nvidia Tegra platform on Android, it runs perfectly fine on all of our supported platforms: iPhone, iPad, iPod Touch, Android phones, Android tablets, Windows Phone 7, PC, Xbox 360, Linux and MacOS. We currently focused content and optimizations for the Tegra, but the demo itself is written without any knowledge of the target platform (like ZombieParty and like every project with the Delta Engine). With a click of a button it compiles and runs on all the mentioned platforms, but we will obviously optimize our engine for all target platforms and make sure all content works on all platforms as well (that is actually most of the work left to do, creating a complex 3D game is no easy or quick task for Artists). If all goes well we should have a pretty good SoulCraft demo in a month that works on all platforms and shows all 3D features of the Delta Engine. Then we can focus on the game-play again and finish the game itself :)

3D Model formats: Collada or FBX?

by Benjamin Nitschke 11. November 2010 14:33

Our Delta Engine supports Collada files for 3D Models since we have worked with Collada for many projects (since 2005). However the support for Collada .DAE files in 3DS Max 2011 or Maya just gets worse every year. No one is actively working on a plugin anymore. ColladaMax is dead, Feeling Software is not working on Collada anymore, OpenCollada (open source collada support) has done nothing for a year now, Sony does not like Collada anymore, Autodesk wants everyone to use FBX and never really supported collada well, etc.

 

So should we give up on Collada? Well, yes, the artists will no longer use Collada (we still support it for our engine and content importers) and instead export everything in FBX, which works fine out of the box in 3DS Max or Maya (or XSI or Blender, etc.). Now we got a bunch of FBX files (and some older collada files we still want to support) and we are way too lazy to write a new importer right now (we want to work on our 3D SoulCraft TechDemo now). There is the FbxConverter tool you can download from Autodesks website (for free, you just need to register). With this tool you can freely convert from FBX to Collada or one of many other formats (many different FBX formats, 3DS, OBJ, or DXF). The resulting collada .DAE file is MUCH better than one exported directly from 3DS Max or Maya (at least if the artist does not have the latest FBX exporter installed). It seems to be in a more standard format, it exports all animations, bones, skin, camera, spline, etc. information out where the standard Autodesk Collada exporter leaves this all out and makes the format useless. Sadly ColladaMax, OpenCollada, etc. all do not provide a plugin for 3DS Max 2011 or 2010 because no one is working on that, older versions like 3DS Max 2009 work fine, but we had our problems with those plugins as well. Note: Feeling Software still supports FCollada, but it is not longer a free plugin, you need to get a commercial license, but we had too many problems with FCollada in the past to go that route again (some stuff works great, other stuff not so much, at least for us).

 

I will report back how our implementation goes with all this format mess :)

 

References:

 

Update 2010-11-18: While Collada (including converted fbx files and yes, I tried many different converters and exporters) still works great for simple static meshes and animated meshes, it just causes too many problems with many of our more advanced content. While we might continue to support Collada in the future (I am not so sure right now), we will switch completely to importing FBX files directly now. This gives us much more flexibility, but it of course also adds lots more work to support FBX completly (e.g. there is no easy managed library to load FBX files, you gotta use the native FBX SDK). But everyone else uses FBX too (Unreal, Unity, most modeling tools, etc.), so I guess we have to jump on that bandwagon too. Another good thing is our artists already use FBX and they don't care how stuff is converted, it just should work. Another important aspect is that we want the ability to convert game content and levels back into FBX files (we used collada for that before), this way a nice interaction can happen between modeling tools like 3DS Max or Maya and the engine editors (but we are not there yet obviously) :)


For me the following put the nail into the coffin for getting away from the FBX to collada converted way. We had additional problems like missing material info and no tangent data that could be reconstructed from the original FBX file, but it turned out to be too much work and there are plenty of other subtle problems with the conversion (it looks fine at first, only after working with it a while you see more and more stuff not working the way you want):

From the FBXConverter Documentation by Autodesk:

What’s not supported by COLLADA
The following is a list of elements that DAE (COLLADA) do not support on
export or import.
■ Triangles, polylist, and polygons. The FBX Converter exports polygons
without holes, and imports them without holes and triangles.
■ Bake & single matrices
■ Variable sampling rates.
■ Absolute Paths (when invalid)
■ Instances
■ Skew
■ Non-standard Materials and maps for different parameters
■ Orthographic cameras
■ Normals (for skin and shape)
■ Light parameters
■ Vertex animation
■ Material parameters animations
■ Effects specialized profiles
■ COLLADA physics
NOTE Export or import to DAE (COLLADA) can destroy or produce unpredictable results for some elements.

 

We also tried putting some FBX files into the XNA Content pipeline to see what comes out at the other side, but there were way too many restrictions for us to work with the XNA Content pipeline. I am happy to see that XNA's FBX support is getting so much better (I used it last in 2006 and it totally sucked), but the following things were just too annoying:

 

■  All pathes must fit, including textures, shaders, and whatever else is in the fbx file. File formats also have to match, artists love to use .psd files, but that is nothing I want to support (there is a .png file with the same name right there, but the content processor does not see it). All of our FBX models we tested used absolute paths that were just not available on the content server (or my machine for testing). While FBX ASCII files can easily be edited, this is still a show stopper for us. While a small XNA team might be ok with this restriction (you can just use relative paths in 3DSMax when exporting and make sure all images, shaders, etc. are in place), I don't think this is the right way for us.


■  Many files caused other compiling issues, for example most of our animated meshes use not just a standard skeleton and bones, but some other system (e.g. CAT). Again, this can be sometimes fixed before exporting, but it just does not work automatically, it is always a custom process with lots of fiddling around. The following error was just too annoying to me to try this out any more (especially because you find not much useful information in the internet about problems like this): Skeleton not found. Meshes that contain a Weights vertex channel cannot be processed without access to the skeleton data.

 

■  Finally the process always takes way too long, we spend already most of the content server time inside the XNA content pipeline (to convert stuff to WP7), hopefully loading FBX directly instead of converting them to collada or going through the XNA content pipeline is way faster ;)

 

■  Even with all these problems the XNA Model Processor has some cool tricks up its sleeve. For example when you get far enough with the importing process an .xml file will be generated in the Content/obj/x86/Debug directory, which contains a lot of extra information the xnb generated file might not have. It can be useful to look at those .xml files and maybe even extract useful information out of those. For us it worked not so well with some easy to get information like tangents (they are just missing), but it contained all the CAT bone information on the other hand. Still good to know ^^

 

So we are back by converting FBX into our own format ourselves (not sure if I will write a native tool with the FBX SDK or just PInvoke from C# yet ^^). For anyone interested, there are some cool projects out there using the FBX SDK, for example this FBXViewer is a nice one, it worked out of the box with everything I had thrown at it (as opposed to converting collada mess or trying the XNA content pipeline).

Disclaimer: The opinions expressed in this blog are own personal opinions and do not represent the companies view.
© 2000-2011 exDream GmbH & MobileBits GmbH. All rights reserved. Legal/Impressum

Poll

Which platform should Soulcraft be released on next?











Show Results Poll Archive

Recent Games

Soulcraft

Fireburst

Jobs @ exDream

Calendar

<<  April 2014  >>
MoTuWeThFrSaSu
31123456
78910111213
14151617181920
21222324252627
2829301234
567891011