BlockLeftTop, PRELOAD BlockLeftBottom, PRELOAD BlockLeftStretch, PRELOAD BlockTop, PRELOAD BlockBottom, PRELOAD BlockStretch, PRELOAD BlockRightTop, PRELOAD BlockRightBottom, PRELOAD BlockRightStretch, PRELOAD
DeltaEngine

Playing around with VS2010 and the Parallel Extensions

by Benjamin Nitschke 26. May 2009 23:28
Last week when I installed VS2010 I played around with it for a few minutes, but I've been very busy at work because our game is at its final stage and we got a lot of heat going on (this is not a hint, no no, no hint from me, I'm not allowed to talk about it. Damn it, this could be a hint).

Currently the most interesting new feature besides the cool VS2010 IDE is the Parallel Extensions for me. Sadly the IDE still unusable for real work IMO because all the addins just don't work, there isn't even a fix for TestDriven.net yet, Jamie Cansdale is probably busy too ^^

VS2010 support for parallel programming goes beyond just adding a few extra classes in .NET 4.0: You also got a great IDE implementation which lots of useful features and new tool debugging and profiling windows for checking out parallel tasks, threads and the scheduling. Next there are native C++ libraries that work with Parallel Extensions too (using lambda functions) and work good together with STL. You can check out all the new features at the official VS2010 page!

Let's take a quick look on how to use these Parallel Extensions. I wrote this in a few minutes last week, I was just too lazy (I mean busy of course) to post it yet, it is obviously very simple stuff, but was still useful to test out some of the new parallel IDE features and check out some new .NET 4.0 classes. First of all let's do a boring foreach loop, which displays numbers from 0 to 9, which get added to expectedTotalNum. This should obviously be 45 because sum(0..9)=45. Later we will do some parallel foreach adding and check if we got the same result. Since it does not matter in which order we add these numbers, it is also a good test to just start a bunch of parallel tasks and let them do their work. Obviously you would never write parallel code just to add some numbers, but this should illustrate the point and it does not hurt your performance much anyway (as long as you have a few lines of code executing that take more than a few instructions).
 // Initialize a list for some parallel testing :)
List<int> someInts = new List<int>();
for (int num = 0; num < 10; num++)
someInts.Add(num);

// Print out numbers sequentially
int expectedTotalNum = 0;
foreach (int num in someInts)
{
Console.WriteLine("sequential num=" + num);
expectedTotalNum += num;
}

Console.WriteLine("expectedTotalNum=" + expectedTotalNum);
This outputs the obvious sequential adding of 0 to 9 to expectedTotalNum, which is 45 at the end of the loop:
sequential num=0
sequential num=1
sequential num=2
sequential num=3
sequential num=4
sequential num=5
sequential num=6
sequential num=7
sequential num=8
sequential num=9
expectedTotalNum=45
Now let's do the same in parallel, just by replacing foreach with Parallel.ForEach:
 // And do it with the new Parallel Programming classes in .NET 4
 int totalNum = 0;
 System.Threading.Parallel.ForEach(someInts, num =>
   {
     Console.WriteLine("parallel num=" + num);
     totalNum += num;
   });
 Console.WriteLine("totalNum=" + totalNum);
While the result is still the same because totalNum is 45 at the end of this loop, we get there in a different way. As you can see this is a little bit confusing at first because the adding does not happen sequentially anymore, but in parallel instead. Also note that this output can change when you execute it again and can even be much different on different platforms with different number of CPUs:
parallel num=0
parallel num=2
parallel num=5
parallel num=7
parallel num=8
parallel num=9
parallel num=6
parallel num=3
parallel num=4
parallel num=1
totalNum=45
Okay, good stuff so far, but you might not always have a foreach loop and you might not want to wait for it to complete anyway. Maybe you just have some work tasks that need to be executed, no matter in what order and you might not even care if they complete their work right away or in a little bit. A method could just add some work that has to be done and then return while the work tasks are executed in the background. This is when you would have used the Thread or ThreadPool classes in the past, which are great on their own, but you always have some setup code and it was hard to test and you could not use them all over the place because setting up threads is a costly operation and if your work is just few lines of code, it was always a better idea to execute it right away. But fear no more, now you can just create new tasks with the new Task class in the System.Threading namespace. This is a much easier job and has many advantages because .NET 4.0 handles all the thread creation for you, will reuse threads once they are completed with their task and all this works in a very performant way. Setting up tasks is a little bit more work than just executing Parallel.ForEach, but it allows much greater flexibility and you can add more tasks from whereever you are. Tasks can even have children and you can have a lot of complex code using all these tasks. Testing multi-threaded code is obviously harder than just writing sequential code, but with all the great additions to the IDE, it is now easier than ever with the new Parallel Tasks window and by checking out the Parallel Stacks, which shows you all the running threads and were all the task code is.

The following code will create 10 tasks and do the same thing as above, but with several additions. To be able to wait for all the tasks to finish we will add all 10 tasks to the allTasks list. We also have to make sure that our local foreach variable does not change while we are executing tasks because the foreach loop will quickly create all tasks, but might not execute them right away theirfore using num can cause problems. Instead we just create a local copy of num and use that instead, which can't change because we never increase it like we do with num. Finally we add some boring thread sleeping to make it easier to debug this code and check out whats going on by adding breakpoints. Without the sleep the code still works, but checking out the Parallel Tasks and Parallel Stacks windows will most likely give us no results or only the last few tasks that are still being executed at the end of the foreach loop because the tasks are so simple and executed very quickly. We even wait a little after the foreach to make sure all the tasks have been added and are being executed right now or are scheduled (waiting for execution).
 // And finally some tasks, yeah!
 totalNum = 0;
 var allTasks = new List<Task>();
 foreach (int num in someInts)
 {
   // We need a local variable for num because num itself can change
   // at the end of this loop before the task might even be executed!
   int numToBeAdded = num;
   allTasks.Add(Task.Factory.StartNew(delegate
   {
     totalNum += numToBeAdded;
     Console.WriteLine("Adding " + numToBeAdded + " in task with id=" + Task.Current.Id);
     // Wait a little for checking out the tasks in the new Tasks window in VS2010!
     Thread.Sleep(1000);
   }));
 } // foreach
 Console.WriteLine("Done with foreach loop. Tasks might still be pending");

 // Wait a little for all tasks to start
 Thread.Sleep(100);
 Task.WaitAll(allTasks.ToArray());

 // And finally return the result (45 again if everything worked)
 Console.WriteLine("totalNum=" + totalNum);
First of all the results again, which is 45 again. This only work with numToBeAdded, if you use num instead it will sometimes give you different results because at the time you execute a task num already might have changed, especially if you have more tasks than CPUs used for execution and time consuming code like Console.Writeline or even a Thread.Sleep is in there!
Adding 1 in task with id=1
Adding 7 in task with id=6
Adding 8 in task with id=7
Adding 6 in task with id=5
Adding 4 in task with id=3
Adding 3 in task with id=4
Done with foreach loop. Tasks might still be pending
Adding 9 in task with id=8
Adding 0 in task with id=10
Adding 5 in task with id=2
Adding 2 in task with id=9
totalNum=45
As you can see this even looks more confusing than the way Parallel.ForEach added those numbers because while we might create those tasks sequentially (number 1-10) it does not mean there are also executed in exactly that way. To make it easier to check these things out there is the Parallel Tasks window, which shows the following after starting all 10 tasks by adding a breakpoint after Thread.Sleep(100) just before we wait for all tasks to complete:



Not only do we see all over our 10 tasks here, we also see right away that 8 of them are currently being executed (because I have 8 CPUs with my hyper-threaded i7) and all of them are waiting because of our stupid Thread.Sleep(1000) we added for each of those tasks. A second later those tasks are done and the last 2 are also executed, most likely with 2 thread ids already created earlier. You can click on each task and see where it is currently executing in the source code and you can also check out all the thread information in the normal Threads window. But even more useful than the Parallel Tasks window is the Parallel Stacks, which shows how all this code is related, which task or thread created which new task and so on:



All good stuff. While I already have some ideas how to use this on some of my current projects, I have not ported anything to .NET 4.0 / VS2010 yet because of the many issues I have with the IDE (no addins, color theme not really working, I always have to reset it when starting VS2010, also I don't like that the project and solution formats have changed so I cannot easily switch back to VS2008, etc.). But hopefully more and more addins will work for VS2010 and some of the issues are fixed, then it will be great to use all this new .NET 4.0 stuff (dynamics, parallel extensions, MEF, etc.).

Early Visual Studio 2010 experiences

by Benjamin Nitschke 18. May 2009 23:39
VS2010 beta came out a few hours ago, installing it took way too long (over an hour, VS2008 takes less than 10 min for me) and it seems some addins don't install or just don't work, but other than that everything else seems to work great:
  • TestDriven.NET provides an VS2010 option and it will appear in Addins, but the Menu Items will not appear anywhere. Just use keyboard shortcuts and everything still works :) .. at least it worked for a short while (see below). Hopefully Jamie Cansdale fixes this soon because without TestDriven.NET I can't really use VS2010 right now. MS provides its own unit testing framework, which is implemented into VS2010 and even available in the Pro edition now (was only in more advanced versions before), but I still don't like it. It does not output anything to the console except exceptions; it is not possible to just start some Ad-Hoc unit test and for all my functional tests it is also useless. For just "normal" unit test projects VS test integration is quite ok, but I would suggest trying out NUnit or xUnit together with TestDriven.NET, which is just more flexible, easier to use and improves productivity IMO.
  • Completely unable to install any CodeRush version (neither 2, 3, 9.1, Xpress), so I'm also unable to use my CR_Commenter :( hopefully DevExpress provides a version soon, they always were very quick with early VS2005 and VS2008 support!
  • Most other addins I tried also did not install or just do not appear anywhere (only in VS2008)
  • One of my own addins I wrote 2 years ago for VS2008 also does not appear, it probably needs to be configured to appear in the VisualStudio/10 reg key!
Most annoying:
  • Even though Rico Mariani (the .NET performance god) is helping out the VS team to give VS2010 better performance, it currently lags all over the place. I had 10 second delays in Options, Add References and Renaming dialogs already. Seems to be always an issue with opening up stuff for the first time. After a while everything seems to be fast, but I would not say this is faster than VS2008 yet.
  • Only tried this at my home PC till now, but everytime I close VS2010 I cannot open it up again until I restart (or at least log off/log on) again, which is REALLY annoying. I will just keep VS2010 open now all the time and hope it never crashes ^^ The start splash-screen is also messed up and completely black. Maybe this is only on my home PC because I tried to install so many addins, will try this tomorrow at work too.
    Update 2009-05-18: I just found out what was causing this. Some addins tried to load, but failed for some reasons and then the addin loading code locks up or whatever and VS never finishes starting up (the process devenv.exe is still there, but you just don't see any IDE window, they are never created). It seems to be related to the TestDriven.NET-2.21.2448 version, I also tried TestDriven.NET-2.20.2438 - same problem :( (Jamie Cansdale, the creator of TestDriven.NET was also recently blogging about this). Then I used , which also already has VS2010 support and that seems to work fine (no menus, but keyboard shortcuts work fine). I posted more details on Jamie's blog!
Every project I have opened, had to be converted to the new VS 10 format, which is kinda stupid because it seems nothing has changed. All the code works just fine and no changes were required. I will start using some .NET 4.0 features and test out stuff in the next days, especially using Parallel Programming and MEF (Managed Extensibility Framework). Probably will play around with the C# 4.0 dynamic keyword a bit also, but can't think of anything yet where I would really need it. Maybe to talk easier to IronPython or my own DLR language ..

Cool features:
  • Start page is pretty good, easily customizable, but I will probably never see it as I setup my VS to always load the last solution (like my Firefox ^^)
  • It is easy to drag tabs around and move them to different screens. Finally I can use some multi-monitoring.
  • Old exported VS2008 settings also work just fine in VS2010, even coloring :) For some strange reason the background color is always white when I start VS2010 for the first time, then after going to options and closing it, it will be the correct color (black for me).
  • Overall the IDE is very nice and fits perfectly into my color theme. Its also a lot easier to see whats going on, which tab is selected and positioning tool windows all over the place is better than ever before.

Go check out VS2010 yourself, available on MSDN right now and on MS site on Wednesday for everyone else.

Most Recently Used Tab Tweak for Visual Studio

by Benjamin Nitschke 18. May 2009 14:31
While I'm waiting for VS2010 beta today (refreshing blogs and MSDN every few minutes :D), I just found out that you can have your Visual Studio Tabs be automatically ordered by adding the following registry key in HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0:
  • UseMRUDocOrdering: 1 (DWORD)
Now when you select any tab in Visual Studio it will automatically move to the very front thus keeping your recent documents at the start. It is kinda weird at first, but it seems to be a very useful feature. I like it already :)



In case you don't want to create the registry key yourself and want me to help you out modifying your registry (always a safe idea), just execute this file to do the job:


PS: Since my brother is running a TOR exit node server Google does not like when I do searches anymore because it thinks I search too much stuff now and might be infected with viruses and trojans (yeah, sure ..). I could use Google Custom Search sites like Blackle.com, which return the exact same results as Google, but for now I'm trying out Yahoo once again (was using it a lot almost 15 years ago, when there was no Google). Sometimes Yahoo is stupid and does not give good answers, e.g. searching for "year yahoo was launched" gives you random answers on Yahoo, but google returns the Yahoo Questions link with the correct answer as the first result, wtf?

However, most of the time you do not get stupid SEO optimized search results and totally messed up product searches you get when using Google or Live Search. As an Microsoft MVP I also get asked a lot why I would not use Live Search and Microsoft is constantly trying to get more people (and MVPs) using Live Search, but the results are very close to Google most of the time anyway and the interface is just strange if you are used to Google for 10 years (the English Live Search is actually nice with a new image every day, but the German version is totally messed up, just white boxes??). Yahoo at least gives different results, which are sometimes better, sometimes worse .. Maybe we all should switch search engines from time to time, its really stupid Google has 95% market share here in Germany almost and it returns so crappy results so often (finding drivers has become 10 times as hard in the last years, searching files or torrents is impossible, because you only get SEO sites, no real results, product searches are totally messed up, Google groups is not used much anymore, etc.). Okay, enough ranting about search engines .. time to refresh MSDN again.

VS2010 coming soon .. and a little tool: KillEmptyDirectories

by Benjamin Nitschke 14. May 2009 14:17
Visual Studio 2010 beta is apparently coming soon. Since I used VS2005 beta and VS2008 beta when they came out, I will be an early adopter once again. I really want to use the cool Parallel APIs in VS2010 and C# 4.0 dynamic stuff :)

Another topic: I try to synchronize several of my servers at different locations every day. This way it does not matter where I download or create a file, I still can use it the next day wherever I am (home, work, different work ^^). I use SyncBackSE for that job, great tool, but the problem with it is that on slow internet connections (e.g. slow DSL at work right now) it takes forever to even scan the directories. Its about 30000 directories that have to be synchronized and this takes a lot of hours, especially if the internet connection is already doing something else. Obviously uploading and downloading a few GB can also take a lot of time, but in 95% of the cases nothing or just a few MB change every day.

Because searching sub directories is so slow I tried to make it faster by removing directories when not necessary (just compressing 100 directories into a zip file, removing empty directories or old files, etc.). But this is obviously a lot of work for this many directories and I can easily overlook that in sub directory 3592 there is 1500 empty folders of some crap from some years ago no one will ever need again. For this reason I quickly wrote this little tool called KillEmptyDirectories, you can see a picture on the right! It only took me an hour to write, so there is no rocket science here and it is very straight forward. I did not even unit test it, I just tested it by executing on several directories. It also can remove hidden and read-only files by changing the file attributes (otherwise File.Delete will always throw UnauthorizedAccessException). I tested it on around 30000 directories and was able to kill around 8000 of those (deleting a lot of old stuff) ^^ with some additional compression of unused older directories I was able to get it down to around 3000 directories (from 30000), so the scanning process is now about 10 times faster.

If you want to try it out, you can download it here. But use with care, you can obviously kill directories with it and by using the "Always kill these directories" options you can even delete directories with files in it. An extra confirmation message will appear if you try to use that feature. And this is the main function that does all the directory killing:

/// <summary>
/// Recursively kill directories, will be called as many times as we have
/// sub directories (can be many thousand times). Will always go through
/// all subdirectories first in case we can remove them, which makes it
/// much easier to delete directories with just empty sub directories.
/// </summary>
private int RecursivelyKillDirectories(string directory,
    string[] includeFiles, string[] alwaysKillDirectories,
    bool alwaysKillThisDirectory)
{
    int directoriesKilled = 0;
    string[] subDirectories = Directory.GetDirectories(directory);

    // Only delete this directory if there are no useful files in here!
    // Note: GetFileName will give us the last part of the directory! 
    if (alwaysKillDirectories.Contains(Path.GetFileName(directory)))
        alwaysKillThisDirectory = true;

    // Handle subdirectories always first (maybe we can kill them too)
    foreach (string subDir in subDirectories)
        directoriesKilled += RecursivelyKillDirectories(subDir, includeFiles,
            alwaysKillDirectories, alwaysKillThisDirectory);

    // Get all files here and count how many of those we can ignore
    string[] files = Directory.GetFiles(directory);
    int ignoreFileCount = 0;
    foreach (string file in files)
        if (includeFiles.Contains(Path.GetFileName(file)))
            ignoreFileCount++;

    // Only found ignored files (or no files at all) or do we want to kill
    // this directory anyway?
    if (files.Length == ignoreFileCount ||
        alwaysKillThisDirectory)
    {
        // Check again if we don't have any sub directories left here
        // Maybe the subdirectories above were already killed now!
        subDirectories = Directory.GetDirectories(directory);
        if (subDirectories.Length == 0 ||
            alwaysKillThisDirectory)
        {
            try
            {
                // Kill all files in it (only ignored files anyway)
                foreach (string file in files)
                {
                    // Make sure we can delete hidden and readonly files
                    FileAttributes attributes = File.GetAttributes(file);
                    if ((attributes & FileAttributes.ReadOnly) != 0 ||
                        (attributes & FileAttributes.Hidden) != 0)
                        File.SetAttributes(file, FileAttributes.Normal);
                    File.Delete(file);
                } // foreach
                Directory.Delete(directory);

                // We killed something, yeah
                directoriesKilled++;
            } // try
            catch (Exception ex)
            {
                MessageBox.Show("Failed to delete " + directory + ": " +
                    ex.ToString());
            } // catch
        } // if
    } // if

    // Most of the time nothing was killed
    return directoriesKilled;
} // RecursivelyKillDirectories(directory, includeFiles)

Moved a lot of websites today

by Benjamin Nitschke 11. May 2009 20:37
Hi Guys, sorry for not updating much here last week. I have been quite busy with all the iPhone projects going on (at mobile-bits.de) and of course our big game project at exDream (btw: we made a new cool team photo today with 15 people from exDream, it will be up soon at http://www.exdream.com/company.html). I also worked a lot on the DLR language I started a month ago and I'm quite happy with the current state, everything works very well right now and the very basic language functionality (only some very simple commands) is already there and very fast and easy to use. I will improve the language hopefully this month to a useable state, but it will probably take a long time (maybe a year) until everything is done for this language. More on that in a few weeks maybe ..

Because we also lost one of our main website servers at exDream last week because the old publisher didn't want to pay for it anymore, I had to move a lot of websites to a much slower server today (sites for ArenaWars, Rocket Commander, Mods, EuroVernichter, Xna Racing Game, other Xna Projects, some internal sites, etc.). We will get a better internet connection in the next few weeks for that, but for now it is very slow and not very enjoyable to surf on there. The most annoying thing while moving all the wwwroot directories to the new slow server was the fact that the ACL permissions were totally screwed up. I had this happen to me before, but today it was very annoying. Dunno exactly why this happens, but it seems when you extract files into wwwroot your existing directory permissions are more or less ignored. This only happend to me on that Win2003 server, I tried reproducing the same thing with Windows 7 at home, but directory permissions are correctly set when extracting more sub directories there. I could just set those permissions again for some folders, which is kinda strange because the base directory already had the permissions for the exact same user. But for other directories I could just see that the web user permissions are set (so I could not add them again), but still once accessing the website I would just get:

HTTP Error 401 401.3 Unauthorized: Unauthorized due to ACL on resource

or this other error once I allowed my user to access the aspx files, which is pretty confusing too:

Parser Error Message: An error occurred loading a configuration file: Failed to start monitoring changes to 'C:\inetpub\wwwroot\dummywebapp\web.config' because access is denied.

Since I had set the ACL permissions and they looked correctly I was very confused and tried a lot of other stuff like changing the Web Application Pool, recreating the web apps, using all kinds of different users for those web apps, etc. But all that did not really help. Only after I just made sure all permissions for all files plus all directories for this web app were set correctly, it finally worked. I would say this problem should be fixed and the error messages should be better, but in Windows 2008 and Windows 7 it is already so much better, I was easily able to extract the exact same .rar file on my Windows 7 and everything just worked as expected as opposed to the Win2003 server ..

Visual Studio compile times on different disk drives and SSDs

by Benjamin Nitschke 3. May 2009 20:39
This week I wanted to test using a Ramdisk (Ramdrive using just your main RAM) for compiling Visual Studio projects. Playing games or doing other disk intensive stuff would be great too, but most games are just way too big and smaller older games load pretty quick anyway.

While copying files and benchmarking a Ramdisk is incredibly fast with 4-6 GB/s (theoretically my ram should almost reach 10 GB/s, but well that's already way fast enough). Since I use Windows 7 RC since Friday when it came out on MSDN I had a lot of problems finding Ramdisk programs that actually work and do not crash every 2 seconds. Currently I use RamdiskVE by Cenatek, but the company does not exist anymore because it was bought by Dataram, which provides RamDisk on their own now. I was however unable to run Dataram's Ramdisk on Windows 7, it constantly crashes and has also many other limitations. Here is also a discussion on the Ocz Forum about Ramdisks if you want to check out some of the products for yourself.

The Ramdisk is also useful for Temp files, IE Temp files and other Scratch Disk functionality (e.g. Photoshop), but you should only use those for programs to stupid to use more of your Ram (e.g. because a program is 32bit and you have way more memory in your 64bit system). It also will increase the lifetime of your hard disks or SSDs in case you write a lot of stuff on your Ramdisk because the Ramdisk will only be loaded once at start up and saved once when shutting down your PC. But you should be aware that in case of a crash, you will obviously lose all the changed content on your Ramdisk.

For testing I used my new rig with a i7 920 D0 CPU and 12 GB ram, which I overclocked from 2.66Ghz to ~4.4Ghz. This is pretty fast, for example the For vs Foreach Performance application from the last blog post is twice as fast as my 6600 CPU from last week (all times cut in half basically, which is more than I expected).

I also tested quite a lot of hard disks and SSDs, which was kinda interesting because I learned that it does not matter if you mix totally different hard drives into a Raid 0 as long both are similarly fast. For example I tried using 2 older 160 GB disks in a Raid 0 and was only able to get around 60mb/s, which is slower than a single Raptor hard disk, but putting a new fast Samsung 250 GB and a Raptor 150 GB together into a Raid 0 gave me around 160mb/s. To keep things crazy I also added another Intel x25-M SSD to the one I already had and put them into a Raid 0 too, which is amazingly fast (>450mb/s). For highest speed you should always make sure to Enable Write-Through Cache on the Raid controller (in my case a Intel software raid with the ICH10R controller) AND to enable write caching and finally turn off the Windows write-cache buffer flushing (both can be found in hardware->disk->policies) for your disks. Please note that I do not care about data redundancy since everything I do is saved on a server with a Raid 5 anyway. If one of my Raid 0s would fail it would just be annoying to reinstall everything, but I would not lose any of my work or files.

So lets see how much benefit you actually get from compiling several different projects on a fast PC with those different disk configurations. The compile time is obviously heavily dependent on your CPU speed, but I tried to measure how much the total time changes just by using different disks. The following example shows a full recompile of one of my bigger solutions:



All compiles are full recompiles from scratch (no immediate files yet). For testing the DLR (change set 23173, about 25 MB in 20 C# projects), one of my solutions with 5 C# projects, 1 C++ project (~43 MB) and finally the good old Quake3 v1.32 source code (5 MB of c code). Since I do not compile much C++ at home I was to lazy to test bigger C++ projects, but I would guess most times would just scale and the conclusion would be the same.

4 Tests were executed:
  • Loading Visual Studio 2008 and opening the each solution (average time),
  • Compiling the DLR and running the ToyConsole sample. This will generate around 25 MB of files (17 MB of those are .pdb), ~60 files.
  • Compiling and running my own solution (~43 MB, 5 C# projects, 1 C++ project). Generates ~47 MB (lots of copying, >300 files).
  • and finally compiling Quake3 (~500 c files).
Most of the tests were done several times, but I stopped all of them with this cool freeware stopwatch called PC Chrono. Keep that in mind, the results will not be very accurate. Each test was done on the following drives:
  • Good old Raptor Hdd (one of the fastest desktop disks you can get, almost empty for better testing)
  • Single Ocz Ssd with 150mb/s read/write (remember that I complained about the bad JMicron controller on it way back in February)
  • Intel X25-M Ssd Raid 0 Array with 450mb/s read, 140mb/s write (nice ^^)
  • And finally the Ramdisk with 4500mb/s read+write (at least)


Disk/Compile times Loading VS+solution Compiling DLR+start ToyConsole Compile&Run own solution Quake3 Compile
Good old Raptor Hdd 3.75s 9.01s 2.95s 17.57s
Single Ocz Ssd 150mb/s 2.96s 10.47s 3.24s 22.67s
Intel X25-M Sdd Raid 0 Array 1.26s 7.79s 2.88s 17.53s
Ramdisk with 4500mb/s 1.43s 7.70s 2.51s 15.89s

With this data the following fancy graph was build. It shows that there are some improvements in several areas, but as long as you are not limited by your disk speed or IOs (amount of input/output operations you can do per second), you do not get much benefit from way faster drives (the ram disk is at least 45 times faster than the hard disk I used, but the performance improvement is maybe 10 or 20%):



Since it is so much fun using the Intel SSD Raid 0, I will keep using it. It is also big enough to use for all my programs and games. For example launching Left 4 Dead levels is 2-3 times faster than before. But I will probably not continue to use the Ramdisk for compiling. I can just keep everything on the SSD raid, which will not run out of space as soon as my small Ramdisk. Maybe when I do some file-intensive stuff in the future I will try out the Ramdisk idea again.

For now my advise for getting the fastest Visual Studio experience would be getting the fastest CPU you can (i7 920 is pretty nice, overclocking is important too, without it my tests would be 40% slower, 8 hyper-threads might also be useful in the future), enough Ram (4-6 GB), a fast disk like the Intel X25-M and of course using Windows 7 (since everything responses much faster). Then you can really have a lot of fun compiling big and small projects, and playing games, and doing other stuff on your PC.

BTW: My brother also just blogged about Windows 7 RC and using RamdiskVE. He also noticed that the network layer in Windows 7 is way faster than before in Windows Vista for him, in Vista he had 30-60mb/s max, now it is like 110mb/s copying files over the network:

Disclaimer: The opinions expressed in this blog are own personal opinions and do not represent the companies view.
© 2000-2011 exDream GmbH & MobileBits GmbH. All rights reserved. Legal/Impressum

Poll

Which platform should Soulcraft be released on next?











Show Results Poll Archive

Recent Games

Soulcraft

Fireburst

Jobs @ exDream

Calendar

<<  April 2014  >>
MoTuWeThFrSaSu
31123456
78910111213
14151617181920
21222324252627
2829301234
567891011