27 May 2009

New Laboratory

A few pics from our new office right at the Alexanderplatz in the center of Berlin. The office is still sort-of under construction because of some delays, but it will be finished soon (in about 2 weeks hopefully). The good thing is that it is within walking distance from my home (about 3 kilometers), and its a very nice walk (at least if the weather plays well).

The new office is actually a LOT bigger then what's visible on the photos. We decided on a floor plan where we can have around 5..16 people in one room which is big enough to enable direct communication between team members, but not big enough to become noisy. There are several meeting rooms, a chill out area, 3 kitchens, a server room plus a fallback server room, and enough room for future growth (we had to move to a bigger office every 2 years since Radon Labs GmbH exists, hopefully the new office is big enough so that we can stay a bit longer this time around, because as far as Berlin is concerned, this is pretty much the perfect location).

Here's what we see when looking out of the office towards the Fernsehturm:

IMG_6187 

 

That's what it looks like in the areas that are still under construction, this is going to become one of the meeting rooms:

IMG_6234

 

And that's what the team offices look like (this is the room of the Drakensang character department, who are presumably out to lunch or fleeing from the alien attack or something):

IMG_6203

 

And this was right before the alien mothership appeared in the sky and started to lay waste to the city:

IMG_6267

PS: thanks to Stefan for letting me use his photos :)

23 May 2009

Another day with PSN

  1. Read somewhere on the Internets that Infamous demo is on PSN.
  2. Turn on PS3.
  3. “A system update is available”.
  4. Stare for an eternity at a very slowly growing bar indicating not only the current download status but also my growing anger level.
  5. Stare for another eternity at another status bar which says that its installing the update or something…
  6. Contemplate how the PS3 feels more and more like a fucking IBM-PC connected to a 9600 baud analog modem.
  7. Finally! Hurry into the PSN shop, try to make sense of the confusing layout, until (after what feels like another eternity) it slowly dawns on me that the European shop (or German shop, or whatever other obscure region got the fist up its ass this time) does in fact NOT have the demo which (maybe) is only on the Japanese shop (or Americanese, or Papua-New-Guineaese, Kamerunese or whatever other region is lucky this time).
  8. Turn off the PS3 while shaking head in frustration thinking about how internet commerce ought to be the future and wipe out boxed products completely. Somebody should tell the bean counters and lawyers that the internet simply wasn’t designed with national borders in mind.

It’s seldom enough that I feel like turning on my PS3, but being greeted with a fucking system update which takes forever to download (and install!) every single fucking time is a huge *fucking* TURNOFF! At least on the PC, I can continue to do other things while Windows is updating.

To be fair, Microsoft isn’t any worse when it comes to denying us Germans the good stuff, but at least I can check MajorNelsons webpage for the all-to-common “Not available in Germany” message before spending forever looking for the download on the Marketplace. And system updates are just not an issue on the 360. The last one (the NXE update) took under 2 minutes to download (and install, might I add), and unlike a typical PS3 system update, the changes are usually visible to the user.

Maya Programming #1

[Edit: fixed some bugs in the MEL code]

Maya is an incredibly complex beast, but it is also an incredibly powerful beast. The C++ API fully exposes this complexity to the outside (even more then the MEL API), and after about 10 years of working with the Maya SDK I still have some sort of love/hate relationship with the C++ API (and a thorough hate/hate relationship with MEL).

Much of the complexity stems from the fact that even a simple Maya scene is a huge network of interconnected tiny nodes (the dependency graph), and while the API tries its best to provide shortcuts for finding the relevant data, one often has to write several (sometimes a lot) lines of code to get to the actually wanted piece of data. Typical Maya code (at least typical exporter-plugin code) often consists of cascades of calls to getters and iterators just to find and read a single data element from some object hidden deeply in the dependency graph. Things have gotten better over time though, almost every new Maya version provided shortcuts for getting typical data (like triangulated mesh geometry, vertex tangents, or the shader nodes connected to the polygons of a shape node). I think the Maya team needed a little while before they got a feel for the needs of the gaming industry. Since Maya has more of a movie industry background, and (at least in Europe) is still a bit exotic compared to its (ugly) step-brother 3DMax this is understandable.

When programming a Maya plugin the actual API documentation is actually relatively useless because it’s mainly a pure class interface documentation and doesn’t explain how the objects of a Maya scene relate to each other or what’s the best practice to solve a specific task (and worse, there are often a lot of different ways to do something, and its up to the programmer to find the easiest, or cleanest, or fastest way). Instead of relying on the documentation its often better to explore the object relationships directly in Maya through the Script Editor, and only when one has a clear understanding of how the data is organized in Maya, lookup the API docs to find out how it’s done in C++.

Before starting to explore a Maya scene through the script editor one only needs to understand this:

The entire Maya scene is a single Dependency Graph, which is built from Dependency Nodes connected through Plugs (think of it as a brain made of interconnected neurons). Every time the Maya user manipulates the scene he may add or remove nodes from the graph, connect or disconnect plugs, or feed some new data into some input plugs). After parts of the dependency graph have been manipulated, it is in a dirty state and must be brought uptodate before being displayed (or examined by an exporter plugin). But instead of evaluating the entire graph (which would be very slow in a complex Maya scene with thousands of nodes), only the dirty parts of the graph which actually need updating will be evaluated. This is where the dependency stuff comes into play: every dependency node depends only on the nodes connected to its input plugs. Only input plugs marked as “dirty” need to be evaluated. If some data is changed at one of the input plugs it will propagate its dirty state “upward” in the graph, and in turn an “evaluation wave” will propagate “downwards” through the dirty nodes.

This system might seem overkill for simple 3D scenes, but as soon as animation, expressions and the construction history come into play it all makes sense, and its actually a very elegant design.

Now lets go on with the exploration:

Some important MEL commands for this are listAttrs, getAttr, setAttr and connectionInfo. Let’s start with a simple illustration of a dependency chain.

First open Maya’s script editor and create a polygon cube at the origin:

polyCube [Ctrl+Return]

This creates a transform node, and a child shape node in Maya (pCube1 and polyCube1). Let’s list the attributes of the transform node (for the sake of simplicity… these are equivalent with plugs):

listAttr pCube1

This produces dozens of attribute names (a first indication of how complex even a simple Maya scene is). Somewhere in this mess is the translateX attribute which defines the position on the X axis. Let’s have a look at its content:

getAttr pCube1.translateX

This should return 0, since we created the cube at the origin. Let’s move it to x=5.0:

setAttr pCube1.translateX 5.0

When the command executes, the cube in the 3D view should jump to its new position.

So far so good. Lets create a simple dependency by adding a transform animation. Just type this into Maya script editor (start a new line with Return, and execute the whole sequence with Ctrl+Return):

setKeyframe pCube1;
currentTime 10;
setAttr pCube1.translateX -5.0;
setKeyframe pCube1;

This sets an animation key at the current position (time should be 0 and the cube’s x position should be 5), then sets the current time to 10, moves the cube to x=-5 and sets another animation key.

Now grab the time slider and move it between frame 0 and 10, the cube should now move on the X axis. Now lets try to read the translateX attribute at different points in time. Move the time slider to frame number 3 and execute:

getAttr pCube1.translateX

This should return 2.777778. Now move the time slider to frame 6 and get the same attribute:

The result should now be –0.555556. The previously static attribute value now changes over time since we added an animation to the object. Obviously some other node manipulates the translateX attribute on pCube1 whenever the current time changes. Let’s see who it is:

connectionInfo -sourceFromDestination pCube1.translateX

This yields: pCube1_translateX.output

So there’s an object called pCube1_translateX, which has a plug called output, which feeds the translateX attribute of our cube with data. Now let’s check what type of object this pCube1_translateX is:

nodeType pCube1_translateX

The result: animCurveTL.

Now let’s check what the docs say about this node type. Open the Maya docs, go to “Technical Documentation –> Nodes”, and type animCurveTL into the “By substring” search box. This is what comes up:

This node is an "animCurve" that takes an attribute of type "time" as input and has an output attribute of type "distance". If the input attribute is not connected, it has an implicit connection to the Dependency Graph time node.

Interesting! Let’s double check: If the output plug of the animation curve is connected to the translateX attribute, it should return the same value for a specific time… Moving the time slider to frame 3 should yield a value of 2.77778, and indeed, a

getAttr pCube1_translateX.output

returns the expected value.

Now lets try something crazy: what if we feed the current value of translateX into the translateZ attribute of our cube and thus make the dependency chain a bit more interesting? The result should be that the cube moves diagonally when the time slider is moved even though only the translateX attribute is animated by an animation curve:

connectAttr -force pCube1.translateX pCube1.translateZ

Now I cheated a bit by using the –force argument. The translateZ attribute was already automatically connected to another animation curve when we executed the setKeyframe command (same thing that happened for our translateX attribute), we need to break this connection before connecting to another plug, and the –force just does that.

Lets see if it works by moving the time slider. And indeed… the cube moves diagonally on the X/Z plane as expected. Cool shit.

So that’s it. That’s how everything in Maya works. The only difference to a really complex scene is that there are hundreds or thousands of dependency nodes connected through even more plugs.

21 May 2009

Maya Plugin

[Edit: I added some clarification to the “cleanup” point  below]

I have started to write a new Maya plugin for Nebula3, which eventually may replace all (or parts of) our current plugin. The actual Maya plugin is only one small part of our asset pipeline, so this is not about rewriting the entire asset pipeline, just replacing one small gear in it. Otherwise it would be a truly Herculean task. In the beginning this will just be a private endeavour, free from time- or budget-limitations, so that no design compromises have to be made. This approach worked quite well for Nebula3 so it makes sense to use it more often in the future.

Our current Maya plugin is stable and fast, but at least the C++ part of it is beginning to show its age, it’s becoming harder to maintain, and since it’s based on Nebula2 code it is a lot harder to do low-level things like file io compared to similar Nebula3 code.

The new plugin will realize ideas I’ve been carrying around in the back of my head for quite some time, and which would be hard to implement into the existing plugin without a complete rewrite. The most important one is:

Separation into a platform-agnostic front-end and several specialized back-ends:
  • The actual Maya plugin will export into intermediate file formats which are completely platform-independent (and probably even somewhat engine-independent). Thus the plugin itself becomes more of a generic 3D engine exporter tool which doesn’t have to change every time a new target platform is supported or an engine feature is added or rewritten.
  • The back-end tools (or libs) convert the intermediate files into the files actually loaded by Nebula3. Those files can (and should) be highly platform-specific.

I’m expecting that the larger chunk of code goes into the platform-agnostic plugin, and that the back-ends are relatively small and straight-forward. The main advantage of this separation is better maintainability. The core plugin can remain relatively stable and clean, while the back-ends can have a higher frequency of change, and the “throw-away-and-rewrite” barrier is a lot lower since only the relatively small back-end-code has to be replaced without affecting the core plugin and the other platform-back-ends (too much). Also, the platform-mini-teams have more freedom to implement platform-specific optimizations into their engine-ports, since they have complete control over their exporter-backend.

The main disadvantage is that the export times will probably be a bit higher then now. A LOT of effort has gone into optimizing the performance of our toolkit plugin (exporting a scene with hundreds of thousands of polygon should only take up to a few seconds), and writing an additional set of output files may effect performance quite drastically. I’m planning to use XML only for the intermediate object hierarchy structure (which depends on the material complexity of the scene, but shouldn’t be more then a few dozen to a few hundred lines for a typical object), and to use binary file formats for “large stuff” like mesh and animation data. But if the XML files are hindering io performance too much, I will clearly go to a performance-optimized binary format, even if human-readability would be a major plus there (in the end, one set of the back-end tools could convert to human-readable ASCII file formats).

If you’re wondering why performance is so critical during export: consider that a project has about ten-thousand 3d models to be exported (which isn’t unrealistic for a complex RPG project like Drakensang for example). If the export time can be reduced by only one second per object, the time for a complete rebuild will be reduced by almost 3 hours! Actually, most 3d models batch-export in much less then 1 second in our build-pipeline, it’s the texture-conversion to DDS which eats the most build-time…

There are a lot of other things a Maya exporter tool should do right to be considered well-mannered:

  • It should support batch-exporting and automation (command-line batch-exporters, means of controlling export parameters on thousand of assets, standardized project directory structures, etc…).
  • It should be designed for a multi-project environment (a modeling artist or level designer must be able to quickly switch from one project to another).
  • It should of course offer a fast and exact preview for immediate quality control (the artist should be able to get an in-engine view of his work immediately)
  • It should not force artists to use archaic file formats. For instance, all texture conversion tools for the various console platforms I have encountered so far only accept crap like TGA or BMP as input, but NOT the industry standard PSD format! Quite baffling if one thinks about it. I don’t know how other companies deal with this, but I think it’s quite unacceptable to keep an extra TGA version for every one of tens-of-thousands textures around, just so that the batch exporter tools will work (for the console platforms we wrote our own wrappers, which first convert a PSD file to a temp TGA file and then invoke the conversion tools coming with the SDKs, but this is REALLY bad for the batch-export performance of course).
  • It should be fault-tolerant: Maya is an incredibly complex piece of technology, and plugins usually have no other chance then only supporting a specific subset of its features. The plugin should not crash or stop working when it encounters something slightly wrong or unknown in the Maya scene, instead it should provide the artist with clear warnings and readable error messages.
  • It should not require too many restrictions in the Maya scene: for instance, a very early version of our exporter tools required the artist to manually triangulate the scene, which is unacceptable of course.
  • It should cleanup the Maya scene during export: It’s relatively easy in Maya to create zero-area faces, or duplicate faces, or faces with a zero UV-area, etc... The exporter should remove those artefacts, and in those cases where an automatic handling is not possible, provide a detailed error log to the artist so that he has enough information to remove those problems manually. [EDIT: this was badly worded… of course the plugin should not modify the actual Maya scene, but instead remove artefacts from the data which has already been extracted from the Maya scene… it’s a bad idea to modify the Maya scene itself during export!]
  • It should optimize the Maya scene during export: For instance, the last time I looked at the XNA Maya plugin it exported a single material group for every Maya shape node, resulting in hundreds of draw calls for our simple Tiger tank example object. This is almost as bad as requiring the artist to work with a triangulated scene. Instead the plugin should try its best to optimize the scene for efficient rendering during export (like grouping polygons by material, sorting vertices for efficient vertex-cache usage, removing redundant vertices, and so on).

Of course this list could go on for a few more dozen points, there’s almost 10 years of work in our asset pipeline, and there’s probably more C++ and MEL code in it then in Nebula3 (which isn’t necessarily a good thing ;)

15 Apr 2009

N3 SDK Apr 2009 download

Here's the new SDK (see previous post), I hope releases will be a little more frequent in the future. As always, source code for the Xbox360 or Wii versions is not included for obvious legal reasons.

http://www.radonlabs.de/internal/N3SDK_Apr2009.exe

Have fun!

14 Apr 2009

New N3 SDK (Apr 2009)

The new SDK release should be ready by tomorrow (unless there's some last-minute fuckup). For now here's a rough What's New list since the last SDK from September 2008:

Tools:
  • new command line tool: archiver3 - multiplatform wrapper for generating file archives
  • new command line tool: n2converter3 - convert .n2 files to .n3 files (Nebula graphics objects)
  • new command line tool: suiconverter3 - batch converter for SUI resources (simple user interface), currently only useful for the Wii port
  • new command line tool: synctool3 - sync local project directory from build server (only useful with our inhouse asset pipeline)
  • new command line tool: countlines3 - count N3 source code lines and generate Excel-compatible csv-file (comma separated values)
Foundation Layer:
  • brought Wii port uptodate (not part of the public SDK of course)
  • Util::CmdLineArgs renamed to Util::CommandLineArgs
  • Scripting subsystem moved into an addon
  • Remote subsystem moved into an addon
  • new Macros __ConstructInterfaceSingleton / __DestructInterfaceSingleton
  • new standard define __MAYA__ if compiled as part of a Maya plugin
  • new concept: ExitHandlers (see Core::ExitHandler)
  • new low level debug feedback method Core::SysFunc::MessageBox()
  • new concept "root directory" (see CoreServer::SetRootDirectory)
  • various changes to enable using N3 code in N2/Mangalore apps (N2 now sits on top of a very slim N3 Foundation layer)
  • SizeT and IndexT now signed (had to be done for N2/Mangalore compatibility unfortunately)
  • IO::Console is now an InterfaceSingleton
  • various debug HTML page handlers can now sort table content by columns
  • Debug::DebugServer is now an InterfaceSingleton
  • added a minimal Debug::HelloWorldRequestHandler as example for a simple HttpRequestHandler
  • new comfort method Http::HtmlPageWriter::TableRow2(), saves code when creating a 2-column HTML table
  • added Http::HttpClient class, allows send requests to HTTP servers
  • Http::HttpInterface is now an InterfaceSingleton and derived from Interface::InterfaceBase
  • new class Http::HttpRequestWriter
  • new class Http::HttpResponseReader
  • added a "single thread mode" to HttpServer
  • added new classes InterfaceBase and InterfaceHandlerBase
  • moved IOInterface-stuff under IO
  • new class Debug::ConsolePageHandler, displays console output in web server
  • generalized Zip filesystem stuff into general archive filesystem, with ZIP support as one specialization (on Wii, ARC files are used instead)
  • ZipFileSystem is now an InterfaceSingleton (no more per-thread wasted memory for archive table-of-contents)
  • added an "AsString()" method to IO::FileTime
  • new methods in Win360FSWrapper: SetFileWriteTime(), GetAppDataDirectory(), GetProgramsDirectory()
  • moved IO::ZipFileStream class to io/zipfs
  • moved path assign methods from IoServer into new InterfaceSingleton: IO::AssignRegistry
  • new standard assigns under Windows: "appdata" and "programs", mainly useful for tools
  • bugfix in IO::ExcelXmlReader for tables with empty cells
  • new class IO::HistoryConsoleHandler, captures console output into a ring buffer (used by Debug::ConsolePageHandler)
  • moved URI scheme methods from IoServer into IO::SchemeRegistry InterfaceSingleton
  • removed critical section from Stream::Open / Stream::Close
  • new method: IO::XmlWriter::WriteComment()
  • new methods specialized float4-loading methods: Math::float4::load_float3(), Math::float4::load_ubyte4n_signed()
  • vector comparison methods in Math::float4 more intuitive and flexible
  • Math::matrix44() default constructor now sets object to the identity matrix (default constructor still empty in Math::float4!)
  • new constructor from float4 in Math::quaternion
  • moved lots of math functions from scalar.h to platform-specific d3dx9_scalar.h to enable platform-specific optimizations
  • Memory::Heap constructor now accepts initial and maximum heap size
  • memory leap detection for Memory::Heap (doesn't quite work as expected yet)
  • removed global heaps: SmallBlockHeap, LargeBlockHeap, StringHeap
  • added global heaps: PhysicsHeap, AppHeap, StringObjectHeap, StringDataHeap
  • new experimental Win360MemoryPoolClass
  • Messaging::AsyncPort: can now add message handlers to already opened ports
  • added "deferred handled" flag to Messaging::Message (used in the rendering thread to keep CreateEntity messages around until their resources are loaded)
  • wrapped socket and TCP/IP classes to enable "TCP/IP-over-HIO2-Tunneling" on the Wii
  • System::Win32Registry(): split Read() method into ReadString() and ReadInt()
  • System::ByteOrder::Convert<TYPE>() methods now return a value, added new methods System::ByteOrder::ConvertInPlace<TYPE>()
  • new methods Threading::Interlocked::Exchange() and CompareExchange()
  • Win360Thread::IsRunning() now uses the Win32 function GetExitCodeThread() to detect whether the thread is running
  • new method Threading::Thread::YieldThread() (gives up time-slice)
  • new class Threading::SafeFlag
  • new concept: Timing::MasterTime and Timing::SlaveTime, main thread has a MasterTime object, and distributes "main time" to slave threads (i.e. the render thread)
  • Util::Array now has a MinGrowSize and MaxGrowSize to prevent excessive memory waste for huge arrays
  • new experimental class Util::Delegate (doesn't work in Codewarrior unfortunately)
  • new method Util::FixedArray::AsArray()
  • new methods Util::Dictionary::KeysAs<TYPE>(), and Util::Dictionary::ValuesAs<TYPE>()
  • new class Util::PriorityArray (which actually isn't needed anymore I think)
  • new class Util::RandomNumberTable
  • new class Util::Round
  • made many non-essential Util::String methods non-inline
Render Layer
  • new Character subsystem (modular character rendering, fresh rewrite of N2's Character3 system)
  • new CoreFX subsystem (ported from Mangalore's VFX subsystem (visual effects)) -> NOTE: will be moved into addon
  • new CoreUI and UI subsystems (simple user interface system) -> NOTE: will be moved into addon
  • new Video subsystem (video playback, currently Xbox360 only) -> NOTE: will be moved into addon
  • new Particles subsystem (rewritten from scratch) -> NOTE: will be moved into addon
  • new PostEffect subsystem (ported from Mangalore) -> NOTE: will be moved into addon
  • new Vibration subsystem (game pad vibration support) -> NOTE: will be moved into addon
  • new Vegetation subsystem (Drakensang's grass renderer, currently broken under N3) -> NOTE: will be moved into addon
  • new concept: RenderModules, clean framework to add functionality to the render thread
  • new concept: AnimEvents, animations can emit events at certain sample times (i.e. for playing foot-step sounds at the right time)
  • new concept: character attachments (swords, etc...)
  • lots of bugfixes and improvements to Animation system
  • new concept: AnimDrivenMotion, synchronize character movement with its current animation
  • new concept: batched messages, drastically reduce communication overhead between threads by client-side batching of messages
  • new methods: Audio::AudioEmitter::Pause() and Resume()
  • new methods: AudioDevice::SetGlobalVariable() / GetGlobalVariable()
  • added/fixed Xbox360 support to XACT audio classes
  • new class: MultipleRenderTarget, wraps MRT rendering
  • new class: MouseRenderDevice (currently only implemented on Wii)
  • added support for GPU-instanced-rendering
  • RenderTarget: added support to resolve depth buffer into texture (Xbox360 only)
  • RenderTarget: added support to create a resolve-texture which can be efficiently read by the CPU
  • added "late-binding" to ShaderVariableInstance
  • D3D9StreamTextureLoader and D3D9Texture moved from win360 to d3d9, since specialized Xbox360 versions exist now
  • Debug::MeshPageHandler can now display a dump of the vertex data in the web browser
  • Debug::TexturePageHandler now displays the current resource state of textures (Initial, Pending, Loaded, etc...)
  • new class CoreGraphics::MemoryMeshLoader
  • renamed CoreGraphics::Shape to CoreGraphics::RenderShape (because of CodeWarrior problems with identical filenames in different directories)
  • added Multiple Render Target support to Frame::FramePass and Frame::FramePostEffect
  • Graphics::Display::GetDisplayMode() now returns display mode actually set by the CoreGraphics::DisplayDevice (may differ from the requested display mode)
  • Graphics::GlobalLightEntity: all light parameter changes are now transferred to the render-thread-side post-creation
  • Graphics::GraphicsEntity: internal entity handle now only becomes valid after resources have been loaded on the render-thread-side
  • Graphics::GraphicsInterface now uses batch messaging to communicate with render-thread (only 1 message sent per frame)
  • Graphics::Handle is now a smart pointer (fixes problems where render-thread graphics entities were disposed too early)
  • Graphics::ModelEntity: support for AnimDrivenMotion
  • new Input::GamePad methods: ButtonAsString(), AxisAsString(), GetStateAsInputEvents()
  • InternalGraphicsEntities are now registered with the InternalGraphicsServer
  • lots of changes in InternalGraphicsServer and InternalModelEntity which require a proper cleanup doh
  • added support for 2-sided lighting to global lights
  • new classes: Models::AnimatorNode and Models::AnimatorNodeInstance (legacy Nebula2 stuff)
  • new classes: Models::CharacterNode and Models::CharacterNodeInstance, integrate character rendering with model nodes
  • loading of ModelNodes completely rewritten (new .n3 file format, plus n2converter3 tool to convert .n2 files to .n3)
  • new Model::OnResourcesLoaded() method, if Model subclasses need to do initialization work after resources have finished loading
  • new methods to lookup ModelNodeInstances on ModelInstances
  • new class Models::StreamModelLoader
  • lots of other minor changes in namespace Models
  • new class: RenderUtil::MouseRayUtil, convert 2D mouse position into world-space 3D-ray
  • new method: ResourceManager::CheckPendingResources(), returns true when there are currently no pending resources waiting to be loaded
  • SharedResourceServer: several methods now accept a ResourceLoader object when creating shared resources

I also did some interesting line counting statistics recently for the Foundation and Render layers (hopefully the images won't be scaled down too much):

foundation_layer 

render_layer 

"General" is the platform-agnostic code, which is the same for all platforms, "Win360" is the code which is identical between Win32 and Xbox360.

4 Apr 2009

Still Alive

Just a quick update, since I haven’t posted for a while:

  • I’ve pretty much gone into hermit mode for the whole of March with 10 hour-days and six-day-weeks at the Labs because of an interesting new project. Pretty much everything I did falls under NDA unfortunately, so there wasn’t anything I could write about anyway… so that’s my half-assed apology for not posting anything. Although I must admit that it was quite relaxing to simply cut-off all non-essential communication and social contacts and just concentrate on the task at hand, almost feels like I just came back from a long vacation, even though I’m a bit exhausted physically ;)
  • The Larrabee instruction set looks damn impressive, but I’ll withhold my enthusiasm until it’s actually available and doesn’t turn out to be just another fucking “onboard graphics chip”.
  • Same for OnLive. Don’t have tech-journalists any fucking common sense to hype this shit up instead of looking behind the curtain? Even Xbox Live (the current Gold Standard) takes a nose-dive from time to time when a popular multiplayer game is released, and that’s just doing match-making over its servers with (probably) a few Kbytes of traffic per session. Maybe OnLive is capable to demonstrate a system which scales up to a million simultaneous players spread across the globe, running a taxing 3D game with “acceptable” lag and image quality. If that happens I would be truly impressed. I predict that OnLive will collect a considerable amount of money from eager investors (who seem to believe that little nuisances like the Laws Of Nature can be dealt with by throwing the right amount of money at them), waste that money over the next few years, while tech demonstrations become less and less impressive (even though they’ll be fabricated), and finally, if said investors are lucky, maybe one or two worthless patents and “yet another” video codec come out of the whole venture. It’s Phantom all over again.
  • Drakensang won the new “German Computer Game Award” for “Best German Game” and “Best Youth Game”. Please excuse this uncommon display of enthusiasm, but let me just say: Yay!

I love Resident Evil 5. Capcom are my official kings of next-gen. Funny thing is: I almost skipped the game because of the demo. I haven’t played any previous RE’s (at that time I hadn’t converted to the dark side of console gaming yet), so I was completely put off by the strange “tank controls”. Also, the demo took place in “generic Middle-Eastern/North-African town”) which I already know well enough from COD4, MGS4 and FarCry2. No-but-thank-you.

But the actual game… I’d almost go as far to say that this is the best game of this console generation. The only little imperfection are those controls, but it took me only an hour or so to not even think about them anymore and use them just as automatically as the GeOW, or COD, or Rainbow Six controls. They are different from the “standard FPS controls”, but that’s the point, they are just different then the others, but with a little practice just as easy and intuitive in their own way.

The pacing of the game, the location design (there are some really breathtaking locations after “Generic Town”), the sound track, the story-telling, the characters, the cutscenes, the BOSS-FIGHTS (OMG the bosses are epic) – the way how coop-gameplay is implemented – everything is absolutely perfect. And RE5 is a console game through and through, not some FPS with more or less obvious PC heritage like COD or Gears. Little things like the “persistent inventory”, the massive amount of unlockables, the NewGame+ and Mercenary mode - every little bit of RE5 feels, tastes and smells like a 100% pure next-gen console game. Alright that’s it, I need to go play some RE5 now. Bye.