Thursday, April 11, 2013

Googling "Little Crane"

Since january 2011, the release date of The Little Crane That Could, people have been googling the term, mainly in the UK and the US. The big spike of dec 2012 coincides with the Android release.

Wednesday, April 10, 2013

False sense of security

So my new house came with Weiser Smart Key locks for the back, garage and basement doors. One morning, I could no longer open my garage. There was no way to get to my car to pickup my little monkey from day care. My builder is a great guy, so was on site with his installer in 15 minutes. They helped me get in by drilling out the lock. The next day, another guy came in and put in a replacement Weiser deadbolt lock. And what do you know, 10 minutes after he left, I discovered the same thing had happened, with a different lock, different set of keys.

At this time, I had lost all confidence in Weiser locks, and decided to replace them all. I got ANSI Grade 1 fixed key deadbolts from Schlage, and installed those on all doors that previously had that Weiser crap on them. They are robust, super smooth, and rated top grade for commercial use. My advice: never go with those 'programmable' locks known as Smart Key or SecureKey. See the photo above of all the useless crappy locks I removed from my house. Professionals rate the Weiser Smart Key of the same caliber as that lock that could be opened with a Bic pen. See this video on how to open any Weiser Smart Key lock in a few seconds.

So to sum up: Never get a programmable key, especially not the crap from Weiser that will lock you out at random. Instead get a lock that is not programmable, and rated ANSI Grade 1, for instance, this one from Schlage.

Wednesday, March 27, 2013

Ackermann

When I created Little Crane 2½ years ago, I naively thought that when a car turns left, both front wheels will turn in at the same angle. This is not the case in real cars, as this would cause scrubbing of the tyre. Instead, the inner wheel in steered in more than the outer wheel.

When creating my next games, A Blocky Kind of Love and Buggy Bang! Bang! I knew better, so implemented proper Ackermann steering. Today I ported this code back into the Little Crane code base, so the next version will feature a smoother ride. I also implemented a drive differential, so that the outer rear wheel is driven to spin faster than the inner rear wheel in turns.

It was invented in 1816 by Georg Lankensperger, whose agent was Rudolf Ackermann. This Rudolf Ackermann is not to be confused by Prof. Dr. Ir. Akkermans.

Friday, March 1, 2013

Proportional Integral Differential

I am a game programmer that spends the bulk of his time on writing physics simulations of virtual worlds. In these virtual worlds, there are typically people or vehicles. When such a person or vehicle is not under player control, but controlled by the computer, we often call this AI (Artificial Intelligence) or NPC (Non player character.) A common problem is to realistically move these entities around in the world.

A naive way to move these virtual actors would be to write code that sets the new position during each frame of the simulation. This is typically how a ghost in pacman or a space invader is moved on the screen. However, for a proper simulation this is the wrong way to do it. Modern complex games use physics simulations that calculates forces and accelerations.

This means that steering an object in the world becomes a very indirect way of steering. The algorithm sets a force (linear, or torque) which results in acceleration, or change in velocity. The velocity, lastly, will change the position and orientation of the object. This makes steering a hard problem: how do you move an object to position X? If X is far away, you would apply force towards X, but if X is near, and we are quickly approaching X, we need to apply force away from X so that we come to a halt on X without overshooting.

Luckily, this problem has been solved for us by the engineers in process control. When heating a building for instance, knowing when and how much to run the heater depends not only on the current and desired temperature, but also how the difference between the two have been changing in the past. If the temperature is rising quickly (because the heater was running full blast) it is time to stop the heater, as the the lag in the system will cause the temperature to continue rising. If a lot of cold air is entering the building causing the temperature not to rise much in the near past, heating needs to be increased. For this, the engineers use so called PID Controllers.

PID control splits the steering into three different components:

  • Proportional (what is the error right now?)
  • Integral (what is the historic error?)
  • Differential (how is the error changing?)
...where the error is simply the difference between the current value and the desired value. Steering based on the differential of the error means that if we are fast approaching the desired value, we steer away, and if we fast retreating from the desired value, we steer towards. Steering based on integral means that the longer we have had a large error in the past, the harder we steer towards the desired value.

Humans actually steer the same way naturally. For instance, when we need to open a door of unknown weight. First we push a little if it is closed. The longer the door remains closed, the harder we push (this is the integral part). And the faster it swings open (error decreases quickly), the less we push it, or even start pulling it. The PID system is nicely adaptive. If a hard wind is blowing the door shut, the integral steering will make that we compensate by pushing harder. This is what is called steady-state-error. The entities in our simulation will seem smarter for it, if they can adapt to changing conditions.

The PID controller calculates the required force for us, each time step in our simulation. The error is calculated from desired and actual values. Note that the desired value does not have to remain fixed, it can be a moving target. PID will cope with this automagically, e.g. when trying to aim a rifle at a chaotically moving target.

So, are there no downsides to this miracle technique? Well, not really, as long as the controller is tuned roughly correctly. We need to determine with what weights we mix the P, I and D control. We need to select three P, I, D coefficients, and the optimal values are typically different in each application. Personally, I find that selecting P roughly 10 times larger than D, and I somewhere in between always makes my steering converging nicely without much overshoot, and reasonably quickly. Note that all three coefficients need to be negative. (We need to steer against current error, against historic error and against rate of change of error. Just start with some default values (-10, -2, -1) and tweak them with the following guide:

  • When system explodes due to excessive force, lower all coefficients.
  • When overshoot is large, increase D, lower I.
  • When convergence is slow, increase P and I, decrease D.
  • When observed value jitters a lot, decrease P and increase I.

It's high time for some code now. The code is really concise, if it carries a little bit of state (previous error, historic error) along with the P,I,D coefficients.

//! Scalar PID controller
typedef struct
{
 float P;
 float I;
 float D;
 float previousError; // Last error
 float integralError; // Historic error
 bool  fresh; // If set, we have no 'last error' yet.
 bool  angular; // Angular PIDs have errors wrap around at -pi and +pi.
} pid1_t;


//! Reset a PID controller. Clears historial error.
void pid1_reset( pid1_t &p )
{
 p.previousError = p.integralError = 0.0f;
 p.fresh = true;
}

//! Calculate the steering force based on current value (ist) and desired value (soll).
float pid1_update( pid1_t &p, float dt, float ist, float soll )
{
 if ( dt <= 0.0f ) return 0.0f;
 float error = ist - soll;
 if ( p.angular )
 {
  // normalize angular error
  error = ( error < -M_PI ) ? error + 2 * M_PI : error;
  error = ( error >  M_PI ) ? error - 2 * M_PI : error;
 }
 p.integralError = ( p.fresh ) ? error : p.integralError;
 p.previousError = ( p.fresh ) ? error : p.previousError;
 p.integralError = ( 1.0f - dt ) * p.integralError +  dt * error;
 float derivativeError = ( error - p.previousError ) / dt;
 p.previousError = error;
 p.fresh = false;
 return p.P * error + p.I * p.integralError + p.D * derivativeError;
}

And to use this PID controller to aim a turret in a tower-defence game:

pid1_t pid;
pid.reset();
pid.angular = true;
pid.P = -10.0f;
pid.I = -2.0f;
pid.D = -1.0f;
while ( simulating )
{
    ...
    float desired = angleTowardsTarget( turret, enemy );
    float actual  = angleOfGun( turret );
    float steer   = pid1_update( &pid, dt, actual, desired );
    applyTorque( turret, steer );
    ...
}

And that is pretty much it. The only thing to watch out for, is that if the turret suddenly selects a different target, the PID controller needs to be reset, so that the historic error based on previous target does not influence the steering for the new target. This causes a quicker convergence. To do this, just clear the historic error. And there you have it, a smoothly targeting turret that does not need kludges for smooth-in/smooth-out parts of a synthetic animation. Animation is bad, simulation is good.

I use this PID code to:

  • Smoothly move my camera.
  • Smoothly reorient my camera.
  • Smoothly have a tank aim at a moving enemy.
  • Smoothly balance a helicopter at a desired attitude.
  • Smoothly push, pull and twist oars of a rowing boat.
  • Smoothly hover a bike over undulating terrain.
  • Smoothly steer a missile towards a fast moving target.
and much much more. To see many of them in action, download one of my games, they are free. Search for 'Abraham Stolk' on Google Play or iOS App Store.

Thursday, February 21, 2013

Matching up players in action games.

I am currently reading up on cartography. I need to visualize the earth in such a way that distances are accurately depicted. The way to do this is to map the globe using Interrupted Sinusoidal Mapping. Why am I so interested in distances at the moment? It is because of multiplayer gaming.

One of my goals for 2013 is to release a multiplayer game. And writing multi player action games is hard. Especially if you consider that my games always have a strong theme in physics simulation. Distributing a physics simulation is so hard, I consider it unsolved and probably intractable as well. Games typically use hacks, tricks and fakery to give the illusion of a shared experience in a virtual world with simulated physics.

What makes multi player game development hard is not bandwidth restriction, but the network latencies. In action based games (not turn based) you want immediate responses to player input. This player input needs to be communicated over the network to other parties. Even at speeds that are near the speed of light, this communication takes a very long time from the computer's point of view.

The speed of light in vacuum is roughly 300,000km/sec. This seems really fast, but if a UDP packet needs to take a one-way trip to a player 15,000km away, it would take 0.05 seconds. And guess what, in 0.05 seconds our game will have rendered three frames at 60fps. Now this is under ideal circumstances. What if the information needs to make a round trip? Then it is already 6 frames. It gets worse with the latencies introduced by the network equipment. And what if the packet makes a detour to another hub, or worse a satellite hop? And worse of all, a UDP packet can get lost altogether.

As a game programmer, I have no control over the network equipment at a player's home and at his provider. I do have control on how I match up my players though. I cannot afford to have an online match where one player is in Europe and another is in Australia. The limited speed of the signal and the long distance will create large delays between when information is sent and when it is received. I need to do the match-making in my game lobby based on geography. And while doing that, I might just as well communicate this local nature of opponent selection to the players. So I am going to visualize the pool of potential opponents for the player using a mapped globe that accurately depicts distances.

To divide the globe into player pools, I was considering using time zones. This would give 24 partitions. But in a single timezone, there is a long north-south distance. So I will split those at the equator. As there are no gamers on the North and South Pole, the resulting 48 zones should be relatively compact.

The multi player gaming API in Apple's iOS is part of GameKit. It is hard to determine how Apple servers work when matching up players. I hope they take locality into account when matching players. But I have not been able to determine whether they actually do this. So I think I have to do the region encoding myself. There is a hook in iOS for this in the GKMatchRequest class. This class lets you divide players with the playerGroup property. And using the queryPlayerGroupActivity:withCompletionHandler: method of GKMatchMaker I could even visualize player activity on the globe map.

Keep an eye out on my blog, to follow my progress in this venture. And wish me luck, I need it. There is very little documentation on distributed simulations. The main resource on this subject is seven years old and available at Gaffer on Games. I will leave you with a sneak peek at my new game.

Tuesday, February 19, 2013

Computer History

I thought I would document my computer history here. It all started with a Sinclair ZX Spectrum 48K (1982), which much later got replaced by a Sinclair Spectrum Plus (the latter I did not use very much.)

The next computer was a PC-XT clone, with Intel 8088 microprocessor running at 8MHz, equipped with a rare HEGA (Hercules/EGA combo) graphics card.

While studying computer science at the University of Amsterdam, I bought the first computer with my own (not my father's) money, which was a SOYO 386DX at 33MHz. I later equipped this machine with a 387 coprocessor.

The next computer was again a PC, and again self-built from components. The processor was an Intel Pentium at 66MHz, and I remember FedEX stopping by to swap my processor with FDIV bug for a new one.

Next up, again a PC, and it was a really good deal at the bargain bin of a dealer at the HCC dagen. I bought a real multiprocessor motherboard equipped with two Pentium PRO at 200MHz. I was working at ElectroGIG at the time, and I chose it on the floating point performance which is paramount in ray tracing technology.

After ElectroGIG, I purchased a powerhouse. Pretty much the fastest processor money could buy, thanks to some funding from the Silicon Polder Fund. I bought it during a road trip to the US, from Aspen Systems in Colorado. It was a Dec Alpha based, 533MHz ev56 164LX system called Durango II.

While I was working at SARA super computer centre as a Virtual Reality specialist, one of the supers got dismantled. It was a Parsytec CC with 56 PowerPC processors. Each node was a PReP Blackhawk 603 board. Because it was capable of running Linux, the scrapped nodes were distributed to personel, and I received one. This computer was my home server for a long time.

SARA also encouraged and subsidized private computers for employees. Under the PC-Prive regeling I bought a mini laptop years ahead of its time. The Sony Vaio C1VE is more a piece of jewelry than a plain laptop, and it featured a novel CPU from Transmeta. I was following Transmeta actively at the time, as it employed my idol Linus Torvalds at the time.

After exotic architectures like DecAlpha and PowerPC, I bought another x86 PC. Shopping around I decided to get the very average-performance but attractively priced AMD Duron.

As I had my computer switched on day and night, performance was becoming less important to me, and I wanted something silent. My first attempt at HTPC was based on Via C3.

This got replaced with my second attempt at HTPC based on Mobile Pentium technology and an AOpen 915Gmm motherboard.

Tinkering with a Sony PS3 running linux got me interested in CELL SPU programming. This in turn landed me a job with SlantSix games in Vancouver. While in Vancouver I started exploring iOS and for this purchased a Mac Mini and a Macbook Air.

Friday, January 25, 2013

The Little Computer That Could

There was once a Little Computer called Pi. He was smaller than all the big computers around him. Pi wanted to do the work that the big computers were doing. He would love to run high resolution 3D games for all the gamers in the village. But the big computers made fun of him. "Oh, you are too small for HDTV gaming." they would scorn. But the Little Computer would respond "I think I can! I Think I can!" But the big computers would not let him run the complex games. "Too slow for rigid body physics simulations" they would say. Well, "Running physics simulations, I think I can, I think I can!" responded Little Pi.

On one day, when all the expensive computers were busy charging micro-transactions to their users, Little Pi was given the task to run a crane simulator game with complex rigid body simulation and render it at a 1920x1080 resolution. The programmer demanded a fluid 30 frames per second. The Little Pi was straining, but thought "I think I can! I think I can!" It began calculating the constraints and solving the matrices, it would spew out 2 MPixel framebuffers, and there it was. "I knew I could! I knew I could!"

If you want to play The Little Crane That Could on the big screen, now you can conveniently do so with a Raspberry Pi. This little $35,- computer packs quite a punch, and manages to run the game at 1920x1080 at 30 frames per second, albeit without the dynamic shadows. So why don't you head over to the Pi Store and get yourself a free copy of The Little Crane That Could.

Friday, January 18, 2013

Raspberry Pi Explorations

Since I got challenged to porting my game to the Raspberry Pi, I got myself one of these devices. Here I will document my explorations of this device.

  • I installed an OS image using RPi sd card builder tool.
  • Upon booting, you get access to a tool called 'raspi-config' which lets you resize the root filesystem to take up the full size of the SD card.
  • raspi-config also lets you change the keyboard mapping. This worked for the text mode keyboard, but in X I still have a UK keyboard mapping. Editing /etc/default/keyboard did not help. Typing 'setxkbmap us' in a xterm does fix it though. Why is this still a problem? It got reported a long time ago.
  • I am not impressed with the case I bought for it. It is very fragile and hard to open and close. My advice is to shop for something better.
  • Hooking it up to the DVI input of my Samsung SyncMaster 172w did not work. My Dell Ultrasharp works superbly at 1920x1200 though.
  • You can power the device with a microUSB cable that is pligged into an iPhone charger.
  • The graphics chip is the Broadcom VideoCoreIV and there is a github for it. Note that a misspelling as VideCoreIV is often repeated.
  • Apparently you can do OpenGLES2 without X.
  • A guide on EGL on the Pi.
  • The VideoCoreIV driver lacks both GL_OES_texture_half_float and GL_OES_depth_texture which would make shadow mapping a very expensive operation. I think I will have to leave out shadows from my Little Crane port.
  • I have not been able to find support for OpenSLES on the Raspberry Pi.
  • To get XWindow use the full screen on TVs, you need to disable overscan in /boot/config.txt
  • To get natural scrolling, like OSX does, you need to: apt-get install x11-xserver-utils followed by echo "pointer = 1 2 3 5 4 7 6 8 9 10 11 12" > ~/.Xmodmap && xmodmap ~/.Xmodmap

Monday, December 31, 2012

2012: the year of the crane.

The Chinese may call it the year of the dragon. However, I hereby proclaim 2012 to be the year of the crane. And I should say: specifically the year of the Little Crane. In the year 2012 alone, The Little Crane That Could was downloaded more than 5 million times. Thank you gamers!

2012 2011
iOS 3454K 1550K
Android 1656K -
Mac 81K -

Christmas day was by far the best day for Little Crane on iOS, where it hit 23K free downloads and 1.5K purchases of the premium version that day alone.

I can also report that this year 42K people have bought the Little Crane World Editor. This makes 2012 an amazing year, exceeding the goals I set a year ago. And for the sake of completeness: Hover Biker saw 315K downloads, and A Blocky Kind of Love saw 97K downloads.

Friday, December 21, 2012

ANR: Application Not Responding (keyDispatchingTimedOut)

This is a heads up for those Android developers that use the NDK (Native Development Kit) to write Android apps. If you are using the NDK, you are almost certainly using the glue code that Google provides in the android_native_app_glue.c file.

Chances are that in your Google Play Developer Console, you see reports of Application Not Responding (ANR keyDispatchingTimedOut.) For my app, I have 756 of these reports on an installed base of 1.5M downloads. Consulting stackoverflow or other developer groups, will invariably yield the advice not to block the main thread. However, it is easy to cause this ANR without blocking the main thread, if you are using the android_native_app_glue.c file in your project.

If two events are generated at exactly the same time, using different sources or devices, the app will freeze. You can easily produce this with a PS3 controller hooked up to your Android device and depress both analogue sticks at exactly the same time, or release them at exactly the same time. If you do this while running an NDK based app, the app will freeze and issue an ANR.

It took me a day of debugging to find a work around for this, but I am happy to report that the following change to the glue code will stop the issue from happening. What you need to do is get events from the queue repeatedly in a loop, instead of just handling a single event in process_input() function.

static void process_input(struct android_app* app, struct android_poll_source* source)
{
    AInputEvent* event = NULL;
    while ( AInputQueue_hasEvents( app->inputQueue ) )
    {
        if ( AInputQueue_getEvent( app->inputQueue, &event ) >= 0 )
        {
            int32_t handled = 0;
            uint32_t devid = AInputEvent_getDeviceId( event );
            uint32_t src   = AInputEvent_getSource( event );
            //LOGV("New input event: type=%d devid=%x src=%x\n", AInputEvent_getType(event), devid, src);
            int32_t predispatched = (AInputQueue_preDispatchEvent(app->inputQueue, event));
            if (app->onInputEvent != NULL && !predispatched) handled = app->onInputEvent(app, event);
            if (!predispatched) AInputQueue_finishEvent(app->inputQueue, event, handled);
        }
    }
}

I have reported the issue to Google.

Sunday, November 25, 2012

Unsafe Permissions

Where iOS gamers tend to pay for their gaming with a credit card, it seems that Android gamers prefer to pay with their privacy. Google has implemented warnings on unsafe permissions for the user to review before launching an app for the first time. Most users will click away the warnings without reading.

I am proud of the fact that my game does not require a single unsafe permission, as can be witnessed on its google play page:

Out of curiosity, I checked the permissions of the Top 100 free Android games on Google Play. It turns out that apart from my The Little Crane That Could, there was only one other game (Duck Hunt Mario) that did not require unsafe permissions. Those are pretty astonishing results, I would say. And it makes me even more proud that I managed to pull this off: a successful game generating a handsome revenue without compromises. I wish Google Play would open up a new category for safe apps, to promote safety in the ecosystem.

Sunday, November 18, 2012

A conversation with a user

I had an interesting conversation with a user of my game recently. Some background info: my game is a free game, and there is a 'BUY' button that lets you unlock extra levels in the game. If you press it, it will take you to the Google Play store, where it lists the price in your local currency, and describes the purchase.

For every 10000 downloads, I get one email from a user asking about the price of the in-app-purchase. I totally understand why they email me: they fear that pressing 'BUY' will immediately charge them without knowing the price in advance. I can relate to that, as mobile games have done sneakier things in the past. So I always respond to the email, and tell them the price is $2.89 or the equivalent in their local currency.

Now, customer is of course king, and should be treated in the best possible way. Somehow, my patience ran out on Randy, and this conversation ensued. Was I professional? Not as much as I could be. Was it funny? Yes.

On Friday, November 16, 2012, Randy [REDACTED] wrote:
How much for the upgrade? The site don't say anything about that....free would be nice...lol Thanks for your time
On Nov 16, 2012 3:40 PM, "Bram Stolk" wrote:
$2.89
On Fri, Nov 16, 2012 at 11:32 PM, Randy [REDACTED] wrote:
How many levels?.. consistently updated??...
On Nov 17, 2012 1:03 AM, "Bram Stolk" wrote:
18 more levels
On Saturday, November 17, 2012, Randy [REDACTED] wrote:
Love the game.. was just wondering how often this game is updated/more levels added.. before I purchase Thanks
On Nov 17, 2012 12:19 PM, "Bram Stolk" wrote:
I do not expect new levels soon, just bug fixes for now.
On Sunday, November 18, 2012, Randy [REDACTED] wrote:
Cool game and time waster... but I can't see spendig that money (even tho minimal) when one don't update with new levels or added game play.. keep me posted as to when lvls or game play added and I'll be more than happy to purchase...thanks
On Nov 17, 2012 11:46 PM, "Bram Stolk" wrote:
Talk about time waster.... How much time do you think I have trying to convince you to spend that measly 2.89? Almost a 1000 people a day make the purchase, so I am not going to look up your email and send you a message just to get your money. Some people will never spend money, that is fine.
On Nov 18, 2012 12:22 AM, "Randy [REDACTED]" wrote:
I have plenty of money to spend.. and even more time .blogging and pasting what "proActive" man you are and your response to a simple question.. hell , I guess if you dont have time for your People or your program; choose a different career...don't hate.. a I Stated before, I have nothing better to do than to make you look foolish and post the negativity; which you exude upon me.. thanks..for keeping me from buying your program.. and many others, from making the same mistake..
On Nov 18, 2012 12:38 AM, "Randy [REDACTED]" wrote:
I am In Vancouver.. let me know where you at? Talk your smartass..

Friday, October 19, 2012

Useful software

This post is mainly a reminder for myself, and contains a list of essential software I use frequently. If I ever need to install a virgin Macintosh, I can refer to this.

  • graphviz so that Doxygen can use 'dot'.
  • doxygen so that I can make sense of large convoluted code bases.
  • inkscape my tool of choice for designing 2D artwork.
  • GIMP my tool of choice for WYSIWYG image manipulation.
  • NetPBM my tool of choice for command line image manipulation.
  • ImageMagick for when NetPBM doesn't cut it.
  • FFMpeg for video manipulation on the command line.
  • Grandperspective for finding those large files that clutter up your disk drive.
  • Kerkythea for state of the art rendering using Photon Maps.
  • Wings3D My overall favorite software. When I'm modelling in Wings3D I am happy.

Monday, July 23, 2012

Definitive Guide to Using Hyperdeck Shuttle II with Mac OSX

It took quite some experimentation and I was about to give up on this, but it looks I cracked it. I can now record compressed video with the BlackMagic Hyperdeck Shuttle II and use the video file with Mac OSX. I was already able to use the uncompressed video files from the device, but those are just too freaking huge to comfortably work with on a macbook air.

STEP1:
Make sure you are using the latest drivers and firmware as posted on the support page of Blackmagic design. I used the 3.0.2 version.

STEP2:
Set the device to record compressed files into a quicktime container.

STEP3:
Don't try to record output from your iPhone... I tried, and it did not work. My iPad2 and iPad3 HDMI signals were accepted just fine. You will need an adapter from Apple to hookup an iPad to a Hyperdeck Shuttle II.

STEP4:
BlackMagic wants you to install the DNxHD codecs from Avid's website. Frankly: don't bother. You can try, but they did not work for me. I would either get a black screen video with some audio (iPhone recording) or worse, I would have a crashing quicktime player (iPad recording). The crash I am seeing is this one:

Thread 3 Crashed:: Dispatch queue: com.apple.coremedia.playbackboss
0   com.apple.audio.toolbox.AudioToolbox 0x00007fff841595a7 _AT_AudioUnitUninitialize + 55
1   com.apple.audio.toolbox.AudioToolbox 0x00007fff841ed819 MixerChannel::Uninitialize() + 27
2   com.apple.audio.toolbox.AudioToolbox 0x00007fff841ede5e SubmixGraph::ConnectToDestination(bool, CAStreamBasicDescription const&, CAAudioChannelLayout*) + 184
3   com.apple.audio.toolbox.AudioToolbox 0x00007fff841e7120 MasterMixer::ConnectSubgraph(SubmixGraph*) + 686
4   com.apple.audio.toolbox.AudioToolbox 0x00007fff841e7415 SubmixGraph::ConnectInputChannel(bool, MixerChannel*, bool) + 657
5   com.apple.audio.toolbox.AudioToolbox 0x00007fff841e7826 AQMEDevice::AddRunningClient(AQIONodeClient&, bool) + 262
6   com.apple.audio.toolbox.AudioToolbox 0x00007fff841cbaf5 AudioQueueObject::StartRunning(AQIONode*) + 51
7   com.apple.audio.toolbox.AudioToolbox 0x00007fff841c47e3 AudioQueueObject::_Start(XAudioTimeStamp const&) + 637
8   com.apple.audio.toolbox.AudioToolbox 0x00007fff841c494a AudioQueueObject::Start(XAudioTimeStamp const&) + 18
9   com.apple.audio.toolbox.AudioToolbox 0x00007fff841dd8cb AQServer_Start + 64
10  com.apple.audio.toolbox.AudioToolbox 0x00007fff841e0e88 AudioQueueStart + 183
11  com.apple.MediaToolbox         0x00007fff877497d8 FigAudioQueueStart + 551
12  com.apple.MediaToolbox         0x00007fff877a33b3 0x7fff87728000 + 504755
13  com.apple.MediaToolbox         0x00007fff8779ae4e 0x7fff87728000 + 470606
14  com.apple.MediaToolbox         0x00007fff8779e2e0 0x7fff87728000 + 484064
15  com.apple.MediaToolbox         0x00007fff8779e508 0x7fff87728000 + 484616
16  com.apple.MediaToolbox         0x00007fff8779e75a 0x7fff87728000 + 485210
17  libdispatch.dylib              0x00007fff8dd77a86 _dispatch_call_block_and_release + 18
18  libdispatch.dylib              0x00007fff8dd792d6 _dispatch_queue_drain + 264
19  libdispatch.dylib              0x00007fff8dd79132 _dispatch_queue_invoke + 54
20  libdispatch.dylib              0x00007fff8dd7892c _dispatch_worker_thread2 + 198
21  libsystem_c.dylib              0x00007fff8825b3da _pthread_wqthread + 316
22  libsystem_c.dylib              0x00007fff8825cb85 start_wqthread + 13

STEP5:
FFMPEG to the rescue! Instead of trying to get DNxHD and Quicktime to play nice together, Open Source will save us, hooray! Note that some companies charge 500 euros for video conversion software, but ffmpeg will do the trick. You need to make sure you have a very recent version, as DNxHD support was only recently added. Get the ffmpeg binary from the ffmpegmac.net site.

STEP6:
Convert the DNxHD file to something more useful using the following command line:

$ ffmpeg -i Capture0002.mov -an output.mp4
Note that this invokation strips the audio, if you want to keep it, you will have to consult the ffmpeg manual on how to do that. The ffmpeg tool recognizes the iPad2 stream with the following properties:
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'Capture0002.mov':
  Metadata:
    creation_time   : 2012-07-23 22:12:33
  Duration: 00:00:13.38, start: 0.000000, bitrate: 238657 kb/s
    Stream #0:0(eng): Video: dnxhd (AVdn / 0x6E645641), yuv422p10le, 1280x720, 220200 kb/s, 60 fps, 60 tbr, 6k tbn, 6k tbc
    Metadata:
      creation_time   : 2012-07-23 22:12:33
      handler_name    : Apple Alias Data Handler
    Stream #0:1(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, 16 channels, s32, 18432 kb/s
    Metadata:
      creation_time   : 2012-07-23 22:12:33
      handler_name    : Apple Alias Data Handler

Friday, July 13, 2012

Beware of automatically downscaled retina images


The iOS development environment facilitates image content loading for retina and non-retina devices with a clever naming scheme. If an iOS app is running on a retina device, and is instructed to load an image named foo.png, it will actually attempt to load the foo@2x.png file if it exists. By providing both foo.png and foo@2x.png images, both classes of devices are supported.

This convenience offered by UIKit goes one step further: if you do not provide the regular resolution version foo.png but only the retina version foo@2x.png then non-retina devices will load the high res version and automatically downscale a factor 2 so that it can be used. If you are tempted by just providing retina versions of your images, and skip the regular versions, like I was, you are selling your app short. It turns out that UIKit does a horrible job at downscaling, and the image will look considerably worse than a version that you pre-scaled yourself.

To illustrate the effect, see below how the original is downscaled by UIKit, and how it is downsampled by my authoring tool inkscape. A big difference, I would say.

The bottom line: if you care for your iPad2 and iPhone3GS Users, don't skimp on images. Provide both the foo.png and the foo@2x.png files.


The retina version of the graphic, viewed at 1:1 zoom.


The image automatically downscaled by UIKit from the retina version, viewed at 2:1 zoom.


The image, exported at non-retina resolution by inkscape, viewed at 2:1 zoom.

Monday, June 25, 2012

Bram's upcoming indie game

I've been working on my next indie game. It uses a voxel art style, and like the little crane that could, it features a top notch physics simulation. Here are some screenshots of the game. Note: the new project has not been named yet.


On the farm.


Visiting the queen.


At the picnic.