Pages

Thursday, June 13, 2013

Making a game with Keita Takahashi at Funomena

I can finally talk about the super exciting stuff that I've been working on in the past few months!

I have been helping Keita Takahashi's prototype his next big game!! Along the way we joined forces with Robin Hunicke  and Martin Middleton of Funomena. Robin and Martin worked on Journey before they formed Funomena as an independent game studio. 

So today Robin and Keita announced the game at Horizon - the first year of indie press conference in parallel to E3. Here is the short Q&A of the game at Horizon and here is the blog post at Funomena.

Now that I've gotten the basic info and links out of the way - I am so excited about this coming together!!!

This all started at the Babycastles Summit last year when the folks at Babycastles asked me to help out implementing one of Keita's ideas for the summit - Marioball. We kept in touch and one day he asked me whether I can help him prototype an idea and that's what we have been doing since then. (The full story in a later post maybe as we reveal more stuff about the game).

I just want to end this post saying this - Collaborate and do amazing stuff!! I used to sit in a room and try to make games on my own and I wasn't getting anywhere. As soon as I started collaborating with various people amazing things started to happen. And it has finally led to this.

A new adventure begins.



(Keita's concept art of the Mayor and his Deputies celebrating)

Saturday, April 13, 2013

Update Omnibus

Wow - Lots of things have happened in the past three months but I hadn't taken the time to update my blog. So here is a summary of all things big and small from the past few months.

1. Quit my day job 
I finally quit my day job in order to pursue independent game development full time! I had the following projects which I thought were good enough reason for me to quit. And also thanks to the financial support of my wife I can take some time off to figure out what I wanted to do with games (and also my US green card came in last year). So this has really just been all the right things coming together finally.

2. Babycastles
I  have been helping the Babycastles folks with a few things. First I made them a new website - http://babycastles.com/. This is a project which is important to me personally because I know I missed out knowing about Babycastles earlier because I couldn't find enough information about what they really were!

I have also been helping them with smaller things like arranging a Global Game Jam venue at Silent Barn - Silent Jam! We put together a toolbox of game making tools which some people outside the Silent Barn venue found useful - https://docs.google.com/document/d/10doDd27Shk4I7dX8OSbUK5croQZLeQsOXufrECqP4LI

I have been attending the Babycastles meetings more regularly. It's all coming together as Babycastles moves to it's next phase of finding a more permanent space from where it can continue it's awesomeness. This is going to be an exciting year!

3. OUYA Game Jam - Tossers with Ben Johnson
Ben Johnson of Babycastles and I collaborated on creating a game for the OUYA game jam - Tossers. It's a 2v2 local multiplayer game that is a cross between Curling and Marbles where instead of a rock/marble you throw your teammate into a ring of marbles. So far it's been fun working on the game and playing it. So I think we will take it to completion some time in the next few months.

The game didn't win anything in the OUYA contest but we showed it at the NY Gaming Meetup - The first OUYA game to be demo-ed there. By this time we added a powerup feature to add to the mayhem in the game and also added a puppy AI so that the game can be played 1v1 also with an AI teammate. People seemed to like the game play and we got some good feedback.

The meetup and our demo were covered in Alley Watch :P - http://www.alleywatch.com/2013/03/new-york-february-gaming-meetup/

4. Ejecta WebGL talk at onGameStart NY
WebGL bindings on Ejecta allows you to port your WebGL apps/games to iOS. This has been a pure technology project as I want to keep a good mix of games and tech projects.

The work I did has been merged into Ejecta main branch! - https://github.com/phoboslab/Ejecta. A lot more interesting WebGL/three.js demos are now functional on iOS devices (Maybe a different post for this).

I presented the work I did with adding WebGL bindings to Ejecta in the first edition of onGameStart happening in the US. Here are the slides. I hope the videos never come out because I wasn't too happy with my presentation :) I really needed to cut it short in few places and maybe just gone to the demos earlier.

I met many of the HTML5 folks whom I had known only on Twitter. It was nice to meet them in person finally! 

5. GDC
I attended my first ever GDC! More than the talks I spent most of the time hanging out with really interesting people and helping out Babycastles curate/setup what was a really great installation of relaxing games at SFMOMA. Here is a cool review of it - http://www.youtube.com/watch?v=wWLuEp4u5PE

My highlight from the few sessions I attended was the Experimental Game Workshop. All the games in the lineup were amazing but the one game that blew me away in it's elegance was the Katamari-esque Kachina.

Of the parties the Wild Rumpus was by far the most enjoyable. It featured the amazingly fun party game by Keita Takahashi called Tenya Wanya Teens with it's own controller. We also arranged our own Babycastles party at the end of GDC and invited all our friends to it - It was a great way to end GDC.



There are more exciting things afoot - I can hopefully start talking about it in the next few months! This has been a very interesting journey and I really owe it to the amazing folks at Babycastles!! 

Games are only getting more interesting as different kinds of people are getting into it and Babycastles has been an amazing force in progressing the scene in this regard. Hopefully I can help them keep doing their awesome work in the coming years and we can spread this mission to many more places in the world!

Thursday, November 29, 2012

EjectaGL : Quick Update - Basic texture support

Quick Update - Basic texture support has been hacked into EjectaGL.

Here is a super quick (hence the bad lighting, bad finger smudges and bad everything - like all my videos) demo of port of Lesson06 from learningwebgl.com.

To make up for the absence of keyboard the zoom is controlled by tilting the device(which Ejecta nicely supports). Touching the screen toggles between the different filtering options.

I have tried to show the effect of Mipmaps which is the main part of Lesson06 (Open in youtube for bigger size video to actually see the difference)

If at all you want to give it a spin, be warned that I've pretty much stuck to implementing the APIs needed for the lessons for now. I just haven't gotten around to typing all the binding yet. Feel free to jump in and add some of them.

Monday, November 26, 2012

EjectaGL - WebGL on iOS with Ejecta!

Summary: 
I am starting a project EjectaGL which adds WebGL binding to the awesome Ejecta project by Dominic Szablewski. EjectaGL extends the Ejecta framework to provide a WebGL implementation on vanilla JavascriptCore without any of the DOM overhead. EjectaGL provides a way to develop WebGL apps that can be shipped to the iOS app store(Not tested, but very possibly). It is far from being complete - So please contribute if you are interested! 


EjectaGL is distributed in the same MIT license as Ejecta.

Longer story:
I had worked on Voxel Flyer for js13kgames - A WebGL based game in less than 13K compressed Javascript. I wanted to keep developing it but I sort of wished it would work on iOS too. A search for WebGL on iOS usually yields the jailbreak method for enabling WebGL on iOS which is not very useful if you want to ship your WebGL app on the App Store. It feels like Apple might open up WebGL at any time but that never seems to happen. I always wanted to look into adding WebGL binding to AppMobi's directCanvas - but that never happened due to the complexity of the directCanvas code(and not to mention that the open source version has hardly seen the updates of their recent XDK).

Meanwhile I saw that Dominic from Phobos Labs(on whose work directCanvas is actually based upon) released Ejecta, a newer version of his iOS library to provide a barebones fast implementation of Canvas and Audio without all the DOM overhead. This time I felt the code was much more friendlier and  I could actually have a real shot at implementing WebGL on top of Ejecta. 

So couple of days back I started reading my fist tutorial on Objective-C and started hacking Ejecta! The result is EjectaGL. EjectaGL replaces the onscreen 2D canvas with a WebGL enabled canvas. At the current stage it's just a proof of concept and I am discovering and fixing issues as I am porting each lesson from  http://learningwebgl.com/. I tried porting Voxel Flyer to it and discovered that there is something wrong with the way uniforms are being passed in which I'm yet to finish debugging.

So basically it's in a very nacent stage but things are looking mostly doable. I can definitely use some help from people to add more bindings, port more examples and review my code for memory leaks(I'm sure there are few there given my newness to Obejctive-C) and other issues.

Check out the screenshot of Voxel Flyer running in the emulator with buggy lighting turned off at an astounding 9 frames/second. (You can run this demo by downloading the github source, creating an App/ folder and copying the contents of examples/VoxelFlyer to it)





As for some implementation details the one major caveat is the absence of Typed arrays in the current JavascriptCore that Ejecta uses(It might be coming some time later though). I have tried to work around it by implementing a very primitive pseudo typed array in Objective-C. Basically you can instantiate a typed array but that's about it. I found that most WebGL code just create the typed Float32Array, Uint16Array just before binding to the buffer anyway and don' do much with the contents anyway. There could be issues with libraries like glMatrix that use typed arrays for fast matrix calculations. Currently I just replace the typed array with a regular array in my stripped down version of glMatrix. I see a workaround here of implementing glMatrix itself in Objective-C that doesn't have typed array in it's method interface but can use the internal representation of EjectaGL's typed array which can then be passed on to EjectaGL. A similar strategy can be adopted if we want to get libraries like three.js also running on EjectaGL(though that would be a big undertaking of it's own).

There are not lot of WebGL APIs implemented currently but I don't see any other major issues in doing that (other than the typed arrays). Also I am thinking of adding stuff like rendering offline to a 2D Canvas and using that a texture in WebGL (My Voxel Flyer originally had a mini map that was a 2D canvas on top of the WebGL canvas).

So for now I just wanted to quickly mention the project and get the ball rolling so that I can ask for help and also make sure there is no duplication of effort (I am pretty sure 100 different people were also thinking of a similar project).

I think there are lot of applications for this project - It would be nice to get all those nice shader/demos working on the mobile device, possibly even the three.js ones. I also see a possibility of using all the cool WebGL shader effects available on Construct2 to be now fully exportable to the iOS target. 

Drop me a line here or on twitter - @vikerman

Update : EjectaGL is now merged into Ejecta main! You can get it at https://github.com/phoboslab/Ejecta

Tuesday, September 25, 2012

GameDoc: On the fly Javascript game modding for the mobile web

I had a crazy idea and wanted to create a quick experiment to test it out. Mobile browsers are becoming good  - Javascript and Canvas implementation are becoming faster. There are certain advantages to the mobile browsers as a game distribution platform - the most important of which is that it is open and not controlled by arbitrary rules set by Apple, Google or whomever (Classic example - Phone Story). HTML5 games in general are picking up steam but game makers are ignoring the mobile browser.

There are two factors why games on mobile browsers don't work well -

  1. Game developers ignore things like centering the screen on the game area, setting the right zoom level, allowing for touch input(or any alternate mechanism like accelerometer) when there is no keyboard
  2. Browser makers don't provide a good way to do certain things  - locking screen orientation, a working audio API etc.
1) is definitely solvable while 2) involves letting platform creators know about the issues faced by game developers (Some influential developers that are already trying to do that - http://www.phoboslab.org/log/javascript). Maybe newer platforms like Mozilla OS will push the standards higher on the existing ones.

So the crazy idea is centered around the concept that HTML5 games being based on open standards should be modifiable by anyone (An idea floating around in Mozilla land) - In this specific case modified to work well on mobile browsers. If a game developer didn't take care to make the game work well on mobile browsers, anyone should be able to fix it.

To prove this point I created a proof of concept - which I called "GameDoc". For the experiment I took the game emptyblack.com. I really liked the game when I played it on the desktop and I wanted to play it on my phone. The game seems to run ok as far as the graphics is concerned but it had the classic issues mentioned in 1). Since there is no keyboard input in the phone the game is unplayable.

GameDoc is a native mobile App that basically wraps a browser WebView. It loads the game from the original URL but it sets the required zoom level on the WebView to get a good view of the game area and locks the screen orientation to landscape. And here is the cool part - It fixes the absence of keyboard by injecting Javascript snippet into the page that adds on-screen controls. When these controla are pressed keyboard events are simulated using the dispatchEvent mechanism. I used the WebView wrapper route to be able to do this Javascript injection on the device (due to absence of browser extensions on the mobile).

So here is the video of how the converted game works. The video shows how the game is not playable in it's original form in the mobile Chrome browser and how the fixed version works with simulated onscreen controls. It's not perfect - Sometimes keyup/keydown events seem to get missed and I think it might be due to multitouch issues. As you can see I really threw this together rather quickly with horrible looking icons for the controls - But you get the idea.


The grand idea is that there can be a crowd sourced HTML5 game portal where people can add their own customization to the game controls or the game itself and the Gamedoc App(or whatever better name I come up with) would pull that extra information for the game and does the necessary modding.

Can I avoid the need to have a separate App that wraps the WebView and just use the native browser? Maybe I can do something at the server side but that might screw up with things that are tied to the original domain like cookies or the local store. Maybe I can just do the controls with some sort of IME trickery.

Anyway as I said it's a crazy experiment. At the least I hope to push HTML5 game developers to think more about the mobile web.

Let me know of your thoughts/comments.

Update: Looks like IOS6 has better support for Audio opening up better game support! And the awesome folks who make Construct2 already support it - https://www.scirra.com/blog/98/the-web-as-the-platform

Tuesday, September 18, 2012

js13kgames and Voxel Shooter

I knew I had to take part in js13kgames - A competition to write Javascript games less than 13KB when zipped. I used HTML5 canvas for the last two of the last three game projects I worked on and was increasingly identifying myself as a HTML5 game programmer. However a visit back to India and the work on Mario Ball had left very little time for me to actually work on my 13KB game.

At 13KB I had decided to have some sort of procedural generation for the elements of the game. I also wanted to explore the Web Audio API for the sound. However I was having a very hard time trying to come up with a concrete idea for the game. My very first idea was a weird fighting game where the player's form and powers were determined by a random combination of 3 elements - Rectangles, Circles and Bezier curves. The player and enemies basically looked like modern art pictures. I wasn't sure whether I was going too weird with this one. Some of the denizens of this unfinished game:





Then this entry came out on js13kgames - 13th Knight. It was a fully textured 3D world in WebGL in under 13KB! And it had awesome looking trees!! I wanted to make a WebGL game now. I always wanted to learn WebGL but hadn't looked closely into it it before. I had started looking at three.js for 7dfps but never got the time to do anything beyond a sphere moving on plane. This time around three.js wasn't going to be an option because no external libraries were allowed and three.js would not fit into 13KB (I later learnt somebody else had managed to get a subset of three.js working in their 13KB game - Mindless).

So not exactly knowing what I'm going to build I started with the WebGL lessons at http://learningwebgl.com/blog/?page_id=1217 for the basic template for a WebGL program. I played around with it till I was able to draw a flat colored cube with directional lighting. There were 5 days till the competition ended and I had to come up with something quick. Then it struck me - I will just redraw the cubes a lot of times to create a Voxel engine! (Being completely ignorant of the issues before starting a project is sometimes a good thing). I was missing a good flying game these days and I thought I will do a flying game where you shoot enemies on the ground - Voxel Shooter!!

I generated the terrain using some random combination of sin curves. I copied a small subset of the glMatrix library for the matrix operations. And I happily went about drawing cubes over and over again - One drawElements call per cube. I could reach up to 50 x 20 array of cubes before the rendering became too slow. I tried removing faces from the cubes that are hidden behind other cubes - A small increase in the frame rate but nothing great. I tried some cheating like increasing the size of each cube so that I can get away with a smaller number of cubes. Nope - the map was still too small. I banged my head trying to figure out a frustum culling method but it just involved too much Math to be figured out in a small amount of time. Also the far plane was going to be too near and field of view was going to be too narrow if I can display only 1000 cubes. I just had to figure out a way to improve performance first.

Then I came across this - Google I/O 2011 : WebGL Techniques and Performance. I had basically made all the mistakes mentioned in the talk. First and foremost I was switching too much context for every vertex - Even the uniforms that were common to all vertices like lighting and camera and projection matrices. I got a substantial boost when I fixed this issue but still not going to allow the 256 x 256 map size I was aiming for. The next thing was to reduce the number of GL draw calls by combining many cubes into one giant mesh. This one I had some problem getting my head around because I hadn't understood the concept of attributes properly. Then I saw how I could augment the vertex data by putting its world position as an attribute and I could draw the entire terrain with a single drawArrays call - And BAM! Now everything was running smoothly at 19fps (yeah. Much better than 7 fps) for an entire map of 256 x 256.

Everything from here on was mostly polishing. I worked a lot on the camera - I was using the LookAt matrix but couldn't figure out how to roll and I wanted the camera to roll on turning. After scourging the internet found a simple method to just pre-multiply the LookAt matrix with a roll matrix.  I didn't like the way you could lose your bearing if you looked right up. So I added some clouds(All clouds drawn with a single GL draw call), filled the surrounding areas with water and added a wrap around if you strayed too far from the island. I added some cubes to represent the enemies (all drawn with a single call). I spent quite a bit of time overlaying a 2D canvas to represent the minimap so that you would knew where your enemies were. I tried adding bullets to shoot down the enemies with the basic physics handled right in the shader but that wouldn't work - I just had to give up on it. (Did you wonder why it was called Voxel Shooter but there is no shooting? :) I spent the whole night before the competition deadline at 7AM adding these things and then before I realized it was time to submit!

I was very dejected that I didn't complete the game and I was too near to the deadline. You just flew around doing pretty much nothing. You couldn't even crash into anything. Staying awake for the whole night was a complete waste of my time. And then out of sheer desperation I thought I will just add a simple check for crashing into the terrain and that would be the game - Avoid crashing into anything. I was testing it and I kept going through the enemies. I thought I should add collision detection with the enemies as well. I added that and then I realized instead of treating crashing into enemies as a failure, that could be the whole point of the game!! I quickly added a score for picking up the red cube (no longer the enemy!) and increase the speed of the game every time you finished a batch of them. I submitted this new version and was smugly happy with my own crisis handling. Next time I should just give myself more time to avoid such things.

The next day playing through the other entries of the js13kgames and I liked many of them. It was interesting to see the methods they used to keep the size down. The most amazing thing I learnt about was jsfxr - A simple sound effects generator without having to store audio in huge audio files. I am definitely going to use this more in my upcoming projects and in the future versions of Voxel Shooter.

Some people seemed to like Voxel Shooter and it was really cool that indiegames.com picked it - http://indiegames.com/2012/09/browser_game_pick_voxel_shoote.html. Even Cheezbruger picked it for their Lunch leisure  - http://t.co/xO9drzmZ! Some were amazed it was done within 13KB but I should mention that my game was very simple - it just does the same thing a lot of times and I didn't spend any time trying to reduce the size.

The initial reaction from people has given me encouragement to keep working on the game and I think I will make it into a full fledged game in the future. Lots of ideas flying in my mind now - Sounds, bigger worlds, flying enemies, improved shading, shadows, water, lava, gamepad support, multiplayer, levels, missions... But now I will automatically think of one more thing - How small can I keep the code and assets?

And to end it all here is a video from my awesome friend Brian who scored 148 in Voxel Shooter! Can you beat that?!!



Sunday, September 16, 2012

Mario Ball with Keita Takahashi and Kaho Abe!

Something very unexpected happened. The Babycastles folks were running a Summit at the Museum of Modern Art and Design(MAD) - It was centered around bringing Keita Takahashi's ideas to reality along with other things like game (not just videogames but also about things like games in physical spaces) related talks and indie music bands.

They wanted some help with the programming for one of their projects. For some inexplicable reason they thought of asking me and I jumped at the opportunity. I had always wanted to explore physical games/ alternate interfaces and wanted to do things similar to what Kaho Abe does. And now they told me I was actually going to work with her! Not to mention an opportunity to work with Keita Takahashi!!

I had an initial meeting with Keita and Kaho. The project, Mario Ball, was to build the original Mario brothers - but instead of using a normal controller the ideas was to control it using a wooden maze representing the level and Mario actually controlled by a ball. I think Keita's original drawing explains this the best.


Kaho had decided that we are going to use a magnetic ball for Mario and an array of  magnetic reed switches underneath the controller surface to detect where the ball is an move the on screen Mario accordingly (It was decided not to use computer vision because that was already being used for another game). This is where I was going to help them - Actually build the game that processes the controller input to move Mario around in the game. It was also decided to build the controller as a big box so that it would require two people to work with each other to actually control Mario! We also decided that we would make Mario go up side down when the ball is actually moved up and it stays up there - something like a change in direction of gravity. I was concerned with the resolution of the magnetic reeds and asked Kaho to also add an accelerometer so that we can try and interpolate Mario's position when the ball was actually between two magnetic sensors and we had to guess which direction Mario was actually moving. Throughout the meeting I was mostly just thinking - Yay!! I am talking with Keita!!

Kaho was going to build this insane setup with 90 magnetic reed switches connected to an Arduino through a set of multiplexers. The program running on the Arduino would detect the active magnetic switch and write that value to the serial port along with the 3 axis acceleration information from the accelerometer. So now we were ready to start building the controller and the game. The kicker - We had 6 days to build this whole thing!! (And not even 6 whole days. I had to go to my day job on all the days and work in the evenings. Kaho was handling the hardware for 2 other games)

I think my experience with working with multi-tiered systems in my day job (At this small startup) helped here. We decided what the format of the controller output is going to be like so that we can initially work separately and in parallel and then put the controller and the game together when they were ready. I decided to build the game with a ball and magnetic switches that are actually simulated (Similar to mocks/stubs that we use to test in isolation without external dependency) so that I could iron out some of the issues I would face even before the actual controller was ready. In hindsight this got me to 50% of the actual solution - but that was good enough given the time crunch. The things I totally got wrong in my simulation were the area which each magnetic switch covered and the maximum speed the ball would actually be able to roll in the real world.

So I went back and started to actually build the game. I didn't spend much time thinking about it but I decided to use Processing because of its inbuilt ability to use the serial port to interface with the controller. This was a good and a bad idea but at the end of the day was probably the right decision. The good part was that it all worked out and even though I developed the whole thing on the Mac it perfectly ran on the Windows machine where we finally put it - including the serial port part. The bad things included Processing not really having an easy way to rotate and blend images at the same time (Really? I expected better) and some bugs when part of the rendered image going offscreen. The IDE itself was horrible at times and I had to actually delete one file at a time to find syntax errors. At the end of the day the positives probably overrode the negatives. But the next time around I would think a bit harder before using Processing for a game.

I spent the first few days to actually get the assets of the original Mario brothers(Thanks Syed!) in and trying to mimic the looks of the original game as much as possible. I built the game in a way which is probably not recommended - I tried to get the graphics of the game done well before the actual logic/gameplay of the game. But I  found that doing this motivated me much more than just dummy boxes and circles. Looking back I think this also helped out because I would have never been able to put in the effort to get the graphics right in the crazy days that were going to follow. I also spent the initial time building the framework in Processing to read in the world layout and character animation from Tiled TMX files. This also helped me out later when Keita wanted me to add extra characters into the game and I was able to do it with relative ease. As usual I wrote custom collision detection code instead of trying to figure out how to use Box2d with Processing. In my defense I didn't have time to ramp up on anything more complicated than the simple graphics APIs and this was the first time I was using Processing.

We met halfway through the week - Both the hardware and software were lagging behind where they should have been at that point. Keita was amazing as always - He had built the frame for the controller and created an actual Mario Ball by pasting colored strips to create a Mario on the magnetic ball (Thanks Lauren for the photo!).





That night me and Kaho just worked through the whole night trying to finish as much as possible. We made really good progress but we still weren't ready to put things together. The whole time I wasn't stressing myself out and tried to stay as calm as possible. I also told myself that Babycastles always manages to pull off these things - So I had nothing to worry! (It was half true at the end :). So here is the picture of the amazing circuitry that Kaho put together that night. I think she called it a unicorn barf or something like that.



The next day we had to shift to the MAD and this is where things started going wrong. We weren't allowed to stay late the day before we had to set the whole thing up! So on the opening day(Friday) we just had a few hours to get the whole thing ready. Kaho was also getting very busy dividing her time between all the projects she had to get working. This was a very tense day and unfortunately we weren't able to get Mario Ball ready that night.

On Saturday we finally got the game and the controller talking with each other. It was a magical moment. I never thought we would ever get to this stage given the craziness that was the whole project :) It had few kinks including a bug in the Arduino code which made it print lots of useless values in between the actual useful ones. A few hours later and few critical bug fixes later we had something close to playable - There were some issues with Mario appearing to teleport when the ball moved too quickly - but the game was definitely playable. And so by Saturday afternoon Mario Ball was ready to go! We spent some time getting the actual hardware for it setup and the game was finally ready to be put up.

I haven't talked much about Keita and probably mention here just a bit. He is a person of few but precise words and I would like to keep this section the same way. He is an amazing combination of a nice guy and a person who pushes you to the limit. He wasn't there to just have fun - But actually do something great. Good enough was not good enough - perfection was the only end point. In the end I didn't do everything that he asked for - but the game definitely ended up much better than if we had just aimed for something good enough. The couple of things I took out of this whole experience was to aim high and always keep improving.

Even though we were almost a day late I was very happy to have Mario Ball running for more than half of the summit. I think most people had lots of fun playing the game. There were some who said it was too experimental - but I thought that was part of what we were trying to do. Kaho had done an amazing job with the hardware - It worked without anything breaking for most of the summit (It briefly broke down on Sunday evening - but it was just someone had yanked on the controller so hard that the USB cable connected to the Arduino had come off. The crazy wiring just kept working the whole while).

So here it is - A picture of couple of people playing Mario Ball in the jungle themed arcade section of the Summit (The jungle theme was an amazing achievement of its own. Somebody needs to write about that). This should give an idea of the size of the controller and how it would take two people to use it. There are so many details in the controller I can't capture here. Keita added the fur like exterior and window handles(which reminded Keita of his Grandmother's house?) on the side so that you can easily hold and play.




Here is a video of people playing it at the Summit. I am sort of happy that I was able to capture the experience of the two players actually co-ordinating with each other to play the game.


I have no good words to end this post. Maybe that this is just the beginning. We will keep improving. You will see a better Mario Ball in the future!!