My favorite part of these kind of demos is when the audience goes wild (well, relatively) for the breakdancing elephant animation, even more than for the psuedo-3D graphics and psychedelic color scanline gimmicks.
I love when people dig up new dirt on my favorite things from 30-ish years ago, in this case a playable prototype of a never-developed “talkie” version of LucasArt’s The Secret of Monkey Island 2: LeChuck’s Revenge. The folks at the venerable MixNMojo site have a good writeup, including a detailed archeology on the differences and new sound resources discovered, along with information and images of LucasArt’s internal debugging tool called Windex (which ran on a second monitor in Hercules monochrome graphics mode!). Neat.
Nice write-up by Ars Technica on the ScummVM project’s history and developers. Hard to believe it’s been around for over 10 years already! (also, I hadn’t heard that they had a brief-lived controversial build that supported Eric Chahi’sAnother World, one of the best games of all time…)
If you love the old Lucasfilm games and want a peek into how their venerable game engine worked from a very technical perspective, you should read this article that walks through a disassembled Maniac Mansion. Extra bonus: Ron Gilbert, the creator of the SCUMM scripting language, drops a lengthy note in the comments section with insider info:
One of the goals I had for the SCUMM system was that non-programers could use it. I wanted SCUMM scripts to look more like movies scripts, so the language got a little too wordy. This goal was never really reached, you always needed to be a programmer. 🙁
Some examples:
actor sandy walk-to 67,8
This is the command that walked an actor to a spot.
actor sandy face-right actor sandy do-animation reach walk-actor razor to-object microwave-oven start-script watch-edna stop-script stop-script watch-edna say-line dave “Don’t be a tuna head.” say-line selected-kid “I don’t want to use that right now.”
I think it’s amazing that they managed to build a script interpreter with preemptive multitasking (game events could happen simultaneously, allowing for multiple ‘actors’ to occupy the same room, the clock in the hallway to function correctly, etc.), clever sprite and scrolling screen management, and fairly non-linear set of puzzles into software originally written for the 8-bit C64 and Apple II era of computers.
Yukikaze, a “physical output device for a spectrum analyzer”. The idea is surprisingly simple, with elegant results: a case with powder beads that get blown around by sixteen DC fans mounted beneath, their speed controlled by Max/MSP. Real-life visualization fun.
Parchment is a JavaScript-powered Z-machine interpreter. Translation: you can now play your Zork and your Leather Goddesses of Phobos (or more modern pieces of interactive fiction) without leaving the comfort of your web browser.
Nice-looking little HTML5 <canvas> 2D game engine and toolkit written in JavaScript. More and more the apps are moving to the browser and out of the land of plugins and standalone RIA clients.
John Balestrieri is tinkering with generative painting algorithms, trying to produce a better automated “photo -> painting” approach. You can see his works in progress on his tinrocket Flickr stream. (Yes, there are existing Photoshop / Painter filters that do similar things, but this one aims to be closer to making human-like decisions, and no, this isn’t in any way suggestive that machine-generated renderings will replace human artists – didn’t we already get over that in the age of photography?)
Whatever the utility, trying to understand the human hand in art through code is a good way to learn a lot about color theory, construction, and visual perception.
Magician Marco Tempest demonstrates a portable “magic” augmented reality screen. The system uses a laptop, small projector, a PlayStation Eye camera (presumably with the IR filter popped out?), some IR markers to make the canvas frame corner detection possible, Arduino (?), and openFrameworks-based software developed by Zachary Lieberman. I really love this kind of demo – people on the street (especially kids) intuitively understand what’s going on. This work reminds me a lot of Zack Simpson’s Mine-Control projects, especially with the use of cheap commodity hardware for creating a fun spectacle.
MyDsReader, Nintendo DS text-to-speech document reading software for the visually impaired. It’s using the Flite synthesis engine designed for embedded devices, implements gesture controls for ease of use and even throws in an integrated email and to-do client. Excellent use of homebrew development.
There’s something satisfying about hitting ZZ and returning to a webpage. This might be a good way to ensure that 100% of your blog comments come from *nix or code geeks…
Near real-time face detection on the iPhone using OpenCV. An obvious point to make, I know, but I still think it’s amazing that this would have been very difficult to do on any home computer just a few years ago but now our mobile devices can handle the task with relative ease.
Big news for high school hacker nerds everywhere who want to give their graphing calculator’s Z80 processor a better workout than just crunching algebra problems. Also a very good reminder that yesterday’s strong encryption now takes only a small bit of time to crack (in this case, one user with a dual-core Athlon about 75 days to break RSA-512). No, you can’t hide secrets from the future.
Back when I was a young’un, we didn’t have to get around signing keys to run Z80 assembly, just needed to build a serial port interface and a copy of ZShell…
Blit, an early Unix-based multitasking windowing system demo from Bell Labs, a precursor to the X Window System. X11 didn’t look much different ten years later, and true multitasking and multi-user systems have only recently filtered into the Mac and Microsoft Windows worlds. Not bad for 1982.
Videos from the recent ART && CODE Symposium, featuring presentations by the folks behind Scratch, Processing, Max/MSP/Jitter, and other fun + education-leaning graphics tools.