top of page
Search
Writer's pictureJason Rowoldt

Do Androids Dream of Electric Sheep?

I've recently reconnected with a mentor I met in California, Howard Royster. Howard is an incredibly smart guy who has several patents and was one of the early geniuses behind 3D objects. We've had many conversations about AniBot and its potential, so I wanted to start to delve into some technical posts about how exactly AniBot was built and what it can potentially do, for Howard's (any anyone else who is interested) benefit. Some of these posts will be very technical, others will tangent into a bit of philosophy, like this one.


AniBot was designed for, and always intended to be, an engine that creates digital content on an unprecedented scale. When I was designing it, I started reading about this English statistician named Thomas Bayes. Bayes has a theorem you can read all about here, but the important thing is that from this equation we can tease out what many people call a "Bayesian Engine". The math behind it is interesting, but the real thing I got out of it was that in some systems, every input to the system must have a value that can be randomized when producing an output. In this way if anything is missing, there will still be an output instead of an error. In AniBot's case, what this means is that the Animation Brain (which reads a "screenplay"(or more accurately, the Shot Sheet that results from that screenplay) can be fed random inputs which fall into recognizable parameters. So for instance, every shot in the Shot Sheet has things like camera angle, camera tracking, camera distance, actor (which is essentially camera target but can of course be a bit fuzzier than that), mood (lighting), tone, energy level, action... all of that stuff can be randomized. In fact, when testing it, it starts out randomized. Think of the "a thousand monkeys typing on typewriters will eventually produce Shakespeare" thought experiment.


When I was designing it, I started there. I knew I had to create a system that could produce content automatically. AniBot could actually do this early on. I called it "Dream Mode" because what it really does is take every object, every camera tracking method, every VFX script, every background, every movement type that's been uploaded into it, and select from them randomly. In this way AniBot is "dreaming" when idle, that is, when the server is just sitting there and no one is asking it to produce and render something. We quickly learned, of course, you can't store infinite content, so we had to limit its output to produce only a few dreams at a time and then stop. The purpose of this was for me to review the completed videos and see if anything interesting popped up. I could then take the shots it created and if it made something interesting, capture that shot/scene and create a Shot Template out of it. More on Shot Templates later, as they are how we "train" AniBot and make it more useful.


So early on I started reviewing these videos. We had it set up so that every time I created an animation, when it was finished rendering it and delivering it, it would spit out three "dreams". At first these were VERY random. It might be a long shot of a man walking in the woods, then a bunch of quick shots of a spaceship in space, and then a tree exploding. Fever dream stuff. But even though I KNEW how these videos were created (in fact, I programmed the machine that created them), they still surprised me.


At one point I was reviewing a dream when the main character, who was an android a guy named Frank Sterling III created for me, turned suddenly toward the camera and stared straight into it. Then the camera zoomed in on his face and the face went from happy to angry.


Now I KNOW how this happened. It was random. But in fact, it freaked me out a bit. It really looked like there was agency there, that the character was mad at me. Of course if you simply deleted the character pose that shows anger, the android would never pick that pose again. But now we get into a whole host of philosophical questions. What are we going to allow people to upload to the AniBot databases, exactly? Do we want to limit the facial expression of anger? Jealousy? Indignation? Only happy androids? What about sadness? Do we want to limit violent animations? No blood? No death? If we did that, I could never create an animated version of The Iliad, which I had wanted to do for a while. What about sex? My wife was very concerned about people using AniBot to make porn. What about Nazi salutes? What about a lot of things. Once you can create nearly instant content and pick from millions of objects and animations, you have to moderate them somehow, because once the genie gets out of the bottle, it's going to be hard to put back in. So we created a method to moderate things that are uploaded into the AniBot database.


To close, Androids can dream of electric sheep. But only if you let them know what sheep are.


6 views0 comments

Recent Posts

See All

AniBot Roadmap - cost breakdown

We have come up with a preliminary roadmap for the future of AniBot. Without my time, we appear to be able to get this done with about...

How do shot templates work?

Steven Spielberg, Sam Raimi, and Joss Whedon are called out as examples of shot templates in my patent for AniBot. To clarify, AniBot...

Blender community post

I've come up with the following to post to Blender 3D communities, probably starting with reddit. Please let me know your thoughts or...

Comments


Post: Blog2_Post
bottom of page