Tuesday, 8 July 2014

3D Wood Grain

Using a block of wood and a plane Keith Skretch made something amazing. He snapped a picture of the wood, then planed a thin layer off, snapped another picture, planed another layer off, and repeated this hundreds of times. In the resulting timelapse/stop motion video you fly through the wood structure, and can see knots and grain in the wood ripple by.


Waves of Grain from Keith Skretch on Vimeo
To my computational image analysis eyes, the truly amazing thing about this video is contains the detailed three dimensional map of the internal structure of blocks of wood; that these blocks of wood have been digitally immortalised!
Let's look at just one of the blocks of wood:
 The series of images 29-36 seconds through Waves of Grain

So what can you do with this data? Well you can reproject to give you a virtual view of what the left and the front sides of the blocks of wood would have looked like:

That's quite cool, but doesn't capture the power of having the full 3D information. The more powerful thing you can do is do a virtual cuts through anywhere you want in the block of wood. You can cut it somewhere in the middle to take a look at the internal structure... The yellow lines mark where the virtual slices were made:
That's also quite cool, but still doesn't capture the power of having all that 3D data. You can also reslice the image at any orientation that you want; it doesn't have to be neat orthogonal lines:

Again, quite cool. But you can still do more. Because this is now a purely digital representation of this block of wood you can display it in ways that would be physically impossible to make. Instead of just looking at the outside of the block...



... you can now look inside.



This 3D reconstruction lets you see how the growth rings appear in three dimensions, showing exactly where the grain runs. It lets you see how the knot, which is where a branch grew from the tree, cuts through the growth rings in a distinctive way. It lets you see pretty much everything about the internal structure of the wood!

This kind of approach is used all over biology, and is normally called something like serial sectioning. You can use it for everything from reconstructing a whole person by histology and a light microscope to a single cell by electron microscopy.

Software used:
ImageJ: 3D reconstructions

Thursday, 3 July 2014

3D Lightning 2

About a year ago two redditors happened to take a photo of the same lightning bolt, but from different places, and I use them to make a 3D reconstruction: 3D Lightning.

Well, it happened again!
The two source images.

This time the lightning bolt struck one World Trade Center (Freedom Tower), and two people got a shot of it from over the river. A little adjustment for the rotation of the image and some guestimation of their approximate locations let me work out that there was very little vertical shift between their locations, but quite a large horizontal shift.

Just like last time, a 100% accurate reconstruction isn't possible. You need to know the exact locations and elevations of the people, and field of view of the cameras used, to do this precisely. However, just like last time, a rough reconstruction is possible where the difference in horizontal position of part of the lightning bolt between the two images is proportional to the distance from the people taking the photos.

The approximate 3D reconstruction.

After grabbing the coordinates from the photos it was just a matter of plugging them into Blender to make an approximate 3D reconstruction.

Software used:
ImageJ: Image analysis.
Blender: 3D modelling and rendering.

Sunday, 8 June 2014

PixelTool

Many classic games like Transport TycoonRollercoaster Tycoon and Theme Hospital have pixel art graphics using a limited number of colours. These graphics are tricky to draw and take a lot of skill, especially when trying to draw accurate 3D shapes from different angles and getting the perspective and shading right.

So I made PixelTool to help out!



What is PixelTool?

PixelTool is an online voxel-based tool for drawing isometric pixel art graphics. To use it you modify a 3D volume of voxels; picking 8-bit colours for each of the voxels and leaving the background as the 'magic blue' which is transparent.

It takes the voxel data and does a pixel-perfect rendering of it into 3D and adding lighting and shadowing, but still sticking to the starting 8-bit colour palette.

Slices through the voxel data of a piece of heavy hauling equipment for OpenTTD

The corresponding rendered image of the voxel block.

Blowing up the voxels in the rendered image by 4 times lets you see what is going on in a bit more detail:


PixelTool isn't just a cheap imitation of 3D rendering software, it is a dedicated tool streamlined to making isometric sprites for classic 8-bit games.

Want to play some more?
Test PixelTool out online here: http://www.richardwheeler.net/interactive/pixeltool.html
Grab the source HTML/javascript code here: http://dev.openttdcoop.org/projects/pixeltool
Download this example of voxel data here: www.richardwheeler.net/hosting/voxeldata.txt
Join the discussion here: http://www.tt-forums.net/viewtopic.php?f=26&t=69974&start=60

Tuesday, 20 May 2014

Jurassic Wedding



You will have seen the instant internet classic of a dinosaur crashing a wedding... I got married this year and just had to do the same. Fortunately my wife agreed! I am a biochemist, but cloning a dinosaur to crash my wedding would have been a bit of a challenge, so I had to stick to the graphics approach instead.

So how do you get a dinosaur to crash your wedding?

Step 1: Recruit an understanding wedding photographer and guests for a quick running photoshoot. Make sure everyone is screaming and staring at something imaginary!


Step 2: Recruit a dinosaur. A virtual one will do, and I used this excellent freely available Tyrannosaurus rex model for blender.



Step 3: Get some dynamic posing going on! Most 3D graphics software uses a system called 'rigging' to add bones to a 3D model to make it poseable. This is exactly what I did, and with 17 bones (three for each leg, seven for the tail, two for the body and neck and two for the head and jaw) I made our pet T. rex poseable.

 The bone system

The posed result

Step 4: Get the T. rex into the scene. By grabbing the EXIF data from the running photo I found that it was shot with a 70mm focal length lens. By setting up a matching camera in blender and tweaking its position I made the camera position match perspective between the view of the T. rex and the running people.


Step 5: Making the dino look good. A 3D model is just a mesh of points in 3D space. To get it looking good texturing and lighting need to be added. For this project they also need to match the photo. Matching the lighting is particularly important, and I used Google maps and the time the photo was taken to match up where the sun was as accurately as possible.

The T. rex wireframe

Textured with a flat grey texture.



With a detail bump texture and accurate lighting.

With colours, detail texture and lighting.


 Step 6: Layering it all together. To fit into the scene the dinosaur must sit into the picture in 3D; in front of some object and behind others. To do this I just made a copy of some of the guests which need to sit in front of the dinosaur and carefully cut around them. The final result is then just layering the pictures together.



So there you go! 6 steps to make your own wedding dinosaur disaster photo!


Software used:
Blender: 3D modelling and rendering.
Paint.NET: Final layering of the image.

Monday, 28 April 2014

A Year in The Life of a Computer

What does a year in the life of a computer look like?


Well, something like the map below! This is a map every bit of of mouse movement, every mouse click and every keyboard press I have done on my home and work computer over every day of a whole year.


2013-2014 [click for a bigger view]

To make it I wrote a little python script using pyHook to grab inputs in Windows, which I compiled to an .exe using py2exe. I set this up so that it starts recording the mouse movement, clicks, and keyboard presses after I log into my home or work computer. After 2 years it had collected nearly 10 Gb of data! This was far too much to look through by hand, so I wrote a second set of scripts to plot it to an image.

So what does it all mean? Well the map breaks down a bit like a normal calendar, with days of the week running from the top to the bottom of the map, and successive weeks running from left to right. The years and months are marked at the top of the map.


Within each day my computer activity is broken down by time. Time runs from the top to the bottom of each day, from midnight to midnight. Coloured speckles on the dark background indicate computer activity. It is easy to see that I use computers a lot, with a chunk of time from around midnight to 7 am when I am normally asleep, then smatterings of activity from around 8 am to midnight when I am at work or awake at home.


Different types of computer activity are shown in different colours.


The structure within each of the colours also contains information; distance in the horizontal direction corresponds to horizontal mouse position across my two screens (for mouse movement) which mouse button was clicked (for mouse clicks) and which key was pressed (for keyboard presses).

2012-2013 [click for a bigger view]

In these maps of usage some interesting structures jump out; you can spot the type of work I was doing with my computer based on the type of mouse and keyboard activity:


This is usage on a day where I was writing my PhD thesis. The keyboard (cyan) has loads of activity, while the mouse (magenta) did relatively little.


This is a day where I was mainly using Blender for 3D graphics. The mouse (magenta) has huge levels of activity, centred on just the left hand screen). The keyboard is hardly active except for the control and shift keys, which light up as a single column of bright cyan pixels.

It is quite scary how much information can be gleaned from these maps of computer activity. Without knowing which programs were open or which keyboard keys were being pressed it is still easy to work out where I have been, when I have been working, and the kind of things I was doing on my computer. Similar data can be collected remotely; particularly if an internet company tracks when and where you use the internet.

Stop for a second and think about the companies you interact with, and the data mining they can do. Think how much they can learn about you and your habits; Google and the websites you visit, your phone company and when and who you text and call, the supermarket you shop in and what you buy. These companies can work out what you are interested in, what you like and dislike, when you are awake and when you are asleep. This is big data, and it is valuable and it is powerful. Big data is how Target knew a man's teenage daughter was pregnant before he did!

Software used:
pyHook and py2exe: Data logging.
ImageJ: Data plotting.
Inkscape: Plot annotation.

Thursday, 17 April 2014

Tree of Plants

Everyone knows what plants are like; they have leaves and roots, flowers and seeds. Or do they? All of these classic features of plants are actually relatively recent developments in plant evolution. Conifers don't have flowers, ferns don't have seeds or flowers and moss doesn't have leaves, roots, seeds or flowers! Leaves, roots, flowers and seeds are all features that evolved as plants adapted, starting at something like seaweed, to life on the land.

This term's issue of Phenotype has a bit of a focus on plants, and my research comic for this issue focuses on how plants evolved and adapted to land. You can download a pdf of this feature here, the full issue for the summer (Trinity) term will be available soon here.


While I was making this I started reconsidering just what the plant life cycle looks like, as a classic school education about how plants reproduce isn't very accurate! The classic teaching is that the pollen produced by a flower is like sperm in mammals (and humans), and the ovum in the flower is like the egg in mammals. In fact pollen and the developing seed are more like small haploid multicellular organisms, gametophytes, that used to be free living. If you go back through evolutionary time towards ferns then the gametophyte is a truly independent multicellular organism. Go back further still and the bryophytes spend most of their time as the gametophyte.

If you imagine the same evolutionary history for humans then it is easy to see how different this life cycle is to animals; if the ancestors of humans had a life cycle similar to ferns then, roughly speaking, ovaries and testicles would be free-living organisms that sprout a full grown human once fertilisation successfully occurs. I can't help but think that would have been a little strange!

Software used:
Autodesk Sketchbook Pro: Drawing the cells.
Inkscape: Page layout.