27 March 2014

Reindeer Test



This is my first test at filling the reindeer mesh I created with particles, the aim is to create the suggestion that the reindeer is being created from stars. 

I tried several different approaches before deciding on a set-up, my intention was to have the particles fill the mesh and fly around, rather than filling the mesh solidly and being stuck together. I also wanted the particles to fill smaller areas of the mesh such as the antlers for clarity. Using a simple emitter set to a high Rate Per Second meant that the mesh file with particles that were packed very tightly, although this eventually filled some of the more detailed areas the particles were too static and there were too many of them, making the simulation very slow.

In order to achieve the final result I decide to use separate curves to outline some of the main areas of the reindeer mesh. I then assigned particles emitters to the curves, meaning that the particles would emit from each vertex in the curve. I was also able to manipulate the number of points in each curve by using the 'Rebuild Curve' tool, meaning that I could have the particles emit from even areas of each curve. I then experimented with several different rates of emission in order to gain a look I was happy with and have the particles fill the mesh at an appropriate rate. One of the main elements that helped achieve this was changing the emission type from Live Forever to Constant.



In order to achieve the effect in which the particles fly around the mesh I added a Turbulence and a Vortex field. In the render above I also changed the particle type to Streak and manipulated the shading. For lighting I simply used a HDR image to produce the test.

I was quite happy with the outcome of this and so will work on adding a small amount of animation to the reindeer mesh, I may also add some other background elements to create interest. I also need to add another particle simulation that will help create the illusion that the particles are gathering together to form the deer. This could however be created separately and composited with the reindeer render.



Reindeer model in progress

2 January 2014

Filling Mesh with Fluid


In the following examples I have been experimenting with filling the character I created previously with a fluid in Maya. 

In order to achieve this effect I have used a 3D fluid container and started with the fish tank preset, by manipulating the attributes of the fluid I was able to make the fluid rise to the top of its container. The container is placed around the outside of the character and the characters mesh is selected and assigned as a collision object with the fluid. 

The outcome of this was that although the fluid filled the mesh there were also large amounts of fluid around the outside of the character. On this occasion I was unable to fix this and I did like the effect that this had in part, aside from the blocky nature of the fluid because it adheres to the shape of the container. I will therefore look for other techniques of filling meshes with fluids. 

Another way to approach this would be to have the fluid assigned to 'Fill Object' in the attribute editor, this will result in the fluid originating from the centre of the character mesh. I will try this approach next.


I am also aware that another approach would be to use particles to fill the mesh instead  these can also be converted to a mesh that can be textured, but the physics of the water may not be overly effective.




Having achieved a look I was fairly happy with I did a test render of the scene to see what the final look would be. I quite liked the effect that the fluid had as it appeared to burst from the lens of the character and the result of the reflection of the animated image sequence in the fluid, I therefore decided to experiment with having this appear to seep into the rest of the fluid by combining the test renders with the video footage from the experiment. 

I used several duplicates with varied blending  of the footage which I masked and animated in After Effects to produce a rough test towards the end of the video below.




The next step will be to use a similar technique to combine the characters mesh filling up with with the cornstarch / displacement map experiments, this will involve attempting to match the shading of the liquid that fills the character with that of the displacement animation, this is something I will experiment with at a later date.

Post by Jess

11 December 2013

Further Testing of Oil Footage as Animated Texture on Model

Expanding upon my last post I have done some further testing of image sequences on the lens headed character. In the following video I have added an image of a lens to the image sequence before exporting from After Effects, using this process will mean that we can use After Effects to create any sort of footage that can then be exported as an image sequence and applied as a texture within Maya. 

I also wanted to experiment with the reflectivity of the glass in the characters face, here I have moved an image plane with a Surface Shader applied to it around the scene to test the reflections. I like the effect that this has and think this technique could be used to emphasise particular areas of the animation.

28 November 2013

Projecting video footage on to 3D objects

Having modeled a basic character based on my designs in a previous post I have been experimenting with having the live-action footage from our experiments as textures on objects within Maya.

In this example I have created extracted the 'lens' in the characters faced and duplicated this. To create the look of glass I have enlarged the duplicate and added several extrudes to create a raised surface. I have also added a spot light as the main illumination in the scene, I have set the light decay to Quadratic, this creates a more natural look as the fall off of the light is closer to that of light in reality. This means that the further away the light is from the object the less like there will be, with a setting of no decay the value of the light will not change no matter how far away the light is from the object. I also added a point light behind the character to highlight some of the rear details and have added two planes with surface shaders applied to them to create reflections.

I first tested a still image as the texture on the lens to see what it would look like. Clearly in this image the lighting is too dark so I also needed to adjust this. I also used an image of a camera lens underneath the frame from the video here, adjusting the coloration of the video footage is something we will address at a later date.




In order to project the video onto the lens I first created a simple UV layout for the lens, I then created a UV snapshot of this and opened the file in After Effects, this allowed me to maintain the dimensions of the UVs and import the video footage over the top. I then create a spherical mask to mask out the area of the video that was outside the lens. 




Using a mask to create an image sequence
Doing this in After Effects then allowed me to export the footage as single frames (an image sequence), I was then able to set this image sequence as a texture for the lens object in Maya. It took me a while to achieve this correctly, the main problem being the naming of the files, it seems Maya is rather particular about the naming of the files in an image sequence. I eventually found that naming the images image.00, image.01 etc was the correct naming system in order for the image sequence to work. I then exported a few frames in order to ensure that the image sequence was working. As I am working with the Mental Ray shader mia_material_x I was unable to test the image sequence in the scene as the shader just appears as black.

I also created a transparent, reflective material for the outer lens, although I am not quite happy with the results, this purely functions as a test scene.



Image sequence as texture on lens, outer lens with bevel to create reflection 

Having achieved this task we are now aware of the workflow for applying image sequences to objects in Maya and the way After Effects can be utilised to create image sequences from video footage. Below is the test of the image sequence.




25 November 2013

20 November 2013

Fun With Corn Flour and Sound


This is the video I managed to capture from the experiment. Unfortunately, I do not think it is good enough.


  1. The white balance is not correct.
  2. Not enough light (bigger aperture, more lights).
  3. Slow shutter speed.
  4. Would be ideal if we could get a green background.
  5. For better results and stronger speaker/ sub would be needed.
These are all things I am going to adjust in tomorrows experimental session. One of the bulbs blew leaving us with only one light. This left the scene a little dark. I did not want to open the aperture to wide incase of not having the whole scene shape. I also did not want to change ISO because of noise. To lighten the scene I slowed the shutter speed, but this has resulted in very blurry results. because of this, in tomorrows session I will be strict on variables and note down all the numbers being used.

© Conor Page