9/20/25
Continuing with the development with the tests for Caustics, I now have access to HigxA point renderer that takes the particle system nuke has to the next level.
The first two iterations based upon gravity, and an animated driving force
A test of one of the stills from the animatic, with the caustics, and the effect masked.
9/17/25
I have been working on a spreadsheet to plan for shooting each shot that will be on either the greenscreen or the XR stage. Each shot is different, so having the plan of what shots will be needed for comp and all cg assets is essential to the workflow. We now have an instagram for our project, for other students who we will need help from, will be able to get involved, the instagram is: @metamorphosis_vfxthesis
9/15/25
Hannah has updated the animatic here:
9/12/25
I have done a couple of tests with Nukes particle system feeling out how much I may be able to do in comp in terms of additonal effects to save on render times down the pipeline. These are entirely to test to see if it will be possible to create caustics and additonal light/effects in Nuke.
I used Nuke's particle system, the movement is driven on cos, sin and tan waves for the movement.
Here are the first initial tests behind a HDRI for context of transparency
First class! Super exciting to finally be starting 408 after quite a bit of prep and antisipation!
Going off of the notes from today, I have come across a few color scripts that were used for the tinkerbell movies, that may be helpful for the team going forward.
For any physical props the character is going to interact with Gaussian Splatting may be the way to go to integrate the cg together. This article:
Gaussian Splatting in Superman
talks about how Superman used Gaussian Splatting to get more accurate depth, and I believe although we don't have the same amount of resources that the team that made it happen, we can possibly simplify it to make the depth pass work for our needs as needed.
Hannah
worked on the storyboards:
When researching projects for visual effects students, I found that a main commonality is the lack of movement of the camera. To fix that, I believe using the motion control rig on the shots that are on greenscreen/bluescreen will allow for that fix. I am not sure if the XR has motion control capabilities, but if it does as well then that would help with it too. Additionally, if we are able to take the camera information from the XR stage shots and import it into unreal and nuke with keeping the correct proportions that would allow for the effects integration to be physically accurate along with proper match move!
I have had the privilege of working at Harbor over the summer for an internship, so I asked a few questions that would pertain to the project based upon the constraints and direction we are heading into.
As everyone on the core team is a visual effects artist we will need to branch out to have people from other departments help where we may not be as strong in such as having a camera operator and a director of photography. If we are able to have the backgrounds done for all of the shots, not just the ones that will be on the XR stage before we shoot, it will help the DP (director of photography) light the actress better allowing for a seamless camera match.
As my portion of the project cannot start until filming begins, I have been doing a lot of research on the best ways to shoot the plates that will be on the XR stage and Greenscreen/Bluescreen background depending on the final lookdev.
The main thing I wanted to look into is getting the intergration of the material on the fairy to the effects that will be taking place around her. In order to do that I think we are going to need the costume that she will be wearing as a 3d asset to allow for the accurate intergration on both lighting and texturing.
For my senior project, the theme right now is some sort of dark fairy fantasy world. I will be incharge of compositing and helping to conduct the on set supervison needed for the live action elements.
Hannah Kim (Director/FX/Lookdev)
Gracie Szymanski (FX/Prodction)
Mia Esparragoza (Enviornment/Lookdev/Lighting)
Julian Schenker (Unreal Production)
Olivia Westling (Compositing/Editing)
Charlie Ragland (Lighting/FX)
9/8/25
9/5/25
The script can be found on Gracies Blog
8/22/25
8/16/25
7/30/25
7/5/25