A test of what the magic may look like in shot 2K! The way this effect is done is through applying noise that moves and layering that on as if it were makeup. Additonally, I did a little bit of cleanup to the skin on areas that may be distracting, such as removing the visible bobbi pins.
I used ST Maps set from frame 1015, and will likely need to segment each of the maps to be intbetween the blinking as that breaks the map from working properly. Eventually Hannah will have an effect for me to intergrate into the eyes!
2/2/26
The updated contact sheet of all the shots I am working on so far! 2B has been passed over to
Olivia to comp, which im so excited to see how that shot is turning out.
This upcoming week I believe I finally will be getting the fairy wings for the shots, along with getting into full iteration mode with the rest of the team!
1/31/26
I have been given an updated backdrop with the camera data from 2A and 2I.2!! Here are those composites in progress. Not final color either.
2A
With midterms coming up, I'm trying to get as many shots as possible to be in good places for integration. The shots I'm finding are not causing too many problems yet as the background was evenly lit (a pro of using the XR stage) so the same issues that I have in one shot can be used in another one. The main issue is that edge work just takes time, but it works as the backgrounds and all cg assets are still works in progress.
2I.2
1/28/26
The past few days,
Zachary from the senior project "Four Ways to Destroy Aspen High" have been working on getting Neat Video back on the computers so we are able to properly denoise our footage, and we finally have it! It took me about 5ish hours to denoise all of the plates as it is something that needs to be done manually. The footage already was fairly clear, so getting high percentages of noise to work with in Neat Video was a bit tricky! The highest number I was able to get was 85% while the lowest was 18%!
1/25/26
The past couple of days I have been developing a tool to help speed along the process of scaling and adjusting the positioning of the exported cameras from unreal in maya. Here is the current state of that code! I have both seperatly working, just putting them together as one tool, is what is taking the most amount of time. This will help speed up that process as there is no real way to go about it other than the way we figured out by doing the fixing of the cameras in maya.
Once I get everything working for it, I plan to make a working gui for the artists on the team to have an easier time with getting the code to work.
1/24/26
Over the weekend I have focused on edge work! Shot 2A, although still in progress, and everything surrounding the character temp, has the edges almost all cleaned up. Untill I get any final assets, of course all of the shots I continue to work on are subject to change, but thats why getting the edges all ready for whatever happens comes first.
Additonally, I am happy with how 1B is turning out! The keying of the green webbing was something I was worried about, but it seems to be working out in the end, so thats super exciting.
1/21/26
All of the live action shots are now at the preliminary comp stage!
By having all of the shots at this stage, it will allow for easier updated as the CG components get updated!
Additonally, it means I can start on all of the matchmove for the wings of the shots!
1/18/26
Over the weekend, I did quick preliminarily cleanups of about half of the live action shots.
The background was the first background WIP from Houdini made by
Mia, our enviornment artist!
By the end of this week, the rest of the shots should be at this stage, so then I am able to matchmove them all which will allow for wing intergration! As I get the actual backgrounds for the shots, I will be able to properly intergrate our live action plates into the background.
What sped up this process tremendously was using a setup from ComfyUi that I had set up a few weeks prior. It is important to note, that many of the shots simply failed, the moduels were unable to dechiper what was supposed to be masked and what wasnt. Shot 1D specifically no matter what the threshold was, it failed. So despite having these advanced tools, they are not good for every shot. Here is what that failure looks like:
1/16/26
Today I spent a bunch of time figuring out if the camera exporting workflow was the best idea, and it was not due to a few things. First off, the camera solve from unreal was so much smoother than it was using Nukes 3D tracking.
Heres just the camera tracking comparison:
Additonally, the cameras did not start at frame 0 from time of recording, the frames from the stage were set up in hour, minute, second, milisecond, so that made it quite tricky to move in nuke, and the camera was not at the origin, so binding it to an axis additonally was not optimal to do in Nuke. Because of this, I went to
Charlie, to help troubleshoot the issues we were getting with exporting our cameras from unreal to houdini! What I initally thought would work was not the case.
What we found what worked though, was exporting the camera from unreal, importing it into Maya (because houdini strangely would not read the camera to begin with). Using Maya to move the keyframes to 0, then exporting the camera from Maya and import it into Houdini as a Filmbox FBX, then bind the camera to an axis in Houdini so the rest of the team would be able to use the same camera.
Camera Tracking Comparisons vs the Footage for Shot 2I.2:
1/14/26
The workflow we are going with currently to get the camera data correctly distribute to the team based upon the data we have from the XR stage's RedSpy camera tracking, and our ability to clean up that tracking: edit -> unreal camera data -> nuke + matchmove fix -> cameras to the rest of the team
Additonally, I set up the final compositing schedule/spreadsheet now that we are just about picture locked!
1/12/26 - 1/13/26
Yesterday and today we had our final shoot days!
I was the On-Set VFX Supervisor, which for this shoot included shooting all of the HDRI's, making sure we had color charts in every shot for the colorist, the tracking data from the redspy tracker that was connected to the camera was working in connection to unreal, and taking BTS!
The day prior, we had also finished up re-doing wing tracking markers!
This is very similar to how the wings were created for the Maleficent movies, huge props to that team. The way these are sticking to the actress is by using kinesiology tape, that is glued to the bottom of the wings, allowing for no wires getting in the way of the actress AND no wires that we have to remove.
I additonally was incharge of data collection, so making sure all of the data was uploaded to our collaborative space, that allows us to acess the footage anywhere while on SCAD campus, and connect remotely to the data off campus via filezilla as well. Uploading the footage and camera data, making the HDRI's, and taking BTS in total took about 13hrs across the two days. The footage in total was 302 gigabytes of data, and we shot 53 takes! So safe to say we have plenty of footage to work with!
1/9/26
Last test shoot today!!
The main purpose of this was to make sure our new actress felt comfortable on the XR stage, and to go over the plan for shooting on Monday and Tuesday.
I believe I also got the footage looking color correct from our test, which means
Olivia, will be able to work her color magic for the overall footage down the pipeline! So thats super exciting!
And a few BTS!
1/6/26
Hannah, our Director has made the map / pathway for where our fairy will be traveling to help with lighting! I'm super excited over this because it will help inform us where the moon will need to be the entire time of the shoot!
1/5/26
Today was the first day of the quarter! As a group we went over what the plan is overall and indivutally to have the project done by week five of spring quarter! (approx 16 weeks from now)
As our final shoot on the XR Stage will be next week, we are going to be building the snow physical set that will be needed to help with intergration of our live action assets. We (the vfx students on the project) are going to be making it as all of the film majors and film adjacent people we recruited to work on our project last minute told us they are all dropping the project, so it now lays on us! Some of us on the project including myself have experience making set pieces and props, so this is not forigen territory for us and will have updates later this week on that part of the project!
Additonally, after more research I am redoing the wing rig and have decided on a more minimal approach to not impead on the actress's movement and dress. The actress also had dropped so we are not 100% sure if the new actress will fit into the dress and thus the wing rig hence the additonal need to remake the rig. These setbacks simply mean that we will be able to pivot to ideas we had in the backburner. Last thing! We have switched from using Unreal entirely to using Houdini, as everyone in the group has more experience, and the look that unreal was giving us, was not what we wanted our piece to look like in the final product. Houdini allows us to have more creative freedom of what our project will look like without the restrictions of the engine.
In terms of wht I am going to be focusing on until we get to shooting, is making sure the matting I have set up so far in ComfyUI works, all of the information we will need on set will be able to be documented, and that we will be able to use the camera data from the XR stage and bring it to both Nuke and Houdini to allow for everyone to be using the same cameras!
Overall, today was very productive and excited to see what happens throughout this week!
12/18/25
Over the past few weeks, I have been testing ways to use ComfyUI to help with mattes!
To get the best result, I have to make two seperate mattes from the ComfyUI output as seen here.
One is of the character, and the other is of the snow prop.
Once I have both of those mattes I am able to composite them together in Nuke.
The base footage along with the final (note it is still a work in progress but the proof of concept is what matters most)
The script in ComfyUI
11/12/25
Most up to date cut at the end of Fall Quarter
11/7/25
Today we were supposed to do a test shoot on the XR stage, but unfortantely, they overbooked, and we were only able to test how a blue screen with trackers would look, which in our luck is possible with the technichans, using disguise to get the blue with trackers set up.
We have the ability to use the stage as a bluescreen, and even though we werent able to shoot today on the stage, this was very helpful. We were able to reschedule for this Thursday, so we are very excited about that
11/3/25
The Callout for the construction of the wings!
The concept art is made by
Hannah
10/29/25
After a couple of days of extensive research, looking back on my notes from my talks with Gray Marshall, Michael Karp, Allie Sargent, Molly Intersiome, reading the
The Technique of Special Effects Cinematography written by Raymond Fielding book, colsulting a few costuming friends,
Kaicey Rhan and Bobby Chastain I think creating a mix of a wired bone rig and physical stumps for tracking of the wings is the best bet.
Bobby has graciously offered to help with the construction of making the rig itself! I'm super excited to be collaborating with him on the rig.
10/27/25
Oliva and I blocked out the camera move for shot 2I.2. As it is our proof of concept shot, I was able to use the footage from the test shoot to do a quick composite with the camera movement from unreal. Getting this shot roughly set up really gets me excited for our shoot week 9!
Using the most up to date version of the unreal enviornment from
Mia, I helped
Oliva and
Charlie block out more of the greenscreen shots that we plan to shoot for our lighters on set to have lighting refrence.
10/23/25
I met today with
Molly Intersimone , one of the amazing mentors I had while at my Harbor internship, to get advice on how to shoot some of the more complex elements in the film.
She made a lot of really helpful notes, here are a few of the ones that we plan on implementing into shooting:
- Transparent refrence for consistency of the wings
- As many in-camera elements connecting the character to the bg as possible to keep realisim tied in
- Spine for wing rig, rework the current plan, look at more ways wings have been done in the past, no need to reinvent the wheel
10/19/25
A test of using smart vectors and st maps to get an idea of what we may be able to do for makeup enhancements in nuke of our actress.
This is footage from the test shoot. The fact this works is promising because to matte out the face I am using one of the segmentation tools from Cattery:
Face Parsing .
Based upon this working, it means that like I mentioned previously, we will be able to likely use copycat to help with roto and mapping out parts of the fairy to help enhance makeup and removing her from the background from either the greenscreen stage or the XR stage, which is what I am most worried about needing to do.
10/16/25
The right tool to shoot the shots in the best way to set us up for success has been decided on!
We are going to be using a Kessler slider that we have access to and are able to hook up whatever camera we get our hands on for the shoot. It has the full ability to change height so we can mount it on c-stands to hold it up and move in time with the actress to get those moving shots we are aiming for.
10/15/25
Over the quarter I have been in conversation with Gray Marshall, the chair of VFX at SCAD asking for his advice as to go about shooting a few of the shots within the limitations that we have as visual effects students.
First as mentioned in the past on this blog, I proposed we could use the motion control kit. The main reasoning for using motion control would be in cases where the camera would need to make repeatable movements. Such as multiple exposure shots, or in what I thought would be our case a running scene that the camera is almost entirely parallel with the actress. After we went over the kit, talked more about the technical side of shooting visual effects plates, he recommended this book:
The Technique of Special Effects Cinematography written by Raymond Fielding. So many of the concepts of visual effects have not changed since they were created and this book is totally a testament to that. I haven't read as much of it as I'd like yet, but a very informative read.
Marshall also graciously reached out to an old colleague of his, Michael Karp who was a motion control operator and now mainly a matchmove artist. He gave great insight that sometimes using a rig like a motion control one can make a shot look out of place. He explained that for shots similar to the ones with the fairy running through the woods (2A and 2L), he would track the camera, and then track the object rather than needing to control the motion of the camera. By parenting the object track to the camera it would naturally nullify a feeling of being out of place in world space. After more research and trying to figure out how to go about these shots, Gray brought up an idea of using a slider rather than a motion control rig. It allows for consistent movement while not needing all of the complexities that come with needing to set up a motion control rig.
And in our luck Gracie and I are going to learn how to use it with Gray later today. I'm so excited that we hopefully finally have found the right tool for the project.
10/12/25
Because so many of the shots are going to require roto and keying for some degree, I have been looking into training possibly a copycat model of our actor to help with masking her out in various isntances. I am not 100% sure how to go about it yet, so I have started doing research on how I could go about it. I have been looking at the Foundry's user guide as a jumping off point.
10/9/25
Midterm review was super helpful! We got really construtive feedback on a couple of the shots that we may need to rework in the event we are unable to get certian equipment.
Here is an up to date edit with the changes
10/7/25
Most up to date edit with the test footage!
*This is all super quick comps to just see how the footage looks, additoanlly, Hannah is not our actor as she was unable to be there for the entire test shoot day, but will have her for the actual shoot and when we do a camera and costume test.
10/5/25
Revewing the footage from yesterday made me realise that we really do need more lights in order to be able to light the subject well, along with having a better camera. We want to be able to use the same camera that is on the XR stage (RED Komodo 6K) But the issue that again we are running into is that as VFX students we don't have access to rent out the cameras, so we are in discussion with a bunch of film majors that have certification to use the specific cameras that can work on our project with us.
10/4/25
Shoot day!
All things considered the shoot was productive in terms of what we need to aquire for the next shoot, changing shots / lighting plans.
Unfortantley, we ran into quite a few issues while preparing to shoot test shots.
First we didnt have access to the breaker to turn the house lights on for the greenscreen. When we were able to get them turned on, we realised one of the lights was blown out entirely making it difficlt to light the greenscreen evenly, so we had to use a small portion of the stage. Additonally a few of the bulbs for one of the panels was also blown out pictured below
If we were able to rent more lights from the Montgomery cage, or use the Film lighting kits this may not have been as much of an issue. So renting more lights overall for the shoot will be super important to get the shots we need properly lit.
10/2/25
Today myself Gracie , and
Hannah met with our planned DP for the shoot this weekend along with the rest of our planned shoots for the project. It was super informative especially for specific bounds we will need to make sure the unreal scene file aspects work with the stage. My main concern for shooting on the XR stage is that regardless of how well lit the shots are shot that the background will need to be removed regardless to properly add in the fairy wings to the talent.
I am going to have to do research as a way to possibly avoid the inevatible of removing the bg regardless for when we shoot on the stage in winter quarter.
10/1/25
Our test shoot is this weekend, so I have been spending most of the time for this project working on the prep material that will be needed such as all of the lighting diagrams for each shot.
I bought a GODOX light, that will be used for light interactions!!! I'm super excited to add it to my On-Set VFX Supervison kit!
Heres the spreadsheet with the information below such as the call sheet, shoot schedule, lighting diagrams, kit checklist and more! Feel free to check it out
9/25/25
Continuing with the development with the tests for Caustics, I now have access to Higx
A point renderer that takes the particle system nuke has to the next level.
The first two iterations based upon gravity, and an animated driving force
A test of one of the stills from the animatic, with the caustics, and the effect masked.
9/17/25
We have been making a plan for shooting each shot that will be on either the greenscreen or the XR stage. Each shot is different, so having the plan of what shots will be needed for comp and all cg assets is essential to the workflow.
We now have an instagram for our project, for other students who we will need help from, will be able to get involved, the instagram is: @metamorphosis_vfxthesis
I have done a couple of tests with Nukes particle system feeling out how much I may be able to do in comp in terms of additonal effects to save on render times down the pipeline. These are entirely to test to see if it will be possible to create caustics and additonal light/effects in Nuke.
I used Nuke's particle system, the movement is driven on cos, sin and tan waves for the movement.
Here are the first initial tests behind a HDRI for context of transparency
Tests of different sizes without the trails
Another test, smaller with trails
9/8/25
First class! Super exciting to finally be starting 408 after quite a bit of prep and antisipation!
Going off of the notes from today, I have come across a few color scripts that were used for the tinkerbell movies, that may be helpful for the team going forward.
9/5/25
For any physical props the character is going to interact with Gaussian Splatting may be the way to go to integrate the cg together. This article:
Gaussian Splatting in Superman
talks about how Superman used Gaussian Splatting to get more accurate depth, and I believe although we don't have the same amount of resources that the team that made it happen, we can possibly simplify it to make the depth pass work for our needs as needed.
When researching projects for visual effects students, I found that a main commonality is the lack of movement of the camera. To fix that, I believe using the motion control rig on the shots that are on greenscreen/bluescreen will allow for that fix. I am not sure if the XR has motion control capabilities, but if it does as well then that would help with it too. Additionally, if we are able to take the camera information from the XR stage shots and import it into unreal and nuke with keeping the correct proportions that would allow for the effects integration to be physically accurate along with proper match move!
8/16/25
I have had the privilege of working at Harbor over the summer for an internship, so I asked a few questions that would pertain to the project based upon the constraints and direction we are heading into.
As everyone on the core team is a visual effects artist we will need to branch out to have people from other departments help where we may not be as strong in such as having a camera operator and a director of photography. If we are able to have the backgrounds done for all of the shots, not just the ones that will be on the XR stage before we shoot, it will help the DP (director of photography) light the actress better allowing for a seamless camera match.
7/30/25
As my portion of the project cannot start until filming begins, I have been doing a lot of research on the best ways to shoot the plates that will be on the XR stage and Greenscreen/Bluescreen background depending on the final lookdev.
The main thing I wanted to look into is getting the intergration of the material on the fairy to the effects that will be taking place around her. In order to do that I think we are going to need the costume that she will be wearing as a 3d asset to allow for the accurate intergration on both lighting and texturing.
7/5/25
For my senior project, the theme right now is some sort of dark fairy fantasy world. I will be incharge of compositing and helping to conduct the on set supervison needed for the live action elements.