Experiments with photogrammetry

August 2022

I experimented with different methods of photogrammetry to try to digitize my roommate, dorm, and other objects in a VR environment. The idea was that I just thought it’d be fun to show people how it feels to be in my dorm from anywhere. I learned that quick LiDAR scans with the Polycam app were fast and dirty. If I want to get much higher resolution models, I have to take hundreds of pictures and let them render out on my Mac using the Object Capture API. When I tried to do this on my body to rig it and make a digital avatar for VRChat or Bonelab, I learned that it starts to break if the subject gets too big. So, I split up each scan into different body parts: head, torso, feet, left arm, and right arm. I did this before I learned how to use Blender. I have since learned how to use Blender, but have yet to put my body parts together. This idea eventually became explored much further in-depth when I joined the EmpathyBytes VIP and used photogrammetry to capture Georgia Tech artifacts to preserve in a VR museum.

An AR model of my roommate standing next to my real roommate IRL.

Me lying still on the floor in a room with good lighting so my roommate can capture my left arm with photogrammetry.

A screenshot of the unity project including photogrammetry of my dorm and my roommate. I thought it would be funny to make replace the skybox with the texture generated from the 3D model of my roommate.

A screenshot from within the Polycam app showing the locations of all the photos I gook and the generated model.

This was my first failed attempt to capture my whole body using hundred of photos. As you can see, the subject (me) was too complex so it got confused and tried to include the whole room and exclude my head and right arm.