Over the years, photogrammetry has gone from a niche affair requiring a huge workstation to make even a humble model from thousands of photos, to a fairly fast process, allowing a regular person with a gaming-grade GPU (more on that later) to process a thousand photos into an extremely high-poly (lots of polygons), very detailed, 3D file of an object or environment.
But before we get ahead of ourselves, what is photogrammetry software, anyway? Basically, it is software that allows you to take a bunch of photos (and/or laser scans) of an object or place, and process those photos into an actual 3D shape: either a dense point cloud, or a solid mesh made of triangles, colored (textured) by the original photos.Īs you might guess, this is a pretty computationally-intensive process, and in terms of input data, the sky’s the limit: You might use thousands or tens of thousands of source photos to create, say, an aerial 3d model of a city. There are a few competing solutions in the market for photogrammetry software. What is Photogrammetry Software?īefore we step into the worlds of Realit圜apture and Epic Games, let’s take one step back and look at the software involved to get a better understanding of what we’re in for. Who are these companies, and what is this software all about, anyway? Let’s dive in a little bit. What does this mean, and what’s going to happen? That’s always the question. HUGE NEWS in the 3D world! Capturing Reality, the Bratislava-based company behind Realit圜apture, a leading photogrammetry tool, has been acquired by Epic Games, one of the world’s largest video game companies, and makers of Unreal Engine, one of the leading 3D engines used in video games today. There are lots of tips in archaeology blogs, I think, as well.įor reference, check out the stuff around open source packages of visualsfm, colmap and meshroom, as well as the relevant online reaources of the software you are using.Direct from the desk of the CEO: Epic Games acquires Realit圜apture – This is HUGE!
Since you took the images on overcast day just go back and take the extra few picks missing.įor full on neurotic mode, Put camera on manual, take RAW images, and keep your aperture, shutter speed and ISO constant throughout the shoot.Įven though you have a drone, you still likely want to take photos of ground details by hand (unless you have a real good camera on the drone - my 12 Mp really is noticeably worse than the 24 Mp - in this case you want all the pixels as you can as that helps with the alignment - just as long as they are NOT mushed by poor JPG compression - like my Mavic does-hence RAW:s). I usually notice only after taking the photos that I missed an angle and some details I wanted to see are not there.
Overall rule of thumb is that when you move, consequent picks should have 80% overlap Prefer a bright overcast day - it's way better than a direct sunlight. Then similar turn around of the details that you want to model clearly like columns, stairs, and whatnot protruding and receding from the house facade.
The 3d reproduction quality is roughly to what you provide in photos-meaning if you view your model from a similar camera distance relatively as the source image it will look good - or further away.įor a high resolution scan you likely want to have a hierachical set of pictures, First a roundabout of say 32 pics. With a drone you can capture large wider areas, though, which is really cool (a city block. Same rules of thumb apply to both drone and handheld cameras. The DICE StarWars photogrammetry presentation is pretty good "Photogrammetry and Star Wars battlefront":