R&D New York Times (LiDar?)13434
Pages:
1oshioyi private msg quote post Address this user | ||
Have a look at this site https://rd.nytimes.com/projects/reconstructing-journalistic-scenes-in-3d on a desktop browser. I'm wondering what they used to capture the photometry. It feels more 3D and immersive compared to Matterport etc. Seems like a mixture of LiDar plus photometry? The web embed works great as well with pre-set pathed walkthrough as you scroll the mouse. A client would like to capture their heritage room and would like the capture data to contain both 3D model + textures and we are not sure if iPhone 12 Pro + iPad Pro LiDar can do this. Like https://www.youtube.com/watch?v=KYXIIB_JlYU&ab_channel=laanlabs |
||
Post 1 IP flag post |
Tosolini Productions Bellevue, Washington |
Tosolini private msg quote post Address this user | |
This is such a beautiful, interactive, yet linear storytelling. Thanks for sharing it. I'm guessing that for the photogrammetry part they used a DSLR camera and a software like Agisoft Photoscan (renamed as Metashape) or Reality Capture. The quality is definitely superior than what you can attain using the Apple Lidar. I did some tests with various iPad Pro apps and the results are promising, but not photogrammetry quality. Yet, for something like this, the Lidar may be good enough. The second component of the NYTimes story is the scrolling animation, along the lines of what Apple does on their Airpod Pro page. This is a cool technique that tools like WebFlow seem to support, but from what I read, it's a fairly advanced technique. |
||
Post 2 IP flag post |
Pages:
1This topic is archived. Start new topic?