Testing Apple iPad Pro (2020)11488
Pages:
1
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
Hi guys, I got it yesterday and spent all day trying to restore all my scans and the capture app from iCloud. Unfortunately it was the last app/data to restore with over 70GB in size. With my internet suddenly going down/up it took all day. However at the end of the day it was finished so I did a little test. I do not think it says much as I have tried one scan only. But here are some details: On the Ipad Pro it took 20 sec from a point when I pressed scan to the point when I can do the next. Then I tried to repeat the same scan with my Ipad 5th gen. It took 30 seconds even though it looked like it was going to finish it in the same 20 sec. The message about moving a camera seemed to appear close to the time that happened with the Ipad Pro but it seemed to take 10 seconds longer before it'd let me scan again. I would not call it a valid test as it would be probably more valid if I tried to scan one place with a few scans using different ipads and then compared time it'd take on each. I am going to use Ipad Pro on a job tomorrow and I hope to get more data on how fast my Ipad Pro is. As for its Lidar, it seems to be very limited on apps you can use the Lidar with. Canvas by Occipital does not support Ipad Pro(2020) for now but they seem to be working on making it work with their app. https://support.canvas.io/article/31-what-device-should-i-use-with-canvas |
||
Post 1 IP flag post |
WGAN Fan CLUB Member Coeur d'Alene, Idaho |
lilnitsch private msg quote post Address this user | |
Matterport's beta Capture App program was starting to test it's lidar capability | ||
Post 2 IP flag post |
photosintheround private msg quote post Address this user | ||
Do the new pro versions have better WiFi connectivity to the matterport and/or 350 cameras compared to the air or regular iPad versions? (Assuming all latest generations) | ||
Post 3 IP flag post |
WGAN Standard Member Los Angeles |
Home3D private msg quote post Address this user | |
I've only begun to use my iPad Pro 2020 with lidar, and haven't used the lidar for anything yet. Excited to see what developers employ it for. I would imagine lidar to be very useful for such as CubiCasa if they update their software to use it. Most interesting is an app called Shapr3D which has uses both with and without lidar. It's an intriguingly simple 3D CAD application. There's a free version which I believe runs on any fairly recent iPad. Play with it. I'd love to hear what others smarter than me think. Count me in for any sharing regarding the iPad 2020 lidar use. |
||
Post 4 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
Just tested it in my garage and adjoining it a rumpus room. 12 scan points in total Ipad Pro 2020 all 12 scans done is 7 minutes Ipad 5th gen has taken 7:30 minutes. The Ipad Pro does a scan(from a press of a button) till you can press it again is 20 seconds. On the Ipad 5th it takes 32 seconds. The only way to get that huge gain in speed is to start walking to the camera and move it to the next spot as soon as spinning stops. You should have about 4-6 seconds to reach the camera and move it to another spot and press 3D scan. With the Ipad 5th gen you have about 20 seconds to do all that. If you cannot move that fast then it is not going to scan much faster. Even though saving 30 seconds on 12 scan points out of 7 minutes is about 7% faster. Other things: Deleting scans is faster on an Ipad Pro Aligning seems to be faster and I believe most of new scans aligned either right before 3D scan button becomes active or 1-2 seconds after it. On the Ipad 5th aligning is usually happens when you are scanning your next point. I have not tried a lot of trim lines while scanning but that is going to be tested on a real job. |
||
Post 5 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
Quote:Originally Posted by lilnitsch Do you know if it is available anywhere for me to download? Because if Matterport really makes it working as an addition to scanning with their Matterport camera we can just use an Ipad LiDAR sensor to fill the gaps in mesh to avoid stupid scans just for it. Also it may fix problem with scanning outside in full sunlight. |
||
Post 6 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
Quote:Originally Posted by photosintheround I have not tested it yet but it is not a problem for me with my Ipad 5th gen. It works around corners if I am not more than 5-7 meters away and I am right behind a corner. |
||
Post 7 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
Quote:Originally Posted by Home3D Yes, I have downloaded it but have not tried it yet. There are only 2-3 apps I could find but so far only one LiDARScanner has been tested. It does scan a room but it is very limited in any other functionality. It works indoor but does not do much outdoor even with the sun being behind a house. |
||
Post 8 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
Actually I was wrong about the sun. It works just fine with the sun. I just do not have a suitable case for my Ipad Pro and I am using one for 2018 model. It has a small window for a camera and it covers Ipad Pro 2020 LiDAR sensor. When I was trying to check if it works outdoor I forgot about the case covering the LIDAR. After I removed my Ipad from the case I could scan all outdoor areas around my house. It worked even with the direction of LiDAR pointed exactly where the Sun was. And it still collected 3D data. |
||
Post 9 IP flag post |
zebrajack private msg quote post Address this user | ||
In fact the ipad pro lidar uses TOF(time of flight)sensor which has large wave range than that of structure light sensor which is adopted by MP. That means it has less impact under sunlight. But conventionally , the resolution and depth accuracy is not comparable to structure light sensor. | ||
Post 10 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
Quote:Originally Posted by zebrajack I am not an expert in LiDAR types but I guess it can still help if we scan outdoor with an Ipad Pro LiDAR and then when we have some mesh for an outdoor area created by the Ipad sensor we should be able to add scans to it with a Matterport camera. |
||
Post 11 IP flag post |
WGAN Fan CLUB Member Coeur d'Alene, Idaho |
lilnitsch private msg quote post Address this user | |
@Wingman Private Message sent |
||
Post 12 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
Bad luck for me today. I have dropped it and its screen are now covered with tiny cracks. It works just fine. It is bit dangerous to move a finger over these cracks as it always catching a tiny piece of glass. However the good news is I have purchased it with AppleCare so the only big deal now is waiting till it is repaired/replaced. If I had a proper rugged case it would probably survive without cracking. But it is hard to get any case for a 2020 Pro model. I have been using one for 2018 Pro and even though it fits 2020 one it protects very little. |
||
Post 13 IP flag post |
WGAN Fan CLUB Member Coeur d'Alene, Idaho |
lilnitsch private msg quote post Address this user | |
I rock a LapWorks case for my iPad Pro it was the only case that I found with both the notch for charging the pencil & lanyard support. https://amzn.to/2YegZV1 |
||
Post 14 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
I have purchased this one today https://www.amazon.com.au/gp/product/B0872F58TL the Apple repair guys offered to me the same case for $74 AUD but I found it on amazon australia for $43. I can use my Ipad Pro till replacement comes from Apple. They said it can take up to 5 business days. I have just put a clear film on its screen to protect fingers from any cuts. BTW, before I dropped it I was scanning an empty house and I had about 50 scans done in 26 minutes. Pretty impressive speed. It can be even faster if you stay with the camera and walk around while it spins. |
||
Post 15 IP flag post |
WallsCouldTalk private msg quote post Address this user | ||
I scanned a home Thursday using the 2020 iPad pro 11" 1TB. This is an upgrade to a 2 year old 6th gen that had run out of memory. I ordered this case to keep it in until Joy Factory releases a 2020 pro model. The slot for the iPencil comes in handy as a place to connect my lanyard, allowing me to move around without juggling the iPad. I will order the iPencil soon and see if there are any efficiently improvements compared to the touch screen ink pens I source from my banks drive through window. I will be glad to get the Joy Factory case when it is released. The adjustable hand strap really improves the ergonomics and reduces hand fatigue with my carpal tunnel. The 2020 pro is slightly bigger than the 6th gen, so I will have to modify the foam cutout in the matterport cas to accommodate for that. I will wait until the Joy Factory case is released before I start chopping up the foam. Spigen Tough Armor Pro Designed for iPad Pro 11 Case (2020/2018) - Gunmetal As for the 2020 pro, processing is MUCH faster than my 6th gen. I will have to work on changing some workflow habits to really capitalize on the processing speed. I had grown accustomed to having time to trim between scans. It may eliminate the game of "jack in the box" I play waiting for the alignment to finish. You know, the one where you drag the trim line all the way across the model when it aligns faster than expected and repositions the view. Anyway, it barely gives me enough time to get the camera moved and leveled for the next spin. I'm certain there is an opportunity for efficiency improvements here,I just have to figure out the timing. Ive played with a few of the lidar apps, but none of them seem to offer any improvement on my work. They were fun to play with though. I look forward to seeing what apps are developed to harness the lidar capabilities. Another hardware change I was pleasantly surprised with was the charging port. FINALLY I can use the same cable to charge my android phone and the iPad! I'll update if I encounter any game changers |
||
Post 16 IP flag post |
GFHoge private msg quote post Address this user | ||
Thank you all for your excellent comments. I am not familiar with the Lidar camera. Can you help me understand it? Are you able to add virtual furniture to a 360 photo? How does it work with the MP2? (I am trying to justify the business expense) Thanks, Geof |
||
Post 17 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
Quote:Originally Posted by GFHoge LiDar is what is being used in Matterport cameras to collect 3D data for a mesh. It emits laser pulses and measures distances from a camera to any surface the laser beam bounces from. So Ipad Pro 2020 and I belive some new IPhones come with some type of Lidar built in. So far there is no app apart from one that can use iPad/Iphone LiDARs. The only one you can use it is https://apps.apple.com/us/app/lidar-scanner-3d/id1504307090 Any other app I have tried does not seem to use the LiDAR at all and even they mention LiDAR or 3D scan it looks like they are using photogrammetry. The closest one with extended functionality we can expect is probably coming from occipital who are known for the structure 3D sensor for IPads. I believe their sensor may be being used in GeoCV tech. I cannot find that page now but a week ago I found a mention on their website that they are working on utilising a built in LiDAR sensor. I subscribed to their updates on it and I will let you know once I get any news from them on it. I would be more interested to see Matterport developing something to add Ipad LiDAR to a scanning workflow. The good thing about built in LiDAR is that it does not seem to care much about Sunlight while LiDAR installed in any Matterport camera simply does not work with Sunlight. |
||
Post 18 IP flag post |
WGAN Standard Member Los Angeles |
Home3D private msg quote post Address this user | |
I second the suggest from - @lilnitsch - I've used LapWorks hand-held iPad Pro cases for 4 years and they're wonderful. @Wingman - Permit me to clarify what I believe you accidentally misstated. Both Matterport Pro 1 and Pro 2 cameras DO NOT have Lidar. They have IR (infrared) scanners which is why they cannot gather point-cloud depth data in direct sunlight. Sunlight is packed with infrared so the MP IR scanner is "blind" in direct sunlight. This is why if direct sunlight comes through a window and creates a bright rectangle on a floor, this will render in the dollhouse view as a "hole" in the floor. If the floor is very dark, carpeting or wood, sometimes the dark color absorbs enough of the IR light so the point cloud DOES work, but go outside in the sun and the Pro cameras really won't work at all. By contrast, the new 2020 iPad Pro has a Lidar scanner. This DOES work in direct sunlight. However it captures (at least currently) a very large triangular mesh, less resolution than MP Pro cameras so I'm not expecting it to capture a Matterport level of scan detail. But I'm no tech genius, so perhaps this is a combo of hardware and software issue and may improve with upgrades. We shall see what develops. Yes, the original Occipital Structure Sensor was the IR device used by GeoCV. Take a look at the photo of my GeoCV camera beside the OSS device mounted on its iPad Pro bracket. Identical except for the exterior color. I'm a dreamer with great admiration for the GeoCV team, now all working at Occipital. Let's all wish them safety and good health. This is a team to watch. |
||
Post 19 IP flag post |
WGAN Forum Founder & WGAN-TV Podcast Host Atlanta, Georgia |
DanSmigrod private msg quote post Address this user | |
@Wingman Thank you for starting this discussion and testing of the Apple iPad Pro (2020). Ugh! Wish it was better news on the screen. @lilnitsch @photosintheround @Home3D @zebrajack @WallsCouldTalk @GFHoge Thanks for your all your great questions and insights too. You probably saw this, but just in case: ✓ Video: iPhone 12 Leak Shows LiDAR Scanning Cameras Lenses I could imagine that once the iPhone 12 comes out, that companies - including Matterport - will create apps - in conjunction with rotators for capturing spaces. Stay healthy, Dan |
||
Post 20 IP flag post |
WGAN Forum Founder & WGAN-TV Podcast Host Atlanta, Georgia |
DanSmigrod private msg quote post Address this user | |
vGIS (23 April 2020) 2020 iPad Pro: Does the LiDAR sensor improve spatial tracking? | ||
Post 21 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
This is not just Ipad Pro 2020 related and I believe you can use it even on android tablet/phone. I have found these guys(ViewAR) and downloaded their ViewAR SDK from app store and played with staging an actual live view video on Ipad. it does not seem to use LiDAR and it works with a device camera finding a floor plane in your live video and letting you put virtual furniture on it. So basically it is a 3D AR overlay for an empty space that you can stage and take a 2D shoot from any angle. It is not just for that and they have some other functions but this is the most interesting one. Other function is to trace room walls and it should build like a computer copy of your real room with no furniture and then let you stage it. Here is their website https://www.viewar.com/ or just find the app called "ViewAR SDK" in the app store, enter "furniture.live.demo" as app id and you can play with it. Do not try to use "furniture.art.demo" id as their manual says. It does not work. |
||
Post 22 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
Quote:Originally Posted by Home3D Like I mentioned before I am not an expert in LiDARs but I know a bit about light spectrum. That comes from astronomy background and operating laser cutters for over 12 years. According to the quote below lasers in LiDARs are from IR light spectrum so LiDARs are infrared devices. Quote: Originally Posted by Velodyne https://www.inverse.com/innovation/apple-ipad-pro-lidar I believe Dan or somebody's else posted an article on WGAN that says that the laser used in Ipad Pro LiDar is 1060 nm so it is IR spectrum too. So the big question apart from a method used to calculate distances in Matterport cameras what wavelength is used in their lasers and how intense it is? I can only say it is not in visible spectrum because otherwise we would see it but it is probably in an area close to visible where the most intensity of the sun radiation in the same infrared wavelength. Look at the chart https://en.wikipedia.org/wiki/Sunlight#/media/File:Solar_spectrum_en.svg 905nm is a sweet spot at the beginning of IR spectrum when the Sun is not so intense and even though 1550 nm wavelength is also occupied by the Sun but it is on the less intense side of the chart. 1060 nm wavelength lands not in so sweet spot but comparing to anything on the left apart from some specific wavelength where there is mostly no Sun interference. Plus I am sure lasers used in commercial LiDARs and probably in Ipad may be more intense and beat the Sun intensity while the on used in Matterport can be in one these three wavelengths but too weak to beat the Sun intensity occupying the same wavelength. Do you know wavelength and power of IR in Matterport cameras? |
||
Post 23 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
BTW, if we know their wavelength I can see if I can measure intensity with my laser power meter. I have a good & expensive one for laser cutters than can measure up to 500W for different wavelength used in cutters even though their wavelength is at the far end of infrared spectrum. I think Matterport just did not care about very limited ability of scanning outdoor with the Sun and created their system for indoor only. I do believe they could do it better using for example 905 or 1060 nm as it is obviously work in Ipad and some commercial LiDar systems. BTW, I believe you are using GeoCV, how occipital sensors deal with Sunlight? |
||
Post 24 IP flag post |
WGAN Standard Member Los Angeles |
Home3D private msg quote post Address this user | |
From my experience shooting GeoCV which uses the original Structure Sensor from Occipital, it has worked, and not worked, in sunlight about the same as the Matterport Pro camera. So I presume the IR wavelength is about the same. This talk of wavelengths is way beyond my pay scale, but that's been my experience. | ||
Post 25 IP flag post |
zebrajack private msg quote post Address this user | ||
@Wingman why bother to know the specific wavelength? TOF means sun-resistant in default. compare with wavelength, I am more concerned of the density and accuracy of the depth map. Looks the original projected spot is much more sparse than IR. Creating a rough geometry may not be a problem. but more delicate details looks difficult to achieve by just simple depth interpolation algorithm. | ||
Post 26 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
Because there may be a way to block a spectrum of sunlight that affects performance of the IR sensor in the camera. I know many tried to use ND filters but it should be precise, not just blocking a wide range of wavelengths. Optics these days are quite precise and a state of the art. I have a laser cutter system that before were using two laser tubes each 140W, combined them through a Brewster Window making about 278W in total. The window was a state of an art with coating. It let one beam through on one side of the vindow straight and was reflecting another beam on the other side lining them up in one beam. Both beams of course were the same wavelength. I still have the window but due to my negligence with a level of water in water chiller I burned one tube. Later I just replaced both tubes with a special combined tube that still have two laser sources inside but it combines them inside the tube. Another example is from astronomy when you can block light pollution at all wavelengths but let light from stars and emission of nebula hydrogen clouds to pass through and be captured. Here is more details on astonomy filters https://astronomy.com/-/media/import/files/pdf/8/c/7/0805_nebula_filters.pdf |
||
Post 27 IP flag post |
WGAN Fan Club Member Queensland, Australia |
Wingman private msg quote post Address this user | |
Quote:Originally Posted by Home3D It must be their version called Mark I. They do say it does not work in Sunlight but with Mark II they say it works. Anyway with GeoCV system you look better as it is modular. If GeoCV was around I believe they could make some updates to swap their Mark I with Mark II. |
||
Post 28 IP flag post |
Pages:
1This topic is archived. Start new topic?