Front Porches & Soffits Get No Love

So, this was my first hack at a residential home and property 3D rendering, a product type which we’re currently assessing the local Realtor listing demand for. Yes, people are reluctant to go tour homes on the market due to the COVIDS, and this seems like a pretty legit workaround. The problem with… One of the many problems with being located so close to our nation’s capital is the localized housing demand bubble which exists in perpetuity anywhere within 60 miles of the Capital. It’s not a bad thing, but homes spending an average of 3 1/2 days on the market now doesn’t really create the demand for these 3D maps like we’re seeing in other, far less corrupt areas…Did I say that out loud? Sorry.

That said, with the self-diagnosed OCD which I blame all my problems on, whether or not I will ever make another one of these really cool renderings is of no consequence. Why? Because I can’t get it to be perfect, and, well, I don’t want to get all Howard Hughes on everyone,. Suffice to say, because of my made-up condition, there is without a doubt a better probability that the government of the Democratic People’s Republic of Korea and the government of the Republic of Korea will decide to reunite tomorrow night over a bottle of soju than the chance I’ll be able to just accept this underachievement, move on, and fight another day. So, here we are.

Alright, being my first rodeo for this type of project, I’m fully cognizant that the issues which I’m having are directly related to one or more of not-so-established workflow steps we stumbled through while doing our best to look like we OBVIOUSLY knew what we were doing. In the spirit of taking my medicine and beatings, I will outline the steps we took initially, then to attempt to mitigate the issues, and yet again to mitigate them again. Not once, not twice, but thrice!

I’ll drop the SketchFab link to this project at this point to allow those interested in assisting to see what I will attempt to describe with my words with my indoor voice:

Remember, first try, be nice, and I won’t start crying. So, the issue: On the initial day we scanned the property (DJI Inspire 1 v2.0 with Zenmuse X3, old, but still a workhorse), we committed what I must think is a common rookie mistake. We didn’t ensure that we captured enough (any) imagery of the structure which was hidden by the porch roofs and the gutter soffits. Well, we figured out that boo boo pretty quick after we chunked all the data through Pix4D Mapper, and realized our mistake. Here is the step by step, way too complicated, workflow which should NOT be duplicated:

DAY 1 (Weather, Bright Overcast, diffused sunlight with minimal shadows)

  1. We conducted what was a training mission, on the property of a friend, choosing it for the notable absence of mature trees IVO the home.
  2. We began with a process our crew is well versed on, a double grid flight plan with Pix4D Capture (PLEASE MAKE A CLOUD BASED INTERFACE!!!).
  3. Based on the size of the overall property, as well as the training nature of the mission, we elected to conduct the initial grid flight at 200’ AGL.
  4. The initial flight launch, collection, and termination were uneventful. A quick audit of the data collected showed a nearly perfect set of data
  5. With the mapping portion complete, we elected to use the DJI Go POI flight function to conduct a total of 3 full circuits around the actual house, each being conducted at 15’ higher than the last.
  6. As evident in the model, the property is not free of collision risks in the form of bushes, fences, old growth trees, and other obstacles.
  7. We selected the initial altitude at 30’ AGL, configuring a POI circuit holding a 55’ radius from our estimated house center point.
  8. The camera was configured to collect images at a 2 second interval, which based on our slower than normal speed, was assumed to be more than adequate
  9. During the first circuit, we realized that the 55’ radius at that altitude would be unable to clear an obstacle we had not judged accurately. with the aircraft now hovering, we reduced the radius to 48’ to allow for clearance. (Did that have a potential screw-up factor?)
  10. The second circuit was flown at 45’ with a radius of 48’. Not realizing the 2-second auto-shutter had been reset until the aircraft had started moving
  11. In lieu of pausing the aircraft to reset the camera, we elected to manually take images at a rate which ended up being more like 1 every second, but more data is better
  12. The third circuit was flown at 55’ AGL (only 10 feet above the previous due to roof-line and the need to mask the horizon) at the same 48’ radius
  13. When the third circuit was complete, we decided to make an effort to improve the shed on the property simply for practice. All images were taken of the shed while the aircraft hovered at 6’ or below, and being hand flown at all times.

DAY 2 (Weather: High Clouds, sunlight diffused, however less than previous day. Still minimal shadows)

  1. At some point in the last 12 months, I read or heard someone claim that cell phone cameras are adequate for use to supplement 3D model image scans. Based on that, and in no way happy with the 3D render errors which resulted from a lack of any imagery data, we elected to return the next day without the drone.
  2. Using a Samsung Galaxy 8 Active cell phone, I collected approximately 180 additional images of all the areas we had missed the previous day on the house, as well as at the shed.
  3. Upon returning, I elected to rerun the project with the cell phone images (accurately identified by Pix4D) and was very disappointed to find that this resulted in no improvements. I made sure that those images were in the list, which they were, and at some point realized that while not all, the majority of the cell photos were designated as uncalibrated. Either way, we were looking at the same data errors in the model.

DAY 3 (Weather: Overcast, sunlight diffused, not shadows)

  1. Never accepting defeat, on the third day the drone came back out, and back to work we went
  2. During the 2 batteries used, we hand flew the aircraft at chest level and utilized the camera tilt to ensure we were capturing every inch under the porches and soffits.
  3. We covered the same ground as the day before, but this time with the drone’s camera thinking the two sets of data from two cameras might have poisoned the well the day before
  4. We collected 230 additional images, and were satisfied with the data.

We through the day 3 data in with the day 1 data, and created a completely new project. Once again, the porches and soffits still refused to render properly, that despite the day 3 data all being designated as calibrated.

Is there a glaring error I’m not seeing here? Should we have bit the bullet on day 2 and flown every part of the mission again? I can’t see why data from 2 different days would be irreconcilable, so help me out!

Thanks!

Hey Pete,

Thank you very much for sharing detailed information about what you did and the problems you want to address along with a link to the 3D textured mesh on Sketchfab. I really appreciate it.

Please read all of the steps and let us know if you are not confident you understand what you should do during any step before you return to working with the project.

With the understanding that without seeing a PDF copy of the latest project’s quality report I’m not confident the following will get you back on track and that there is more than one way to troubleshoot a project, I suggest you do the following:

  1. Create one Pix4Dmapper project with the images you captured on day one.
  2. Create one Pix4Dmapper project with the images you captured on day three.
  3. Process step 1. Initial Processing for each of the projects.
  4. Ensure that Pix4Dmapper calibrates practically all of the images in each of the projects.
  5. Identify a handful of things throughout the project area where the automatic tie point cloud of each project overlaps with the automatic tie point cloud of the other project.
  6. Add a manual tie point for each of those common features.
  7. Ensure that each of the corresponding manual tie points has the exact same name.
  8. Reoptimize the project.
  9. Generate a new quality report.
  10. Confirm once again that Pix4Dmapper calibrates practically all of the images in each of the projects.
  11. Close both of the Pix4Dmapper projects.
  12. Create a new Pix4Dmapper project by merging the two sub-projects.
  13. Confirm for the third time that Pix4Dmapper calibrates practically all of the images in the project and that the computed position and orientation of all of the images from both day one and day three align with each other.
  14. Add a processing area with the rayCloud so that Pix4Dmapper only reconstructs what you want to reconstruct.
  15. Process step 2. Point Cloud and Mesh.
  16. Consider the results. If the reconstruction accurately represents the geometry of the front porch and soffits, please let us know. Otherwise, please consider sharing a description of your reaction to the latest results.

Looking forward to hearing from you.