I ran a dataset with 999 images on an Alienware 17 laptop and an Amazon EC@ G2.8xlarge instance. Both datasets I ran through initial processing with the exact same settings. They both ran for 5 hours and only had 3 minutes difference between processing time. The EC2 should be way better but a time difference of only 3 minutes is not worth it. Has anyone else see and lack of time improvement when using a better computer??
Could you please send us the log files and the quality reports of the projects processed on Amazon and on the laptop to further investigate on the performance?
You can send the files in a request that you can submit here.
I run a similar workstation with dual xeon 2670 and will destroy any laptop. there is no way the Alienware should be near at the same time as the G2. You should easily see in improvement of around half the time.
was there any update on this? I also ran some amazon EC2 and was not pleased with the timing. Seems like my macbook was not that much slower.