Maybe you have more than you know. The paper ref’ed the open source Detectron. Pics also get normalized down to 800x600pix, so maybe the knowledge of data loss from really large images could mess with this process?
https ://github.com/facebookresearch/Detectron