No digital interface is 100% perfect and there are always options for errors creeping in under certain conditions. The advantage that digital interfaces have over analogue is that inherent signal transfer degradations like noise and distortion, which are progressively audible in analogue interfaces, are completely ignored by digital interfaces... up to a certain catastrophic failure point.
One of the major weaknesses of the AES3 interface is that is employs an embedded clock, and inherent cable characteristics degrade that embedded clock through jitter. So any system that relies on using the embedded clock to transcode the digital data to analogue (or vice versa) can suffer signal degradation.
That's why every manufacturer makes a song and dance about the quality of their jitter-reduction technologies.
Having said all that, if you are passing digital audio from one device to another, without converting it to/from analogue, interface jitter is completely irrelevant.
There is no error detection/correction system involved in AES3 -- or any of the other common digital interface formats -- because it is completely unnecessary. However, I'm afraid I'm not yet sufficiently cognoscente of the arrangements involved in passing digital audio over USB3.
There were/are lots of problems passing digital audio over USB2, many of which were overcome with various asynchronous interface designs. So I'm not surprised that USB3 might perform better than USB2. But AES3 -- when used properly, with sensible cables and good jitter-handling designs -- has been proven over twenty plus years to be extremely reliable and to provide bit-perfect data transfers.
So I would surmise that there was some degree of manufacturer hype or ignorance involved in that claim... not that that is unusual for a GS thread!

H