Profile photo of guyharris

At the risk of being boring, we’re extremely happy with our MADI set-up, which is the one described by Xvision:

“The only “acceptable” on paper seems to be MADI with an RME interface on the PC end” … although we use Mac rather than PC.

In fact, I had to ADD latency through MainStage to make the Ivory piano samples feel real: they were coming back at the pianists too quickly, which was uncomfortable. That’s possibly because the musicians were used to latency in the previous MOTU-based system.

Seems to be two different starting points for discussions here: running sample sources and simply routing live mics through the system, and there being a problem with latency. These would appear to be incompatible scenarios. In any case, it’s NEVER going to be possible to compensate fully for the alignment of different sound sources; this is a problem known to church organists and other sound stages where musicians have to compensate continuously.