just wondering - why is it that when I set up my TC verb to be used as an insert on the s/pdif in/out of my digi001 there is latency? How can I work around that so I can use it during a processor-intensive mixdown without printing at low buffer settings? I would figure that since the signal requires no conversions, it would avoid the latency. Guess Im wrong. Thanks
Cool, Bri. Thanks...but there is no conversion going on here - the 102 samples of latency in the A/D/A process don't exist - the signal exits PT digitally, is processed digitally, and returns digitally. Still at a loss...
--------------------- SOLD: '96 Artic Silver M3/2 Lux Click here for pic's. 2003 X5 4.6is: Nav, loading floor, comfort seats 2007 E90 335i: Titanium Silver/Black/Aluminum trim, Sport, Premium, Heated seats (for the wife), iPOD, MORR VS7 2011 E92 M3: Space Grey/Black/Blue
i see what you're saying... there is still buffer latency even if there is no converter latency (and i am pretty sure that there is some sort of hardware latency going on - you just can't get around it). anyway, for the sake of illustration, lets assume that there is no hardware or converter latency. if your h/w buffer is set to 1024 samples, you'd still have to nudge your track 2048 samples to the left to account for buffer latency. you get the buffer on the way out and again on the way back in. maybe kenn legault will chime in here. he's the latency expert at digi...
you know that you could do the test that's done in that link above. just run your spdif cable from the digi spdif out to the digi spdif in and measure the latency that way...
--------------------- 1995 M3 - Gone 2007 335i Sport - Euro Delivery - Gone 2009 335i Sport - Gone 1995 M3 - Gone 1998 4Runner - Current
good call, Bri, but I suspect the results would be predictable - 256samples at 128 buffer setting. Didn't realize that even s/pdif is buffered (duh!). I think we're clear here - if not...anybody? Is there any particular reason that the digital HAS to go through a buffer? I mean both sources and destinations have locked clocks. Why would there be a need to buffer it? Kenn? NIKA?....Ill get my reading glasses
I believe that this is the nature of the beast - by this I mean a host-based system where it is all run through the host computer. TDM has far less latency because the processing is mostly done in the DSP chips on the card(s). The buffers you are running into are in the system buffering. In other words you don't have a straight pipe from one device to the other, everything still runs through the host. Someone please correct me if I'm wrong...
Doug, I do beleive that you are correct. The question I have is, if the system is clocked to one clock source, why does the s/pdif or ADAT I/O need to be buffered - it should be sync'ed via the clocks - no chance of running into playback errors...or am I totally wrong here. (probably.) I wish Eric or someone from Digi would chime in and explain this to me. I think I get it, but I need some clarity here!