Forum Replies Created

Viewing 15 posts - 196 through 210 (of 218 total)
  • Author
    Posts
  • #63899

    If you go the Ableton route, here’s the bit you care about. (Using midi “dummy,” clips.)

    Attachments:
    You must be logged in to view attached files.
    #63897

    DDFF_LV – yes: If you use the TCP midi driver, bi-directional midi communications automatically happen via that midi “device,” in whatever software you’re attempting to use it for. For instance, I was checking out what messages were actually being sent to/from the thing so, had Osculator look to TCP midi as a midi device, the program change messages from dLive popped right up on scene recall, as well as midi fader movements and so forth.

    Example 2: in Ableton, I’ve got outgoing commands being sent to the dLive via Program change messages (on clip launch,) that are routed to the dLive by way of the “TCP Midi,” driver (which shows up as a midi “device.”)

    If you want to, you can do this the other way around, IE the dLive will easily trigger Ableton and so forth. Where that gets squirrely is that Ableton will not differentiate program change values as discrete cues, IE program change on channel 16, even though I’ve got values 0-128 on bank 1, will *still* only show up as “Program change, 16.” So you can’t map scene change triggers directly to Ableton.

    HOWEVER: If you use something like Osculator as an intermediary, you can then map those discrete program change values to multiple commands simultaneously, including to Ableton, as well as osc messages being sent to other devices in house. (Lighting, for example.)

    This gets interesting in a number of ways: if you use Ableton as a clock/timing master, you can issue those cues in really interesting ways that synchronize, musically, to what is happening on stage. In a show recently I had the sound board issue a go command to Ableton, which then ran a timeline that issued commands to a Pro Presenter machine doing environmental projection/video, lighting, and timed sub-cues on the desk.

    In your particular example – just map your controller to clip launch buttons in Ableton, and have those issue the program changes to the dLive that trigger the cues you want. I’m doing that very thing with a Rocktron midi mate when the band is short and they need another player. You don’t have to use Ableton either, Osculator will do it as well. I just prefer doing it with Ableton for the musical timing.

    Attachments:
    You must be logged in to view attached files.
    #63876

    Sure. …but a channel strip on a digital desk isn’t exactly an analogue gate.

    #63824

    +1 on the lag question.

    My personal iPad auto-updated the dang mixPad software and I can’t downgrade to 1.4 now, so that’s hosed until I can upgrade our consoles to 1.5. /facepalm

    …aaaand we’ll DEFINITELY be waiting on the firmware update until that lag issue is addressed. I find myself constantly poking around the filtering while building sets. That being laggy/buggy is a non-starter for us!

    Mr. X – if you stay on 1.42 you’ll be fine. Buy a dLive, like yesterday. You won’t regret it!

    #63729

    Just played with this. …how are you getting +10? my c2500 only does -inf or 0 currently. Is it a setting somewhere?

    #63727

    Great idea if you’re in a static show/venue. I could see Aux sends being a problem if you move a Channel preset from one show to another. Also… what happens if you accidentally send +10dB prefade of, say, a massive keyboard to the infra-sub plugin by mistake? Ouch. (I’d definitely want an option to scope/disable that kind of like the “Recall Preamp” button.)

    #63724

    Thanks for the tips fellas!

    Shame about the +10 increment though, I can see a lot of situations in which that would be an accidental, “find a new job,” button. I’d be interested to see if there’s a way to limit fader jumps to either -inf or 0 only.

    #63661

    Just an update: It was definitely an unshielded cable issue. Pulled what was supposed to be a shielded cable out of a bin. …it was not. Running 3 weeks now with a heavier shielded cat6 and zero issues so far.

    #63633

    RE: Macs, You can actually accomplish automatic reconnect with Automator. I’ve made several scripts for shows in the past to make it easy for volunteers running the thing. It’s not perfect, but it definitely works.

    For dLive – I have to run A&H DAW driver, which is coded by A&H, so why would they not be able to add an automatic connection attempt feature to a software they designed/coded? That’s a completely separate process from opening an OSX network midi session, so the same rules shouldn’t necessarily apply.

    By your logic, my Ableton OSC command session should also HAVE to be manually reconnected upon startup. However, there is an automatic connection feature I’ve built in to the Max patch running those commands. I guess what I’m saying is that, given the number of workarounds I can conceive as a non-coding individual, the, “because native OSX TCP behaves this way, a third-party software must as well,” argument doesn’t hold any water whatsoever.

    #63627

    I’d definitely like to see an auto-connect feature in future updates. The driver already automatically stores the last IP I entered, seems like an oversight not to have it at least attempt to automatically reconnect. I mean, I’m literally clicking a “connect,” button. It should be easy to add an, “attempt connection on startup,” button, no?

    #63558

    I believe the word you’re looking for is, “expander.” Technically a gate *is* an expander with a really really reeeeally high ratio. Most consoles I’ve worked with, and just about every expander/gate plug in any DAW, offer the ability to alter that ratio from absolute on/off to expansion. I mean, even my Behringer gives me that option; (see attached.) I would argue it’s *more* unusual to find a gate without an option to be used as an expander. Again, small thing, but it does make a big difference when you’re dealing with subtle dynamics.

    Attachments:
    You must be logged in to view attached files.
    #63286

    Thanks guys!

    I think I found the problem. It looks like the c2500 had been set up with an incompatible iP address by way of someone setting it to DHCP; the Mixrack was set to static. C2500 was only connected to mixrack via GigaAce, no control network connection at all, so it decided to assign itself an address:

    Wed Apr 26 2017 02:50:18.737 :: Notification: Scene Recall: 106 (../../TLD/TLDCommon/SceneManager.cpp:1020) :: 0 :: -1
    Wed Apr 26 2017 05:07:16.685 :: Surface halting :: 3 :: -1
    Wed Apr 26 2017 05:07:16.685 :: 127.0.0.1 – Server Side Disconnected (AHNet) :: 3 :: -1
    Wed Apr 26 2017 05:07:16.686 :: 10.0.1.152 Disconnected (AHNet) :: 3 :: -1
    Wed Apr 26 2017 22:23:29.703 :: Application Started – V1.42 – Rev. 32313 :: 3 :: -1
    Wed Apr 26 2017 22:23:36.605 :: GigaACE MAC Address: 00:04:C4:04:0B:A8 :: 3 :: -1
    Wed Apr 26 2017 22:23:36.605 :: DX MAC Address Base: 00:04:C4:04:0B:A9 :: 3 :: -1
    Wed Apr 26 2017 22:23:37.982 :: Surface Started :: 3 :: -1
    Wed Apr 26 2017 22:23:37.982 :: Outgoing Remote Connection 10.0.1.152 :: 3 :: -1
    Wed Apr 26 2017 22:23:39.309 :: Connection made to 10.0.1.152 (AHNet) :: 3 :: -1
    Wed Apr 26 2017 22:23:39.309 :: Incoming Remote Connection :: 3 :: -1
    Wed Apr 26 2017 22:23:39.309 :: Connection made from 127.0.0.1 – Server Side (AHNet) :: 3 :: -1

    (Dates are wrong, that was Saturday 5/6.)

    Jack, I’ll send you the full log as well. I set the c2500 to static immediately after and haven’t had anything come up in log since then. (…and I’ve been making a conceited effort to try to crash the thing.)

    #63267

    Apologies, I was only attempting to provide reference for what I perceive to be a lacking feature.

    I guess what I’m expecting to hear is a seamless transition between 1/4 and 1/2 note taps without burning another FX slot/return channel, or having to do a quick fade out/in. VENUE’s one delay does a pretty good job at that, and I’ve come close to the same effect with the Ableton automation layer I mentioned earlier. I absolutely understand it’s a dynamic process. What I’m saying is: in a digital system, with digital delays, there should be a built-in automation mechanism that provides a user with an option to remove those audible shifts entirely, without it sounding weird/unnatural. Is that so odd a request?

    #63259

    Absolutely agreed! Keep it simple when you can! I have a handful of individuals who can handle more than the “voodoo,” tools; and when people are able to actually learn/use that information I give it to them and have them learn the proper way to actually use the tools. (For what it’s worth, I even have a couple of folks working through Bob Katz concepts right now.) Unfortunately, I also work with a handful of individuals whose learning just caps off at a very basic level, so I do have to have a few of those tricksy tricks in my bag for those folks.

    I do agree that having a second delay on fader is equally useful, and just as good a workaround. I typically end up fading one out before the swap anyway. I guess part of where I’m coming from there is that it’s one more thing to keep track of/accidentally fade the wrong one (for certain individuals;) and I’m also used to working consoles with very limited channel strips/resources, so tend to compress returns to as few as I can get away with. In the Pro Tools example, the engineer also had to deal with not having any snapshots, or automation whatsoever; so not having to worry about the position of the delay fader was a big plus. (We were simultaneously recording multi-track, janky rig I know.) Obviously dLive doesn’t have that issue; and with the addition of modular dsp/fx racks, the sky’s really the limit. …or at least your DSP host Processor/RAM capacity is the limit.

    #63256

    …and to answer the, “why wouldn’t you just map it directly?” point -> Ableton was acting as intermediary for contextual, musically-flexible time-based automation parameters. In this case, kill delay output, change tempo tap, return delay output. Again, it’s really an end-user decision: all the volunteers see is that they tapped the tempo, things happened, and it sounded good. (On the Fly, no audible pitch-shift.) I use Ableton for a number of other musically-timed automation controls as well: lights, console fades (X32 on occasion, which is built on OSC,) filter parameters, video cues, and so on. It just made sense to me to use it in this application.

    As an engineer, I totally respect and understand your position on delays; everything you’re saying makes sense, for engineers, or users with any base level of mixing skill. On the other hand, I have to be concerned about unskilled user interaction in a fairly comprehensive way; hence my obsession with delay “artifacts.”

    Thanks for the interesting conversation, appreciate your perspective sir! I’d still like to see the availability of a delay on dLive that behaves the same way as VENUE short/med/long, but I definitely agree there are other things that are deserving of attention before then.

Viewing 15 posts - 196 through 210 (of 218 total)