that's really not how I want to control my DAW. I mentioned this a bit in the comments section of tonebenders episode 2 (check it out!) but I want to go into more detail here.
First of all, I get why random access control surfaces are probably the future - the ability to implement custom interfaces that are not tied to dedicated knobs and sliders is a powerful concept - but the fact of the matter is that finger access is never going to be as fast and accurate as mouse access for fast, sample-accurate and detailed editing tasks.
it also doesn't seem like you'd get the same "feel" as you would with a hardware controller if you're doing fingertip mixing. All of these "pictures under glass" offer only visual feedback when you manipulate them, which IMO takes you out of the process of manipulating sound.
It also reminded me of this little rant I came across a while back - which very eloquently details the ways in which our hands manipulate the world and receive feedback, while pointing out that the touchscreen interface concept basically breaks that concept by refusing to give tactile feedback. It forces us to use our eyes to see if we did anything and to measure the degree by which we've done those things.
We can close our eyes and turn a knob or move a fader, and instinctually feel the degree to which we've moved those things. This is an important concept in manipulating audio, and all of these touchscreen interfaces break that concept.
I'd personally much prefer to see a system that looks like a flat tablet, but actually has a grid of actuators that pop up tactile feedback (much the way a refreshable braille reader does), and have that give feedback. Maybe it could also be both touch and pressure sensitive, so that it could move with your hands as you manipulate the interface.
I think that would end being more useful for audio specific tasks, and as an interface in general. Our hands currently aren't getting any feedback from our screens.