Question

Single sonos one working as stereo?

  • 5 December 2019
  • 41 replies
  • 370 views


Show first post

41 replies

Taking onboard what @buzz mentions above as being the cause, I found this animation that shows this issue in action.

The two grey coloured mono audio waves are played in phase and one is slowed down and you will see it then goes 180° out of phase which produces a zero value in the blue line audio wave mix at the bottom of the graphic.

 

There is clearly a lot more to audio than meets the eye.

I am aware of this phenomenon, but I somehow can’t square it with what we are seeing here with one instrument that is playing music getting dropped out of the mix while all else seems to sound ok. And also, this seems to be the first time someone has noticed something like this so this is not at all common - kudos to the OP for picking it up, even if accidentally!

I am aware of this phenomenon, but I somehow can’t square it with what we are seeing here with one instrument that is playing music getting dropped out of the mix while all else seems to sound ok. And also, this seems to be the first time someone has noticed something like this so this is not at all common - kudos to the OP for picking it up, even if accidentally!

 

If that one instrument was purposefully recorded out of phase (which seems to be the case here), and the rest is recorded normally (i.e. in phase) across the two channels, this is the expected effect.  

Would that one instrument be recorded out of phase by error or by design? If the latter, why?

Yes, I think jgatie is correct here and that two mono sources from the appegiator’s audio output were likely mixed 180° out of phase by the engineer at the mixer desk.. maybe a mistake, but possibly it could have been done deliberately to give rise to discussion and/or generate some publicity, perhaps? 

All the same, it’s an interesting topic, I think.

I would assume by error. The engineer probably never listened to the track in mono, and wouldn’t have recognized it was occurring?

Would that one instrument be recorded out of phase by error or by design? If the latter, why?

 

No chance of knowing.  It wouldn’t be the first time someone did funky things with the mix.

 

Hello Seba,

Thank you for the reply and the link to the discussion surrounding this issue.  After looking at the diagnostic you provided I am seeing some level of interference which could definitely cause audio quality variations.  My recommendation at this point, in order to cover all bases, would be to factory reset the device and reboot your modem and router.  See if this problem still arises and if so please send us another diagnostic.

Michael C.
Sonos | Customer Care | Contact Us  
Ask questions, find answers, and share your thoughts on the Sonos Community
 

 

THIS IS JUST SAD. 

Curious as to why this is sad? They identified an issue in your submission, which might cause issues, even if not this particular one, and suggested a method for you to potentially resolve that issue. That’s what I would expect from a support line. Even if they’re unable to address my exact issue, due to the nature of the discussion above, they’ve identified another issue that I haven’t yet, and helped me address it before it becomes a problem. 

Using a pair of regular speakers connected to an amplifier, play a mono program source, such as a news broadcast. Now, connect one of the speakers out of phase. Notice that your sense of “space” is different. After some training you can listen to a system and determine if it is in phase or out of phase.

There is a story about Les Paul (guitar player and innovator) walking through a control room, paused for a second, then informed the young recording engineer that the piano microphone was wired out of phase. The engineer was stunned, but after checking, verified that Les was correct.

With regard to the case that we are studying, the instrument may have been deliberately added out of phase in order to create a spatial effect. On mixing consoles that I’ve seen one would need to take special action in order to add the instrument with L/R out of phase.(but it is not rocket science)

 

With regard to the case that we are studying, the instrument may have been deliberately added out of phase in order to create a spatial effect. 

 Is the music that we do hear on a stereo speaker pair influenced by this action and would it sound different when mixed in the usual way? I understand that to be yes from the above, just checking.

In this case only the the instrument in question is out of phase. The other instruments are normal.

Sometimes a single instrument (usually a lead instrument) is deliberately processed out-of-phase to give it a floaty/ethereal sound. It’s very easy to do during the mixing process. The problem comes, as you’ve discovered, when you listen in mono - the instrument disappears.

Just for fun, I’ve taken this song and flipped the phase of the right channel, then added the left and right channels together (same as a mono speaker will do), and saved the result as a stereo file with this mono signal in both left and right channels (I’ve saved it this way so it sounds the same whether you listen in mono or stereo). You can hear the result by downloading the file from here.

You will notice the absence of the kick drum at the start, because it was recorded to be the same in both channels (often bass instruments are recorded the same in each channel because the ear is not very good at stereo locating low frequencies). Hence flipping the phase cancels it out. Vocals also are often recorded the same in both channels, but vocal reverb is generally a stereo effect, so can be different in each channel.

At 3:00 minutes, the synth is very much audible. It was definitely recorded out-of-phase as many people in this thread surmised.

At 3:31 you can just hear the vocals - this is mainly the vocal reverb. The dry (original non-reverbed) vocal part has been cancelled by the phase flip.

 

Cheers, Peter.

Curious as to why this is sad? They identified an issue in your submission, which might cause issues, even if not this particular one, and suggested a method for you to potentially resolve that issue. That’s what I would expect from a support line. Even if they’re unable to address my exact issue, due to the nature of the discussion above, they’ve identified another issue that I haven’t yet, and helped me address it before it becomes a problem. 

Agrreed. That was a harsh and unfair comment by the OP. The Support person is looking for problems that affect Sonos and that is what he found. The true explanation was nothing to do with Sonos, extremely unusual, and took a bunch of us geeks a fair while and lots of testing and discussion to get to the bottom of it. The Support person had no chance of coming up with the actual explanation here.

In the LP mastering process the lowest frequencies are customarily cut in mono because this uses somewhat less “space” on the record surface.

By the way, modern LP mastering cutters digitize the signal. It is very difficult to find a recently cut LP that has not been digitized in any step of the process. In the digital domain it is easy to adjust the groove “pitch” (spacing) such that a loud bass note will not cut through to the adjacent groove. Otherwise the recording engineer must manually ride the groove pitch control. Dynamically changing the pitch allows stuffing more music on a side, but if the adjustment is a little late, the master is ruined because there is a “Y” in the spiral. In the digital domain, one can set the minimum groove wall thickness, then sit back and watch the process. If the groove wall is too thick, maximum time on the side is reduced, too thin results in “pre echo” and “post echo” because cutting will slightly deform the wall of adjacent grooves. (Think of the cutter as a hot knife melting wax) (There is also a pre or post echo associated with storage of an analog master tape)

Reply