- cross-posted to:
- news@lemmy.linuxuserspace.show
- cross-posted to:
- news@lemmy.linuxuserspace.show
Wayland. It comes up a lot: “Bug X fixed in the Plasma Wayland session.” “The Plasma Wayland session has now gained support for feature Y.” And it’s in the news quite …
I live in a time where I don’t need to edit config files by hand to allow using multiple applications with the same audio output, since I use a sound server. If you’re willing to do it by hand, then by all means continue. Though it does seem that ALSA has had support for automatically setting up dmix since 2005, after PulseAudio was released.
I also don’t know if resampling and the like is automatically handled when using dmix, but perhaps you can tell me that, since it sounds like you have experience with it?
How about we keep a good fucking tone. Yes, that’s great. However my experience is that programs all want to set those properties without a way to disable it, so in practice it doesn’t really matter.
Yeah, as you mention hardware mixing used to be an option, but AFAIK hardware generally hasn’t supported that for a long time.
Another reason to use Pipewire is to enable sandboxed access to multimedia devices, for use with things like Flatpak or Snap.
You don’t need to. It just works out of box.
And after JACK. And PulseAudio development started after JACK was released.
Eiher by dmix or by libalsa since I never had issues with samplerate.
Not to offend you, just saying that reading manual is one of the best ways to get information.
For some reason. I don’t remember having such issue, but I also don’t remember using same webcam in two applications for sure, but I think I used v4l2loopback “webcam” in VLC and chromium at the same time.
Well, /dev/videoX can be forwarded to sandbox. Snap and Flatpak are not designed to be a sandboxes.
Sorry I forgot to reply. Better late than never.