Outsourcing compression workers
General stability improvement when doing compressons and extractions.
I sometimes compress larger directories into txz archives. Just now I compressed four of those large folders at the same time to better utilize my eight-core processor. Just opening a new nautilus window, navigating through three folders and finally pressing that “back” mouse button led to Nautilus completely crash with all four compression workers. Compression of all four folders needs a few hours in my case, so it would be good to not kill these processes when Nautilus crashes.
So one can say that this is a bug of Nautilus, but Nautilus in general is pretty fragile (depends on what you do, but working on remote locations is a horror story, for example), so there are many reasons for Nautilus to crash. However, as Nautilus is quite stable when just directories are browsed, this compression work seems to weaken Nautilus and I think it may be a good idea in general to outsource this work into separate FileRoller processes. Would this make sense?
Benefits of the solution
Better separation of processes for a more stable user experience. If Nautilus crashes, those processes might hopefully continue running. These compressions currently slow down the system by a large degree, maybe this even can be improved.
I don’t know, maybe it’s not as easy to sync the progress back to this progress pie animation.