btrfs RAID1: wrong disk usage reported
When using a btrfs file system which spans two disks in RAID1 mode, it seems to me that gparted is reporting the wrong used/unused sizes.
In my case, btrfs fi usage
returns:
Overall:
Device size: 32.74TiB
Device allocated: 28.16TiB
Device unallocated: 4.58TiB
Device missing: 0.00B
Used: 28.15TiB
Free (estimated): 2.29TiB (min: 2.29TiB)
Free (statfs, df): 2.29TiB
Data ratio: 2.00
Metadata ratio: 2.00
Global reserve: 512.00MiB (used: 0.00B)
Multiple profiles: no
Data,RAID1: Size:14.06TiB, Used:14.06TiB (99.99%)
/dev/sdf1 14.06TiB
/dev/sde1 14.06TiB
Metadata,RAID1: Size:17.00GiB, Used:14.89GiB (87.61%)
/dev/sdf1 17.00GiB
/dev/sde1 17.00GiB
System,RAID1: Size:32.00MiB, Used:1.94MiB (6.05%)
/dev/sdf1 32.00MiB
/dev/sde1 32.00MiB
Unallocated:
/dev/sdf1 2.29TiB
/dev/sde1 2.29TiB
So I have 14 TiB of data, duplicated across 2 disks (thus resulting in a total of 28 TB on-disk). However, gparted is showing that both disks would be only filled with 14TiB/2 = 7 TiB, which is not true:
This issue might be related to #106.