1. The matrix implied by the reference primaries in Table 1: [X; Y; Z] = [506752/1228815, 87098/409605, 7918/409605; 87881/245763, 175762/245763, 87881/737289; 12673/70218, 12673/175545, 1001167/1053270]*[R; G; B].
2. The matrix in section 5.2: [X; Y; Z] = [1031/2500, 447/1250, 361/2000; 1063/5000, 447/625, 361/5000; 193/10000, 149/1250, 1901/2000]*[R; G; B].
3. The inverse of the matrix in section 5.3: [X; Y; Z] = [248898325000/603542646087, 71938950000/201180882029, 36311670000/201180882029; 128304856250/603542646087, 143878592500/201180882029, 14525360000/201180882029; 11646692500/603542646087, 23977515000/201180882029, 191221850000/201180882029]*[R; G; B].
The distinction starts to matter for 16-bit color. The CSS people seem to take the position that the matrix implied by primaries is the true version, but meanwhile, the same document's Annex F (in Amd. 1) seems to suggest that the 5.2 matrix is the true version, and that the 5.3 matrix should be rederived to the increased precision. There's no easy way to decide, as far as I can tell.
Meanwhile, I agree with the author that the ICC's black-point finagling in their published profiles has not helped with the confusion over what exactly sRGB colors are supposed to map to.
But if I embed it in a photo and then open the photo in GraphicConverter, it shows up as "sRGB IEC61966-2.1", which to my understanding is identical to Apple’s sRGB Color Space Profile.icm.
But that's an sRGB v2 profile. Should I download and use a v4 profile instead? Or download the ArgyllCMS sRGB.icm [1] and convert all photos to it? Or just select the Apple default sRGB profile everywhere?
I'm not a pro and don't have a calibrated display, but it annoys me when photos I upload online look vastly different in my browser than they look in my editing software on the same display.
There is always a historic reason for a colour profile, sadly most software avoids terminology like the plague.