Or you could average the results of many tests. Similar drives, such as a commute, reduces the variability and could yield a reasonable estimate to compare capacity loss over time. The challenge here is that a commute doesn't completely drain the battery (hopefully!).
What about using the battery charge level as reported by the OnStar app to back out the full battery capacity? By noting charge level percentage and the kWh used in the vehicle display both before and after a drive, you can calculate how much energy the car considers to be in the battery at 100% charge.
Using data I've collected from my car over 22 somewhat similar ~30mi trips, I calculated a battery capacity of 17.9 kWh with standard deviation of 0.4 kWh. The car's mileage was between 250 and 1000 miles.
There is at least one issue with this method. The battery level percentage occasionally increases after turning off the car and, presumably, the battery recovers slightly. A reading from the app could be 65%, for example, immediately after stopping, but it may increase to 66% within a hour or so. My data was taken immediately after stopping, which could explain the slightly low estimate for the new 18.4 kWh battery. The difference of 0.5 kWh is 3% low and in line with the variability seen after the battery recovers.
All this assumes the car is recalibrating the 100% charge level as the battery degrades. I started a separate topic about this particular question here:
Do degraded batteries still show as 100% when full? Of course, if the car didn't recalibrate with the battery, we could just directly watch the degradation as the car reports subsequently lower percentages (98%, 95%, …92%) as "fully charged". But I don't think that is how the car works.