I like your cumulative case method because it looks like you're using Jeffrey conditioning. The probability of G in light of TAGs uncertain status=P(G/TAG)xP(TAG)+P(G/~TAG)xP(~TAG). The left side would be 1x.6, and the right side would be .4 multiplied by whatever you think P(G) is given all the other evidence. In this case, you can use TAG along with evidential arguments. That's why I think the Bayesian approach is best, bc it can let you do stuff that you couldn't do if you were offering TAG as a proof. I don't think Larry would be open to this approach though.
My initial argument was offered against someone who uses TAG as a deductive argument, so my point still holds in that context. I think you offer a good way to sidestep the problem. But even with the method you offer, my main point still stands: evidential arguments can only make a contribution if you're open to the possibility that TAG is wrong.
About fine tuning. I agree for the most part. There are a few ways I could push back. For what it's worth, there's a long tradition, going back to at least Spinoza, of arguing that P(FT/G) is either 0 or near 0. (Spinoza obviously didn't talk about fine tuning per se, but he argued that God wouldn't/couldn't create a universe bc he'd have no reason to do so, given his lack of wants/needs).
Also, be careful with P(FT/~G). The fine tuning data doesn't say this number is small, it says that the life permitting range of values is small. If someone says a constant is fine tuned to one part in 10 billion, they aren't saying that there is a 1/10 billion chance that the constant would have that value. Instead, they are just saying that if the value of the constant were changed by one ten billionth of a percent, then life couldn't exist. Claims about fine tuning are about the narrowness of the life permitting range of values a constant could take. Additional argumentation is needed to show that its improbable that a constant would take a value in that range. To say that P(FT/~G) is low, you need to make some philosophical assumptions, such as the adoption of the principle of indifference and the rejection of the axiom of countable additivity.
[See summary page of this discussion, with links to all the posts, here.]