If they're close enough bounds we'd only need two, since any one of the unknowns would solve. Otherwise it depends on how close the bounds are.
It looks like the (x+n) square base u values for the squared c between i[t] elements get very close to BigN-n. Not to a point where it seems like it could be used to calculate, but close enough that it's worth noting probably. It also doesn't seem like there's always a value below BigN-n, but there always seems to be a value above. Is this one of the upper bounds you were talking about?
I don't think there's anything to this. I made graphs to compare u minus BigN-n to u minus BigN, and they're pretty much identical. This pattern is more like C_F2's u being close to BigN than it is to C_F2's u being close to BigN-n. I suppose that's still kind of interesting, but I don't think it helps, since if anything all this would do is add extra iterations to an iterative look for n. I could be wrong, I don't know.
Purely out of curiousity, if we keep going the way we have been for the last week, how much longer do you think it'll take? I'm not trying to speed anything up or come across as impatient, but I can't help but wonder about that.
How can a grid of static numbers be represented with rates of change?