I'm not sure of this, but I think it's based on the clock frequency. You might want to run a few tests to see what it is at 166, 266, and 333. Then you can generate your own constants. If you find out what they are offhand, please post them here.
-PMF
"I'm a little source code, short and stout
Here is my input, here is my out."
When I looked at sceRtcGetTickResolution(), it was something like 1000000, which sounds like an oddly round number for 222Mhz. I suspect that if the timer is cycle-based, the RTC compensates this in its calculations.
My guess is tick resolution should be independent of CPU clock (always the same). Modifing CPU clock only means that some tasks will use more or less cycles, it has nothing to do with resolution. The same is true for Pentium CPUs anyway.