Sometimes I like, or need, to turn things on their heads to look at them from a different angle. Most photographers have likely come across the concept of the diffraction-limited aperture at some point in their researching lenses.
The same concept can be flipped around to compute the maximum effective resolution a given aperture can produce on a given size sensor.
I would note there’s a certain amount of fuzziness to this, as the standard diffraction limited aperture calculation isn’t necessarily accurate in practice. It’s not wrong, per say. What it does is work on the fundamental assumption that the sensors’ actual resolving power is equal to the sensor’s pixel pitch. That holds true for a monochrome sensor, a Foevon style sensor that stacks all the colors for each pixel vertically, or a 3-chip system, and all without an optical low-pass filter.
On the other hand, Bayer pattern sensors, and virtually all sensors with optical low-pass filters, can’t actually resolve at their native resolution. The Bayer pattern alone reduces the usable resolving power to something between the native resolution and half of the native resolution depending on the quality of debayering algorithm is. With a lower actual resolving power, the aperture where diffraction becomes a problem increases.
Which brings me to this tool, instead of computing the diffraction limited aperture, I’m computing the effective resolution of a given aperture on various sensor sizes for red, green, and blue light. If the “image resolution” is higher than the resolution of your camera, then you’re not diffraction limited. If the resolution is lower, then you are.