For FM there would be little to no audible benefit, on a real transmission, even with a reference grade tuner and playback gear. In 99.999% of actual listening situations, nah. No difference that anyone would ever notice.
But if you can output to analog from 24bit or 32bit, you still should use the highest quality you can, if there isn't something technically wrong/broken with a specific model/device. 🙂 I haven't heard of a soundcard sounding worse with increased bit depth, yet.
Swedish National Radio, one of the most respected FM networks for quality, is using a custom codec for their studio to transmitter links which runs at 160 kHz, 12 bit, mono. 🙂 Zero complaints about the sound quality, from anyone, or it wouldn't have even made it on air.
It doesn't mean you can't be slightly crazy about your sound quality, when it's free and easy. So yeah, crank it up! 8)
Now…. the input into a processor, that should always be as high a bit depth as you can get, and yes it does matter A LOT. The difference between a 16bit or 24bit input into a processor (from a 24bit studio path of course) can be audible on FM in some cases. On HD/Streaming/etc… it can be audible a lot of the time, even if it's only used for Billboard HOT 100. 😛