A DC/DC converter IC that I am looking into has an input voltage of 7 to 35 V and an output of 0.8 V to Vin. Can it be used under the condition of a large difference in input and output voltages, with a 35 V input and 0.8 V output?
It is possible to calculate whether this is possible from the IC spec for the minimum on time. For example, if the minimum on time is 250 ns, then the minimum on duty ratio for a switching frequency of 100 kHz is 250 ns/(1/100 kHz)×100 = 2.5%. The minimum output voltage for an input voltage of 35 V is 35 V×2.5% = 0.875 V. If the IC is of the type that enables switching frequency selection, this can be dealt with by lowering the switching frequency. However, the values of the inductor and output capacitor will be larger.