Regardless of whether the customer gives the load of devices in amps or VA, it is necessary to convert everything into watts. It's easy to convert amps to VA. Simply multiply the amps times the nominal voltage of the device. For example, a 2.5 A device that runs on 120 V power has a VA rating of 300 VA . It is a little more difficult, however, to convert VA to watts. It all depends on the power factor of the device. Power Factor (PF) is the ratio between VA and watts... but the problem is that the power factor varies from device to device. Generally, all devices found in a datacenter environment (servers, switches, etc.) use power factor corrected (PFC) power supplies that provide a typical power factor of 1.0. In these instances, the VA and wattage are identical (often called unity). Generally, all devices found in a desktop environment have power factors between 0.6 and 0.8. Selecting the appropriate power factor is critical in attaining an accurate UPS recommendation. If one uses a power factor that is too low for a particular device, when they multiply by the VA, the result is a wattage that is artificially low. If the wattage determined is lower than the actual wattage, the user will run the risk of overloading the UPS. For simplicity's sake, a power factor of 1.0 is always the safest value to enter.