Is there any reason the spec couldn’t say all cables deliver 5V and the recieving end converts that to what it needs? Because from my understanding on electronics, if you double the voltage, you half the amperage. So supply 5V@20A and let the recieving end convert that to 20V@5A
Less current usually means less heat generated in the wires and so you can use smaller copper wires. Higher voltage requires better insulation on the wires. Altogether I think it's a good idea to use 20V at 5A for 100W rather than 20A at 5V. You will get much more heat and voltage loss at 20A for the same size wire.
Cables are limited by both current and voltage. The thickness of the copper limits the current, and the thickness of the insulation limits the voltage. 20A is a massive amount of current and won't be easily carried by anything flexible (the wires in the wall of your house are only rated for 15A, and they are quite bulky). 20V on the other hand isn't that high and needs very little insulation.