The last time this exact scenario happen to me it was the cabling.
The formula for power loss is:
P = I²R, where P=lost power, I=current and R=resistance
Or something pretty similar to that.
The power lost increases proportionally with the resistance of the cable. So the cable can work perfectly good for data, but due to resistance loses sufficient power so that it can not properly power what is plugged into its end.
Using a cable with a large copper core diameter reduces resistance. Using a shorter cable reduces resistance. Sometimes the ends are not fitted off poorly causing increased resistance.
But often - you just have to replace the cable with another one.
<rant on>
Let me go completely off-topic for a bit. This has really big implications that many people don't realise in the non-networking world.
Lets say I have a 30m extension code. I plug my electric saw into this, and the cord into a powerpoint. Lets say I live in a country that uses 240V. If I buy the cheapest extension lead with a core size of 1.0mm^2 instead of an extension lead with a core size of 2.5mm^2 I'll loose about 7% to 8% of the power to my saw - or my saw will be about 8% less powerful.
Lets say I was now driving a motor that has to run at a fixed speed such as a compressor. The compressor motor will now get hot and will eventually over a long period of time damage itself.
Lets say I am putting in a big machine in a factory. Using the smallest core size allowed by regulations might make my big machines run properly but the resulting power loss (compared to using a larger wire with a larger core size) may result in that machine using more electricity than it needed to for the rest of its life.
</rant on>