I thought since R is fixed, increasing Voltage would increase I, the current.
I think you're mixing up the concept of Ohm's Law.
First, some definitions:
Ohm's Law
V=iR
This equation refers to a voltage drop across resistance.
Voltage drop = (current in the wire) x (resistance in the wire)
V(lost) = i(wire) x R(wire)
P=iV
This equation can have two meanings:
Power transmitted = (current in the wire) x (voltage across the wire)
P(wire) = i(wire) x V(wire)
or
Power lost = (current in the wire) x (voltage lost due to resistance)
P(lost) = i(wire) x V(lost)
---------------------------------------------------------------------
Now, I'll deal with this problem in two parts:
1) Show that the current drops due to the increased voltage
2) Show that a decreased current will lead to a decrease in the power lost
1) P(wire) = i(wire) x V(wire)
The power in a wire is always constant. This means that there is an inverse relationship between the current in the wire, and the voltage being transmitted across the wire. Increasing one will decrease the other. This means that an increase in voltage will cause a decrease in the current flowing through the wire.
2) P(lost) = i(wire) x V(lost)
Now we're talking about a loss in power due to a voltage drop (due to resistance). This is where Ohms Law comes in.
V(lost) = i(wire) x R(wire).
P(lost) = i(wire) x i(wire) x R(wire)
Since the current in the wire is lower, less power will be lost in the wire.
Hope that helps.