Not to be argumentative here, but 72Watts is 7.2% of the peak/ideal generation power of your solar (very nearly 10% or 1/10th). Also recall that in order to recharge batteries you need to put about 125% of the used charge into the battery (charging is not 100% efficient either). This means that to replace 72watts used during an hour (72W/hr), you need about 90Watts for an hour to do that (90W/hr). If your load is drawing power while the battery is also charging, that's another 72Watts that isn't available to any other load. This means to run a load that draws 72 Watts over an hour while charging the battery that supplied power for that load over another past hour, you are actually using about 160Watts from your solar panel (about 16% of your total peak/ideal generation capacity).
Introduce cloud cover issues on a single generation "plant", and
you can see generation variability from +/-15% to +/-40%. This means you could see as little as 600Watts generated due to environmental causes - and that 160Watts (run+charge) starts to hurt at about 27% of your total solar generation capacity (round it up about a third for dramatic effect
).
And my point again here is that if your inverter was switched off for the time period that it would have otherwise been running "ready to support a load", your draw and charge impact for that same two hour example would be 0Watts - you'd get that 27% of your generation capacity back. If you start looking at your whole system cost (panels, charger, batteries, fuel to move the total system based on size/weight), you might find that spending an hour to reconfigure a relay and an inverter so that it really only draws power when it's needed would be time well spent versus jacking up the size of a system, running a generator to make up the remaining power, or limiting your total run-time.