There is often quite a lot of confusion about how transformers actually work, for example thinking that as the secondary load increases that it can cause the core to magnetically saturate when this is not the case. The core losses (hysteresis and eddy currents during the magnetic cycle) remain fairly constant and independent of the load. It works like this:
In the off load state(voltage applied to the primary and no secondary load) the magnetic flux cycle in the core (and the current that produces it) lags the applied voltage by 90 degrees if there were no losses, the current would be in phase with the flux, but since there is some resistance, hysteresis and eddy currents (all waste energy as heat) there must be a small component of the current in phase in phase with the applied voltage. So the no load primary current leads the flux by a small angle and can therefore be split up into two components, a "wattless" magnetizing current in phase with the flux and a Loss current in phase with the applied voltage.
When a load is placed on the secondary, current flows in the secondary winding, the magnetic field from this acts to reduce the flux set up by the primary winding. And not increase it as some people believe. It only takes a small reduction in primary flux to enable full load current to flow in the primary.Therefore, there is little error in assuming that the transformer's main flux remains constant between between no load and full load conditions. And it is the case that the magnetic effect of the secondary current is immediately neutralized by the the appearance of a corresponding component in the primary current.
Assuming, off load, a mains transformer has a sensible magnetizing current and the peak flux is not too high, you can forget about the transformer core when it comes to the max load the transformer can tolerate. All power transformers for some mains voltage and frequency and core material should be designed so that the fixed losses of eddy currents and hysteresis and off load primary current are not excessive.
However, the thing that limits a transformer's ability to deliver more than a certain amount of power is the winding DC resistance. These are the copper losses otherwise known as I(squared)R losses as they increase with the square of the current. Larger transformers simply can have thicker wire and lower copper losses for any applied load. When you overload a transformer, its winding temperature will simply go higher than it was designed for and the transformer will get hotter. How much of that is acceptable depends on how conservative the initial design was and other factors including ventilation, quality of enamel wire and insulation used etc.
(If you want you can measure the DC resistance of the primary and secondary windings. To get one resistance total you can mathematically transform one winding resistance into the other by multiplying it by the square of the turn's ratio (the impedance ratio). Then with the known load current I, you can calculate the power loss in the windings with I(sqaured)R. How much the transformer heats up though with that power dissipated in it , is a little like a heat sink calculation, it depends on the average degrees C/watt of the entire transformer body).
The above remarks of course ignore leakage reactance effects which could be a whole other topic.