The heat rate of a fossil fuel plant or a nuclear plant is the measure of how much thermal energy (i.e. fuel) it takes to generate a net kWh of electrical energy (net kWh meaning after subtracting all the electrical energy use of the plant). It is a measure of the efficiency of the plant: the lower the heat rate, the more efficient the plant is.
In the US, the heat rate is measured in BTu/kWh.
The EIA has a handy page showing the average heat rate across all US plants per fuel source per year since 2002, for coal, oil, natural gas and nuclear, Even more interesting, the California Energy Commission publishes heat rate statistics for all of its plants.
Why is heat rate an important concept? As demand peaks in a region, the grid operator needs to ramp up generation plants with progressively worse heat rates, worsening fuel costs and carbon emissions. Energy storage with the right discharge rate (to fulfill demand), energy capacity (during the whole duration of peak demand) and energy cost (to compete with peaker plants) can alleviate the need for such peaker plants.