
As I recall the context of that 6-7% was in the form of a loss compared to the best-case theoretical scenario.
From memory the average swamp cooler reaches about 85% to 89% efficiency. Of the 15 to 19% lost, 6 to 7% was due to an uninsulated pan, and the rest was due to the heat input of the motor and pump.
A back-of-the napkin calc however does not show this to be in the ballpark. A 1/3rd horsepower motor draws about 300 watts (1023 btu). A typical recirc pump draws about an amp (340 btu) for a total of 1360-ish btu. Add-in that 6-7% for uninsulated loss for another 680-ish btu and you have about 2040 btus of lost cooling capacity. If we figure this loss to be 13% of our total output,(overall system eff is 87%, splitting the difference here) then total cooling output would be 15,692 btu. That cant be right. Its LOW.
Realworld specs on a commercial unit are 4.3 amp draw, 4500 cfm, and a 25F drop with 5 gph used. Just calculating the theoretical btus gives you 36,935 btus at 100% efficiency. That is off by half. Not even close. If we assume the advertisements are right, that 4500 cfm with a 25F drop would take 135,000 btu. Again, a huge disconnect.
I think I need to go beat myself with a psychometric table and see where the errors are.

Edit: BINGO! I found the motherlode. See: http://www.piec.com/formulasdef.htm