Allocation rate definition
/What is an Allocation Rate?
An allocation rate is the standard amount of overhead applied to a unit of production or other measure of activity. This is done when shifting costs to a cost object, which may be required under one of the accounting frameworks to ensure that a full cost is applied to inventory. This information then appears in a company’s balance sheet, in the form of fully-burdened finished goods. An allocation rate can also be used as part of an internal accounting effort, to ensure that overhead costs are applied throughout a business.
How to Calculate an Allocation Rate
To calculate an allocation rate, divide the total overhead cost incurred by the number of units produced. In many cases, a more precise allocation approach is to derive the overhead cost associated with a subset of the total units produced, and divide by just the subset of units produced. For example, a factory has a separate production line for its purple widgets, so it should only derive an allocation rate for the overhead costs incurred by that production line, and divided by only the units produced by that line. The resulting allocation rate would only be applied to the units produced by the purple widget production line.
Examples of Allocation Rate
As an example of an allocation rate, a business has a factory overhead cost pool of $100,000, and routinely produces 20,000 widgets per month. In this case, the allocation rate is $5 per widget, which is calculated as follows:
$100,000 Cost pool ÷ 20,000 Units of production = $5 Allocation rate
As another example, a parent company allocates its corporate overhead to its subsidiaries based on their revenues. Total corporate overhead is $1 million, and the sum total of all revenue generated by all subsidiaries is $100 million. Given these activity levels, the allocation rate should be $.01 million per million of revenue. Thus, if a subsidiary generates $20 million of revenue, the allocation rate mandates that $200,000 be applied to that subsidiary.