Consider the following statement:

Our consultancy operates on a 50% margin.

Margin is how much extra the consultancy charges on top of their costs to cover the consultancy’s costs, operating expenses, and profits. Most consultancies have a minimum margin that they’ll consider acceptable when bidding on work.

But the details matter when it comes to requirements. A lot. The statement above is subtly ambiguous. How is margin calculated, precisely? I can think of two ways which make sense but result in wildly different profits for the consultancy.

Let’s work through a specific example when the consultancy is billing a client for a sub-contractor that charges the consultancy $2,000/day. How much should the client pay for this sub-contractor?

## Option 1: Apply margin to cost

In this method, the margin is calculated from the cost so we simply take the cost and multiply it by the margin:

bill_rate = cost * (1 + margin)

So in the example of the $2k contractor we have

$2,000 * (1 + 0.5) = $3,000

The client is billed $3k, the consultancy pays the sub-contractor $2k, and keeps a profit of $1k for itself.

## Option 2: Margin applied to bill rate

This method says that the margin is calculated from the bill rate so that the cost accounts for *x%* of the revenue

cost = bill_rate * (1 - margin)

Or rearranged to

bill_rate = cost / (1 - margin)

So in the example of our $2k contractor we have

$2,000 / (1 - 0.5) = $4,000

The client is billed $4k, the consultancy pays the sub-contractor $2k, and pockets a tidy profit of $2k for itself … *double the profit* of option 1

## Details Matter

One could argue that both calculations are correct. They both apply a markup of 50% and the consultancy makes out like a bandit either way, but the profits are signifcantly higher with Option 2 which is why specific examples are priceless (ha!) when dealing with numbers.