Food cost is a fundamental metric for any business that serves food. It represents the total expense an operation incurs to purchase the ingredients for the menu items it sells to customers. Understanding and tracking this figure is a primary component of managing a financially healthy business, as it directly impacts profitability and informs decisions about menu pricing, inventory management, and operational strategy.
The Basic Food Cost Formula
At its core, determining the total food cost for a specific period relies on calculating the Cost of Goods Sold (COGS). This figure represents the direct cost of all the ingredients used to create the dishes sold during that time frame. The formula is straightforward: Beginning Inventory + Purchases – Ending Inventory = COGS.
The first component, Beginning Inventory, is the total monetary value of all food and beverage products you have in stock at the start of the accounting period. This includes everything in your freezers, refrigerators, and dry storage areas. To find this number, a physical count of every item is necessary, with each item’s value calculated based on its purchase price.
Purchases refers to the total value of all food and beverage products bought during that same period. This figure is found by adding up all invoices from your suppliers for that timeframe. It’s important to only include items that are part of the menu items sold; for example, cleaning supplies are an operational expense, not part of COGS.
The final piece of the formula is the Ending Inventory, which is the total value of all food stock remaining at the end of the period. This requires another complete physical count of all on-hand ingredients. The result of the COGS formula shows how much, in dollar terms, the food you sold cost to produce.
Calculating Food Cost Percentage
Once the Cost of Goods Sold (COGS) is determined, an operation can calculate its food cost percentage. This metric contextualizes the raw COGS number by comparing it to sales. The formula is: (COGS / Total Food Sales) x 100 = Food Cost Percentage. This calculation shows how much of every dollar in revenue is spent on ingredients.
For instance, if a restaurant has a COGS of $3,000 for a week and total food sales of $10,000 for that same week, its food cost percentage is 30%. This means that for every dollar the restaurant earned, 30 cents were spent on ingredients.
While there is no single “correct” food cost percentage, an industry benchmark for many restaurants falls between 28% and 35%. This figure can vary significantly based on the type of establishment. A fine-dining steakhouse using premium cuts of meat will have a higher food cost percentage than a pizzeria, which relies on lower-cost ingredients like flour and cheese.
Determining Individual Menu Item Cost
Shifting from a broad operational view to a micro-level analysis involves costing out individual menu items, also known as recipe or plate costing. This calculation is fundamental to setting profitable menu prices. It requires a detailed breakdown of every component that goes into making one serving of a dish.
The first step is to list every ingredient in a specific recipe. For each ingredient, you must calculate the exact cost of the portion used in one serving. For example, if a 5-pound bag of flour costs $10 and a recipe calls for 8 ounces of flour, you must determine the cost for just those 8 ounces.
An important part of this process is accounting for yield, which is the amount of usable product left after trimming and cooking. For instance, a whole chicken will lose weight after fabrication and roasting, and this loss must be factored into the actual cost of the chicken served. Once the precise cost of each ingredient portion is calculated, they are all added together to arrive at the total plate cost.
Factors That Influence Food Cost
An operation’s food cost percentage is not a static number and is influenced by numerous internal and external variables. The most significant factors include:
- Waste and spoilage. Ingredients that expire from improper storage, over-prepping food that doesn’t sell, or kitchen mistakes contribute to costs without generating revenue. These losses are absorbed directly into the Cost of Goods Sold.
- Inconsistent portion control. When kitchen staff serves larger portions than what the recipe specifies, the cost of that dish increases with each plate. Using portioning tools like scales and standardized scoops helps ensure consistency.
- Supplier pricing. Market fluctuations, seasonality, and supply chain disruptions can cause the price of ingredients to change. Poor negotiation with vendors or failing to shop for competitive pricing can also lead to paying more for raw materials.
- Theft. Internal issues like employees taking inventory home or unrecorded sales where food is given away increase the amount of food used without a corresponding sale.
- Sales mix. The overall mix of items sold also plays a role. A high volume of sales on low-margin, high-cost dishes will drive the overall food cost percentage up.
Ideal Versus Actual Food Cost
A more advanced method for diagnosing operational efficiency involves comparing ideal food cost with actual food cost. Ideal food cost, sometimes called theoretical cost, represents what the food cost would be in a perfect world with no waste, theft, or imperfect portioning. It is calculated based on recipe costs multiplied by the number of items sold.
Actual food cost is the real-world figure calculated using the COGS formula. The difference between the ideal and actual food cost is known as the food cost variance, and it highlights the financial impact of operational inefficiencies.
For example, if a restaurant’s ideal food cost for a month is 28% but its actual food cost is 31%, that 3% variance represents lost profit. Regularly calculating and analyzing this variance allows management to identify specific problems within the operation. A small variance indicates strong controls, while a large variance signals opportunities to improve processes and increase profitability.