A credit score indicates creditworthiness by distilling years of borrowing behavior into a single number that predicts how likely you are to repay debt. Lenders use this number to sort borrowers into risk categories, and the differences between categories are dramatic. Federal Reserve data shows that borrowers with FICO scores below 520 defaulted on new loans at a rate of 41%, while those with scores of 720 or higher defaulted just 1% of the time. That gap is why your three-digit score shapes nearly every lending decision you encounter.
What the Score Actually Measures
A credit score doesn’t measure your income, your savings, or your net worth. It measures patterns in how you’ve handled borrowed money. The FICO scoring model, used by about 90% of top lenders, weighs five categories of behavior drawn from your credit reports.
Payment history (35%) is the largest factor. Every on-time or late payment on credit cards, auto loans, mortgages, and other accounts gets recorded. A single payment that’s 30 or more days late can drop your score significantly, and the damage increases the later the payment gets. Collections, bankruptcies, and foreclosures fall into this category too.
Amounts owed (30%) looks at how much of your available credit you’re currently using. This is often called your credit utilization ratio. If you have a credit card with a $10,000 limit and carry a $7,000 balance, your utilization on that card is 70%, which signals to lenders that you may be financially stretched. Borrowers who keep utilization low, generally under 30%, tend to score higher.
Length of credit history (15%) considers the age of your oldest account, the age of your newest account, and the average age across all accounts. A longer track record gives the model more data to work with, which reduces uncertainty about your behavior.
New credit (10%) tracks how many accounts you’ve recently opened and how many hard inquiries appear on your report. Opening several accounts in a short window statistically correlates with higher default risk, especially for people with shorter credit histories.
Credit mix (10%) reflects the variety of account types you manage. Having experience with both revolving credit (like credit cards) and installment loans (like a car loan or mortgage) suggests you can handle different repayment structures. You don’t need one of every type to score well, but having only one kind of account limits what the model can learn about you.
How Lenders Turn Scores Into Risk Tiers
Lenders don’t just see a number. They map your score to a risk category that determines whether you qualify for credit and on what terms. The Consumer Financial Protection Bureau uses five standard tiers based on FICO Score 8:
- Super-prime: 720 and above
- Prime: 660 to 719
- Near-prime: 620 to 659
- Subprime: 580 to 619
- Deep subprime: below 580
Each tier carries a different statistical likelihood of default, and lenders price their risk accordingly. A super-prime borrower with a long history of on-time payments and low utilization represents minimal risk. A deep subprime borrower, whose report likely contains late payments, collections, or maxed-out accounts, represents substantial risk. Lenders either charge that borrower more to compensate for the higher chance of loss, or they decline the application altogether.
The Dollar Cost of a Lower Score
The gap between score tiers translates directly into money. On a 30-year mortgage, CFPB data from early 2025 shows that a borrower with a 625 credit score could see interest rates ranging from 6.125% to 8.875%, while a borrower with a 700 score could see rates from 5.875% to 8.125%. At the extremes, that difference adds up to roughly $264,500 in additional interest over the life of the loan.
The same pattern plays out across auto loans, personal loans, and credit cards. A lower score means higher interest rates, larger required down payments, and sometimes mandatory fees or deposits. On a credit card, the difference between a prime and subprime APR can be 10 percentage points or more, which compounds quickly on carried balances.
Why Default Rates Validate the Model
Credit scores work as indicators of creditworthiness because they genuinely predict outcomes. The Federal Reserve studied default rates on new loans originated between October 2000 and April 2001, tracking performance over the following two years. The results show a clear gradient:
- Below 520: 41% default rate
- 520 to 559: 28.4%
- 560 to 599: 22.5%
- 600 to 639: 15.8%
- 640 to 679: 8.9%
- 680 to 719: 4.4%
- 720 or higher: 1.0%
Each step up the scoring ladder roughly cuts the default rate in half. That predictive power is the core reason lenders rely on credit scores so heavily. A borrower at 720 is statistically 41 times less likely to default than one below 520. No single piece of information on a credit report tells that story as efficiently.
What a Credit Score Doesn’t Capture
A credit score is built entirely from your credit report data. It doesn’t factor in your salary, employment status, bank account balances, or investment portfolio. Two people with identical incomes can have vastly different scores if one pays bills late and the other doesn’t. Conversely, someone earning a modest income but managing credit responsibly can outscore a high earner who’s missed payments or maxed out cards.
This is both a strength and a limitation. The score isolates borrowing behavior from everything else, making it a consistent and objective tool. But it also means that people with little or no credit history, sometimes called “thin file” consumers, can appear riskier than they actually are. Someone who has always paid rent and utilities on time but has never had a credit card or loan may have no score at all.
The industry is gradually incorporating alternative data to address this gap. Some scoring models now consider rent payments, utility bills, and bank account transaction patterns to build a profile for borrowers whose traditional credit files are sparse. Transaction data like spending ratios and payment frequency at different retailers can generate predictive signals that supplement a thin credit report. These approaches are still evolving, but they’re expanding access for people who’ve been financially responsible without using traditional credit products.
How Your Score Changes Over Time
A credit score is a snapshot, not a permanent label. It updates every time the information on your credit report changes, which can happen monthly as your creditors report new data. This means your score can shift in either direction depending on recent behavior.
Paying down a high credit card balance can boost your score within a billing cycle or two because your utilization ratio drops. Missing a payment, on the other hand, can cause an immediate and sharp decline. Negative marks like late payments, collections, and bankruptcies weigh heavily at first but lose influence over time. Most negative items fall off your credit report after seven years, though bankruptcies can remain for up to ten.
The recency of your behavior matters. A missed payment from six years ago has far less impact than one from six months ago. The scoring model is always recalculating, placing more weight on your recent track record. This is why someone recovering from a financial setback can rebuild their score steadily by establishing a pattern of on-time payments and low utilization going forward.

