Instructor: Kaitlin Oglesby

Kaitlin has a BA in political science and extensive experience working in the business world as Director of Marketing and Business Development at a financial advice firm.

If you are dealing with data that has decimal values, then you're going to want to know about the decimal and numeric data types in SQL. While they are very similar, they are distinct enough that you should know when to use which. We'll learn that and more in this lesson.

Data Types for Numbers

If you've been using SQL for any period of time, then you are probably aware of the fact that data types are incredibly useful for making sure that the right information goes in the right field. Very likely, a great deal of the information that you will want to store comes in the shape of numeric data. For numbers that are made up of nothing but integers, you'd likely use one of the INTEGER options.

However, a great deal of your data may require a decimal. In that case, you have two options available - NUMERIC and DECIMAL. Luckily, they are used in exactly the same way. In this lesson, we'll see how each is used, further explain the differences between them and other data types, and see how to use decimals in each.


As mentioned before, you can largely use NUMERIC and DECIMAL interchangeably, although it is considered to be best practice to choose one and stick with it. This is because that while they are functionally the same, SQL does not treat them as such. If you have linked tables by use of a foreign key, for example, you will have an error message because SQL thinks that it is two different types of data that are being compared.

There is one minor difference, when using some SQL standards, however. NUMERIC will only go to the specified number of decimals, while DECIMAL will be at least that exact, if not more so. For example, if the number of decimals indicated was 2, and your data point was 5.323, NUMERIC would call that 5.32 while DECIMAL would go to 5.323. Please note that this is not in every SQL standard.

How Are They Different from Other Data Types?

The biggest difference between NUMERIC and DECIMAL when compared to other data types is the inclusion of decimal points. If you are going to have data that has decimals, you'll find it greatly to your advantage to label it as such. This way any data that does not match up will render an error message. Further, it provides a nice check on your entry as you would have to put 25.00 if the value was 25. This helps to ensure greater precision.

To unlock this lesson you must be a Study.com Member.
Create your account

Register to view this lesson

Are you a student or a teacher?

Unlock Your Education

See for yourself why 30 million people use Study.com

Become a Study.com member and start learning now.
Become a Member  Back
What teachers are saying about Study.com
Try it now

Earning College Credit

Did you know… We have over 220 college courses that prepare you to earn credit by exam that is accepted by over 1,500 colleges and universities. You can test out of the first two years of college and save thousands off your degree. Anyone can earn credit-by-exam regardless of age or education level.

To learn more, visit our Earning Credit Page

Transferring credit to the school of your choice

Not sure what college you want to attend yet? Study.com has thousands of articles about every imaginable degree, area of study and career path that can help you find the school that's right for you.

Create an account to start this course today
Used by over 30 million students worldwide
Create an account

We use cookies on our site.

To learn more about the information we collect, how we use it and your choices visit our Privacy Policy .