Sort sequences and normalization in SQL
A sort sequence defines how characters in a character set relate to each other when they are compared or ordered. Normalization allows you to compare strings that contain combining characters.
The sort sequence is used for all character, and UCS-2 and UTF-16 graphic comparisons performed in SQL statements. There are sort sequence tables for both single byte and double byte character data. Each single byte sort sequence table has an associated double-byte sort sequence table, and vice versa. Conversion between the two tables is performed when necessary to implement a query. In addition, the CREATE INDEX statement has the sort sequence (in effect at the time the statement was run) applied to the character columns referred to in the index.
- Sort sequence used with ORDER BY and row selection
The examples show how rows are ordered and selected for the sort sequence used.
- Sort sequence and views
Views are created with the sort sequence that is in effect when the CREATE VIEW statement is run.
- Sort sequence and the CREATE INDEX statement
Indexes are created with the sort sequence that is in effect when the CREATE INDEX statement is run.
- Sort sequence and constraints
Unique constraints are implemented with indexes. If the table on which a unique constraint is added is defined with a sort sequence, the index is created with the same sort sequence.
- ICU sort sequence
When an International Components for Unicode (ICU) sort sequence table is used, the database uses the language-specific rules to determine the weight of the data based on the locale of the table.
- Normalization
Normalization allows you to compare strings that contain combining characters.
Parent topic:
SQL programming
Related reference
Creating and using views
Creating indexes
Specifying a search condition using the WHERE clause
GROUP BY clause
HAVING clause
ORDER BY clause
Handling duplicate rows
Defining complex search conditions
Using the UNION keyword to combine subselects
Sort sequence