Sort sequences and normalization in SQL

 

A sort sequence defines how characters in a character set relate to each other when they are compared or ordered. Normalization allows you to compare strings that contain combining characters.

The sort sequence is used for all character, and UCS-2 and UTF-16 graphic comparisons performed in SQL statements. There are sort sequence tables for both single byte and double byte character data. Each single byte sort sequence table has an associated double-byte sort sequence table, and vice versa. Conversion between the two tables is performed when necessary to implement a query. In addition, the CREATE INDEX statement has the sort sequence (in effect at the time the statement was run) applied to the character columns referred to in the index.

 

Parent topic:

SQL programming

 

Related reference


Creating and using views
Creating indexes
Specifying a search condition using the WHERE clause
GROUP BY clause
HAVING clause
ORDER BY clause
Handling duplicate rows
Defining complex search conditions
Using the UNION keyword to combine subselects
Sort sequence