Lexical analysis involves breaking input text into tokens. It is implemented using regular expressions to specify token patterns and a finite automaton to recognize tokens in the input stream. Lex is a tool that allows specifying a lexical analyzer by defining regular expressions for tokens and actions to perform on each token. It generates code to simulate the finite automaton for token recognition. The generated lexical analyzer converts the input stream into tokens by matching input characters to patterns defined in the Lex source program.