Per previous experimentation, backtracking slows down lexing performance
significantly (by about a third). It's usually pretty easy to avoid, just
need to have rules that accept an incomplete construct and do whatever the
lexer would have done otherwise.
The backtracking was introduced by the patch that added quoted variable
substitution. Back-patch to 9.0 where that was added.
escape_variable(true);
}
+ /*
+ * These rules just avoid the need for scanner backup if one of the
+ * two rules above fails to match completely.
+ */
+
+:'[A-Za-z0-9_]* {
+ /* Throw back everything but the colon */
+ yyless(1);
+ ECHO;
+ }
+
+:\"[A-Za-z0-9_]* {
+ /* Throw back everything but the colon */
+ yyless(1);
+ ECHO;
+ }
+
/*
* Back to backend-compatible rules.
*/
}
}
-:[A-Za-z0-9_]* {
+:[A-Za-z0-9_]+ {
/* Possible psql variable substitution */
if (option_type == OT_VERBATIM)
ECHO;
}
}
+:'[A-Za-z0-9_]* {
+ /* Throw back everything but the colon */
+ yyless(1);
+ ECHO;
+ BEGIN(xslashdefaultarg);
+ }
+
+:\"[A-Za-z0-9_]* {
+ /* Throw back everything but the colon */
+ yyless(1);
+ ECHO;
+ BEGIN(xslashdefaultarg);
+ }
+
"|" {
ECHO;
if (option_type == OT_FILEPIPE)