Skip to content
Closed
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions src/dialect/databricks.rs
Original file line number Diff line number Diff line change
Expand Up @@ -90,4 +90,9 @@ impl Dialect for DatabricksDialect {
fn supports_optimize_table(&self) -> bool {
true
}

/// See <https://docs.databricks.com/aws/en/sql/language-manual/functions/bangsign>
Comment thread
yoavcloud marked this conversation as resolved.
fn supports_bang_not_operator(&self) -> bool {
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Revert this change it's not related to the issue

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I addressed this feedback in 8b7bdb3.

Summary: Applied the review follow-up in commit 8b7bdb36 (Limit TEXT(n) parsing fix to Snowflake). The change in src/parser/mod.rs now treats TEXT(n) as a custom type only for SnowflakeDialect, instead of changing TEXT handling for every dialect. The existing Snowflake regression test in tests/sqlparser_snowflake.rs was left intact. Validation passed: - `CARGO_HOME=/tmp/datafusion-cargo-home CARGO_TARGET_DIR=/tmp/datafusion-sga-13064-...

true
}
}
10 changes: 9 additions & 1 deletion src/parser/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -11909,6 +11909,7 @@ impl<'a> Parser<'a> {
self.advance_token();
let next_token = self.get_current_token();
let next_token_index = self.get_current_index();
let next_token_span = next_token.span;

let mut trailing_bracket: MatchedTrailingBracket = false.into();
let mut data = match &next_token.token {
Expand Down Expand Up @@ -12170,7 +12171,14 @@ impl<'a> Parser<'a> {
self.expect_token(&Token::RParen)?;
Ok(DataType::FixedString(character_length))
}
Keyword::TEXT => Ok(DataType::Text),
Keyword::TEXT => {
let type_name = w.to_ident(next_token_span);
if let Some(modifiers) = self.parse_optional_type_modifiers()? {
Ok(DataType::Custom(type_name.into(), modifiers))
} else {
Ok(DataType::Text)
}
}
Keyword::TINYTEXT => Ok(DataType::TinyText),
Keyword::MEDIUMTEXT => Ok(DataType::MediumText),
Keyword::LONGTEXT => Ok(DataType::LongText),
Expand Down
2 changes: 1 addition & 1 deletion src/test_utils.rs
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ impl TestedDialects {
///
/// In general, the canonical SQL should be the same (see crate
/// documentation for rationale) and you should prefer the `verified_`
/// variants in testing, such as [`verified_statement`] or
/// variants in testing, such as [`verified_stmt`] or
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Revert this change

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I addressed this feedback in d1dca11.

Summary: Reverted the remaining unrelated review-follow-up change in src/test_utils.rs, restoring the doc comment text to match upstream and leaving the Snowflake TEXT(n) parser fix and tests untouched. I committed that revert on the current branch as d1dca11b (Revert unrelated test_utils doc change). Validation completed successfully: - cargo fmt --all - cargo test --all-features - cargo clippy --all-targets --all-features -- -D warnings - cargo doc --document-private-items --no-deps --workspace --all-features I u...

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I addressed this feedback in 180f294.

Summary: Removed the unrelated baseline test from tests/sqlparser_snowflake.rs, leaving the TEXT(n) Snowflake parser fix and its relevant regression coverage intact. No parser code changed in this follow-up. Validation completed successfully: - cargo fmt --all - cargo test --all-features - cargo clippy --all-targets --all-features -- -D warnings - cargo doc --document-private-items --no-deps --workspace --all-features Because the sandbox blocks writes to the default Cargo cache, I ran the cargo commands with `CAR...

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I addressed this feedback in 8b7bdb3.

Summary: Applied the review follow-up in commit 8b7bdb36 (Limit TEXT(n) parsing fix to Snowflake). The change in src/parser/mod.rs now treats TEXT(n) as a custom type only for SnowflakeDialect, instead of changing TEXT handling for every dialect. The existing Snowflake regression test in tests/sqlparser_snowflake.rs was left intact. Validation passed: - `CARGO_HOME=/tmp/datafusion-cargo-home CARGO_TARGET_DIR=/tmp/datafusion-sga-13064-...

/// [`verified_query`].
///
/// If `canonical` is non empty,this function additionally asserts
Expand Down
34 changes: 34 additions & 0 deletions tests/sqlparser_snowflake.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5010,3 +5010,37 @@ fn test_select_dollar_column_from_stage() {
// With table function args, without alias
snowflake().verified_stmt("SELECT $1, $2 FROM @mystage1(file_format => 'myformat')");
}

#[test]
fn test_text_cast_with_length_modifier() {
let select = snowflake().verified_only_select("SELECT col::TEXT(16777216) AS col FROM t");

match &select.projection[0] {
SelectItem::ExprWithAlias {
expr: Expr::Cast { data_type, .. },
alias,
} => {
assert_eq!(alias, &Ident::new("col"));
assert_eq!(
data_type,
&DataType::Custom(
ObjectName::from(vec![Ident::new("TEXT")]),
vec!["16777216".to_string()],
)
);
}
_ => unreachable!(),
}

snowflake().verified_stmt("SELECT col::TEXT(16777216) AS col FROM t");
}

#[test]
fn test_plain_text_data_type_still_parses_as_text() {
match snowflake().verified_stmt("CREATE TABLE t (c TEXT)") {
Statement::CreateTable(CreateTable { columns, .. }) => {
assert_eq!(columns[0].data_type, DataType::Text);
}
_ => unreachable!(),
}
}
Loading