-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spark not-nullable column converts to EXASOL column failed while saving #60
Comments
Hey @3cham , Good to hear from you! Yes, it is a good point! From my side, I would prefer to remove the Please feel free to send a pull request! |
We create an Exasol table, if it did not exist, before saving the Spark dataframe. The `NOT NULL` constraint was added to the create table DDL, if the Spark schema field type is not nullable. However, this can be problem in Exasol side. Because, Exasol puts `null` if the string is empty for `VARCHAR` or `CLOB` column types. Therefore, putting not null constraints fails when inserting empty strings. This commit removes the `NOT NULL` constraints from string types even if they are not nullable. Fixes #60.
We create an Exasol table, if it did not exist, before saving the Spark dataframe. The `NOT NULL` constraint was added to the create table DDL, if the Spark schema field type is not nullable. However, this can be problem in Exasol side. Because, Exasol puts `null` if the string is empty for `VARCHAR` or `CLOB` column types. Therefore, putting not null constraints fails when inserting empty strings. This commit removes the `NOT NULL` constraints from string types even if they are not nullable. Fixes #60.
While creating table in EXASOL we infer column information from spark schema.
Spark says if a column does not contain any null value then this column is not nullable, despite it can contain empty string. So our connector will set the column in the EXASOL DDL to NOT NULL. This leads to problem when writing the data to EXASOL if empty strings occur.
I suggest that we remove this nullable checking for world's peace since it does take a lot of time to find the cause of this problem :)
Let's me know about your opinions @morazow @jpizagno
The text was updated successfully, but these errors were encountered: