How to insert data into datasource in dremio using JDBC

From Azure databricks I am trying to push a dataframe so that, i can insert data into an existing mysql database table but I a get this error:

scala code:
streamingDataFrame.write.mode(SaveMode.Append).jdbc(jdbcUrl, “sentiment_analysis”, connectionProperties)

error:

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, 10.139.64.5, executor 0): java.sql.SQLException: Failed to create prepared statement: error_id: “b929187c-8dfe-4f18-8953-2bb383483eab”
endpoint {
address: “dremio-demo-vm.internal.cloudapp.net
user_port: 31010
fabric_port: 45678
roles {
sql_query: true
java_executor: true
master: true
}
start_time: 1592941976465
max_direct_memory: 8589934592
available_cores: 2
node_tag: “”
conduit_port: 39955
}
error_type: UNSUPPORTED_OPERATION
message: “UNSUPPORTED_OPERATION ERROR: Please contact customer support for steps to enable the iceberg tables feature.\n\n\n[Error Id: b929187c-8dfe-4f18-8953-2bb383483eab on dremio-demo-vm.internal.cloudapp.net:31010]\n\n”
original_message: “Please contact customer support for steps to enable the iceberg tables feature.”

at com.dremio.jdbc.impl.DremioJdbc41Factory.newServerPreparedStatement(DremioJdbc41Factory.java:147)
at com.dremio.jdbc.impl.DremioJdbc41Factory.newPreparedStatement(DremioJdbc41Factory.java:108)
at com.dremio.jdbc.impl.DremioJdbc41Factory.newPreparedStatement(DremioJdbc41Factory.jav

Hi, I think you can only write in file systems sources, with a CTAS, and with the new iceberg feature, with INSERT INTO statements.