catalog_writer
            CatalogWriter
¶
    A writer for Catalog tables.
Source code in src/cloe_nessy/integration/writer/catalog_writer.py
                
            write_table(df, table_identifier, partition_by=None, options=None, mode='append')
  
      staticmethod
  
¶
    Write a table to the unity catalog.
Parameters:
| Name | Type | Description | Default | 
|---|---|---|---|
| df | DataFrame | None | The DataFrame to write. | required | 
| table_identifier | str | None | The table identifier in the unity catalog in the format 'catalog.schema.table'. | required | 
| mode | str | The write mode. One of append, overwrite, error, errorifexists, ignore. | 'append' | 
| partition_by | str | list[str] | None | Names of the partitioning columns. | None | 
| options | dict[str, str] | None | PySpark options for the DataFrame.saveAsTable operation (e.g. mergeSchema:true). | None | 
Notes
append: Append contents of this DataFrame to existing data. overwrite: Overwrite existing data. error or errorifexists: Throw an exception if data already exists. ignore: Silently ignore this operation if data already exists.
Raises:
| Type | Description | 
|---|---|
| ValueError | If the mode is not one of append, overwrite, error, errorifexists, ignore. | 
| ValueError | If the table_identifier is not a string or not in the format 'catalog.schema.table'. | 
| ValueError | If the DataFrame is None. |