Skip to content

Latest commit

 

History

History
146 lines (103 loc) · 4.79 KB

File metadata and controls

146 lines (103 loc) · 4.79 KB

Upgrade Guides

DataFusion 54.0.0

The Config class has been removed. It was a standalone wrapper around ConfigOptions that could not be connected to a SessionContext, making it effectively unusable. Use :py:class:`~datafusion.context.SessionConfig` instead, which is passed directly to SessionContext.

Before:

from datafusion import Config

config = Config()
config.set("datafusion.execution.batch_size", "4096")
# config could not be passed to SessionContext

After:

from datafusion import SessionConfig, SessionContext

config = SessionConfig().set("datafusion.execution.batch_size", "4096")
ctx = SessionContext(config)

DataFusion 53.0.0

This version includes an upgraded version of pyo3, which changed the way to extract an FFI object. Example:

Before:

let codec = unsafe { capsule.reference::<FFI_LogicalExtensionCodec>() };

Now:

let data: NonNull<FFI_LogicalExtensionCodec> = capsule
    .pointer_checked(Some(c_str!("datafusion_logical_extension_codec")))?
    .cast();
let codec = unsafe { data.as_ref() };

DataFusion 52.0.0

This version includes a major update to the :ref:`ffi` due to upgrades to the Foreign Function Interface. Users who contribute their own CatalogProvider, SchemaProvider, TableProvider or TableFunction via FFI must now provide access to a LogicalExtensionCodec and a TaskContextProvider. The function signatures for the methods to get these PyCapsule objects now requires an additional parameter, which is a Python object that can be used to extract the FFI_LogicalExtensionCodec that is necessary.

A complete example can be found in the FFI example. Your FFI hook methods — __datafusion_catalog_provider__, __datafusion_schema_provider__, __datafusion_table_provider__, and __datafusion_table_function__ — need to be updated to accept an additional session: Bound<PyAny> parameter, as shown in this example.

#[pymethods]
impl MyCatalogProvider {
    pub fn __datafusion_catalog_provider__<'py>(
        &self,
        py: Python<'py>,
        session: Bound<PyAny>,
    ) -> PyResult<Bound<'py, PyCapsule>> {
        let name = cr"datafusion_catalog_provider".into();

        let provider = Arc::clone(&self.inner) as Arc<dyn CatalogProvider + Send>;

        let codec = ffi_logical_codec_from_pycapsule(session)?;
        let provider = FFI_CatalogProvider::new_with_ffi_codec(provider, None, codec);

        PyCapsule::new(py, provider, Some(name))
    }
}

To extract the logical extension codec FFI object from the provided object you can implement a helper method such as:

pub(crate) fn ffi_logical_codec_from_pycapsule(
    obj: Bound<PyAny>,
) -> PyResult<FFI_LogicalExtensionCodec> {
    let attr_name = "__datafusion_logical_extension_codec__";
    let capsule = if obj.hasattr(attr_name)? {
        obj.getattr(attr_name)?.call0()?
    } else {
        obj
    };

    let capsule = capsule.downcast::<PyCapsule>()?;
    validate_pycapsule(capsule, "datafusion_logical_extension_codec")?;

    let codec = unsafe { capsule.reference::<FFI_LogicalExtensionCodec>() };

    Ok(codec.clone())
}

The DataFusion FFI interface updates no longer depend directly on the datafusion core crate. You can improve your build times and potentially reduce your library binary size by removing this dependency and instead using the specific datafusion project crates.

For example, instead of including expressions like:

use datafusion::catalog::MemTable;

Instead you can now write:

use datafusion_catalog::MemTable;