Skip to content

Commit 9b2505c

Browse files
authored
fix(doc): close #19393, make upgrading guide match v51 api (#19648)
Change-Id: Id62d32d14fa19b34b592e186e7962bb96a6a6964 ## Which issue does this PR close? <!-- We generally require a GitHub issue to be filed for all bug fixes and enhancements and this helps us generate change logs for our releases. You can link an issue to this PR using the GitHub syntax. For example `Closes #123` indicates that this PR will close issue #123. --> - Closes #19393 ## Rationale for this change <!-- Why are you proposing this change? If this is already explained clearly in the issue then this section is not needed. Explaining clearly why changes are proposed helps reviewers understand your changes and offer better suggestions for fixes. --> Make datafusion doc great ## What changes are included in this PR? <!-- There is no need to duplicate the description in the issue here but it is sometimes worth providing a summary of the individual changes in this PR. --> Move “Refactoring of `FileSource` constructors and `FileScanConfigBuilder` to accept schemas upfront" section to right place. ## Are these changes tested? No need for code test. <!-- We typically require tests for all PRs in order to: 1. Prevent the code from being accidentally broken by subsequent changes 2. Serve as another way to document the expected behavior of the code If tests are not included in your PR, please explain why (for example, are they covered by existing tests)? --> ## Are there any user-facing changes? <!-- If there are user-facing changes then we may require documentation to be updated before approving the PR. --> No <!-- If there are any breaking changes to public APIs, please add the `api change` label. --> Signed-off-by: mag1cian <[email protected]>
1 parent ada0923 commit 9b2505c

File tree

1 file changed

+85
-85
lines changed

1 file changed

+85
-85
lines changed

docs/source/library-user-guide/upgrading.md

Lines changed: 85 additions & 85 deletions
Original file line numberDiff line numberDiff line change
@@ -107,6 +107,91 @@ let config = FileScanConfigBuilder::new(object_store_url, source)
107107
The `pyarrow` feature flag has been removed. This feature has been migrated to
108108
the `datafusion-python` repository since version `44.0.0`.
109109

110+
### Refactoring of `FileSource` constructors and `FileScanConfigBuilder` to accept schemas upfront
111+
112+
The way schemas are passed to file sources and scan configurations has been significantly refactored. File sources now require the schema (including partition columns) to be provided at construction time, and `FileScanConfigBuilder` no longer takes a separate schema parameter.
113+
114+
**Who is affected:**
115+
116+
- Users who create `FileScanConfig` or file sources (`ParquetSource`, `CsvSource`, `JsonSource`, `AvroSource`) directly
117+
- Users who implement custom `FileFormat` implementations
118+
119+
**Key changes:**
120+
121+
1. **FileSource constructors now require TableSchema**: All built-in file sources now take the schema in their constructor:
122+
123+
```diff
124+
- let source = ParquetSource::default();
125+
+ let source = ParquetSource::new(table_schema);
126+
```
127+
128+
2. **FileScanConfigBuilder no longer takes schema as a parameter**: The schema is now passed via the FileSource:
129+
130+
```diff
131+
- FileScanConfigBuilder::new(url, schema, source)
132+
+ FileScanConfigBuilder::new(url, source)
133+
```
134+
135+
3. **Partition columns are now part of TableSchema**: The `with_table_partition_cols()` method has been removed from `FileScanConfigBuilder`. Partition columns are now passed as part of the `TableSchema` to the FileSource constructor:
136+
137+
```diff
138+
+ let table_schema = TableSchema::new(
139+
+ file_schema,
140+
+ vec![Arc::new(Field::new("date", DataType::Utf8, false))],
141+
+ );
142+
+ let source = ParquetSource::new(table_schema);
143+
let config = FileScanConfigBuilder::new(url, source)
144+
- .with_table_partition_cols(vec![Field::new("date", DataType::Utf8, false)])
145+
.with_file(partitioned_file)
146+
.build();
147+
```
148+
149+
4. **FileFormat::file_source() now takes TableSchema parameter**: Custom `FileFormat` implementations must be updated:
150+
```diff
151+
impl FileFormat for MyFileFormat {
152+
- fn file_source(&self) -> Arc<dyn FileSource> {
153+
+ fn file_source(&self, table_schema: TableSchema) -> Arc<dyn FileSource> {
154+
- Arc::new(MyFileSource::default())
155+
+ Arc::new(MyFileSource::new(table_schema))
156+
}
157+
}
158+
```
159+
160+
**Migration examples:**
161+
162+
For Parquet files:
163+
164+
```diff
165+
- let source = Arc::new(ParquetSource::default());
166+
- let config = FileScanConfigBuilder::new(url, schema, source)
167+
+ let table_schema = TableSchema::new(schema, vec![]);
168+
+ let source = Arc::new(ParquetSource::new(table_schema));
169+
+ let config = FileScanConfigBuilder::new(url, source)
170+
.with_file(partitioned_file)
171+
.build();
172+
```
173+
174+
For CSV files with partition columns:
175+
176+
```diff
177+
- let source = Arc::new(CsvSource::new(true, b',', b'"'));
178+
- let config = FileScanConfigBuilder::new(url, file_schema, source)
179+
- .with_table_partition_cols(vec![Field::new("year", DataType::Int32, false)])
180+
+ let options = CsvOptions {
181+
+ has_header: Some(true),
182+
+ delimiter: b',',
183+
+ quote: b'"',
184+
+ ..Default::default()
185+
+ };
186+
+ let table_schema = TableSchema::new(
187+
+ file_schema,
188+
+ vec![Arc::new(Field::new("year", DataType::Int32, false))],
189+
+ );
190+
+ let source = Arc::new(CsvSource::new(table_schema).with_csv_options(options));
191+
+ let config = FileScanConfigBuilder::new(url, source)
192+
.build();
193+
```
194+
110195
### Adaptive filter representation in Parquet filter pushdown
111196

112197
As of Arrow 57.1.0, DataFusion uses a new adaptive filter strategy when
@@ -754,91 +839,6 @@ TIMEZONE = '+00:00';
754839
This change was made to better support using the default timezone in scalar UDF functions such as
755840
`now`, `current_date`, `current_time`, and `to_timestamp` among others.
756841

757-
### Refactoring of `FileSource` constructors and `FileScanConfigBuilder` to accept schemas upfront
758-
759-
The way schemas are passed to file sources and scan configurations has been significantly refactored. File sources now require the schema (including partition columns) to be provided at construction time, and `FileScanConfigBuilder` no longer takes a separate schema parameter.
760-
761-
**Who is affected:**
762-
763-
- Users who create `FileScanConfig` or file sources (`ParquetSource`, `CsvSource`, `JsonSource`, `AvroSource`) directly
764-
- Users who implement custom `FileFormat` implementations
765-
766-
**Key changes:**
767-
768-
1. **FileSource constructors now require TableSchema**: All built-in file sources now take the schema in their constructor:
769-
770-
```diff
771-
- let source = ParquetSource::default();
772-
+ let source = ParquetSource::new(table_schema);
773-
```
774-
775-
2. **FileScanConfigBuilder no longer takes schema as a parameter**: The schema is now passed via the FileSource:
776-
777-
```diff
778-
- FileScanConfigBuilder::new(url, schema, source)
779-
+ FileScanConfigBuilder::new(url, source)
780-
```
781-
782-
3. **Partition columns are now part of TableSchema**: The `with_table_partition_cols()` method has been removed from `FileScanConfigBuilder`. Partition columns are now passed as part of the `TableSchema` to the FileSource constructor:
783-
784-
```diff
785-
+ let table_schema = TableSchema::new(
786-
+ file_schema,
787-
+ vec![Arc::new(Field::new("date", DataType::Utf8, false))],
788-
+ );
789-
+ let source = ParquetSource::new(table_schema);
790-
let config = FileScanConfigBuilder::new(url, source)
791-
- .with_table_partition_cols(vec![Field::new("date", DataType::Utf8, false)])
792-
.with_file(partitioned_file)
793-
.build();
794-
```
795-
796-
4. **FileFormat::file_source() now takes TableSchema parameter**: Custom `FileFormat` implementations must be updated:
797-
```diff
798-
impl FileFormat for MyFileFormat {
799-
- fn file_source(&self) -> Arc<dyn FileSource> {
800-
+ fn file_source(&self, table_schema: TableSchema) -> Arc<dyn FileSource> {
801-
- Arc::new(MyFileSource::default())
802-
+ Arc::new(MyFileSource::new(table_schema))
803-
}
804-
}
805-
```
806-
807-
**Migration examples:**
808-
809-
For Parquet files:
810-
811-
```diff
812-
- let source = Arc::new(ParquetSource::default());
813-
- let config = FileScanConfigBuilder::new(url, schema, source)
814-
+ let table_schema = TableSchema::new(schema, vec![]);
815-
+ let source = Arc::new(ParquetSource::new(table_schema));
816-
+ let config = FileScanConfigBuilder::new(url, source)
817-
.with_file(partitioned_file)
818-
.build();
819-
```
820-
821-
For CSV files with partition columns:
822-
823-
```diff
824-
- let source = Arc::new(CsvSource::new(true, b',', b'"'));
825-
- let config = FileScanConfigBuilder::new(url, file_schema, source)
826-
- .with_table_partition_cols(vec![Field::new("year", DataType::Int32, false)])
827-
+ let options = CsvOptions {
828-
+ has_header: Some(true),
829-
+ delimiter: b',',
830-
+ quote: b'"',
831-
+ ..Default::default()
832-
+ };
833-
+ let table_schema = TableSchema::new(
834-
+ file_schema,
835-
+ vec![Arc::new(Field::new("year", DataType::Int32, false))],
836-
+ );
837-
+ let source = Arc::new(CsvSource::new(table_schema).with_csv_options(options));
838-
+ let config = FileScanConfigBuilder::new(url, source)
839-
.build();
840-
```
841-
842842
### Introduction of `TableSchema` and changes to `FileSource::with_schema()` method
843843

844844
A new `TableSchema` struct has been introduced in the `datafusion-datasource` crate to better manage table schemas with partition columns. This struct helps distinguish between:

0 commit comments

Comments
 (0)