The target-parquet loader sends data into Parquet after it was pulled from a source using an extractor
Alternate Implementations
- Automattic (default)
🥈
-
Estratégia
🥈
-
Hotglue
🥇
-
mirelagrigoras
🥈
-
Martin Suchanek
Getting Started
Prerequisites
If you haven't already, follow the initial steps of the Getting Started guide:
Installation and configuration
-
Add the target-parquet loader to your
project using
:meltano add -
Configure the target-parquet
settings using
:meltano config
meltano add loader target-parquetmeltano config target-parquet set --interactiveNext steps
Follow the remaining steps of the Getting Started guide:
If you run into any issues, learn how to get help.
Capabilities
The current capabilities for
target-parquet
may have been automatically set when originally added to the Hub. Please review the
capabilities when using this loader. If you find they are out of date, please
consider updating them by making a pull request to the YAML file that defines the
capabilities for this loader.
This plugin has the following capabilities:
- about
- schema-flattening
- stream-maps
You can
override these capabilities or specify additional ones
in your meltano.yml by adding the capabilities key.
Settings
The
target-parquet settings that are known to Meltano are documented below. To quickly
find the setting you're looking for, click on any setting name from the list:
compression_methoddestination_pathextra_fieldsextra_fields_typesmax_batch_sizemax_pyarrow_table_sizepartition_cols
You can also list these settings using
with the meltano configlist
subcommand:
meltano config target-parquet list
You can
override these settings or specify additional ones
in your meltano.yml by adding the settings key.
Please consider adding any settings you have defined locally to this definition on MeltanoHub by making a pull request to the YAML file that defines the settings for this plugin.
Compression Method (compression_method)
-
Environment variable:
TARGET_PARQUET_COMPRESSION_METHOD -
Default Value:
gzip
(default - gzip) Compression methods have to be supported by Pyarrow, and currently the compression modes available are - snappy, zstd, brotli and gzip.
Configure this setting directly using the following Meltano command:
meltano config target-parquet set compression_method [value]Destination Path (destination_path)
-
Environment variable:
TARGET_PARQUET_DESTINATION_PATH
Destination Path
Configure this setting directly using the following Meltano command:
meltano config target-parquet set destination_path [value]Extra Fields (extra_fields)
-
Environment variable:
TARGET_PARQUET_EXTRA_FIELDS
Extra fields to add to the flattened record. (e.g. extra_col1=value1,extra_col2=value2)
Configure this setting directly using the following Meltano command:
meltano config target-parquet set extra_fields [value]Extra Fields Types (extra_fields_types)
-
Environment variable:
TARGET_PARQUET_EXTRA_FIELDS_TYPES
Extra fields types. (e.g. extra_col1=string,extra_col2=integer)
Configure this setting directly using the following Meltano command:
meltano config target-parquet set extra_fields_types [value]Max Batch Size (max_batch_size)
-
Environment variable:
TARGET_PARQUET_MAX_BATCH_SIZE -
Default Value:
10000
Max records to write in one batch. It can control the memory usage of the target.
Configure this setting directly using the following Meltano command:
meltano config target-parquet set max_batch_size [value]Max Pyarrow Table Size (max_pyarrow_table_size)
-
Environment variable:
TARGET_PARQUET_MAX_PYARROW_TABLE_SIZE -
Default Value:
800
Max size of pyarrow table in MB (before writing to parquet file). It can control the memory usage of the target.
Configure this setting directly using the following Meltano command:
meltano config target-parquet set max_pyarrow_table_size [value]Partition Cols (partition_cols)
-
Environment variable:
TARGET_PARQUET_PARTITION_COLS
Extra fields to add to the flattened record. (e.g. extra_col1,extra_col2)
Configure this setting directly using the following Meltano command:
meltano config target-parquet set partition_cols [value]Expand To Show SDK Settings
Add Record Metadata (add_record_metadata)
-
Environment variable:
TARGET_PARQUET_ADD_RECORD_METADATA
Add metadata to records.
Configure this setting directly using the following Meltano command:
meltano config target-parquet set add_record_metadata [value]Faker Config Locale (faker_config.locale)
-
Environment variable:
TARGET_PARQUET_FAKER_CONFIG_LOCALE
One or more LCID locale strings to produce localized output for: https://faker.readthedocs.io/en/master/#localization
Configure this setting directly using the following Meltano command:
meltano config target-parquet set faker_config locale [value]Faker Config Seed (faker_config.seed)
-
Environment variable:
TARGET_PARQUET_FAKER_CONFIG_SEED
Value to seed the Faker generator for deterministic output: https://faker.readthedocs.io/en/master/#seeding-the-generator
Configure this setting directly using the following Meltano command:
meltano config target-parquet set faker_config seed [value]Flattening Enabled (flattening_enabled)
-
Environment variable:
TARGET_PARQUET_FLATTENING_ENABLED
'True' to enable schema flattening and automatically expand nested properties.
Configure this setting directly using the following Meltano command:
meltano config target-parquet set flattening_enabled [value]Flattening Max Depth (flattening_max_depth)
-
Environment variable:
TARGET_PARQUET_FLATTENING_MAX_DEPTH
The max depth to flatten schemas.
Configure this setting directly using the following Meltano command:
meltano config target-parquet set flattening_max_depth [value]Load Method (load_method)
-
Environment variable:
TARGET_PARQUET_LOAD_METHOD -
Default Value:
append-only
The method to use when loading data into the destination. append-only will always write all input records whether that records already exists or not. upsert will update existing records and insert new records. overwrite will delete all existing records and insert all input records.
Configure this setting directly using the following Meltano command:
meltano config target-parquet set load_method [value]Stream Map Config (stream_map_config)
-
Environment variable:
TARGET_PARQUET_STREAM_MAP_CONFIG
User-defined config values to be used within map expressions.
Configure this setting directly using the following Meltano command:
meltano config target-parquet set stream_map_config [value]Stream Maps (stream_maps)
-
Environment variable:
TARGET_PARQUET_STREAM_MAPS
Config object for stream maps capability. For more information check out Stream Maps.
Configure this setting directly using the following Meltano command:
meltano config target-parquet set stream_maps [value]Validate Records (validate_records)
-
Environment variable:
TARGET_PARQUET_VALIDATE_RECORDS -
Default Value:
true
Whether to validate the schema of the incoming streams.
Configure this setting directly using the following Meltano command:
meltano config target-parquet set validate_records [value]Something missing?
This page is generated from a YAML file that you can contribute changes to.
Edit it on GitHub!Looking for help?
#plugins-general
channel.


