Changelog
1.11.8 (core) / 0.27.8 (libraries)
New
- A param
exclusions
was added to time window partition definitions to support custom calendars. - The
dagster
library now supportsprotobuf==6.x
- [dg]
dg scaffold defs --help
now shows descriptions for subcommands. - [dg] A new
dg check toml
command has been added to validate your TOML configuration files. - [dagster-databricks] The
DatabricksAssetBundleComponent
has been added in preview. Databricks tasks can now be represented as assets and submitted via Dagster. - [dagster-dbt] The DbtProjectComponent now takes an optional
cli_args
configuration to allow customizing the command that is run when your assets are executed. - [dagster-dbt] The polling interval and timeout used for runs triggered with the
DbtCloudWorkspace
resource can now be customized with theDAGSTER_DBT_CLOUD_POLL_INTERVAL
andDAGSTER_DBT_CLOUD_POLL_TIMEOUT
environment variables. - [ui] Added the ability to filter to failed/missing partitions in the asset report events dialog.
- [ui] A tree view has been added in the Global Asset Lineage.
- [telemetry] Telemetry disclaimer now prints to stderr.
Bugfixes
- Fixed an issue that would require config provided to backfills to contain config for all assets in the code location rather than just the selected ones.
1.11.7 (core) / 0.27.7 (libraries)
New
dg
will now report multiple detected errors in a configuration file instead of failing on the first detected error.- It is now possible to supply run config when launching an asset backfill.
- Updated the root URL to display the Overview/Timeline view for locations with schedules/automations, but no jobs (thanks @dschafer!)
- Added
tzdata
as a dependency todagster
, to ensure that declaring timezones likeUS/Central
work in all environments. - [dagster-dg-cli] Updated scaffolded file names to handle consecutive upper case letters (ACMEDatabricksJobComponent → acme_databricks_job_component.py not a_c_m_e_databricks_job_component.py)
- [dagster-dg-cli] Validating
requirements.env
is now opt-in fordg check yaml
. - [dagster-dbt]
DAGSTER_DBT_CLOUD_POLL_INTERVAL
andDAGSTER_DBT_CLOUD_POLL_TIMEOUT
environment variables can now be used to configure the polling interval and timeout for fetching data from dbt Cloud.
Deprecations
- [components] Removed deprecated and non-functional
asset_post_processors
fields fromSlingReplicationCollectionComponent
andAirflowInstanceComponent
.
1.11.6 (core) / 0.27.6 (libraries)
New
- Allow explicit git
platform
selection inlink_code_references_to_git
, thanks @chazmo03!
Bugfixes
- Fixed issue causing
AutomationCondition.replace
to not update built-in sub-conditions that did not have an explicit label, thanks @dschafer! - Fixed an issue where assets were considered stubs if they were a stub in any code location.
- Projects using components no longer cause "job definitions changed while uploading" errors on older agent versions.
- [dagster-dbt] Fixed a bug that could cause execution to fail if
enable_code_references
was set toTrue
on theDagsterDbtTranslatorSettings
.
Documentation
- Updated documentation of
dagster.yaml
to include thenux
option, thanks @dwisdom0!
Dagster Plus
- Fix "Create a support ticket" dialog submissions.
1.11.5 (core) / 0.27.5 (libraries)
New
- Static functions on classes decorated with
@template_var
can now optionally accept aComponentLoadContext
argument. - [dg] A MCP server is available to expose
dg
CLI capabilities to MCP clients. See thedg mcp
CLI group for details. - [dagster-dbt] The
dagster-dbt
package no longer has a dependency ondbt-core
. - [dagster-dbt][preview] Users of the dbt Fusion CLI can now use the
dagster-dbt
package to run dbt commands with no changes to their existing dagster code. This support is still in preview as the format of the log messages produced by the dbt Fusion CLI is still subject to change. Let us know if you notice any incompatibilities. - [dagster-databricks] Added a
PipesDatabricksServerlessClient
to support Databricks Serverless jobs with Dagster pipes. - [dagster-databricks] Added additional options for cluster configuration (thanks @jmccartin!)
Bugfixes
- Various bugfixes for backfills that target assets which change their partitions definition mid-backfill.
- [ui] Fixed issue that could cause errors related to the
ObjectMetadataValue
class.
Documentation
- Added docs for using Spark Connect and Databricks Connect with Dagster.
1.11.4 (core) / 0.27.4 (libraries)
New
- Schedules now support specifying a subset of asset checks to execute in a
RunRequest
. - [dg] A new
docs integrations
cli is available for viewing an index of available integrations. - [ui] Jobs can now be filtered with a selection syntax.
- [dagster-tableau] Dashboards containing hidden sheets are now correctly linked to upstream data sources.
- [dagster-tableau] Tableau sheets and dashboards now produce observation events instead of materialization events when using
refresh_and_poll
inside the@tableau_assets
asset decorator.
Bugfixes
- Fixed a set of issues with the asset backfill system that could, in rare cases, cause runs to be kicked off out of order or never be kicked off.
- Fixed issue where additional args passed into a PermissiveConfig object could not be accessed via dot notation (thanks @CarlyAThomas and @BoLiuV5!)
- Duplicate definitions are no longer incorrectly created when including jobs for schedules & sensors when loading from a
defs
folder. - [components] Fixed an incorrect import being generated when scaffolding a component in Python. (thanks, @ajohnson5!)
- [dg] when assets are selected via
--assets
, other definitions types will no longer be displayed.
Documentation
- Fixed typo in the
polars.md
example doc (thanks @j1wilmot!) - Fixed a typo in the ETL tutorial docs (thanks @yumazak!)
1.11.3 (core) / 0.27.3 (libraries)
New
-
Introduced
AssetExecutionContext.load_asset_value
, which enables loading asset values from the IO manager dynamically rather than requiring asset values be loaded as parameters to the asset function. For example:@dg.asset(deps=[the_asset])
def the_downstream_asset(context: dg.AssetExecutionContext):
return context.load_asset_value(dg.AssetKey("the_asset")) -
Expose asset_selection parameter for
submit_job_execution
function in DagsterGraphQLClient, thanks @brunobbaraujo! -
Large error stack traces from Dagster events will be automatically truncated if the message or stack trace exceeds 500kb. The exact value of the truncation can be overridden by setting the
DAGSTER_EVENT_ERROR_FIELD_SIZE_LIMIT
environment variable. -
Added
databento
,ax
, andbotorch
kind tags, thanks @aleewen and @CompRhys! -
[dagster-k8s] Added the option to include
ownerReferences
s to k8s executor step jobs, ensuring that the step job and step pod are properly garbage collected if the run pod is deleted. These can be enabled by setting theenable_owner_references
flag on the executor config. -
[components] Added
dg list component-tree
command which can be used to visualize the component tree of a project. -
[components] Added the ability to reference, load, and build defs for other components in the same project. In YAML, you may use the
load_component_at_path
andbuild_defs_at_path
functions:type: dagster.PythonScriptComponent
attributes:
execution:
path: my_script.py
assets:
- key: customers_export
deps:
- "{{ load_component_at_path('dbt_ingest').asset_key_for_model('customers') }}"
Bugfixes
- [components] Python component instances are now properly loaded from ordinary Python files.
- Fixed an issue that could cause asset backfills to request downstream partitions at the same time as their parent partitions in rare cases.
- Fixed a bug that could cause
@graph_asset
s to not properly apply theAllPartitionMapping
orLastPartitionMapping
to dependencies, thanks @BoLiuV5! - Fixed a bug that could cause code locations to fail to load when a custom python AutomationCondition was used as the operand of
AutomationCondition.any_deps_match()
orAutomationCondition.all_deps_match()
. - The
create-dagster
standalone executable now works on all Linux versions using glibc 2.17 or later. - [ui] Partition tags are now properly shown on the runs page, thanks @HynekBlaha!
- [ui] Using the "Retry from Asset Failure" option when retrying a run that failed after materializing all of its assets will now correctly indicate that there is no work that needs to be retried.
- [ui] The timeline tab on the Overview page now shows runs by sensor when they were launched by an automation condition sensor, instead of showing every row in the same "Automation condition" row.
- [ui] Fixed an issue where filtering to an asset group on the lineage page did not apply the correct repository filter in code locations with multiple repositories.
- [ui] Fixed an issue where asset checks referencing asset keys that did not exist in the asset graph did not appear in the Dagster UI.
- [ui] Fixed occasional crashes of the asset graph on the asset lineage tab.
- [dagster-dbt] The
@dbt_assets
decorator and associated APIs no longer error when parsing dbt projects that contain an owner with multiple emails.
Documentation
- Fixed typos in the ELT pipeline tutorial, thanks @aaronprice00 and @kevJ711!
- Fixed typos in components docs, thanks @tintamarre!
- Fixed error in Sling docs, thanks @nhuray!
- Updated the
AutomationCondition.replace
type signature to provide callers more information about the returnedAutomationCondition
, thanks @dschafer!