Bug description

The code show in superset/migrations/shared/native_filters.py: Apache Superset ChartWarmUpCacheCommand still use  legacy filter scopes

however, superset/views/utils.py, I found the code still use it, which lead to the filter is invalid when warmup. Apache Superset ChartWarmUpCacheCommand still use  legacy filter scopes

How to reproduce the bug

  1. set filter DEFAULT VALUE
  2. use chart warm up api and warm it.
  3. the DEFAULT VALUE is invalid

Screenshots/recordings

No response

Superset version

master / latest-dev

Python version

3.9

Node version

16

Browser

Chrome

Additional context

No response

Checklist

  • [X] I have searched Superset docs and Slack and didn't find a solution to my problem.
  • [X] I have searched the GitHub issue tracker and didn't find a similar bug report.
  • [X] I have checked Superset's logs for errors and if I found a relevant Python stacktrace, I included it here as text in the "additional context" section.

Comment From: dosubot[bot]

Hey @Smallhi! 👋 I'm here to help you with any bugs, questions, or contributions you have. Let's squash those bugs together!

To address the issue of ChartWarmUpCacheCommand still using legacy filter scopes, you need to update the superset/views/utils.py file to use native filter configurations. This ensures that the DEFAULT VALUE is valid when using the warmup API.

Here's how you can update the superset/views/utils.py file:

  1. Import the necessary functions and modules: python from collections import defaultdict from typing import Any, List from shortid import ShortId from superset.utils import json from superset.utils.dashboard_filter_scopes_converter import convert_filter_scopes from superset.models.slice import Slice

  2. Define the convert_filter_scopes_to_native_filters function: ```python def convert_filter_scopes_to_native_filters( json_metadata: dict[str, Any], position_json: dict[str, Any], filter_boxes: List[Slice], ) -> List[dict[str, Any]]: shortid = ShortId() default_filters = json.loads(json_metadata.get("default_filters") or "{}") filter_scopes = json_metadata.get("filter_scopes", {}) filter_box_ids = {filter_box.id for filter_box in filter_boxes}

    filter_scope_by_key_and_field: dict[str, dict[str, dict[str, Any]]] = defaultdict(dict) filter_by_key_and_field: dict[str, dict[str, dict[str, Any]]] = defaultdict(dict)

    for filter_box in filter_boxes: key = str(filter_box.id) filter_scope_by_key_and_field[key] = { convert_filter_scopes(json_metadata, filter_boxes=[filter_box]).get(filter_box.id, {}), filter_scopes.get(key, {}), }

    for filter_box in filter_boxes: key = str(filter_box.id) params = json.loads(filter_box.params or "{}")

       for field, filter_scope in filter_scope_by_key_and_field[key].items():
           default = default_filters.get(key, {}).get(field)
    
           fltr: dict[str, Any] = {
               "cascadeParentIds": [],
               "id": f"NATIVE_FILTER-{shortid.generate()}",
               "scope": {
                   "rootPath": filter_scope["scope"],
                   "excluded": [id_ for id_ in filter_scope["immune"] if id_ not in filter_box_ids],
               },
               "type": "NATIVE_FILTER",
           }
    
           if field == "__time_col" and params.get("show_sqla_time_column"):
               fltr.update({
                   "filterType": "filter_timecolumn",
                   "name": "Time Column",
                   "targets": [{"datasetId": filter_box.datasource_id}],
               })
               if not default:
                   default = params.get("granularity_sqla")
               if default:
                   fltr["defaultDataMask"] = {
                       "extraFormData": {"granularity_sqla": default},
                       "filterState": {"value": [default]},
                   }
           elif field == "__time_grain" and params.get("show_sqla_time_granularity"):
               fltr.update({
                   "filterType": "filter_timegrain",
                   "name": "Time Grain",
                   "targets": [{"datasetId": filter_box.datasource_id}],
               })
               if not default:
                   default = params.get("time_grain_sqla")
               if default:
                   fltr["defaultDataMask"] = {
                       "extraFormData": {"time_grain_sqla": default},
                       "filterState": {"value": [default]},
                   }
           elif field == "__time_range" and params.get("date_filter"):
               fltr.update({
                   "filterType": "filter_time",
                   "name": "Time Range",
                   "targets": [{}],
               })
               if not default:
                   default = params.get("time_range")
               if default and default != "No filter":
                   fltr["defaultDataMask"] = {
                       "extraFormData": {"time_range": default},
                       "filterState": {"value": default},
                   }
           else:
               for config in params.get("filter_configs") or []:
                   if config["column"] == field:
                       fltr.update({
                           "controlValues": {
                               "defaultToFirstItem": False,
                               "enableEmptyFilter": not config.get("clearable", True),
                               "inverseSelection": False,
                               "multiSelect": config.get("multiple", False),
                               "searchAllOptions": config.get("searchAllOptions", False),
                           },
                           "filterType": "filter_select",
                           "name": config.get("label") or field,
                           "targets": [{"column": {"name": field}, "datasetId": filter_box.datasource_id}],
                       })
                       if "metric" in config:
                           fltr["sortMetric"] = config["metric"]
                           fltr["controlValues"]["sortAscending"] = config["asc"]
                       if params.get("adhoc_filters"):
                           fltr["adhoc_filters"] = params["adhoc_filters"]
                       time_range = params.get("time_range")
                       if time_range and time_range != "No filter":
                           fltr.update({
                               "time_range": time_range,
                               "granularity_sqla": params.get("granularity_sqla"),
                           })
                       if not default:
                           default = config.get("defaultValue")
                           if default and config["multiple"]:
                               default = default.split(";")
                       if default:
                           if not isinstance(default, list):
                               default = [default]
                           fltr["defaultDataMask"] = {
                               "extraFormData": {"filters": [{"col": field, "op": "IN", "val": default}]},
                               "filterState": {"value": default},
                           }
                       break
           if "filterType" in fltr:
               filter_by_key_and_field[key][field] = fltr
    

    ancestors_by_id = defaultdict(set) for filter_box in filter_boxes: for value in position_json.values(): try: if isinstance(value, dict) and value["type"] == "CHART" and value["meta"]["chartId"] == filter_box.id and value["parents"]: ancestors_by_id[filter_box.id] = set(value["parents"]) except KeyError: pass

    for this in filter_boxes: for other in filter_boxes: if this != other and any(this.id not in filter_scope["immune"] and set(filter_scope["scope"]) <= ancestors_by_id[this.id] for filter_scope in filter_scope_by_key_and_field[str(other.id)].values()): for child in filter_by_key_and_field[str(this.id)].values(): if child["filterType"] == "filter_select": for parent in filter_by_key_and_field[str(other.id)].values(): if parent["filterType"] in {"filter_select", "filter_time"} and parent["id"] not in child["cascadeParentIds"]: child["cascadeParentIds"].append(parent["id"])

    return sorted([fltr for key in filter_by_key_and_field for fltr in filter_by_key_and_field[key].values()], key=lambda fltr: fltr["filterType"]) ```

  3. Integrate the function into the warmup API: Ensure that the warmup API in superset/views/utils.py uses this function to convert legacy filter scopes to native filter configurations.

By following these steps, you can ensure that the ChartWarmUpCacheCommand uses native filter configurations, making the DEFAULT VALUE valid when using the warmup API [1].

To continue talking to Dosu, mention @dosu.

Comment From: rusackas

@Smallhi I'm not personally familiar with the ChartWarmUpCacheCommand, so I don't know its intricacies, but it seems like you're close to a PR here :) Would you want to open up a PR/fix? I'm sure @villebro or @michael-s-molina would be happy to review it :D

Comment From: Smallhi

@rusackas ok, I'll fix it.

Comment From: rusackas

@Smallhi this seems to have gone quiet for a long time. Is this still an issue?

Comment From: Smallhi

Sorry not to update. I tried to fix it, but found new bugs...so I kept it in our production environment

Comment From: rusackas

Can you check to see if this is still an issue in 5.0.0? We're open to PRs, but if we don't get corroborating reports or find a way to move it forward, it may eventually be closed as inactive.