Bug description

Description

When i call the "/api/v1/dashboard/{dashboard_id}/copy/" API, with the duplicate_slices is true, I receive this error. Here is my code and the error. And when i come to the code and check. When i come to the code and check the code need the "positions" fields as input. However, the api dont allow to put this fields.

Code and bug notification

Screenshot 2024-08-06 143605

        dashboard_id  = 1
        headers = {
            "Content-Type": "application/json",
            "Authorization": f"Bearer {self.access_token}",
            "X-CSRFToken": self.csrf_token
        }

        dashboard_response = requests.get(f"{superset_url}/api/v1/dashboard/{dashboard_id}", 
                                headers=headers, cookies=self.cookies)
        dashboard_data = dashboard_response.json()["result"]
        json_meta = dashboard_data.get("json_metadata", "")
        json_meta = json.loads(json_meta)
        # json_meta["position_json"] = dashboard_data.get("position_json", "")
        reuqest_payload = {
            "dashboard_title": "intergrate_test",
            "duplicate_slices": True,
            "json_metadata":json.dumps(json_meta)
        }
        payload_json = json.dumps(reuqest_payload)
        # r = session.post(f'{superset_url}/api/v1/dashboard/{dashboard_id}/copy/', data=payload_json)
        r = requests.post(f'{superset_url}/api/v1/dashboard/{dashboard_id}/copy/', json=reuqest_payload,
                                headers=headers, cookies=self.cookies)
        print(r.json())

These part of code you should check and change

Apache Superset /api/v1/dashboard/{id_or_slug}/copy with

How to reproduce the bug

  1. using "{superset_url}/api/v1/dashboard/{dashboard_id}" API to get the json_metadata
  2. take it the parameter of '{superset_url}/api/v1/dashboard/{dashboard_id}/copy/' API code.txt
  3. you can use this for easy to check,

Screenshots/recordings

No response

Superset version

master / latest-dev

Python version

3.9

Node version

16

Browser

Chrome

Additional context

No response

Checklist

  • [X] I have searched Superset docs and Slack and didn't find a solution to my problem.
  • [X] I have searched the GitHub issue tracker and didn't find a similar bug report.
  • [X] I have checked Superset's logs for errors and if I found a relevant Python stacktrace, I included it here as text in the "additional context" section.

Comment From: dosubot[bot]

Hello, @Qambitions! I'm here to help you with any bugs, questions, or contributions you have for the repository. Let's tackle this issue together!

To resolve the KeyError for the key 'positions' when calling the "/api/v1/dashboard/{dashboard_id}/copy/" API with duplicate_slices set to true, you need to ensure that the positions key is included in the json_metadata of the request payload before making the API call.

Here's how you can modify your code to include the positions key:

  1. Fetch the position_json from the dashboard data.
  2. Add the position_json to the json_metadata.

Here's the updated code:

import json
import requests

dashboard_id = 1
headers = {
    "Content-Type": "application/json",
    "Authorization": f"Bearer {self.access_token}",
    "X-CSRFToken": self.csrf_token
}

# Fetch the dashboard data
dashboard_response = requests.get(f"{superset_url}/api/v1/dashboard/{dashboard_id}", 
                                  headers=headers, cookies=self.cookies)
dashboard_data = dashboard_response.json()["result"]

# Extract and update json_metadata
json_meta = dashboard_data.get("json_metadata", "")
json_meta = json.loads(json_meta)
json_meta["positions"] = dashboard_data.get("position_json", "")

# Prepare the request payload
request_payload = {
    "dashboard_title": "intergrate_test",
    "duplicate_slices": True,
    "json_metadata": json.dumps(json_meta)
}

# Make the copy API call
r = requests.post(f'{superset_url}/api/v1/dashboard/{dashboard_id}/copy/', 
                  json=request_payload, headers=headers, cookies=self.cookies)
print(r.json())

This code ensures that the positions key is included in the json_metadata when making the API call to /api/v1/dashboard/{dashboard_id}/copy/, preventing the KeyError [1].

To continue talking to Dosu, mention @dosu.

Comment From: Qambitions

I have check the solutions @dosu. bot and the code return the {'message': {'json_metadata': [{'positions': ['Not a valid mapping type.']}]}} error.

Comment From: dosubot[bot]

To resolve the error {'message': {'json_metadata': [{'positions': ['Not a valid mapping type.']}]}}, you need to ensure that the positions key in the json_metadata is correctly formatted as a valid mapping type (i.e., a dictionary).

Here's how you can modify your code to ensure the positions key is correctly formatted:

  1. Fetch the position_json from the dashboard data.
  2. Parse the position_json to ensure it is a valid dictionary.
  3. Add the parsed position_json to the json_metadata.

Here's the updated code:

import json
import requests

dashboard_id = 1
headers = {
    "Content-Type": "application/json",
    "Authorization": f"Bearer {self.access_token}",
    "X-CSRFToken": self.csrf_token
}

# Fetch the dashboard data
dashboard_response = requests.get(f"{superset_url}/api/v1/dashboard/{dashboard_id}", 
                                  headers=headers, cookies=self.cookies)
dashboard_data = dashboard_response.json()["result"]

# Extract and update json_metadata
json_meta = dashboard_data.get("json_metadata", "")
json_meta = json.loads(json_meta)

# Ensure positions is a valid mapping type
positions = dashboard_data.get("position_json", "{}")
positions = json.loads(positions) if isinstance(positions, str) else positions
json_meta["positions"] = positions

# Prepare the request payload
request_payload = {
    "dashboard_title": "intergrate_test",
    "duplicate_slices": True,
    "json_metadata": json.dumps(json_meta)
}

# Make the copy API call
r = requests.post(f'{superset_url}/api/v1/dashboard/{dashboard_id}/copy/', 
                  json=request_payload, headers=headers, cookies=self.cookies)
print(r.json())

This code ensures that the positions key is included in the json_metadata as a valid dictionary, which should resolve the error [1][2][3].

To continue talking to Dosu, mention @dosu.

Comment From: Qambitions

I'm ensure that Superset current version dont allow to add the "positions" into json_metadata. that why its return {'message': {'json_metadata': [{'positions': ['Not a valid mapping type.']}]}} error.

Comment From: rusackas

Is this still an issue, @Qambitions ? Particularly in 4.0/4.1?

Comment From: Qambitions

Is this still an issue, @Qambitions ? Particularly in 4.0/4.1? it is in version 4.0. i haven't check with version 4.1.

For this problem, i change my way to duplicate this dashboard. 1. Using /api/v1/dashboard/export/ to get the Zip file. 2. Change the name and generate new uuid in previous zip file. 3. Using /api/v1/dashboard/import/ to post changed zip file.

I will check it again soon and tells you Is this still an issue in the next comment

Comment From: Parzi68

@Qambitions Try this code for the positions issue:

# Parse json_metadata
    try:
        json_meta = json.loads(dashboard_data.get("json_metadata", "{}"))
    except json.JSONDecodeError:
        json_meta = {}

    # Parse position_json
    try:
        positions = json.loads(dashboard_data.get("position_json", "{}")) if dashboard_data.get("position_json") else {}
    except json.JSONDecodeError:
        positions = {}

    # Ensure positions is added to json_metadata
    json_meta["positions"] = positions

Comment From: rusackas

Anyone still facing this? It's been silent thus far this year, so I'm tempted to close it as inactive. If it's still an issue in 4.1.2, we can keep this open.

Comment From: rusackas

Closing as inactive/not-planned since folks seem to have moved along. Holler if it needs to be revisited/reopend, and we can see about updating the context here in 5.0.0/master