Bug description
Hi , I am automating the dashboard import process by keeping all the files in Repo and importing them to Superset. I have a requirement to add multiple databases. My current folder structure is below for Reporting database. If I need to add a new database eg, Superset_metadb.yaml how the structure will be for handling multiple databses /tmp/repo/databases/databases.zip databases/ databases/metadata.yaml databases/datasets/ databases/datasets/Reporting/ databases/datasets/Reporting/Inventory_Cyclecount_by_Task_Priority_and_Zone.yaml databases/datasets/Reporting/Inventory_Task_Ended.yaml databases/Reporting.yaml
Screenshots/recordings
No response
Superset version
5.0.0
Python version
3.11
Node version
18 or greater
Browser
Chrome
Additional context
No response
Checklist
- [x] I have searched Superset docs and Slack and didn't find a solution to my problem.
- [x] I have searched the GitHub issue tracker and didn't find a similar bug report.
- [x] I have checked Superset's logs for errors and if I found a relevant Python stacktrace, I included it here as text in the "additional context" section.
Comment From: dosubot[bot]
To import multiple database connections into Superset (v5.0.0) using your automated process, structure your ZIP file like this:
databases/
metadata.yaml
Reporting.yaml
Superset_metadb.yaml
...
datasets/
Reporting/
Inventory_Cyclecount_by_Task_Priority_and_Zone.yaml
Inventory_Task_Ended.yaml
Superset_metadb/
... (datasets for this DB)
Each database connection gets its own YAML file under databases/
(e.g., Reporting.yaml
, Superset_metadb.yaml
). Datasets for each database go in their own subfolder under datasets/
(e.g., datasets/Reporting/
, datasets/Superset_metadb/
). This structure is explicitly supported by the import logic in Superset 5.x and is the standard approach for bulk importing databases and datasets via the UI or API [source] [source].
When importing, you can provide passwords for each database as a JSON map, keyed by the path to the YAML file (e.g., {"databases/Reporting.yaml": "password1", "databases/Superset_metadb.yaml": "password2"}
).
No special naming convention is required beyond placing each database YAML in the databases/
directory and organizing datasets by database name under datasets/
. This approach works for automated imports and is robust for handling multiple databases in Superset.
To reply, just mention @dosu.
How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other
Comment From: Devika7733
@dosu., Then how do we pass password for multiple databases in API call if I am keeping multiple databses in a zip file like: databases/ metadata.yaml Reporting.yaml Superset_metadb.yaml . I am currently using the below endpoints for uploading Assets: local response=$(curl -k -s -w "%{http_code}" -b cookies.txt -o "${asset_type}_response_body.txt" \ --url "https://superset.test.com/api/v1/database/import/" \ --header 'accept: application/json' \ --header Authorization": Bearer $ACCESS_TOKEN" \ --header 'content-type: multipart/form-data' \ --header "referer: https://superset.test.com" \ --header "x-csrftoken: $CSRF_TOKEN" \ --form "formData=@$zip_file" \ --form overwrite=true \ --form "passwords={\"databases/SuperSet.Metadata.yaml\": \"$SUPERSET_DB_PASS\"}")
Comment From: dosubot[bot]
To pass passwords for multiple databases in your API call, use the passwords
form field as a JSON string mapping each database YAML path (as it appears in your zip, e.g., databases/Reporting.yaml
, databases/Superset_metadb.yaml
) to its password. The keys must match the file paths exactly, including case and directory structure. For example:
--form 'passwords={"databases/Reporting.yaml":"'"$REPORTING_PASS"'", "databases/Superset_metadb.yaml":"'"$SUPERSET_DB_PASS"'"}'
Your curl command should look like:
curl -k -s -w "%{http_code}" -b cookies.txt -o "${asset_type}_response_body.txt" \
--url "https://superset.test.com/api/v1/database/import/" \
--header 'accept: application/json' \
--header "Authorization: Bearer $ACCESS_TOKEN" \
--header 'content-type: multipart/form-data' \
--header "referer: https://superset.test.com" \
--header "x-csrftoken: $CSRF_TOKEN" \
--form "formData=@$zip_file" \
--form overwrite=true \
--form 'passwords={"databases/Reporting.yaml":"'"$REPORTING_PASS"'", "databases/Superset_metadb.yaml":"'"$SUPERSET_DB_PASS"'"}'
If you need to provide SSH tunnel credentials, use the same pattern for the relevant fields (ssh_tunnel_passwords
, ssh_tunnel_private_keys
, ssh_tunnel_private_key_passwords
) as JSON objects keyed by the database YAML path reference.
Common pitfalls: make sure your JSON is properly escaped, and the paths match exactly what’s in your zip file. If a password is missing or a path is incorrect, the import will fail reference.
To reply, just mention @dosu.
How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other