Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Compete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.

Reply
Sidhant
Helper II
Helper II

Power BI Report creation through Script

Hi everyone,
So recently I was experimenting that can we create reports through script apart from Power BI desktop and I found out that we can do it by using the '.pbip' file extension. And I did found some success, that I was able to create the 'model.bim' file (which contains the data source: table and its columns -> This data was been extracted from an excel file), for this I used Python script so initially I had a zip template that had the basic structure which was first been extracted in the Python script (I had created a function to do the same which extracted the zip and renamed the file : I was taking the report nam from the user and then renaming the files (like Report_name.pbip, Report.Semantic, Report_Name.Report .platform). 

import os
import zipfile
import json
import shutil
import pandas as pd
import zipfile
from datetime import datetime
import copy
import uuid
import time

# Paths
STATIC_DIR = os.path.join(os.path.dirname(__file__), 'Static')
MODEL_TEMPLATE_PATH = os.path.join(STATIC_DIR, 'model_template.json')
DATA_TYPE_MAPPING_PATH = os.path.join(STATIC_DIR, 'data_type_mapping.json')
REPORT_CONFIG_TEMPLATE_PATH = os.path.join(STATIC_DIR, 'report_config_template.json')

# Load static templates
def load_json_template(path):
    with open(path, 'r') as f:
        return json.load(f)

model_template = load_json_template(MODEL_TEMPLATE_PATH)
data_type_map = load_json_template(DATA_TYPE_MAPPING_PATH)
report_static_config = load_json_template(REPORT_CONFIG_TEMPLATE_PATH)

# Function to map data types using external mapping
def map_data_type(value):
    value = str(value).strip().lower()
    mapped_type = data_type_map.get(value, "string")

    valid_types = {"int64", "double", "boolean", "string", "dateTime", "decimal"}
    return mapped_type if mapped_type in valid_types else "string"

def prepare_report_structure(report_name):
    print(" Preparing report structure...")
    ZIP_FILENAME = "Template.zip"
    EXTRACTION_FOLDER = "Extraction"
    original_report = "reportname"

    # Clear previous extraction if exists
    if os.path.exists(EXTRACTION_FOLDER):
        shutil.rmtree(EXTRACTION_FOLDER)

    # Extract the zip
    with zipfile.ZipFile(ZIP_FILENAME, 'r') as zip_ref:
        zip_ref.extractall(EXTRACTION_FOLDER)

    # Define paths
    paths = {
        "report_folder_old": os.path.join(EXTRACTION_FOLDER, f"{original_report}.Report"),
        "semantic_folder_old": os.path.join(EXTRACTION_FOLDER, f"{original_report}.SemanticModel"),
        "pbip_file_old": os.path.join(EXTRACTION_FOLDER, f"{original_report}.pbip"),
        "report_folder_new": os.path.join(EXTRACTION_FOLDER, f"{report_name}.Report"),
        "semantic_folder_new": os.path.join(EXTRACTION_FOLDER, f"{report_name}.SemanticModel"),
        "pbip_file_new": os.path.join(EXTRACTION_FOLDER, f"{report_name}.pbip"),
    }

    # Rename folders and pbip file
    os.rename(paths["report_folder_old"], paths["report_folder_new"])
    os.rename(paths["semantic_folder_old"], paths["semantic_folder_new"])
    os.rename(paths["pbip_file_old"], paths["pbip_file_new"])

    # Update .pbip file
    try:
        with open(paths["pbip_file_new"], 'r', encoding='utf-8') as file:
            data = json.load(file)
        data["artifacts"][0]["report"]["path"] = f"{report_name}.Report"
        with open(paths["pbip_file_new"], 'w', encoding='utf-8') as file:
            json.dump(data, file, indent=4)
    except Exception as e:
        print(f"Error updating .pbip: {e}")

    # Update .platform in .Report
    try:
        platform_path = os.path.join(paths["report_folder_new"], ".platform")
        with open(platform_path, 'r', encoding='utf-8') as file:
            data = json.load(file)
        data["metadata"]["displayName"] = report_name
        with open(platform_path, 'w', encoding='utf-8') as file:
            json.dump(data, file, indent=4)
    except Exception as e:
        print(f"Error updating .platform in .Report: {e}")

    # Update definition.pbir
    try:
        definition_path = os.path.join(paths["report_folder_new"], "definition.pbir")
        with open(definition_path, 'r', encoding='utf-8') as file:
            data = json.load(file)
        data["datasetReference"]["byPath"]["path"] = f"../{report_name}.SemanticModel"
        with open(definition_path, 'w', encoding='utf-8') as file:
            json.dump(data, file, indent=4)
    except Exception as e:
        print(f"Error updating definition.pbir: {e}")

    # Update .platform in .SemanticModel
    try:
        semantic_platform_path = os.path.join(paths["semantic_folder_new"], ".platform")
        with open(semantic_platform_path, 'r', encoding='utf-8') as file:
            data = json.load(file)
        data["metadata"]["displayName"] = report_name
        with open(semantic_platform_path, 'w', encoding='utf-8') as file:
            json.dump(data, file, indent=4)
    except Exception as e:
        print(f"Error updating .platform in SemanticModel: {e}")

    print(" Folder extraction, renaming, and file updates completed successfully.")
    return paths["semantic_folder_new"]

def create_model_bim(tables, output_folder):
    print(" Creating model.bim...")
    start_time = time.time()

    model_bim = copy.deepcopy(model_template)
    query_order = [tbl["name"] for tbl in tables]
    model_bim["model"]["annotations"][0]["value"] = str(query_order)

    updated_tables = []
    for tbl in tables:
        table_name = tbl["name"]

        partition = {
            "name": f"{table_name}_Partition",
            "mode": "import",
            "source": {
                "type": "m",
                "expression": (
                    f"let\n"
                    f"    Source = Sql.Database(\"server_name\", \"database_name\")\n"
                    f"in\n"
                    f"    Source{{[Schema=\"dbo\", Item=\"{table_name}\"]}}[Data]"
                )
            }
        }

        # Add sourceColumn to each column
        for col in tbl["columns"]:
            col["sourceColumn"] = col["name"]

        tbl["partitions"] = [partition]
        updated_tables.append(tbl)

    model_bim["model"]["tables"] = updated_tables

    bim_path = os.path.join(output_folder, "model.bim")
    with open(bim_path, "w", encoding="utf-8") as file:
        json.dump(model_bim, file, indent=4)

    print(f" model.bim created at: {bim_path} in {time.time() - start_time:.2f} seconds.")


def create_visual(visual_type, rows, columns, title, x, y):
    visual_id = str(uuid.uuid4())
    projections = {}

    if columns:
        projections["Y"] = [{"queryRef": f"Sum({columns[0][0]}.{columns[0][1]})"}]
    if rows:
        projections["Category"] = [{"queryRef": f"{rows[0][0]}.{rows[0][1]}", "active": True}]

    config = {
        "name": visual_id,
        "layouts": [
            {
                "id": 0,
                "position": {
                    "x": x,
                    "y": y,
                    "z": 0,
                    "width": 400.0,
                    "height": 300.0,
                    "tabOrder": 0
                }
            }
        ],
        "singleVisual": {
            "visualType": visual_type.lower().replace(" ", ""),
            "projections": projections,
            "prototypeQuery": {},
            "drillFilterOtherVisuals": True,
            "hasDefaultSort": True,
            "objects": {},
            "vcObjects": {
                "title": [
                    {
                        "properties": {
                            "text": {
                                "expr": {
                                    "Literal": {"Value": f"'{title}'"}
                                }
                            }
                        }
                    }
                ]
            }
        }
    }
    return config

def create_dynamic_report_json(report_folder_path, metadata_path):
    print(" Generating report.json from metadata...")
    start_time = time.time()

    df_types = pd.read_excel(metadata_path, sheet_name="chart_types")
    df_axes = pd.read_excel(metadata_path, sheet_name="chart_axes")

    visuals = []
    x, y = 100.0, 100.0
    visuals_per_page = 3
    visual_index = 0
    pages = []

    for _, row in df_types.iterrows():
        worksheet = row['worksheet']
        plot_type = row['plot_type']

        filtered_axes = df_axes[df_axes['worksheet'] == worksheet]

        # Gather row and column fields
        rows = filtered_axes[filtered_axes['type'].str.lower() == 'rows']
        cols = filtered_axes[filtered_axes['type'].str.lower() == 'cols']

        row_fields = list(rows[['table_name', 'column']].dropna().itertuples(index=False, name=None))
        col_fields = list(cols[['table_name', 'column']].dropna().itertuples(index=False, name=None))

        # Skip visual if missing axis info
        if not row_fields or not col_fields:
            print(f" Skipping visual '{worksheet}' due to missing row/column axis data.")
            continue

        config = create_visual(plot_type, row_fields, col_fields, worksheet, x, y)
        visuals.append({
            "config": json.dumps(config),
            "filters": "[]",
            "height": 300.0,
            "width": 400.0,
            "x": x,
            "y": y,
            "z": 0.0
        })

        visual_index += 1
        if visual_index % visuals_per_page == 0:
            x, y = 100.0, 100.0
        else:
            x += 420.0

    for i in range(0, len(visuals), visuals_per_page):
        section_id = str(uuid.uuid4())
        pages.append({
            "config": "{}",
            "displayName": f"Page {i//visuals_per_page + 1}",
            "displayOption": 1,
            "filters": "[]",
            "height": 720.0,
            "name": section_id,
            "visualContainers": visuals[i:i+visuals_per_page],
            "width": 1280.0
        })

    report_json = {
        "config": json.dumps(report_static_config["config"]),
        "layoutOptimization": report_static_config.get("layoutOptimization", 0),
        "resourcePackages": report_static_config.get("resourcePackages", []),
        "sections": pages
    }

    output_path = os.path.join(report_folder_path, "report.json")
    with open(output_path, "w", encoding="utf-8") as f:
        json.dump(report_json, f, indent=4)

    print(f" report.json created with {len(visuals)} visuals across {len(pages)} page(s) in {time.time() - start_time:.2f} seconds.")


# ---------- Main Execution ----------

report_name = input("Enter report name: ").strip()

# Step 1: Prepare report structure
semantic_model_path = prepare_report_structure(report_name)

# Step 2: Load Metadata.xlsx (only 'columns' sheet)
print(" Loading table column metadata...")
metadata_path = os.path.join("Files", "MetaData.xlsx")
df = pd.read_excel(metadata_path, sheet_name="columns")

# Group by table_name to create list of tables
tables = []
for table_name, group in df.groupby("table_name"):
    columns = [
        {
            "name": row["caption"],
            "dataType": map_data_type(row.get("type", row.get("data_type", "string")))
        } for _, row in group.iterrows()
    ]
    tables.append({"name": table_name, "columns": columns})

# Step 3: Create model.bim
create_model_bim(tables, semantic_model_path)

# Step 4: Create report.json
dynamic_report_folder_path = os.path.join("Extraction", f"{report_name}.Report")
create_dynamic_report_json(dynamic_report_folder_path, metadata_path)

General structure of the zip file: Includes a Report, Semantic Model and the PBIP file

This the model.bim file that is created (which is done from the script : 2 nd function : 'create_model_bim' and the data tables and columns are extracted from i.e. read from a Meta-data file which looks like):

Sidhant_1-1744783457318.png
Here I am generally focussing on three columns : table_name, caption (i.e. Colun name), type (Column type) that generates the model.bim structure (which consists of the skelton of the tables). 
Here I was able to get the desired result like on opening the pbip file could see the data source and tables. Then I moved to create a single visual using the script (initially I explicitly specified that I want to create a bar chart with these columns and executed, it did the job)

def create_valid_report_json(report_folder_path):
    import uuid
    section_id = str(uuid.uuid4())
    visual_id = str(uuid.uuid4())

    # Copy base config
    base_config = copy.deepcopy(report_static_config["config"])
    
    # Build the visual configuration
    visual_config = {
        "name": visual_id,
        "layouts": [
            {
                "id": 0,
                "position": {
                    "x": 100.0,
                    "y": 100.0,
                    "z": 0,
                    "width": 400.0,
                    "height": 300.0,
                    "tabOrder": 0
                }
            }
        ],
        "singleVisual": {
            "visualType": "columnChart",
            "projections": {
                "Y": [{"queryRef": "Sum(Orders.Sales)"}],
                "Category": [{"queryRef": "Orders.Customer Name", "active": True}]
            },
            "prototypeQuery": {
                "Version": 2,
                "From": [
                    {"Name": "Orders", "Entity": "Orders", "Type": 0}
                ],
                "Select": [
                    {
                        "Aggregation": {
                            "Expression": {
                                "Column": {
                                    "Expression": {"SourceRef": {"Source": "Orders"}},
                                    "Property": "Sales"
                                }
                            },
                            "Function": 0
                        },
                        "Name": "Sum(Orders.Sales)",
                        "NativeReferenceName": "Sum of Sales"
                    },
                    {
                        "Column": {
                            "Expression": {"SourceRef": {"Source": "Orders"}},
                            "Property": "Customer Name"
                        },
                        "Name": "Orders.Customer Name",
                        "NativeReferenceName": "Customer Name"
                    }
                ],
                "OrderBy": [
                    {
                        "Direction": 2,
                        "Expression": {
                            "Aggregation": {
                                "Expression": {
                                    "Column": {
                                        "Expression": {"SourceRef": {"Source": "Orders"}},
                                        "Property": "Sales"
                                    }
                                },
                                "Function": 0
                            }
                        }
                    }
                ]
            },
            "drillFilterOtherVisuals": True,
            "hasDefaultSort": True,
            "objects": {},
            "vcObjects": {
                "title": [
                    {
                        "properties": {
                            "text": {
                                "expr": {
                                    "Literal": {"Value": "'Sales by Customer'"}
                                }
                            }
                        }
                    }
                ]
            }
        }
    }

    report_json = {
        "config": json.dumps(base_config),
        "layoutOptimization": report_static_config.get("layoutOptimization", 0),
        "resourcePackages": report_static_config.get("resourcePackages", []),
        "sections": [
            {
                "config": "{}",
                "displayName": "Page 1",
                "displayOption": 1,
                "filters": "[]",
                "height": 720.0,
                "name": section_id,
                "visualContainers": [
                    {
                        "config": json.dumps(visual_config),
                        "filters": "[]",
                        "height": 300.0,
                        "width": 400.0,
                        "x": 100.0,
                        "y": 100.0,
                        "z": 0.0
                    }
                ],
                "width": 1280.0
            }
        ]
    }

    output_path = os.path.join(report_folder_path, "report.json")
    with open(output_path, "w", encoding="utf-8") as f:
        json.dump(report_json, f, indent=4)

    print(f" report.json created at: {output_path}")


So now I thought to progress of doing this dynamically though the metada file (wherein I majorly focussing on two sheets one is chart_axes : That tells which columns are used as rows and which are as column (in the sheet it is denoted as cols), what are tool tips; and the other is chart_types which denotes what are the chart types we have). So basically I have just modified the above function (which you can see in the main script that I shared ata start to read from the meta-data file check the chart_types {source and source_reporting_system are common so that is ignored}
chart_types  and chart_axes (sheet in Metdata)

Sidhant_2-1744784048449.pngSidhant_3-1744784129303.png


So initially in the code if there was no relevant information found for a chart in chart axes I thought of creating a blank visual and continue but when I tried opening the pbip file it gave an fatal error: Failed to open report so I thought it might be due to the blank visuals so I decided that I will skip the chart if there is no information of axes found in chart axes for the respective chart and try (which is what I have done my main script last function) still I am getting the same issue:

Sidhant_4-1744784321881.png

 

So if anyone has any inputs or suggestion on how to debug this please do share the same and if you need any more, do let me know.

Regards,
Sidhant 

1 ACCEPTED SOLUTION
Sidhant
Helper II
Helper II

Hi everyone,
A quick update so I am able to create reports in a programatic fashion it looks like:

def create_model_bim(tables, output_folder):
    # Load a fresh copy of the model template
    model_bim = copy.deepcopy(model_template)
 
    # Replace {{query_order}} annotation
    for annotation in model_bim["model"].get("annotations", []):
        if annotation.get("value") == "{{query_order}}":
            annotation["value"] = str([tbl["name"] for tbl in tables])
 
    # Build table_column_map to validate relationships later
    table_column_map = {}
    
    with open("Static/data_type_mapping.json") as f:
        type_mapping = json.load(f)

    model_bim["model"]["tables"] = []
    for table in tables:
        column_names = [col["name"] for col in table["columns"]]
        table_dict = {
            "name": table["name"],
            "columns": [
                {
                    "name": col["name"],
                    "dataType": type_mapping.get(col["type"].lower(), "string"),
                    "sourceColumn": col["name"]
                } for col in table["columns"]
            ],
            "partitions": [
                {
                    "name": f"{table['name']}_Partition",
                    "mode": "import",
                    "source": {
                        "type": "m",
                        "expression": (
                            f"let Source = Sql.Database(\"server_name\", \"database_name\") "
                            f"in Source{{[Schema=\"dbo\", Item=\"{table['name']}\"]}}[Data]"
                        )
                    }
                }
            ]
        }
        
        
        model_bim["model"]["tables"].append(table_dict)
        # table_column_map[table["name"]] = [col["name"] for col in table["columns"]]
        table_column_map[table["name"]] = column_names
 
    #  Load relationships sheet and parse valid entries
    # metadata_path = os.path.join(os.path.dirname(output_folder), "Files", "MetaData.xlsx")
    metadata_path = os.path.abspath(os.path.join(output_folder, "..", "..", "Files", "MetaData.xlsx"))
    # relationships = extract_relationships_from_metadata(metadata_path)
    relationships = extract_relationships_from_metadata(metadata_path, table_column_map)
    if relationships:
        model_bim["model"]["relationships"] = relationships
 
    measures = extract_measures_from_metadata(metadata_path)
    if measures:
         measures_table = {
             "name": "__Measures",
             "columns": [
                {
                    "name": "Dummy",
                    "dataType": "string"
                }
             ],
             "measures": measures,
             "partitions": [
                 {
                    "name": "__Measures_Partition",
                    "mode": "import",
                    "source": 
                        {
                             "type": "m",
                            "expression": "let Source = #table({\"Dummy\"}, {}) in Source"
                        }
                }
             ]
         }
         model_bim["model"]["tables"].append(measures_table)
    
    # Write model.bim to file
    bim_path = os.path.join(output_folder, "model.bim")
    with open(bim_path, "w", encoding="utf-8") as file:
        json.dump(model_bim, file, indent=4)
 
    print(f" model.bim created at: {bim_path}")
 
 
#-- Report.json file creation:
def create_valid_report_json(report_folder_path, chart_types_df, chart_axes_df):
    base_config = copy.deepcopy(report_static_config["config"])

    def create_visual_config(visual_type, visual_index, axes):
        visual_id = str(uuid.uuid4())
        layout = {
            "id": 0,
            "position": {
                "x": 100.0 + (visual_index % 2) * 450.0,
                "y": 100.0 + (visual_index // 2) * 350.0,
                "z": 0,
                "width": 400.0,
                "height": 300.0,
                "tabOrder": 0
            }
        }

        projections = {}
        selects = []
        from_tables = set()

        for axis_type, axis_list in axes.items():
            projections[axis_type] = []
            for axis in axis_list:
                table = axis['table_name']
                column = axis['column']
                full_ref = f"{table}.{column}"
                from_tables.add(table)

                if axis.get('aggregation'):
                    agg_func = 0  # sum
                    projections[axis_type].append({"queryRef": f"Sum({full_ref})"})
                    selects.append({
                        "Aggregation": {
                            "Expression": {
                                "Column": {
                                    "Expression": {"SourceRef": {"Source": table}},
                                    "Property": column
                                }
                            },
                            "Function": agg_func
                        },
                        "Name": f"Sum({full_ref})",
                        "NativeReferenceName": f"Sum of {column}"
                    })
                else:
                    projections[axis_type].append({"queryRef": full_ref, "active": True})
                    selects.append({
                        "Column": {
                            "Expression": {"SourceRef": {"Source": table}},
                            "Property": column
                        },
                        "Name": full_ref,
                        "NativeReferenceName": column
                    })

        visual_config = {
            "name": visual_id,
            "layouts": [layout],
            "singleVisual": {
                "visualType": visual_type,
                "projections": projections,
                "prototypeQuery": {
                    "Version": 2,
                    "From": [{"Name": t, "Entity": t, "Type": 0} for t in from_tables],
                    "Select": selects
                },
                "drillFilterOtherVisuals": True,
                "hasDefaultSort": True,
                "objects": {},
                "vcObjects": {
                    "title": [
                        {
                            "properties": {
                                "text": {
                                    "expr": {
                                        "Literal": {
                                            "Value": f"'{visual_type.title()} Visual {visual_index + 1}'"
                                        }
                                    }
                                }
                            }
                        }
                    ]
                }
            }
        }
        return visual_config

    visual_containers = []

    for idx, row in chart_types_df.iterrows():
        visual_type = row.get('plot_type')
        worksheet = row.get('worksheet')

        if not visual_type or not worksheet:
            continue  # Skip incomplete rows

        relevant_axes = chart_axes_df[
            (chart_axes_df['worksheet'] == worksheet) &
            (chart_axes_df['type'].isin(['rows', 'cols'])) &
            # (chart_axes_df['order_id'] == 0) &
            (chart_axes_df['table_name'].notna())
        ]

        axes_dict = {}
        for axis_type in ['rows', 'cols']:
            group = relevant_axes[relevant_axes['type'] == axis_type]
            if not group.empty:
                axes_dict['Category' if axis_type == 'rows' else 'Y'] = group.apply(
                    lambda axis_row: {
                        'table_name': axis_row['table_name'],
                        'column': axis_row['column'],
                        'aggregation': str(axis_row.get('aggregation', '')).strip().lower() == 'sum'
                    },
                    axis=1
                ).tolist()

        if axes_dict:
            config = create_visual_config(visual_type, idx, axes_dict)
            container = {
                "config": json.dumps(config),
                "filters": "[]",
                "height": 300.0,
                "width": 400.0,
                "x": 100.0 + (idx % 2) * 450.0,
                "y": 100.0 + (idx // 2) * 350.0,
                "z": 0.0
            }
            visual_containers.append(container)

    report_json = {
        "config": json.dumps(base_config),
        "layoutOptimization": 0,
        "resourcePackages": [],
        "sections": [
            {
                "config": "{}",
                "displayName": "Auto Page",
                "displayOption": 1,
                "filters": "[]",
                "height": 720.0,
                "name": str(uuid.uuid4()),
                "visualContainers": visual_containers,
                "width": 1280.0
            }
        ]
    }

    output_path = os.path.join(report_folder_path, "report.json")
    with open(output_path, "w", encoding="utf-8") as f:
        json.dump(report_json, f, indent=4)

    print(f" report.json with {len(visual_containers)} visuals written at: {output_path}")

Sidhant_0-1747641165958.png

 

View solution in original post

21 REPLIES 21
Sidhant
Helper II
Helper II

Hi everyone,
A quick update so I am able to create reports in a programatic fashion it looks like:

def create_model_bim(tables, output_folder):
    # Load a fresh copy of the model template
    model_bim = copy.deepcopy(model_template)
 
    # Replace {{query_order}} annotation
    for annotation in model_bim["model"].get("annotations", []):
        if annotation.get("value") == "{{query_order}}":
            annotation["value"] = str([tbl["name"] for tbl in tables])
 
    # Build table_column_map to validate relationships later
    table_column_map = {}
    
    with open("Static/data_type_mapping.json") as f:
        type_mapping = json.load(f)

    model_bim["model"]["tables"] = []
    for table in tables:
        column_names = [col["name"] for col in table["columns"]]
        table_dict = {
            "name": table["name"],
            "columns": [
                {
                    "name": col["name"],
                    "dataType": type_mapping.get(col["type"].lower(), "string"),
                    "sourceColumn": col["name"]
                } for col in table["columns"]
            ],
            "partitions": [
                {
                    "name": f"{table['name']}_Partition",
                    "mode": "import",
                    "source": {
                        "type": "m",
                        "expression": (
                            f"let Source = Sql.Database(\"server_name\", \"database_name\") "
                            f"in Source{{[Schema=\"dbo\", Item=\"{table['name']}\"]}}[Data]"
                        )
                    }
                }
            ]
        }
        
        
        model_bim["model"]["tables"].append(table_dict)
        # table_column_map[table["name"]] = [col["name"] for col in table["columns"]]
        table_column_map[table["name"]] = column_names
 
    #  Load relationships sheet and parse valid entries
    # metadata_path = os.path.join(os.path.dirname(output_folder), "Files", "MetaData.xlsx")
    metadata_path = os.path.abspath(os.path.join(output_folder, "..", "..", "Files", "MetaData.xlsx"))
    # relationships = extract_relationships_from_metadata(metadata_path)
    relationships = extract_relationships_from_metadata(metadata_path, table_column_map)
    if relationships:
        model_bim["model"]["relationships"] = relationships
 
    measures = extract_measures_from_metadata(metadata_path)
    if measures:
         measures_table = {
             "name": "__Measures",
             "columns": [
                {
                    "name": "Dummy",
                    "dataType": "string"
                }
             ],
             "measures": measures,
             "partitions": [
                 {
                    "name": "__Measures_Partition",
                    "mode": "import",
                    "source": 
                        {
                             "type": "m",
                            "expression": "let Source = #table({\"Dummy\"}, {}) in Source"
                        }
                }
             ]
         }
         model_bim["model"]["tables"].append(measures_table)
    
    # Write model.bim to file
    bim_path = os.path.join(output_folder, "model.bim")
    with open(bim_path, "w", encoding="utf-8") as file:
        json.dump(model_bim, file, indent=4)
 
    print(f" model.bim created at: {bim_path}")
 
 
#-- Report.json file creation:
def create_valid_report_json(report_folder_path, chart_types_df, chart_axes_df):
    base_config = copy.deepcopy(report_static_config["config"])

    def create_visual_config(visual_type, visual_index, axes):
        visual_id = str(uuid.uuid4())
        layout = {
            "id": 0,
            "position": {
                "x": 100.0 + (visual_index % 2) * 450.0,
                "y": 100.0 + (visual_index // 2) * 350.0,
                "z": 0,
                "width": 400.0,
                "height": 300.0,
                "tabOrder": 0
            }
        }

        projections = {}
        selects = []
        from_tables = set()

        for axis_type, axis_list in axes.items():
            projections[axis_type] = []
            for axis in axis_list:
                table = axis['table_name']
                column = axis['column']
                full_ref = f"{table}.{column}"
                from_tables.add(table)

                if axis.get('aggregation'):
                    agg_func = 0  # sum
                    projections[axis_type].append({"queryRef": f"Sum({full_ref})"})
                    selects.append({
                        "Aggregation": {
                            "Expression": {
                                "Column": {
                                    "Expression": {"SourceRef": {"Source": table}},
                                    "Property": column
                                }
                            },
                            "Function": agg_func
                        },
                        "Name": f"Sum({full_ref})",
                        "NativeReferenceName": f"Sum of {column}"
                    })
                else:
                    projections[axis_type].append({"queryRef": full_ref, "active": True})
                    selects.append({
                        "Column": {
                            "Expression": {"SourceRef": {"Source": table}},
                            "Property": column
                        },
                        "Name": full_ref,
                        "NativeReferenceName": column
                    })

        visual_config = {
            "name": visual_id,
            "layouts": [layout],
            "singleVisual": {
                "visualType": visual_type,
                "projections": projections,
                "prototypeQuery": {
                    "Version": 2,
                    "From": [{"Name": t, "Entity": t, "Type": 0} for t in from_tables],
                    "Select": selects
                },
                "drillFilterOtherVisuals": True,
                "hasDefaultSort": True,
                "objects": {},
                "vcObjects": {
                    "title": [
                        {
                            "properties": {
                                "text": {
                                    "expr": {
                                        "Literal": {
                                            "Value": f"'{visual_type.title()} Visual {visual_index + 1}'"
                                        }
                                    }
                                }
                            }
                        }
                    ]
                }
            }
        }
        return visual_config

    visual_containers = []

    for idx, row in chart_types_df.iterrows():
        visual_type = row.get('plot_type')
        worksheet = row.get('worksheet')

        if not visual_type or not worksheet:
            continue  # Skip incomplete rows

        relevant_axes = chart_axes_df[
            (chart_axes_df['worksheet'] == worksheet) &
            (chart_axes_df['type'].isin(['rows', 'cols'])) &
            # (chart_axes_df['order_id'] == 0) &
            (chart_axes_df['table_name'].notna())
        ]

        axes_dict = {}
        for axis_type in ['rows', 'cols']:
            group = relevant_axes[relevant_axes['type'] == axis_type]
            if not group.empty:
                axes_dict['Category' if axis_type == 'rows' else 'Y'] = group.apply(
                    lambda axis_row: {
                        'table_name': axis_row['table_name'],
                        'column': axis_row['column'],
                        'aggregation': str(axis_row.get('aggregation', '')).strip().lower() == 'sum'
                    },
                    axis=1
                ).tolist()

        if axes_dict:
            config = create_visual_config(visual_type, idx, axes_dict)
            container = {
                "config": json.dumps(config),
                "filters": "[]",
                "height": 300.0,
                "width": 400.0,
                "x": 100.0 + (idx % 2) * 450.0,
                "y": 100.0 + (idx // 2) * 350.0,
                "z": 0.0
            }
            visual_containers.append(container)

    report_json = {
        "config": json.dumps(base_config),
        "layoutOptimization": 0,
        "resourcePackages": [],
        "sections": [
            {
                "config": "{}",
                "displayName": "Auto Page",
                "displayOption": 1,
                "filters": "[]",
                "height": 720.0,
                "name": str(uuid.uuid4()),
                "visualContainers": visual_containers,
                "width": 1280.0
            }
        ]
    }

    output_path = os.path.join(report_folder_path, "report.json")
    with open(output_path, "w", encoding="utf-8") as f:
        json.dump(report_json, f, indent=4)

    print(f" report.json with {len(visual_containers)} visuals written at: {output_path}")

Sidhant_0-1747641165958.png

 

Hi @Sidhant ,

 

We are pleased to hear that you have found a workaround. Mark it as "Accept as solution" to assist others with similar issues.
Thank you.

Hi @Sidhant ,

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.


Thank you.

Sidhant
Helper II
Helper II

Hi,
So for through the script I am able to a basic skeleton that includes the schema which is been inferred from a Meta Data sheet which looks like:

Sidhant_0-1746167400604.png

Through which I am creating the model.bim file which is kind of a json file. And even created measures by inferring the MetaData sheet (for now converted the formula's into appropriate Power BI expression manually {within the sheet}, and to keep the measures intact created a table that stores all the measures, so that later when the data source gets connected the measures aren't lost)
Now the thing is as of while creating the report.json I have explicitly specified (kind of hard-coding) the columns and the visual taht's to be used, this works but when I tried to generate the visuals (create the report.json by inferring the MetaData sheet, the structure was not right due to which even after opening the report the visuals weren't rendered properly (i.e. even after connecting the data source there was no option to add columns) which indicated there is an issue in report.json).
 

Sidhant_2-1746168150954.png

 

Sidhant_1-1746168095759.png

So any inputs on the same how can I make this dynamic as well.

Regards,
Sidhant

 

Hi @Sidhant ,

Thanks for reaching out to the Microsoft fabric community forum. 

 

Please follow below Steps to Dynamically Generate Valid report.json

1. Infer and map dataRoles dynamically from metadata: Every visual has expected data roles (e.g., Axis, Values, Legend).
These must be declared and bound correctly in the JSON:

json code:

"dataRoles": [
{ "name": "Axis", "values": ["Orders.Product ID"] },
{ "name": "Values", "values": ["Orders.Profit"] }
]

Note: You can map roles using a config dictionary per visual type (e.g., bar chart → Axis, Values) and match it with fields in your metadata.

2. Use queryRef properly: Each field reference should match the model structure:

json code:

"queryRef": {
"Product ID": "Orders.Product ID",
"Profit": "Orders.Profit"
}

Note: If your model.bim defines aliases or you’ve renamed fields, this mapping must be consistent.

3. Validate visual type GUIDs / identifiers: Some of the custom visuals in your screenshot are not rendering because they aren’t included in the report package or referenced improperly. Either: Stick with default visuals Power BI supports out of the box (e.g., bar chart, line chart), or For custom visuals, ensure you: Import them into the report manually or programmatically (pbiviz.json). Reference their visualClassName and customVisualName properly in the visual payload.

4. Auto-generate visual layout placeholders: Create a generic grid or layout algorithm (e.g., page index * height/width) to avoid visuals overlapping or missing position info:

json code:

"position": {
"x": 0,
"y": 0,
"z": 0,
"width": 300,
"height": 200
}

Sample example: Template for One Visual in report.json

json code:

{
"visualType": "barChart",
"dataViewMappings": [
{
"conditions": [{}],
"categorical": {
"categories": {
"for": { "in": "Product ID" }
},
"values": {
"select": [{ "bind": "Profit" }]
}
}
}
],
"dataRoles": [
{ "name": "Axis", "values": ["Product ID"] },
{ "name": "Values", "values": ["Profit"] }
],
"queryRef": {
"Product ID": "Orders.Product ID",
"Profit": "Orders.Profit"
},
"position": {
"x": 0,
"y": 0,
"z": 0,
"width": 300,
"height": 200
}
}

 

If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS. Feel free to reach out if you need further assistance.
Thanks and Regards

 

Hi @v-dineshya ,
Thanks for the reply, I will check this out. As of now when I am creating the report.json (within my Python script I have hard coded the structure what columns need to be picked)

def create_valid_report_json(report_folder_path):
 
    section_id = str(uuid.uuid4())
    section_id_page2 = str(uuid.uuid4())
    visual_id_chart = str(uuid.uuid4())
    visual_id_table = str(uuid.uuid4())
    visual_id_pie = str(uuid.uuid4())
    visual_id_gauge = str(uuid.uuid4())
 
    # Copy base config
    base_config = copy.deepcopy(report_static_config["config"])
 
    # === 1. Bar Chart Visual ===
    bar_chart_config = {
        "name": visual_id_chart,
        "layouts": [
            {
                "id": 0,
                "position": {
                    "x": 100.0,
                    "y": 100.0,
                    "z": 0,
                    "width": 400.0,
                    "height": 300.0,
                    "tabOrder": 0
                }
            }
        ],
        "singleVisual": {
            "visualType": "columnChart",
            "projections": {
                "Y": [{"queryRef": "Sum(Orders.Sales)"}],
                "Category": [{"queryRef": "Orders.Customer Name", "active": True}]
            },
            "prototypeQuery": {
                "Version": 2,
                "From": [
                    {"Name": "Orders", "Entity": "Orders", "Type": 0}
                ],
                "Select": [
                    {
                        "Aggregation": {
                            "Expression": {
                                "Column": {
                                    "Expression": {"SourceRef": {"Source": "Orders"}},
                                    "Property": "Sales"
                                }
                            },
                            "Function": 0
                        },
                        "Name": "Sum(Orders.Sales)",
                        "NativeReferenceName": "Sum of Sales"
                    },
                    {
                        "Column": {
                            "Expression": {"SourceRef": {"Source": "Orders"}},
                            "Property": "Customer Name"
                        },
                        "Name": "Orders.Customer Name",
                        "NativeReferenceName": "Customer Name"
                    }
                ],
                "OrderBy": [
                    {
                        "Direction": 2,
                        "Expression": {
                            "Aggregation": {
                                "Expression": {
                                    "Column": {
                                        "Expression": {"SourceRef": {"Source": "Orders"}},
                                        "Property": "Sales"
                                    }
                                },
                                "Function": 0
                            }
                        }
                    }
                ]
            },
            "drillFilterOtherVisuals": True,
            "hasDefaultSort": True,
            "objects": {},
            "vcObjects": {
                "title": [
                    {
                        "properties": {
                            "text": {
                                "expr": {
                                    "Literal": {"Value": "'Sales by Customer'"}
                                }
                            }
                        }
                    }
                ]
            }
        }
    }
 
    # === 2. Table Visual ===
    table_visual_config = {
        "name": visual_id_table,
        "layouts": [
            {
                "id": 0,
                "position": {
                    "x": 550.0,
                    "y": 100.0,
                    "z": 0,
                    "width": 500.0,
                    "height": 300.0,
                    "tabOrder": 0
                }
            }
        ],
        "singleVisual": {
            "visualType": "tableEx",
            "projections": {
                "Values": [
                    {"queryRef": "Orders.Customer Name"},
                    {"queryRef": "Sum(Orders.Sales)"}
                ]
            },
            "prototypeQuery": {
                "Version": 2,
                "From": [
                    {"Name": "Orders", "Entity": "Orders", "Type": 0}
                ],
                "Select": [
                    {
                        "Column": {
                            "Expression": {"SourceRef": {"Source": "Orders"}},
                            "Property": "Customer Name"
                        },
                        "Name": "Orders.Customer Name",
                        "NativeReferenceName": "Customer Name"
                    },
                    {
                        "Aggregation": {
                            "Expression": {
                                "Column": {
                                    "Expression": {"SourceRef": {"Source": "Orders"}},
                                    "Property": "Sales"
                                }
                            },
                            "Function": 0
                        },
                        "Name": "Sum(Orders.Sales)",
                        "NativeReferenceName": "Sum of Sales"
                    }
                ]
            },
            "drillFilterOtherVisuals": True,
            "objects": {},
            "vcObjects": {
                "title": [
                    {
                        "properties": {
                            "text": {
                                "expr": {
                                    "Literal": {"Value": "'Customer Sales Table'"}
                                }
                            }
                        }
                    }
                ]
            }
        }
    }

    # Page-2
    # Pie chart
    pie_chart_config = {
        "name": visual_id_pie,
        "layouts": [{
            "id": 0,
            "position": {
                "x": 100.0, "y": 100.0, "z": 0,
                "width": 400.0, "height": 300.0,
                "tabOrder": 0
            }
        }],
        "singleVisual": {
            "visualType": "donutChart",
            "projections": {
                "Category": [{"queryRef": "Orders.Segment"}],
                "Values": [{"queryRef": "Sum(Orders.Sales)"}]
            },
            "prototypeQuery": {
                "Version": 2,
                "From": [{"Name": "Orders", "Entity": "Orders", "Type": 0}],
                "Select": [
                    {
                        "Column": {
                            "Expression": {"SourceRef": {"Source": "Orders"}},
                            "Property": "Segment"
                        },
                        "Name": "Orders.Segment"
                    },
                    {
                        "Aggregation": {
                            "Expression": {
                                "Column": {
                                    "Expression": {"SourceRef": {"Source": "Orders"}},
                                    "Property": "Sales"
                                }
                            },
                            "Function": 0
                        },
                        "Name": "Sum(Orders.Sales)"
                    }
                ]
            },
            "objects": {},
            "vcObjects": {
                "title": [{
                    "properties": {
                        "text": {
                            "expr": {
                                "Literal": {"Value": "'Sales by Segment'"}
                            }
                        }
                    }
                }]
            }
        }
    }

    # -- Gauge --
    gauge_chart_config = {
        "name": visual_id_gauge,
        "layouts": [{
            "id": 0,
            "position": {
                "x": 550.0, "y": 100.0, "z": 0,
                "width": 400.0, "height": 300.0,
                "tabOrder": 0
            }
        }],
        "singleVisual": {
            "visualType": "gauge",
            "projections": {
                "Y": [{"queryRef": "Sum(Orders.Profit)"}]
            },
            "prototypeQuery": {
                "Version": 2,
                "From": [{"Name": "Orders", "Entity": "Orders", "Type": 0}],
                "Select": [
                    {
                        "Aggregation": {
                            "Expression": {
                                "Column": {
                                    "Expression": {"SourceRef": {"Source": "Orders"}},
                                    "Property": "Profit"
                                }
                            },
                            "Function": 0
                        },
                        "Name": "Sum(Orders.Profit)"
                    }
                ]
            },
            "objects": {},
            "vcObjects": {
                "title": [{
                    "properties": {
                        "text": {
                            "expr": {
                                "Literal": {"Value": "'Profit Gauge'"}
                            }
                        }
                    }
                }]
            }
        }
    }

 
    # === Final Report JSON ===
    report_json = {
        "config": json.dumps(base_config),
        "layoutOptimization": report_static_config.get("layoutOptimization", 0),
        "resourcePackages": report_static_config.get("resourcePackages", []),
        "sections": [
            {
                "config": "{}",
                "displayName": "Page 1",
                "displayOption": 1,
                "filters": "[]",
                "height": 720.0,
                "name": section_id,
                "visualContainers": [
                    {
                        "config": json.dumps(bar_chart_config),
                        "filters": "[]",
                        "height": 300.0,
                        "width": 400.0,
                        "x": 100.0,
                        "y": 100.0,
                        "z": 0.0
                    },
                    {
                        "config": json.dumps(table_visual_config),
                        "filters": "[]",
                        "height": 300.0,
                        "width": 500.0,
                        "x": 550.0,
                        "y": 100.0,
                        "z": 0.0
                    }
                ],
                "width": 1280.0
            },
            {
                "config": "{}",
                "displayName": "Page 2",
                "displayOption": 1,
                "filters": "[]",
                "height": 720.0,
                "name": section_id_page2,
                "visualContainers": [
                    {
                        "config": json.dumps(pie_chart_config),
                        "filters": "[]",
                        "height": 300.0,
                        "width": 400.0,
                        "x": 100.0,
                        "y": 100.0,
                        "z": 0.0
                    },
                    {
                        "config": json.dumps(gauge_chart_config),
                        "filters": "[]",
                        "height": 300.0,
                        "width": 400.0,
                        "x": 550.0,
                        "y": 100.0,
                        "z": 0.0
                    }
                ],
                "width": 1280.0
            }
        ]
    }
    
 
    output_path = os.path.join(report_folder_path, "report.json")
    with open(output_path, "w", encoding="utf-8") as f:
        json.dump(report_json, f, indent=4)
 
    print(f" report.json with bar chart + table written at: {output_path}")
 

Instead I was thinking to templatize this, wherein as per requirement I will just fill the table name and the columns and it will fill them in the respective positions

//report.json
 "visualContainers": [
                {
                    "config": "{\"name\": \"d8ac8514-74a7-4abb-b788-d338689a4975\", \"layouts\": [{\"id\": 0, \"position\": {\"x\": 100.0, \"y\": 100.0, \"z\": 0, \"width\": 400.0, \"height\": 300.0, \"tabOrder\": 0}}], \"singleVisual\": {\"visualType\": \"columnChart\", \"projections\": {\"Y\": [{\"queryRef\": \"Sum(Orders.Sales)\"}], \"Category\": [{\"queryRef\": \"Orders.Customer Name\", \"active\": true}]}, \"prototypeQuery\": {\"Version\": 2, \"From\": [{\"Name\": \"Orders\", \"Entity\": \"Orders\", \"Type\": 0}], \"Select\": [{\"Aggregation\": {\"Expression\": {\"Column\": {\"Expression\": {\"SourceRef\": {\"Source\": \"Orders\"}}, \"Property\": \"Sales\"}}, \"Function\": 0}, \"Name\": \"Sum(Orders.Sales)\", \"NativeReferenceName\": \"Sum of Sales\"}, {\"Column\": {\"Expression\": {\"SourceRef\": {\"Source\": \"Orders\"}}, \"Property\": \"Customer Name\"}, \"Name\": \"Orders.Customer Name\", \"NativeReferenceName\": \"Customer Name\"}], \"OrderBy\": [{\"Direction\": 2, \"Expression\": {\"Aggregation\": {\"Expression\": {\"Column\": {\"Expression\": {\"SourceRef\": {\"Source\": \"Orders\"}}, \"Property\": \"Sales\"}}, \"Function\": 0}}}]}, \"drillFilterOtherVisuals\": true, \"hasDefaultSort\": true, \"objects\": {}, \"vcObjects\": {\"title\": [{\"properties\": {\"text\": {\"expr\": {\"Literal\": {\"Value\": \"'Sales by Customer'\"}}}}}]}}}",
                    "filters": "[]",
                    "height": 300.0,
                    "width": 400.0,
                    "x": 100.0,
                    "y": 100.0,
                    "z": 0.0
                }

So in the above snippet I wanted to templatize this (was thinking to store a template and wherever needed like : Source: table (here it is Orders); queryRef: Column_name (Table_Name.Column_Name; Orders.CustomerName)
So as of now I already have created a Static folder wherein I have kept some bolier code, like for report.json, model.bim; in similar fashion can we templatize this such that when I run the script I replace the placeholders (in config) and don't need to explicitly mention the visuals and the columns.
And the other thing is if I need to have a filter (page level on a columns what should be the structure for that; i.e. how to add that?)

Apologise for the late response

Regards,
Sidhant.

Hi @Sidhant ,

Thank you for reaching out to the Microsoft Community Forum.

 

I can't replicate the issue. It's advisable to raise a support ticket with Microsoft for further backend investigation.

How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

 

If my response solved your query, please mark it as the "Accepted solution" to help others find it easily.

And if my answer was helpful, I'd really appreciate a "Kudos".

 

Thanks

Hi @v-dineshya ,

Ok, thanks for the clarification I will have a look. Just to give you an update now I am able to resolve the issue able to create visuals and the schema (tables and the respective columns) from the Metadata file.

Regards,
Sidhant

Hi @Sidhant ,

 

We are pleased to hear that you have found a workaround. Please share the details here and mark it as "Accept as solution" to assist others with similar issues.
Thank you.

Hi @Sidhant ,

If my response has resolved your query, please mark it as the Accepted Solution to assist others. Additionally, a 'Kudos' would be appreciated if you found my response helpful.

Thank you

Hi @Sidhant ,

If my response has resolved your query, please mark it as the Accepted Solution to assist others. Additionally, a 'Kudos' would be appreciated if you found my response helpful.

Thank you

Hi @Sidhant ,

If my response has resolved your query, please mark it as the Accepted Solution to assist others. Additionally, a 'Kudos' would be appreciated if you found my response helpful.

Thank you

v-dineshya
Community Support
Community Support

Hi @Sidhant ,

Thank you for reaching out to the Microsoft Community Forum.

 

Can you please refer the Microsoft official documents.

Run Python scripts in Power BI Desktop - Power BI | Microsoft Learn

Create Power BI visuals using Python in Power BI Desktop - Power BI | Microsoft Learn

Use an external Python IDE with Power BI - Power BI | Microsoft Learn

 

If my response has resolved your query, please mark it as the Accepted Solution to assist others. Additionally, a 'Kudos' would be appreciated if you found my response helpful.

Thank you

Hi @v-dineshya ,
The thing is I am trying to use scripts to create the visuals using the pbip file format so the documentation links that you gave I assume that are to be used inside Power BI Desktop (In Get Data-> Python script), but that not my requirement mine is different:
So I was able to fix some issues:


So now I am able to open the pbip file but even after adding the connection reference for data on visuals I am not able to see the add columns option which we ususally get
Sidhant_0-1744871092969.png


So if you have any idea let me know.

Regards,
Sidhant.

Hi @Sidhant ,

Thank you for reaching out to the Microsoft Community Forum.

 

Please follow these steps:

1. Add Custom Visuals to Your Project
If your visuals are custom visuals, you need to: Download the custom .pbiviz files for the visuals you're using (barchart, linechart, etc.). Place them in the correct location and register them in your pbip project.

If you're unsure which custom visuals are needed, open the report.json and look under visualContainers -> config -> singleVisual -> visualType to identify them.

2. Reference Them in capabilities.json or the Report Definition: You’ll also need to ensure these visuals are referenced in the project metadata. That often means: Adding them to the visualPlugins array. Making sure their capabilities (like dataRoles) are defined properly so Power BI knows what fields can go in which bucket

3. Field Wells Appear Only if Visual is Valid: Once a visual is loaded successfully, Power BI Desktop will enable the field wells (X-axis, Y-axis, legend, etc.). You won’t get the “add data fields here” option unless: The visual is recognized.

It has defined data roles in its capabilities. The dataset is connected and valid

 

If my response has resolved your query, please mark it as the Accepted Solution to assist others. Additionally, a 'Kudos' would be appreciated if you found my response helpful.

Thank you

Hi @Sidhant ,

If my response has resolved your query, please mark it as the Accepted Solution to assist others. Additionally, a 'Kudos' would be appreciated if you found my response helpful.

Thank you

Hi @v-dineshya ,
Aplogise for the late response but your response is somewhat different (may be as per your understanding that would be correct), but in my case I am looking a something else so if you need any additional input do let me know I will be happy to share.

Regards,
Sidhant

Hi @Sidhant ,

 

Please provide more inputs and brief explanation on your expected output.

 

Thanks

Hi @v-dineshya ,
Just to explain you in short I am exploring can we create entire report programatically, so for that I understood we can (using the pbip file extension) so I had created a kind of boiler zip package which can be used (so I have shared the main code in my post you can check that out) so there what I am doing is from the user I am taking report name as input and then the script is divided into 3 functions: 
Function-1: Unzips the zip package & renames few files and folders:

Sidhant_0-1745838385568.pngSidhant_1-1745838463978.png

.platform:

Sidhant_2-1745838533357.png

 


And inside few files like (.platform, report_name.pbip) and the folders where there is report_name that's been replaced by the value provided by user then comes the Table schema creation (i.e. it includes table names and the respective columns and its type, relationships, measures creation all of this inferred from a MetaData.xlsx excel sheet and accordingly the file is created), some of the content which is static in these files for them I have created few templates that are being reffered to (in order to access it again and gain in main code)
Like: For report.json

Sidhant_3-1745838754888.png

Once this is done then the report.json (which contains visuals i.e. which visuals are being used is also inferred from the sheet (chart_axes), once done I usually open the pbip file
(Since I am using .bim I need to remove files with .tmdl extension which is present inside 'definition folder'-> cultures (en-US.tmdl)
And now I need the skeleton to be present that is the visuals and the schema (there won't be any data but as soon as I connect the appropriate DB based on the respective columns the visual will show the data), the issue (I think the report.json file is not been created propely due to whih the visuals are not been rendered properly).
So for now in report.json : I have explicitly specified two visuals (what should be the rows, columsn and the visual type) so that's been created but this needs to be dynamic.

Do you need any more inputs.

Regards,
Sidhant.

Hi @Sidhant ,

If my response has resolved your query, please mark it as the Accepted Solution to assist others. Additionally, a 'Kudos' would be appreciated if you found my response helpful.

Thank you

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors