Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now

Reply
Devanshu
Frequent Visitor

Dynamic Filters and Dynamic Data Grouping

Hello team,

 

Hope you are doing well.

 

I have the data where we are getting our ping report for multiple locations and multiple destinations. 

In the data, I have Start Date and Time, End Date and Time, Location, Destination, Maximum Latency, and Packet Lost.

Now, I have to create a visualization where I want to show the 75th and 95th Percentile of Maximum Latency. Also, I have to have a granularity. I can keep Granularity in Slicer. I want here the grouping of the data based on the Date -> Location -> Destination -> Time. The Grouping should happen Dynamically so that I can use fewer charts on my canvas.

The granularity we have been asked is like: - 

If I select Hourly from the slicer then the options for granularity should be 10 minutes, 20 minutes and 30 minutes. If Granularity is 10 minutes then we should get points on the line chart in an hour.

The Granularity I have to work on is given as follows:

Hourly: - 10 minutes, 20 minutes, 30 Minutes
Daily: - 1 Hour, 4 Hours, 8 Hours, 12 Hours
Weekly: - 12 Hours, 24 Hours

I these as dynamic filtering in all the visuals that I want to present on the canvas.

Please help me on how I can do the same.

2 REPLIES 2
lbendlin
Super User
Super User

Please provide sample data that covers your issue or question completely, in a usable format (not as a screenshot).

Do not include sensitive information. Do not include anything that is unrelated to the issue or question.

Need help uploading data? https://community.fabric.microsoft.com/t5/Community-Blog/How-to-provide-sample-data-in-the-Power-BI-...

Please show the expected outcome based on the sample data you provided.

Want faster answers? https://community.fabric.microsoft.com/t5/Desktop/How-to-Get-Your-Question-Answered-Quickly/m-p/1447...

 

 

# Function to apply granularity and calculate averages
def apply_granularity(df, interval):
    # Ensure 'Start Time' is set as the index
    df = df.set_index('Start Time')
    
    aggregated_data = df.resample(interval).agg({
        'Minimum Latency': 'mean',
        'Maximum Latency': 'mean',
        'Average Latency': 'mean'
    }).dropna()
    
    return aggregated_data.round()

# Function to detect anomalies using IQR and handle NaN values
def detect_anomalies(latency_data):
    Q1 = latency_data['Maximum Latency'].quantile(0.25)
    Q3 = latency_data['Maximum Latency'].quantile(0.75)
    IQR = Q3 - Q1
    upper_bound = Q3 + 1.5 * IQR
    
    # Handle cases where upper_bound may be NaN
    if pd.isna(upper_bound):
        upper_bound = 0
    
    latency_data['Anomaly'] = latency_data['Maximum Latency'].apply(lambda x: 'Yes' if (x > upper_bound) else 'No')
    return latency_data.round(), round(upper_bound)

# Function to generate a table of latency stats
def generate_latency_table(latency_data, interval, upper_bound):
    latency_data['DateHour'] = latency_data.index.strftime('%Y-%m-%d %H:00')

    def anomaly_range(x):
        valid_anomalies = x[x > upper_bound]
        if not valid_anomalies.empty:
            return f"{int(valid_anomalies.min())}-{int(x.max())}"
        else:
            return "No anomaly"
    
    latency_data['Min Above Upper Bound'] = latency_data['Maximum Latency'].apply(lambda x: x if x > upper_bound else None)
    
    table_data = latency_data.groupby('DateHour').agg({
        'Maximum Latency': [lambda x: round(x.quantile(0.75)), lambda x: round(x.quantile(0.95))],
        'Min Above Upper Bound': 'min',
        'Anomaly': lambda x: (x == 'Yes').sum()
    })
    
    table_data.columns = ['Max. Latency 75th Percentile', 'Max. Latency 95th Percentile', 'Min Anomaly Latency', 'Anomalies Count']
    table_data = table_data.round()
    table_data['Anomalies'] = latency_data.groupby('DateHour')['Maximum Latency'].apply(anomaly_range).fillna("No anomaly")
    
    return table_data

# Granularity intervals for different time frames
daily_granularity_intervals = {"5": '5T', "10": '10T', "15": '15T', "30": '30T'}
weekly_granularity_intervals = {"240": '4H', "480": '8H', "720": '12H'}
monthly_granularity_intervals = {"720": '12H', "1440": '24H'}

 

 

 

 

The Above is the Python code that I have done to get my desired output and I can get the Excel files from this but I want to do the same thing in Power BI as a Table as well as the Line chart and Bubble chart to show the Q3, Q4 as Line chart and Count of Anomalies as Bubble chart with size variation of Bubble based on the counts of the anomalies.

Please find the data template given below. I have 14 Prob and 15 Destinations, which are distinct counts. 

 

The Granularity I have to focus are given as:

  • 10 minutes, 20 minutes, 30 minutes for hourly data
  • 1 hour, 4 hours, 8 hours, 12 hours for daily data
  • 12 hours, 24 hours for Weekly and Monthly data
StartTimeProbDestinationMinimum LatencyMaximum LatencyAverage LatencyPacket SentPacket LostEndTime
01/21/2024 12:00:00Location 1Destination128.636001/21/2024 12:01:00
01/21/2024 12:00:00Location 2Destination22.1826001/21/2024 12:01:00

 

I hope this information can help you more to provide me help on the same.

 

Looking forward to hear back from you

Helpful resources

Announcements
November Carousel

Fabric Community Update - November 2024

Find out what's new and trending in the Fabric Community.

Live Sessions with Fabric DB

Be one of the first to start using Fabric Databases

Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.

Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.

Nov PBI Update Carousel

Power BI Monthly Update - November 2024

Check out the November 2024 Power BI update to learn about new features.

Top Solution Authors
Top Kudoed Authors