Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
Jul1An
Frequent Visitor

Python word clustering

Hello,

im currently facing a Python clustering problem. I would like to extract from a column (say "comment") from the table "rawdata" the individual comments from customers, take out stop words and then cluster by similar sounding terms. Similar sounding because umlauts like ä,ö... are not always saved/ displayed correctly. 

Unfortunately, i cant get a visual to work here. Can someone help me here?

Code I tried to use:

# The following code for creating a data frame and removing duplicate rows is always executed and serves as a preamble to your script:

# dataset = pandas.DataFrame(Comments)
# dataset = dataset.drop_duplicates()
# Fügen oder geben Sie hier Ihren Skriptcode ein:
import pandas as pd
import nltk
from nltk.corpus import stopwords
from nltk.tokenize import word_tokenize

df = pd.read_excel("C:\Users\Data.xlsx")
def extract_entities(text) :
stop_words = set(stopwords.words('german'))
word_tokens = word_tokenize(text)
filtered_tokens = [word.lower() for word in word_tokens if word.lower() not in stop_words and word.isalpha()]
tagged = nltk.pos_tag(filtered_tokens, lang='de')
entities = nltk.chunk.ne_chunk(tagged, binary=True)
return entities
df['entities'] = df['Disp_Comments'].apply(extract_entities)
df.groupby('entities')['Disp_Comments'].count()
 
Data sometimes looks like this: 
IDComment
1Kündigung rückgängig machen
2Invoice
 
1 REPLY 1
lbendlin
Super User
Super User

Your Python visual needs to plot something to the default renderer.  What are you planning to display once you are done?

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.