Analysing comments to "Star Wars: The Last Jedi" - part 2
The greatest teacher, failure is.
Posted on December 21, 2017
As already mentioned in my first post I also analysed the user comments from a post at www.starwars-union.de word by word. The figure shows the ‘wordcloud’ from all comments (1728 until now).
To create such a nice wordcloud, I used the following code. The first part was already explained in my first post.
First I load all neccesarry packages and I create all available URLs to the comments.
This is the main part for scraping all the comments: I searched the HTML file for id=”kommentargesamt” and extract the comments. These are saved in the variable comments.
Now all is prepared for creating the wordcloud. For that purpose I used the following snippet, which I found once in the internet. There are many examples creating a wordcloud with R and I decided to use the following one:
To create a nice graphical output I recommend to save the wordcloud directly and not via RStudio viewer or something else.
And that’s it !! I think most of the words are comprehensible also for non-german readers ;-)