youtube comments scrape r

The solution for “youtube comments scrape r” can be found here. The following code will assist you in solving the problem.

devtools::install_github(“ropensci/RSelenium”) # Install from github

pJS <- phantom(pjs_cmd = "PATH TO phantomjs.exe") # as i am using windows Sys.sleep(5) # give the binary a moment remDr <- remoteDriver(browserName = 'phantomjs') remDr$open() remDr$navigate("") remDr$getTitle()[[1]] # [1] "YouTube" # scroll down for(i in 1:5){ remDr$executeScript(paste("scroll(0,",i*10000,");")) Sys.sleep(3) } # Get page source and parse it via rvest page_source <- remDr$getPageSource() author <- html(page_source[[1]]) %>% html_nodes(“.user-name”) %>% html_text()
text <- html(page_source[[1]]) %>% html_nodes(“.comment-text-content”) %>% html_text()

#combine the data in a data.frame
dat <- data.frame(author = author, text = text) Result: > head(dat)
author text
1 Kikyo bunny simpie Omg I love fluffy puff she’s so adorable when she was dancing on a rainbow it’s so cute!!!
2 Tatjana Celinska Ciao 0
3 Yvette Austin GET OUT OF MYÂ HEAD!!!!
4 Susan II Watch narhwals
5 Greg Ginger who in the entire fandom never watched this, should be ashamed,\n\nPFFFTT!!!
6 Arnav Sinha LOL what the hell is this?

Thank you for using DeclareCode; We hope you were able to resolve the issue.

More questions on [categories-list]

inline scripts encapsulated in