When we start our blog or website, we always want to reach as many people as we can too quickly. There are other ways like SEO. Or Social media marketing. Once you get traffic on your website, then it becomes all about how good your content is. But the first step is first. We want traffic and to do that we need to show google that we get lots of visitors, so to fake it, we can create a bot to do that.
First, you will need the following libraries
from selenium import webdriver
from webdriver_manager.chrome import ChromeDriverManager
from selenium.webdriver.common.keys import Keys
import time
import pandas as pd
import random
And then you will need the following code in order to do the crawling
from selenium import webdriver
from webdriver_manager.chrome import ChromeDriverManager
from selenium.webdriver.common.keys import Keys
import time
import pandas as pd
import random
web = webdriver.Chrome(executable_path=ChromeDriverManager().install())
web.maximize_window()
col_name = ['url']
df_url = pd.read_csv("links.txt", names=col_name)
while True:
website_random_URL = random.sample(df_url.url.to_list(), 1)
driver.get(website_random_URL[0])
time.sleep(5)
height = int(driver.execute_script("return document.documentElement.scrollHeight"))
while True:
driver.execute_script('window.scrollBy(0,10)')
time.sleep(0.10)
totalScrolledHeight = driver.execute_script("return window.pageYOffset + window.innerHeight")
if totalScrolledHeight == height:
driver.find_element_by_tag_name('body').send_keys(Keys.CONTROL + Keys.HOME)
driver.switch_to.window(driver.window_handles[0])
break
print('Traffic reached')
Create a link.txt file and put it into the same directory, you can put the URLs of your website that you want to be crawled. And you will see a web browser automatically opened and doing the traffic job for you.