This bot is separated in two scenarios as this is the workflow needed to use Browse.aiMake sure you import Web scraper bot 1 and Web scraper bot 2 scenarios.
2
Scenario 2
3
Go to Browse.ai, create a free account and Login
4
Once you are logged in click on Dashboard
5
Click on Build New Robot
6
Click on Extract Structured Data…
7
go to the Origin URL field and type: www.coinmarketcap.com
8
a prebuilt robot will appear at the bottom, click it.
9
Select the number of tokens you want to scrape, this will extract the data starting from the number 1 token then it will keep going down the list until the number of tokens you specified.
10
Click on Next
11
Click on Start Extracting
12
This is your Bot Dashboard, in here you can see all your bot information.
13
Go to make Web scraper bot 1 scenario.
14
click on the browse AI module. then "create a connection"
15
Name your Browse.ai connection
16
Click on Save
17
Choose the robot you just created on browse.ai, if you didn't named it will be called "Extract the Coinmarketcap Coins List & prices" NOTE: make sure the "map" button is ticked off so you get the dropdown menu to choose your bot
18
Select the number of tokens you want to scrape, this will extract the data starting from the number 1 token then it will keep going down the list until the number of tokens you specified. NOTE: this will override the number you chose when creating the bot.
19
Click on OK
20
Go to Web scraper bot 2 scenario.
21
We will need to create a webhook to connect scenario 1 with scenario 2.on the Browse AI module click on "Create a webhook"
22
Click on Add
23
Click on Save t his will automatically connect your Browse.AI account to make and you will be able to create a webhook by choosing what bot to connect to
24
Click on highlight
25
Select the name of the bot you created on browse.ai
26
On the "event" dropdown menu click on Task finished successfully
27
Your configuration should look like this
28
Click on Save
29
Click on OK
30
You will need to create a Google Spreadheet to save all the data you scrape.here is a prebuilt Spreadsheet with all the fields already linked to our scenario link:https://docs.google.com/spreadsheets/d/1Z0zoYUrLMOcP7nJYuP9GuxhJIKhnzcVgAYO_GK_RAsM/edit?usp=sharing
31
Once you open the Provided Spreadsheet , you will need to make a copy of it to be able to edit it.click on file then on "make a copy"
32
Name your new spreadsheet.
33
Click on Make a copy
34
you will need to connect your google sheets account to make.com.click on the first google sheets module.
35
Click on Add
36
Click on Sign in with Google
37
Choose your google account from the list and click on allow to connect it to Make.com
38
On the first Google sheets module, you will need to choose your spreadsheet, click on the "Spreadsheet ID" field and you will get a search bar, search for your newly created google spreadsheet.
39
Click on OK
40
Connect your Secong Google Sheets module.
41
Click on the connection dropdown menu and choose your account
42
you will need to choose your spreadsheet, click on the "Spreadsheet ID" field and you will get a search bar, search for your newly created google spreadsheet. ( this is the same spreadsheet you chose on the previous module)
43
Click on OK
44
Make sure that your Web Scraper Bot - 2 Automation is set "Immediatly as data arrives" and turn it on. this will make it so it only runs when the first automation sends data.
45
Go to your Web scraper bot - 1 automation and select how frequent you want to run your automation (we suggest to set it to every 24 hours)
46
Turn on your automation .
47
Your automation is ready to run!
Well done!
Create how-to guides like this in a snap. Get Tango now.
Press space bar to start a drag.
When dragging you can use the arrow keys to move the item around and escape to cancel.
Some screen readers may require you to be in focus mode or to use your pass through key