What is meant by “front-end”? And why would I want to trigger something based on something changing on the front-end?

What is meant by “front-end”? And why would I want to trigger something based on something changing on the front-end?

Unraveling the World of "Front-End" Automation


Introduction

In the dynamic realm of web development, terms like "front-end" often surface in discussions about user interfaces and interactions. But what exactly does "front-end" entail? Let's dive into the essence of front-end and its significance in automation processes.


Understanding Front-End Automation

When we refer to the front-end of a website, we are essentially highlighting the visible elements that users interact with. From the layout to the buttons and content display, the front-end plays a pivotal role in shaping the user experience.


Scraping and Logic in Front-End Operations

One key aspect of front-end operations is scraping data to extract valuable information. By employing filtering and comparison logic, developers can make informed decisions based on the content displayed on a webpage.


Practical Application: Automation Demo

Let's explore a practical example of front-end automation using the Cloudways website. By creating a step-by-step process involving scrolling, scraping, and filtering, we can efficiently navigate through web pages and extract the relevant data.


Enhancing Efficiency with Pagination Handling

Handling pagination on websites like Cloudways requires a systematic approach. By recording browser actions, scrolling, clicking load more buttons, and applying filters, we can streamline the process of gathering comprehensive data sets.


Optimizing Automation Flows

In the automation workflow, it's crucial to establish a logical sequence of steps. By incorporating scrolling, clicking, scraping, and filtering actions strategically, developers can ensure smooth and effective automation processes.


Concluding Thoughts

Front-end automation offers a robust solution for streamlining tasks and gathering essential data from websites. By harnessing scraping techniques, comparison logic, and automation tools, developers can enhance efficiency and productivity in web development projects.

Video



Steps

Step 1- What is displayed on the website is known as front end

Notion image
 

Step 2- Click on New Automation

Notion image
 

Step 3- Click on Web

Notion image
 

Step 4- Click on Guided Templates

Notion image
 

Step 5- Paste the link in URL column and Click on Save go to URL to open the website in browser window

Notion image
 

Step 6- Click on Scroll down step to record a scroll down from web page

Notion image
 

Step 7-Click on Confirm to record the Scroll down step of the website

Notion image
 

Step 8- Click on Scrape single step to record the scrape step

Notion image
 

Step 9- Click on Load more button—Click on Confirm

Notion image
 

Step 10- Now Click on Click step to record

Notion image
 

Step 11- Click on Load more button—Click on Confirm

Notion image
 

Step 12-Click on Filter and fill all the required fiels

Notion image
 

Step 13- Now add Scrape a list

Notion image
 

Step 14- Select all the items and Click on Confirm

Notion image
 

VIDEO TRANSCRIPT

When we say front end, we're talking about what is displayed on the website, what we're able to see here. So when we want to make a decision based on something on the front end, that usually involves a scraping something and then using filter or some other sort of comparison logic to make a decision on the page.

An example of that is scraping this cloudways page. We can handle scrolling down and scraping to check if there's a load more button. When we click this load more button, you'll see that new results are loaded. And if we scroll down more, the load more button is there. Again, after we click enough times, this load more button is gonna stop showing because we got to the end of the list.

So we would may want to adjust our flow to then start scraping everything. So I'm adding a video of an automation after this. That is a perfect example of scraping this, where we scrape this. and use filter to check if it's still on the page and that's similar to how you might check for new messages in a, in LinkedIn, new messages on Instagram is scraping something and then comparing it to what it last was to decide which steps you should run.

So to handle pagination on the Cloudways website, we can launch a new browser recording and go to that page is our first step.

So I'll paste that here. And then we can go to that page. Then we're going to want to do is record a scroll down step so that we can scroll down to look for this load more button. Then we're going to scrape it and click it. The reason that helps us is we can apply a filter to decide if we still need to click this button and decide if it's still on the page.

So first step is going to be this scroll down step. So I'll. Record this and we see that blue border on the left telling us that was highlighted. Next, we can record a scrape step of the load more button, and then we will confirm this. And then we can add a click step of this load more button. The reason we're doing this is we are going to add a filter step so that we can decide on resetting part of this logic in case there's more to click because we want to continue scrolling And there's quite a few to click here.

Next, we are going to add a filter. I think I mixed up some of the logic here. Actually, we're going to want to scroll down, then click, then scrape. Then we're going to add a filter where if. Step four is not empty. That means we need to click it again. Then let's go back up to the scroll down step or the click step.

Uh, we should actually probably switch the click in the scroll down step. Otherwise continue. We'll go back through this in a second. Okay. So we're going to go to cloudways. Then we're going to click load more. Then we're going to scroll a little bit with all the new results. Try to scrape the button if the button exists, which means it's not empty.

Then we want to go back to step three. Nope. We want to go back to step two

and perfect. Okay. So this is what we want this to do. We go to cloudways. We click load more, scroll down a little bit. Then we scrape this button to check if it's still on the page. If it is, we go all the way back up to step two, which clicks it and repeats that scroll down process. At the very end of this, we're gonna add a scrape a list step so that we can grab all of these results.

So I'll select the first name and then the second name, which got everything. And then we can add a new column and get the descriptions and that is all for this automation.

Did this answer your question?
😞
😐
🤩