Scraping Skool Members to Find Community Creators With Paid Content and Users
Scraping Skool Members to Find Community Creators With Paid Content and Users
Leveraging Automation to Identify Valuable Community Creators
In today's digital age, online communities serve as hubs of creativity and collaboration, housing a diverse range of creators with valuable content and engaged users. But how can we efficiently sift through these communities to find creators with paid content and active followers? The answer lies in automation.
Introduction:
The script provides insights into a systematic approach to scraping online community members to identify potential creators with paid content and engaged users. By leveraging automation tools, we can streamline the process of gathering valuable data from profiles and make informed decisions based on the insights gathered.
Scraping Online Community Members:
The first step in this process involves scraping online community members' profiles to extract crucial information such as communities created, follower count, and other relevant data. Automation is used to navigate through profiles and gather detailed insights for analysis.
Setting Up Automation Triggers:
To ensure a comprehensive analysis, triggers are set up to handle pagination and looping over profiles. This allows for the iteration through different profiles to identify suitable creators and gather essential data for decision-making.
Iterative Analysis for Decision-Making:
The script emphasizes the importance of iterative analysis when identifying valuable community creators. By continuously scraping profiles and analyzing data, we can make informed decisions on which creators to engage with based on their content quality and social media presence.
Leveraging Scraped Data for Insights:
Once the data is scraped and analyzed, it can be leveraged to identify active community members with valuable content and social media influence. This data can be used to make strategic decisions on collaborations and content partnerships within the community.
Conclusion:
In conclusion, the process of scraping online community members for valuable creators showcases the power of automation in optimizing decision-making and content creation strategies. By leveraging automation tools effectively, we can identify and engage with creators who have the potential to drive community growth and engagement.
Video
Steps
Step 1- Copy the link from the search box
Step 2-Click on New automation to start a new automation
Step 3- Click on Web
Step 4-Click on continue with cookies
Step 5-Click on Scrape a list
Step 6- Click on Scrape a list and select the text accordingly —Click on Confirm
Step 7- Click on Scroll down
Step 8-Select the profile name and Click on Scrolldown to change the time.
Step 9-Click on Click step to record a click automation for next page
Step 10- Click on Trigger
Step 11-Click on Web List
Step 12-Click and chage the Loop number you want to do and click on Continue
Step 13- Drag the arrow to change the loop
Step 14- Click on Rename automation-Edit the automation name -Click on Save
Step 15- Again click on New automation
Step 16- Click on Web and continue with cookies, Again repeat steps scrape a list and continue
Step 17- Paste the link of google sheets and select the sheets form drop down
Step 18- Check details and Click on Looks good
Step 19-Again change the rows for loop and automation
Step 20- Again select the profile from the URL and Click on Save
Step 21- Click on Download - Click on Save
Step 22- Click on upload and click on Import data
VIDEO TRANSCRIPT
Okay, so scraping the online members from school right here. So I think this is what we were talking about. We were planning to grab all of the members here. And then what we could also do is go to their profile, um, and scrape anything. We see, I guess, uh, communities they've created, uh, how many followers they have, whatever else here is being used.
determining if they're a good fit. So to build this automation, I'm going to grab this URL right now, because this is the member group that we're scraping, and I will start a new automation. So I'll go here, new automation, web, then, uh, we don't need to use cookies in this use case because we don't have to log in to see the members of this community since it's public, I believe.
Oops. Sorry. So first URL is going to be that one that I was pasting the school. com slash community. Then we're going to have a scrape a list step, and there's a couple different ways we can go about this. We can either scrape a list by grabbing everybody's name like this, which highlights everybody on the page.
Then we can duplicate this column and grab a link to their profile like this. And what this is allowing us to do is, uh, revisit this URL in another So for example, um, this user's profile is something we could use in a go to page step. So one thing we can do here is just this, this would go ahead and scrape everybody from this page.
We might need to add a scroll down step and then pagination. So to do that, we can add a scroll down step and we'll just scroll and we'll see this blue border get highlighted. I'll confirm that 10 seconds is probably plenty. We could probably do like five seconds. Then we'll record a click step to click on the next page button,
and that is going to be it. So the way that this is going to handle pagination is we're going to use the list trigger, which I'll show you in a second. Then it's going to go to the school community page, scrape a list, the name, and then the link scroll down and then click next. And then we're going to have another automation.
That goes to every URL and, uh, perform some logic of checking. They have a community, things like that. So I'll click, I'm done here. Then for our trigger, we're going to set this up to be a list trigger and we want to loop, um, I think it was, whoops, I think that school community was going to be
probably 30 times, so let's go here and we're going to enter this to be 30. And then we're going to drag this slider so that it's at the end. And I'll explain why this is the way it is in a second here. So the list trigger, what this does is it allows you to run some steps and then repeat a certain section of those steps a certain amount of times.
So we go to school community, we scrape a list of stuff. We scroll down. Oh, sorry. We should scroll down first. Then we scrape a list of stuff. Then we click the next page button, which is step five. Then we restart from step two, which is scraping a list again. And this allows us to iterate through those pages.
So I'll click play steps really quick. I'm going to change this to just be two. Um, so that it's pretty fast. And if I click play steps here, we will see this grab a link to everybody's profile.
So in a second here, we'll see the Anthony S get highlighted after we scroll down. Oh, we won't see Anthony, I guess, because we're scrolling down. Um, but we'll see all of these names get highlighted and that's going to be us grabbing both the name and the link for that person. So there's that. And then we're going to be clicking the next page button
and repeating those steps.
So we're repeating that for the next people
and scraping again. And then we'll see that all of this data is being scraped, which we could be sending this to a Google sheet, or we could download as a CSV. Uh, whatever we wanted to do there is, uh, is available. So what we'll do in another automation is we will. Let me quickly, uh, rename this and start setting that up.
This is going to be scrape members. We're going to have another automation that loops over that scrape data to grab that profile information. Okay. Sorry, I had to pause there for a second. Um, but first automation is scraping all of the online members. Let me actually,
so next we're going to set up another automation that is going to scrape that information from that person's profile. So this, for example, whoops, is I'm going to scrape the group here and then the social media accounts down here, um, just for this automation in particular. So first step is going to be starting a new automation, a web automation, and then again, without cookies.
And then the first URL.
Um, this is going to be coming from our Google sheet that the other automation pushes data to. So, uh, that'll happen after the fact, while we record this, we need to enter one ourselves, which is going to be that one profile I found that I'm recording with here. So first step is going to be adding a scrape step of this group.
So we want to make sure that they have a group that they created. And then we also are going to get if it's private and how many members,
then let's get the information. Uh, again, we're going to get their name, username, and then let's get the Instagram link here, which is that profile, the Twitter, scrape links, that Twitter. Now YouTube. I'll change these after LinkedIn.
And that is it. So that's it for the scraping steps. This gets everything from this page. Now, same thing as before, we're setting up another list trigger, this time using our Google Sheet. So I'll click in here and then I'll select a list and I'm going to enter a Google Sheet that the other automation is pushing data to.
So here is a Google Sheet that I have that we are sending data to. So our other automation scraping members is going to be putting all of that data here. Bye. Where we have that name and that profile URL, then we're going to have another sheet here and this is going to be us exporting all of the other data we're scraping.
So I'm just going to enter, uh, this is a, this was their, uh, community and then community info,
name, username, Instagram. And this was all the other stuff. These are all the other things that we are scraping. But I won't set the rest of this up right now. So we need to enter this as our list that we're using. So I'll click set up sheet connection and then I'll enter this. I'll select the worksheet that we want to pull usernames in from.
Since we want to loop over this profile URL, then we can click looks good. And now we can set up the rest here. So this is the same as before where we're setting up how many times we want to loop over pagination. In this case, it's how many users we want to run this for at once. So just to break this up into batches, you probably want to do like a hundred or 550, whatever it's going to be.
I'll just do five in this use case, because I'm building out this just to test it. And then we're saying the automation wants to start at row two, because when all of this data gets entered in this Google sheet, that's going to be starting at row two, all the way onwards, that's going to be it for setting up our trigger.
And we can just click continue. And now we can enter our variable and set up the trigger or the list slider up here. I'll dismiss this really quick. Then instead of always going to the same page, we want to edit this URL to be the one from the Google sheet. So I'll select app and profile URL instead, and then save that.
As far as the loop itself or the list itself, we want it to always repeat every step of this automation. We scrape the profile or sorry, not scrape. We go to that profile URL, and then we scrape the group name. The information of that group, the user's name, username, and then the other Instagram profile information that they have here.
And that is it. It's going to push all of this data to our Google sheet. When we set that up through send to Google sheets, or we can download it from a CSV at the end there. So I'll just quickly rename this. here, get member info from school page, and we can run both of these automations that way we can see this and the works.
So I'll go back to my build workspace and then I have that automation somewhere in here. Scrape online members, I'll click place steps here. And then what this is gonna do is this is gonna put all of our data in this data out section that I can then import into this Google sheet. And that's only just because I haven't set up the send to Google sheet for this automation.
We could definitely do that if we wanted to.
So this is scrolling down the page and then it's scraping the name, username, or sorry, the username and the profile link. and then clicking on the next page.
All right. I think we have this first one only set to go over two pages instead of those 30 is what we had set there. Um, but we'll see that this is filling out the data out section with the name and the profile URLs for those people.
So that's it for the automation. I'm going to go ahead and download this data here so that we can import this to our Google sheet. Again, we could just set up, send a Google sheet. I just didn't do that in this case. We'll just go to file. import, and then I'll upload that data dot CSV. Whoops. Just to make this really simple, um, for getting started with this again, all can be automated through this, send to Google sheets piece, and then we'll import this.
And I think that is going to be it. There's a couple of people that didn't have names on their profiles is what it looks like. Or, um, maybe it was something different with their name. Again, we can rescrape the name from the profile URL here. But that is all we need to have to run the second automation.
So that's the first one done. We looped over these pages and scraped those results. Next, we're going to go to the other automation we have. Which is getting the member info. And what this is going to do is this is going to loop over each person here. I'll actually, I'll just set this up so that it sends to Google sheets for this one.
We will export all of this information by setting up that send to Google sheet, which is just pasting our URL and selecting the sheet to send to. And now what we're going to do is we're going to loop over every single URL here. Scrape that info and put that in this spreadsheet. So I'll click play steps here and I'll let this run for just a couple of usernames.
So this person doesn't have any groups, so we probably won't scrape that group information here. We're just gonna see those steps marked as, um, errors, although it's gonna keep running because the element wasn't found, we were able to find their name and stuff, just no group, which is, uh, in our condition or in our scenario, we'll be able to set up a filter for that to make sure they don't meet our criteria.
And this is why we can run this automation a bunch of times back to back is just because, uh, because these people may go on and off in a matter of minutes, if we constantly scrape it, or if we scrape it infrequently every 10 minutes or something, we can eventually get that list of all of the people that are currently online.
Or sorry, all of the people that are active in that community so that we can find the best creators on the platform.
So there's us moving on to the next page
and scraping the information. Uh, we should be getting the company there. So when we went there, we got armored sense, private one free, and that's just allowing us to build out all of our logic for only finding people with courses. So again, um, a couple of different things we might do with that. We might run this, um, more frequently so that we're always getting those online members, but as far as scraping the information here, it looks like we'll be able to get, Some really cool stuff and make some decisions on their profile, getting all that social media stuff, getting their, uh, getting their communities and things like that, we'll see all of this as being put in this data out section.
Again, uh, some of the people's information is empty. It looks like. The social media seems to be pretty empty for people, but besides that, we're seeing all of that gets scraped here. And again, depending on our logic, maybe we're going to be scraping a bunch of different communities. If we can focus a little bit more than just schools members.
Um, but yeah, a lot of flexibility we can add to this as well. And then it looks like this is done. So we should see all of that in this Google sheet. It looks like the nobody here had their usernames connected. Um, Are there other social media is connected, but we can obviously check for that a little bit further.
And that's definitely something we can tweak for. If we find a different way, people are listing that information.