Sourcing is like an Onion…you must peel it back


digital onion

We live in a world of automation.  We want to speed things up.  Or at least that’s what we want when we’re doing mundane things that are wasting our time.  It comes down to 2 things: FRUSTRATE or AUTOMATE.  FRUSTRATE: sitting in traffic going 3 MPH and watching squirrels walk faster than you.  AUTOMATE: adding a 360-degree jet-propulsion system to your car and flying over everyone.  FRUSTRATE: using a sourcing method that is cool…but takes too many steps for it to be worthwhile.  AUTOMATE: cutting out the extra steps, getting to the specific information you want, and easily repeating the process.

For instance, it’s no big secret that there are many different coding user groups and exchanges on the internet.  And if you source/recruit for the world of computer science, then you should already be aware of the potential use of these sites.  As sourcers, you should always remember the basics because they will help you.

Now take a site like Snipplr or Google Code, which everyone knows about.  They are sites for developers; with tools, code, discussions, and other technical resources.  Whenever you approach ANY site, the first things that should go through your mind are:

  1. What is this site’s purpose?
  2. Who uses it?
  3. What user information is available?
  4. Can I search for users while still focusing on specifics?
  5. Can I speed up my search process while still maintaining the integrity of my search?
  6. Can I search this site from another source?

One quick look through the site and you can understand the site’s purpose.  You can view individual code projects, the contributors, discussions, code downloads, and other reference tools.  More importantly, the projects are defined by their function.  If I needed to run a search to find contributors for an API written in python, then I could search the site’s internal engine: python AND api.

Once I view the summary of the project, then I can view the contributors to the project, and click through to their individual profiles/emails.  Now this is very cool, but here’s where the FRUSTRATE piece comes in:  how can you look through targeted users at a quicker pace?   The search functionality on Snipplr or Google Code is not really meant for high volume sourcing.

Instead, go to one of the user’s pages and investigate.  Do some old fashioned URL detective work and you notice the URL on one of the user’s pages:  www.snipplr.com/users/chrisaiv/tags/python/

After going to a couple of pages, we start to see a trend.   So instead of dealing with the search functionality within the site, we go to Google and do a site: and inurl: search.  So we use the site: operator and another keyword or search to get us only pages with user information.   Since every user’s URL starts with snipplr.com/users/ , we can use part of that as a constant.  When dealing with mathematics, science, or anything…constants are your friend.  We use the inurl: operator and force the word “users” to always follow the dotcom address.  So that turns your Google search into:

site:snipplr.com inurl:"com/users" python (api OR wrapper OR "web service" OR restful OR rest)

The piece in blue is the constant we will use, and the green keywords will change depending on the type of experience you are focusing on.  Since this is a coding site, you want to get pretty specific with your keywords: programming methods, classes, functional, API’s, text manipulation, OOP, and other specific technologies.  Here’s that same line of thinking for Google Code:

site:code.google.com inurl:people python (api OR wrapper OR "web service" OR restful OR rest)

Once you run this search, you can easily look through all the profiles of the users who are sharing relevant software.  They will have names, usernames, emails, websites, or all of the above.  Being able to quickly search through sites like these will definitely help with the AUTOMATE piece.

For sourcers, you want to be able to do this to ANY site at the drop of a hat.  It’s really just easy pickings.

Now there is a further AUTOMATE step that you can take with a search like this.  You can run these searches, copy the relevant data you want, and then save into a spreadsheet of your own for analysis.  This can be done with free or low cost tools like Outwit Hub combined with a Macro program.  My only warning is that you do this responsibly.   Many websites have fail-safes put into place for users who are over-zealous with their http requests.  You could conceivably get banned or blocked by a website by doing this.  So let’s be responsible when we over-fish our candidate resource pools.  We don’t want to ruin it for everyone now!

- Mark Tortorici
Founder & Expert Trainer
Transform Talent Acquisition

Views: 136

Comment

You need to be a member of RecruitingBlogs to add comments!

Join RecruitingBlogs

Subscribe

All the recruiting news you see here, delivered straight to your inbox.

Just enter your e-mail address below

Webinar

RecruitingBlogs on Twitter

© 2024   All Rights Reserved   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service