Saturday, August 16, 2014

Intelligent agents or Web crawlers, Where is my automation level?


    As Automation testing engineers we must ask this question ourselves  frequently. And, it's not only new technologies, frameworks or tools that we can master. It's more abstract - how we automate our workflow. A very good answer will be - 'I automate my repeatable tasks'. But still not quite enough. Why?

   One can be our daily workflow activities. Let's take for example simple process as checking your email. If you do it manually, many times a day and analyse it - does this ring-a-bell?

    If yes, then it's time to talk about Multi-agent systems and Intelligent agents. If we stick to it by definition, we can refer to more specific one - User_agent. So what they can do for us, we can find in their classification:

User agents, or personal agents, are intelligent agents that take action on your behalf. In this category belong those intelligent agents that already perform, or will shortly perform, the following tasks:
  • Check your e-mail, sort it according to the user's order of preference, and alert you when important emails arrive.
  • Assemble customized news reports for you. There are several versions of these, including CNN.
  • Find information for you on the subject of your choice.
  • Fill out forms on the Web automatically for you, storing your information for future reference
  • Scan Web pages looking for and highlighting text that constitutes the "important" part of the information there
  • Facilitate with online job search duties by scanning known job boards and sending the resume to opportunities who meet the desired criteria

   So if you already found your 'guy', lets discuss how to 'build' it and put 'him' to work. For the reason of good programming I divide my agents according to their tasks and capabilities.

   First let's start with the Mail_User_Agent. If your company already uses Outlook, you can set up MUA to use the Web Access API and automate this task via any script or tool you know. In my examples good old Powershell (as we know already - task automation and configuration management framework from Microsoft) with its headless IE capabilities (will make it work as a Daemon and allow it to communicate with us only if preconditions are true) is able to support all this. 
    And to be more precise if you have to Login and use Firewall  Web Access APIs - the implementations is the same. So with one stone we get two birds - every morning you'll be behind-the-wall and knowing your mail contentHere you can see sample solution.

    As next we can continue with Assembled and customized news reports & Find information for you on the subject of your choice (Hint- You can combine all scripts into single start-up.ps1 and they will read all news for you). This one will search for all jobs in such Bulgarian Web Portal according to your predefined criteria and will give you a Html report. Using this one you will be able to get Google & Youtube search-resources results again with predefined criteria and Html report.  And a little extra one who can convert any Text-To-Speech file. In case you are asking - why? I'm using it to  listen my lectures when I'm in the Subway. 

    And my personal favorite -  online job search duties by scanning known job boards and sending the resume to opportunities who meet the desired criteria. Again the Jobs Web Portal is Bulgarian, but to adopt the logic and reuse it will be trivial task. And as a Syntactic sugar there is a "Apply via Linkedin" support.  But be careful with this  little baby -because it can generate enormous amount of apply records on your account and you don't want to get a call when someone is asking why you have sent 20 times your CV and Cover letter for this position. The algorithm is smart (but greedy) and 'apply' only to jobs given by your criteria and since your last Login, but still it's better to monitor it from time to time. 

   So can use them all - Web_Crawlers or User Agents, but in the end they do what they are designed to do - automate our daily lives.

No comments:

Post a Comment