Thursday 15 March 2012

node.js - Scanning Mongo databases with Javascript and setting off events when strings are not identical -


I am trying to work on a JavaScript algorithm so that through my Mongo DBB the loop has two identical urls Can be searched, and header objects

Below is a snippet of an example of this object:

  {"url": "www.professionalsupplementcenter.com", "Date": "Mars March 26 2013 15 : 08: 31 GMT-400 (EDT) "," Header ": {" Server ":" Microsoft IIS / 7.5 "," X-ASPnet-version ":" 4.0.3031 9 "," X Specifically, my There are two collections of "SAP.Net")}   

In addition to two collections, there are two databases in the MongodiBi of HTTP Header scraps, let's call them today's scraps And two silence forms Call me on . Through these two collections, I want to compare their header items, therefore, server , x-aspnet-version , And x-powered-by and see if there are any upgrades in these numbers, eg IIS / 7.5 to IIS / 8.0 (in future ).

I do not have any code to display, because I do not know how to implement this system at all I do not know Where is the beginning, I want to compare the first two URLs, then when the program recognizes that both the URLs are present then it will make a specific comparison of the three header objects. This can then be reported when some of these attributes are set in order by scanning them and some event such as console.log ("a change has been made") is set. Are not the same.

Can someone give me some suggestions from where? I have been stuck on this issue for some days and have been stunned, disappointing. I would really like to start implementing it, but I need some help.

Firstly you need install npm mongodb (if you have Not already) Then in your app.js file:

  var mongodb = is required ('mongodb'); Var Server = Mongodb Sir's ('127.0.0.1', 27017, {}); New Mongodb DB ('ScrapperAp', Server, {W}: Open (function (mistake, dB) {var scrap = New Mongodb compilation (db, 'scrap') var scrap futures = new mangaodb.selection db, 'Scrapefuture'); Scrape.find ({url: {$ exists: true}}). ToArray (function (err, today_docs) {if (today_docs) return; var scrapeFn = function (i) {var today_doc = Today_docs [i]; scrapeFuture.findOne ({url: today_doc.url}, function (mistake, future_doc) { If (! Future_doc) returns; if (today_doc.adders.server! Future_doc.headers.server) console.log (Today_doc.url + ': server isolated'); if (today_doc.holder ['X-ASPnet-version'] ! = Future_doc.eards ['x-espanet-version']) console.log (today_doc.rest + ': X-aspnet-versions different'); if (today_doc.advertisers ['x-power-by']! = Future_doc.headers ['x-powered-by']) console.log (today_doc.url + ': x-supported by different'); if (today_desk [i + 1]) Rapfn (i + 1);});} Skrepfn (0);});});    

No comments:

Post a Comment