PDA

View Full Version : MySQLSearch Module


razoor
01-16-2008, 10:16 PM
Would annyone want to make a script that can search through a db. for a release thats on a ftp.
If im using alot of ftp:s, its kinda pain in the ass to search for something.
Would be great if annyone can/will/want/have the time, to write such a module.

Like Turs script for glftpd :) (http://www.grandis.nu/glftpd/modules.php?name=News&file=article&sid=472)

jeza
01-17-2008, 02:19 AM
there allready exists similar script as part of jcS (ReleaseCollector.7z)
http://code.google.com/p/ioftpdscripts/downloads/list
it uses ftp to connect to remote machine(s) and creates folders on local machine

razoor
01-17-2008, 05:23 AM
Thats great jeza.

It creates folders?
Does it del that folder if it´s deleted on the ftp to, so it keeps it up to date?
And i would need someone to make me a !search command from irc to then. =)

jeza
01-17-2008, 11:03 AM
yes it creates folders on local machine.
it is not search engine and it will not delete/update folders.
u must manualy delete them...

there is source included so u can play with it and change dor your needs...

razoor
01-17-2008, 07:33 PM
well then my request stand.
My guess it would best if its and module as nxsharedb.
But i cant annything about this at all. so im hoping on your guys. =)

neoxed
01-17-2008, 10:12 PM
So basically, you want the exact same feature as nxTools' dupe function, but using MySQL instead of SQLite?

razoor
01-18-2008, 01:34 AM
Yes thats correct neoxed.
So im abel to !search <rls> from irc.
But if it´s wiped ut must be wiped in the db to offcource. =)

Yil
01-18-2008, 02:25 AM
I had a fragile solution that I used for a while a few years ago. I wrote a perl script that would connect to an FTP server and request a recursive directory listing of the server (anybody remember me adding that into the very first 6.0 release? hehe). If that failed because it wasn't supported or wasn't allowed it would manually walked the FTP creating the listing. I ended up putting a lot of work into making that latter case very efficient. It would compare the previous listing to what was on the server and eliminate listing directories whose timestamps matched that didn't have subdirectories (or whose local subdirs were excluded via a regex so you could make it ignore complete tags, etc). It also would just list directories without cwd into them and try to remember dirs that had permission issues before so if nothing had changed it would avoid trying them again. Finally, it would create the remote directory tree locally and copy matching files (like *.nfo, *.diz, etc) locally. I eventually wanted to make it a true mirror script where the copy feature could be to a remote machine but I never finished that bit.

I then created a simple search index based upon the local mirrors of the remote servers and sorted them by just the last directory name to get a name to location index. I also wanted to infer the release date of something from the paths and find a way to store everything in memory efficiently but didn't implement either.

The other piece of the puzzle I wanted was driving updates externally rather than on a timer for even faster updating. The timer event just checked a few dirs for changes so it was fast but this way it would catch rare changes as well. My solution was to request new and delete dir events from the ioftpd logfile that had been added since the last time it was checked and to watch messages in a spam channel. This combined with a periodic refresh to catch manually moved files/folders behind the FTPs back would seem to work well and couldn't hurt.

I probably should dig that stuff up from a few years ago, but it makes a template for a solution for you as well. Use a similar tool that creates a full listing (which is actually cool because it could verify that dirs and files were the same across multiple FTPs), or use any mirroring script (there are many, and even includes things like SuperFlexibleFileSynchronizer) to duplicate the remote file system directory tree and then search that from a prepared listing such as generated from find in cygwin or a similar windows tool.

Neoxed's solution would work very well for cooperating sites. The idea being that each site updates the database with changes and runs around and double checks nobody moved anything every once in a while is a great way to do things since everything is pretty much already there and it could easily update 2 databases. If you added a way to support non-participating sites that would cover all the bases.

razoor
01-18-2008, 09:01 AM
Sounds you have been busy over the yars Yil. ;)
I like your replyes all over the place.
You always going on the depth of things.

But i would prefer a module as neoxed sharedb.
Then i can run a mysqdb somewhere else, and i guess it would be the most humane thing to have it look through the log for wipe and del command on a dir. and look in the xfer log for creating of the dir.

But im not a scripter or annything so im just hoping someone have the time to make this.
There are alot of scripts to glftpd that sounds great. But when you are a windows user, the gl is not a solution, and i dont feel i want to learn linux at all. To old to learn:P

And when we got our savior Yil now that releases more updates on ioFTPD then ever, i hope new scripter will come and join and make alot of these nice scripts, so io can "compete" with gl alittle more.