Monday, June 07, 2004

My utterly fucking brilliant search idea of the day, to get me hired by goodle, or so that I can sue them for copying my awesome idea is the following?::

Search for keywords, limiting the scope of the search to pages in your browser's history. That way you can easily figure out "where did I read blah blah.."

Shit I'm a genius.

The trouble is that google doesn't function on the basis of searching thorugh long lists of specific pages. On the other hand, it would be undesirable to have to keep a copy of each web page locally.

One option would be to use the Google API to construct search queries limited to the domain names of sites in your history. While your history may contain 10,000 web pages viewed, they might be under < 500 domains. The results could tehn be filtered out in another step to include only the pages you want to view.

Unfortunately this is still a lot of work and a lot of network activity to send out ot ogoogle.

The default web cache size is 50 MB. Perhaps splitting the web cache into a dedicated text repository, perhaps usng some of thespace that would have gone to caching images, a local cache of one's history may be more feasable with modern storage costs. Hell, google keeps their cache of the entire internet in RAM.

Another advantage of this local caching would be to get around changes made by webmasters to their sites, or censored news pages (actually a fairly common occurrance)

By al - 7:04 p.m. |

Comments:
I'm not very good with mozilla plugins or programming without a Perl-like language. I think that would be a great idea for a Google API script though. The problem is actually getting the users pages.
 
Post a Comment

    follow me on Twitter

    al's del.icio.us Links

    • www.flickr.com
      This is a Flickr badge showing public photos from dragonofsea. Make you own badge here.
    •  
    • (al)



    • Powered by Blogger