A very simple python caching proxy
A while ago a colleague asked me if I knew of a HTTP proxy tool that would cache and then replay the first response it got, in order to develop a client for a webservice without hammering the service with requests during testing (or doing a lot of work to mock it out). I mentioned Fiddler but that's Windows only. In the end I realised it probably wouldn't take more than 20 lines of python to achieve the desired effect. 10 minutes later I had the following script, just throwing it out there in case its useful to someone else.
import BaseHTTPServer import hashlib import os import urllib2 class CacheHandler(BaseHTTPServer.BaseHTTPRequestHandler): def do_GET(self): m = hashlib.md5() m.update(self.path) cache_filename = m.hexdigest() if os.path.exists(cache_filename): print "Cache hit" data = open(cache_filename).readlines() else: print "Cache miss" data = urllib2.urlopen("http://targetserver" + self.path).readlines() open(cache_filename, 'wb').writelines(data) self.send_response(200) self.end_headers() self.wfile.writelines(data) def run(): server_address = ('', 8000) httpd = BaseHTTPServer.HTTPServer(server_address, CacheHandler) httpd.serve_forever() if __name__ == '__main__': run()