Tuesday, November 8, 2011

S3 and the 5GB limit!

s3cmd as of this point still does not support mutli-part uploads to break the 5GB barrier.

But http://sprightlysoft.com/S3Upload/ does!

Now the problem with that is the Etag that used to be the md5 is NOT the md5 anymore once you start using multi-part uploads. It looks like it, but it's not because it has a dash in it.

So a slight modification of s3cmd

@@ -633,7 +633,10 @@
  output(u"   File size: %s" % info['headers']['content-length'])
  output(u"   Last mod:  %s" % info['headers']['last-modified'])
  output(u"   MIME type: %s" % info['headers']['content-type'])
- output(u"   MD5 sum:   %s" % info['headers']['etag'].strip('"'))
+ if info["headers"].has_key("x-amz-meta-multipart-etag"):
+ output(u"   MD5 sum:   %s" % info['headers']['x-amz-meta-multipart-etag'])
+ else:
+ output(u"   MD5 sum:   %s" % info['headers']['etag'].strip('"'))
  else:
  info = s3.bucket_info(uri)
  output(u"%s (bucket):" % uri.uri())

and bam you're back in business with s3Upload and s3cmd

Executable version with Standalone Python 2.5 and cx_freeze : Here


Tuesday, November 1, 2011

New google reader interface

Are you kidding me? Why all the wasted space? It's TERRIBLE...

This would be so much better.