tag:blogger.com,1999:blog-29194345.post6254911487194600247..comments2023-10-26T07:30:05.047-07:00Comments on Code Intensity: Delete an S3 Bucket Containing Thousands of FilesChrishttp://www.blogger.com/profile/14204551800123292694noreply@blogger.comBlogger4125tag:blogger.com,1999:blog-29194345.post-33139045129023881302009-02-11T04:01:00.000-08:002009-02-11T04:01:00.000-08:00Or AWS::S3::Bucket.find('foo') -- let the web serv...Or AWS::S3::Bucket.find('foo') -- let the web service do the work.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-29194345.post-58858508802373414992008-12-03T08:25:00.000-08:002008-12-03T08:25:00.000-08:00Good call Henrik, yes, if you have a lot of bucket...Good call Henrik, yes, if you have a lot of buckets that'll be more effective for sure (I had less than 10 buckets), and is obviously a more robust/flexible solution.Chrishttps://www.blogger.com/profile/14204551800123292694noreply@blogger.comtag:blogger.com,1999:blog-29194345.post-67744497177260559942008-12-03T04:51:00.000-08:002008-12-03T04:51:00.000-08:00Thanks, had the same situation.Picked the bucket w...Thanks, had the same situation.<BR/><BR/>Picked the bucket with this instead:<BR/><BR/>AWS::S3::Service.buckets.find {|b| b.name =~ /foo/ }<BR/><BR/>since picking by index is scary.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-29194345.post-64774500820186492412008-02-04T05:52:00.000-08:002008-02-04T05:52:00.000-08:00Chris:If you have to do something one time like th...Chris:<BR/><BR/>If you have to do something one time like these deletes, and do not need a reusable code in an automate program, you can try our product, Bucket Explorer. It should be able help with these S3 operations.<BR/><BR/>SaurabhAnonymousnoreply@blogger.com