我如何能看到什么是在S3桶与boto3?(例如,写一个“ls”)?

做以下事情:

import boto3
s3 = boto3.resource('s3')
my_bucket = s3.Bucket('some/path/')

返回:

s3.Bucket(name='some/path/')

我如何看到它的内容?


当前回答

首先,创建一个s3客户端对象:

s3_client = boto3.client('s3')

接下来,创建一个变量来保存bucket名称和文件夹。注意文件夹名后面的斜杠“/”:

bucket_name = 'my-bucket'
folder = 'some-folder/'

接下来,调用s3_client。List_objects_v2获取文件夹内容对象的元数据:

response = s3_client.list_objects_v2(
  Bucket=bucket_name,
  Prefix=folder
)

最后,使用对象的元数据,您可以通过调用s3_client来获取S3对象。get_object功能:

for object_metadata in response['Contents']:
    object_key = object_metadata['Key']
    response = s3_client.get_object(
        Bucket=bucket_name,
        Key=object_key
    )
    object_body = response['Body'].read()
    print(object_body)

如你所见,字符串格式的对象内容可以通过调用response['Body'].read()来获得。

其他回答

也可以这样做:

csv_files = s3.list_objects_v2(s3_bucket_path)
    for obj in csv_files['Contents']:
        key = obj['Key']

我假设您已经单独配置了身份验证。

import boto3
s3 = boto3.resource('s3')

my_bucket = s3.Bucket('bucket_name')

for file in my_bucket.objects.all():
    print(file.key)

从lambda函数运行aws cli命令也是一个不错的选择

import subprocess
import logging

logger = logging.getLogger()
logger.setLevel(logging.INFO)

def run_command(command):
    command_list = command.split(' ')

    try:
        logger.info("Running shell command: \"{}\"".format(command))
        result = subprocess.run(command_list, stdout=subprocess.PIPE);
        logger.info("Command output:\n---\n{}\n---".format(result.stdout.decode('UTF-8')))
    except Exception as e:
        logger.error("Exception: {}".format(e))
        return False

    return True

def lambda_handler(event, context):
    run_command('/opt/aws s3 ls s3://bucket-name')

这类似于'ls',但它没有考虑到前缀文件夹约定,并将列出bucket中的对象。由读取器来过滤掉作为Key名称一部分的前缀。

在Python 2中:

from boto.s3.connection import S3Connection

conn = S3Connection() # assumes boto.cfg setup
bucket = conn.get_bucket('bucket_name')
for obj in bucket.get_all_keys():
    print(obj.key)

在Python 3中:

from boto3 import client

conn = client('s3')  # again assumes boto.cfg setup, assume AWS S3
for key in conn.list_objects(Bucket='bucket_name')['Contents']:
    print(key['Key'])

这是解决方案

import boto3

s3=boto3.resource('s3')
BUCKET_NAME = 'Your S3 Bucket Name'
allFiles = s3.Bucket(BUCKET_NAME).objects.all()
for file in allFiles:
    print(file.key)