当前位置: 移动技术网 > IT编程>脚本编程>Python > Python实现备份EC2的重要文件和MySQL数据库到S3

Python实现备份EC2的重要文件和MySQL数据库到S3

2019年04月05日  | 移动技术网IT编程  | 我要评论

灌篮高手全国大赛篇,风云决的歌词,关颖颖博客

今天尝试了使用boto这个工具来用python备份文件到s3,废话不说,上代码:

1. 备份重要文件到s3:


[python] 
import os 
connected = 0 
def connect(): 
    access_key = 'yourkey 
    secret_key = 'yourkey' 
    from boto.s3.connection import s3connection 
    global conn 
    conn = s3connection(access_key, secret_key) 
    global connected 
    connected = 1 
 
def put(filename, bucketname): 
    if connected == 0: 
        print 'not connected!' 
    elif connected == 1: 
        local_file = filename.strip() 
        bucket = bucketname.strip() 
        from boto.s3.key import key 
        b = conn.get_bucket(bucket) 
        k = key(b) 
        k.key = local_file 
        k.set_contents_from_filename(local_file) 
         
if __name__ == '__main__': 
    connect() 
    sourcefolder = '/var/www/www.ttgrow.com/ttgrow/photos/storyphotos' 
    print 'story photo sync in progress' 
    for root, dirs, files in os.walk(sourcefolder): 
        for file in files:   
            print '  '+str(os.path.join(root,file)) 
            put(os.path.join(root,file),'ttgrow-photo') 
    sourcefolder = '/var/www/www.ttgrow.com/ttgrow/photos/thumbnails' 
    print 'thumbnail sync in progress' 
    for root, dirs, files in os.walk(sourcefolder): 
        for file in files: 
            print '  '+str(os.path.join(root,file)) 
            put(os.path.join(root,file),'ttgrow-photo') 
    print 'finished' 

2. 备份mysql到s3:

[python] 
import os 
connected = 0 
def connect(): 
    access_key = 'yourkey' 
    secret_key = 'yourkey' 
    from boto.s3.connection import s3connection 
    global conn 
    conn = s3connection(access_key, secret_key) 
    global connected 
    connected = 1 
 
def put(filename, bucketname): 
    if connected == 0: 
        print 'not connected!' 
    elif connected == 1: 
        local_file = filename.strip() 
        bucket = bucketname.strip() 
        from boto.s3.key import key 
        b = conn.get_bucket(bucket) 
        k = key(b) 
        k.key = local_file 
        k.set_contents_from_filename(local_file) 
         
if __name__ == '__main__': 
    from datetime import datetime 
    import os 
    temp = datetime.today() 
    filename = '/tmp/dbbak-'+str(temp.year)+'-'+str(temp.month)+'-'+str(temp.day)+'-'+str(temp.hour)+'-'+str(temp.minute)+'.sql' 
    os.system("mysqldump -h your_rds_location -u usrname -ppassword dbname > "+filename) 
    print 'backup db finished' 
    connect() 
    put(filename,'ttgrow-db') 
 
    print 'upload to s3 finished' 
再把执行脚本加到定时器就每天可以定时执行了 :)

如对本文有疑问,请在下面进行留言讨论,广大热心网友会与你互动!! 点击进行留言回复

相关文章:

验证码:
移动技术网