Golang : Upload and download file to/from AWS S3




One of my projects involved uploading tar gzipped files to AWS-S3 and use the CloudFront(CDN) for worldwide distribution.

The process is triggered by cronjob and in this tutorial, I will show you how to create a small program in Golang to upload file to S3 and download the uploaded file from S3.

However, please note that I won't go into the details of setting up AWS and configuring cronjob.

The code below uses third-party packages from :

  "launchpad.net/goamz/s3"
  "launchpad.net/goamz/aws"

so before executing the code below.....please :

go get launchpad.net/goamz/s3

go get launchpad.net/goamz/aws

first.

Here you go :

 package main

 import (
 "bufio"
 "fmt"
 "io"
 "launchpad.net/goamz/aws"
 "launchpad.net/goamz/s3"
 "net/http"
 "os"
 )

 func main() {
 AWSAuth := aws.Auth{
 AccessKey: "", // change this to yours
 SecretKey: "",
 }

 region := aws.USEast
 // change this to your AWS region
 // click on the bucketname in AWS control panel and click Properties
 // the region for your bucket should be under "Static Website Hosting" tab

 connection := s3.New(AWSAuth, region)

 bucket := connection.Bucket("<your bucketname>") // change this your bucket name

 path := "example/big.jpg" // this is the target file and location in S3

 // need to read big.jpg into a []byte buffer
 // see https://www.socketloop.com/tutorials/golang-read-binary-file-into-memory

 fileToBeUploaded := "big.jpg"

 file, err := os.Open(fileToBeUploaded)

 if err != nil {
 fmt.Println(err)
 os.Exit(1)
 }

 defer file.Close()

 fileInfo, _ := file.Stat()
 var size int64 = fileInfo.Size()
 bytes := make([]byte, size)

 // read into buffer
 buffer := bufio.NewReader(file)
 _, err = buffer.Read(bytes)

 // then we need to determine the file type
 // see https://www.socketloop.com/tutorials/golang-how-to-verify-uploaded-file-is-image-or-allowed-file-types

 filetype := http.DetectContentType(bytes)

 err = bucket.Put(path, bytes, filetype, s3.ACL("public-read"))

 if err != nil {
 fmt.Println(err)
 os.Exit(1)

 // NOTE : If you get this error message
 // Get : 301 response missing Location header

 // this is because you are using the wrong region for the bucket
 // and if you want to figure out the bucket location automatically
 // see http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketGETlocation.html
 // I've try it out with http.Get() and just getting the authenticating 
 // requests part right is already too much
 // work for this tutorial.
 // See http://docs.aws.amazon.com/AmazonS3/latest/API/sig-v4-authenticating-requests.html

 // UPDATE 15th Jan 2015: See http://camlistore.org/pkg/misc/amazon/s3/#Client.BucketLocation
 }

 fmt.Printf("Uploaded to %s with %v bytes to S3.\n\n", path, size)

 // Download(GET)
 downloadBytes, err := bucket.Get(path)

 if err != nil {
 fmt.Println(err)
 os.Exit(1)
 }

 downloadFile, err := os.Create("download.jpg")

 if err != nil {
 fmt.Println(err)
 os.Exit(1)
 }

 defer downloadFile.Close()

 downloadBuffer := bufio.NewWriter(downloadFile)
 downloadBuffer.Write(downloadBytes)

 io.Copy(downloadBuffer, downloadFile)

 fmt.Printf("Downloaded from S3 and saved to download.jpg. \n\n")

 }

Sample output :

Uploaded to example/big.jpg with 150042 bytes to S3.

Downloaded from S3 and saved to download.jpg.

and to verify, both files should have the same size.

 ls -la *.jpg
 -rw-r--r--@ 1 sweetlogic  staff  150042 Dec 10 10:13 big.jpg
 -rw-r--r--  1 sweetlogic  staff  150042 Jan  9 13:30 download.jpg

UPDATE : If you are uploading file larger than 100MB... please read this tutorial on how to upload with multipart to S3.

References :

https://www.socketloop.com/tutorials/golang-download-file-example

http://golang.org/pkg/io/#Copy

http://godoc.org/launchpad.net/goamz/s3

  See also : Golang : List objects in AWS S3 bucket





By Adam Ng

IF you gain some knowledge or the information here solved your programming problem. Please consider donating to the less fortunate or some charities that you like. Apart from donation, planting trees, volunteering or reducing your carbon footprint will be great too.


Advertisement