An image of a robot made of cardboard
Photo by Hello I'm Nik on Unsplash

How to Upload to S3 — Spring Boot tutorial

theoneamin
4 min readMay 24, 2021

--

With the growing use of media content such as images, videos, and audio files on websites and applications, there comes the need to seek alternative storage for these files. Amazon S3 or Amazon Simple Storage Service provides that.

In this article, we are going to look at how we can upload files to an Amazon S3 bucket from a Spring Boot application. If you do not already have an AWS account, register, most of the services are free, at least for your first year.

Once you have your account, create an IAM user and attach the AmazonS3FullAccess policy. Also, make sure to get security credentials i.e, the Access Key ID and secret. Then in the services, search S3 and create a bucket. For your bucket, change the permissions to allow public access. After getting these done, we can now create the functionality to upload files to our bucket.

Uploading from Spring Boot

I have a spring boot application in which I have added the AWS SDK for Java as a dependency. You can find it here: https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk.

The next thing we are going to do is create three packages, one for the configuration, another for getting the bucket name, and the final one for holding the logic to upload files.

First, let’s create a bucket package and in it, an enum class and give it an appropriate name, mine is called BucketName. Put this in the class.

package com.example.test.bucket; 
public enum BucketName { PROFILE_IMAGE("PUT_YOUR_BUCKET_NAME_HERE");
private final String bucketName;
BucketName(String bucketName) {
this.bucketName = bucketName;
}
public String getBucketName() {
return bucketName;
}
}

Next, let’s create a package for the configuration, I called mine amazonconfig. In it, create a class and call it AmazonConfig then paste the following.

package com.example.test.amazonconfig;  
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration; @Configuration public class AmazonConfig { @Value("${AWS_SECRET}")
private String awsSecret;
@Value("${AWS_KEY}")
private String awsKey;
@Bean
public AmazonS3 s3() {
AWSCredentials awsCredentials = new BasicAWSCredentials( awsKey,awsSecret
);
return AmazonS3ClientBuilder
.standard()
.withCredentials(new AWSStaticCredentialsProvider(awsCredentials)) .withRegion("PUT_YOUR_REGION_HERE")
.build();
}
}

Note: Remember to put your AWS KEY and SECRET in your application.properties.

Finally, let’s create the last package called storage and in it, a storage class. Within this class, we are going to inject AmazonS3 package by passing it to the constructor as shown below. Then we will create a method called save, which will be uploading our files to the s3 bucket.

package com.example.test.storage;  
import com.amazonaws.AmazonServiceException;
import com.amazonaws.SdkClientException;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.model.*;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Service;
import java.io.IOException;
import java.io.InputStream; import java.util.Map;
import java.util.Optional;
@Service public class Storage {
private final AmazonS3 s3;
@Autowired
public Storage(AmazonS3 s3) {
this.s3 = s3;
}
public void save(String path,
String fileName,
Optional<Map<String,String>> optionalMetadata,
InputStream inputStream) {
ObjectMetadata metadata = new ObjectMetadata();
optionalMetadata.ifPresent(map -> {
if(!map.isEmpty()) {
for (Map.Entry<String, String> entry : map.entrySet()) {
String key = entry.getKey();
String value = entry.getValue();
if (key == "Content-Length") {
metadata.setContentLength(Long.parseLong(value));
}
if (key == "Content-Type") {
metadata.setContentType(value);
}
}
}
});
try {
s3.putObject(new PutObjectRequest(path, fileName,
inputStream, metadata) .withCannedAcl(CannedAccessControlList.PublicRead));
} catch (AmazonServiceException e) {
throw new IllegalStateException("Failed to store file", e);
}
}
}

With these three classes, we are ready to implement this functionality. For this example, we are going to be uploading profile pictures of users. So let’s create a UserService class and inject the storage class like below. Again, remember to import all required packages for the class.

@Service 
public class UserService {
private final UserRepository userRepository;
private final Storage storage;
@Autowired
public UserService(UserRepository userRepository, Storage storage) { this.userRepository = userRepository;
this.storage = storage;
}
}

Then add a method in the class to upload a profile picture,

@Transactional
public void uploadProfilePic(Long userId, MultipartFile file) { //check if the file is not empty
if (file.isEmpty()) {
throw new IllegalStateException("No file added");}
//check whether the user exists
User user = userRepository.findById(moduleId)
.orElseThrow(() -> new IllegalStateException("user does not exist"));
//prepare metadata
Map<String, String> metadata = new HashMap<>(); metadata.put("Content-Type", file.getContentType()); metadata.put("Content-Length", String.valueOf(file.getSize())); //store the file
//create a path depending on the username, so that all of a user's files are in the same directory
String path = String.format("%s/%s", BucketName.PROFILE_IMAGE.getBucketName(), user.getUsername()); //create a filename from original filename and random UUID String filename = String.format("%s-%s", file.getOriginalFilename(), UUID.randomUUID());
try {
storage.save(path, filename, Optional.of(metadata), file.getInputStream());
} catch (IOException e) {
throw new IllegalStateException("error", e);
}
}

And that is it. Now we can use a controller to create an endpoint to upload files. The endpoint accepts multipart form data and a user id, so when testing, remember to send the expected data.

@PostMapping(
path = "{userId}/profile/upload",
consumes = MediaType.MULTIPART_FORM_DATA_VALUE,
produces = MediaType.APPLICATION_JSON_VALUE
)
public void uploadProfilePic(@PathVariable("userId") Long userId, @RequestParam("file") MultipartFile file) {
userService.uploadProfilePic(userId, file);
}

Now we have completely built everything we need to upload files to our s3 bucket. Test your API endpoint and adjust this feature to match your application structure.

Thanks for reading.

--

--