Symfony2: Configuring VichUploaderBundle and Gaufrette to use AmazonS3

Last week, I was looking to install the VichUploaderBundle into a Symfony2 project to automatically handle file uploads. As I was looking through the Vich documentation I ran across a chunk describing being able to use Gaufrette to skip the local filesystem and push files directly to Amazon S3. Since we’d eventually need to load balance the app and push uploaded files to S3 anyway, I decided to set it up out of the gate. Unfortunately, the documentation for setting up Vich with Gaufrette is a bit opaque so here’s a step by step guide to getting it going.

Install Everything

The first thing you’ll want to do is install all the required packages. If you’re using Composer, the following will work:

Once all the packages are installed, you’ll need to configure *both* Gaufrette and Vich. This is where the documentation broke down a bit for me. You’ll need your Amazon AWS “Access Key ID” and “Secret Key” which are both available at if you’re logged into AWS.

Configure It

Once everything is configured at the YAML level, the final step is adding the Vich annotations to your entities.

Make sure you add the “@Vich\Uploadable” annotation to your Entity or Vich will fail silently.

The “mapping” specified in “@Vich\UploadableField(mapping=”logo”, fileNameProperty=”logo”)” needs to match the value under “vich_uploader.mappings” which you defined in config.yml

Finally, one last “gotcha” to be cognizant of is this bug – Since Vich uses Doctrine lifecycle callbacks to manage files, if no Doctrine fields are changed then the Vich code isn’t executed. The easiest way to get around this (and what we used), is just to manually update the “updated_at” column every time a form is submitted to ensure that the upload handling code is executed.

Anyway, as always, questions and comments are welcome.

Posted In: Symfony

Tags: , ,

  • Jon


  • ZhilasdZhil

    there are huge perfomance killer in this implementation – if you use AssertFile – it means, that symfony will validate that file each time you load entity. Yep, it will create SFTP connection, check dir, check file. If you load 100 entities – pretty expensive :(

  • Michael Brauner

    And what is another solution? What are the other performance killer in this code?

  • What Andrew is referring to below is that since you’re using a remote file system, when Symfony validates the file it’s going to have to make a network connection. I don’t think his point about loading entities is correct though – Symfony should only validate when the entity is saved.

    Alternative solutions would just be to hand roll the upload code with but you’d have to write your own logic to be able to use a local file system vs. S3.

  • Michael Brauner

    I understand the problem completely. Is there not a solution to validate only on upload with the vichuploaderBundle ?

  • Oh sure. If you specify the constraint inside your form type instead of on the entity it’ll only be validated once when the file is uploaded.

  • Michael Brauner

    Ah ok. Thank you! Sounds logically. And if I use only use aws-sdk-php instead of gaufrette and vichuploaderBundle. What is advantage in this case and in common?

  • Using the aws-sdk directly, you’ll have more control over how things work but you’ll have to write a lot of glue code that Gaufrette+Vich have already taken care of for you.

    Gaufrette+Vich has been decent in my experience – the only issues I’ve run into is there’s some weirdness with reading files off S3 with Gaufrette.

  • Olim Saidov

    I’m not quite sure about your statement that symfony (doctrine?) will validate entity on load. Could you point me to a resource or docs?

    I thought it is the Form is responsible for issuing validaton.

  • Andrew Zhilin

    If you use annotation AssertFile – you can simply put breakpoint on listener and be surprised when its called.

    Not sure about last symfony2 versions – but in symfony2.3 validation listener was called everytime you submit form with entity, even if file uploading is not used and even if file field is not included in form.
    I located this problem in sonata admin – each time when filter by entity is submitted entity validation called.

  • I follow these step and create entity name brand But when this entity we configure in sonata admin then it’s show me error :-

    Neither the property “logo_virtual” nor one of the methods “getLogoVirtual()”, “logoVirtual()”, “isLogoVirtual()”, “hasLogoVirtual()”, “__get()” exist and have public access in class “ABCBundleAABundleEntityBrand”. Can you help me for this ?

  • Do you have a field call logoVirtual on the entity? It looks like you don’t have the getters/setters for that variable defined which is the error you are seeing.

  • Thanks for early response. We put same as you mentioned in doc and When I run command generate:doctrine:entities Brand then geter seter created only for logo field name Not a logoVirtual field.

  • You’ll need to to add them by hand since it isn’t a doctrine field. just do a setLogoVirtual/getVirtualLogo setters.

  • yes we set it like this but it’s not working :-

    * If manually uploading a file (i.e. not using Symfony Form) ensure an instance
    * of ‘UploadedFile’ is injected into this setter to trigger the update. If this
    * bundle’s configuration parameter ‘inject_on_load’ is set to ‘true’ this setter
    * must be able to accept an instance of ‘File’ as the bundle will inject one here
    * during Doctrine hydration.
    * @param File|SymfonyComponentHttpFoundationFileUploadedFile $image
    public function setLogoVirtual(File $image = null)
    $this->logo_virtual = $image;
    if ($image) {
    // It is required that at least one field changes if you are using doctrine
    // otherwise the event listeners won’t be called and the file is lost
    $this->updatedAt = new DateTime(‘now’);

    * @return File
    public function getLogoVirtual()
    return $this->logo_virtual;

  • Great Thanks for you. You save my time. It’s worked perfect. One more thing can you tell how I can show images on front-end side and back-end side.

  • andr435

    I was needed to change configuration to this one:
    class: AwsS3S3Client
    factory_class: AwsS3S3Client
    factory_method: ‘factory’

    credentials: {key: %aws_key%, secret: %aws_secret_key%}
    region: %aws_region%
    version: %aws_version%

  • Devid

    Hey, Thanks for Nice Post. I am doing same but I have issue on edit section in sonata admin. When I click on edit button then then it’s showing again upload image file…..Can you tell me why and How I can fix this ?

  • Do you mean it is just showing the field to upload a new image again? If you only want to allow them to upload it once, you can do something like if($this->getSubject()->getImage() === null) $form->add(‘image’,’image’…..). in the Admin class for the edit.

    That way once the entity has an image it will no longer output the field.

  • Devid

    Thanks for reply.Yes it is just showing the field to upload a new image again in edit section …and I want If admin want then upload new image otherwise old image will be update..

  • Kyaw Zin

    Hi, Thanks for nice post.
    Bye the way I would like to ask you.
    How to change the public URL of my bucket ‘http’ to ‘https’ look like this



  • If you set your URI prefix to be over https you should be all set.

  • Kyaw Zin

    This is code sample:

    uri_prefix: /home/banner
    upload_destination: home_banner_image_fs
    delete_on_remove: true
    delete_on_update: true
    inject_on_load: true
    namer: vich_uploader.namer_uniqid

    How can I do with URI prefix ?

  • Your uri_prefix should be something like https://mybuck.s3-ap-northern…. not /home/banner. It uses the URI prefix to basically prepend whatever is in your database as the filename for a public url.

  • Sidd Dev

    Hello Thanks for article it’s amazing :)

    I have problem Image upload is working perfect But When I want to update image then nothing is working…Can you help how I can fix when I update then image will be update on amazonS3 ??

  • You might be getting tripped up by this bug, on the “update” the entity needs to have something changed apart from the file upload property in order for the doctrine listeners to run. We usually update an “updatedAt” time manually in order to do this. Good luck!

  • Supun Madushanka

    how to config kernal.php
    i have eror
    PHP Fatal error: Uncaught SymfonyComponentDebugExceptionFatalThrowableError: Type error: Argument 1 passed to GaufretteAdapterAmazonS3::__construct() must be an instance of AmazonS3, none given, called in /home/supun/PhpstormProjects/SiploELearning/app/AppKernel.php on line 48 in /home/supun/PhpstormProjects/SiploELearning/vendor/knplabs/gaufrette/src/Gaufrette/Adapter/AmazonS3.php:25
    Stack trace:
    #0 /home/supun/PhpstormProjects/SiploELearning/app/AppKernel.php(48): GaufretteAdapterAmazonS3->__construct()
    #1 /home/supun/PhpstormProjects/SiploELearning/app/bootstrap.php.cache(2621): AppKernel->registerBundles()
    #2 /home/supun/PhpstormProjects/SiploELearning/app/bootstrap.php.cache(2450): SymfonyComponentHttpKernelKernel->initializeBundles()
    #3 /home/supun/PhpstormProjects/SiploELearning/vendor/symfony/symfony/src/Symfony/Bundle/FrameworkBundle/Console/Application.php(70): SymfonyComponentHttpKernelKernel->boot()
    #4 /home/supun/PhpstormProjects/SiploELearning/vendor/symfony/symfony/src/Symfony/Component/Console/Ap in /home/supun/PhpstormProjects/SiploELearning/vendor/knplabs/gaufrette/src/Gaufrette/Adapter/AmazonS3.php on line 25

  • I think you have a problem in your Symfony services configuration. Your AWS S3 client must not be configured properly.

  • Supun Madushanka

    thank you very much

  • Supun Madushanka

    No credentials were provided. The SDK attempts to retrieve Instance Profile credentials from the EC2 Instance Metadata Service, but doing this requires the “default_cache_config
    ” option to be set in the file or constructor. In order to cache the retrieved credentials.

  • Supun Madushanka

    is this configuration enough to upload file using vich upload for amozon bucket

  • Supun Madushanka

    i get error like

    Could not write the “11246017_952948928145692_3919764058477068346_n.jpg” key content.

  • You might want to post on StackOverflow for some more help

  • Supun Madushanka

    thank you