Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

BitBucket Pipelines not finding archive

I am using Bitbucket Pipelines to build a jar file and deploy it to a Spring Boot server.

  1. The first step is the default pipeline to automatically build the
    jar file (pow-wow-0.0.2-SNAPSHOT.jar), which it does successfully.

  2. The second step is a custom pipeline to copy the jar file
    (pow-wow-0.0.2-SNAPSHOT.jar) to an AWS S3 bucket, but it fails
    because it cannot find the built jar file.

    MEDevel.com: Open-source for Healthcare and Education

    Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

    Visit Medevel

I have the following:

bitbucket-pipelines.yml

image: maven:3.9.0
pipelines:
  default:
    - parallel:
        - step:
            name: Build
            caches:
              - maven
            script:
              - echo 'Maven Build'
              - mvn clean install -Dmaven.test.skip
              - echo "$(date +%Y-%m-%d) $BITBUCKET_BUILD_NUMBER $BITBUCKET_COMMIT"
            artifacts:
              - target/pow-wow-0.0.2-SNAPSHOT.jar
              - appspec.yml
  custom:
    upload-and-deploy-to-uat:
        - step:
            name: Upload to UAT S3
            script:
              - echo "Upload to AWS S3 Bucket on UAT $(date +%Y-%m-%d) (powwow-$BITBUCKET_BUILD_NUMBER-$BITBUCKET_COMMIT)"
              - pipe: atlassian/aws-code-deploy:1.2.0
                variables:
                  AWS_ACCESS_KEY_ID: $UAT_AWS_ACC_KEY
                  AWS_SECRET_ACCESS_KEY: $UAT_AWS_SEC_ACC_KEY
                  AWS_DEFAULT_REGION: $UAT_AWS_DEFAULT_REGION
                  COMMAND: 'upload'
                  ZIP_FILE: 'target/pow-wow-0.0.2-SNAPSHOT.zip'
                  APPLICATION_NAME: 'pow-wow'
                  S3_BUCKET: ${UAT_S3_BUCKET}
                  VERSION_LABEL: "powwow-${BITBUCKET_BUILD_NUMBER}-${BITBUCKET_COMMIT}"
        - step:
            name: Deploy to the UAT Server
            script:
              - echo "Deploy with AWS CodeDeploy to UAT Application Server $(date +%Y-%m-%d) (powwow-$BITBUCKET_BUILD_NUMBER-$BITBUCKET_COMMIT)"
              - pipe: atlassian/aws-code-deploy:1.2.0
                variables:
                  AWS_ACCESS_KEY_ID: $UAT_AWS_ACC_KEY
                  AWS_SECRET_ACCESS_KEY: $UAT_AWS_SEC_ACC_KEY
                  AWS_DEFAULT_REGION: $UAT_AWS_DEFAULT_REGION
                  COMMAND: 'deploy'
                  APPLICATION_NAME: 'pow-wow'
                  DEPLOYMENT_GROUP: 'deployment-group'
                  IGNORE_APPLICATION_STOP_FAILURES: 'true'
                  FILE_EXISTS_BEHAVIOR: 'OVERWRITE'
                  WAIT: 'true'
                  S3_BUCKET: ${UAT_S3_BUCKET}
                  VERSION_LABEL: 'powwow-${BITBUCKET_BUILD_NUMBER}-${BITBUCKET_COMMIT}'

Output (with error):

INFO: Uploading target/pow-wow-0.0.2-SNAPSHOT.zip to S3.
Traceback (most recent call last):
  File "/pipe.py", line 264, in <module>
    pipe.run()
  File "/pipe.py", line 254, in run
    self.upload_to_s3()
  File "/pipe.py", line 230, in upload_to_s3
    with open(self.get_variable('ZIP_FILE'), 'rb') as zip_file:
FileNotFoundError: [Errno 2] No such file or directory: 'target/pow-wow-0.0.2-SNAPSHOT.zip'

p.s. I have tried naming it .jar too, and it get the same error.

ZIP_FILE: 'target/pow-wow-0.0.2-SNAPSHOT.jar'

appspec.yml

version: 0.0
os: linux
files:
  - source: target/pow-wow-0.0.2-SNAPSHOT.jar
    destination: /home/jboss/bitbucket/pow-wow/pow-wow
hooks:
  BeforeInstall:
    - location: scripts/powwow-deploy.sh

>Solution :

The error is correct there is no such file. In your pipeline definition you have your default pipeline which create the file. But artifact is only kept in that pipeline run. Because your uploads are in custom pipeline, and must be run in a different pipeline run they have no access to the file.

To fix this, you need to either put the upload steps into the default or add the Build step to your custom pipeline. Note the former cannot have the upload steps be in the parallel (which doesn’t make sense with current setup as it’s a sole step) or the artifact won’t be there for the upload steps.

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading