When dealing with a larger project built using Apify Actors, you might consider setting up some automatic tests and builds for your Actors. This guide shows how you can easily set up such continuous integration using Bitbucket Pipelines. This guide expects that you are familiar with Git and the Apify command-line interface (CLI). If you are not, you might also want to check the following resources:
First, you'll need a Bitbucket account. On the free account, you'll get an unlimited number of Git repositories and 50 free minutes for running your tests and builds on Bitbucket Pipelines. Once you have the account, create a new Git repository, as described in this guide.
In this tutorial, we will use the basic Actor template generated by the Apify CLI. Let's create a new Actor by running the following command in Terminal and choose the "Basic" template:
apify create my-actor
? Select template for the Actor (Use arrow keys)
β― Basic ("basic")
Puppeteer ("puppeteer")
Puppeteer crawler ("puppeteer_crawler")
Plain request url list crawler ("plain_request_urls_list")
Run: npm install --no-optional
npm notice created a lockfile as package-lock.json. You should commit this file.
added 76 packages in 9.501s
Success: Actor 'my-actor' was created. To run it, run "cd my-actor" and "apify run".
The command creates a directory called my-actor
with several pre-created files and directories that are useful for local Actor development. You can now link your local Actor to the Apify cloud using the apify push
command. Note that you also need to be logged in using apify login
.
Now that the Actor is ready, it's time to commit its source code to the git repository. Let's add the remote origin to your local Git repository using the following commands:
git init
git remote add origin https://drobnik@bitbucket.org/drobnik/my-actor-repo.git
git push -u origin master
Now, let's enable a Bitbucket Pipeline for the repository by adding a file called bitbucket-pipelines.yml
to the root of your Git repository. You can use a file like this one:
image: node:8.4.0 # Dockerfile for testing and building
pipelines:
default:
- step:
script:
- npm install apify-cli -g
After you commit and push the file to the Git repository, you should see your pipeline start in the Bitbucket user interface:
If everything works, we can move to setting up automatic builds of the Actor. In order to enable Bitbucket Pipelines to access your Apify account, you'll need to obtain your Apify API token (available on the Account page) and then set it as the APIFY_TOKEN
environment variable in Bitbucket Pipelines (in Repository -> Settings -> Environment variables). Make sure to mark the environment variable as Secured!
Now that the Apify API token is passed to the pipeline process, we can use it to authenticate the pipeline to access your Apify account. Simply update the bitbucket-pipelines.yml
file as follows:
image: node:8.4.0 # Dockerfile for testing and building
pipelines:
default:
- step:
script:
- npm install apify-cli -g
- apify login $APIFY_TOKEN
After committing and pushing this change, verify that the pipeline can log in to Apify:
Now, let's split up the Actor into two versions: latest
and beta
- one will be used for production and one for development. Every commit to the master
branch will be built as the latest
version, while every commit to develop
as the beta
version. This can be achieved by means of the following bitbucket-pipelines.yml
configuration:
image: node:8.4.0 # Dockerfile for testing and building
pipelines:
branches:
master: # This runs on every commit to master
- step:
caches:
- node
script:
- npm install apify-cli -g
- apify login --token $APIFY_TOKEN
- apify push --build-tag latest
develop: # This runs on every commit to develop
- step:
caches:
- node
script:
- npm install apify-cli -g
- apify login --token $APIFY_TOKEN
- apify push --build-tag beta
After committing and pushing the changes to Git, you should see that your Actors are being built and pushed to Apify. Just go to https://my.apify.com/acts, select the Actor and go to the Builds tab:
With this setup, every commit to master
or develop
branch will rebuild the Apify Actor and set the right tag.
Now let's create a test for the Actor using mochajs:
npm install mocha --save-dev
Add a simple test by creating a file sample_test.js
inside the test
directory:
Let's start with an extremely simple test that should always pass:
const assert = require('assert');
describe('My Sample Test', function () {
it('should work', function () {
assert.equal(true, true);
});
});
Enable the unit test by updating the package.json
file:
{
...
"scripts": {
"test": "mocha"
},
...
}
Now we can run the tests in the Actor directory using the npm test
command. The output of this command should look like this:
My Sample Test
β should work
1 passing (13ms)
In the last step, let's add the unit tests to the Bitbucket Pipeline. We'll update the bitbucket-pipelines.yml
configuration file once more:
image: node:8.4.0 # Dockerfile for testing and building
pipelines:
branches:
master: # This runs on every commit to master
- step:
caches:
- node
script:
- npm install
- npm test
- npm install apify-cli -g
- apify login --token $APIFY_TOKEN
- apify push --build-tag latest
develop: # This runs on every commit to develop
- step:
caches:
- node
script:
- npm install
- npm test
- npm install apify-cli -g
- apify login --token $APIFY_TOKEN
- apify push --build-tag beta
Make sure to add npm install
and npm test
commands to the beginning of the pipeline, so that the unit tests will be executed before the Actor is pushed to Apify. In our example, the unit tests should pass:
And that's it, you've just setup continuous integration for an Apify Actor! You can check the Bitbucket repository with the files from this tutorial. If you're missing anything, just let us know on chat.