{"__v":2,"_id":"586d4ef5efae8d0f002cf56f","category":{"project":"55c6bec1b9aa4e0d0016c2c3","version":"55c6bec1b9aa4e0d0016c2c6","_id":"56f97e9a4c612020008f2eaf","__v":0,"sync":{"url":"","isSync":false},"reference":false,"createdAt":"2016-03-28T18:57:30.798Z","from_sync":false,"order":3,"slug":"migrations","title":"Migrations"},"parentDoc":null,"project":"55c6bec1b9aa4e0d0016c2c3","user":"56f1c8c95ebd6d20000e2982","version":{"__v":8,"_id":"55c6bec1b9aa4e0d0016c2c6","project":"55c6bec1b9aa4e0d0016c2c3","createdAt":"2015-08-09T02:45:21.683Z","releaseDate":"2015-08-09T02:45:21.683Z","categories":["55c6bec2b9aa4e0d0016c2c7","56c14bc5826df10d00e82230","56cceed8723ad71d00cae46c","56ccf29a431ada1f00e85aae","56ccf3c28fa8b01b00b82018","56ce1e6ee538330b0021ac5d","56f97e9a4c612020008f2eaf","5734fafd146eb82000597261"],"is_deprecated":false,"is_hidden":false,"is_beta":false,"is_stable":true,"codename":"","version_clean":"1.0.0","version":"1.0"},"updates":["589e54f14e78cf0f00d0c838","58a7a3862171a00f00a22f5d"],"next":{"pages":[],"description":""},"createdAt":"2017-01-04T19:37:25.526Z","link_external":false,"link_url":"","githubsync":"","sync_unique":"","hidden":false,"api":{"settings":"","results":{"codes":[]},"auth":"required","params":[],"url":""},"isReference":false,"order":1,"body":"Cassandra is no longer an actively maintained or supported persistence store. \n\nA migration to S3, GCS, or AZS is **recommended**.\n\n## 1. Create a Bucket and Folder\n\nMake up a bucket name (or account name if using Azure) that is consistent with the naming policies for the underlying storage service. For purposes of this document, we'll pick ${USER}-spinnaker since many storage services (including Amazon S3, Google GCS, and Azure AZS) require globally unique names.\n\nMake up a folder name within the bucket. The default is \"front50\". Spinnaker will store all the objects within this folder. The folder name will be used for configuration and managed by Spinnaker. You do not need to physically create the folder.\n\nNote that currently only Amazon Simple Storage Service (S3), Google Cloud Storage (GCS), or Azure Storage (AZS) are supported, and they are mutually exclusive. Pick only one. This is independent of where you are actually running Spinnaker.\n\n### A. Create the Bucket in S3\n\nIf you wish to use S3 as the storage service, create the bucket and underlying root folder.\n\nSee [Amazon's Documentation](http://docs.aws.amazon.com/AmazonS3/latest/gsg/CreatingABucket.html) for more information. \n\nTo enable versioning on an existing bucket, follow [these](http://docs.aws.amazon.com/AmazonS3/latest/UG/enable-bucket-versioning.html) steps.\n\n### B. Create the Bucket in GCS\n\nIf you wish to use GCS as the storage service, Spinnaker can automatically create the bucket (with versioning) for you if it has the right OAuth scopes (Storage Admin). To create it yourself you are best off using the [gsutil tool](https://cloud.google.com/storage/docs/gsutil). Using `gsutil`, also [turn on versioning](https://cloud.google.com/storage/docs/object-versioning) within the bucket.\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"gsutil mb ${USER}-spinnaker\\ngsutil versioning set on ${USER}-spinnaker\",\n      \"language\": \"text\",\n      \"name\": \"Example\"\n    }\n  ]\n}\n[/block]\n### C. Create the Storage Account in AZS\n\nIf you wish to use Azure Storage, Spinnaker can automatically create the container and root folder (with versioning). Follow instructions [here](https://docs.microsoft.com/azure/storage/storage-create-storage-account#create-a-storage-account) to create a storage account and make sure to note the storage account name and a key.\n\n## 2. Disable Cassandra in spinnaker-local.yml\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"services:\\n  front50:\\n    cassandra:\\n      enabled: false\",\n      \"language\": \"yaml\"\n    }\n  ]\n}\n[/block]\n## 3. Enable the object store in spinnaker-local.yml\n\n### A. Enable S3\n\nIf you wish to use S3 as the storage service, set the following:\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"services:\\n  front50:\\n    storage_bucket: YOUR_S3_BUCKET_NAME (From Step #1)\\n    bucket_root: YOUR_S3_FOLDER_NAME (From Step #1)\\n    s3:\\n      enabled: true\",\n      \"language\": \"yaml\"\n    }\n  ]\n}\n[/block]\n### B. Enable GCS\n\nIf you wish to use GCS as the storage service, set the following YAML properties.\nThe default project name is the `${providers.google.primaryCredentials.project}` from `spinnaker-local.yml`. It is needed only if the bucket does not exist and Spinnaker is going to create the bucket for you. The `jsonPath` is used to convey the credentials to use in order to access GCS. The default is `${providers.google.primaryCredentials.jsonPath}` from `spinnaker-local.yml` however if you created the bucket in a different project (e.g. in the project that is running Spinnaker as opposed to the project that Spinnaker may be managing) then you may need to supply different credentials.\n\nFor more information about credentials and obtaining a JSON credentials file, see the [discussion on Service Accounts](https://support.google.com/cloud/answer/6158849?hl=en) \n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"services:\\n  front50:\\n    storage_bucket: YOUR_GCS_BUCKET_NAME (From Step #1)\\n    bucket_root: YOUR_GCS_FOLDER_NAME (From Step #1)\\n    gcs:\\n      enabled: true\\n      project: SEE NOTE\\n      jsonPath: SEE NOTE\",\n      \"language\": \"yaml\"\n    }\n  ]\n}\n[/block]\n### C. Enable AZS\n\nIf you wish to use Azure Storage, set the following:\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"services:\\n  front50:\\n    azs:\\n      enabled: true\\n      storageAccountName: YOUR_STORAGE_ACCOUNT_NAME (From Step #1)\\n      storageAccountKey: YOUR_STORAGE_ACCOUNT_KEY (From Step #1)\\n      storageContainerName: spinnaker\\n      rootFolder: front50\",\n      \"language\": \"yaml\"\n    }\n  ]\n}\n[/block]\n## 4. Export Existing Applications, Pipelines, Strategies, Notifications and Projects\n\nReplace \"FRONT50_HOSTNAME\" and \"FRONT50_PORT\" and run the following (\"localhost\" and \"8080\" are the defaults):\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"#!/bin/sh\\n\\nrm applications.json\\ncurl http://FRONT50_HOSTNAME:FRONT50_PORT/v2/applications | json_pp > applications.json\\n\\nrm pipelines.json\\ncurl http://FRONT50_HOSTNAME:FRONT50_PORT/pipelines | json_pp > pipelines.json\\n\\nrm strategies.json\\ncurl http://FRONT50_HOSTNAME:FRONT50_PORT/strategies | json_pp > strategies.json\\n\\nrm notifications.json\\ncurl http://FRONT50_HOSTNAME:FRONT50_PORT/notifications | json_pp > notifications.json\\n\\nrm projects.json\\ncurl http://FRONT50_HOSTNAME:FRONT50_PORT/v2/projects | json_pp > projects.json\",\n      \"language\": \"shell\"\n    }\n  ]\n}\n[/block]\n## 5. Deploy new Front50\n\nRun `sudo restart front50`\n\n## 6. Import Applications, Pipelines, Strategies, Notifications and Projects\n\nReplace \"FRONT50_HOSTNAME\" and \"FRONT50_PORT\" and run the following (\"localhost\" and \"8080\" are the defaults):\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"#!/bin/sh\\n\\ncurl -X POST -H \\\"Content-type: application/json\\\" --data-binary :::at:::\\\"notifications.json\\\" http://FRONT50_HOSTNAME:FRONT50_PORT/notifications/batchUpdate\\ncurl -X POST -H \\\"Content-type: application/json\\\" --data-binary @\\\"strategies.json\\\" http://FRONT50_HOSTNAME:FRONT50_PORT/strategies/batchUpdate\\ncurl -X POST -H \\\"Content-type: application/json\\\" --data-binary @\\\"pipelines.json\\\" http://FRONT50_HOSTNAME:FRONT50_PORT/pipelines/batchUpdate\\ncurl -X POST -H \\\"Content-type: application/json\\\" --data-binary @\\\"applications.json\\\" http://FRONT50_HOSTNAME:FRONT50_PORT/v2/applications/batch/applications\\ncurl -X POST -H \\\"Content-type: application/json\\\" --data-binary @\\\"projects.json\\\" http://FRONT50_HOSTNAME:FRONT50_PORT/v2/projects/batchUpdate\",\n      \"language\": \"shell\"\n    }\n  ]\n}\n[/block]","excerpt":"","slug":"front50-cassandra-to-object-store","type":"basic","title":"Front50: Cassandra to Object Store (S3, Azure, or GCS)"}

Front50: Cassandra to Object Store (S3, Azure, or GCS)


Cassandra is no longer an actively maintained or supported persistence store. A migration to S3, GCS, or AZS is **recommended**. ## 1. Create a Bucket and Folder Make up a bucket name (or account name if using Azure) that is consistent with the naming policies for the underlying storage service. For purposes of this document, we'll pick ${USER}-spinnaker since many storage services (including Amazon S3, Google GCS, and Azure AZS) require globally unique names. Make up a folder name within the bucket. The default is "front50". Spinnaker will store all the objects within this folder. The folder name will be used for configuration and managed by Spinnaker. You do not need to physically create the folder. Note that currently only Amazon Simple Storage Service (S3), Google Cloud Storage (GCS), or Azure Storage (AZS) are supported, and they are mutually exclusive. Pick only one. This is independent of where you are actually running Spinnaker. ### A. Create the Bucket in S3 If you wish to use S3 as the storage service, create the bucket and underlying root folder. See [Amazon's Documentation](http://docs.aws.amazon.com/AmazonS3/latest/gsg/CreatingABucket.html) for more information. To enable versioning on an existing bucket, follow [these](http://docs.aws.amazon.com/AmazonS3/latest/UG/enable-bucket-versioning.html) steps. ### B. Create the Bucket in GCS If you wish to use GCS as the storage service, Spinnaker can automatically create the bucket (with versioning) for you if it has the right OAuth scopes (Storage Admin). To create it yourself you are best off using the [gsutil tool](https://cloud.google.com/storage/docs/gsutil). Using `gsutil`, also [turn on versioning](https://cloud.google.com/storage/docs/object-versioning) within the bucket. [block:code] { "codes": [ { "code": "gsutil mb ${USER}-spinnaker\ngsutil versioning set on ${USER}-spinnaker", "language": "text", "name": "Example" } ] } [/block] ### C. Create the Storage Account in AZS If you wish to use Azure Storage, Spinnaker can automatically create the container and root folder (with versioning). Follow instructions [here](https://docs.microsoft.com/azure/storage/storage-create-storage-account#create-a-storage-account) to create a storage account and make sure to note the storage account name and a key. ## 2. Disable Cassandra in spinnaker-local.yml [block:code] { "codes": [ { "code": "services:\n front50:\n cassandra:\n enabled: false", "language": "yaml" } ] } [/block] ## 3. Enable the object store in spinnaker-local.yml ### A. Enable S3 If you wish to use S3 as the storage service, set the following: [block:code] { "codes": [ { "code": "services:\n front50:\n storage_bucket: YOUR_S3_BUCKET_NAME (From Step #1)\n bucket_root: YOUR_S3_FOLDER_NAME (From Step #1)\n s3:\n enabled: true", "language": "yaml" } ] } [/block] ### B. Enable GCS If you wish to use GCS as the storage service, set the following YAML properties. The default project name is the `${providers.google.primaryCredentials.project}` from `spinnaker-local.yml`. It is needed only if the bucket does not exist and Spinnaker is going to create the bucket for you. The `jsonPath` is used to convey the credentials to use in order to access GCS. The default is `${providers.google.primaryCredentials.jsonPath}` from `spinnaker-local.yml` however if you created the bucket in a different project (e.g. in the project that is running Spinnaker as opposed to the project that Spinnaker may be managing) then you may need to supply different credentials. For more information about credentials and obtaining a JSON credentials file, see the [discussion on Service Accounts](https://support.google.com/cloud/answer/6158849?hl=en) [block:code] { "codes": [ { "code": "services:\n front50:\n storage_bucket: YOUR_GCS_BUCKET_NAME (From Step #1)\n bucket_root: YOUR_GCS_FOLDER_NAME (From Step #1)\n gcs:\n enabled: true\n project: SEE NOTE\n jsonPath: SEE NOTE", "language": "yaml" } ] } [/block] ### C. Enable AZS If you wish to use Azure Storage, set the following: [block:code] { "codes": [ { "code": "services:\n front50:\n azs:\n enabled: true\n storageAccountName: YOUR_STORAGE_ACCOUNT_NAME (From Step #1)\n storageAccountKey: YOUR_STORAGE_ACCOUNT_KEY (From Step #1)\n storageContainerName: spinnaker\n rootFolder: front50", "language": "yaml" } ] } [/block] ## 4. Export Existing Applications, Pipelines, Strategies, Notifications and Projects Replace "FRONT50_HOSTNAME" and "FRONT50_PORT" and run the following ("localhost" and "8080" are the defaults): [block:code] { "codes": [ { "code": "#!/bin/sh\n\nrm applications.json\ncurl http://FRONT50_HOSTNAME:FRONT50_PORT/v2/applications | json_pp > applications.json\n\nrm pipelines.json\ncurl http://FRONT50_HOSTNAME:FRONT50_PORT/pipelines | json_pp > pipelines.json\n\nrm strategies.json\ncurl http://FRONT50_HOSTNAME:FRONT50_PORT/strategies | json_pp > strategies.json\n\nrm notifications.json\ncurl http://FRONT50_HOSTNAME:FRONT50_PORT/notifications | json_pp > notifications.json\n\nrm projects.json\ncurl http://FRONT50_HOSTNAME:FRONT50_PORT/v2/projects | json_pp > projects.json", "language": "shell" } ] } [/block] ## 5. Deploy new Front50 Run `sudo restart front50` ## 6. Import Applications, Pipelines, Strategies, Notifications and Projects Replace "FRONT50_HOSTNAME" and "FRONT50_PORT" and run the following ("localhost" and "8080" are the defaults): [block:code] { "codes": [ { "code": "#!/bin/sh\n\ncurl -X POST -H \"Content-type: application/json\" --data-binary @\"notifications.json\" http://FRONT50_HOSTNAME:FRONT50_PORT/notifications/batchUpdate\ncurl -X POST -H \"Content-type: application/json\" --data-binary @\"strategies.json\" http://FRONT50_HOSTNAME:FRONT50_PORT/strategies/batchUpdate\ncurl -X POST -H \"Content-type: application/json\" --data-binary @\"pipelines.json\" http://FRONT50_HOSTNAME:FRONT50_PORT/pipelines/batchUpdate\ncurl -X POST -H \"Content-type: application/json\" --data-binary @\"applications.json\" http://FRONT50_HOSTNAME:FRONT50_PORT/v2/applications/batch/applications\ncurl -X POST -H \"Content-type: application/json\" --data-binary @\"projects.json\" http://FRONT50_HOSTNAME:FRONT50_PORT/v2/projects/batchUpdate", "language": "shell" } ] } [/block]