diff options
Diffstat (limited to 'roles')
29 files changed, 1924 insertions, 0 deletions
diff --git a/roles/LICENSE b/roles/LICENSE new file mode 100644 index 0000000..3be62f8 --- /dev/null +++ b/roles/LICENSE @@ -0,0 +1,201 @@ + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "{}" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright 2018 Orange-OpenSource / lfn / ci_cd + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. diff --git a/roles/README.md b/roles/README.md new file mode 100644 index 0000000..9002afc --- /dev/null +++ b/roles/README.md @@ -0,0 +1,15 @@ +Chained-CI-roles +========== + +Role +---- +Chained-CI is a way to run a set of projects, each one as a job in a top level +pipeline. + +This project, running can run on a gitlab CE, is triggering configured projects +one after the other, or in parallele, sharing configuration through artifacts. +This allow to integrate projects managed by third parties, or running together +independant projects. + +___This project is hosting the roles needed to run the pipelines. The running +project hosting pipelines and the inventory is not yet public___ diff --git a/roles/artifact_init/defaults/main.yaml b/roles/artifact_init/defaults/main.yaml new file mode 100644 index 0000000..c1ccbb9 --- /dev/null +++ b/roles/artifact_init/defaults/main.yaml @@ -0,0 +1,2 @@ +--- +step: "{{ lookup('env', 'CONFIG_NAME') | default('config', true )}}" diff --git a/roles/artifact_init/filter_plugins/filters.py b/roles/artifact_init/filter_plugins/filters.py new file mode 100644 index 0000000..db38fc6 --- /dev/null +++ b/roles/artifact_init/filter_plugins/filters.py @@ -0,0 +1,8 @@ +#!/usr/bin/env python3 + +import os +import sys + +sys.path.append(os.path.normpath(os.path.join(os.path.dirname(__file__),'../../'))) + +from library.filepath import FilterModule diff --git a/roles/artifact_init/tasks/main.yml b/roles/artifact_init/tasks/main.yml new file mode 100644 index 0000000..e4e4fb6 --- /dev/null +++ b/roles/artifact_init/tasks/main.yml @@ -0,0 +1,180 @@ +--- +## +# Warn if log level is high +## +- name: Warn if log level is high + debug: + msg: "{{ msg.split('\n') }}" + verbosity: 3 + vars: + msg: | + !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! + !! Log level is HIGH ! !! + !! Some sensitive data may be visible to everyone. !! + !! Don't forget to clean the task output ! !! + !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! + +## +# get the config +## + +- name: get artifact_src if we refer to a previous one + when: artifacts_src is defined + uri: + url: "{{ artifacts_src }}" + headers: + PRIVATE-TOKEN: "{{ gitlab.private_token }}" + dest: "{{ playbook_dir }}/artifacts.zip" + +- name: unzip get_artifact archive + when: artifacts_src is defined or artifacts_bin is defined + unarchive: + src: "{{ playbook_dir }}/artifacts.zip" + dest: "{{ playbook_dir }}" + remote_src: "yes" + +- name: delete archive + file: + path: "{{ playbook_dir }}/artifacts.zip" + state: absent + +- name: create artifacts folders + file: + path: "{{ item }}" + state: directory + mode: 0775 + when: item[-1] == '/' + with_items: "{{ vars[lookup( 'env', 'CI_JOB_NAME')].artifacts.paths }}" + +- name: ensure configs can be written + file: + path: "{{ playbook_dir }}/{{ item }}" + mode: 0660 + ignore_errors: true + with_items: + - vars/pdf.yml + - vars/idf.yml + - vars/certificates.yml + - vars/vaulted_ssh_credentials.yml + - vars/ssh_gateways.yml + +- name: get the infra config name + set_fact: + infra_config: "{{ config.infra | default(inventory_hostname) }}" + +- name: get the infra PDF/IDF + when: infra_config != 'NONE' + block: + - name: get PDF configs + uri: + url: >- + {{ config.api }}/repository/files/{{ + [config.path | default(''), 'config'] | + filepath(infra_config, '.yaml') }}?ref={{ config.branch }} + headers: + PRIVATE-TOKEN: "{{ gitlab.private_token }}" + status_code: 200 + return_content: yes + register: pdf_get + + - name: save PDF config + copy: + content: "{{ pdf_get.json.content | b64decode }}" + dest: "{{ playbook_dir }}/vars/pdf.yml" + force: true + mode: 0660 + decrypt: false + + - name: get IDF configs + uri: + url: >- + {{ config.api }}/repository/files/{{ [config.path | default(''), + 'config'] | filepath('idf-', infra_config, '.yaml') + }}?ref={{ config.branch }} + headers: + PRIVATE-TOKEN: "{{ gitlab.private_token }}" + status_code: 200 + return_content: yes + register: idf_get + + - name: save IDF config + copy: + content: "{{ idf_get.json.content | b64decode }}" + dest: "{{ playbook_dir }}/vars/idf.yml" + force: true + mode: 0660 + decrypt: false + +- name: get certificate + uri: + url: >- + {{ config.api }}/repository/files/{{ + [config.path | default(''), 'certificats'] + | filepath(config.certificates) }}?ref={{ config.branch }} + headers: + PRIVATE-TOKEN: "{{ gitlab.private_token }}" + status_code: 200 + return_content: yes + register: certs_get + when: config.certificates is defined + +- name: save certificate + copy: + content: "{{ certs_get.json.content | b64decode }}" + dest: "{{ playbook_dir }}/vars/certificates.yml" + force: true + mode: 0660 + decrypt: false + when: config.certificates is defined + +- name: get ssh credentials + uri: + url: >- + {{ config.api }}/repository/files/{{ + [config.path | default(''), 'ssh_creds'] | + filepath(config.ssh_creds | default(ansible_ssh_creds)) + }}?ref={{ config.branch }} + headers: + PRIVATE-TOKEN: "{{ gitlab.private_token }}" + status_code: 200 + return_content: yes + register: ssh_creds_get + when: config.ansible_ssh_creds is defined or ansible_ssh_creds is defined + +- name: save ssh credentials + copy: + content: "{{ ssh_creds_get.json.content | b64decode }}" + dest: "{{ playbook_dir }}/vars/vaulted_ssh_credentials.yml" + force: true + mode: 0660 + decrypt: false + when: config.ansible_ssh_creds is defined or ansible_ssh_creds is defined + +- name: set ssh gateways config + uri: + url: >- + {{ config.api }}/repository/files/{{ + [config.path | default(''), 'config/ssh_gateways'] + | filepath(config.ssh_access) }}?ref={{ config.branch }} + headers: + PRIVATE-TOKEN: "{{ gitlab.private_token }}" + status_code: 200 + return_content: yes + register: ssh_gw_get + when: config.ssh_access is defined + +- name: save ssh gateways config + copy: + content: "{{ ssh_gw_get.json.content | b64decode }}" + dest: "{{ playbook_dir }}/vars/ssh_gateways.yml" + force: true + mode: 0660 + decrypt: false + when: config.ssh_access is defined + +- name: set basic inventory + copy: + dest: "{{ playbook_dir }}/inventory/inventory" + content: > + jumphost ansible_host={{ jumphost.server }} + ansible_user={{ jumphost.user }} pod={{ inventory_hostname }} diff --git a/roles/get_artifacts/defaults/main.yml b/roles/get_artifacts/defaults/main.yml new file mode 100644 index 0000000..112aa4a --- /dev/null +++ b/roles/get_artifacts/defaults/main.yml @@ -0,0 +1,7 @@ +--- +previous_artifacts_folder: "{{ playbook_dir }}/previous_artifacts" +final_artifacts_folder: "{{ playbook_dir }}/FINAL_ARTIFACT" + +job_id_fetch: + max_page: 100 + per_page: 100 diff --git a/roles/get_artifacts/filter_plugins/filters.py b/roles/get_artifacts/filter_plugins/filters.py new file mode 100644 index 0000000..db38fc6 --- /dev/null +++ b/roles/get_artifacts/filter_plugins/filters.py @@ -0,0 +1,8 @@ +#!/usr/bin/env python3 + +import os +import sys + +sys.path.append(os.path.normpath(os.path.join(os.path.dirname(__file__),'../../'))) + +from library.filepath import FilterModule diff --git a/roles/get_artifacts/tasks/binary.yml b/roles/get_artifacts/tasks/binary.yml new file mode 100644 index 0000000..99ba930 --- /dev/null +++ b/roles/get_artifacts/tasks/binary.yml @@ -0,0 +1,244 @@ +--- +## +# Handle different get_artifacts types +## +- name: value change for coherency + set_fact: + config: >- + {{ config|combine({'get_artifacts': [] }) }} + when: config.get_artifacts is not defined +- name: value change for coherency + set_fact: + config: >- + {{ config|combine({'get_artifacts': + [{ 'name': config.get_artifacts }] }) }} + when: config.get_artifacts is string + +- debug: + var: config + verbosity: 3 +## +# Prepare a folder for +## + +- name: set previous_artifacts_folder + file: + path: "{{ item }}" + state: directory + loop: + - "{{ previous_artifacts_folder }}" + - "{{ final_artifacts_folder }}" + +- name: create dest folders for the jobs artifacts + file: + path: "{{ previous_artifacts_folder }}/{{ item.name }}" + state: directory + loop: "{{ config.get_artifacts }}" + loop_control: + label: "{{ item.name }}" + +## +# Get all artifacts job ids +## +- name: loop on get_artifacts + include_tasks: get_one_artifact.yml + vars: + artifact_job_name: "{{ item.name }}" + artifact_in_pipeline: "{{ item.in_pipeline | default(true) }}" + when: not (item.static_src | default(false)) + loop: "{{ config.get_artifacts }}" + loop_control: + label: "{{ artifact_job_name }}" + +- name: download all job artifacts + uri: + url: >- + {{ gitlab.api_url }}/projects/{{ lookup('env', 'CI_PROJECT_ID') + }}/jobs/{{ artifact_job_ids[idx] }}/artifacts + headers: + PRIVATE-TOKEN: "{{ gitlab.private_token }}" + status_code: 200 + dest: >- + {{ previous_artifacts_folder }}/{{ item.name }}/artifacts.zip + when: not (item.static_src | default(false)) + loop: "{{ config.get_artifacts }}" + loop_control: + index_var: idx + label: "{{ item.name }}" + +- name: download all static artifacts on public projects + uri: + url: >- + {{ config.url }}/raw/{{ config.branch }}/{{ + config.path | default('') }}/config/artifacts/{{ + item.name }}.zip?inline=false + status_code: 200 + dest: >- + {{ previous_artifacts_folder }}/{{ item.name }}/artifacts.zip + when: (item.static_src | default(false)) and (config.api is not defined) + loop: "{{ config.get_artifacts }}" + loop_control: + label: "{{ item.name }}" + +- name: download all static artifacts using api + uri: + url: >- + {{ config.api }}/repository/files/{{ + [config.path | default('') , 'config/artifacts'] | + filepath(item.name, '.zip') + }}/raw?ref={{ config.branch }} + headers: + PRIVATE-TOKEN: "{{ gitlab.private_token }}" + status_code: 200 + dest: >- + {{ previous_artifacts_folder }}/{{ item.name }}/artifacts.zip + when: (item.static_src | default(false)) and (config.api is defined) + loop: "{{ config.get_artifacts }}" + loop_control: + label: "{{ item.name }}" + +- name: unarchive all artifacts + unarchive: + src: "{{ previous_artifacts_folder }}/{{ item.name }}/artifacts.zip" + dest: "{{ previous_artifacts_folder }}/{{ item.name }}/" + remote_src: "yes" + loop: "{{ config.get_artifacts }}" + loop_control: + label: "{{ item.name }}" + +- name: remove all artifacts archives + file: + path: "{{ previous_artifacts_folder }}/{{ item.name }}/artifacts.zip" + state: absent + loop: "{{ config.get_artifacts }}" + loop_control: + label: "{{ item.name }}" + +- name: create artifacts folders + file: + path: "{{ final_artifacts_folder }}/{{ item }}" + state: directory + recurse: true + mode: 0775 + when: item[-1] == '/' + with_items: "{{ vars['.artifacts_root'].paths }}" + +- name: copy all files if no filters + copy: + decrypt: false + src: "{{ previous_artifacts_folder }}/{{ item.name }}/" + dest: "{{ final_artifacts_folder }}/" + when: item.limit_to is not defined or item.limit_to == None + loop: "{{ config.get_artifacts }}" + loop_control: + label: "{{ item.name }}" + +- name: copy filtered files if filters + include_tasks: limit_to.yml + when: item.limit_to is defined + loop: "{{ config.get_artifacts }}" + vars: + job_name: "{{ item.name }}" + limit_to: "{{ item.limit_to }}" + loop_control: + label: "{{ item.name }}" + +## +# get list of files to archive +## +- name: get list of files to encrypt + find: + paths: "{{ final_artifacts_folder }}" + recurse: true + register: artifacts_files + +- name: set file list + set_fact: + files_list: "{{ artifacts_files.files | map(attribute='path')| list }}" + +## +# If we encode file via ansible vault +## +- name: encrypt files + shell: > + ansible-vault encrypt --vault-password-file {{ + lookup( 'env', 'VAULT_FILE') }} {{ item }} + register: res + loop: "{{ files_list }}" + failed_when: + res.rc == 1 and res.stderr != "ERROR! input is already encrypted" + when: + config.get_encrypt is defined and (config.get_encrypt | bool) + + +## +# Add ssh_gateways file if needed +## + +- name: get config step parameters + set_fact: + config_step: >- + {{ gitlab.git_projects[ + hostvars[inventory_hostname].scenario_steps['config'].project] | + combine(hostvars[inventory_hostname].scenario_steps['config']) }} + +- name: get ssh gateways config + uri: + url: >- + {{ config_step.api }}/repository/files/{{ + [config_step.path | default(''), 'config/ssh_gateways'] | + filepath(config.ssh_access) + }}?ref={{ config_step.branch }} + headers: + PRIVATE-TOKEN: "{{ gitlab.private_token }}" + status_code: 200 + return_content: yes + register: ssh_gw_get + when: config.ssh_access is defined + +- name: save ssh gateways config + copy: + content: "{{ ssh_gw_get.json.content | b64decode }}" + dest: "{{ final_artifacts_folder }}/vars/ssh_gateways.yml" + force: true + mode: 0660 + when: config.ssh_access is defined + +## +# get list of files and folders to archive +## +- name: set file list + set_fact: + arch_files: + "{{ (arch_files | default([])) + + [ final_artifacts_folder + '/' + item ] }}" + loop: "{{ vars['.artifacts_root'].paths }}" + +- name: Prepare artifact archive for binary transmission + archive: + path: "{{ arch_files }}" + dest: "{{ playbook_dir }}/artifacts.zip" + format: zip + +## +# Set the artifact to send +## +- name: "Prepare artifact archive for binary transmission" + slurp: + src: artifacts.zip + register: slurped_artifact + +- name: Add artifacts bin if requested + set_fact: + artifacts_bin: "{{ slurped_artifact.content }}" + +## +# Clean +## +- name: delete temporary folders + file: + path: "{{ item }}" + state: absent + loop: + - "{{ previous_artifacts_folder }}" + - "{{ final_artifacts_folder }}" diff --git a/roles/get_artifacts/tasks/get_one_artifact.yml b/roles/get_artifacts/tasks/get_one_artifact.yml new file mode 100644 index 0000000..ccbdc48 --- /dev/null +++ b/roles/get_artifacts/tasks/get_one_artifact.yml @@ -0,0 +1,49 @@ +--- +## +# Search for a job id +# with name: artifact_job_name +# limit to pipeline if artifact_in_pipeline (default: true) +## + +- name: set empty fact for job + set_fact: + job: {} + artifact_in_pipeline: "{{ artifact_in_pipeline | default(true) }}" + +- name: get job id in this pipeline + when: artifact_in_pipeline | bool + block: + - name: "Get job successful job ids of the pipeline" + uri: + url: >- + {{ gitlab.api_url }}/projects/{{ + lookup( 'env', 'CI_PROJECT_ID') }}/pipelines/{{ + lookup( 'env', 'CI_PIPELINE_ID') }}/jobs?scope[]=success + method: GET + headers: + PRIVATE-TOKEN: "{{ gitlab.private_token }}" + register: pipeline_success_jobs + - name: get the job id + set_fact: + job: >- + {{ { 'id': + pipeline_success_jobs.json |json_query( + '[?name==`'+ artifact_job_name + ':' + + inventory_hostname +'`].id') | last + } }} + +- name: fetch the job id corresponding to get_artifact value if not in pipeline + include_tasks: job_id_fetch.yml + loop: "{{ range(0, job_id_fetch.max_page)| list }}" + when: not (artifact_in_pipeline | bool ) + loop_control: + loop_var: page + +- name: check we found an artifact job id + fail: + msg: 'We can not found a correct job id' + when: job.id is not defined + +- name: get last successful job id + set_fact: + artifact_job_ids: "{{ (artifact_job_ids|default([])) + [job.id] }}" diff --git a/roles/get_artifacts/tasks/job_id_fetch.yml b/roles/get_artifacts/tasks/job_id_fetch.yml new file mode 100644 index 0000000..cab4bcb --- /dev/null +++ b/roles/get_artifacts/tasks/job_id_fetch.yml @@ -0,0 +1,20 @@ +--- + +- block: + - name: "Get successful job ids if artifact fetching" + uri: + url: >- + {{ gitlab.api_url }}/projects/{{ lookup( 'env', 'CI_PROJECT_ID') + }}/jobs?scope[]=success&per_page={{ job_id_fetch.per_page + }}&page={{ page }} + method: GET + headers: + PRIVATE-TOKEN: "{{ gitlab.private_token }}" + register: successful_jobs + - name: save successful job + set_fact: + job: >- + {{ successful_jobs.json| + selectattr('name', 'equalto', artifact_job_name)| list | + first | default({}) }} + when: job.id is not defined diff --git a/roles/get_artifacts/tasks/limit_to.yml b/roles/get_artifacts/tasks/limit_to.yml new file mode 100644 index 0000000..2e1b782 --- /dev/null +++ b/roles/get_artifacts/tasks/limit_to.yml @@ -0,0 +1,20 @@ +--- + +- debug: + var: limit_to + verbosity: 3 +- debug: + var: job_name + verbosity: 3 +- name: copy all files if filters and rename if needed + copy: + decrypt: false + src: "{{ previous_artifacts_folder }}/{{ job_name }}/{{ original }}" + dest: "{{ final_artifacts_folder }}/{{ renamed }}" + loop: "{{ limit_to }}" + vars: + original: "{{ file.keys()|first }}" + renamed: "{{ file.values()|first }}" + loop_control: + loop_var: file + label: "{{ original }}" diff --git a/roles/get_artifacts/tasks/main.yml b/roles/get_artifacts/tasks/main.yml new file mode 100644 index 0000000..605521c --- /dev/null +++ b/roles/get_artifacts/tasks/main.yml @@ -0,0 +1,34 @@ +--- +## +# Check config is prepared +## +- name: check 'step' is set + fail: + msg: 'Prepare role must be run before' + when: config is not defined + + +- name: recover previous artifacts + when: + config.get_artifacts is defined and + config.get_artifacts + block: + ## + # If we get previous artifacts via url + ## + - name: Add artifacts via source + include_tasks: url.yml + when: + (config.get_bin is not defined or not (config.get_bin | bool)) + and (config.ssh_access is not defined) + and (config.get_artifacts is string) + + ## + # If we get previous artifacts via url + ## + - name: Add artifacts via binary + include_tasks: binary.yml + when: + (config.get_bin is defined and (config.get_bin | bool)) + or (config.ssh_access is defined) + or (config.get_artifacts is not string) diff --git a/roles/get_artifacts/tasks/url.yml b/roles/get_artifacts/tasks/url.yml new file mode 100644 index 0000000..a2b5a91 --- /dev/null +++ b/roles/get_artifacts/tasks/url.yml @@ -0,0 +1,13 @@ +--- + +- name: get_artifacts with just one value + include_tasks: get_one_artifact.yml + vars: + artifact_job_name: "{{ config.get_artifacts }}" + +- name: get the url of the artifact + set_fact: + artifacts_src: >- + {{ gitlab.api_url }}/projects/{{ + lookup( 'env', 'CI_PROJECT_ID') }}/jobs/{{ + artifact_job_ids[0] }}/artifacts diff --git a/roles/gitlab-ci-generator/defaults/main.yml b/roles/gitlab-ci-generator/defaults/main.yml new file mode 100644 index 0000000..11b8726 --- /dev/null +++ b/roles/gitlab-ci-generator/defaults/main.yml @@ -0,0 +1,3 @@ +--- +ci_file: "{{ lookup('env', 'CI_FILE') + | default(playbook_dir +'/.gitlab-ci.yml', true)}}" diff --git a/roles/gitlab-ci-generator/tasks/main.yml b/roles/gitlab-ci-generator/tasks/main.yml new file mode 100644 index 0000000..a96ae7c --- /dev/null +++ b/roles/gitlab-ci-generator/tasks/main.yml @@ -0,0 +1,45 @@ +--- +## +# Warn if log level is high +## +- name: Warn if log level is high + debug: + msg: "{{ msg.split('\n') }}" + verbosity: 3 + vars: + msg: | + !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! + !! Log level is HIGH ! !! + !! Some sensitive data may be visible to everyone. !! + !! Don't forget to clean the task output ! !! + !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! + +## +# Generate the CI file +## +- name: generate the new gitlab-ci file from inventory + run_once: true + block: + - name: create a tempfile + tempfile: + state: file + suffix: temp + register: tmp_file + - copy: + src: "{{ ci_file }}" + dest: "{{ tmp_file.path }}" + ignore_errors: true + - name: generate the gitlab-ci.yml + template: + src: gitlab-ci.yml + dest: "{{ ci_file }}" + rescue: + - name: restore gitlab-ci + copy: + src: "{{ tmp_file.path }}" + dest: "{{ ci_file }}" + always: + - name: destroy temp file + file: + path: "{{ tmp_file.path }}" + state: absent diff --git a/roles/gitlab-ci-generator/templates/gitlab-ci.yml b/roles/gitlab-ci-generator/templates/gitlab-ci.yml new file mode 100644 index 0000000..51ceb05 --- /dev/null +++ b/roles/gitlab-ci-generator/templates/gitlab-ci.yml @@ -0,0 +1,204 @@ +--- +################################################################################ +# +# !! DO NOT EDIT MANUALLY !! +# +# This file is generated by gitlab-ci-generator +# +################################################################################ + +stages: +{% for stage in stages %} + - {{ stage }} +{% endfor %} + +variables: + GIT_SUBMODULE_STRATEGY: recursive + VAULT_FILE: .vault + +################################################################################ +# Shared parameters +################################################################################ +.runner_tags: &runner_tags + tags: +{% for tag in runner.tags %} + - {{ tag }} +{% endfor %} + +.syntax_checking: &syntax_checking + only: + - pushes + stage: lint + +.artifacts_root: &artifacts_root + name: "$CI_JOB_NAME-$CI_COMMIT_REF_NAME" + paths: + - vars/ + - inventory/ + +.artifacts: &artifacts + artifacts: + <<: *artifacts_root + expire_in: 15 days + +.artifacts_longexpire: &artifacts_longexpire + artifacts: + <<: *artifacts_root + expire_in: 1 yrs + +.runner_env: &runner_env +{% for var_name, var_value in runner.env_vars.items()|default({'foo': 'bar'}) %} + {{ var_name }}: "{{ var_value }}" +{% endfor %} + +################################################################################ +# Linting +################################################################################ + +yaml_checking: + <<: *syntax_checking + <<: *runner_tags + variables: + <<: *runner_env + image: {{ runner.docker_proxy }}sdesbure/yamllint:latest + script: + - > + yamllint -d "line-length: { + max: 80, + allow-non-breakable-words: true, + allow-non-breakable-inline-mappings: true}" + .gitlab-ci.yml + - yamllint *.yml + +ansible_linting: + <<: *syntax_checking + <<: *runner_tags + variables: + <<: *runner_env + image: {{ runner.docker_proxy }}sdesbure/ansible-lint:latest + script: + - ansible-lint -x ANSIBLE0010,ANSIBLE0013 run-ci.yml + +{% if not (disable_pages | default(false)) %} +################################################################################ +# Pages +################################################################################ + +pages: + image: {{ runner.docker_proxy }}{{ runner.image }}:{{ runner.image_tag }} + stage: lint + <<: *runner_tags + variables: + <<: *runner_env + script: + - ./chained-ci-vue/init.sh ./pod_inventory + artifacts: + paths: + - public + only: + - master + except: + - triggers + - api + - external + - pipelines + - schedules + - web + +{% endif %} + +################################################################################ +# Jobs +################################################################################ + +.vault_mgmt: &vault_mgmt + before_script: + - echo ${ANSIBLE_VAULT_PASSWORD} > ${PWD}/${VAULT_FILE} + after_script: + - rm -f $PWD/.vault + +.set_config: &set_config + <<: *runner_tags + <<: *vault_mgmt + image: {{ runner.docker_proxy }}{{ runner.image }}:{{ runner.image_tag }} + script: + - > + ansible-playbook -i pod_inventory/inventory --limit ${pod} + --vault-password-file ${PWD}/${VAULT_FILE} + ${ansible_verbose} artifacts_init.yml + +.run_ci: &run_ci + <<: *runner_tags + <<: *vault_mgmt + image: {{ runner.docker_proxy }}{{ runner.image }}:{{ runner.image_tag }} + script: + - > + ansible-playbook -i pod_inventory/inventory --limit ${pod} + --extra-vars "step=${CI_JOB_NAME%:*}" + --vault-password-file ${PWD}/${VAULT_FILE} + ${ansible_verbose} run-ci.yml + +.trigger: &trigger + <<: *runner_tags + <<: *vault_mgmt + image: {{ runner.docker_proxy }}{{ runner.image }}:{{ runner.image_tag }} + script: + - > + ansible-playbook -i pod_inventory/inventory --limit ${pod} + --vault-password-file ${PWD}/${VAULT_FILE} + ${ansible_verbose} --extra-vars "step=trigger" trigger_myself.yml + +{% for pipeline in groups['all'] %} +################################################################################ +# {{ pipeline }} +################################################################################ + +.{{ pipeline }}_global: &{{ pipeline }}_global + variables: + pod: {{ pipeline }} + <<: *runner_env +{% if hostvars[pipeline].environment is defined %} + environment: + name: {{ hostvars[pipeline].environment }} +{% endif %} + only: + variables: + - $POD == "{{ pipeline }}" +{% if hostvars[pipeline].inpod is defined %} + - $INPOD == "{{ hostvars[pipeline].inpod }}" +{% endif %} + refs: + - web + - schedules + - triggers + +{% for stage in stages %} +{% for task in hostvars[pipeline].scenario_steps %} +{% if hostvars[pipeline].scenario_steps[task].stage | default( + gitlab.git_projects[hostvars[pipeline].scenario_steps[task].project].stage + ) == stage %} +{{ task }}:{{ pipeline }}: + stage: {{ stage }} + <<: *{{ pipeline }}_global +{% if hostvars[pipeline].scenario_steps[task].project == 'config' %} + <<: *set_config +{% elif hostvars[pipeline].scenario_steps[task].project == 'trigger' %} + <<: *trigger +{% else %} + <<: *run_ci +{% endif %} +{% if (hostvars[pipeline].scenario_steps[task].pull_artifacts + | default(gitlab.git_projects[hostvars[pipeline].scenario_steps[task].project].pull_artifacts) + | default(false)) + or task == 'config' %} + <<: *artifacts{% if hostvars[pipeline].longlife_artifact | default(false) | bool %}_longexpire{% endif %} + +{% endif %} +{% endif %} +{% endfor %} +{% endfor %} + +{% endfor %} +## +# End of generated file +## diff --git a/roles/library/filepath.py b/roles/library/filepath.py new file mode 100644 index 0000000..356f5fb --- /dev/null +++ b/roles/library/filepath.py @@ -0,0 +1,40 @@ +#!/usr/bin/env python3 + +import os +from urllib.parse import quote + +class FilterModule(object): + def filters(self): + return { + 'filepath': self.filepath + } + + def filepath(self, path, *filename): + # + # path: a string or a list of string that contains successive parts + # of the path. Nul or empty parts are removed + # filename: the optionnal filename to be used after the path. It may + # be specified using multiple args to be concatenate (useful + # when building dynamic names in ansible/jinja templates) + # + '''build a gitlab filepath given `path' and `filename'.''' + + if path is not None: + if not isinstance(path, list): + path = [path] + path = list(filter(None, path)) + if path: + path = os.path.normpath(os.path.join(path[0], *path[1:])) + + if filename: + filename = ''.join(list(filter(None, filename))) + + if path and filename: + path = os.path.join(path, filename) + elif filename: + path = filename + + if path: + return quote(path, safe='') + + return None diff --git a/roles/logo.png b/roles/logo.png Binary files differnew file mode 100644 index 0000000..1fc981e --- /dev/null +++ b/roles/logo.png diff --git a/roles/logo.svg b/roles/logo.svg new file mode 100644 index 0000000..b2e6875 --- /dev/null +++ b/roles/logo.svg @@ -0,0 +1,130 @@ +<?xml version="1.0" encoding="UTF-8" standalone="no"?> +<!-- Created with Inkscape (http://www.inkscape.org/) --> + +<svg + xmlns:dc="http://purl.org/dc/elements/1.1/" + xmlns:cc="http://creativecommons.org/ns#" + xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" + xmlns:svg="http://www.w3.org/2000/svg" + xmlns="http://www.w3.org/2000/svg" + xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" + xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" + width="19.586046mm" + height="19.586046mm" + viewBox="0 0 19.586046 19.586046" + version="1.1" + id="svg8" + inkscape:version="0.92.3 (2405546, 2018-03-11)" + sodipodi:docname="logo.svg" + inkscape:export-filename="/home/edby8475/Dev/chained-ci-roles/logo.png" + inkscape:export-xdpi="98" + inkscape:export-ydpi="98"> + <defs + id="defs2" /> + <sodipodi:namedview + id="base" + pagecolor="#ffffff" + bordercolor="#666666" + borderopacity="1.0" + inkscape:pageopacity="0.0" + inkscape:pageshadow="2" + inkscape:zoom="6.5333333" + inkscape:cx="7.8188803" + inkscape:cy="30.617961" + inkscape:document-units="mm" + inkscape:current-layer="g933" + showgrid="false" + inkscape:window-width="2560" + inkscape:window-height="1403" + inkscape:window-x="0" + inkscape:window-y="0" + inkscape:window-maximized="1" + fit-margin-top="0" + fit-margin-left="0" + fit-margin-right="0" + fit-margin-bottom="0" /> + <metadata + id="metadata5"> + <rdf:RDF> + <cc:Work + rdf:about=""> + <dc:format>image/svg+xml</dc:format> + <dc:type + rdf:resource="http://purl.org/dc/dcmitype/StillImage" /> + <dc:title></dc:title> + </cc:Work> + </rdf:RDF> + </metadata> + <g + inkscape:label="Calque 1" + inkscape:groupmode="layer" + id="layer1" + transform="translate(-76.099242,-143.22085)"> + <circle + style="opacity:1;vector-effect:none;fill:#4d4d4d;fill-opacity:1;stroke:#4d4d4d;stroke-width:0.50260705;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1" + id="path921" + cx="85.892265" + cy="153.01387" + r="9.5417194" /> + <g + id="g933"> + <path + sodipodi:nodetypes="cccc" + inkscape:connector-curvature="0" + id="path919" + d="m 87.755086,149.65335 h -3.722685 c 3.722685,0 0.03786,6.84943 3.737003,6.84401 v 0" + style="fill:none;stroke:#f2f2f2;stroke-width:0.26458332px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" /> + <g + transform="translate(-2.1166667)" + id="g862"> + <circle + style="opacity:1;vector-effect:none;fill:#ffffff;fill-opacity:1;stroke:#44aa00;stroke-width:0.5291667;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1" + id="path848" + cx="83.371094" + cy="149.57428" + r="2.7487407" /> + <path + style="fill:none;stroke:#44aa00;stroke-width:0.62900001;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1" + d="m 82.386279,149.28284 0.715902,0.81612 1.267143,-1.24567" + id="path850" + inkscape:connector-curvature="0" /> + </g> + <g + transform="translate(-0.03559777,6.879167)" + id="g858"> + <circle + r="2.7487407" + cy="149.57428" + cx="90.565697" + id="circle852" + style="opacity:1;vector-effect:none;fill:#ffffff;fill-opacity:1;stroke:#2a7fff;stroke-width:0.5291667;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1" /> + <path + style="opacity:1;vector-effect:none;fill:#2a7fff;fill-opacity:1;stroke:none;stroke-width:0.36824697;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1" + id="circle854" + sodipodi:type="arc" + sodipodi:cx="90.415558" + sodipodi:cy="149.57428" + sodipodi:rx="1.9128479" + sodipodi:ry="1.9128479" + sodipodi:start="4.7106229" + sodipodi:end="2.5736704" + d="m 90.41218,147.66143 a 1.9128479,1.9128479 0 0 1 1.881627,1.55068 1.9128479,1.9128479 0 0 1 -1.17015,2.13913 1.9128479,1.9128479 0 0 1 -2.320669,-0.74807 l 1.61257,-1.02889 z" /> + </g> + <g + transform="translate(7.1590052)" + id="g868"> + <circle + r="2.7487407" + cy="149.57428" + cx="83.371094" + id="circle864" + style="opacity:1;vector-effect:none;fill:#ffffff;fill-opacity:1;stroke:#44aa00;stroke-width:0.5291667;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1" /> + <path + inkscape:connector-curvature="0" + id="path866" + d="m 82.386279,149.28284 0.715902,0.81612 1.267143,-1.24567" + style="fill:none;stroke:#44aa00;stroke-width:0.62900001;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1" /> + </g> + </g> + </g> +</svg> diff --git a/roles/prepare/README.md b/roles/prepare/README.md new file mode 100644 index 0000000..2cdcd0e --- /dev/null +++ b/roles/prepare/README.md @@ -0,0 +1,28 @@ +# Chained CI Prepare role + +This role prepare the settings before getting artifacts and run the playbook. +It: + - Warn if log level is HIGH to avoid data leaking + - Check the step parameter is set + - prepare the `config` fact + - test `only` and `except` step parameters to limit when jobs are runned. + This will __SKIP__ this job if __ONE of__ the `except` condition is + successful __AND__ if __ALL__ the `only` conditions are failing. Those + conditions are testing environment variables like this: + - `VAR`: this test the presence of a variable that is not empty + - `VAR == value`: this test the exact value of a variable + - `VAR != value`: this test the exact difference of a variable. + - `VAR in [value1, value2]`: this test the exact value of a variable is a + set of possibilities + +## Example + +``` +except: + - "XXX in [aaa, aab]" + - "YYY" +only: + - "AAA == yes" + - "BBB != no" + - "CCC in [pitet, possible]" +``` diff --git a/roles/prepare/tasks/continue.yml b/roles/prepare/tasks/continue.yml new file mode 100644 index 0000000..5d664c7 --- /dev/null +++ b/roles/prepare/tasks/continue.yml @@ -0,0 +1,15 @@ +--- + +- name: we have to continue this role + debug: + msg: "{{ msg.split('\n') }}" + vars: + msg: | + ************************************************************************** + ** We continue the play + ** REASON = '{{ condition }}' + ************************************************************************** + +- name: Do not skip the run of the play + set_fact: + skip_run: false diff --git a/roles/prepare/tasks/except.yml b/roles/prepare/tasks/except.yml new file mode 100644 index 0000000..8d8abff --- /dev/null +++ b/roles/prepare/tasks/except.yml @@ -0,0 +1,55 @@ +--- +# in this file, default variable value is '-666-', I hope no one will ever +# test the number of the beast :) + + +- name: Testing 'EXCEPT' condition + debug: + var: condition + +- name: if condition is only one word + block: + - name: check variable is present + include_tasks: exit.yml + when: lookup('env', condition)| default(False, true) + when: condition.split()| length == 1 + +- name: if condition contains '==' + block: + - name: split condition with '==' + set_fact: + cond: "{{ (condition|replace(' == ', '==')).split('==') }}" + - debug: msg="{{ cond[1:]| join('==') }}" + - name: test condition + include_tasks: exit.yml + when: (lookup('env', cond[0])| default('-666-', true)) == ( + cond[1:]| join('==')) + when: condition is search('==') + +- name: if condition contains '!=' + block: + - name: split condition with '!=' + set_fact: + cond: "{{ (condition|replace(' != ', '!=')).split('!=') }}" + - name: test condition + include_tasks: exit.yml + when: (lookup('env', cond[0])| default('-666-', true)) != ( + cond[1:]| join('!=')) + when: condition is search('!=') + +- name: if condition contains 'in' + block: + - name: split condition with ' in ' + set_fact: + cond: "{{ condition.split(' in ') }}" + - name: split list + set_fact: + inlist: | + {{ (cond[1]| + replace(', ', ',')| replace(' ,', ',')| + replace(' ]', '') | replace(']', '')| + replace('[ ', '') | replace('[', '')).split(',') }} + - name: test condition + include_tasks: exit.yml + when: (lookup('env', cond[0])| default('-666-', true)) in inlist + when: condition is search(' in ') diff --git a/roles/prepare/tasks/exit.yml b/roles/prepare/tasks/exit.yml new file mode 100644 index 0000000..58fb43d --- /dev/null +++ b/roles/prepare/tasks/exit.yml @@ -0,0 +1,13 @@ +--- + +- name: we have to end this role + debug: + msg: "{{ msg.split('\n') }}" + vars: + msg: | + ************************************************************************** + ** We finish the play here + ** REASON = '{{ condition }}' + ************************************************************************** + +- meta: end_play diff --git a/roles/prepare/tasks/main.yml b/roles/prepare/tasks/main.yml new file mode 100644 index 0000000..ce08540 --- /dev/null +++ b/roles/prepare/tasks/main.yml @@ -0,0 +1,93 @@ +--- +## +# Warn if log level is high +## +- name: Echo running pipeline link + debug: + msg: "{{ msg.split('\n') }}" + verbosity: 3 + vars: + msg: | + !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! + !! Log level is HIGH ! !! + !! Some sensitive data may be visible to everyone. !! + !! Don't forget to clean the task output ! !! + !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! + +## +# Check Step parameters +## +- name: check 'step' is set + fail: + msg: 'Step must be defined ! (use --extra-vars "step=test1")' + when: step is not defined + +## +# Check the pod is not protected +## +- name: clean var + set_fact: + protected_pods: [] + when: protected_pods|default() == None + +- name: check pod protection + fail: + msg: 'This pod is protected' + when: + inventory_hostname in protected_pods and + lookup( 'env', 'AREYOUSURE') != 'MAIS OUI !!!' + +## +# Prepare the step config +## +- name: get default step parameters + set_fact: + config: >- + {{ gitlab.git_projects[ + hostvars[inventory_hostname].scenario_steps[step].project] | + combine(hostvars[inventory_hostname].scenario_steps[step]) }} + +- name: merge step parameters + set_fact: + config: >- + {{ config| combine( + {'parameters': config.parameters| + combine(config.extra_parameters)}) }} + when: config.extra_parameters is defined + +## +# Check if we must run this step - Must be run at the end of this role +## + +- name: Set default skip_run value + set_fact: + skip_run: false + +- name: run except parameter + include_tasks: except.yml + loop: "{{ config.except }}" + loop_control: + loop_var: condition + label: "{{ condition }}" + when: config.except is defined + +- name: Set default skip_run value + set_fact: + skip_run: true + when: config.only is defined + +- name: run only parameter + include_tasks: only.yml + loop: "{{ config.only }}" + vars: + skip_all: false + loop_control: + loop_var: condition + label: "{{ condition }}" + when: config.only is defined + +- name: Skip if none of ONLY is successful + include_tasks: exit.yml + vars: + condition: "None of ONLY conditions are successful" + when: config.only is defined and skip_run diff --git a/roles/prepare/tasks/only.yml b/roles/prepare/tasks/only.yml new file mode 100644 index 0000000..893d32b --- /dev/null +++ b/roles/prepare/tasks/only.yml @@ -0,0 +1,57 @@ +--- +# in this file, default variable value is '-666-', I hope no one will ever +# test the number of the beast :) + +- name: test condition only if the previous failed + when: skip_run + block: + - name: Testing 'ONLY' condition + debug: + var: condition + + - name: if condition is only one word + block: + - name: check variable is present + include_tasks: continue.yml + when: lookup('env', condition)| default(False, true) + when: condition.split()| length == 1 + + - name: if condition contains '==' + block: + - name: split condition with '==' + set_fact: + cond: "{{ (condition|replace(' == ', '==')).split('==') }}" + - debug: msg="{{ cond[1:]| join('==') }}" + - name: test condition + include_tasks: continue.yml + when: (lookup('env', cond[0])| default('-666-', true)) == ( + cond[1:]| join('==')) + when: condition is search('==') + + - name: if condition contains '!=' + block: + - name: split condition with '!=' + set_fact: + cond: "{{ (condition|replace(' != ', '!=')).split('!=') }}" + - name: test condition + include_tasks: continue.yml + when: (lookup('env', cond[0])| default('-666-', true)) != ( + cond[1:]| join('!=')) + when: condition is search('!=') + + - name: if condition contains 'in' + block: + - name: split condition with ' in ' + set_fact: + cond: "{{ condition.split(' in ') }}" + - name: split list + set_fact: + inlist: | + {{ (cond[1]| + replace(', ', ',')| replace(' ,', ',')| + replace(' ]', '') | replace(']', '')| + replace('[ ', '') | replace('[', '')).split(',') }} + - name: test condition + include_tasks: continue.yml + when: (lookup('env', cond[0])| default('-666-', true)) in inlist + when: condition is search(' in ') diff --git a/roles/run-ci/tasks/grafana_start.yml b/roles/run-ci/tasks/grafana_start.yml new file mode 100644 index 0000000..183a439 --- /dev/null +++ b/roles/run-ci/tasks/grafana_start.yml @@ -0,0 +1,42 @@ +--- +- block: + - name: get start time (epoch+milliseconds) + set_fact: + time_start: "{{ lookup('pipe', 'date +%s%N | head -c 13' ) | int }}" + + - name: set tags + set_fact: + grafana_tags: "{{ [ inventory_hostname ] }}" + + - name: add inpod in tags + set_fact: + grafana_tags: "{{ grafana_tags + [ inpod ] }}" + when: inpod is defined + + - name: "Create a grafana annotation" + uri: + url: "{{ grafana.api | regex_replace('\\/$', '') }}/annotations" + method: POST + status_code: 200 + body_format: "json" + body: "{{ + { + 'time': time_start | int, + 'isRegion': true, + 'timeEnd': (time_start | int + 10000000), + 'tags': grafana_tags, + 'title': step, + 'text': text + } + }}" + headers: + Content-Type: "application/json" + Accept: "application/json" + Authorization: "Bearer {{ grafana.token }}" + register: grafana_events + vars: + text: + "<a href=\"{{ pipeline_url }}\">{{ step }}</a> running" + + delegate_to: "{{ grafana.jumphost }}" + ignore_errors: true diff --git a/roles/run-ci/tasks/grafana_stop.yml b/roles/run-ci/tasks/grafana_stop.yml new file mode 100644 index 0000000..4d5a7e7 --- /dev/null +++ b/roles/run-ci/tasks/grafana_stop.yml @@ -0,0 +1,53 @@ +--- +- block: + - name: get end time + set_fact: + time_end: "{{ lookup('pipe', 'date +%s%N | head -c 13' ) | int }}" + + - name: calculate duration + set_fact: + duration: + "{{ ((time_end|int) - (time_start|int))/1000 }}" + + - name: "update a grafana annotation start" + uri: + url: + "{{ grafana.api | regex_replace('\\/$', '') }}/annotations/{{ + grafana_events.json.id }}" + method: PUT + status_code: 200 + body_format: "json" + body: "{{ + { + 'time': time_start | int, + 'tags': grafana_tags + [ result ], + 'text': text + '<br/>Duration (s): ' + duration + } + }}" + headers: + Content-Type: "application/json" + Accept: "application/json" + Authorization: "Bearer {{ grafana.token }}" + + - name: "update a grafana annotation end" + uri: + url: + "{{ grafana.api | regex_replace('\\/$', '') }}/annotations/{{ + grafana_events.json.endId }}" + method: PUT + status_code: 200 + body_format: "json" + body: "{{ + { + 'time': time_end | int, + 'tags': grafana_tags + [ result ], + 'text': text + '<br/>Duration (s): ' + duration + } + }}" + headers: + Content-Type: "application/json" + Accept: "application/json" + Authorization: "Bearer {{ grafana.token }}" + + delegate_to: "{{ grafana.jumphost }}" + ignore_errors: true diff --git a/roles/run-ci/tasks/main.yml b/roles/run-ci/tasks/main.yml new file mode 100644 index 0000000..eed27f8 --- /dev/null +++ b/roles/run-ci/tasks/main.yml @@ -0,0 +1,270 @@ +--- + +## +# Prepare base of variables to send +## +- name: prepare variables to sent + set_fact: + params: + { + 'token': "{{ config.trigger_token }}", + 'ref': "{{ config.branch }}", + 'variables[source_job_name]': "{{ step }}", + 'variables[pod]': "{{ inventory_hostname }}", + 'variables[jumphost]': "{{ jumphost }}", + } + +## +# Prepare the artifacts to get +## + +- name: add bin artifacts param + when: artifacts_bin is defined + set_fact: + params: + "{{ params|combine({'variables[artifacts_bin]': artifacts_bin }) }}" + +- name: add src artifacts param + when: artifacts_src is defined + set_fact: + params: + "{{ params|combine({'variables[artifacts_src]': artifacts_src }) }}" + +- name: ensure artifacts.zip is not present + file: + path: "{{ playbook_dir }}/artifacts.zip" + state: absent + +- name: set healthchecks base url + set_fact: + base_url: "{{ gitlab.healthchecks_url }}/ping/{{ healthchecks_id }}" + when: healthchecks_id is defined + +## +# Run the step +## +- name: Run step + block: + ## + # add step parameters in the parameters to send + ## + - name: Add step parameters + set_fact: + params: "{{ params|combine({key: value}) }}" + vars: + key: "variables[{{ item.key }}]" + value: "{{ item.value }}" + with_dict: "{{ config.parameters }}" + when: config.parameters is defined and config.parameters != None + + ## + # add NOVAULT_LIST parameter in the parameters to send + ## + - name: Add NOVAULT_LIST parameter + set_fact: + params: "{{ params|combine({key: value}) }}" + vars: + key: "variables[NOVAULT_LIST]" + value: "{{ config.novault |join(\"\n\") }}" + when: config.novault is defined + + ## + # Trigger the pipeline + ## + - name: "Trigger a new pipeline for step {{ step }}" + uri: + url: "{{ config.api }}/trigger/pipeline" + method: POST + status_code: 201 + body_format: raw + body: "{{ params| urlencode }}" + headers: + Content-Type: "application/x-www-form-urlencoded" + register: trigger_out + + - name: set pipeline url + set_fact: + pipeline_url: "{{ config.url }}/pipelines/{{ trigger_out.json.id }}" + api_pipeline_url: "{{ config.api }}/pipelines/{{ trigger_out.json.id }}" + + - name: Echo running pipeline link + debug: + msg: "{{ msg.split('\n') }}" + vars: + msg: | + ****************************************************************** + * Pipeline triggered for step '{{ step }}' + * {{ pipeline_url }} + ****************************************************************** + + - name: set grafana start point + include_tasks: grafana_start.yml + when: grafana is defined + + - name: "Wait for pipeline result {{ step }}" + uri: + url: "{{ config.api }}/pipelines/{{ trigger_out.json.id }}" + method: GET + status_code: 200 + return_content: 'yes' + headers: + PRIVATE-TOKEN: + "{{ config.api_token|default(gitlab.private_token, 'true') }}" + register: pipeline_out + retries: "{{ config.timeout }}" + delay: "{{ gitlab.pipeline.delay }}" + until: (((pipeline_out.json + |default({'status':'unknown'})).status + |default('unknown')) + not in ['created', 'waiting_for_resource', 'preparing', + 'pending', 'running', 'unknown']) or ( + pipeline_out.status == 401 + ) + + + - name: Exit -1 + fail: + when: pipeline_out.json.status not in ['success'] + + ## + # When finished, recover an artifact if requested + ## + - name: pull artifacts_src + when: + config.pull_artifacts is defined and config.pull_artifacts != None + block: + - name: "Get job id for the artifact to get" + uri: + url: >- + {{ config.api + }}/pipelines/{{ trigger_out.json.id }}/jobs?scope[]=success + method: GET + headers: + PRIVATE-TOKEN: + "{{ config.api_token|default(gitlab.private_token, 'true') }}" + register: pipeline_success_jobs + + - name: download job artifact + uri: + url: "{{ config.api }}/jobs/{{ job_id[0] }}/artifacts" + headers: + PRIVATE-TOKEN: + "{{ config.api_token|default(gitlab.private_token, 'true') }}" + dest: "{{ playbook_dir }}/artifacts.zip" + vars: + job_id: >- + {{ pipeline_success_jobs.json |json_query( + '[?name==`'+ config.pull_artifacts +'`].id') }} + + - name: remove actual artifacts + file: + path: "{{ item }}" + state: absent + when: item[-1] == '/' + with_items: + "{{ vars[lookup( 'env', 'CI_JOB_NAME')].artifacts.paths }}" + + - name: create artifacts folders + file: + path: "{{ item }}" + state: directory + recurse: true + mode: 0775 + when: item[-1] == '/' + with_items: + "{{ vars[lookup( 'env', 'CI_JOB_NAME')].artifacts.paths }}" + + - name: unarchive artifacts + unarchive: + src: "{{ playbook_dir }}/artifacts.zip" + dest: "{{ playbook_dir }}" + remote_src: "yes" + + - name: trigger OK healthchecks + uri: + url: "{{ base_url }}" + when: healthchecks_id is defined + ignore_errors: true + + - name: update grafana stop point + include_tasks: grafana_stop.yml + vars: + result: "{{ pipeline_out.json.status }}" + text: "<a href={{ pipeline_url }}>{{ step }}</a> succeeded" + when: grafana is defined and grafana_events is defined + + ## + # If something failed, print the jobs that failed + ## + rescue: + - name: print last pipeline result for forensic + debug: + var: pipeline_out + verbosity: 3 + + - name: update grafana stop point + include_tasks: grafana_stop.yml + vars: + result: "{{ pipeline_out.json.status }}" + text: "<a href={{ pipeline_url }}>{{ step }}</a> failed" + when: grafana is defined and grafana_events is defined + + - name: trigger Failed healthcheck + uri: + url: "{{ base_url }}/fail" + when: healthchecks_id is defined + + - name: "Show last pipeline_out value" + debug: + msg: "{{ pipeline_out.json | default('No pipeline out') }}" + verbosity: 3 + + - name: "RESCUE - Get jobs list that failed" + uri: + url: "{{ config.api }}/pipelines/{{ trigger_out.json.id }}/jobs/" + method: GET + status_code: 200 + return_content: 'yes' + headers: + PRIVATE-TOKEN: + "{{ config.api_token|default(gitlab.private_token, 'true') }}" + register: jobs_list + + - name: RESCUE - filter failed jobs + set_fact: + failed_jobs: + "{{ failed_jobs | default({}) | combine({ item.id: + {'stage': item.stage, + 'name': item.name, + 'status': item.status, + 'duration': item.duration, + 'url': url + }})}}" + vars: + url: "{{ config.url }}/-/jobs/{{ item.id }}" + when: item.status not in ['success', 'skipped'] + with_items: "{{ jobs_list.json }}" + + - name: RESCUE - run failed ! + when: true + fail: + msg: "{{ msg.split('\n') }}" + vars: + msg: | + ****************************************************************** + * Oh ! NO !!! Pipeling of the project failed !!! + * ----------------------- + * Step: {{ step }} + * Project: {{ config.project }} + * Status: {{ pipeline_out.json.status }} + * Pipeline: '{{ pipeline_url }}' + * API pipeline url: '{{ api_pipeline_url }}' + * Failed jobs: + {% for job_id, job_status in failed_jobs.items() -%} + * - id: {{ job_id }} + * name: {{ job_status.stage }}/{{ job_status.name }} + * status: {{ job_status.status }} + * duration: {{ job_status.duration }} + * link: {{ job_status.url }} + {% endfor %} + ****************************************************************** diff --git a/roles/trigger_myself/tasks/main.yml b/roles/trigger_myself/tasks/main.yml new file mode 100644 index 0000000..bbea974 --- /dev/null +++ b/roles/trigger_myself/tasks/main.yml @@ -0,0 +1,75 @@ +--- +- name: check 'step' is set + fail: + msg: 'Step must be defined ! (use --extra-vars "step=test1")' + when: step is not defined + +- name: get default step parameters + set_fact: + config: >- + {{ gitlab.git_projects[ + hostvars[inventory_hostname].scenario_steps[step].project] | + combine(hostvars[inventory_hostname].scenario_steps[step]) }} + +- name: merge step parameters + set_fact: + config: >- + {{ config| combine( + {'parameters': config.parameters| + combine(config.extra_parameters)}) }} + when: config.extra_parameters is defined + + +## +# Prepare base of variables to send +## +- name: prepare variables to sent + set_fact: + params: + { + 'token': "{{ config.trigger_token}}", + 'ref': "{{ config.branch }}", + 'variables[source_job_name]': "{{ step }}", + 'variables[triggered_from]': "{{ lookup('env','CI_JOB_NAME') }}", + 'variables[INPOD]': "{{ inventory_hostname }}", + 'variables[jumphost]': "{{ jumphost }}", + } + +- name: Add step parameters + set_fact: + params: "{{ params|combine({key: value}) }}" + vars: + key: "variables[{{ item.key }}]" + value: "{{ item.value }}" + with_dict: "{{ config.parameters }}" + when: config.parameters is defined + + +## +# Trigger the pipeline +## +- name: "Trigger a new pipeline for step {{ step }}" + uri: + url: >- + {{ gitlab.api_url}}/projects/{{ lookup( 'env', 'CI_PROJECT_ID') + }}/trigger/pipeline + method: POST + status_code: 201 + body_format: raw + body: "{{params| urlencode}}" + headers: + Content-Type: "application/x-www-form-urlencoded" + register: trigger_out + +- name: Echo running pipeline link + debug: + msg: "{{ msg.split('\n') }}" + vars: + url: >- + {{ lookup('env','CI_PROJECT_URL') }}/pipelines/{{ + trigger_out.json.id }}" + msg: | + ****************************************************************** + * Pipeline triggered for step '{{ step }}' + * {{ url }} + ****************************************************************** |