Troubleshoot Hashicorp packer AWS Instance Directly

I use GitHub to send a notifyCommit to Jenkins to kick off a pipeline job that spins up packer, calling ansible as a provisioner (along with some preliminary shell) to configure and install, which then tests (ansible properly formatted cover unit testing) integration using a temporary cloud formation using Hashicorp consul service discovery and a designated testing instance and security groups to isolate and ensure this is a self-referencing cloud, and once tests succeed a deploy can proceed IZF the commit and the built are in the VPC assigned for production…

Troubleshooting through this is like building a ship inside of a bottle inside of a bottle inside of a bottle. Sometimes you need to directly access that final packer instance and find out what exactly goes wrong.

By default (unconfigured to do something else) Hashicorp packer arbitjiliy creates and deletes AWS access keys for the instance. You can configure packer to spin up the instance with a specific known key.

Here’s the packer.json file:

{
    "variables": {
        "app_name": "",
        "aws_instance_type_1": "",
        "aws_region_1": "",
        "aws_sg_id_1": "",
        "aws_subnet_id_1": "",
        "release": "",
        "aws_source_ami_1": "",
        "orb_project_id": "",
        "consul_url": "",
        "owner": "",
        "stack": "",
        "uuid": "",
        "build_tag": "{{ env `BUILD_TAG` }}",
        "jenkins_url": "{{ env `JENKINS_URL` }}",
        "update_cloud": "",
        "vault_addr": "",
        "encrypt_boot": "true",
        "kms_key_id": "",
        "access_key": "",
        "secret_key": "",
        "token": ""
    },
    "builders": [
        {
            "name": "build-us-west-5-dev",
            "iam_instance_profile": "nebula-service",
            "type": "amazon-ebs",
            "region": "us-west-5",
            "source_ami": "{{ user `ami_us-west-5` }}",
            "instance_type": "{{ user `us-west-5_instance_type` }}",
            "ssh_username": "ec2-user",
            "security_group_id": "{{ user `us-west-5_sec_group_id` }}",
            "subnet_id": "{{ user `us-west-5_subnet_id` }}",
            "tags": {
              "Owner": "{{ user `owner` }}",
              "App": "{{ user `app_name` }}",
              "Stage": "dev",
              "Stack": "{{ user `stack` }}",
              "build_tag": "{{ user `jenkins_url` }}:{{ user `build_tag` }}",
              "update_cloud" : "{{ user `update_cloud` }}",
              "Release": "{{ user `release` }}"
            },
            "ami_name": "{{ user `stack` }}-{{ user `app_name`}}-{{ user `uuid` }}",
            "run_tags": {
                "packer_builder": "true "
            },
            "kms_key_id": "",
            "user_data_file": "cicd/packer/hcm-vendor-fix.sh",
            "access_key": "{{ user `AWS_ACCESS_KEY_ID` }}",
            "secret_key": "{{ user `AWS_SECRET_ACCESS_KEY` }}",
            "token": "{{ user `AWS_SESSION_TOKEN` }}"
        }
    ],
    "provisioners": [
        {
            "type": "shell",
            "inline": [
                "sudo echo 'adjust resolv.conf content'",
                "sudo sed -i '1i options single-request-reopen attempts:5 rotate' /etc/resolv.conf",
                "sudo echo 'verify resolv.conf'",
                "sudo cat /etc/resolv.conf",
                "sudo yum -y install python2-pip",
                "sudo pip install --upgrade pip",
                "sudo pip install ansible python-consul"
            ]
        },
        {
           "type": "ansible-local",
           "playbook_file": "cicd/ansible/access.yml"
        },
        {
           "type": "ansible-local",
           "playbook_file": "cicd/ansible/dependencies.yml"
        },
        {
           "type": "ansible-local",
           "command": "export ANSIBLE_CONSUL_URL={{ user `consul_url` }} && ansible-playbook",
           "playbook_file": "cicd/ansible/site.yml",
           "role_paths": [
             "cicd/ansible/roles/pipetestapp"
             ],
           "extra_arguments": [
             "--extra-vars 'app_name={{ user `app_name` }} release={{ user `release` }} vault_addr={{ user `vault_addr` }} build_name=$PACKER_BUILD_NAME'"
             ]
        },
        {
           "type": "ansible-local",
           "playbook_file": "cicd/ansible/drop_github_access.yml"
        }
    ]
}

To access the packer instance directly, add:

               "ssh_keypair_name": "devops ssh key to use",
               "ssh_private_key_file": "/home/ec2-user/.ssh/packer_key",
               "disable_stop_instance": "true",

immediately below “name”: in the “builders:” section.

Then grab the instance id in AWS and go freeze it – enable termination protection, so that instance stays available. Then log on.

This has allowed troubleshooting scripts that run at boot to provision secrets and enabled access to repository servers and vault, finding permissions and package issues that just don’t come back through to the Jenkins console log.

Break all three bottles, pick up the loupe and go see what’s up directly.

— doug