How to Fix Ansible 403 Forbidden on Azure VM


As Senior DevOps Engineers at WebToolsWiz.com, we frequently encounter “Ansible 403 Forbidden” errors when managing Azure Virtual Machines. This error, while seemingly generic, usually points to specific authentication and authorization issues within the SSH communication channel.


Troubleshooting Guide: Ansible 403 Forbidden on Azure VM

When Ansible reports a “403 Forbidden” error while attempting to connect to an Azure VM, it generally signifies that the SSH connection was established, but the subsequent authentication or authorization failed. This is not typically a network access issue (which would likely manifest as a timeout or “connection refused”), but rather a problem with how Ansible is attempting to log in to the target VM.

1. The Root Cause

The “Ansible 403 Forbidden” error against an Azure VM almost invariably boils down to one or more of the following:

  • SSH Key Mismatch or Corruption: The private SSH key Ansible is using does not have a corresponding public key in the ~/.ssh/authorized_keys file on the Azure VM for the specified user, or the keys themselves are corrupted.
  • Incorrect Private Key Permissions: The private key file on your Ansible control node has insecure file permissions (e.g., readable by others), causing the SSH client to refuse to use it.
  • Incorrect SSH User: Ansible is attempting to connect as a user that does not exist on the Azure VM, or the specified user is not configured for SSH access (e.g., attempting to log in as root when not permitted). Azure VMs commonly use a default user like azureuser or one specified during VM creation.
  • SSH Daemon Configuration on Azure VM: The sshd_config file on the Azure VM is overly restrictive, disallowing public key authentication, specific users, or allowing only certain authentication methods not being met by Ansible.
  • Incorrect authorized_keys Permissions/Ownership on Azure VM: The ~/.ssh directory or the ~/.ssh/authorized_keys file on the Azure VM has incorrect permissions or ownership, preventing the SSH daemon from reading them.

2. Quick Fix (CLI)

Start with these immediate checks and commands on your Ansible control node and, if accessible, directly on the Azure VM.

On your Ansible Control Node:

  1. Verify Private Key Permissions: SSH requires strict permissions on private key files.

    chmod 600 /path/to/your/azure_vm_key.pem

    Replace /path/to/your/azure_vm_key.pem with the actual path to your private key.

  2. Add Key to SSH Agent: This ensures your SSH client can easily find and use the key.

    eval "$(ssh-agent -s)"
    ssh-add /path/to/your/azure_vm_key.pem

    If ssh-add prompts for a passphrase, enter it.

  3. Test SSH Connectivity Manually: Before involving Ansible, confirm you can connect directly via SSH. This is a crucial step to isolate the problem.

    ssh -i /path/to/your/azure_vm_key.pem <azure_vm_user>@<azure_vm_public_ip_or_hostname>
    • Replace <azure_vm_user> (e.g., azureuser) and <azure_vm_public_ip_or_hostname>.
    • If this manual SSH command fails, the problem is with your core SSH setup, not Ansible itself. The error message from SSH will be more specific (e.g., “Permission denied (publickey)”).

On the Azure VM (if you can access it via another method, e.g., Azure Bastion, Serial Console, or an existing working key):

  1. Verify authorized_keys Permissions and Ownership:
    sudo ls -ld ~/.ssh
    sudo ls -l ~/.ssh/authorized_keys
    • ~/.ssh directory should ideally have drwx------ (700) permissions.
    • ~/.ssh/authorized_keys file should ideally have -rw------- (600) permissions.
    • Both should be owned by the connecting user.
    sudo chmod 700 ~/.ssh
    sudo chmod 600 ~/.ssh/authorized_keys
    sudo chown <azure_vm_user>:<azure_vm_user> ~/.ssh ~/.ssh/authorized_keys
    Replace <azure_vm_user> with the actual user (e.g., azureuser).

3. Configuration Check

If the quick fixes don’t resolve the issue, dive into the configurations on both your Ansible control node and the Azure VM.

On your Ansible Control Node:

  1. Ansible Inventory File (e.g., inventory.ini or hosts.yml): Ensure the ansible_user and ansible_private_key_file are correctly specified.

    INI Format:

    [azure_vms]
    my_azure_vm ansible_host=<Azure_VM_Public_IP> ansible_user=azureuser ansible_private_key_file=/path/to/your/azure_vm_key.pem

    YAML Format:

    all:
      hosts:
        my_azure_vm:
          ansible_host: <Azure_VM_Public_IP>
          ansible_user: azureuser
          ansible_private_key_file: /path/to/your/azure_vm_key.pem
    • Crucial: The ansible_user must be an existing user on the Azure VM.
    • The ansible_private_key_file must point to the correct private key on your control node.
  2. Ansible Playbook/Ad-hoc Command Line: If not using an inventory, ensure you’re passing the correct arguments.

    ansible -i <Azure_VM_Public_IP>, -m ping -u azureuser --private-key /path/to/your/azure_vm_key.pem
  3. SSH Client Configuration (~/.ssh/config): If you’re using an SSH config file, ensure the settings for your Azure VM are correct and not overriding Ansible’s parameters.

    Host my_azure_vm
        Hostname <Azure_VM_Public_IP_or_DNS>
        User azureuser
        IdentityFile /path/to/your/azure_vm_key.pem
        IdentitiesOnly yes # Prevents SSH agent from offering too many keys

On the Azure VM (Requires SSH access via another means):

  1. Public Key in ~/.ssh/authorized_keys:

    • Content: Ensure the public key on the VM exactly matches the public part of the private key you’re using on your control node. You can get the public key from your private key:
      ssh-keygen -y -f /path/to/your/azure_vm_key.pem
      Copy this output and compare it to the contents of ~/.ssh/authorized_keys on the Azure VM for the specific user.
    • Location: The file should be /home/<azure_vm_user>/.ssh/authorized_keys.
  2. SSH Daemon Configuration (/etc/ssh/sshd_config): Inspect the sshd_config file on the Azure VM for restrictive settings.

    sudo grep -E "PasswordAuthentication|PubkeyAuthentication|AllowUsers|PermitRootLogin" /etc/ssh/sshd_config

    Look for:

    • PasswordAuthentication no: This is common and good practice; it means you must use key-based authentication. If Ansible isn’t providing a valid key, this will cause a failure.
    • PubkeyAuthentication yes: This must be enabled for key-based authentication to work.
    • AllowUsers <user1> <user2>: If present, ensure your ansible_user is listed. If your user is not listed, they will be denied.
    • PermitRootLogin no: If you’re attempting to connect as root, this will prevent it. It’s generally recommended to connect as a regular user and use sudo.

    If you make changes to sshd_config, you must restart the SSH service:

    sudo systemctl restart sshd
    # or (for older systems)
    sudo service sshd restart

4. Verification

After performing the checks and applying any necessary fixes, verify the connection using Ansible:

ansible -i inventory.ini my_azure_vm -m ping

(If using an inventory file configured above)

Or, for an ad-hoc test:

ansible -i <Azure_VM_Public_IP>, -m ping -u azureuser --private-key /path/to/your/azure_vm_key.pem

Expected Successful Output:

my_azure_vm | SUCCESS => {
    "changed": false,
    "ping": "pong"
}

If you receive this output, your Ansible “403 Forbidden” issue has been resolved, and Ansible can now successfully communicate with your Azure VM. If the problem persists, re-evaluate each step methodically, paying close attention to the exact error messages provided by SSH or Ansible.