Running scheduled scripts with API keys

SOLVED
NJNetworkGuy100
Getting noticed

Running scheduled scripts with API keys

I have a bunch of scripts with Dashboard API calls in them that I want to run on a regular schedule (most are scripts that grab some info via API, and send custom emails based on that data).  Any tips on a secure way to do this, since it involves API keys with access to different orgs? 

 

I am using Python for the Meraki API scripts, and have access to use VMWare or Azure VM's.  

 

Are there any programs or utilities I can use to store our API keys for these scripts?  Any documentation or examples I can use or follow?  

 

Really just getting started using these scripts in this fashion, and looking for any help.  

 

Thanks!

1 ACCEPTED SOLUTION
PhilipDAth
Kind of a big deal
Kind of a big deal

This is my proposal:

https://community.meraki.com/t5/Developers-APIs/A-newer-safer-way-to-access-the-dashboard-API/m-p/69... 

 

You can also use an environment variable.

https://developer.cisco.com/meraki/api-v1/#!python/usage

I nice way to do this is to create a user to run the scripts (such as "API"), and then define the environment variable in that user's account.  Then it is available only to that user, and everything that runs as that user.

You could also use the same approach with my older system.  You just put the config into that users home directory.

View solution in original post

4 REPLIES 4
PhilipDAth
Kind of a big deal
Kind of a big deal

This is my proposal:

https://community.meraki.com/t5/Developers-APIs/A-newer-safer-way-to-access-the-dashboard-API/m-p/69... 

 

You can also use an environment variable.

https://developer.cisco.com/meraki/api-v1/#!python/usage

I nice way to do this is to create a user to run the scripts (such as "API"), and then define the environment variable in that user's account.  Then it is available only to that user, and everything that runs as that user.

You could also use the same approach with my older system.  You just put the config into that users home directory.

User specific environmental variables...with a dedicated maintenance account on that VM...that's not a bad idea.  I keep hearing about environmental variables, but I always assume they are talking about system ones.  

 

I'll give this a try, but try out your other proposal.  See how that goes...

 

Thanks!

We actually found another way to handle this via Azure.

 

Our Azure engineer didn't like the environmental variable option...felt it was still too insecure to keep passwords/keys.

 

We ended up using Azure Key Vault and an Azure VM.

 

So, we created an Azure 2019 server VM, installed all the Python programs and modules needed on it.

 

Then, we created an Azure Key Vault to store the various passwords and secrets (The VM and the KeyVault are all in the same Resource Group for simpler management).  

 

We use a hybrid on-prem AD and Azure environment, so this setup was easier because of that.

 

We ended up giving our AD accounts access to the KeyVault AND we gave the Azure VM itself access to the KeyVault via RBAC (basically, via Managed Identity).  

 

On the VM and on your local laptop, you need to install the Azure-KeyVault-Secrets and Azure-Identity modules.  

 

azure · PyPI

 

azure-identity · PyPI

 

 

This website gives a great breakdown on how the Azure Identity module connects your code to Azure.

 

Authenticate to Azure from Python | Thomas Stringer (trstringer.com)

 

Example code:

 

import os  #to use for the KeyVault name

from azure.keyvault.secrets import SecretClient

from azure.identity import DefaultAzureCredential

 

#Code that needs to be in your script to connect to the KeyVault

 

azureCredentials = DefaultAzureCredential()

 

keyVaultName = os.environ.get('KeyVault')

#we put the KeyVault Name as an environmental variable on the local machine

 

KVUri = f'https://{keyVaultName}.vault.azure.net' 

#the KeyVault URI from its Azure settings

 

global client

#makes the client variable a global variable in the script in case you need to call it from anywhere in your code

 

#connects to the Key Vault

client = SecretClient(vault_url=KVUri, credential=azureCredentials)

 

#You can then retrieve secrets from the KeyVault to use when connecting to various systems.  

retrievedSecret = client.get_secret('SECRETNAME')

 

password = retrievedSecret.value

 

On the Azure VM, it will use managed identity to connect to Azure.  On your local laptop, it will use your local user credentials to connect to Azure, which is usually a GUI pop-up.  

 

We were then able to use a local admin account to run our scheduled scripts on the Azure VM via Task Scheduler.

 

And because of the way the Azure-Identity module works, we don't have to modify our code a single bit to connect to Azure and Key Vault, whether it is running on my local laptop, or on the Azure VM with a local non-domain admin account.  

 

If you use VS Code, there is an Azure Account extension you can use to connect your VS Code program to Azure to use the Azure-Identity module.  

 

 

Hope this helps some folks out there.  

RomanMD
Building a reputation

I do two three different things:

1. either store in the environment variable 

2. either store in the database in the setting table (usually using this when the scripts run on Django)

3. either way - I have an encryption/decryption algorithm, so the key is not stored in plain text. It does not mean that the key is fully safe, as long as one will have access to the algorithms, but at least having the encrypted API key does not make much sense for some ....

Get notified when there are additional replies to this discussion.