Bulk Install Browser Extensions in Indigo X Profiles via API
This guide provides a comprehensive walkthrough on using a powerful Python script to automate the installation of your favorite browser extensions into multiple Indigo X browser profiles.
🎯 Goal
Automate the installation of a list of browser extensions into many Indigo X profiles, either during creation or as an update to existing ones.How it Works
The Python script intelligently interacts with the Indigo X local API to:
- 🔐 Securely Authenticate: Logs into your Indigo X account to start a session.
- 💾 Cache Session Tokens: Caches your automation token to minimize login calls and speed up subsequent runs.
- 🚀 Create or Update Profiles: Based on your configuration, it will either:
- Create brand-new profiles with your specified extensions pre-installed.
- Search for existing profiles and update them to include the specified extensions.
- ⏳ Handle API Limits: Gracefully manages API rate limits with an automatic retry mechanism to ensure stability.
Prerequisites
Before you begin, ensure you have the following ready:
-
Python 3.6+: If you don't have Python, download it from the official python.org website.
-
Requests Library: The script uses this library to communicate with the Indigo X API. Install it by opening your command prompt or terminal and running this command:
pip install requests
-
Indigo X Account: An active account with the Indigo X application running.
-
Extension Files (Unpacked): The script needs the source files of your extensions. The process differs slightly for Chromium and Firefox-based profiles.
-
For Mimic (Chromium-based) profiles: You need the unpacked folder of the extension.
- Use a "CRX Extractor/Downloader" tool to download an extension from the Chrome Web Store as a
.zip
file. - Extract the contents of the
.zip
file into a dedicated folder. The script will need the full path to this folder.
- Use a "CRX Extractor/Downloader" tool to download an extension from the Chrome Web Store as a
-
For Stealthfox (Firefox-based) profiles: You need the
.xpi
file of the extension.- Navigate to the Firefox Browser ADD-ONS page for your desired extension.
- Right-click on the "Add to Firefox" button.
- Select "Save Link As..." to download the extension as an
.xpi
file. The script will need the full path to this file.
-
Step 1: The Python Script
Save the following Python code as indigo_extension_manager.py
in a new folder on your computer. This script is the engine of our automation process.
Click to view the indigo_extension_manager.py
script
import hashlib
import json
import os
import sys
import time
import requests
# --- Constants ---
API_BASE_URL = "https://api.indigobrowser.com"
CONFIG_FILE_NAME = "config.json"
TOKEN_LIFETIME_SECONDS = 23 * 60 * 60 # 23 hours for a 24h token, for safety
def load_and_validate_config():
if not os.path.exists(CONFIG_FILE_NAME):
sys.exit(f"Error: Configuration file '{CONFIG_FILE_NAME}' not found.")
with open(CONFIG_FILE_NAME, 'r') as f:
try:
config = json.load(f)
except json.JSONDecodeError:
sys.exit(f"Error: Could not parse '{CONFIG_FILE_NAME}'.")
# Common keys are always required
required_keys = ["email", "password", "workspace_name", "extension_paths", "action"]
missing_keys = [key for key in required_keys if key not in config]
if missing_keys:
sys.exit(f"Error: Config is missing required keys: {', '.join(missing_keys)}")
action = config.get("action")
if action == "create":
if "create_new_profiles_config" not in config:
sys.exit("Error: 'create_new_profiles_config' section is missing for 'create' action.")
required_action_keys = ["base_name", "count", "folder_name", "os_type", "browser_type"]
cfg_section = config["create_new_profiles_config"]
section_name = "create_new_profiles_config"
elif action == "update":
if "update_existing_profiles_config" not in config:
sys.exit("Error: 'update_existing_profiles_config' section is missing for 'update' action.")
required_action_keys = ["selection_method"]
cfg_section = config["update_existing_profiles_config"]
section_name = "update_existing_profiles_config"
if cfg_section.get("selection_method") == "by_folder" and "folder_name" not in cfg_section:
sys.exit(f"Error: Config section '{section_name}' is missing 'folder_name' for 'by_folder' method.")
elif cfg_section.get("selection_method") == "by_name_search" and "search_text" not in cfg_section:
sys.exit(f"Error: Config section '{section_name}' is missing 'search_text' for 'by_name_search' method.")
else:
sys.exit(f"Error: Invalid action '{action}' in config. Must be 'create' or 'update'.")
missing_action_keys = [key for key in required_action_keys if key not in cfg_section]
if missing_action_keys:
sys.exit(f"Error: Config section '{section_name}' is missing keys: {', '.join(missing_action_keys)}")
print("Configuration loaded and validated successfully.")
return config
def save_token_to_config(config_path, token, expiration_timestamp, workspace_id):
config_data_to_save = {}
if os.path.exists(config_path):
with open(config_path, 'r') as f:
try:
config_data_to_save = json.load(f)
except json.JSONDecodeError:
print(f"Warning: Could not read existing config file ('{config_path}') to save token.")
config_data_to_save["cached_automation_token"] = token
config_data_to_save["token_expiration_timestamp"] = expiration_timestamp
config_data_to_save["workspace_id"] = workspace_id
with open(config_path, 'w') as f:
json.dump(config_data_to_save, f, indent=4)
print("New automation token, expiration, and workspace ID saved to config.json.")
def get_valid_cached_token(config):
cached_token = config.get("cached_automation_token")
expiration_ts = config.get("token_expiration_timestamp")
workspace_id = config.get("workspace_id")
if cached_token and isinstance(expiration_ts, (int, float)) and workspace_id:
if time.time() < expiration_ts:
# Basic check: Attempt a lightweight API call to see if token is active
try:
print("Verifying cached token activity...")
headers = {'Authorization': f'Bearer {cached_token}', 'Accept': 'application/json'}
verify_response = requests.get(f"{API_BASE_URL}/workspace/restrictions", headers=headers)
if verify_response.status_code == 200:
print("Valid cached automation token found. Using it.")
return cached_token, workspace_id
else:
print(
f"Cached token verification failed (Status: {verify_response.status_code}). Re-authenticating.")
return None, None
except requests.exceptions.RequestException as e:
print(f"Error during cached token verification: {e}. Re-authenticating.")
return None, None
else:
print("Cached automation token has expired.")
return None, None
def perform_full_login(config):
email, password, target_workspace_name = config['email'], config['password'], config['workspace_name']
hashed_password = hashlib.md5(password.encode()).hexdigest()
try:
print("Performing full login to get new automation token...")
print("Signing in...")
response = requests.post(f"{API_BASE_URL}/user/signin", json={'email': email, 'password': hashed_password})
response.raise_for_status()
data_payload = response.json().get('data', {})
initial_token, refresh_token = data_payload.get('token'), data_payload.get('refresh_token')
if not initial_token or not refresh_token:
print("Error: Sign-in response did not include expected initial or refresh tokens.")
return None, None
print(f"Fetching workspaces to find '{target_workspace_name}'...")
headers = {'Authorization': f'Bearer {initial_token}', 'Accept': 'application/json'}
ws_response = requests.get(f"{API_BASE_URL}/user/workspaces", headers=headers)
ws_response.raise_for_status()
workspaces_list = ws_response.json().get('data', {}).get('workspaces', [])
target_workspace = next((ws for ws in workspaces_list if ws.get('name') == target_workspace_name), None)
if not target_workspace:
print(f"Error: Could not find workspace named '{target_workspace_name}'.")
return None, None
selected_workspace_id = target_workspace.get('workspace_id')
print(f"Found workspace ID: {selected_workspace_id}")
print(f"Activating workspace...")
refresh_payload = {'email': email, 'refresh_token': refresh_token, 'workspace_id': selected_workspace_id}
refresh_response = requests.post(f"{API_BASE_URL}/user/refresh_token", headers=headers, json=refresh_payload)
refresh_response.raise_for_status()
refreshed_token = refresh_response.json().get('data', {}).get('token')
if not refreshed_token:
print("Error: Failed to get a new token for the selected workspace.")
return None, None
print("Generating automation token...")
headers['Authorization'] = f'Bearer {refreshed_token}'
auto_token_response = requests.get(f"{API_BASE_URL}/workspace/automation_token?expiration_period=24h",
headers=headers)
auto_token_response.raise_for_status()
automation_token = auto_token_response.json().get('data', {}).get('token')
if not automation_token:
print("Error: Final automation token not found in response.")
return None, None
new_expiration_timestamp = time.time() + TOKEN_LIFETIME_SECONDS
save_token_to_config(CONFIG_FILE_NAME, automation_token, new_expiration_timestamp, selected_workspace_id)
print("\nSuccessfully obtained and cached new automation token!")
return automation_token, selected_workspace_id
except requests.exceptions.RequestException as e:
error_message = f"API Response: {e.response.text}" if hasattr(e,
'response') and e.response is not None else str(e)
print(f"An error occurred during authentication: {error_message}")
return None, None
except Exception as e:
print(f"An unexpected error occurred during authentication: {e}")
return None, None
def make_api_request(method, url, token, payload=None, max_retries=3, base_wait_time=5):
"""
A simple request function with basic retry for 429 errors.
This replaces the APIManager for simplicity.
"""
headers = {
'Authorization': f'Bearer {token}',
'Content-Type': 'application/json',
'Accept': 'application/json'
}
for attempt in range(max_retries):
try:
if method.upper() == 'POST':
response = requests.post(url, headers=headers, json=payload or {})
else: # GET
response = requests.get(url, headers=headers)
if response.status_code == 401: # Unauthorized, likely expired token
print(" -> Token seems to be invalid or expired (401). Will attempt to refresh via main logic.")
return None # Signal to main logic to re-authenticate
if response.status_code == 429:
wait_time = base_wait_time * (2 ** attempt)
print(f" -> Rate limit hit (429). Retrying in {wait_time}s... (Attempt {attempt + 1}/{max_retries})")
time.sleep(wait_time)
continue
response.raise_for_status()
return response
except requests.exceptions.RequestException as e:
print(f" -> An API error occurred: {e}")
# If it's a client error like 400, retrying won't help
if hasattr(e,
'response') and e.response is not None and 400 <= e.response.status_code < 500 and e.response.status_code != 429 and e.response.status_code != 401:
break
print(f" -> Request failed after {max_retries} attempts for URL: {url}")
return None
def get_folder_id_by_name(config, token, folder_name):
print(f"\nFetching folders to find '{folder_name}'...")
try:
response = make_api_request('GET', f"{API_BASE_URL}/workspace/folders", token)
if response:
folders_list = response.json().get('data', {}).get('folders', [])
target_folder = next((f for f in folders_list if f.get('name') == folder_name), None)
if target_folder:
folder_id = target_folder.get('folder_id')
print(f"Found existing folder with ID: {folder_id}")
return folder_id
except Exception as e:
print(f"Warning: Could not fetch existing folders: {e}. Will attempt to create.")
return None
def create_folder(config, token, folder_name):
print(f"Folder '{folder_name}' not found or error fetching. Creating it now...")
try:
create_payload = {"name": folder_name, "comment": ""}
response = make_api_request('POST', f"{API_BASE_URL}/workspace/folder_create", token, payload=create_payload)
if response:
new_folder_id = response.json().get('data', {}).get('id')
if new_folder_id:
print(f"Successfully created folder with new ID: {new_folder_id}")
return new_folder_id
except Exception as e:
print(f"Fatal Error: Could not create the folder. {e}")
return None
def handle_new_profiles(config, token):
print("\n--- Running Action: Create New Profiles ---")
create_config = config['create_new_profiles_config']
folder_name = create_config['folder_name']
folder_id = get_folder_id_by_name(config, token, folder_name)
if not folder_id:
folder_id = create_folder(config, token, folder_name)
if not folder_id:
print("Fatal Error: Folder could not be found or created. Aborting profile creation.")
return
comma_separated_paths = ",".join(config['extension_paths'])
cmd_params = {"params": [{"flag": "load-extension", "value": comma_separated_paths}]}
# Simple sequential processing with basic delay
delay_between_requests = 2
for i in range(1, create_config['count'] + 1):
profile_name = f"{create_config['base_name']}{i}"
payload = {
"name": profile_name, "browser_type": create_config['browser_type'],
"os_type": create_config['os_type'], "folder_id": folder_id,
"parameters": {
"fingerprint": {"cmd_params": cmd_params},
"storage": {"is_local": False}
}
}
print(f"({i}/{create_config['count']}) Creating profile '{profile_name}'...")
response = make_api_request('POST', f"{API_BASE_URL}/profile/create", token, payload=payload)
if response and response.status_code < 400:
print(f" -> Success!")
else:
print(f" -> Failed to process profile '{profile_name}'.") # Error details from make_api_request
if i < create_config['count']:
time.sleep(delay_between_requests) # Simple delay
print("\nAll profile creation requests processed.")
def handle_existing_profiles(config, token):
print("\n--- Running Action: Update Existing Profiles ---")
update_config = config['update_existing_profiles_config']
search_payload = {
"limit": 100, "offset": 0, "is_removed": False,
"storage_type": "all", "search_text": ""
}
selection_method = update_config.get("selection_method")
if selection_method == "by_folder":
folder_name_to_search = update_config.get("folder_name")
if not folder_name_to_search:
sys.exit("Error: 'folder_name' is missing for 'by_folder' selection.")
folder_id = get_folder_id_by_name(config, token, folder_name_to_search)
if not folder_id:
print(f"Error: Folder '{folder_name_to_search}' not found. Cannot update profiles.")
return
search_payload["folder_id"] = folder_id
elif selection_method == "by_name_search":
search_text_to_use = update_config.get("search_text")
if search_text_to_use is None:
sys.exit("Error: 'search_text' is missing for 'by_name_search' selection.")
search_payload["search_text"] = search_text_to_use
else:
sys.exit(f"Error: Invalid selection_method '{selection_method}'.")
try:
response = make_api_request('POST', f"{API_BASE_URL}/profile/search", token, payload=search_payload)
if not response: return
profiles_to_update = response.json().get('data', {}).get('profiles', [])
if not profiles_to_update:
print("No profiles found matching your criteria.");
return
except Exception as e:
print(f"Error searching for profiles: {e}");
return
profile_names_found = [p.get('name', 'Unnamed Profile') for p in profiles_to_update]
print(f"Found {len(profiles_to_update)} profiles to update: {profile_names_found}")
comma_separated_paths = ",".join(config['extension_paths'])
cmd_params = {"params": [{"flag": "load-extension", "value": comma_separated_paths}]}
delay_between_requests = 2 # Simple fixed delay
for idx, profile in enumerate(profiles_to_update):
profile_id = profile.get('id')
profile_name_to_update = profile.get('name', 'Unnamed Profile')
if not profile_id: print(f"Warning: Skipping profile with missing ID: {profile}"); continue
print(f"({idx + 1}/{len(profiles_to_update)}) Updating profile '{profile_name_to_update}'...")
payload = {"profile_id": profile_id, "parameters": {"fingerprint": {"cmd_params": cmd_params}}}
response = make_api_request('POST', f"{API_BASE_URL}/profile/partial_update", token, payload=payload)
if response and response.status_code < 400:
print(f" -> Success!")
else:
print(f" -> Failed to process profile '{profile_name_to_update}'.")
if idx < len(profiles_to_update) - 1:
time.sleep(delay_between_requests) # Simple delay
print("\nAll profile update requests processed.")
def main():
config = load_and_validate_config()
automation_token, workspace_id = get_valid_cached_token(config)
if not automation_token:
print("No valid cached token found or token expired. Proceeding with full login.")
token_data = perform_full_login(config)
if not token_data or not token_data[0]:
sys.exit("Failed to obtain automation token. Exiting.")
automation_token, workspace_id = token_data
# The perform_full_login function now saves the token and workspace_id to config
else:
print("Using cached automation token.")
# Store retrieved/validated token and workspace_id back into the runtime config
config['api_token'] = automation_token
config['workspace_id'] = workspace_id
print("\n--- Authentication Complete ---")
action = config.get("action")
if action == "create":
handle_new_profiles(config, automation_token) # Pass token directly
elif action == "update":
handle_existing_profiles(config, automation_token) # Pass token directly
if __name__ == "__main__":
main()
Step 2: Prepare Your config.json
In the same folder as your Python script, create a file named config.json
. This file tells the script exactly what to do. Copy and paste the template below and modify it with your details.
Click to view the config.json
file
{
"email": "[email protected]",
"password": "YourIndigoXPassword",
"workspace_name": "My Main Workspace",
"action": "create",
"extension_paths": [
"/Users/YourUser/indigo_extensions/ublock_unpacked",
"/Users/YourUser/indigo_extensions/buster.xpi"
],
"create_new_profiles_config": {
"base_name": "New-Profile-",
"count": 2,
"folder_name": "Automated Profiles",
"os_type": "windows",
"browser_type": "mimic"
},
"update_existing_profiles_config": {
"selection_method": "by_folder",
"folder_name": "Profiles To Update",
"search_text": ""
},
"cached_automation_token": null,
"token_expiration_timestamp": null,
"workspace_id": null
}
Configuration Details
"email"
: Your Indigo X login email."password"
: Your Indigo X login password.- 🛡️ Security Note: This is stored in plain text. Ensure this file is kept in a secure location and is not shared or committed to public version control.
"workspace_name"
: The exact email name of the Indigo X workspace you want to use, as seen in the Indigo application."action"
: Determines the script's operation.- Set to
"create"
to generate new profiles with extensions. - Set to
"update"
to add extensions to existing profiles.
- Set to
"extension_paths"
: A list of strings. Each string must be the full absolute path to either:- A folder containing an unzipped Mimic (Chromium) extension.
- An
.xpi
file for a Stealthfox (Firefox) extension.
"create_new_profiles_config"
: Used only when"action"
is"create"
."base_name"
: A prefix for new profile names (e.g., "Marketing-" creates "Marketing-1", "Marketing-2")."count"
: The number of new profiles to create."folder_name"
: The folder where new profiles will be created (the script creates it if it doesn't exist). By default, new profiles are stored in the cloud (is_local: false
)."os_type"
: OS for new profiles ("windows"
,"macos"
,"linux"
,"android"
)."browser_type"
: Browser for new profiles ("mimic"
,"stealthfox"
).
"update_existing_profiles_config"
: Used only when"action"
is"update"
."selection_method"
: How to find profiles to update."by_folder"
: Targets all profiles within the specified"folder_name"
."by_name_search"
: Targets all profiles matching the specified"search_text"
.
"folder_name"
: The name of the folder to target (if using"by_folder"
)."search_text"
: Text to search in profile names (if using"by_name_search"
).
"cached_automation_token"
,"token_expiration_timestamp"
,"workspace_id"
: These are managed automatically by the script. You can leave them asnull
initially.
⚠️ Important: How Updates Work
When you run the script with the"update"
action, it replaces all existing command-line parameters on the target profiles. This means your "extension_paths"
list in the config file should always contain the complete and final list of all extensions you want the profiles to have. Any extensions previously installed on a profile via this method, but not included in the new list, will be removed.💡 Tip: Changing Credentials
If you want to change youremail
or password
, you must clear the script's authentication cache to force a fresh login.Simply reset the cached fields in your config.json
back to null
:
"cached_automation_token": null,
"token_expiration_timestamp": null,
"workspace_id": null
The script will then perform a full, new login on its next run with your updated credentials.