Initializing the Client
The AIGuardClient is the main entry point for all AI Guard SDK operations. It manages authentication, connection settings, and context for classification, redaction, and metrics requests.
Basic Initialization
from ai_guard import AIGuardClient
from ai_guard.api import AIPlatform
client = AIGuardClient(
"https://ai-guard.example.com:4443",
token="your-api-key",
agent_id="my-agent",
platform=AIPlatform.AMAZON_BEDROCK,
)Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
url | str | Yes | Base URL of your AI Guard service (e.g., https://ai-guard.example.com:4443) |
token | str | Yes | OneTrust API key with Data Discovery scope |
agent_id | str | Yes | Unique identifier for your AI agent or application |
platform | AIPlatform | Yes | The AI platform your application uses |
pin_sha256 | str | No | Base64-encoded SHA-256 hash of the server certificate's public key for certificate pinning |
session | requests.Session | No | Custom requests.Session for advanced TLS configuration |
Supported Platforms
The platform parameter identifies the AI platform powering your application. This value is included in all metrics events for attribution.
from ai_guard.api import AIPlatform
# Available platforms:
AIPlatform.AMAZON_BEDROCK
AIPlatform.AMAZON_SAGEMAKER
AIPlatform.AZURE_FOUNDRY
AIPlatform.DATABRICKS
AIPlatform.GCP_VERTEXUsing Environment Variables
For production deployments, load configuration from environment variables:
import os
from ai_guard import AIGuardClient
from ai_guard.api import AIPlatform
client = AIGuardClient(
os.environ["AI_GUARD_URL"],
token=os.environ["AI_GUARD_TOKEN"],
agent_id=os.environ.get("AI_GUARD_AGENT_ID", "my-agent"),
platform=AIPlatform.AMAZON_BEDROCK,
)Certificate Pinning
When your AI Guard service uses self-signed or internally-signed TLS certificates, use certificate pinning to verify the server's identity without managing a full CA chain:
client = AIGuardClient(
"https://ai-guard.example.com:4443",
token="your-api-key",
agent_id="my-agent",
platform=AIPlatform.AMAZON_BEDROCK,
pin_sha256="x48Lk2iu3R3nAhSiz07bExGHTusDRjHqBx9ArK3cFGE=",
)The pin is a base64-encoded SHA-256 hash of the server certificate's Subject Public Key Info (SPKI). Your administrator provides this value during Light Worker Node deployment.
Pinning BehaviorWhen
pin_sha256is provided:
- CA chain verification is bypassed; the connection is secured by the pinned key alone
- Hostname verification is skipped; the server URL can use an IP address or
localhost- Certificate rotation is transparent as long as the same key pair is reused
If the key pair changes, the
pin_sha256value must be updated.
Extracting the Pin
If you have access to the server certificate, extract the pin with:
openssl x509 -in server.crt -pubkey -noout \
| openssl pkey -pubin -outform DER \
| openssl dgst -sha256 -binary \
| base64Custom Session
For advanced TLS configurations (e.g., custom CA bundles, mutual TLS), pass a pre-configured requests.Session:
import requests
from ai_guard import AIGuardClient
from ai_guard.api import AIPlatform
session = requests.Session()
session.verify = "/path/to/corporate-ca-bundle.pem"
client = AIGuardClient(
"https://ai-guard.example.com:4443",
token="your-api-key",
agent_id="my-agent",
platform=AIPlatform.AMAZON_BEDROCK,
session=session,
)
Note
sessionandpin_sha256are mutually exclusive. If you provide your ownrequests.Session, thepin_sha256parameter is ignored. The SDK assumes you have already configured TLS on your session.
Automatic Context Injection
The client automatically injects agent_id and platform into the context of all classification and metrics requests. You do not need to include them manually when building requests.
What's Next?
- Classify Text β Send text for classification
- Redact Sensitive Data β Apply redaction policies
- Streaming Classification β Process streaming LLM responses
Updated 3 days ago