This guide walks you through creating a web-based chatbot using Azure and DeepSeek API. The chatbot schedules meetings, queries upcoming meetings, and displays processed tasks. It uses Azure App Service for the web interface, Azure Functions for background processing, and Azure Cosmos DB for storage. Source code is available at GitHub.
What it does: Users can submit tasks (e.g., "Schedule a meeting for tomorrow at 10 AM") via a web form, which are stored in Cosmos DB. An Azure Function processes tasks using the DeepSeek API to refine them into structured JSON (e.g., task name, date, priority). Users can query "When is the next meeting?" to see the earliest upcoming meeting, and the interface displays processed meetings.
What it does not do: No complex NLP, notifications, real-time chat, or additional metadata. Input validation ensures only meeting-related tasks are accepted.
Prerequisites: Free Azure account, DeepSeek API key, Azure CLI, Python 3.8+, and a code editor.
Create accounts to access required services.
Install tools to deploy and manage the application.
az login
to authenticate with Azure.Organize files for the Flask app and Azure Function.
mkdir meeting-scheduler
cd meeting-scheduler
mkdir -p app templates check_tasks
touch setup-env.sh app/app.py app/requirements.txt templates/index.html check_tasks/task-checker-fn.py check_tasks/requirements.txt check_tasks/function.json check_tasks/host.json
Define environment variables for Azure and DeepSeek.
# setup-env.sh
#!/bin/bash
# Project root directory on the local machine
export PROJDIR=~/py/ai-python-scripts
# Azure resource group for all resources
export AGENTICAI_RG="meeting-scheduler-rg"
# Azure region for resource deployment
export AZURE_LOCATION="eastus"
# Azure subscription ID (will be set dynamically)
export SUBSCRIPTION_ID=""
# Azure CLI version (will be set dynamically)
export AZURE_CLI_VERSION=""
# App Service name for the chatbot interface
export AGENTICAI_APP_SRV="meetingschedulerapp$(date +%s)"
# Python script name for the App Service
export AGENTICAI_APP="agenticai-chat-bot"
# Directory for App Service code
export AGENTICAI_APP_DIR="$PWD/app"
# Path to the App Service deployment zip file
export AGENTICAI_APP_ZIP="$PWD/webapp.zip"
# Port for the App Service application
export AGENTICAI_APP_PORT="8000"
# Function App name for task management
export AGENTICAI_TASK_CHECKER_FN_APP="meetingschedulerfn$(date +%s)"
# Directory for Function App code
export AGENTICAI_FN_DIR="$PWD/check_tasks"
# Path to the Function App deployment zip file
export AGENTICAI_FN_ZIP="$PWD/function-app.zip"
# Python script name for the Function App
export AGENTICAI_FN_SCRIPT="task-checker-fn"
# Timeout for Function App execution (in seconds)
export AGENTICAI_FN_TIMEOUT="600"
# Cosmos DB account name
export COSMOS_DB_ACCOUNT="meetingschedulerdb$(date +%s)"
# Cosmos DB endpoint URL (will be set dynamically)
export COSMOS_DB_ENDPOINT=""
# Cosmos DB primary key (will be set dynamically)
export COSMOS_DB_KEY=""
# Cosmos DB database name
export COSMOS_DATABASE_NAME="taskdb"
# Cosmos DB container name
export COSMOS_CONTAINER_NAME="task"
# Storage Account name for file storage
export STORAGE_ACCOUNT_NAME="agenticaistorage$(date +%s)"
# Storage Account primary key (will be set dynamically)
export STORAGE_ACCOUNT_KEY=""
# Storage Account connection string for Function App (will be set dynamically)
export AGENTICAI_FN_STORAGE=""
# Blob container name for file storage
export STORAGE_CONTAINER_NAME="agenticai-files"
# Application Insights resource name
export APP_INSIGHTS_NAME="agenticai-app-insights"
# Application Insights instrumentation key (will be set dynamically)
export APP_INSIGHTS_KEY=""
# Hugging Face API URL for LLM integration
export HF_API_URL=""
# Hugging Face API key
export HF_API_KEY=""
# DeepSeek API URL for LLM integration
export DS_API_URL="https://api.deepseek.com"
# DeepSeek API key
export DS_API_KEY="your-deepseek-api-key"
# OpenAI resource name (optional)
export AGENTICAI_OPENAI="agenticai-openai"
# OpenAI API key (optional)
export OPENAI_API_KEY=""
# Log level for application logging
export AGENTICAI_LOG_LEVEL="INFO"
# Environment identifier (e.g., dev, prod)
export AGENTICAI_ENV="dev"
# Environment BUILD during deployment
export SCM_DO_BUILD_DURING_DEPLOYMENT="true"
# Ensure Azure CLI is installed
if ! command -v az &> /dev/null; then
echo "Azure CLI is not installed. Please install it before running this script."
exit 1
fi
# Perform az login if not already logged in
echo "Checking Azure CLI login status..."
az account show &> /dev/null
if [ $? -ne 0 ]; then
echo "Not logged in. Performing az login..."
az login
else
echo "Already logged in to Azure CLI."
fi
# Fetch Azure Subscription ID
echo "Fetching Subscription ID..."
export SUBSCRIPTION_ID=$(az account show --query id --output tsv)
if [ -z "$SUBSCRIPTION_ID" ]; then
echo "Failed to fetch Subscription ID. Please check your Azure CLI login."
exit 1
fi
echo "SUBSCRIPTION_ID set to $SUBSCRIPTION_ID"
# Fetch Azure CLI Version
echo "Fetching Azure CLI Version..."
export AZURE_CLI_VERSION=$(az --version | grep azure-cli | awk '{print $2}')
if [ -z "$AZURE_CLI_VERSION" ]; then
echo "Failed to fetch Azure CLI Version."
exit 1
fi
echo "AZURE_CLI_VERSION set to $AZURE_CLI_VERSION"
# Fetch Cosmos DB Endpoint
echo "Fetching Cosmos DB Endpoint..."
export COSMOS_DB_ENDPOINT=$(az cosmosdb show --name $COSMOS_DB_ACCOUNT --resource-group $AGENTICAI_RG --query documentEndpoint --output tsv)
if [ -z "$COSMOS_DB_ENDPOINT" ]; then
echo "Failed to fetch Cosmos DB Endpoint. Please verify the Cosmos DB account '$COSMOS_DB_ACCOUNT' exists in resource group '$AGENTICAI_RG'."
exit 1
fi
echo "COSMOS_DB_ENDPOINT set to $COSMOS_DB_ENDPOINT"
# Fetch Cosmos DB Primary Key
echo "Fetching Cosmos DB Primary Key..."
export COSMOS_DB_KEY=$(az cosmosdb keys list --name $COSMOS_DB_ACCOUNT --resource-group $AGENTICAI_RG --query primaryMasterKey --output tsv)
if [ -z "$COSMOS_DB_KEY" ]; then
echo "Failed to fetch Cosmos DB Primary Key. Please verify the Cosmos DB account '$COSMOS_DB_ACCOUNT' exists in resource group '$AGENTICAI_RG'."
exit 1
fi
echo "COSMOS_DB_KEY set (redacted for security)"
# Fetch Storage Account Primary Key
echo "Fetching Storage Account Primary Key..."
export STORAGE_ACCOUNT_KEY=$(az storage account keys list --account-name $STORAGE_ACCOUNT_NAME --resource-group $AGENTICAI_RG --query "[0].value" --output tsv)
if [ -z "$STORAGE_ACCOUNT_KEY" ]; then
echo "Failed to fetch Storage Account Primary Key. Please verify the Storage Account '$STORAGE_ACCOUNT_NAME' exists in resource group '$AGENTICAI_RG'."
exit 1
fi
echo "STORAGE_ACCOUNT_KEY set (redacted for security)"
# Fetch Storage Account Connection String
echo "Fetching Storage Account Connection String..."
export AGENTICAI_FN_STORAGE=$(az storage account show-connection-string --name $STORAGE_ACCOUNT_NAME --resource-group $AGENTICAI_RG --query connectionString --output tsv)
if [ -z "$AGENTICAI_FN_STORAGE" ]; then
echo "Failed to fetch Storage Account Connection String. Please verify the Storage Account '$STORAGE_ACCOUNT_NAME' exists in resource group '$AGENTICAI_RG'."
exit 1
fi
echo "AGENTICAI_FN_STORAGE set (redacted for security)"
# Fetch Application Insights Instrumentation Key
echo "Fetching Application Insights Instrumentation Key..."
export APP_INSIGHTS_KEY=$(az monitor app-insights component show --app $APP_INSIGHTS_NAME --resource-group $AGENTICAI_RG --query instrumentationKey --output tsv)
if [ -z "$APP_INSIGHTS_KEY" ]; then
echo "Failed to fetch Application Insights Instrumentation Key. Please verify the Application Insights resource '$APP_INSIGHTS_NAME' exists in resource group '$AGENTICAI_RG'."
exit 1
fi
echo "APP_INSIGHTS_KEY set (redacted for security)"
export COSMOS_CONNECTION_STRING="AccountEndpoint=$COSMOS_DB_ENDPOINT;AccountKey=$COSMOS_DB_KEY;"
Replace your-deepseek-api-key
with your actual DeepSeek API key.
Download: setup-env.sh
Implement the web interface to handle task submissions and queries.
# app/app.py
from flask import Flask, request, render_template
import json
import requests
from azure.cosmos import CosmosClient
import os
from datetime import datetime, date, timedelta
import logging
from time import sleep
app = Flask(__name__)
# Configure logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
# DeepSeek API configuration
DS_API_URL = os.getenv("DS_API_URL")
DS_API_KEY = os.getenv("DS_API_KEY")
logger.info(f"DS_API_URL: {DS_API_URL}, DS_API_KEY: {'set' if DS_API_KEY else 'not set'}")
headers = {"Authorization": f"Bearer {DS_API_KEY}", "Content-Type": "application/json"} if DS_API_KEY else None
def get_cosmos_client():
"""Initialize Cosmos DB client on demand."""
cosmos_endpoint = os.getenv("COSMOS_DB_ENDPOINT")
cosmos_key = os.getenv("COSMOS_DB_KEY")
logger.info(f"Cosmos DB Endpoint: {cosmos_endpoint}, Key: {'set' if cosmos_key else 'not set'}")
if not cosmos_endpoint or not cosmos_key:
logger.error("Cosmos DB endpoint or key is missing")
raise Exception("Cosmos DB configuration missing")
try:
client = CosmosClient(cosmos_endpoint, cosmos_key)
database = client.get_database_client(os.getenv("COSMOS_DATABASE_NAME"))
container = database.get_container_client(os.getenv("COSMOS_CONTAINER_NAME"))
return container
except Exception as e:
logger.error(f"Failed to initialize Cosmos DB: {str(e)}")
raise
def call_deepseek_api(prompt, max_retries=3, initial_delay=5):
"""Call DeepSeek API with retry logic for 503 errors."""
logger.info(f"Attempting DeepSeek API call with prompt: {prompt[:50]}...")
if not DS_API_URL or not DS_API_KEY:
logger.error("DeepSeek API URL or Key is missing. Using fallback.")
raise Exception("DeepSeek API configuration missing")
for attempt in range(max_retries):
try:
payload = {
"model": "deepseek-chat",
"messages": [
{"role": "system", "content": "You are a helpful assistant that outputs JSON."},
{"role": "user", "content": prompt}
],
"max_tokens": 200,
"temperature": 0.5
}
logger.info(f"Making request to {DS_API_URL}/v1/chat/completions")
response = requests.post(DS_API_URL + "/v1/chat/completions", headers=headers, json=payload, timeout=60)
response.raise_for_status()
return response.json()
except requests.exceptions.HTTPError as e:
if response.status_code == 503:
logger.warning(f"503 error on attempt {attempt + 1}: {str(e)}")
delay = initial_delay * (2 ** attempt)
sleep(delay)
continue
else:
logger.error(f"API error: {str(e)}")
raise Exception(f"API Error: {str(e)}")
except requests.exceptions.RequestException as e:
logger.error(f"Request error: {str(e)}")
raise Exception(f"Request Error: {str(e)}")
raise Exception("Max retries reached for DeepSeek API")
def parse_task_fallback(task_input):
"""Fallback parsing for tasks."""
logger.info(f"Using fallback parsing for task: {task_input[:50]}")
try:
tomorrow = (date.today() + timedelta(days=1)).isoformat() + "T10:00:00Z"
return {
"task_name": task_input[:50],
"due_date": tomorrow,
"priority": "Medium"
}
except Exception as e:
logger.error(f"Fallback parsing error: {str(e)}")
raise Exception("Failed to parse task with fallback")
def is_query(input_text):
"""Check if the input is a query (e.g., 'When is the next meeting?')."""
query_keywords = ["when is", "next meeting", "upcoming meeting"]
return any(keyword in input_text.lower() for keyword in query_keywords)
def get_next_meeting():
"""Query Cosmos DB for the next meeting."""
try:
container = get_cosmos_client()
query = f"SELECT c.task_name, c.due_date, c.priority, c.processed FROM c WHERE c.due_date >= '{datetime.utcnow().isoformat()}Z' ORDER BY c.due_date ASC"
tasks = list(container.query_items(query, enable_cross_partition_query=True))
if tasks:
return tasks[0]
return None
except Exception as e:
logger.error(f"Error querying Cosmos DB: {str(e)}")
raise Exception(f"Failed to query next meeting: {str(e)}")
def get_processed_tasks():
"""Query Cosmos DB for processed tasks."""
try:
container = get_cosmos_client()
query = f"SELECT c.task_name, c.due_date, c.priority FROM c WHERE c.processed = true ORDER BY c.due_date ASC"
tasks = list(container.query_items(query, enable_cross_partition_query=True))
return tasks
except Exception as e:
logger.error(f"Error querying processed tasks: {str(e)}")
return []
def validate_task_input(input_text):
"""Basic validation to check if input resembles a meeting task."""
task_keywords = ["schedule", "meeting", "call", "sync", "appointment"]
time_indicators = ["tomorrow", "today", "at", "on", "next"]
input_lower = input_text.lower()
return any(keyword in input_lower for keyword in task_keywords) or any(indicator in input_lower for indicator in time_indicators)
@app.route('/', methods=['GET', 'POST'])
def index():
"""Handle task submission, queries, and display processed tasks."""
logger.info("Handling request to /")
processed_tasks = get_processed_tasks() # Fetch processed tasks for display
if request.method == 'POST':
user_input = request.form['task'].strip()
logger.info(f"Received input: {user_input}")
try:
if is_query(user_input):
# Handle query for next meeting
next_meeting = get_next_meeting()
if next_meeting:
response = f"Next meeting: {next_meeting['task_name']} on {next_meeting['due_date']} (Priority: {next_meeting['priority']}, Processed: {next_meeting['processed']})"
else:
response = "No upcoming meetings found."
return render_template('index.html', response=response, suggestion=None, processed_tasks=processed_tasks)
else:
# Validate task input
if not validate_task_input(user_input):
logger.warning(f"Invalid task input: {user_input}")
return render_template('index.html', response="Error: Input does not appear to be a valid meeting task.", suggestion=None, processed_tasks=processed_tasks)
# Process task submission
task_data = parse_task_fallback(user_input)
task_data['id'] = str(datetime.now().timestamp())
task_data['created_at'] = datetime.now().isoformat()
task_data['processed'] = False
container = get_cosmos_client()
container.create_item(task_data)
logger.info(f"Task stored in Cosmos DB: {task_data['id']}")
return render_template('index.html', response="Task added successfully! Processing in background.", suggestion="Awaiting suggestion from background process.", processed_tasks=processed_tasks)
except Exception as e:
logger.error(f"Error processing request: {str(e)}")
return render_template('index.html', response=f"Error: {str(e)}", suggestion=None, processed_tasks=processed_tasks)
return render_template('index.html', processed_tasks=processed_tasks)
if __name__ == '__main__':
logger.info("Starting Flask app")
port = int(os.getenv("AGENTICAI_APP_PORT", 8000))
app.run(host="0.0.0.0", port=port, debug=True)
Download: app.py
Specify Python dependencies for the Flask app.
# app/requirements.txt
Flask==2.0.1
requests==2.26.0
azure-cosmos==4.2.0
Download: requirements.txt
Design the HTML interface for user interaction.
# templates/index.html
Meeting Scheduler Chatbot
Meeting Scheduler Chatbot
This chatbot uses two worker agents (Task Parser and Suggestion) coordinated by a master agent to schedule tasks and provide proactive suggestions.
{% if response %}
Response:
{{ response }}
{% endif %}
{% if suggestion %}
Suggestion:
{{ suggestion }}
{% endif %}
{% if processed_tasks %}
Upcoming Meetings:
{% for task in processed_tasks %}
Meeting: {{ task.task_name }} on {{ task.due_date }} (Priority: {{ task.priority }})
{% endfor %}
{% endif %}
Download: index.html
Implement the background task processor using DeepSeek API.
# check_tasks/task-checker-fn.py
import azure.functions as func
import json
import requests
import os
import logging
from datetime import date, timedelta
from azure.cosmos import CosmosClient
# Configure logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
# DeepSeek API configuration
DS_API_URL = os.getenv("DS_API_URL")
DS_API_KEY = os.getenv("DS_API_KEY")
logger.info(f"DS_API_URL: {DS_API_URL}, DS_API_KEY: {'set' if DS_API_KEY else 'not set'}")
headers = {"Authorization": f"Bearer {DS_API_KEY}", "Content-Type": "application/json"} if DS_API_KEY else None
def parse_task_fallback(task_input):
"""Fallback parsing if DeepSeek API fails."""
logger.info(f"Using fallback parsing for task: {task_input[:50]}")
try:
tomorrow = (date.today() + timedelta(days=1)).isoformat() + "T10:00:00Z"
return {
"task_name": task_input[:50],
"due_date": tomorrow,
"priority": "Medium",
"processed": True
}
except Exception as e:
logger.error(f"Fallback parsing error: {str(e)}")
raise Exception("Failed to parse task with fallback")
def call_deepseek_api(prompt, max_retries=3, initial_delay=5):
"""Call DeepSeek API with retry logic for 503 errors."""
logger.info(f"Attempting DeepSeek API call with prompt: {prompt[:50]}...")
if not DS_API_URL or not DS_API_KEY:
logger.error("DeepSeek API URL or Key is missing. Using fallback.")
raise Exception("DeepSeek API configuration missing")
for attempt in range(max_retries):
try:
payload = {
"model": "deepseek-chat",
"messages": [
{"role": "system", "content": "You are a helpful assistant that outputs JSON."},
{"role": "user", "content": prompt}
],
"max_tokens": 200,
"temperature": 0.5
}
logger.info(f"Making request to {DS_API_URL}/v1/chat/completions")
response = requests.post(DS_API_URL + "/v1/chat/completions", headers=headers, json=payload, timeout=60)
response.raise_for_status()
return response.json()
except requests.exceptions.HTTPError as e:
if response.status_code == 503:
logger.warning(f"503 error on attempt {attempt + 1}: {str(e)}")
import time
time.sleep(initial_delay * (2 ** attempt))
continue
else:
logger.error(f"API error: {str(e)}")
raise Exception(f"API Error: {str(e)}")
except requests.exceptions.RequestException as e:
logger.error(f"Request error: {str(e)}")
raise Exception(f"Request Error: {str(e)}")
raise Exception("Max retries reached for DeepSeek API")
def main(req: func.DocumentList) -> func.Document:
"""Cosmos DB trigger to process new tasks."""
logger.info("Function triggered by Cosmos DB change.")
if not req:
logger.warning("No documents received.")
return None
try:
cosmos_endpoint = os.getenv("COSMOS_DB_ENDPOINT")
cosmos_key = os.getenv("COSMOS_DB_KEY")
logger.info(f"Cosmos DB Endpoint: {cosmos_endpoint}, Key: {'set' if cosmos_key else 'not set'}")
if not cosmos_endpoint or not cosmos_key:
logger.error("Cosmos DB endpoint or key is missing")
raise Exception("Cosmos DB configuration missing")
client = CosmosClient(cosmos_endpoint, cosmos_key)
database = client.get_database_client(os.getenv("COSMOS_DATABASE_NAME"))
container = database.get_container_client(os.getenv("COSMOS_CONTAINER_NAME"))
for doc in req:
task_input = doc.get("task_name", "")
if not task_input or doc.get("processed", False):
continue
parse_prompt = (
f"Return a JSON object with fields: task_name (string), due_date (ISO format string, e.g., '2025-07-19T10:00:00Z'), and priority (string, one of 'Low', 'Medium', 'High'). Output only the JSON object, enclosed in ```json\n...\n```.\n"
f"Task: {task_input}\n"
f"Example: ```json\n{{\"task_name\": \"Schedule a meeting\", \"due_date\": \"2025-07-19T10:00:00Z\", \"priority\": \"Medium\"}}\n```"
)
try:
parse_response = call_deepseek_api(parse_prompt)
generated_text = parse_response["choices"][0]["message"]["content"]
json_start = generated_text.find("```json\n") + 8
json_end = generated_text.rfind("\n```")
if json_start > 7 and json_end > json_start:
json_str = generated_text[json_start:json_end]
task_data = json.loads(json_str)
else:
logger.warning(f"Invalid JSON format: {generated_text}. Using fallback.")
task_data = parse_task_fallback(task_input)
except (json.JSONDecodeError, IndexError, KeyError, Exception) as e:
logger.warning(f"Invalid JSON response or error: {str(e)}. Using fallback.")
task_data = parse_task_fallback(task_input)
task_data["id"] = doc["id"]
task_data["created_at"] = doc.get("created_at")
task_data["processed"] = True
container.upsert_item(task_data)
logger.info(f"Processed task: {task_data['task_name']}")
return func.DocumentList(req)
except Exception as e:
logger.error(f"Error processing documents: {str(e)}")
raise
Download: task-checker-fn.py
Specify Python dependencies for the Azure Function.
# check_tasks/requirements.txt
azure-functions==1.7.2
requests==2.26.0
azure-cosmos==4.2.0
Download: requirements.txt
Configure the Cosmos DB trigger for the Azure Function.
# check_tasks/function.json
{
"bindings": [
{
"type": "cosmosDBTrigger",
"name": "req",
"direction": "in",
"leaseCollectionName": "leases",
"connectionStringSetting": "COSMOS_CONNECTION_STRING",
"databaseName": "taskdb",
"collectionName": "task",
"createLeaseCollectionIfNotExists": true
}
],
"scriptFile": "task-checker-fn.py"
}
Download: function.json
Define Azure Function app settings.
# check_tasks/host.json
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true
}
}
}
}
Download: host.json
Create a resource group to organize Azure resources.
source setup-env.sh
az group create --name $AGENTICAI_RG --location eastus
Set up Cosmos DB to store tasks.
az cosmosdb create --name $AGENTICAI_COSMOS_DB --resource-group $AGENTICAI_RG --locations regionName=eastus
az cosmosdb sql database create --account-name $AGENTICAI_COSMOS_DB --resource-group $AGENTICAI_RG --name $COSMOS_DATABASE_NAME
az cosmosdb sql container create --account-name $AGENTICAI_COSMOS_DB --resource-group $AGENTICAI_RG --database-name $COSMOS_DATABASE_NAME --name $COSMOS_CONTAINER_NAME --partition-key-path "/id"
Retrieve endpoint and key for Cosmos DB.
export COSMOS_DB_ENDPOINT=$(az cosmosdb show --name $AGENTICAI_COSMOS_DB --resource-group $AGENTICAI_RG --query documentEndpoint --output tsv)
export COSMOS_DB_KEY=$(az cosmosdb keys list --name $AGENTICAI_COSMOS_DB --resource-group $AGENTICAI_RG --query primaryMasterKey --output tsv)
export COSMOS_CONNECTION_STRING=$(az cosmosdb keys list --name $AGENTICAI_COSMOS_DB --resource-group $AGENTICAI_RG --query primaryMasterKey --output tsv --type connection-strings)
Deploy the Flask app to Azure App Service.
az appservice plan create --name meetingscheduler-plan --resource-group $AGENTICAI_RG --sku FREE
az webapp create --name $AGENTICAI_APP_SRV --resource-group $AGENTICAI_RG --plan meetingscheduler-plan --runtime "PYTHON|3.8"
Deploy the Azure Function for background processing.
az functionapp create --name $AGENTICAI_TASK_CHECKER_FN_APP --resource-group $AGENTICAI_RG --consumption-plan-location eastus --runtime python --runtime-version 3.8 --os-type Linux --functions-version 3
Set environment variables for both apps.
az webapp config appsettings set --name $AGENTICAI_APP_SRV --resource-group $AGENTICAI_RG --settings \
DS_API_URL="$DS_API_URL" \
DS_API_KEY="$DS_API_KEY" \
COSMOS_DB_ENDPOINT="$COSMOS_DB_ENDPOINT" \
COSMOS_DB_KEY="$COSMOS_DB_KEY" \
COSMOS_DATABASE_NAME="$COSMOS_DATABASE_NAME" \
COSMOS_CONTAINER_NAME="$COSMOS_CONTAINER_NAME"
az functionapp config appsettings set --name $AGENTICAI_TASK_CHECKER_FN_APP --resource-group $AGENTICAI_RG --settings \
DS_API_URL="$DS_API_URL" \
DS_API_KEY="$DS_API_KEY" \
COSMOS_DB_ENDPOINT="$COSMOS_DB_ENDPOINT" \
COSMOS_DB_KEY="$COSMOS_DB_KEY" \
COSMOS_DATABASE_NAME="$COSMOS_DATABASE_NAME" \
COSMOS_CONTAINER_NAME="$COSMOS_CONTAINER_NAME" \
COSMOS_CONNECTION_STRING="$COSMOS_CONNECTION_STRING"
Package and deploy the Flask app to Azure.
cd app
zip -r ../webapp.zip app.py requirements.txt templates/
cd ..
az webapp deploy --name $AGENTICAI_APP_SRV --resource-group $AGENTICAI_RG --src-path $AGENTICAI_APP_ZIP --type zip
Package and deploy the Azure Function.
cd check_tasks
zip -r ../function-app.zip task-checker-fn.py requirements.txt host.json function.json
cd ..
az functionapp deployment source config-zip --name $AGENTICAI_TASK_CHECKER_FN_APP --resource-group $AGENTICAI_RG --src $AGENTICAI_FN_ZIP --build-remote
Verify the chatbot works by submitting tasks and queries.
open http://$AGENTICAI_APP_SRV.azurewebsites.net
Test inputs:
Check logs to debug issues.
az webapp log tail --name $AGENTICAI_APP_SRV --resource-group $AGENTICAI_RG
az functionapp logstream --name $AGENTICAI_TASK_CHECKER_FN_APP --resource-group $AGENTICAI_RG
DS_API_URL
is set to https://api.deepseek.com
in Azure settings.COSMOS_DB_ENDPOINT
and COSMOS_DB_KEY
using az cosmosdb show
and az cosmosdb keys list
.COSMOS_CONNECTION_STRING
and ensure the Cosmos DB trigger is configured.deepseek-chat
model.To enhance the chatbot, consider adding:
Visit AI Python Solutions for more AI and migration tools.