AI-Assisted Troubleshooting with Ollama
Use a local LLM to analyze your network traffic in real-time. This guide shows how to set up Qwen3 with Ollama so you can ask questions about your traffic in plain English - and the AI fetches the data itself.
What you'll build:
You: "What external APIs is my app calling?"
Qwen3: *fetches traffic from DevTools API*
"Based on the traffic I captured, your application is connecting to:
- api.stripe.com (payment-service container)
- api.openai.com (risk-ai-sprawl container)
- hooks.slack.com (multiple containers)
..."No copy/paste. No manual data wrangling. The LLM calls the DevTools API directly.
Prerequisites
Linux host with Qtap installed
~3GB disk space for the Qwen3 model
Quick Start (No Code)
The simplest approach: capture traffic, paste into any LLM.
Copy the output and paste it into ChatGPT, Claude, or ollama run qwen3:4b with a question like:
"Here's my network traffic. What external services am I calling? Are there any errors?"
This works great for one-off debugging. For a more integrated experience, continue below.
Setup
Step 1: Install Ollama and Python Library
Step 2: Start Qtap with DevTools
Verify DevTools is accessible:
Step 3: Create the Traffic Analysis Script
Save this as traffic-chat.py:
Make it executable:
Step 4: Start Chatting
Example session:
Example Questions
Debugging:
"What external APIs is my app calling?"
"Are there any failed requests?"
"What's making requests to api.stripe.com?"
"Which requests are taking the longest?"
Security:
"What data is being sent to external services?"
"Are there any unexpected outbound connections?"
"Which containers are making the most external calls?"
Analysis:
"Summarize the traffic patterns"
"What's the breakdown of HTTP status codes?"
"How many unique endpoints are being contacted?"
Troubleshooting
"No traffic captured"
Make sure Qtap is running and DevTools is enabled:
Model seems slow
Try reducing the capture window or using a smaller context:
Tool not being called
If Qwen3 responds without fetching data, be more explicit:
Ollama connection errors
Ensure Ollama is running:
Next Steps
DevTools API Reference - Full API documentation
DevTools Interface Guide - Use the browser UI alongside the AI
Traffic Processing with Plugins - Configure what traffic gets captured
Last updated