ChatGPT API
Send a prompt to ChatGPT and get the assistant reply as structured JSON
The ChatGPT API takes a single text prompt and returns the assistant's reply from chatgpt.com as structured JSON. One GET request, no account, no cookies, no session state to manage on your side.
curl "https://api.scrape.do/plugin/chatgpt/chat?token=$TOKEN&q=ping"Credit Usage: Each successful request costs 10 credits. For bulk processing, use the Async API with plugins.
Key Features
- One-shot prompt-to-reply: send a prompt, get the full assistant response back in a single HTTP call.
- Structured JSON envelope: the response is the full message document, not a flat string. You get the assistant text plus citations, model slug, finish reason, and message / conversation IDs.
- Stateless from your side: every call is independent. No login, no token refresh, no conversation IDs to track.
- Citation support: when the model returns inline citations, they come back as a
content_references[]array on the message metadata, with resolved URLs. - No render fee: protocol traffic to
chatgpt.comis included in the per-call price.
Endpoint
GET https://api.scrape.do/plugin/chatgpt/chatBasic Example
curl --location --request GET 'https://api.scrape.do/plugin/chatgpt/chat?token=<SDO-token>&q=Explain+how+rainbows+form'import requests
import json
token = "<SDO-token>"
url = f"https://api.scrape.do/plugin/chatgpt/chat?token={token}&q=Explain+how+rainbows+form"
response = requests.request("GET", url)
print(json.dumps(response.json(), indent=2))const axios = require('axios');
const token = "<SDO-token>";
const url = `https://api.scrape.do/plugin/chatgpt/chat?token=${token}&q=Explain+how+rainbows+form`;
axios.get(url)
.then(response => {
console.log(JSON.stringify(response.data, null, 2));
})
.catch(error => {
console.error(error);
});package main
import (
"fmt"
"io/ioutil"
"net/http"
)
func main() {
token := "<SDO-token>"
url := fmt.Sprintf(
"https://api.scrape.do/plugin/chatgpt/chat?token=%s&q=Explain+how+rainbows+form",
token,
)
resp, err := http.Get(url)
if err != nil {
panic(err)
}
defer resp.Body.Close()
body, _ := ioutil.ReadAll(resp.Body)
fmt.Println(string(body))
}require 'net/http'
require 'json'
token = "<SDO-token>"
url = URI("https://api.scrape.do/plugin/chatgpt/chat?token=#{token}&q=Explain+how+rainbows+form")
response = Net::HTTP.get(url)
puts JSON.pretty_generate(JSON.parse(response))import java.net.HttpURLConnection;
import java.net.URL;
import java.io.BufferedReader;
import java.io.InputStreamReader;
public class ChatGPTChat {
public static void main(String[] args) throws Exception {
String token = "<SDO-token>";
String url = String.format(
"https://api.scrape.do/plugin/chatgpt/chat?token=%s&q=Explain+how+rainbows+form",
token
);
HttpURLConnection conn = (HttpURLConnection) new URL(url).openConnection();
conn.setRequestMethod("GET");
BufferedReader reader = new BufferedReader(
new InputStreamReader(conn.getInputStream())
);
String line;
StringBuilder response = new StringBuilder();
while ((line = reader.readLine()) != null) {
response.append(line);
}
reader.close();
System.out.println(response.toString());
}
}using System;
using System.Net.Http;
using System.Threading.Tasks;
class Program
{
static async Task Main()
{
string token = "<SDO-token>";
string url = $"https://api.scrape.do/plugin/chatgpt/chat?token={token}&q=Explain+how+rainbows+form";
using HttpClient client = new HttpClient();
string response = await client.GetStringAsync(url);
Console.WriteLine(response);
}
}<?php
$token = "<SDO-token>";
$url = "https://api.scrape.do/plugin/chatgpt/chat?token={$token}&q=Explain+how+rainbows+form";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($ch);
curl_close($ch);
echo json_encode(json_decode($response), JSON_PRETTY_PRINT);
?>curl "https://api.scrape.do/plugin/chatgpt/chat?token=$TOKEN&q=Explain+how+rainbows+form"Request Parameters
Required
| Parameter | Type | Description |
|---|---|---|
token | string | Your Scrape.do API authentication token |
q | string | The prompt to send. URL-encode multi-word prompts |
Notes
- The prompt is capped at a maximum length per request. Very long prompts are rejected before any model call runs, so no credits are spent on them.
- One call returns one assistant reply. There is no multi-turn conversation state; to follow up, include any prior context inside the new
q.
Response
The response is the full assembled message envelope as JSON; consumers walk the envelope themselves. The most useful paths:
data.message.content.parts[0] → assistant text
data.message.metadata.content_references[N] → citations
data.message.metadata.content_references[N].safe_urls
→ resolved citation URLs
data.message.metadata.finish_details → stop reason
data.message.metadata.model_slug → model
data.message.status → "finished_successfully"
data.message.id → message id
data.conversation_id → conversation idExample
{
"data": {
"message": {
"id": "f0a2b1c4-...",
"status": "finished_successfully",
"content": {
"content_type": "text",
"parts": [
"Rainbows form when sunlight is refracted, reflected, and dispersed inside water droplets ..."
]
},
"metadata": {
"model_slug": "gpt-5",
"finish_details": { "type": "stop" },
"content_references": [
{
"type": "webpage",
"title": "How Rainbows Form, NOAA SciJinks",
"safe_urls": ["https://scijinks.gov/rainbow/"]
}
]
}
},
"conversation_id": "9b8a7c6d-..."
}
}Top-Level Fields
| Field | Type | Description |
|---|---|---|
data.message.id | string | Unique identifier for this assistant message |
data.message.status | string | Message status. "finished_successfully" for a complete reply |
data.message.content.parts[0] | string | The assistant's reply text |
data.message.metadata.model_slug | string | Model that produced the reply (e.g., "gpt-5") |
data.message.metadata.finish_details | object | Stop reason, e.g., { "type": "stop" } |
data.message.metadata.content_references | array | Inline citations. Absent when the model did not cite anything |
data.conversation_id | string | Identifier for the conversation this reply belongs to |
content_references[]
| Field | Type | Description |
|---|---|---|
type | string | Reference type (e.g., "webpage", "image") |
title | string | Citation title as shown in the reply |
safe_urls | string[] | Resolved URLs for the citation. Present on web citations |
Notes
- The envelope follows the streaming message shape used by
chatgpt.com; we collect the full stream server-side and return the final assembled document. - Citation delimiters that appear inline in the raw stream are stripped before the response is returned. The text in
parts[0]is clean, and citations live separately inmetadata.content_references. - A reply that ends with
finish_details.type == "stop"is a normal completion. Other values indicate the model stopped early (e.g., a tool call boundary).

