便携AI网
AI的百科全书

便携AI聚合API已支持o3-pro模型,附调用方法

今天OpenAI推出了全新的o3-pro模型,这是继o3和o3‑mini之后,专为追求“可靠性”而非速度的大型推理模型,通过深度思考,利用更大算力和更长思考时间,o3‑pro有能力处理复杂逻辑推理,比如数学、科学、编程、商业决策等,目前便携AI聚合API已经支持最新的o3-pro模型,本文分享下调用方法。

一、o3-pro模型介绍

o3-pro可以进行“更深的思考”,基于o3架构,增加推理计算,优化链式思考,提升复杂任务表现,官方建议将o3‑pro用于“高挑战、高价值、需要高度准确性且时间延迟可以接受”的任务。

以下是OpenAI官方对于o3-pro的介绍:

The o-series of models are trained with reinforcement learning to think before they answer and perform complex reasoning. The o3-pro model uses more compute to think harder and provide consistently better answers.

o3-pro is available in the Responses API only to enable support for multi-turn model interactions before responding to API requests, and other advanced API features in the future. Since o3-pro is designed to tackle tough problems, some requests may take several minutes to finish. To avoid timeouts, try using background mode.

翻译:

o系列模型通过强化学习进行训练,具备“先思考再作答”的能力,能够执行复杂的推理任务。
其中,o3-pro模型使用了更多计算资源,以便“更深入地思考”,从而提供更稳定、更优质的回答。

o3-pro 仅在 Responses API 中提供,旨在支持模型在响应 API 请求前进行多轮对话交互,并为未来的高级 API 功能奠定基础。
由于 o3-pro 专为解决复杂问题而设计,部分请求可能需要几分钟才能完成。为了避免请求超时,建议使用后台模式(background mode)。

总的来说,o3-pro思考更久,能力更强,可以进行更复杂的推理任务,但是非常耗时

o3-pro价格如下:

  • 输入token:$20/1M tokens
  • 输出token:$80/1M tokens

二、o3-pro模型调用方法

o3-pro模型名称:o3-pro、o3-pro-2025-06-10

o3-pro只能使用Responses API,传统的Chat Completions API无法调用。

下面以Python为例,介绍下如何调用o3-pro。示例中的api_key可以在网站后台获取,获取方法:《便携AI聚合API新建令牌(API key)教程》。

1、OpenAI包调用

def response_openai():
    from openai import OpenAI
    client = OpenAI(
        api_key="sk-234345",
        base_url=f'https://api.bianxie.ai/v1'
    )

    response = client.responses.create(
        model="o3-pro",
        input=[
            {
                "role": "user",
                "content": "Write a one-sentence bedtime story about a unicorn."
            }
        ]
    )

    print(response.json())

返回示例:

{
    "id": "resp_6849735d219881a28f8a567b1d3070550949fa752fa96f08",
    "created_at": 1749644125,
    "error": null,
    "incomplete_details": null,
    "instructions": null,
    "metadata": {

    },
    "model": "o3-pro-2025-06-10",
    "object": "response",
    "output": [
        {
            "id": "rs_68497370d9f881a2a4c639f73758a5cb0949fa752fa96f08",
            "summary": [

            ],
            "type": "reasoning",
            "status": null
        },
        {
            "id": "msg_68497370daf481a29dc7ba6a5fd71b690949fa752fa96f08",
            "content": [
                {
                    "annotations": [

                    ],
                    "text": "As moonlight painted the clouds silver, a gentle unicorn tiptoed through dream-dust meadows, collecting children’s worries in her shimmering mane and replacing them with whispers of starlit peace.",
                    "type": "output_text"
                }
            ],
            "role": "assistant",
            "status": "in_progress",
            "type": "message"
        }
    ],
    "parallel_tool_calls": true,
    "temperature": 1,
    "tool_choice": "auto",
    "tools": [

    ],
    "top_p": 1,
    "max_output_tokens": null,
    "previous_response_id": null,
    "reasoning": {
        "effort": "medium",
        "generate_summary": null,
        "summary": null
    },
    "service_tier": "default",
    "status": "completed",
    "text": {
        "format": {
            "type": "text"
        }
    },
    "truncation": "disabled",
    "usage": {
        "input_tokens": 17,
        "input_tokens_details": {
            "cached_tokens": 0
        },
        "output_tokens": 46,
        "output_tokens_details": {
            "reasoning_tokens": 0
        },
        "total_tokens": 63
    },
    "user": null,
    "background": false,
    "store": true
}

2、curl调用

import requests

url = f"https://api.bianxie.ai/v1/responses"
headers = {
    "Content-Type": "application/json",
    "Authorization": f"Bearer sk-2jsodfjo"
}
data = {
    "model": "o3-pro",
    "input": "Tell me a three sentence bedtime story about a unicorn."
}

response = requests.post(url, headers=headers, json=data)

print(response.json())

返回示例:

{
  "id": "resp_684977361f8081a18b547b259f2d72c20ffee4a22ce28380",
  "object": "response",
  "created_at": 1749645110,
  "status": "completed",
  "background": false,
  "error": null,
  "incomplete_details": null,
  "instructions": null,
  "max_output_tokens": null,
  "model": "o3-pro-2025-06-10",
  "output": [
    {
      "id": "rs_6849774897a081a1ac739a829fd56a230ffee4a22ce28380",
      "type": "reasoning",
      "summary": []
    },
    {
      "id": "msg_68497748998881a1b8868097f2ed564e0ffee4a22ce28380",
      "type": "message",
      "status": "in_progress",
      "content": [
        {
          "type": "output_text",
          "annotations": [],
          "text": "As Luna the silver-maned unicorn tiptoed through the moonlit forest, her horn softly glowed and stitched tiny stars back into holes in the night sky. Each twinkling stitch hummed a lullaby that drifted into every woodland burrow, lulling even the busiest fireflies to sleep. When her work was done, Luna curled beside a quiet pond, and the sky, now perfectly mended, shimmered like a gentle blanket over all the dreaming creatures below."
        }
      ],
      "role": "assistant"
    }
  ],
  "parallel_tool_calls": true,
  "previous_response_id": null,
  "reasoning": {
    "effort": "medium",
    "summary": null
  },
  "service_tier": "default",
  "store": true,
  "temperature": 1.0,
  "text": {
    "format": {
      "type": "text"
    }
  },
  "tool_choice": "auto",
  "tools": [],
  "top_p": 1.0,
  "truncation": "disabled",
  "usage": {
    "input_tokens": 17,
    "input_tokens_details": {
      "cached_tokens": 0
    },
    "output_tokens": 104,
    "output_tokens_details": {
      "reasoning_tokens": 0
    },
    "total_tokens": 121
  },
  "user": null,
  "metadata": {}
}
赞(0)
未经允许不得转载:便携AI » 便携AI聚合API已支持o3-pro模型,附调用方法