[good]

Alibaba

Qwen 3 235B

Open Source
3.3
out of 10

Qwen 3 235B is Alibaba's largest open-source language model, released April 2025 under the MIT license. It uses a Mixture-of-Experts architecture with 235 billion total parameters and 22 billion active per forward pass. With a 262K context window and pricing as low as $0.20/$0.88 per 1M tokens on Alibaba Cloud, it's one of the most capable open models available at scale. Qwen 3 235B supports both a standard instruct mode and a Thinking mode for step-by-step reasoning. AA Intelligence Index of 17 reflects its April 2025 release date — newer open models have since surpassed it. A newer Qwen3 235B 2507 version has been released for those wanting the latest.

Context window

262K tokens

API (blended)

$0.37/1M

Consumer access

Free (limited)

Multimodal

Text only

Strengths

  • +MIT license — fully open source, can self-host, fine-tune, or build commercial products
  • +235B total parameters (22B active MoE) — large capable model at very low API cost
  • +262K context window — larger than GPT-5.2 (400K) and most mid-tier models
  • +Thinking mode available — switchable reasoning capability in one model
  • +Widely available on third-party inference providers (Together AI, Fireworks, OpenRouter)
  • +$0.20/$0.88 per 1M tokens — among the cheapest routes to a large open model

Weaknesses

  • -AA Intelligence Index 17 — below average for frontier models; newer models have surpassed it
  • -Chinese company (Alibaba) — data jurisdiction concern for enterprise sensitive workloads
  • -Released April 2025 — already one generation behind (Qwen3 235B 2507 is newer)
  • -40.7 t/s output speed — slower than most comparable models
  • -Thinking mode pricing ($0.45/$3.50) erases much of the cost advantage

Best for

open-source projects needing a large capable model with permissive licensingself-hosted deployments on enterprise hardwarebudget API workloads where Chinese jurisdiction is acceptableresearch and experimentation with large MoE architectures

Not ideal for

sensitive enterprise workloads (Alibaba/Chinese data jurisdiction)applications needing frontier intelligence (April 2025 model is behind current SOTA)latency-sensitive use cases (40.7 t/s is on the slower side)

Pricing details

Subscription plans

Qwen Chat FreeWeb chat with Qwen models including Qwen 3 235B, free with rate limits(Daily message limits; primarily designed for Chinese market)
Free

API pricing

Alibaba Cloud (Dashscope)free tierAlibaba Cloud pricing: $0.20/$0.88 per 1M tokens (instruct mode). Thinking mode is higher: $0.45/$3.50 per 1M via some providers. Also widely available on Together AI, Fireworks AI, OpenRouter, and Hyperbolic at varying rates. MIT license — can self-host for free.
$0.2/$0.88

Prices verified February 2026. LLM pricing changes frequently — verify at the provider's site before budgeting.

Last updated: February 2026