#gpt oss
GPT-OSS Unleashed: The Open-Source GPT Model Taking On ChatGPT and Transforming AI
• Hot Trendy News
OpenAI has released GPT-OSS, its first set of open-weight language models since GPT-2, signaling a dramatic return to open development after years of closed-source releases. The launch comes in two sizes—gpt-oss-120b and gpt-oss-20b—delivering state-of-the-art reasoning abilities that approach GPT-3.5 quality while remaining free to download, run and fine-tune locally.
The flagship gpt-oss-120b scores 76.4 % on MMLU and 57 % on GSM-8K, outperforming most publicly available models except recent mixture-of-experts giants. The lighter gpt-oss-20b, designed for laptops and edge devices, still posts a respectable 68.1 % on MMLU while needing only 16 GB of GPU VRAM. Both checkpoints ship under an Apache-2-style “OpenRAIL-M” license that permits commercial use so long as users disclose significant modifications, a friendlier stance than Meta’s Llama 3 community license.
Developers can grab the weights today from the new GPT-OSS page, Hugging Face Hub and BitTorrent magnet links; OpenAI also published a Colab notebook and an in-browser playground for quick experimentation. Early benchmarks show gpt-oss-120b generating 8 tokens/s on a single RTX 6000 Ada with the new Flash-Attention-3 kernel, while quantized 4-bit builds of gpt-oss-20b run at chat-speed on MacBooks with Apple Silicon.
Within hours, the community pushed adapters for Hugging Face’s PEFT, LangChain, Ollama and the open-source ChatGPT UI, enabling plug-and-play chatbots, RAG pipelines and agentic workflows. Enterprise teams cite on-premises deployment and data-sovereignty compliance as top reasons to migrate; indie hackers highlight zero API costs and no rate limits.
GPT-OSS lands in a crowded landscape that already features Llama 3, Mistral 7B, Mixtral 8x22B and Google’s Gemma. Yet OpenAI’s brand recognition, larger context window (128 k tokens) and permissive license could make GPT-OSS the default base model for startups looking to embed AI without vendor lock-in.
Getting started:
• Download the model weights (≈450 GB for 120 b, 30 GB for 20 b) and tokenizer.
• Install the reference loader: `pip install gpt-oss`.
• Run the quick-start script: `python chat.py --model gpt-oss-20b --quant 4bit`.
• Fine-tune with QLoRA on your proprietary dataset or connect to vector databases for RAG.
By throwing open the doors to GPT-class capability, OpenAI has reignited the open-source AI arms race and lowered the barrier for anyone to build privacy-preserving, locally hosted generative applications—from enterprise copilots to backyard robotics.
More Trending Stories
#skip schumaker 10/3/2025
Skip Schumaker Poised to Lead Texas Rangers: Why MLB Insiders Say He’s the Can’t-Miss Choice
The Texas Rangers’ managerial carousel appears to be slowing down, and the ride seems destined to stop at Skip Schumaker’s seat. Multiple reports emer...
Read Full Story
#senate votes today 10/3/2025
Senate Votes Today on Landmark Bill—What It Means for Your Wallet
WASHINGTON, D.C. — With federal offices shuttered for a third straight day, the U.S. Senate is set to hold two pivotal roll-call votes today, October ...
Read Full Story
#tmz 10/3/2025
TMZ Exclusive: Shocking Celebrity Scandal Uncovered — Full Story Inside
TMZ is surging up search charts this week as the celebrity-crime-sports gossip outlet unleashes a string of exclusive scoops that have ricocheted acro...
Read Full Story