#gpt oss
GPT-OSS Unleashed: The Open-Source GPT Model Taking On ChatGPT and Transforming AI
• Hot Trendy News
OpenAI has released GPT-OSS, its first set of open-weight language models since GPT-2, signaling a dramatic return to open development after years of closed-source releases. The launch comes in two sizes—gpt-oss-120b and gpt-oss-20b—delivering state-of-the-art reasoning abilities that approach GPT-3.5 quality while remaining free to download, run and fine-tune locally.
The flagship gpt-oss-120b scores 76.4 % on MMLU and 57 % on GSM-8K, outperforming most publicly available models except recent mixture-of-experts giants. The lighter gpt-oss-20b, designed for laptops and edge devices, still posts a respectable 68.1 % on MMLU while needing only 16 GB of GPU VRAM. Both checkpoints ship under an Apache-2-style “OpenRAIL-M” license that permits commercial use so long as users disclose significant modifications, a friendlier stance than Meta’s Llama 3 community license.
Developers can grab the weights today from the new GPT-OSS page, Hugging Face Hub and BitTorrent magnet links; OpenAI also published a Colab notebook and an in-browser playground for quick experimentation. Early benchmarks show gpt-oss-120b generating 8 tokens/s on a single RTX 6000 Ada with the new Flash-Attention-3 kernel, while quantized 4-bit builds of gpt-oss-20b run at chat-speed on MacBooks with Apple Silicon.
Within hours, the community pushed adapters for Hugging Face’s PEFT, LangChain, Ollama and the open-source ChatGPT UI, enabling plug-and-play chatbots, RAG pipelines and agentic workflows. Enterprise teams cite on-premises deployment and data-sovereignty compliance as top reasons to migrate; indie hackers highlight zero API costs and no rate limits.
GPT-OSS lands in a crowded landscape that already features Llama 3, Mistral 7B, Mixtral 8x22B and Google’s Gemma. Yet OpenAI’s brand recognition, larger context window (128 k tokens) and permissive license could make GPT-OSS the default base model for startups looking to embed AI without vendor lock-in.
Getting started:
• Download the model weights (≈450 GB for 120 b, 30 GB for 20 b) and tokenizer.
• Install the reference loader: `pip install gpt-oss`.
• Run the quick-start script: `python chat.py --model gpt-oss-20b --quant 4bit`.
• Fine-tune with QLoRA on your proprietary dataset or connect to vector databases for RAG.
By throwing open the doors to GPT-class capability, OpenAI has reignited the open-source AI arms race and lowered the barrier for anyone to build privacy-preserving, locally hosted generative applications—from enterprise copilots to backyard robotics.
More Trending Stories
#susan rice 8/18/2025
Susan Rice Back in the White House Spotlight: How Her Return Could Shape Biden’s 2024 Agenda
Lead: Former U.S. National Security Adviser Susan Rice warned that President Donald Trump’s private White House talks with Ukrainian President Volodym...
Read Full Story
#what countries use mail in ballots 8/18/2025
Which Countries Use Mail-In Ballots? Complete 2025 List of Nations Allowing Postal Voting
As debate over absentee voting heats up again in the United States, voters are asking a simple question: which countries actually use mail-in ballots?...
Read Full Story
#nfl crocs 8/18/2025
NFL Crocs Drop 2025: Limited-Edition Team Clogs Selling Out Fast—Here’s Where to Get Yours
The comfort kings at Crocs just called an audible on game-day style, unveiling a league-wide NFL Crocs Collection that lets fans rep all 32 teams from...
Read Full Story