Close Menu
    Facebook X (Twitter) Instagram
    • Privacy Policy
    • Terms Of Service
    • Social Media Disclaimer
    • DMCA Compliance
    • Anti-Spam Policy
    Facebook X (Twitter) Instagram
    Fintech Fetch
    • Home
    • Crypto News
      • Bitcoin
      • Ethereum
      • Altcoins
      • Blockchain
      • DeFi
    • AI News
    • Stock News
    • Learn
      • AI for Beginners
      • AI Tips
      • Make Money with AI
    • Reviews
    • Tools
      • Best AI Tools
      • Crypto Market Cap List
      • Stock Market Overview
      • Market Heatmap
    • Contact
    Fintech Fetch
    Home»AI News»ByteDance Releases Protenix-v1: A New Open-Source Model Achieving AF3-Level Performance in Biomolecular Structure Prediction
    ByteDance Releases Protenix-v1: A New Open-Source Model Achieving AF3-Level Performance in Biomolecular Structure Prediction
    AI News

    ByteDance Releases Protenix-v1: A New Open-Source Model Achieving AF3-Level Performance in Biomolecular Structure Prediction

    February 8, 20264 Mins Read
    Share
    Facebook Twitter LinkedIn Pinterest Email
    synthesia

    How close can an open model get to AlphaFold3-level accuracy when it matches training data, model scale and inference budget? ByteDance has introduced Protenix-v1, a comprehensive AlphaFold3 (AF3) reproduction for biomolecular structure prediction, released with code and model parameters under Apache 2.0. The model targets AF3-level performance across protein, DNA, RNA and ligand structures while keeping the entire stack open and extensible for research and production.

    The core release also ships with PXMeter v1.0.0, an evaluation toolkit and dataset suite for transparent benchmarking on more than 6k complexes with time-split and domain-specific subsets.

    What is Protenix-v1?

    Protenix is described as ‘Protenix: Protein + X‘, a foundation model for high-accuracy biomolecular structure prediction. It predicts all-atom 3D structures for complexes that can include:

    • Proteins
    • Nucleic acids (DNA and RNA)
    • Small-molecule ligands

    The research team defines Protenix as a comprehensive AF3 reproduction. It re-implements the AF3-style diffusion architecture for all-atom complexes and exposes it in a trainable PyTorch codebase.

    The project is released as a full stack:

    changelly
    • Training and inference code
    • Pre-trained model weights
    • Data and MSA pipelines
    • A browser-based Protenix Web Server for interactive use

    AF3-level performance under matched constraints

    As per the research team Protenix-v1 (protenix_base_default_v1.0.0) is ‘the first fully open-source model that outperforms AlphaFold3 across diverse benchmark sets while adhering to the same training data cutoff, model scale, and inference budget as AlphaFold3.‘

    The important constraints are:

    • Training data cutoff: 2021-09-30, aligned with AF3’s PDB cutoff.
    • Model scale: Protenix-v1 itself has 368M parameters; AF3 scale is matched but not disclosed.
    • Inference budget: comparisons use similar sampling budgets and runtime constraints.

    On challenging targets such as antigen–antibody complexes, increasing the number of sampled candidates from several to hundreds yields consistent log-linear improvements in accuracy. This gives a clear and documented inference-time scaling behavior rather than a single fixed operating point.

    PXMeter v1.0.0: Evaluation for 6k+ complexes

    To support these claims, the research team released PXMeter v1.0.0, an open-source toolkit for reproducible structure prediction benchmarks.

    PXMeter provides:

    • A manually curated benchmark dataset, with non-biological artifacts and problematic entries removed
    • Time-split and domain-specific subsets (for example, antibody–antigen, protein–RNA, ligand complexes)
    • A unified evaluation framework that computes metrics such as complex LDDT and DockQ across models

    The associated PXMeter research paper, ‘Revisiting Structure Prediction Benchmarks with PXMeter,‘ evaluates Protenix, AlphaFold3, Boltz-1 and Chai-1 on the same curated tasks, and shows how different dataset designs affect model ranking and perceived performance.

    How Protenix fits into the broader stack?

    Protenix is part of a small ecosystem of related projects:

    • PXDesign: a binder design suite built on the Protenix foundation model. It reports 20–73% experimental hit rates and 2–6× higher success than methods such as AlphaProteo and RFdiffusion, and is accessible via the Protenix Server.
    • Protenix-Dock: a classical protein–ligand docking framework that uses empirical scoring functions rather than deep nets, tuned for rigid docking tasks.
    • Protenix-Mini and follow-on work such as Protenix-Mini+: lightweight variants that reduce inference cost using architectural compression and few-step diffusion samplers, while keeping accuracy within a few percent of the full model on standard benchmarks.

    Together, these components cover structure prediction, docking, and design, and share interfaces and formats, which simplifies integration into downstream pipelines.

    Key Takeaways

    • AF3-class, fully open model: Protenix-v1 is an AF3-style all-atom biomolecular structure predictor with open code and weights under Apache 2.0, targeting proteins, DNA, RNA and ligands.
    • Strict AF3 alignment for fair comparison: Protenix-v1 matches AlphaFold3 on critical axes: training data cutoff (2021-09-30), model scale class and comparable inference budget, enabling fair AF3-level performance claims.
    • Transparent benchmarking with PXMeter v1.0.0: PXMeter provides a curated benchmark suite over 6k+ complexes with time-split and domain-specific subsets plus unified metrics (for example, complex LDDT, DockQ) for reproducible evaluation.
    • Verified inference-time scaling behavior: Protenix-v1 shows log-linear accuracy gains as the number of sampled candidates increases, giving a documented latency–accuracy trade-off rather than a single fixed operating point.

    Check out the Repo and Try it here. Also, feel free to follow us on Twitter and don’t forget to join our 100k+ ML SubReddit and Subscribe to our Newsletter. Wait! are you on telegram? now you can join us on telegram as well.

    notion
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Fintech Fetch Editorial Team
    • Website

    Related Posts

    Working to advance the nuclear renaissance | MIT News

    Working to advance the nuclear renaissance | MIT News

    April 6, 2026
    OCSF explained: The shared data language security teams have been missing

    OCSF explained: The shared data language security teams have been missing

    April 5, 2026
    Blue lobster as, with the launch of KiloClaw, enterprises now have a tool to enforce governance over autonomous agents and manage shadow AI.

    KiloClaw targets shadow AI with autonomous agent governance

    April 4, 2026
    Arcee AI Releases Trinity Large Thinking: An Apache 2.0 Open Reasoning Model for Long-Horizon Agents and Tool Use

    Arcee AI Releases Trinity Large Thinking: An Apache 2.0 Open Reasoning Model for Long-Horizon Agents and Tool Use

    April 3, 2026
    Add A Comment

    Comments are closed.

    Join our email newsletter and get news & updates into your inbox for free.


    Privacy Policy

    Thanks! We sent confirmation message to your inbox.

    kraken
    Latest Posts
    Japanese Market Notably Higher | Nasdaq

    Japanese Market Sees Significant Gains | Nasdaq

    April 6, 2026
    Working to advance the nuclear renaissance | MIT News

    Working to advance the nuclear renaissance | MIT News

    April 6, 2026
    I challenged the smartest AI bots to make me money... (insane results!)

    I challenged the smartest AI bots to make me money… (insane results!)

    April 5, 2026
    Bitcoin

    Bitcoin Whales on a Buying Spree: 10,000 BTC Acquired in Just 3 Days

    April 5, 2026
    Crypto Token Glut Is Diluting Value And Breaking Investor Returns

    Crypto Token Glut Is Diluting Value And Breaking Investor Returns

    April 5, 2026
    kraken
    LEGAL INFORMATION
    • Privacy Policy
    • Terms Of Service
    • Social Media Disclaimer
    • DMCA Compliance
    • Anti-Spam Policy
    Top Insights
    North Korean Hackers Infiltrated Crypto For Seven Years

    North Korean Hackers Infiltrated Crypto For Seven Years

    April 6, 2026
    'History Has Arrived': Robert Kiyosaki Warns of Collapse

    History Unfolds: Robert Kiyosaki Cautions About Imminent Collapse

    April 6, 2026
    quillbot
    Facebook X (Twitter) Instagram Pinterest
    © 2026 FintechFetch.com - All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.