Intel's Neural Texture Compression slashes VRAM use by 18x as Nvidia shows 6.5GB reduced to 970MB

Reviewed byNidhi Govil

6 Sources

Share

Intel unveiled Texture Set Neural Compression (TSNC), an AI-powered compression technology that can reduce game texture sizes by up to 18 times while maintaining image quality. Nvidia demonstrated similar capabilities with its Neural Texture Compression, cutting VRAM usage from 6.5GB to just 970MB in demo scenes. Both solutions address mounting storage and memory challenges as modern games demand increasingly detailed assets.

Intel Launches AI-Powered Compression Technology to Address VRAM Constraints

Intel has introduced its Texture Set Neural Compression (TSNC), an AI-powered compression technology designed to significantly reduce game texture sizes and VRAM demands

1

. The technology offers two operational modes: Variant A achieves over 9x compression ratios while maintaining high image quality, and Variant B pushes compression ratios beyond 18x with modest visual trade-offs

2

. This Neural Texture Compression approach replaces conventional block compression formats with an AI-driven encoding and decoding process that stores textures in a compact neural representation

5

. For game developers facing escalating asset size challenges, TSNC provides a practical path to reduce storage requirements, accelerate install times, and reduce VRAM use without requiring complete pipeline overhauls.

Source: TweakTown

Source: TweakTown

How Intel's Solution Compares to Traditional Texture Compression Methods

Intel's texture compression technology leverages BC1 texture compression and linear algebra for the XMX-accelerated portion of its neural compression system

1

. Instead of compressing each texture independently, TSNC trains a neural network on related textures and encodes them into a shared latent space stored across four BC1-compressed pyramid levels

2

. In Intel's testing, standard BC compression produced approximately a 4.8x ratio, while TSNC Variant A reached more than 9x compression and Variant B exceeded 18x compression

2

. The company claims Variant A can compress two 4096 x 4096 64MB textures down to 10.7 MB each while retaining 4K resolution, with remaining textures reduced to half resolution and compressed to 2.7 MB

1

.

Source: Guru3D

Source: Guru3D

Fallback Mode Enables Broader Hardware Compatibility Beyond Intel GPUs

Intel has developed two execution paths for its decoder: a linear algebra path that utilizes XMX acceleration on supported Intel GPUs, and a fallback mode using fused multiply-add implementation that runs on traditional CPU and GPU cores

1

. In a Panther Lake microbenchmark on the integrated B390 GPU, Intel measured approximately 0.661 nanoseconds per pixel on the FMA path versus 0.194 nanoseconds per pixel on the XMX path, delivering roughly a 3.4x performance gain

2

. This dual-path approach makes Intel the only major Western GPU manufacturer offering a neural compression solution that works on graphics cards beyond its own hardware

1

. The flexibility could prove critical for widespread adoption among game developers seeking to optimize game assets across diverse hardware configurations.

Nvidia NTC Demonstrates Dramatic VRAM Reduction in Real-World Scenarios

Nvidia showcased its Neural Texture Compression capabilities through a "Tuscan Wheels" demo, where VRAM usage dropped from approximately 6.5GB with traditional BCN-compressed textures to just 970MB using Nvidia NTC while preserving image quality close to the original

4

. The company's RTX Neural Texture Compression SDK is already available for developers to use today

3

. Nvidia uses a small neural network running on its Tensor cores to reconstruct texture data deterministically, ensuring developers retain full control over visual output

3

. Beyond texture compression, Nvidia also demonstrated Neural Materials technology, which encodes material behavior into compact latent representations. In one example, a material setup with 19 channels was reduced to eight, with Nvidia reporting 1.4x to 7.7x faster 1080p render times

4

.

Source: PCWorld

Source: PCWorld

Quality Trade-offs and Deployment Strategies for Optimizing Game Assets

Intel acknowledges that AI texture compression introduces perceptual trade-offs, with Variant A showing approximately 5% perceptual loss and Variant B exhibiting 6% to 7% quality reduction based on FLIP perceptual analysis

2

. Variant A shows some precision loss in normals, while Variant B begins to display BC1 block artifacts in normals and ARM data

2

. Intel outlined four deployment strategies for developers: compressing textures before uploading to servers to reduce download sizes, streaming textures during game loading, streaming during gameplay, and loading textures on-the-fly without holding them in VRAM—the latter particularly beneficial for low-VRAM GPUs

1

. These flexible deployment options allow developers to target specific bottlenecks whether related to storage, bandwidth, or memory constraints.

DirectX Integration and SDK Availability Signal Industry-Wide Adoption

Microsoft plans to build support for neural texture compression into DirectX, creating an API that will enable developers to leverage both "small models" and "scene models" for next-generation rendering

3

. Intel plans to release TSNC as a standalone SDK, with an alpha version scheduled for later this year, followed by beta testing and eventual public release

2

. The technology can be integrated at various points in a game's lifecycle—from installation and loading to streaming and per-pixel sampling—depending on whether teams prioritize smaller installs, lower bandwidth, or reduced VRAM demands

2

. For PC gamers confronting escalating RAM costs and storage issues, these neural compression technologies offer tangible relief. Games like Hogwarts Legacy requiring 58GB of base data plus an additional 18.3GB for high-definition texture packs illustrate the mounting pressure on storage and memory subsystems

3

. By enabling more detailed assets within existing hardware budgets, neural compression could extend the viable lifespan of older graphics cards while reducing the financial burden of keeping pace with increasingly demanding titles.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo