NLAC | Next Level Asset Compressor |
Modern games and apps continue to expand in size, posing a dilemma for developers between maintaining and preserving high-quality content and how many potential users they can afford to lose with increased package size.
Package size not only affects install conversion rates but also has an impact on how long users will keep your application installed. Even with faster network connections, with modern games often reaching tens and sometimes hundreds of gigabytes of data, it’s not uncommon for users to wait hours for a download, which can create a poor first impression before they’ve even started playing.
The data size is much more important on mobile market, due to the majority of users around the world having very limited tariff plans. And in a lot of cases, downloading a large app can mean extra charges for additional data. Which indirectly increases your game/application cost to the user in an already highly competitive market. And even if users have access to fast and unlimited wifi, they often must still decide whether to download and keep your app or have more space available for personal content like beloved cat videos, since mobile storage is much more limited in comparison to PC. This can have a direct impact on retention and your active user base.
There's also an important consideration from a content distribution standpoint. In the typical scenario of delivering tens of gigabytes of data to a large user base, package size directly impacts total network traffic. Even when using local CDNs, small reductions in data size can lead to significant cost savings over time.
Typically, package size comprises mainly of texture assets. Classical approaches for distributing texture assets usually include two different paths: the use of GPU-native formats such as DXT/BCN or ETC, which is very desirable for runtime memory efficiency and optimized performance, and storing them raw or with some general-purpose after compression like Zstd or Deflate. This enables optimal hardware usage and fast loading on client devices. However, due to the limited ability to compress already compressed data, the downside is a large package size. At times, content also faces challenges in compression due to its intricate nature, making it unsuitable for classical GPU-friendly formats without substantial loss in quality. Consequently, this necessitates the adoption of modern, bulkier formats like ASTC 4x4, which in turn further increases the overall package size.
The second approach is to use general-purpose image codecs like Jpeg or Webp, which will allow you to compress images significantly better than first approach while maintaining good quality. However, the limitation is that these codecs are not optimal for hardware, especially for mobile hardware, and cannot produce GPU-friendly formats, leading to decreased fps, 3–6 times more memory usage, and decreased phone battery life. Additionally, modern codecs like AVIF or JPEG-XL, which offer exceptional compression, have huge decoding time, especially on weak devices, and no user wants to wait while watching a sluggish progress bar.
![]() |
![]() |
|
|
Key Points
Sounds too good to be true?
Well, no. NLAC is NOT a universal codec optimized to be efficient on a wide range of applications from very blurred and blocky overcompressed images to mathematically lossless ones, adding more complexity for all use cases. Instead, it focuses on a narrow set of use cases to maximize performance.
The main idea is to focus on high-quality and high-resolution content, which is what most games use today and typically constitutes the largest portion of the data.
The second focus point is decoder performance. While there are some industry-proven codecs on the market like AVIF or JPEG-XL, that offer very good image/texture compression, not only they don't produce GPU-friendly formats, but some of them are also quite slow on decompressing. Therefore, you can reduce texture size significantly, but loading times will also rise substantially while overall performance will also suffer, and there is no point to exchange one frustrating users thing for another one. This is due to the nature of compression itself, which follows an exponential-like curve of diminishing returns, meaning that achieving incremental improvements in compression ratio requires exponentially greater computational effort. Our goal is to strike the right balance — targeting a compression ratio that typically comes close to, and in some cases even exceeds, that of cutting-edge general-purpose codecs, while keeping computational complexity much lower. This approach significantly outperforms traditional compression methods commonly used in game development, with decoding cost dramatically lower than that of general-purpose codecs.
With that in mind, we leveraged our substantial experience in software optimization to design from scratch performance-oriented decoder, that performs very fast even on current low-budget devices, enabling access to millions of new users while enhancing the experience for existing ones.
*some of the numbers are calculated, interpolated or projected based on real measurements and are subject to change with further revisions without prior notice
**all sample textures shown are public domain licensed material