We took our handsome pair of new Arc A700-series GPUs out for some glamour shots. While minding standard static-related protocols, of course.
Credit:
Sam Machkovech
We took our handsome pair of new Arc A700-series GPUs out for some glamour shots. While minding standard static-related protocols, of course.
Credit:
Sam Machkovech
What's it like owning a brand-new Intel Arc A700-series graphics card? Is it the show-stopping clapback against Nvidia that wallet-pinched PC gamers have been dreaming of? Is it an absolute mess of unoptimized hardware and software? Does it play video games?
That last question is easy to answer: yes, and pretty well. Intel now has a series of GPUs entering the PC gaming market just in time for a few major industry trends to play out: some easing in the supply chain, some crashes in cryptocurrency markets, and more GPUs being sold near their originally announced MSRPs. If those factors continue to move in consumer-friendly directions, it will mean that people might actually get to buy and enjoy the best parts of Intel’s new A700-series graphics cards. (Sadly, limited stock remains a concern in modern GPU reviews. Without firm answers from Intel on how many units it's making, we’re left wondering what kind of Arc GPU sell-outs to expect until further notice.)
While this is a fantastic first-generation stab at an established market, it’s still a first-generation stab. In great news, Intel is taking the GPU market seriously with how its Arc A770 (starting at $329) and Arc A750 (starting at $289) cards are architected. The best results are trained on modern and future rendering APIs, and in those gaming scenarios, their power and performance exceed their price points.
Yet our time with both Arc-branded GPUs has been like picking through a box of unlabeled chocolates. While none of our testing results were necessarily revolting, a significant percentage tasted funny enough to make a general recommendation pretty tricky.
Table of Contents
Warning: Intel buyers will want (if not need) a ReBAR-compatible PC
A brief unboxing and examination of Intel's new Arc A700-series GPUs. These boxes sure look like shirts you'd see in '80s family photos.
Sam Machkovech
A brief unboxing and examination of Intel's new Arc A700-series GPUs. These boxes sure look like shirts you'd see in '80s family photos.
Sam Machkovech
This shot was taken after one of the "Let's Play" information packets had been removed.
This shot was taken after one of the "Let's Play" information packets had been removed.
A brief unboxing and examination of Intel's new Arc A700-series GPUs. These boxes sure look like shirts you'd see in '80s family photos.
Sam Machkovech
This shot was taken after one of the "Let's Play" information packets had been removed.
1 x HDMI 2.1, 3 x DP 2.0.
1 x HDMI 2.1, 3 x DP 2.0.
Twin fan blower, aimed downward when installed in a typical ATX case.
Twin fan blower, aimed downward when installed in a typical ATX case.
1 x HDMI 2.1, 3 x DP 2.0.
Twin fan blower, aimed downward when installed in a typical ATX case.
There's a lot to get into with Intel's latest major entry into the GPU market, and it's important to start by addressing a considerable barrier to entry for potential customers.
Intel strongly urges buyers of its new Arc graphics card line to triple-check their computer’s support for a pair of relatively recent features: Resizable BAR ("ReBAR") and/or Smart Access Memory. We say "and/or" because they're branded versions of the same technology. The shortest explanation is that a ReBAR-compatible motherboard can send much larger chunks of data to and from the graphics card on a regular basis, and Intel would really like you to turn the feature on if possible.
Will the Arc A750 and Arc A770 graphics cards work without Resizable BAR enabled? Yes, but we don't recommend it. Intel's Arc architecture leans heavily into ReBAR's wide-open pipeline to your GPU's frame buffer—so much so that it doesn't have a fallback when a game's workload includes constant streaming of assets like textures. The best example I found was in driving along Cyberpunk 2077's sprawling highways at high speeds. With ReBAR enabled on my AMD Ryzen 7 5800X system, I could expect smooth-enough driving at 1440p with "high" settings enabled and ray tracing disabled. (This test's "1 percent low" frame rate count, indicating the worst persistent dips, measured above 30 fps, which is pretty good.)
I then rebooted, disabled ReBAR on the BIOS level, and played the same Cyberpunk segment again. The result was nigh unplayable, thanks to constant multi-second pauses and chugs. To give this scenario a fair shake, I immediately reloaded the save file in question and tried again in case this was a matter of one-time shader compilation causing the stutters. The bad numbers persisted between the tests.
Should your favorite games revolve around tight corridors or slower runs through last-gen 3D environments, the Arc GPU difference between ReBAR enabled and disabled can range from a margin-of-error sliver to a 10–15 percent dip. But even if you can stomach those issues, you might run into significant quirks outside of gaming. In my case, Google Chrome and Microsoft Edge would both routinely glitch with ReBAR disabled while videos played in any tab. The whole browser window would turn into jibberish while the rest of the OS environment remained intact. It looked like this:
The only fix for this error was to enable ReBAR. If you don't have a relatively recent CPU-and-motherboard combo that supports either ReBAR or Smart Access Memory—basically, Intel's 10-series and up or AMD's Ryzen 3000-series and up—the rest of this review may be moot for you.
That's an unfortunate brick wall for a PC gaming market dominated by budget-priced CPUs and GPUs.
How we got here
Since around June of 2017, the single most popular GPU in the Steam Hardware Survey has been Nvidia’s GeForce GTX 1060, a midrange card that launched for $250 over six years ago. (Adorably, we called a $400 GPU price tag “hefty” in our review. We were all much younger in 2016.)
It’s looking like its reign might finally come to an end any month now, especially now that the GPU shortage is definitively over. But after the GTX 1060, we see a rogue’s gallery of mid-grade Nvidia GPUs—in order, the GTX 1650, RTX 2060, GTX 1050 Ti, RTX 3060 (laptop and desktop), GTX 1050, GTX 1660 Ti, and GTX 1660 Super. You need to get to the No. 10 spot in August’s data before you hit a model number without a 6 or a 5 in it. Collectively, that run of midrange cards accounts for a little over a third of the PC GPU install base.
We bring this up not to emphasize Nvidia’s dominance of the GPU space—though it’s worth remembering the clout Nvidia has—but to point out that for most graphics card buyers, what matters most isn’t raw speed but the performance you get for the money. And that’s what Intel is counting on with its first wave of Arc GPUs. At $289, $329, and $349, respectively, the Arc A750, Arc A770 (8GB version), and Arc A770 (16GB version) are all priced underneath Nvidia's RTX 3060. The 3060 is the only competing card that Intel is using as a comparison point in its opening PR salvo. As marketing strategies go, it's not subtle.
Whatever Arc’s circumstances, whatever its shortcomings, a serious deep-pocketed newcomer to the GPU market is the most significant thing to happen since AMD bought ATI in 2006, or maybe even further back—since 3DFX went under at the turn of the millennium. It’s a big deal. And Intel hasn't jumped into the market tentatively.
More context: Major hires, plus an “integrated” reputation to shed
Intel's present-day dedicated graphics project dates back to November of 2017, when Intel hired graphics exec Raja Koduri away from his second stint at AMD. Intel didn't play coy about its intentions. Koduri, who also worked on graphics at Apple when the company was assembling its first custom iPhone chips and transitioning its entire product lineup to high-density "Retina" displays, was coming to Intel to "expand Intel’s leading position in integrated graphics for the PC market with high-end discrete graphics solutions for a broad range of computing segments."
It was a big move for a company known primarily (and not always fondly) for its integrated graphics products. But Intel had been trying to shed its reputation as a purveyor of barebones utilitarian GPUs for years at that point. By the time 2013's Haswell CPU architecture arrived, more than half of the "CPU" die in a laptop like that year's MacBook Air was actually taken up by the chip's integrated GPU. That was the same year Intel introduced an "Iris Pro" branded GPU, which was set apart from the rest of the lineup by the inclusion of a small chunk of DRAM that the GPU could use in addition to the larger, slower pool of system memory that most integrated GPUs need to share with rest of the computer. And if the results weren't exactly as fast as the low-end dedicated GPUs of the day, they weren't laughably far behind, either.
Intel had also been taking other, non-performance-related aspects of its GPUs increasingly seriously, introducing and continuously improving a hardware-accelerated video encoding and decoding feature called "Quick Sync" and adding support for new versions of the DirectX, OpenGL, OpenCL, and (eventually) Vulkan APIs.
Making a dedicated GPU lineup seemed like a logical next step for Intel. It would leverage technology and drivers that the company had been developing for years, and it would help the company stay relevant as GPU-accelerated server and machine-learning workflows became ever more common.
By late 2019, Koduri had begun talking about Intel's "Xe" architecture, a foundation on top of which everything from its integrated GPUs to high-end server, workstation, and gaming products would be built. By late 2020, the first Xe-based GPUs came to market as fairly impressive integrated GPUs for Intel's 11th-generation Core laptop CPUs (codenamed "Tiger Lake"). By early 2021, the first Xe dedicated GPU had technically been released in the form of an oddball OEM-only card called "DG1," with whispers of a more-powerful "DG2" card following in its wake. DG2 eventually became known as Arc Alchemist (that's what the "A" in "A770" stands for), the cards we're finally holding in our hands today.
Part of the delay in getting Arc to market—and five years is a long time, even in chip development—has at least a little to do with the same manufacturing troubles that have bedeviled the rest of the company for the last decade. DG1 was built on a 10 nm Intel manufacturing process, and in that 2019 presentation, future Intel Xe chips were planned on a 7 nm process that still hasn't seen the light of day. (When it does, it will be called "Intel 4," just as today's "Intel 7" was formerly a version of the 10 nm process.) Arc is actually built on a 6 nm process from TSMC, one of Intel's arch-competitors, though it's safe to assume that this wasn't the plan when Koduri joined the company back in 2017, when Intel still did most things strictly in-house.
Intel's communication strategy in 2022 has laid the blame for continued delays at the feet of its graphics drivers; Intel's last earnings call suggested that the company was sitting on hundreds of millions of dollars worth of hardware that it simply hadn't been able to sell. But Intel has spent its summer preparing reviewers and end users for lower-than-expected performance in games based on older versions of the DirectX graphics API. That PR strategy—and aggressive pricing—may have bought Intel some time. But for the first wave of Arc buyers, those rough drivers may define your experience with the GPUs.
The (not-fully-baked) Arc Control app
We've been using and testing the A770 (16GB edition) and A750 for a little less than a week now, but we actually bought a cheaper, lower-series Arc GPU weeks ago to get a sense of how its drivers were shaping up. ASRock's Challenger A380 appeared on Newegg a couple of months ago, and we've been kicking its tires with a succession of graphics drivers since then. The 700-series cards are more powerful, but no matter what Arc card you're using, it's clear that Intel has some problems to solve.
Intel's GPU driver has four components: the driver itself; the Intel Driver and Support Assistant, which manages updates for the GPU drivers and a few other Intel-designed things in your system if you have them; the Intel Graphics Command Center, which handles basic video settings and is essentially the same app Intel provides for its integrated graphics; and Arc Control, an Arc-specific overlay app that can control game-specific 3D settings, monitor system performance, capture footage and control your webcam, and apply mild overclocks.
Things are better now than they were even a couple of months ago, when the hosts at YouTube channel Gamers Nexus ran through a long list of broken or buggy features. In the current version (31.0.101.3430), the basic features seem to work. But that doesn't mean the software feels done.
The main culprit is Arc Control, which will get your relationship with Arc off to a rocky start by demanding administrator access every time you run it (including every time you restart your PC). Also, irritatingly, it currently exists only as an overlay, even when you're not in a game. It floats over the top of your other windows, but you can't interact with any of them without closing the Arc Control window.
Expanding the Arc Control window grants access to some auto-update settings. In these early days, you'll probably want to install every new Arc driver as quickly as you can.
Andrew Cunningham
Expanding the Arc Control window grants access to some auto-update settings. In these early days, you'll probably want to install every new Arc driver as quickly as you can.
Andrew Cunningham
Arc Control can show you a bit of info about your hardware and gives you a way to check whether the all-important ReBAR setting is on.
Andrew Cunningham
Arc Control can show you a bit of info about your hardware and gives you a way to check whether the all-important ReBAR setting is on.
Andrew Cunningham
A smaller system performance monitor overlay can be brought up in any app or game by pressing Alt+O.
Andrew Cunningham
A smaller system performance monitor overlay can be brought up in any app or game by pressing Alt+O.
Andrew Cunningham
Arc Control can show you a bit of info about your hardware and gives you a way to check whether the all-important ReBAR setting is on.
Andrew Cunningham
A smaller system performance monitor overlay can be brought up in any app or game by pressing Alt+O.
Andrew Cunningham
A handful of game settings can be set individually from within Arc Control, though for most, you should continue tuning settings within the games themselves.
Andrew Cunningham
More extensive performance monitoring plus mild (and somewhat obtuse) overclocking options are available for all Arc GPUs.
Andrew Cunningham
Some of Arc Control's more intriguing features involve screen capturing, which leverages the GPUs' hardware-accelerated video encoding and decoding blocks and can integrate webcam footage.
Andrew Cunningham
Some webcam-agnostic background blurring and replacement features are available, too, though you may prefer to use your webcam's native software if it came with any.
Andrew Cunningham
Once in a game, it's nice to be able to bring up the full Arc Control overlay by pressing Alt+I or the smaller system performance monitoring overlay with Alt+O. But for desktop use, a normal application window that can coexist with your other application windows would be ideal.
Let’s do the numbers
We still have more context to address, as Intel's bold entry into the GPU market needs it, but now is a good time to look at some performance numbers.
The new cards' incredible performance in synthetic benchmarks suggests that the Arc A770 (specifically its 16GB GDDR6 VRAM model, which is the only A770 we tested) and Arc A750 (which has only one model, sporting 8GB of GDDR6 VRAM) are going to clean house top to bottom. That, of course, is why we don't rely solely on synthetic benchmarks; GPU manufacturers prioritize them, which makes their usefulness suspect.
This gallery consists primarily of benchmarks that rely on normal rasterization (i.e., no ray tracing). In general, the A770 wins each test in this gallery, while the A750 typically either leads or contends directly with the RTX 3060.
This gallery consists primarily of benchmarks that rely on normal rasterization (i.e., no ray tracing). In general, the A770 wins each test in this gallery, while the A750 typically either leads or contends directly with the RTX 3060.
Once we get into games, at least, the Arc A770 and Arc A750 generally kick serious butt for their price range, and they’re an easier recommendation right now—even in their early, uneven state—than the comparable AMD RX 6600XT. Really, it’s simple: Based on our testing results, you should probably buy either of these GPUs before you buy a 6600XT.
Once Nvidia’s comparable modern GPU, the RTX 3060 (non-Ti), enters the conversation, however, the results aren’t necessarily as clear-cut. They’re certainly neck-and-neck, at least.
At least one example of Intel’s Arc Redemption
When it comes to tests of standard rasterization (i.e., games without ray tracing), both Arc cards enjoy a few massive leads. The biggest and most jaw-dropping comes from Red Dead Redemption 2, a PC heavyweight that stands out partly because its console version never got a "next-gen" patch. If you want to uncork this Western epic to its fullest, you'll need the PC version. In like-for-like testing, the Arc A750 exceeds the Nvidia RTX 3060 by 15 percent at 4K resolution and a whopping 23 percent at 1440p. The Arc A770 gets even bigger RDR2 wins compared to the 3060: 23 and 30 percent, respectively.
Without ray tracing enabled, both Arc cards beat the RTX 3060 in identical Watch Dogs Legion workloads as well (by margins of 11.3 percent and 25.5 percent). The same good news goes for older software like Witcher 3 and modern games like Hitman 3 and Far Cry 6.
However, despite its 8GB GDDR6 VRAM, the Arc A750 can exhibit some dismal numbers in like-for-like tests that another 8GB GDDR6 card, AMD's RX 6600XT, does not. It's unclear whether these cases are specifically a matter of badly optimized VRAM. But in many non-RT Arc A750 scenarios, the Arc A770 either matches or exceeds the RTX 3060 in like-for-like tests, including in Guardians of the Galaxy, Assassin's Creed Valhalla, and Cyberpunk 2077.
Some of those results actually come from games that use the DirectX 11 API despite official notice that Intel has designed its GPUs around the likes of DirectX 12 and Vulkan. (In a few cases, picking DX11 coughed up a few percentage points of better performance from both Arc GPUs we tested.) So you won't always want to default to DX12 when playing your favorite adventure games on PC.
Hark! The Arc has some rough waters ahead
Some of our worst Arc A700-series testing didn't make the benchmarks, so you'll want to read the text below to get additional context on Intel's pre-release shortcomings.
Some of our worst Arc A700-series testing didn't make the benchmarks, so you'll want to read the text below to get additional context on Intel's pre-release shortcomings.
One of many reminders that the Arc A750 can turn up surprisingly grim results for seemingly inexplicable reasons. We're not sure what about this 1080p workload pounds it so much.
Yet a few games turned in freakish test results, and they should be treated as massive red flags before buying into Intel's first generation of Arc A700-series GPUs.
The worst, by far, comes from Grand Theft Auto V, both due to its dip in performance and the fact that the title is still one of the most popular games in the world, owing to its GTA Online mode. In both 1440p and 4K workloads, it delivers approximately half the frame rate performance compared to the same tests on the RTX 3060. It's so bad that the RTX 3060's "lowest 1 percent" measure, which accounts for a threshold of worst-case-scenario dips, is significantly higher than either Arc card's averaged-out FPS in the same tests.
Apex Legends is massively popular as a free-to-play first-person shooter in part because it's optimized to run on potato-grade PCs. Sadly, the Arc A700-series GPUs don't hold up here. I tested this game at an intentionally low-spec settings preset (see above), which my testing rig handled with a nearly locked 144 fps refresh on comparable Nvidia and AMD cards. On both Arc GPUs, Apex Legends could never exceed an average of 100 fps, which might be fine if it weren't for considerable dips and outright stutters breaking up the frantic run-and-gun action. Firefights in particular brought the game's sense of responsiveness to its knees—which is bonkers for Apex's game engine, which is optimized to reduce latency.
And we even ran into issues with 2007's Portal (yes, Portal 1, not Portal 2), which we detail further below in an explanation of the Arc cards' issues with APIs. You'll also see some Nvidia victories in the above benchmark gallery, including Shadow of the Tomb Raider, Deus Ex: Mankind Divided, and Assassin's Creed Odyssey. (The latter game currently exhibits bizarre behavior, with the A750 surpassing the A770 in like-for-like tests.)
With Arc, it’s usually not a shame about ray (tracing)
Quake II RTX includes a handy automatic resolution slider that can get any of these GPUs up to a 60 fps refresh. This blurs the image to some extent, but for a low-poly game like Quake II, that's fine. Still, this brutal 4K test is meant to pummel the GPU exclusively, so it's the percentage points that matter on this one. And with Quake II RTX's demanding global illumination system, this is a good reminder that some Nvidia technologies remain supreme.
Quake II RTX includes a handy automatic resolution slider that can get any of these GPUs up to a 60 fps refresh. This blurs the image to some extent, but for a low-poly game like Quake II, that's fine. Still, this brutal 4K test is meant to pummel the GPU exclusively, so it's the percentage points that matter on this one. And with Quake II RTX's demanding global illumination system, this is a good reminder that some Nvidia technologies remain supreme.
This slide is a reminder that various image reconstruction techniques work with pretty much any GPU out there. XeSS compares favorably to DLSS as of press time, and both include "quality" and "balanced" presets.
This slide is a reminder that various image reconstruction techniques work with pretty much any GPU out there. XeSS compares favorably to DLSS as of press time, and both include "quality" and "balanced" presets.
One of many reminders that the Arc A750 can turn up surprisingly grim results for seemingly inexplicable reasons. We're not sure what about this 1080p workload pounds it so much.
One of many reminders that the Arc A750 can turn up surprisingly grim results for seemingly inexplicable reasons. We're not sure what about this 1080p workload pounds it so much.
This slide is a reminder that various image reconstruction techniques work with pretty much any GPU out there. XeSS compares favorably to DLSS as of press time, and both include "quality" and "balanced" presets.
One of many reminders that the Arc A750 can turn up surprisingly grim results for seemingly inexplicable reasons. We're not sure what about this 1080p workload pounds it so much.
Once again, the Arc A750 crumbles.
See the prior caption.
Similarly, this benchmark required a texture downgrade on both the Arc A750 and the RX 6600XT to get their numbers in performant range once RT was added.
Yet not every Arc A750 RT test is so bad.
Again, the brutal workload here is to emphasize percentage differences. You'll want to scale down resolution and other settings to get CP2077 into performant RT territory.
Intel confirmed that its Arc GPUs generally dedicate a certain amount of silicon to ray tracing, proportionate to each card's compute units. This bold move technically crowds out other possible options on each GPU, but Intel has made clear that it wants Arc GPUs to parse and understand DirectX 12 ray tracing calls efficiently. In some tests, this approach clearly paid off.
Control famously pushes GPUs to their limits—so much so that you can't really play the game's ray traced mode on weaker Nvidia GPUs without some form of DLSS image reconstruction turned on. In like-for-like tests with DLSS disabled, both Arc GPUs beat the Nvidia RTX 3060 in our Control workloads, and they each more than double the RX 6600XT's attempts.
In most of our standard RT tests, the A770 enjoys a slight lead over the RTX 3060, while the A750 often comes close, with a few notable failures (though, again, this could be a matter of the reduced VRAM wreaking havoc). Of note: the Arc A770 doesn't run circles around the RTX 3060 when it comes to RT on the computationally expensive Cyberpunk 2077, but it does win in like-for-like testing by 11 percent; meanwhile, the Arc A750 loses to the 3060 in the same test by a margin of 20 percent.
Other RT workloads see Nvidia take a clear lead in the same price category. In particular, Quake II RTX pummels both Arc A700 series cards, whose own performance resembles the puny RX 6600XT, while a low-resolution test of Guardians of the Galaxy sees the RTX 3060 secure a noticeable RT-rendering win.
Ultimately, the ray traced category is where I feel most confident suggesting that budget-minded PC gamers get a helluva deal out of the Arc A700-series GPUs, though we do wonder whether the 8GB model of the Arc A770 will run into the same RT variability we're seeing with the Arc A750 in pre-release testing or whether the Arc A750's worst test results may improve with future driver updates. We're reluctant to assume that Intel will wave a magic wand and solve every issue we've seen thus far, but the list of popular games with RT support is a lot easier to narrow down than, well, the soup of DX9 and DX11 issues that we imagine will emerge once more tests go live.
The API alibi
Intel has been blunt in conversations with Ars Technica, saying that you'll get better performance on its highest-end Arc cards in games with modern, low-overhead graphics APIs like DirectX 12 and Vulkan. These APIs are called "low-overhead" because they let games and other 3D apps access your GPU hardware more directly, reducing the number of things your GPU driver is responsible for mediating.
The problem for Intel is that not all popular PC games use new APIs. Popular multiplayer FPS, RTS, MMORPGs, MOBA, and battle royale games can trundle along for years, adding new features without overhauling the underlying game engine. PC gaming comes with expectations about backward compatibility, and a new GPU is going to be asked to run everything from Deathloop to Minecraft to Half-Life 2.
As of this review, some significant Arc problems emerge in DirectX 11 games, which are still recent enough to need a decent GPU to run well (particularly at 4K, or at higher than 60 fps). Our performance testing in titles like GTA V and Apex Legends makes it pretty difficult to recommend even the best Arc cards to people who want to run those games. And while Intel insists that things are always getting better (each Arc driver's release notes has a lengthy list of fixed issues, usually dominated by DX11 and DX12 games), these are not always easy problems to fix. AMD rewrote both its DirectX 11 and OpenGL drivers this year to help its cards catch up to Nvidia's, and AMD has been working on dedicated GPU drivers much longer than Intel has.
The original Portal suffers from multi-second hitches on any Arc card ranging from the A380 to the A770.
Older DirectX 9 games will also present weird edge cases. Raw performance is less likely to be an issue for these games, which are generally capable of running at full speeds with settings maxed at 1080p on poky integrated GPUs. But rather than trying to implement DirectX 9 support in its drivers, Intel is relying on DirectX 9-to-DirectX 12 code translation provided by Microsoft as part of Windows.
This kind of API translation is actually in fairly wide use. MoltenVK translates Vulkan API calls to Apple's Metal API, and the Proton software that makes Windows games run on the Steam Deck is in part possible because of a gaggle of DirectX-to-Vulkan translation layers. And while they often work well, individual games can run into plenty of performance and stability edge cases.
The original Portal (from 2007's Orange Box) suffers from multi-second hitches on any Arc card ranging from the A380 to the A770, typically when new portals are opened, along with occasional graphical corruption. We could not reproduce these kinds of stutters and glitches playing the game with the exact same settings on the nominally slower integrated Radeon GPU built into AMD's Ryzen 7 5700G, which suggests that this is a software problem and not a horsepower problem.
The good and bad news is that Microsoft, not Intel, is primarily responsible for fixing D3D9On12 bugs, which means that you could eventually see fixes for these issues even if Intel shuts down its graphics division tomorrow and never releases another driver update. But it also means needing to install new OS updates to get fixes, even if those updates contain other features or regressions you'd rather not have.
More quirks, more codecs, more games
As of press time, we've run into a severe Arc A700-series issue with one high-resolution virtual reality system. Attempts to connect HP's Reverb G2 headset to the A770 resulted in imagery being rendered in obnoxious black-and-white static. We're unsure if this is an issue with the Arc A700 series having a multi-monitor pixel maximum of any kind. Unseating and reseating cables led to one of the headset's lenses showing a clear, smooth image for roughly five minutes, at least, and we managed to play a bizarre one-eyed version of the popular VR game Beat Saber for long enough to believe that driver updates may get this problem fixed and lead to performant, high-FPS VR gaming.
Like AMD FSR, XeSS does not require an Intel GPU to work, though its upscaling results are slightly less fuzzy and slightly more performant on Intel hardware than on rivals' GPUs. It's the least controversial piece of the Intel A700 conversation: it works, it looks decent, and we hear that games with DLSS and FSR implementations could potentially enjoy a drag-and-drop DLL trick to get XeSS working (though Intel has not formally acknowledged this).
Losing sound altogether in Windows was a frequent Arc testing issue, albeit one that was tough to manually reproduce. Sometimes, sound via HDMI just cut out, despite all visual signs pointing to normally functioning sound via the testing PC's HDMI port. In one instance, putting the PC to sleep and reawakening it led to my sound device disappearing altogether, requiring a reboot. Other causes included changing the in-game pixel resolution and quitting a game while a YouTube video was playing in the background. The audio drop-off issues persisted our entire testing period, despite all sound driving directly from an HDMI 2.1 connection to my TV (there was no breakout box or dongle interrupting the carriage of sound).
Intel has been bullish about the Arc A700 series' native support for video encoding via the open source AV1 codec. AV1, for the uninitiated, has enjoyed sweeping support from major device manufacturers and software makers as it moves encoding and decoding workloads away from CPUs and toward GPUs. The results will likely bring more efficiency and image quality to average video streams, thus reducing bandwidth and rendering workloads for content creators and streaming services alike. But while YouTube includes some native support for the codec, bigger fish in the creator pond—particularly the livestreaming systems at Twitch and YouTube—don't yet support it. Thus, we're hesitant to make much hay about its efficiency on Arc A700 series GPUs; by the time more platforms get on the AV1 bus, we'll see promised support from Nvidia and AMD GPUs emerge as well. Check back for updates on that front.
Additional tests of the Arc A770 and Arc A750 revolved around games that are tricky to benchmark but turned up decent-enough performance. Elden Ring ran about as well on the A750 as the RTX 3060, though both run smack into its creators' lackluster PC support, which includes an obnoxious 60 fps cap that doesn't steadily lock to a smooth update. Skyrim Special Edition, without a single mod installed, easily reaches 60 fps on an A750 with all default settings maxed out at 4K resolution. And a few random spot checks of older PC games that we haven't covered in a while easily exceeded 100 fps at 4K, including Strafe and Metro: Last Light.
Verdict: Adventure awaits, if that’s what you want to pay for
Create your own flipbook out of these three images to get a sense of the default LED pulsing animation that ships with the Arc A770. (The Arc A750 does not include this LED strip.)
Create your own flipbook out of these three images to get a sense of the default LED pulsing animation that ships with the Arc A770. (The Arc A750 does not include this LED strip.)
We like this color scheme, though apparently, you can use included light-management software to edit or disable it.
We like this color scheme, though apparently, you can use included light-management software to edit or disable it.
Solid blue moment.
Solid blue moment.
We like this color scheme, though apparently, you can use included light-management software to edit or disable it.
Solid blue moment.
After nearly a week of testing, benchmarking, and casual use, we're ecstatic to see Intel land some serious blows against the competition with the Arc A770 and Arc A750's best results. Our jobs are more interesting when computer and device manufacturers push each other to either deliver exciting new features or shake up a seemingly locked-down market. In those respects, Intel has our attention.
But we're not just here to review these GPUs' promise. We also have enough benchmarks and anecdotes available to confirm that, at least as of the retail launch, Arc is all over the place. The results are heavily game- and settings-dependent. For newer games, you're usually getting a solid deal on a GPU that punches well above its weight, beating out Nvidia's popular RTX 3060 for something like 75 percent of the price. It feels great until you fire up one of the games Arc isn't good at running, where the results can range from "glitchy" to "choppy" to "unplayably broken," and there's no guarantee when (or even if) things will get better.
No matter how solid their performance can be in modern games, the earliest Arc GPU buyers are clearly in for at least a few months of beta-quality driver software. So far, those drivers haven't benefited much, if at all, from the company's decades of experience supporting its integrated GPUs.
The fact that we could imagine recommending these GPUs to select PC owners is reflective of how bizarre the GPU marketplace has been for the past three years. We want to tell you there's a good GPU at a good price in the near future. But the Arc A700 series of today is scraping by with annoyances and mixed results, and it could be pushed to irrelevance the instant AMD or Nvidia finally refreshes their midrange lines (or deeply discounts their older ones in light of Arc's biggest wins, if only to nuke Intel's price-to-performance-ratio advertising). When the full history of Intel is recounted, its Arc GPUs may appear merely as a footnote, a product that drove the competition's prices down.
Or maybe Intel will stick around long enough to get its next-gen "Battlemage" Arc GPUs into gamers' hands and make this hardware sector even more interesting. For now, that's the best adjective to describe the Arc A700-series GPUs: more "interesting" than "great." Or, to paraphrase one tech parody account, do you want a graphics card, or do you want an adventure?