On the one hand, the Surface Pro 9 is pretty much what we expected: a jump up to Intel’s 12th-gen CPUs. But Microsoft surprised us with a huge shakeup for its tablet PCs. There’s also a Surface Pro 9 running a custom SQ3 ARM chip, which also includes built-in 5G. Can an x86 Intel processor and a mobile ARM chip really sit side by side? We got a chance to compare the two new machines at Microsoft’s hands-on event, and to be honest, we just have more questions.
Both models look and feel the same, save for the more visible 5G antennas on the Arm model. Microsoft representatives say performance is also comparable between the SQ3 and Intel’s chips, something we’ll have to fully test to believe. (An early Geekbench 5 test on a demo unit hit 978/4,760, which is far slower than Intel 11th and 12th-gen systems we’ve reviewed. Those figures could improve with better software and firmware, though.) The same reps also noted that app compatibility with legacy x86 apps has gotten better for ARM devices, and there are an increasing number of native Windows apps which will run just fine across both platforms.
In either case, you’re getting tablet PCs that can easily transform into functional laptops with their keyboard cases. Unfortunately, those are still sold separately, as is the Slim Pen 2 Microsoft introduced last year. The Surface Pro 9 won’t change your mind about the viability of using a tablet as a PC, but on the Intel side it’s nice to see a major speed bump.
The SQ3 Arm model also has a few features the Intel version doesn’t, thanks to its neural processor. That includes some real-time enhancements to video chats, like blurring your background. (The video quality across both systems also look fantastic.) According to Microsoft, it’s possible to bring those features to Intel chips when they have their own neural chips, but unfortunately those aren’t available on Intel’s current lineup.
Microsoft representatives admitted there may be some confusion among some shoppers, since they can easily walk out of a store with two very different computers. But it sounds like the company is willing to deal with those usability bumps, rather than splitting the Surface Pro line once again.
It’s been four years since we got the Surface Studio 2, the much-improved followup to Microsoft’s ever-so-flexible all-in-one desktop. Surely, the company has something special in store for the next version, right? Well, yes and no. The Surface Studio 2+ is indeed significantly faster than before, thanks to Intel’s H35 11th-gen CPU and NVIDIA’s RTX 3060 graphics.
But we’ve also spent most of this year being impressed by Intel’s terrific 12th-gen hybrid chips, which deliver vastly better performance than 11th-gen CPUs. (Just imagine how much more intriguing the Studio 2+ would be with a 16-core 12th-gen HX CPU.) You’d think a computer starting at $4,300 would have the fastest hardware available? No wonder it’s not called the Surface Studio 3.
If you’ve been following our coverage of the Studio line, the Studio 2+ will sound pretty familiar. It has the same 28-inch PixelSense screen, a flexible hinge that can be effortlessly pushed down to an easel-like angle, and it packs in all of its hardware in a desktop-friendly base. Following the trend we’ve seen with all of Microsoft’s Surface hardware this year, there aren’t any exterior design changes at all. But hey, at least we’ve finally got Thunderbolt 4 USB-C ports and Dolby Vision HDR.
Microsoft claims the Core i7-11370H CPU in the Surface Studio 2+ is 50 percent faster than the Studio 2, as well as five times faster than the original 2017 modeled. Coupled with the RTX 3060, which is twice as fast as the GTX 1060 in the Studio 2, this new desktop will definitely be a fast performer. But the hardware geek in me can’t help but wonder why Microsoft couldn’t make Intel’s 12th-gen chips work. They’re technically more efficient, and it’s not as if there wasn’t enough time to prepare for new hardware.
I’m sure the Studio 2+ will satisfy general buyers who are willing to pony up big bucks for a uniquely flexible desktop. But I can’t think of recommending Intel’s 11th-gen chips in any other computer today, unless you’re snagging it with a serious discount. For such a premium, they’re practically a dealbreaker.
At least the Studio 2+ is well equipped otherwise, shipping with 32GB of DDR4 RAM and a 1TB SSD. (It sure would be nice to see some SSD expansion slots, though). But take note, if you want it to ship with Microsoft’s Surface Pen, Keyboard and Mouse, you’ll have to spend an extra $300. (That makes it a $4,500 computer with last year’s Intel chip! The horror!)
You can pre-order the Surface Studio 2+ today, and it’ll be available in select markets on October 25th.
Last year’s Surface Pro 8 was one of the biggest design upgrades for Microsoft’s tablet PC, adding long-awaited features like Thunderbolt 4 and surprising upgrades like a 120Hz display. This year’s Surface Pro 9, paradoxically, is both more of the same and a dramatic departure. It has the usual chip refresh — in this case, Intel’s far superior 12th-gen CPUs — but there’s also a new 5G-equipped model with a custom SQ 3 Arm chip.
If that sounds confusing to you, well, it is. We last saw the company’s SQ chip in the 2020 Surface Pro X, a computer that we found both beautiful and frustrating, thanks to Windows’ crummy software compatibility with Arm chips. To shift that problem over to a computer with the same name as its Intel sibling is a recipe for disaster. (We can just imagine the frustrated Best Buy shoppers who are dazzled with the idea of a 5G Surface, only to learn they can’t run most of their traditional Windows apps.) The 5G Pro 9 is also broken down into millimeter-wave and Sub-6 variants, which will be sold in their respective markets.
It’s understandable why Microsoft isn’t keen to keep the Surface Pro X moniker going — the Pro 8 lifted many of its modern design cues, after all. But from what we’ve seen, Windows 11 doesn’t solve the problems we initially had with the Pro X.
Beyond the chip updates, the Pro 9 looks mostly the same as its predecessor, with a 13″ 120Hz PixelSense display, as well as relatively slim screen bezels. Microsoft claims the 1080p webcam has been improved, and there’s also a 4-degree tilt to help keep you centered. You’ve also got a few bolder colors to choose from, including Sapphire, Forest and a new Liberty London Special Edition. (And yes, before you ask, you’ll still have to pick up a Surface Keyboard and Slim Pen 2 separately if you actually want to be productive with the Pro 9.)
Adding to the confusion of having two chip platforms under the same product name, there are several major differences between them. For example, the Intel version can be equipped with up to 32GB of LPDDR5 RAM and a 1TB SSD, while the Arm variant is limited to 16GB of LPDDR4x RAM and a 512GB SSD at most. You’ll also lose the two USB-C 4.0/Thunderbolt 4 ports on the Arm Pro 9 — instead, you’ll get two USB-C 3.2 connections. (On the plus side, the 5G model should get up to 19 hours of battery life, 3.5 more hours than the Intel version.)
The Intel-based Surface Pro 9 starts at $999 for a Core i5 model with 8GB of RAM and a paltry 128GB of storage, while the cheapest 5G model will run you $1,300 with the same specs. You’ll be able to pre-order the Surface Pro 12 in select models starting today, with general availability beginning on October 25th.
Techtober isn’t over yet! Today, we’re gearing up to cover Microsoft’s Surface device event at its NYC store. Senior Editors Sam Rutherford and Devindra Hardawar will be watching the stream and jotting all of their thoughts down in this live blog. And once the stream is over, they’ll be on the ground to churn out some hands-on coverage of these new Surface devices. Stay tuned for some deeply nerdy (and hopefully fun!) Surface commentary.
How do you go about reviewing something like NVIDIA’s RTX 4090? Just looking at its specs alone, it’s obviously the fastest consumer GPU we’ve ever seen. So sure, I can tell you that I can play just about anything in 4K with ray tracing and every graphical nicety turned on. Hell, it can even scale up to 8K if you’re a masochist. For a $1,599 video card, it damn well better. But the real question is, who is this thing actually for?
Benching the RTX 4090 against NVIDIA and AMD’s older hardware is practically pointless. Of course it’s far faster. Of course it’ll make you jealous. If you’ve got the cash and you’re itching to upgrade, go with God (or NVIDIA’s leather-clad CEO Jensen Huang, as the company’s fans see him). But for anyone else who doesn’t need bleeding edge hardware, it exists purely as an object of lust. Sure, you could wait for the upcoming RTX 4080 cards, or whatever AMD has in the works, but it’s not a 4090. Just like the last generation of GPUs, NVIDIA is throwing down the gauntlet with a power-hungry card for the most hardcore gamers and creators.
If your mind isn’t made up, I assume you’re here just to see how much of a beast the 4090 is. And let me tell you, it’s a stunning thing to behold. Weighing in at 4.8 pounds, and approaching the size of the PlayStation 5, the RTX 4090 is a triple-slot GPU that will dominate whatever case it’s in. Seriously, if you’re thinking of getting it, be sure to measure your PC to ensure you can fit a nearly foot-long card that’s close to 5 inches thick.
Be prepared to upgrade your power supply too: The 4090 has a high 450W TDP (the same thermal design profile as the 3090 Ti) and it requires an 850W PSU. (Some third-party companies are pushing that demand to 1200W PSUs!) While it can be powered by a single PCIe 5.0 cable, there still aren’t many of those PSUs on the market, so most people will likely end up using four 8-pin adapters. I cursed Jensen’s name when I realized I needed to string another PSU line, after tidying up all of my cables.
Beyond its obscene power demands, though, NVIDIA hasn’t changed much about the 4090 Founder Edition’s design from its previous model: It’s still a high-end, all-metal card with a massive vapor chamber, heatsink array and two fans on opposite sides. NVIDIA claims they can push 20 percent more air than the 3090 Ti – in my testing, that meant the 4090 stayed at a relatively cool 70C under load.
What’s truly special about the RTX 4090, though, is everything under the hood. It features the company’s new “Ada Lovelace” architecture (named after the world’s first computer programmer, though I wonder if NVIDIA pays any royalties to turn her name into marketing). It has 16,384 CUDA cores (almost 6,000 more than the 3090 Ti), a base clock speed of 2.23GHz (boost to 2.52GHz), and 24GB of GDDR6X RAM. With figures like these, the upcoming RTX 4080 cards (which start with 7,680 CUDA cores) seem puny in comparison.
And really, that seems like the point of dropping the 4090 before the rest of NVIDIA’s new GPUs. It’s like a heavenly body so massive it warps space time around it. This is the new standard. What other GPU can get you 135fps in Cyberpunk 2049 while playing in 4K with maxed out graphics and ray tracing?
To be clear, though, the 4090 isn’t just about brute-force power. It was able to reach that killer Cyberpunk framerate by using DLSS 3, NVIDIA’s latest AI upscaling technology that can now generate entire frames of imagery on its own. (That’s in addition to upscaling lower resolution textures using AI, like earlier versions.) DLSS 3 helped A Plague Tale Requiem perform more than twice as fast while running in 4K, delivering around 175fps (up from 74fps).
None
3DMark TimeSpy Extreme
Port Royal (Ray Tracing)
Control
Blender
NVIDIA RTX 3090
16,464
25,405/117.62 fps
4K (Native) High RT: 107 fps
12,335
NVIDIA RTX 3080 Ti
8,683
12,948/59.95fps
4K (Native) 43fps Med RT
5,940
AMD Radeon RX 6800 XT
7,713
9,104/42.15fps
4K (Native) No RT 28-40
N/A
The RTX 4090 had no trouble delivering 107 fps in Control while playing in 4K with high ray tracing settings. But you know what’s even better? Getting a solid 128 fps when I flipped on an older version of DLSS. It’s just unfortunate it doesn’t support DLSS 3 yet, because I’m sure it would eke out even better performance. Even though the game was actually being rendered in 1,440p, to my eye, DLSS still does a stunning job of making that seem like 4K. (I tested the 4090 alongside Samsung’s 55-inch Arc monitor, giving me a much larger view than my typical 34-inch ultrawide screen. If there were graphical anomalies, I would have seen them.)
I was particularly interested in stressing ray tracing performance on the 4090, because that was a feature that still managed to bring NVIDIA’s 30-series cards to their knees. It enables more realistic lighting, shadows and reflections. For most, I’d wager the graphical facelift it delivers would be more impressive than a skyrocketing framerate count. So it’s a wonder to see an NVIDIA card that can finally deliver 4K and solid ray tracing beyond 100fps. Is that worth $1,599, though? That remains unclear, especially since we don’t know how the rest of the 40-series cards will compete.
If you’re looking for a video card that can do more than just game, the 4090 may make more sense. In the Blender 3D rendering benchmark, it scored twice as high as the RTX 3090 Ti, a GPU released earlier this year for an eye-watering $1,999. (Let’s have a moment of silence for the poor souls who jumped on that card.) When it came to transcoding a short 4K clip into 1080p, the RTX 4090 was also 10 seconds faster than the 3080 Ti. That could certainly add up if you’re rendering longer clips, episodes or feature films.
It’s hard not to covet the RTX 4090, especially once you see what it’s capable of. It’s a glimpse into a world where we can finally get uncompromised ray tracing. But with the $899 and $1,199 RTX 4080 cards on the horizon, it’s tough to drop the price of an entire computer just to get the best frame rates in town There’s just so much more to consider these days. You could pair up one of those 4080s with a Steam Deck, for example, and bring the joys of PC gaming on the road and all over your home. Sure, you won’t have the prestige of being in the 4090 club, but you’ll probably end up having more fun.
For a few years now, gaming laptops have been some of the most intriguing PCs around. They’ve gotten thinner and lighter, naturally — but they’ve also become vastly more powerful and efficient, making them suitable for both work and play. They’ve adopted some bold innovations, like rotating hinges and near desktop-like customizability. Gaming laptops are where PC makers can get adventurous.
If you’re a professional in the market for a beefy new computer, and you like to play a few rounds of Apex Legends on occasion, it may make more sense to go for a gaming notebook instead of a MacBook Pro-like workstation. You’ll still get plenty of power for video encoding and 3D rendering, plus you may end up paying less.
Your laptop buying journey starts and ends with the amount of money you’re willing to spend. No surprise there. The good news: There are plenty of options for gamers of every budget. In particular, we’re seeing some great choices under $1,000, like Dell’s G15 lineup. PCs in this price range will definitely feel a bit flimsier than pricier models, and they’ll likely skimp on RAM, storage and overall power. But they should be able to handle most games in 1080p at 60 frames per second, which is the bare minimum you’d want from any system.
Stepping up to mid-range options beyond $1,000 is where things get interesting. At that point, you’ll start finding PCs like the ASUS Zephyrus ROG G14, one of our favorite gaming notebooks. In general, you can look forward to far better build quality than budget laptops (metal cases!), improved graphics power and enough RAM and storage space to handle the most demanding games. These are the notebooks we’d recommend for most people, as they’ll keep you gaming and working for years before you need to worry about an upgrade.
If you’re willing to spend around $1,800 or more, you can start considering more premium options like Razer’s Blade. Expect impeccably polished cases, the fastest hardware on the market, and ridiculously thin designs. The sky’s the limit here: Alienware’s uber customizable Area 51m is an enormous beast that can cost up to $4,700. Few people need a machine that pricey, but if you’re a gamer with extra cash to burn, it may be worth taking a close look at some of these pricier systems.
What kind of CPU and GPU do you want?
The answer to this question used to be relatively simple: Just get an Intel chip with an NVIDIA GPU. But over the last few years, AMD came out swinging with its Ryzen notebook processors, which are better suited for juggling multiple tasks at once (like streaming to Twitch while blasting fools in Fortnite). Intel responded with its impressive 12th-gen chips, but it’s nice to have decent AMD alternatives available, especially since they’re often cheaper than comparable Intel models.
When it comes to video cards, though, AMD is still catching up. Its Radeon RX 6000M GPU has been a fantastic performer in notebooks like ASUS’s ROG Strix G15, but it still lags behind NVIDIA when it comes to newer features like ray tracing. But at least a Radeon-powered notebook ht can approach the general gaming performance of NVIDIA’s RTX 3070 and 3080 GPU.
If you want to future-proof your purchase, or you’re just eager to see how much better ray tracing can make your games look, you’re probably better off with an NVIDIA video card. They’re in far more systems, and it’s clear that NVIDIA has better optimized ray tracing technology. RTX GPUs also feature the company’s DLSS technology, which uses AI to upscale games to higher resolutions. That’ll let you play a game like Destiny 2 in 4K with faster frame rates. That’s useful if you’re trying to take advantage of a high refresh rate monitor.
NVIDIA’s RTX 3050 is a decent entry point, but we think you’d be better off with at least an RTX 3060 for solid 1080p and 1440p performance. The RTX 3070, meanwhile, is the best balance of price and performance. It’ll be able to run many games in 4K with the help of DLSS, and it can even tackle demanding titles like Control. NVIDIA’s RTX 3080 and 3080 Ti are the king of the hill; you’ll pay a premium for any machine that includes them.
It’s worth noting that NVIDIA’s mobile GPUs aren’t directly comparable to its more powerful desktop hardware. PC makers can also tweak voltages to make it perform better in a thinner case. Basically, don’t be surprised if you see notebooks that perform very differently, even if they’re all equipped with the same GPU.
What kind of screen do you want?
Screen size is a good place to start when judging gaming notebooks. In general, 15-inch laptops will be the best balance of immersion and portability, while larger 17-inch models are heftier, but naturally give you more screen real estate. There are some 13-inch gaming notebooks, like the Razer Blade Stealth, but paradoxically you’ll often end up paying more for those than slightly larger 15-inch options. We’re also seeing plenty of 14-inch options, like the Zephyrus G14 and Blade 14, which are generally beefier than 13-inch laptops while still being relatively portable.
But these days, there is plenty to consider beyond screen size. For one: refresh rates. Most monitors refresh their screens vertically 60 times per second, or at 60Hz. That’s a standard in use since black and white NTSC TVs. But over the past few years, displays have evolved considerably. Now, 120Hz 1080p screens are the bare minimum you’d want in any gaming notebook — and there are faster 144Hz, 240Hz and even 360Hz panels. All of this is in the service of one thing: making everything on your display look as smooth as possible.
For games, higher refresh rates also help eliminate screen tearing and other artifacts that could get in the way of your frag fest. And for everything else, it just leads to a better viewing experience. Even scrolling a web page on a 120Hz or faster monitor is starkly different from a 60Hz screen. Instead of seeing a jittery wall of text and pictures, everything moves seamlessly, as if you’re unwinding a glossy paper magazine. Going beyond 120Hz makes gameplay look even more responsive, which to some players gives them a slight advantage.
Not to make things more complicated, but you should also keep an eye out for NVIDIA’s G-SYNC and AMD’s FreeSync. They’re both adaptive sync technologies that can match your screen’s refresh rate with the framerate of your game. That also helps to reduce screen tearing and make gameplay smoother. Consider them nice bonuses on top of a high refresh rate monitor; they’re not necessary, but they can still offer a slight visual improvement.
One more thing: Most of these suggestions are related to LCD screens, not OLEDs. While OLED makes a phenomenal choice for TVs, it’s a bit more complicated when it comes to gaming laptops. They’re mostly limited to 60Hz, though some models offer 90Hz. Still, you won’t see the smoothness of a 120Hz or 144Hz screen. OLEDs also typically come as 4K or 3.5K panels – you’ll need a ton of GPU power to run games natively at that resolution. They look incredible, with the best black levels and contrast on the market, but we think most gamers would be better off with an LCD.
A few other takeaways:
Get at least 16GB of RAM. And if you’re planning to do a ton of multitasking while streaming, 32GB is worth considering.
Storage is still a huge concern. These days, I’d recommend aiming for a 1TB M.2 SSD, which should be enough space to juggle a few large titles like Destiny 2. Some laptops also have room for standard SATA drives, which are far cheaper than M.2’s and can hold more data.
Normally we’d recommend getting your hands on a system before you buy, but that’s tough as we’re in the midst of a pandemic. I’d recommend snagging your preferred system from a retailer with a simple return policy, like Amazon or Best Buy. If you don’t like it, you can always ship it back easily.
If you can’t tell by now, we really like the Zephyrus G14. It’s shockingly compact, at just 3.5 pounds, and features AMD’s new Ryzen chips paired together with its Radeon 6000M graphics (we’d recommend the Ryzen 9 model with an RX 6700M for $1,400). While its 14-inch screen is a bit smaller than our other recommendations, it looks great and features a fast 144Hz refresh rate. We also like its retro-future design (some configurations have tiny LEDs on its rear panel for extra flair). While the G14 has jumped in price since it debuted, it’s still one of the best gaming notebooks around, especially since ASUS has finally added a built-in webcam.
We’ve been fans of Dell’s G5 line ever since it first appeared a few years ago. Now dubbed the G15, it starts at under $1,000 and features all of the latest hardware, like Intel’s 12th-generation CPUs and NVIDIA’s RTX 30-series cards. (You can also find AMD’s Ryzen chips in some models.) It’s a bit heavy, weighing over five pounds, but it’s a solid notebook otherwise. And you can even bring it into mid-range gaming territory if you spec up to the RTX 3060.
Razer continues to do a stellar job of delivering bleeding-edge hardware in a sleek package that would make Mac users jealous. The Blade 15 has just about everything you’d want, including NVIDIA’s fastest mobile GPU, the RTX 3080 Ti, as well as Intel’s 12th-gen CPUs and speedy quad-HD screens. Our recommendation? Consider the model with a Quad HD 165Hz screen and an RTX 3070 GPU for $2,050. You can easily save some cash by going for a cheaper notebook, but they won’t feel nearly as polished as the Blade.
While we’ve seen some wilder concepts from Acer, like its 360-degree hinge-equipped Triton 900, the Triton 500 is a more affordable bread and butter option. This year, it’s bumped up to a 16-inch display, giving you more of an immersive gaming experience. It’s relatively thin, weighs just over five pounds, and it can be equipped with Intel’s 11th-gen CPUs and NVIDIA’s RTX 30-series GPUs. Acer’s build quality is as sturdy as ever, and it has most of the standard features you’d need in a gaming notebook.
Take everything we loved about the Razer Blade 15, scale it up to a larger 17-inch screen, and you’re pretty much in gamer paradise. If you can live with its six-pound weight, the Blade 17 will deliver the most desktop-like gaming experience you can find in a notebook. It’s relatively slim, and it’s perfect for binging Netflix in bed. The Blade 17 is also a smart choice if you’re editing media, as its larger screen space makes it perfect for diving into larger timelines. It’s not for everyone, but sometimes you just have to go big or go home, right?
You know if you actually need a dual-screen laptop: Maybe a single 17-inch screen isn’t enough, or you want a mobile setup that’s closer to a multi-monitor desktop. If that’s the case, the Zephyrus Duo 16 is made for you. It’s powerful, and its extra 14-inch screen can easily let you multitask while gaming dutifully working. It also has all of the latest hardware you’d want, like AMD’s new Ryzen chips and NVIDIA’s RTX 3000 GPUs. Sure, it’s nowhere near portable, but a true multitasker won’t mind.
This week, Cherlynn, Devindra and Engadget’s Sam Rutherford dive into everything we learned at Google’s Pixel 7 event. Sure, it’s nice to have new phones, but it’s even nicer to see Google developing a cohesive design for all of its new devices. The Pixel Watch actually looks cool! And while we were ready to knock the (way too late) Pixel Tablet, its speaker base seems genuinely useful. Google may have finally figured out how to combine its software and AI smarts with well-designed hardware.
Listen above, or subscribe on your podcast app of choice. If you’ve got suggestions or topics you’d like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcasts, the Morning After and Engadget News!
Intel Arc A750 and A770 graphics cards review – 42:27
Elon Musk announces intent to buy Twitter (again) – 44:56
Tesla showed off its robot (sort of) – 46:32
Gatorade made a smart water bottle – 47:40
iPhone 14 Plus review – 49:42
Pop culture picks – 52:41
Livestream
Credits Hosts: Cherlynn Low and Devindra Hardawar Guest: Sam Rutherford Producer: Ben Ellman Music: Dale North and Terrence O’Brien Livestream producers: Julio Barrientos Graphic artists: Luke Brooks and Brian Oh
With a new batch of Pixel phones comes a new chip at the heart of them all: Google’s Tensor G2. Like last year’s Tensor, the company’s first custom mobile chip, it’s an AI-infused powerhouse built specifically around the Pixel 7, Pixel 7 Pro’s and Pixel Tablet’s new features. It’ll also be joined with a revamped Titan M2 chip, which deals with on-device security.
On stage during its Pixel launch event, Google VP Brian Rakowski said the Tensor G2 will power the Pixel 7’s voice capabilities, including faster Assistant queries, as well as voice translation, voice typing, and more. He noted that voice typing is around two and a half times faster than using the keyboard, making it a feature that more people are relying on. You’ll even be able to visually describe emojis, like asking for the “heart eyes cat,” while voice typing.
On the Pixel 7 and Pixel 7 Pro, the Tensor G2 also enables Photo Unblur, which can sharpen out of focus photos, as well as Super-Res Zoom, which digitally blows up photos without losing much quality (it also benefits from the phone’s new 50MP cameras). One nice bonus: You’ll also be able to touch up older photos using all of the Tensor G2’s capabilities. As for other features, the Tensor G2 lets Night Sight photos process twice as quickly, and it’s behind Cinematic Blur mode, which can artfully direct how your videos are focused. Sure, it’s not as groundbreaking as the original, but the Tensor G2 shows that Google is still committed to a strong cohesion between the Pixel phone’s software and hardware.
The Tensor G2 chip features two “Big” CPU cores, two “Medium” cores and four “Small” cores, like its predecessor. Clock speeds are only a hair higher — literally just 5MHz and 10MHz across the Big and Medium cores — and Google is sticking with the Arm Cortex X1 and A55 chips with the Big and Small cores. The only major update? The Tensor G2’s Medium core now uses an Arm A78 instead of an A76. Google says the G2 is also running a “next-generation” TPU AI accelerator.
Follow all of the news from Google’s Pixel 7 event right here!
Google will pay Arizona $85 million to settle a 2020 lawsuit, which claimed that the search giant was illegally tracking Android users, Bloomberg reports. At the time, Arizona Attorney General Mark Brnovich argued that Google continued to track users for targeted advertising, even after they turned off location data settings. If this sounds familiar, it’s because Google is also being sued by attorneys general in Texas, Washington, D.C., and Indiana over similar data tracking complaints. Brnovich’s office also notes that the $85 million settlement is the largest amount Google has paid per user in a privacy lawsuit like this.
But given that Google is currently seeing quarterly revenue over $69 billion, the punishment may seem like a drop in the bucket. It’s nothing compared to the $1.7 billion Google was fined by the EU over abusive advertising practices. In a statement, Google spokesman José Castañeda said the suit was related to older product policies that have been changed. “We provide straightforward controls and auto delete options for location data, and are always working to minimize the data we collect,” he said. “We are pleased to have this matter resolved and will continue to focus our attention on providing useful products for our users.”
Brnovich, meanwhile, says he’s “proud of this historic settlement that proves no entity, not even big tech companies, is above the law.”
Not too long ago, the notion of Intel getting into the world of discrete graphics cards seemed ludicrous. Intel?! The same company that killed its last major GPU project in 2009 and spent the 2010s focusing on weak integrated graphics? The same one th…