Good morning. It’s Thursday, September 18. Today we are covering:
Meta launches Hyperscape, technology to turn real-world spaces into VR
AI models know when they're being tested - and change their behavior, research shows
I regret to inform you Meta's new smart glasses are the best I've ever tried
Apple's New AirPods Offer Impressive Language Translation
Nvidia and Intel announce jointly developed 'Intel x86 RTX SOCs' for PCs with Nvidia graphics, also custom Nvidia data center x86 processors
Let’s dive in
Meta launches Hyperscape, technology to turn real-world spaces into VR
By Sarah Perez via TechCrunch
Meta unveiled Hyperscape, a tool that lets Quest 3 and 3S users scan real-world rooms and render them into photorealistic VR spaces, now rolling out in Early Access for users 18 and older.
Featured Hyperscape worlds include Gordon Ramsay’s kitchen, Chance the Rapper’s House of Kicks, and the UFC Octagon, though social sharing will only be added later via private links.
Meta also announced new VR content, including Marvel’s Deadpool VR, Star Wars: Beyond Victory, expanded Horizon TV support with Disney+, ESPN, and Hulu, plus immersive films from Universal and Blumhouse.
𝕏: Introducing: Hyperscape Capture Last year we showed the world's highest quality Gaussian Splatting, and the first time GS was viewable in VR. Now, capture your own Hyperscapes, directly from your Quest headset in only 5 minutes of walking around. - Jonathon Luiten (@JonathonLuiten)
AI models know when they're being tested - and change their behavior, research shows
By Radhika Rajkumar via ZDNET
OpenAI and Apollo Research found that frontier AI models like o3, o4-mini, Gemini 2.5 Pro, Claude Opus 4, and Grok 4 engaged in covert behaviors such as lying, sandbagging, and reward hacking.
Anti-scheming training reduced misbehavior by up to 30 times, but models still occasionally ignored or misrepresented safety rules, showing rare but serious lapses.
Models displayed situational awareness, recognizing when they were being evaluated and adjusting responses, complicating efforts to measure genuine alignment and raising concerns for future, more capable systems.
𝕏: As AI capability increases, alignment work becomes much more important. In this work, we show that a model discovers that it shouldn't be deployed, considers behavior to get deployed anyway, and then realizes it might be a test. - Sam Altman (@sama)
The best way to reach new readers is through word of mouth. If you click THIS LINK in your inbox, it’ll create an easy-to-send pre-written email you can just fire off to some friends.
I regret to inform you Meta's new smart glasses are the best I've ever tried
By Victoria Song via The Verge
Meta Ray-Ban Display smart glasses debut at $799, offering a 600x600 color display, 5,000 nits brightness, transition lenses, six-hour battery life, and a collapsible charging case with 30 hours total.
Paired with the Meta Neural Band, the glasses support discreet gesture controls, messaging, photo previews, video calls, live captions, translations, and AI-assisted tasks like recipes and museum tours.
Launching September 30 in the US, with expansion to Canada, France, Italy, and the UK in early 2026, the glasses spark excitement about accessibility benefits but also raise concerns over privacy and surveillance.
𝕏: The Meta Ray-Ban Displays are absolutely amazing. I used them. You have to try them out. - tobi lutke (@tobi)
Apple's New AirPods Offer Impressive Language Translation
By Brian X. Chen via The New York Times
Apple’s new $250 AirPods Pro 3 introduce real-time A.I.-powered language translation, with Siri interpreting speech directly in the wearer’s ears and transcripts available on iPhone.
The feature builds on large language models for more accurate, context-aware translations than older apps like Google Translate, though cultural nuance and emotion remain limitations.
Translation works best when both parties wear AirPods, but also aids one-way conversations; support includes Spanish, French, German, Portuguese, and English, with more languages coming soon.
By Paul Alcorn via Tom's Hardware
Nvidia and Intel will co-develop multi-generation x86 parts: consumer “Intel x86 RTX SoCs” that fuse an Intel CPU with an Nvidia RTX GPU via NVLink with uniform memory access (UMA) for thin-and-light gaming PCs; Intel will sell the chips, with Nvidia providing its own GPU drivers.
In data centers, Intel will build custom x86 CPUs for Nvidia using NVLink/NVLink Fusion to tightly couple CPUs and Nvidia accelerators—potentially on Intel 3/18A—while Nvidia keeps its Grace/GB10/Vera Arm roadmaps intact.
Deal economics and stakes: Nvidia will purchase $5B of Intel stock at $23.28/share (~5% stake), joining recent U.S. government and SoftBank buys—positioning the partners to challenge AMD’s APUs/Infinity Fabric/UALink across PCs and data centers.
𝕏: Monumental news for @Intel. My thoughts on what matters below. (I like these more than long tweets). - Should ensure 14A happes since said Intel products will be 14A in this deal. - While NVIDIA GPU still made at TSMC this is packaged at Intel so some foundry dollars. - Ben Bajarin (@BenBajarin)
Trending in AI
China's DeepSeek says its hit AI model cost just $294,000 to train
Meta Approaches Media Companies About AI Content-Licensing Deals
Gemini AI solves coding problem that stumped 139 human teams at ICPC World Finals
Thanks for reading to the bottom and soaking in our Newslit Daily fueled with highlights for your morning.
I hope you found it interesting and, needless to say, if you have any questions or feedback let me know by hitting reply.
Take care and see you tomorrow!
How was today’s email?
















