The devices use a new processing method that Sony calls “cognitive intelligence.” The company says it goes beyond standard AI to create an immersive visual and sound experience: The processor divides the screen into zones and detects the focal point in the picture. It also analyzes sound positions in the signal to match the audio to the images on the screen. [Read: Meet the 4 scale-ups using data to save the planet] In a video demo, Sony said the expanding size of TVs has made viewers focus on parts of the screens rather than the entire image — like we do when viewing the real world. “The human eye uses different resolutions when we are looking at the whole picture and when we are focusing on something specific,” said Yasuo Inoue, a Sony signal processing expert. “The XR Processor analyzes the focal point and refers to that point as it processes the entire image to generate an image close to what a human sees.” It’s impossible to tell how well the AI works without seeing the TVs in person. If you wanna test it out yourself, you’ll likely need deep pockets. Pricing and availability for the new lineup will be announced in the spring.