So, they comply with the DMA… by only offering their own apps? Isn’t that the opposite of what the DMA is supposed to do?
So, they comply with the DMA… by only offering their own apps? Isn’t that the opposite of what the DMA is supposed to do?
It’s kinda funny to see US lawmakers getting so offended and worried that China does to the US what the US has done to the rest of the world for the last couple decades waving its technological superiority.
For the non-destructive option, yeah preventing it from using the network is about as good as you can make it.
For mine I intend to open it up once it’s out of warranty and try reflashing the Google TV on it or neuter the board entirely. From a hardware perspective, I expect the panel driver and the smarts to be on separate boards entirely. On mine, the Android TV UI renders at 1080p despite the TV being 4K HDR, so there’s got to be some hardware switching and multiplexing going on to make it work which means it should be possible to bypass the smarts entirely if I can figure out how it signals to change settings and inputs. It’s probably gonna be I2C or something.
If you’re not too risk averse you can probably at minimum open it up to cut the microphone, camera and WiFi antenna. Although careful with that, they tend to have Bluetooth remotes these days so they’d share the 2.4 GHz antenna, if you disconnect the antenna you’ll have to use the IR fallback. It won’t connect to any network not even open ones if it’s got no antenna.
They usually boot up faster because the Android side of it is just in sleep mode so it only needs to start up the panel. My smart TV turns on in like 1-3 seconds, a cold start is like a solid minute.
My older TV displayed the splash screen for a good 5 seconds as the panel backlight warmed up, and then it had to figure out the input and set it all up. That said once it was up, the menu was much much snappier and always responsive, which the Android TV side well, struggles.
Why would you refuse to buy IoT devices unless they’re more expensive, use more battery and have less range? Like why, what does it give you to not have a 2.4 GHz network? It’s not like it’ll interfere with the 5 GHz network.
Like sure the 2.4 GHz spectrum is pretty crowded and much slower. But at this point that’s pretty much all that’s left on 2.4GHz: low bandwidth, battery powered devices at random locations of your house and on the exterior walls of your house and all the way across the yard.
It’s the ideal spectrum to put those devices on: it’s dirt cheap (they all seem to use ES8266 or ESP32 chips, lots of Espressif devices on the IoT network), it uses less power, goes through walls better, and all it needs to get through is that the button has been pressed. I’m not gonna install an extra AP or two when 2.4 reaches fine, just so that a button makes my phone ring and a bell go ding dong or a camera that streams and bitrates that you could stream on dialup internet.
Phones and laptop? Yeah they’re definitely all on 5 GHz. If anything I prefer my IoT on 2.4 because then I can make my 5 GHz network WPA3 and 11ac/11ax only so I don’t have random IoT devices running at 11n speeds slowing down my 5 GHz network.
Completely talking out of my ass, but bass is generally much lower frequency and higher frequencies are more energetic.
Same with light: infrared we perceive as just heat, then as frequency increases it becomes visible red, and as you go up you get blue and violet and UV, which starts burning your skin very easily.
We know blue light affects sleep and reds are much more pleasing at night, so logically I’d expect something similar to be valid for sound as well. And we use ultrasound to clean stuff just like we use UV to sterilize stuff.
That feels like a major oversight from the EU. Users should be able to sideload whatever the fuck they want. Can it run apps as a separate package? Yes? The user should be able to install their own without restrictions.