

Source?
Source?
One of the expressed reasons for cutting funding to PBS and NPR (via the Corp. for Public Broadcasting) is because they can’t influence what is being said by those entities.
This conversation is surreal because you don’t seem to understand how disagreement works. You said the price makes sense, I am saying it doesn’t. You are free to end the discussion there if you wish but I am going to keep responding to the person who keeps acting like their opinion is fact;
Tegra GPUs are specifically cost reduced, low power versions of previous generations of GeForce GPUs. The one in the Switch 2 has been rumored to be based on the 3000 series but I have not seen any confirmation of that as yet. I feel like you are making my point for me, you keep saying that everything else costs the same so why should this one cheaper part matter… and my response is because it’s cheaper. Note the lack of PCI and Thunderbolt for instance. There is also no Windows license to worry about.
If you don’t want to reply then don’t but seriously it seems like you are getting upset solely because somebody has a differing opinion.
Then it seems we got off on the wrong foot when you called my disagreement meaningless.
RISC has always been fundamentally cheaper than x86 which is one reason why Nintendo has used a RISC processor in all of their handheld consoles since the original GameBoy.
Your last sentence is pretty much my point though. There is no reason for that. Look at the iPad and the Mac Mini, look at the Raspberry Pi… there is no reason for a RISC machine to cost more than an x86 machine.
Your response was to Simple’s comment about price. From my reading it seemed that you were implying that the price was right because the performance was similar. I was agreeing with Simple and disagreeing with that perceived implication based on the fact that it uses a different and historically cheaper architecture. One that would typically make a dollar per hertz comparison useless, as you seem to have pointed out. Hence my confusion.
there are other aspects that impact performance, so you can’t make assumptions based on that
That is literally what I have been trying to say this whole time in response to you saying it looks comparable. I genuinely have no idea what you are arguing against at this point.
I would have bet real money if you had asked me yesterday that it would have been limited to 1080p to reduce cost. I am very curious to know more about how it actually performs in each setting, how much of it is upscaling, etc. I imagine that most 4k games won’t have much in the way of better graphics than the Switch 1, the higher memory bandwidth could help with higher res textures though.
It’s not similar to PC hardware; It uses a Tegra processor like the Switch 1. Which means it’s more like a phone with a less than laptop grade Nvidia graphics chipset thrown in. Unlike the Steam Deck, for instance, which uses an AMD Z processor, a scaled down version of what is in the Xbox and PS5.
This is already a copyright apocalypse though isn’t it? If there is nothing wrong with this then where is the line? Is it okay for Disney to make a movie using an AI trained on some poor sap on Deviant Art’s work? This feels like copyright laundering. I fail to see how we aren’t just handing the keys of human creativity to only those with the ability to afford a server farm and teams of lawyers.
There is also a PWA version for those looking to try it out, and I believe it’s on F-Droid as well.
Ahem… it’s Edge Webview 2. Which I promise is in no way exactly the same as Electron…
I didn’t say that. What I said was if you change “monopoly” for “anticompetitive practices” my question still stands. “How is it different from how Nintendo acts with the Switch?” Keeping in mind that I had already conceded that better smartwatch access made sense.
While I appreciate semantic clarity as much as anybody else I’m not sure it changes my question in this case.
I rarely find myself defending giant corporations but after having looked at the list it seems I am going to have to.
Some of the things do make sense, like allowing other smartwatches the same notification access as Apple Watches. But others like the audio switching seem to lack a fundamental understanding of how that even works.
I keep trying to figure out though what exactly Apple has a monopoly in… they don’t have the largest segment of any market they are in so it makes it seem like the EU is complaining that they have a monopoly on iPhones… which… yes… but that is like saying Nintendo has a monopoly on the Switch.
Edit: I seem to have failed to express the nuance I wanted to. None the less there seem to be some issues with the demands here and I think it will be interesting to see how this pans out.
The Internet will continue to function just fine, just as it has for 50 years. It’s the World Wide Web that is on fire. Pretty much has been since a bunch of people who don’t understand what Web 2.0 means decided they were going to start doing “Web 3.0” stuff.
My experience does not reflect yours. Computer Architecture, Discrete Math (logic gate math), and Operating System Concepts were all required classes in my CS degree from just a few years ago.
Personally, I am just not going to use the smallest screen I own to do most of the tasks they are pushing AI for. They can keep making them bigger and it’s still just going to be a phone first. If this is what they want then why can’t I just have the Watch and an iPad?
…but at the time we had no clear idea what a real landing would end up like…
Surveyor - “What am I? Chopped liver???”
Oh, how easily we all forget…