• 0 Posts
  • 58 Comments
Joined 11 months ago
cake
Cake day: August 14th, 2023

help-circle

  • The spirit of your point is right, but: game patches existed back then. The first patch for Half Life was 1.0.0.8 released in 1999 (release version was 1.0.0.5). I cannot find the patch notes or exact release date as my search results are flooded with “25th anniversary patch” results.

    What was true is that players patching their games was not a matter of course for many years. It was a pain in the ass. The game didn’t update itself. You didn’t have a launcher to update your game for you. No. Instead, you had to go to the game’s website and download the patch executable yourself. But it wasn’t just a simple “Game 1.1 update.exe” patch. That’d be too easy. It was a patch from 1.0.9 to 1.1, and if you were on 1.0.5.3 you had to get the patch for 1.0.5.3 to 1.0.6.2, then a patch from that to 1.0.8 then a patch from that to 1.0.9. Then you had to run all of those in sequence. This is a huge, huge part of why people eventually started to fall in love with Steam back in the day. Patches were easy and “just worked” — it was amazing compared to what came before.

    The end result being that patches existed but the game that people remember (and played) was by and large defined by what it was on release. Also console games weren’t patched, although newer printings of a game would see updates. Ocarina of Time’s 1.0 release was exclusive to Japan; the North American release was 1.1 for the first batch of sales. After the initial batch was sold out the release was replaced by 1.2. That was common back then. As far as I know there was no way for consumers to get theirs updated, or to even find out about the updates. But they did exist.


  • Paying over a third of all revenue generated from searches on Apple’s platform. That’s incredible. Not a lawyer so I have no idea how this will work out legally, but I have a hard time parsing such an enormous pay-share as anything other than an aggressive attempt to stymie competition. Flat dollar payments are easier to read as less damning, but willingly giving up that much revenue from the source suggests the revenue of the source is no longer the primary target. It’s the competitive advantage of keeping (potential) competitors from accessing that source.


  • Typical corporate greed in that sense. It’s stupid but I’m not at all surprised by that attitude.

    The part that even if they were morally right in that sense… it’s already too late. This is trying to close the barn door not just after the horse left, but after the horse already ran off and made it two states over. There’s definitely value to LLM in having more data and more up to date data, but reddit is far from the only source and I cannot imagine that they possess enough value there to have any serious leverage.

    Reddit would/will survive being taken out of internet search results. Not without costs though: it will arrest their growth rate (or accelerate shrink rate, as appropriate) and make people less interested in using the site.





  • That really depends on what their goal is.

    From a business perspective it’s not worth fighting to eliminate 100% of ad block uses. The investment is too high. But if they can eliminate 50% or 70% or 90% of ad block uses with youtube? That could be worth the effort for them. If they can “win” for Chrome and make it a bit annoying for Firefox that would likely be enough for Google to declare it a huge success.

    People willing to really dig all the way in to get a solution they desire are not the norm. Google can be OK with the 1% of us out there as long as we aren’t also making it possible for another huge chunk of people to piggyback off it effortlessly.



  • The stuff that made Vista shitty to most end users wasn’t truly fixed with W7. For the most part W7 was a marketing refresh after Vista had already been “fixed.” Not saying that it was a small update or anything like that, just that the broken stuff had been more or less fixed.

    Vista’s issues at launch were almost universally a result of the change to the driver model. Hardware manufacturers, despite MS delaying things for them, still did not have good drivers ready at release. They took years after the fact to get good, stable, drivers out there. By the time that happened, Vista’s reputation as a pile of garbage was well cemented. W7 was a good chance to reset that reputation while also implementing other various major upgrades.


  • Cities Skylines sees a fairly decent improvement going to the 3D cache chips from AMD (17% speedup here for the 5800x3D). Whats your ability to increase the budget to go for a 7800X3D look like? If this is a genre of game you like and you want to hold off as long as possible between upgrades, it might be worth springing the extra. The difference the 3D cache provides in some games is rather extraordinary. City builders, automation, and similar games tend to benefit the most. AAA games tend to benefit the least (some with effectively no gain).

    A 7600X should be more than capable of handling the game though. So it’s not a question of need but if it’s worth it to you.

    You do not want 4800 CL40 RAM though, that’s too slow. I’d strongly recommend going for 32GB of RAM as well; 16GB can be gobbled up quickly, especially if you want to use mods in Cities Skylines.

    Going up even to DDR5-6000 is not much of a price increase. I’d suggest 6000 and something in the range of CL36-CL40. There’s a lot of 32GB kits in those specs in the ~$90 range. I would not build a gaming system today with 16GB of RAM.





  • People underestimate how much production other countries are capable of. Of course, China does dominate the manufacturing game, especially mass production.

    There’s no shortage of alternatives all the same. Vietnam in particular has been doing quite well taking manufacturing work that companies are moving out of China so as to diversify their production chain. India is rising on that front too. Not to mention that the west truly does far more manufacturing than people give credit for — I’ve found that nearly every category of general goods that I try to buy will have some US made options. That’s not even touching the rest of the west. The big exception being electronics, but those have Vietnam and India as growing alternatives, with Taiwan, Japan, Malaysia, and Singapore all as solid players in that market.

    The overall point being: it’s entirely possible to remove China from the manufacturing chain if there’s enough money behind the push. The US economy is probably large enough to do so with some meaningful struggle. The US and major allies could do so more easily. The difficulty is more political and temporal. Getting everyone on board and committed plus going through with the multi-year long process.


  • ME2 is a good game in isolation, but I think it played a big part in getting Bioware where they are now.

    ME2 saw them move far, far more into the action-RPG direction that was wildly popular at the time, with a narrative that was in retrospect just running in place (ME2 contributes effectively nothing towards the greater plot and zero major issues are introduced if it is excised from the trilogy). I feel the wild success ME2 saw after going in this direction caused Bioware to (a) double down on trend chasing, and (b) abandon one of their core strengths of strong, cohesive narratives. ME3 chased multiplayer shooter trends, DA:I and ME:A both chased open world RPG trends, Anthem chased the live service trend, and the first try at DA3 chased more live service stuff before Anthem launched to shit and they scrapped the whole thing to start over.

    All while, of what I saw first hand (of those I played) or read about secondhand (of those I did not play) none of those games put any serious focus on Bioware’s bread&butter of well written narratives. ME3 in particular is a narrative mess, with two solid payoffs (Krogans + Geth-Quarians) and the rest being some of the worst writing I’ve seen in a major video game.

    ME2 was great. ME2 also set Bioware on a doomed path.




  • I’m planning to upgrade from a 12 mini, which partly influenced my choice of years too (having seen 3 year data was the main part!). If I had a 12 Pro I think I’d have kept it for an extra year, but the battery is just not sufficient for how my phone use has changed.

    I think furthering your extra details here too is I saw someone point out that one of Apple’s slides for the base 15 was comparing its performance to the base 12. Apple knows how often people upgrade. Picking the 12 as a comparison point wouldn’t be an accident — we’re the single largest target audience for the 15. And in a year, they will in all likelihood compare the 16 to the 13 for the same reason.


  • This year’s new phones are for people that last bought a phone in 2020 or earlier. If the average user is on a three year upgrade cycle (what the data shows as I recall) then you’d expect roughly 1/3 of people to upgrade every year.

    This is better for Apple, as it keeps their revenue more spread out instead of heavily concentrated in year one of a three year cycle.
    This is better for consumers, as it means new features and upgrades are constantly being made. If they want to upgrade early they can, and they’ll get new features even if it’s only been two years.
    This is also better for both Apple and consumers because there’s more opportunities to course-correct or respond to feedback over issues. If Apple only released a phone every other or every three years, it’d take that much longer for the switch to USB-C.

    Just because a new product is launched does not mean you need to buy it. Nvidia released a new GPU last year, but I didn’t buy it even though it’s newer than what I currently have. Arguing that new phones shouldn’t come out each year is like arguing that new cars shouldn’t come out each year. It makes no sense.