• 2 Posts
  • 733 Comments
Joined 2 years ago
cake
Cake day: June 16th, 2023

help-circle

  • It’d be interesting project but it seems overkill and over complicatiion when the simplest solution is dual booting and giving each OS complete access to the hardware. Hypervisors for all your systems would be a lot of configuration, and some constant overhead you can’t escape for potentially minimal convenience gain?

    Are you hoping to run these OS at the same time and switch between them? If so I’m not sure the pain of the set up is worth it for a little less time switching between OS to switch task? If you’re hoping to run one task in one machine (like video editing) while gaming in another, it makes more sense but you’re still running a single i7 chip so it’ll still be a bottleneck even with all the GPUs and that RAM. Sure you can share out the cores but you won’t achieve the same performance of 1 chip and chip set dedicated to 1 machine that a server stack gives (and which Hypervisors can make good use of).

    Also I’d question how good the performance you’d get on a desktop motherboard with multiple GPUs assigned to different tasks. It’s doubtful you’d hit data transfer bottlenecks but it’s still asking a lot of hardware not designed for that purpose I think?

    If you intend to run the systems 1 at a time then you might as well dual boot and not be sharing system resources with an otherwise unneeded host for hypervisor software.

    I think if you wanted to do this and run the machines in parallel then a server stack or enterprise level hardware probably would be better. I think it’s a case of “just because you can do something doesn’t mean you should”? Unless it’s just a “for fun” project and you don’t mind the downsides? Then I can see the lure.

    But if I were in your position and wanted the “best” solution I’d probably go for a dual boot with Linux and Windows. In Linux I’d run games natively in the host OS, and use Qemu to have a virtual machine for development (passing through one of the GPUs for AI work). The good thing in this set up is you can back-up your whole development machine hard drive and restore it easily if you make big changes to the host Linux. Windows I’d use for kernel anti cheat games and just boot into it when I wanted.

    Personally I dual boot Linux and windows. I barely use windows now but in Linux I do use Qemu and have multiple virtual machines. I have a few test environments for Linux because I like to tinker, plus a docker server stack that I use to test before deploying to a separate homelab device. I do have a Win11 VM, barely used - it doesn’t have a discrete GPU and it’s sluggish. If you’re gaming I’d dual boot and give it access to the best GPU as and when you need it.

    And if you want the best performance, triple boot. Storage is cheap and you could easily have separate drives for separate OS. I have an Nvme for Linux and another Nvme for Windows for example. You could easily have 2 separate discrete Linux installs and a Windows installs. In some ways it may be best as you’d separate Linux gaming from Linux working and reduce distractions.


  • Linux works great for gaming in my experience. I have a huge games library and I haven’t had many if any games that don’t run. There are certainly some games that need some tweaking to get working or optimisation to run well. I generally have those problems with older games though as my library includes some retro games (games for Windows 98 being the ones I have to tweak most).

    Mods certainly do work - I’ve modded skyrim and rimworld extensively on Linux, as well as Oblivion, Cyberpunk 2077, Stardew Valley, Cities Skylines, Minecraft and more without issue. Proprietary mod managers may not work but they’re often the poorer ones that are really just tools to advertise and market at you.

    The vast majority of game mods work inside the game itself, so if the game runs on Linux the mods will work. The exception would be mods that need to run as a Windows program themselves separate to the game exe. Those can also be made to work, it’s just a bit more involved. Those kinds of mods are pretty rare in my experience though. Mods that act as game launchers etc work fine too, but just need some tweaking to ensure they launch instead of the game exe.

    Most games mods can be manually installed and big games even have their own Linux native mod managers - like Minecraft custom launchers and Rimpy for Rimworld etc.

    I do still have Windows on my PC in case I need it but haven’t used it for gaming in well over a year. I have a desktop so having a spare drive for windows is not a big deal to me but I’m tempted to wipe it as I don’t use it.

    The one bit that people do have issues with is Anti cheat software for multiplayer games. That’s not an area of gaming I do, but I have seen reports of certain games using proprietary systems that lock out Linux. That’s a problem you can’t get round except by having Windows available on your system.If there is a specific game you want like that isn’t working on Linux.


  • I’d take some of the claims with a pinch of salt. Selling faster now reflects better availability of the Switch 2 compared to the switch 1 at this point in its cycle. The switch 1 was also sold out this close to launch but Nintendo wasn’t able to manufacture as many to keep up.

    All this shows for now is that the Nintendo is meeting the initial demand better than it could with the first switch. It does not tell us it’s more popular or how well it’ll do overall. In other words all this stuff about it “out pacing” the swith 1 reflects better manufacturing availability rather than how popular the console itself is going to be long term.

    While the switch 2 has undoubtedly had a strong launch, it remains to be seen if the mass market are going to clamour to buy them for Christmas when they’re relatively expensive, with a limited selection of exclusive games. Adult gamers/early adopters being enthusiastic about getting the switch 2 is a good sign but doesn’t necessarily translate to parents buying the console for their families.

    The family and casual gamer market is the bigger one for Switch, and I honestly don’t yet see a compelling reason they’d rush out to buy one? 1080p.gaming, better performance and game chat certainly isn’t it. It needs some really compelling 1st party or excluaive games. Mario Kart World and Dokey Kong Bonanza plus a raft of old games really isn’t great.

    I’m not seeing a big new must have exclusive game to help drive sales for Christmas. No big new Zelda, Mario or Pokemon game? Maybe Nintendo intend christmas 2026 to be the mass market year for the switch 2, and this year be to keep on top of initial demand but it seems a bit of a risky strategy to me.


  • This is tech writers thinking everyone lives like them. An 8 year old graphics card if you’re not high end gaming or video editing is fine. That card will still run a 4k desktop, and probably multiscreen 4k desktops without any issue.

    For most users, graphics cards have long been at a level when they don’t need upgrading. A mid range graphics card from even 10 years ago is more than powerful enough to watch video, or use desktop programs, and even fine for a wide range of games.

    It’s only if you want high end 3D gaming that upgrading is needed and arguably even that has already beyond a point of diminishing returns in the last 5 years for the majority of users and titles.

    I do game a fair it and my RTX 3070 which is 5 years old really doesn’t need upgrading. Admittedly that was higher end when it launched, but it still plays a game like Cyberpunk 2077 at a high end settings. It’s arguable how much of the “ultra” settings on most games most users would even notice the difference, let alone actually need. New cards are certainly very powerful but the fidelity jump for the price and power just isn’t there in the way it would have been when upgrading a card even 10 years ago.







  • A lot of it comes down to convention and convention is often set by those who did it first or whose work dominated a field. The whole mathematical notation system we use today is just a convention and is not the only one that exists, but is the one the world has decided to standardise to…

    Rene Descartes is usually regarded at he originator of the current system. He used abc for constants and xyz for unknown variables amongst other conventions.

    Sequential letter sets are easy to use as they are easily recognised, and convenient as a result, plus are generally accepted to have non specific or less specific meaning. For example:

    a2+b2=c2

    That formula is a much simpler concept to get round using sequential leffer than:

    V2 + G2 = z2

    Under the common system, when you don’t use sequential letters it also implies much more specific meaning to the individual letters, and that can introduce ambiguity and confusion.

    When writing a proof there can be many many statements made and you’d quickly run out of letters if you didn’t have a convention for accepting abc are variables and can be reused.

    We also do use symbols from other alphabet sets, and alpha/beta/gamma is commonly used trio. But in mathematical notation there are a huge range of defined constants and symbols now that many have been ascribed specific uses. Pi for example. So you risk bringing in ambiguity of meaning by moving away from the accepted conventions of current maths by using other sets.

    Even e has specific meaning and can be ambiguous if you need to stretch to 5 variables. When working with e it’s not uncommon to use a different string of lwtters in the latin alphabet to avoid confusion if you need to use variables

    And we don’t stop at 3; abcd etc is used.


  • Yeah it’s not about the Internet and virtual reality or fax machines etc, it was about overpopulation and ecological collapse among other things.

    The song was inspired by a trip to an underground city in Sendai, Japan if you read Wikipedia. In the late 90s Japan was a gadget obsessed place with neon signs and screens packed into places like Sendai. Japan had industrialised rapidly over the 20th century and gave the impression of a thriving technology and manufacturing industry.

    It was seen as a futuristic place by people from the rest of the world when they visited. Of course in reality Japan was in the first of its “lost decades” of stagnation that’s run from the early 90s to now.


  • I find the article a little ridiculous. “Chilling” is being used to describe the end of late night television for commercial reasons. People aren’t watching late night TV as much, and the advertising is not there - that isn’t chilling; that’s the world we live in.

    The way the article is written you’d think late night TV is an irreplaceable cultural touchstone. It’s nonsense - people have stopped watching so it’s already no longer a touchstone.

    This is from Wikipedia on rival Tonight Shows ratings in 2006:

    2006, The Tonight Show led in ratings for the 11th consecutive season, with a nightly average of 5.7 million viewers – 31% of the total audience in that time slot – compared to 4.2 million viewers for Late Show with David Letterman, 3.4 million for Nightline and 1.6 million for Jimmy Kimmel Live!.

    In 2025, Colbert is leading the ratings with 2.42m - less than half the audience for 1st place in 2006 - and now we’re in a time where TV advertising has declined massively in value.

    Look at the TV ratings for 2024/2025 and network TV has collapsed. The most watched shows are Netflix. And even the few Network TV shows that break the top 20 are getting way fewer viewers than 20 years ago. Plus they’re skewed to older viewers that are not valuable to the advertisers. The demographic they want - 18-35 - don’t watch TV any more, and they certainly don’t sit down every night to watch late night TV shows.

    So Colbert’s Late Show being cancelled is more a sign of the times. Network TV is dying and it’s dying fast. The merger itself between Skydance and Paramount/CBS is itself a sign of the times - one billionaire media family is exiting old media (the Redstone family) which another is buying into it (the Ellis family). But Paramount and CBS are not doing well - Paramount global has declining revenues, declining assets and made an operating loss of $5.3bn. The big media conglomerates failed to move fast enough with the times; Netflix has won the streaming wars while traditional TV and Cinema is in massive decline. These companies don’t have any answers - they’re just managing declines while new companies will come along and take advantage of the new world.

    Colbert’s show was in 1st place and it was cancelled to save money. I wouldn’t be surprised if there are further cancellations although for now I suspect the networks will wait to see where the audience of the Late Show lands. But one of the shows (Jimmy Kimmel’s I think?) house band was sacked last year to save money - the writing is on the wall.

    It’s possible the politics of the merger played a role but even if that’s the case, it shows that the value of the Late Show has declined so much for Paramount/CBS that they could dump it easily.


  • This is a combination of terrible legislation in the UK meets awful social media site.

    The Online Safety Act is an abomination, compromising the privacy and freedom of the vast majority of the UK in the name of “protecting children”.

    I’m of the view parents are responsible for protecting their children. I know it’s hard but the Online Safety Act is not a solution.

    All it will.do is compromise the privacy and security of law abiding adults while kids will still access porn and all the other really bad stuff on the Internet will actually be unaffected. The dark illegal shit on the Internet is not happening on Pornhub or Reddit.

    The UK is gradually sliding further and further into censorship, and authoritarianism and all the in the name of do gooders. It’s scary to watch.





  • All of these can be run on any Linux distro. Dropbox is probably a better choice than Google Drive as Google drive doesn’t have an official Linux app (but you can get it working beyond just using it in a Web browser if its a must).

    I’d go.with Linux Mint as it’s well supported but any point release distro will serve your needs well. For example Fedora KDD or OpenSuSE Leap, Debian etc. I wouldn’t recommend Ubuntu.


  • BananaTrifleViolin@lemmy.worldtoLinux@lemmy.mlSwapping from Win10 on laptop
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    28 days ago

    I personally generally recommend Mint as a good starting distro. It is widely used, which means lots of support readily found online. It also has some of the benefits of Ubuntu without having the Snap forced on users. It also generally works well on a wide range of systems including lower powered systems due to its selection of desktops.

    Your laptop is decent and I’d personally be running a slick desktop on that, specifically KDE. But alot of that comes down to personal preferences, and Mint isn’t the best KDE desktop as it’s not a main desktop for it (although it is available).

    However once you get to grips with the basics of Linux I think other distros offer better more focused benefits for different user groups. There are lots of choices such as Gaming focused distros, rolling release vs point release distros, slow long term projects like Debian vs bleeding edge focused projects, immutable systems etc.

    I personally use OpenSuSE Tumbleweed because it’s cutting edge, but well tested prior to updates, with a good set of system tools in YaST, and decently ready for gaming and desktop use. I also like that it is European. But that may not be a good fit for your specific use case. Leap, the OpenSuSE point release distro would be better - a nice KDE desktop with a reliable release schedule and a focus on stability over cutting edge.


  • BananaTrifleViolin@lemmy.worldtoLinux@lemmy.mlSwapping from Win10 on laptop
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    28 days ago

    That’s not entirely true. Snap is a good reason to avoid Ubuntu as you’re not given the choice whether day to day apps like Firefox are a native app or snap app. You can only have snap versions. The lack of choice in having a slower less efficient version of apps forced on users without official alternatives is a good enough reason for people to recommend avoiding Ubuntu.

    That is regardless of all the commercial and proprietary concerns people have.

    That does not apply to Ubuntu based system like Mint where users are given choices and still benefit from other aspects of the Ubuntu ecosystem.