• 117 Posts
  • 1.93K Comments
Joined 2 years ago
cake
Cake day: June 15th, 2023

help-circle


  • Yes, they are reverting back. Fedora users always live on the edge. They are basically (but not quite right) “always” the first accepting a new technology. Not even Archlinux does that. Arch users obviously live on the edge too, but for other reasons. :D

    But wasn’t Fedora not going to discontinue X11 support only for GNOME version? I thought other spins are still allowed to support it, but doesn’t matter anymore, because they reverting this idea back. I think. But why didn’t you switch to another distribution, instead buying new hardware, if that was the only problem?


  • A dash is a bit problematic from practical point of view. In example I allow single numbers without a colon like just 6 which would be interpreted as 6:6. And each element is optional as well, which would make -6 either be a negative number, an commandline option or a range? Some languages also use dots .. instead. If I want ever support negative numbers, then the hypen, dash or minus character would be in the way.

    I mean I could just do a duck typing like stuff, where I accept “any” non digit character (maybe except the minus and plus characters) with regex. Hell even a space could be used… But I think in general a standardized character is the better option for something like this. Because from practical point of view, there is no real benefit for the end user using a different character in my opinion. Initially I even thought about what format to use and a colon is pretty much set in stone for me.


  • Fedora even switched to Wayland by default in 2016 (at least for the GNOME release). I don’t know what they were thinking. 8 to 9 years before they were already using Wayland… and it still have some “problems”. Can’t imagine what you were going through. :D

    But compared to Fedora, Ubuntu only did change temporarily to Wayland right? I mean it was not an LTS version. I installed LTS 18.04 and don’t remember anything like that by default.







  • I think that I’m going with these approaches. For the ‘0’, I’m now accepting it as the 0 element. Which is not 0 based index, but it really means before the first element. So any slice with an END of 0 is always nothing. Anything that starts at 0 will basically give you as many elements as END points to.

    • 0: is equivalent to : and 1: (meaning everything)
    • 0 is equivalent to 0:0 and :0 (meaning empty)
    • 1:0 still empty, because it starts after it ended, which reads like “start by 1, give me 0 elements”
    • 1:1 gives one element, the first, which reads like “start by 1, give me 1 element”

    I feel confident about this solution. And thanks for everyone here, this was really what I needed. After trying it out in the test data I have, I personally like this model. This isn’t anything surprising, right?





  • First, thanks for the answer. As for the user base, its actually gaming oriented and they typically do not interact with 0 base. So I guess that makes for an obvious choice. And at the moment its also “inclusive”. To get one element user needs to 2:2. If user gives only one element, such as 2, then I could convert it into 2:2, to get one element. Sounds logical, right? Sorry for having so many follow up questions, my head is currently spinning.

    Do you think this interferes somehow with the logic of a “missing” slice element, which would default to “the rest of the list”. In example 2: would then get the second element and until rest. This is the default behavior in Rust.

    If I have a 1 based index, how would you interpret the 0? Currently program panics at Argument interpretation phase.



  • Your final score: 9/26

    There were more nuances and surprised than anticipated. But it should be expected, because its on this website. :D But any language with duck typing and lot of magic and interpretation, and a long history of changes and additions, used in a huge variety of environments, is bound to be surprising. I am not surprised that JavaScript and Python are surprising.



  • I have the same doubts as you and wondered the same when reading those paragraphs about performance claims. I will most likely do my own comparisons, but I have AMD hardware here. The claims talk about Nvidia, so maybe its not applicable to me. I’ll do my comparisons in the next few days, because currently working on something else.

    From my research, I found an old gamingonlinux article, with a quote explaining this on a high level: https://www.gamingonlinux.com/2022/04/a-developer-made-a-shadowplay-like-high-performance-recording-tool-for-linux/

    OBS only uses the gpu for video encoding, but the window image that is encoded is sent from the GPU to the CPU and then back to the GPU. These operations are very slow and causes all of the fps drops when using OBS. OBS only uses the GPU efficiently on Windows 10 and Nvidia. This gpu-screen-recorder keeps the window image on the GPU and sends it directly to the video encoding unit on the GPU by using CUDA. This means that CPU usage remains at around 0% when using this screen recorder.

    Have in mind, these claims maybe not true anymore, because OBS improved over the last 3 years too. Always take claims like these with grain of salt (that is healthy).