I think it was Mandrake Linux for me.
It no longer exists though. …I guess I’m old.
I think it was Mandrake Linux for me.
It no longer exists though. …I guess I’m old.
The packager always should “explicitly require” what are the dependencies in a Nix package… it’s not like it’s a choice, if there are missing dependencies then that’d be a bug.
If the package is not declaring its dependencies properly then it might not run properly in NixOS, since there are no “system libraries” in that OS other than the ones that were installed from Nix packages.
And one of its advantages over AppImages is that instead of bundling everything together causing redundancies and inefficient use of resources, you actually have shared libraries with Nix (not the system ones, but Nix dependencies). If you have multiple AppImages that bundle the same libraries you can end up having the exact same version of the library installed multiple times (or loaded in memory, when running). Appimages do not scale, you would be wasting a lot of resources if you were to make heavy use of them, whereas with Nix you can run an entire OS built with Nix packages.
Huh? as far as I know it has its own libraries and dependency system. What do you mean?
The nice thing about Nix/Guix is that each version of a library only needs to be installed once and it wont really be “bundled” with the app itself. So it would be a lot easier to hunt down the packages that are depending on a bad library.
Flatpak still depends on runtimes though, I have a few different runtimes I had to install just because of one or two flatpaks that required them (like for example I have both the gnome and kde flatpak runtimes, despite not running either of those desktop environments)… and they can depend on specific versions of runtimes too! I remember one time flatpak recommended me to uninstall one flatpak program I had because it depended on a deprecated runtime that was no longer supported.
Also, some flatpaks can depend on another flatpak, like how for Godot they are preparing a “parent” flatpak (I don’t remember the terminology) that godot games can depend on in order to reduce redundancies when having multiple godot games installed.
Because of those things, you are still likely to require a flatpak remote configured and an internet connection when you install a flatpak. It’s not really a fully self contained thing.
Appimages are more self contained… but even those might make assumptions on what libraries the system might have, which makes them not as universal as they might seem. That or the file needs to be really big, unnecessarily so. Usually, a combination or compromise between both problems, at the discretion of the dev doing the packaging.
The advantage with Nix is that it’s more efficient with the users space (because it makes sure you don’t get the exact same version of a library installed twice), while making it impossible to have a dependency conflict regardless of how old or new is what you wanna install (which is something the package manager from your typical distro can’t do).
It’s also not that uncommon of an acronym in web tech, all the first results when searching “PWA” are consistent and it’s a very common way to refer to that technology. The term PWA has made the news in tech channels a few times before (like when Firefox discontinued support for PWA on desktop).
Even if they said “Progressive Web Apps” it would not have been immediatelly clear what that means for anyone who is not familiar with what PWA is. It’s also not the only acronym they use in the article without explaining it (eg. “API”, or “iOS” which is also an acronym on itself), it just so happens that it’s likely not a well known one in this particular lemmy community where the article was posted. The author advertises himself as a writer dedicated to web technologies (PWA and Web Component in particular), so it would be silly if he has to explain what those are on every of his posts.
But C syntax clearly hints to int *p
being the expected format.
Otherwise you would only need to do int* p, q
to declare two pointers… however doing that only declares p
as pointer. You are actually required to type *
in front of each variable name intended to hold a pointer in the declaration: int *p, *q;
Same effort as getting &*
and ()
on a US layout (so, modifier key + 7 8 9 0, respectively), the difference is you press AltGr instead of Shift as the modifier. And i’d argue its actually easier to press AltGr with the thumb than shift with the pinky.
I use EURkey, which is basically a superset of the US layout extended to support symbols from several European languages.
it’s even ISO standardized
Not only are there other ones that are also ISO standards when it comes to software layouts, but funny enough, when it comes to physical layouts, US keyboards normally follow an ANSI standard (not an ISO one), whereas many non-US keyboards typically follow a physical key layout known as “ISO Keyboard”, so one could argue those are more of an “ISO” standard.
right ctrl + left shift + 9 will do?
No keyboard layout uses ctrl like that… in fact, I don’t think you ever really need to press more than one modifier in any standard non-US keyboard. Unless you have a very advanced custom layout with fancy extra glyphs… but definitelly not for the typical programming symbols.
ISO keyboards actually have one more key and one more modifier (“AltGr”, which is different from “Alt”) than the ANSI keyboards.
In fact, depending on the symbol it might be easier in some cases. No need to press “shift” or anything for a #
or a +
in a German QWERTZ keyboard, unlike in the US one. Though of course for some other ones (like =
or \
) you might need to press 1 modifier… but never more than 1, so it isn’t any harder than doing a )
or a _
in the US layout.
Yes… how is “reducing exclamation marks” a good thing when you do it by adding a '
(not to be confused with ,
´,
‘or
’` …which are all different characters).
Does this rely on the assumption that everyone uses a US QWERTY keyboard where !
happens to be slightly more inconvenient than typing '
?
Even if they did hallucinate answers, it wouldn’t be the first game that relies on the “unreliable narrator” trope.
Me neither? That’s why I was hoping they might have added some markdown extension.
I have done it in the past with mardown-it-wikilinks npm package, for example.
Also, I’d argue the wikilinks (internal links) using [[any term here]]
from Wikipedia, that optionally allow automatically inferring the link, is much more comfortable (and less error-prone) for the usecase of a wiki system, than the [text required](/link_here_also_required_even_when_redundant)
from markdown.
I was hoping they might have added some markdown extension to do something similar, but it seems not.
I mean, it would technically be possible to build a computer out or organic and biological live tissue. It wouldn’t be very practical but it’s technically possible.
I just don’t think it would be very reasonable to consider that the one thing making it intelligent is that they are made of proteins and living cells instead of silicates and diodes. I’d argue that such a claim would, on itself, be a strong claim too.
Note that “real world truth” is something you can never accurately map with just your senses.
No model of the “real world” is accurate, and not everyone maps the “real world truth” they personally experience through their senses in the same way… or even necessarily in a way that’s really truly “correct”, since the senses are often deceiving.
A person who is blind experiences the “real world truth” by mapping it to a different set of models than someone who has additional visual information to mix into that model.
However, that doesn’t mean that the blind person can “never understand” the “real world truth” …it just means that the extent at which they experience that truth is different, since they need to rely in other senses to form their model.
Of course, the more different the senses and experiences between two intelligent beings, the harder it will be for them to communicate with each other in a way they can truly empathize. At the end of the day, when we say we “understand” someone, what we mean is that we have found enough evidence to hold the belief that some aspects of our models are similar enough. It doesn’t really mean that what we modeled is truly accurate, nor that if we didn’t understand them then our model (or theirs) is somehow invalid. Sometimes people are both technically referring to the same “real world truth”, they simply don’t understand each other and focus on different aspects/perceptions of it.
Someone (or something) not understanding an idea you hold doesn’t mean that they (or you) aren’t intelligent. It just means you both perceive/model reality in different ways.
+1 on this. Kobos actually use Linux under the hood. And although the default UI is proprietary, it’s super easy to install KOReader.
You don’t even need to hack into it some custom firmware, just a sideloader, which normally doesn’t break even if you actually updated the base firmware.
Here the official tutorial on how to do it: https://github.com/koreader/koreader/wiki/Installation-on-Kobo-devices
The AI can only judge by having a neural network trained on what’s a human and what’s an AI (and btw, for that training you need humans)… which means you can break that test by making an AI that also accesses that same neural network and uses it to self-test the responses before outputting them, providing only exactly the kind of output the other AI would give a “human” verdict on.
So I don’t think that would work very well, it’ll just be a cat & mouse race between the AIs.
And please, get all countries to actually start properly accepting ISO 8601 format for dates as a mandatory universal standard…
Obligatory reference: https://xkcd.com/1179/