

cultural reviewer and dabbler in stylistic premonitions
The statement in this meme is false. There are many programming languages which can be written by humans but which are intended primarily to be generated by other programs (such as compilers for higher-level languages).
The distinction can sometimes be missed even by people who are successfully writing code in these languages; this comment from Jeffrey Friedl (author of the book Mastering Regular Expressions) stuck with me:
I’ve written full-fledged applications in PostScript – it can be done – but it’s important to remember that PostScript has been designed for machine-generated scripts. A human does not normally code in PostScript directly, but rather, they write a program in another language that produces PostScript to do what they want. (I realized this after having written said applications :-)) —Jeffrey
(there is a lot of fascinating history in that thread on his blog…)
deleted by creator
They have to know who the message needs to go to, granted. But they don’t have to know who the message comes from, hence why the sealed sender technique works. The recipient verifies the message via the keys that are exchanged if they have been communicating with that correspondent before or else it is a new message request.
So I don’t see how they can build social graphs if they don’t know who the sender if all messages are, they can only plot recipients which is not enough.
You need to identify yourself to receive your messages, and you send and receive messages from the same IP address, and there are typically not many if any other Signal users sharing the same IP address. So, the cryptography of “sealed sender” is just for show - the metadata privacy remains dependent on them keeping their promise not to correlate your receiving identity with the identities of the people you’re sending to. If you assume that they’ll keep that promise, then the sealed sender cryptography provides no benefit; if they don’t keep the promise, sealed sender doesn’t really help. They outsource the keeping of their promises to Amazon, btw (a major intelligence contractor).
Just in case sealed sender was actually making it inconvenient for the server to know who is talking to who… Signal silently falls back to “unsealed sender” messages if server returns 401 when trying to send “sealed sender” messages, which the server actually does sometimes. As the current lead dev of Signal-for-Android explains: “Sealed sender is not a guarantee, but rather a best-effort sort of thing” so “I don’t think notifying the user of a unsealed send fallback is necessary”.
Given the above, don’t you think the fact that they’ve actually gone to the trouble of building sealed sender at all, which causes many people to espouse the belief you just did (that their cryptographic design renders them incapable of learning the social graph, not to mention learning which edges in the graph are most active, and when) puts them rather squarely in doth protest too much territory? 🤔
i bet you’re going to love to hate this wikipedia article https://en.wikipedia.org/wiki/Monochrome_painting 😂
because it’s stupid.
you were bamboozled
presumably you find value in some things that some other people think are stupid too; it’s OK
It looks huge on a Mercator Projection map even though it isn’t that large.
In the Mercator projection it appears to have about the same area as Africa, while in reality it is about a 14th of it. But, I wouldn’t say that “isn’t that large”: if Greenland was independent it would be (and Denmark is, because of it) the 12th largest country in the world.
I see. What a mess.
The instructions at https://docs.searxng.org/admin/installation-docker.html mention that the docker image (which that page tells you to just pull and run) has its “sources hosted at” https://github.com/searxng/searxng-docker and has instructions for running it the image without docker-compose.
But, the Dockerfile
source for the image is actually in the main repo at https://github.com/searxng/searxng/blob/master/Dockerfile and the searxng-docker
repo actually contains a docker-compose.yaml
and different instructions for running it under compose instead.
Anyway, in the docker-compose
deployment, SEARXNG_BASE_URL
(yet another name for this… neither SEARXNG_URL
or BASE_URL
, but apparently it sets base_url
from it) is constructed from SEARXNG_HOSTNAME
on line 58 here: https://github.com/searxng/searxng-docker/blob/a899b72a507074d8618d32d82f5355e23ecbe477/docker-compose.yaml#L58
If I had a github account associated with this pseudonym I might open an issue or PR about this, but I don’t and it isn’t easy to make one anymore 😢
yes, when the month is written non-numerically (and the year is written with four digits) there is no ambiguity.
but, the three formats in OP’s post are all about writing things numerically.
In some contexts, writing out the full month name can be clearer (at least for speakers of the language you’re writing in), but it takes more (and a variable amount of) space and the strings cannot be sorted without first parsing them into date objects.
Anywhere you want or need to write a date numerically, ISO-8601 is obviously much better and should always be used (except in the many cases where the stupid formats are required by custom or law).
Changing
SEARXNG_HOSTNAME
in my.env
file solved it.
nice. (but, i assume you actually mean SEARXNG_URL
? either that or you’re deploying it under some environment other than one described in the official repo, because the string HOSTNAME
does not appear anywhere in the searxng repo.)
https://docs.searxng.org/admin/settings/settings_server.html says you need to set base_url
, and that by default it’s set to $SEARXNG_URL
.
however, https://docs.searxng.org/admin/installation-docker.html#searxng-searxng says that if you are running it under docker the environment variable which controls base_url
in the config is actually BASE_URL
rather than SEARXNG_URL
.
(possibly whichever variable it is is currently empty, which might make it construct a URL based on the IP address it is configured to listen on.)
in my experience DeepL has the best results for some language pairs while Google is better for others (and has a lot more languages).
But, these days I’m happy to say Firefox translate is the first thing I try and it is often sufficient. I mostly only try the others now when the Firefox result doesn’t make sense or the language is unsupported.
Yeah, that would make sense - language detection is trivial and can be done with a small statistical model; nothing as complicated as a neural network is needed, i think just looking at bigram frequency is accurate enough when you have more than a few words.
If that is what is happening, and it is only leaking the language pair to the server the first time that pair is needed, that would be nice… I wish they made it clear if that is what is happening 😢
Probably that’s when it does online connection?
since the help says it is downloading “partial language files” automatically, and the button never changes from “Download” to “Remove” if you don’t click Download, logically it must sometimes need to download more of a language which you have previous downloaded a “partial language file” of.
i am curious if the choice of which parts of the “language file” (aka model) it is downloading really does not reveal anything about the text you’re translating; i suspect it most likely does reveal something about the input text to the server… but i’m not motivated enough to research it further at the moment.
Wow, thanks for the about:translations
tip - I was wondering how to do that!
Besides “Translate page” there is also a “Translate selection” option in the right-click menu so you can translate part of a page.
However, unless you download languages in the “Translation” section of Firefox preferences, it doesn’t actually always work while offline:
As you pointed out, the help page explicitly says there is “no privacy risk of sending text to third parties for analysis because translation happens on your device, not externally”, but, after I translate something in a new language I haven’t before, it still doesn’t appear as downloaded (eg having a “Remove” button instead of a “Download” button) in the preferences.
The FAQ has a question Why do I need to install languages? with this answer:
Installing languages enables Firefox to perform translations locally within your browser, prioritizing your privacy and security. As you translate, Firefox downloads partial language files as you need them. To pre-install complete languages yourself, access the language settings in Firefox Settings,
General
panel, in the Language and Appearance section under Translations.
I wonder what the difference between the “partial” language files and the full download is, and if that is really not leaking any information about the text being translated. In doing a few experiments just now, I certainly can’t translate to new languages while offline, but after I’ve translated one paragraph in a language I do seem to be able to translate subsequent paragraphs while offline. 🤔
Anyway, it probably is a good idea to click “Download” on all the languages you want to be able to translate.
the bald guy in the middle of the photo owns the servers that Signal outsources the keeping of their privacy promises to 🤔
What you want is not an “uncensored” server, but rather a server that is moderated in a way that you find acceptable.
There is no such thing as an “uncensored/open” server. Or, when there is, it can’t last long. Every open server needs to delete some things, because if they don’t, their disk will soon be full of spam and CSAM and then the server will go away. Some servers claiming to be “uncensored” might allow nearly everything besides those two categories, but they tend to quickly become nazi bars.
Sorry i don’t have any specific suggestion, but of the 61 servers listed here hopefully there is one with a moderation policy that is to your liking.