Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

October 12 2016


I just went through my FOF stream a bit and… words fail me.

The things I've seen:

  • Hardcore porn. And not just penetration but very explicit, very off-mainstream stuff (scat was kinda the first thing that popped up, but goru and the like was about half of it)
  • People insisting the earth was flat, that vaccinations are EVIL, chemtrails, hemeopatheic remedies, spirit healing, the works
  • Nazi bullshit, in many different forms
  • More spam
  • Someone having an educated and useful discourse with various other people that I enjoyed reading
  • Mainstream porn
  • Spam

I could go on, but I do wonder when Soup has become this 90+% cesspool of stuff that has one of three purposes: make money, shock, or swamp the place to make it useless.

Or maybe it's juat that my Friends list (that I haven't touched in years) needs a culling?

Reposted byMissDeWordep856

October 10 2016

I have this everything in my head and I struggle to get it out.
Reposted bythames-trip thames-trip

October 05 2016

While it doesn't mention Fuller's map, this sequence from The West Wing is a nice and short comment on maps.

August 27 2016

Correct, it's an aileron roll.
Reposted byp856 p856

July 27 2016

Dafür hab ich noch nie von Polarkreis 18 gehört %-)

July 23 2016


July 21 2016

I miss Moebius :'(
Reposted byp856 p856

July 16 2016

Well, a dynamically linked app, I can ask what it is linked against, trivially. A static binary is opaque. And "test for the behavior" is not always sanely possible, especially with security bugs. Just replacing the shared lib (and rebuild where necessary, this is trivial to find) is much safer.

Replacing a library with an incompatible one can be handled by the package manager, until the packages that are version-dependent have been updated, you keep both versions of the library (in Gentoo, this is done via @preserved-rebuild).

Making every application hermetic doesn't work: things tend to talk to each other, sometimes in very subtle ways like dlopen() (which is like linking, but can happen at any time during a program run, one example for this are media plugins and the like).

This hermetic approach can be done for very tightly controlled setups, like @schlingel mentioned. But I don't think it will ever work sanely for a desktop computer/workstation.
Reposted byp856 p856

July 13 2016

I think the idea of "apps+libs will update faster" is overly optimistic. Just like bundling does not lead to patches contributed upstream, but rather a drift of the bundled library away from lib-upstream.

Also, having multiple versions of a library installed is very much possible, if the library maintainer puts in a little effort (and even if they don't, some distros have shown that parallel installation of different versions is quite possible). Slow migration from libfoo-1.2.3 to -1.2.4 is done today already.

And developers can't always depend on the latest library: APIs change, what they considered features, lib-upstream considers a bug and has removed it. Or the library breaks with a certain use-case lib-upstream doesn't care about.

The problem with as-many-as-you-want: who does the work and the testing? How does a user decide what is the right package for them? When things break, you have a much larger cognitive load for tracking down issues, since it's never quite clear what packages are affected (users tend to under-report pertinent details).

"Just use the latest" doesn't work due to the problems I outlined above. And patching it will lead to vulnerabilities nobody is aware of, it will lead to incompatible drift of patchsets and quite a bit of extra work for everyone downstream. A fine example of the mess this can become is MPlayer. but there are many more. One problem is: where do you draw the line between bundled and non-bundled libraries? The libs have dependencies themselves, and with patched and bundled libs that drift, you'll soon find that more and more libraries have to be bundled, until you're basically shipping everything but glibc.
For the pinning of overly-specific versions: in my experience, app maintainers will tend to require you to use libfoo-, even if libfoo-1.2.*.* should work. Thus, if you have three apps of that sort, deduplication tends to not work, dragging you back to the era of static linking, despite a lack of necessity.

While this can already be a problem with today's setup, the distro maintainers between the end user and upstream will have none of that and will do a certain amount of QA (there was a recent discussion about the value of having distro maintainers, I can dig up a link if you're interested.)

As for the package features: we're already at the point where binary distributions have to make a choice in enabled features for packages. For example, Debian has exim-light, exim and exim-heavy, with the corresponding feature sets and dependencies. My prediction is that if upstream does the packaging (and you seem to imply that, or at least a vastly diminished integration role for the distro maintainers), there will be only one exim package, the equivalent of exim-heavy, with all of its dependecies always enabled.

(note: I am using exim as an example package, but you could also use something like Apache or PHP; moreover, I do not want to imply ignorance or somesuch on their end. Every time I had to work with those upstreams, they have been very reasonable and nice to work with).

July 12 2016


- Apps will bin themselves to specific libraries, whether that's necessary or not. Thus the main advantage of shared libraries (updating LibFoo once) is gone, making security a nightmare.

- Because of the same over-specification, you'll have sixteen different versions of LibFoo on your system.

- Things become entirely non-optional. A vim package without X11 support? Sorry, we don't provide that. Or you get a small subset of the combinatorial explosion a package feature set like Apache's is (cf. exim-light, exim-heavy).

- Do avoid these shenanigans, app maintainers will do even more bundling of heavily patches versions of LibFoo, making it nigh impossible to know just how vulnerable your system is.

No, thanks.

July 07 2016


Browsing in public.

June 14 2016

The formatting of this slide annoys me more than it should.
Reposted byp856 p856

June 06 2016

John William Waterhouse: Hylas and the nymphs

Reposted byp856linkage

April 09 2016

One more reason not to eat oysters ;)

March 27 2016

No, not in the moral sense at all.

It is wrong because it leads to a very flawed mental model of what is going on. There are much better ways to think about AI, even if we don't know what it is made of, whether it has consciousness or not and so on.

Don't get me wrong, I am not saying that empathy or sympathy for anything (including dead things, for example because they represent a cherished memory) is wrong.

Problems arise when you ascribe that the thing has human-like emotions or mind states; human motivations or outlooks. If you do that, you not only fool yourself, you are also not doing right by the AI.

Consider this: a dog is a lot closer in its functioning to you than an AI ever will be. And yet, a responsible dog owner will not treat a dog exactly like a human, for it is bad for the dog and bad for the relationship between it and its owner.

Everybody realizes that you couldn't treat an alien like a human and that all your learned socializing is probably counterproductive when trying to communicate with an alien, or even remotely understand it.

An AI very likely will function so fundamentally differently to us that it will take a very long time just to be able to communicate properly.

One argument against this is that we as humans can not help but create AI in our own image, that any AI we make will be human just by virtue of the process it was created in. I think that is short-sighted and even a bit arrogant. My prediction is that the first truly sentient and conscious AI will mostly be an accident of sorts, a random confluence of events, ingredients and timing. If anything, history teaches us that at least half of all great and sudden discoveries were made to a large part by accident.

(n.b. the moral side of how to treat AIs once we are able to create them is a very deep subject that I have thought a lot about (and written two and a half novels on); I think it is a deep enough subject that I doubt this is the place to discuss it, if we want to do the topic justice.)
Reposted byp856 p856

March 26 2016

I wrote about it here:

Bottom line: anthropomorphizing things has its pitfalls. And thinking of a true AI as something with human motivations or even emotions is foolish at best.
I thought writing novels about AIs would bring me more insight into the reactions of people to them (and their prospect).

Turns out, it just makes it harder to talk with people about AIs.

Oh well.
Reposted byp856 p856

March 25 2016

As an addendum: even if the software is entirely beneficial and so on, treating it like a human is condescending. It's like treating everyone you meet as a member of your own culture, with perfect knowledge of in-jokes and all that.
Reposted bypaket paket
Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!