Sok Puppette’s Techdirt Profile


About Sok Puppette

Sok Puppette’s Comments comment rss

  • Jul 13th, 2021 @ 7:31am

    You know what would make things safer?

    Not raiding people for possessing or dealing in random substances, that's what. No raid, nobody gets shot. Just repeal the fucking drug laws already.

  • Apr 29th, 2021 @ 2:37pm

    (untitled comment)

    Who (the fuck) are these people and why is everybody talking about them all of a sudden?

    From all the stuff that's been plastered all over everything I read, I have gleaned the information that they're about a 60-person company in Chicago, and that they had something to do with inflicting Ruby on Rails on the world.

    Somehow I'm having trouble caring about them or anything they do...

  • Apr 27th, 2021 @ 9:46am

    (untitled comment)

    If the UK government wants support for its anti-encryption efforts, it needs to do better than basically lying to people.

    Why? Lying works in politics.

    First you lie to yourself, and convince yourself that some single thing is The Most Important Thing. Then you come up with a bunch of Things to Do, and obviously they Must Be Done if they even might have any effect at all on The Most Important Thing. Even if none of them might have any effect, you still have to do them because Something Must Be Done.

    And it doesn't matter how much damage you do elsewhere, because no other issue is The Most Important Thing.

    Then you like to everybody else. You exaggerate, you make wild accusations, whatever. If you want to ban mayonnaise, you say that mayonnaise is radioactive. Which you can justify because after all you're dealing with The Most Important Thing here.

    And, by the way, anybody who says anything that contradicts your lies, or even doesn't promote your view, is scum. It is Not OK to say that mayonnaise is not in fact radioactive. After all, true or not, the idea that mayonnaise is radioactive might actually convince somebody to ban it, and that's The Most Important Thing.

    For these people, protecting children from any exposure to sexuality, especially in relation to adults, is The Most Important Thing. If those same children end up impoverished, oppressed, or dead, well, sorry, that's just not as Important.

  • Aug 26th, 2020 @ 5:35pm


    While this QI bullshit in the US is clearly based on egregious judicial activism by the Supremes (and after that a lot of apparently intentional inactivism), let's not forget that Congress could eliminate it at any moment, has had over 50 years to do it, and hasn't done so.

    And I'm not a lawyer, but I suspect that individual states could do at least something about it with respect to those officers who operate under their own authority. They haven't done it either.

    It seems like there's plenty of blame to go around for this.

    Basically everybody in any authority in government is terrified that the world will burn down if cops have to follow rules. Or they think their constituents are. So the dereliction of duty is pretty universal.

  • Jun 22nd, 2020 @ 6:17pm

    Sorry, no.

    There are two issues here: integrity and confidentiality (aka privacy). These systems are not the answer for either one.

    Integrity is best solved end-to-end using DNSSEC. It's absolutely stupid to try to do it using hop-by-hop cryptography; you're trusting every hop not to tamper with the data.

    ... and just encrypting DNS traffic doesn't solve confidentiality either. It doesn't even improve confidentiality in the large.

    1. The adversary model is incoherent. If your ISP is spying on your DNS traffic, and you deny that to the ISP, then the ISP can just switch to watching where your actual data go. Yes, that may be slightly more costly for them, since otherwise they probably would have done it in the first place. It doesn't follow that the costs imposed on them are enough to justify the switch. In fact, they probably are not.
    2. All the proposals encourage centralization, which means that when (not if) some resolver that a lot of people are trusting goes bad, the impact is huge. Instead of a relatively large number of relatively survivable events, you create a few massive catastrophes.
    3. What this is fundamentally trying to be is an anonymity system (I guess a PIR system). Anonymity systems are HARD. Much, much harder than point to point cryptography. There are a million correlation and fault induction attacks, and in the case of DNS there are a million players in the protocol as well. There's been absolutely zero analysis of how easy or hard these methods may be to de-anonymize using readily observable data. They seem to be being designed by people who don't even understand the basics, and think they're helping when they charge ahead blindly.

    ... not to mention that it's just psychotic to tunnel a nice simple cacheable protocol like DNS over a horrific tower of hacks like HTTP.

  • Apr 9th, 2020 @ 3:29pm


    ... oh, and even if you weren't a company with any significant infrastructure, they could also come after you for providing software for P2P or other decentralized solutions. "Protocols, not platforms" only works if somebody's allowed to provide the software to speak the protocol...

  • Apr 9th, 2020 @ 3:26pm

    (untitled comment)

    Hmm. It actually may be worse than that, because it appears to apply beyond what you'd think of as "platforms".

    The recklessness and "best practices" requirements are applied to all providers of "interactive computer services". The definition of "interactive computer service" is imported by reference from 230. That definition is:

    The term "interactive computer service" means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.

    The part about "system... that enables computer access" sweeps in all ISPs and telecommunication carriers, as well as operators of things like Tor nodes. And "access software provider" brings in all software tools and many non-software tools, including open source projects.

    Under 230, those broad definitions are innocuous, because they're only used to provide a safe harbor. An ISP or software provider is immunized if it doesn't actually know about or facilitate specific content. No ISP and almost no software provider has any actual knowledge of what passes through or use its service, let alone editing the content or facilitating its creation, so they get the full safe harbor, with minimal or no actual cost to them. And anyway nobody has been after them on 230-ish issues, so including them doesn't hurt.

    Under EARN-IT, those same definitions would be used to impose liability, so now those parties actually get burdens from being inside the definition. That's worse than a repeal of 230. It doesn't just remove a safe harbor; it opens an avenue for positive attack.

    This commission could decide that it's a "best practice" for ISPs to block all traffic they can't decrypt. Or it could decide that it's a "best practice" not to provide any non-back-doored encryption software to the public, period.

    Or, since those might generate too much political backlash at the start, it could start boiling the frog on the slippery slope by, say, deciding that it's a "best practice" not to facilitate meaningfully anonymous communication, effectively outlawing Tor, I2P, and many standard VPN practices.

    Then it could start slowly expanding the scope of that, possibly even managing to creep into banning all non-back-doored encryption, without ever making any sudden jump that might cause a sharp public reaction.

    Back on the platform side, over time the rules could easily slide from the expected (and unacceptable) "best practice" of not building any strong encryption into your own product, to the even worse "best practice" of trying to identify and refuse to carry anything that might be encrypted. Start by applying it to messaging, then audio/video conferencing, then file storage... and then you have precedents giving you another avenue to push it all the way to ISPs.

  • Feb 1st, 2020 @ 1:20pm

    "Ban", eh?

    That's a pretty lame excuse for a ban.

    There is no reason that private surveillance camera users should be allowed to have the kind of automated, mass face recognition they're talking about "banning", any more than government users. They're at least as likely to abuse it and even less accountable.

    Nobody should be trying to connect names or any other information to any person who just enters a place where a camera happens to be pointed. Nor should anybody be shouldn't be using the video/images from surveillance to build any kind of face database or any other kind of database.

    Only in the US would people miss the obvious fact that the impact is the same no matter who runs the system.

  • Jan 31st, 2020 @ 12:57pm

    (untitled comment)

    I'm having trouble buying the idea that anybody at all thinks the phrase "child porn" carries any implication, or even suggestion, of legality. It's the most famously illegal thing that exists on the Internet.

    As for moderation, I will bet that almost all references to "child porn" on the Internet are in text that condemns it and/or discusses what to do to stop it. And if the pedos are in fact openly using the phrase "child porn" all over the place, what happens when they start calling it "CSAM"?

  • Jan 31st, 2020 @ 12:28pm


    What's the actual difference between "CSAM" and child porn, and why is it important to make the distinction? Seems like another random pointless acronym being thrown around and another random pointless terminology change.

  • Jan 31st, 2020 @ 11:42am

    Re: Re: creative makeup

    I happened to be playing with the AWS Rekognition demo the other day, and I fed it a bunch of makeup jobs from the CV dazzle site, as well as various other images with "countermeasures" from around the Web.

    Given a nice clear picture, it found every single face and every single feature on every face. It also did a good job of identifying age, sex and mood, right through some pretty extreme makeup. Try it out. It's available to the public.

    The problem with the countermeasures is that you never know whether the other guy has out-evolved you.

    By the way, the good think about Rekognition was that it seems to be crap at actually identifying faces from large groups.

    They have a celebrity recognition demo, and it did very poorly on pictures lots of people who are in the headlines... including people who ARE in the database. It spotted Marilyn Monroe in one of her really iconic shots, but not in another perfectly clear shot that it presumably hadn't been trained on. Same thing for Einstein. Turning to the headlines, it misidentified Alexandra Ocasio-Cortez and Greta Thunburg as random minor celebrities I'd never heard of. In turn it identified random minor celibrities, like members of current boy bands, as different random minor celebrities. It does well on heads of state. And both new and very old pictures of Elizabeth II worked. It may also be OK on Really Big Stars of Today (TM). But that's about it.

    So I assume it won't really identify a random picture as belonging to somebody in a collection unless said collection has a lot of good, similar pictures of that same person.

  • Jan 31st, 2020 @ 5:35am

    Re: Re: Re: Re: I have great hopes for the repeal of 230...

    In a peer to peer system, you bring your own, and you pay for it because you want to participate. Yeah, somebody has to sell it to you, but the equipment and software general purpose, you can't tell what any individual is using them for, and anybody can make them.

    If necessary, that can be extended to the entire communication infrastructure, but in fact we're not talking about the IP layer of fiber and routers here. We're talking about application layer overlays that can clearly be done peer to peer. Facebook and Google are not infrastructure.

  • Jan 31st, 2020 @ 5:32am

    Re: Re: Re: Re: I have great hopes for the repeal of 230...

    What I'm saying is that trying to make a profit will prevent them from properly providing the service. It has nothing to do with what they "should" or " should not" do. It's simply not possible to make a buck providing an unattackable service.

  • Dec 23rd, 2019 @ 8:21am

    Re: Re: others

    ... ever bought an expensive cell phone that was locked in to a single carrier?

    No. That is, not unless I was absolutely sure I could unlock it without the carrier's help or permission. I've never been wrong about that.

    Neither should anybody else.

    or an expensive android phone where software updates ceased after 1-2 years?

    No, because I've never bought one I couldn't load a custom ROM on.

    I have been fucked in 3 to 5 years because of proprietary binary blobs, though. That shit should be illegal.

    In fact, it should be illegal to distribute any software without source code. That includes firmware and other software bundled with hardware. It should also be illegal to distribute hardware without full register descriptions, and all other information necessary to write a driver supporting all of its features. And if you have any other "internal" documentation, go ahead and throw that in too.

    No exceptions, and fuck your "trade secrets".

    And if locking something down so that it will only load signed software is legal at all, there need to be some extremely heavy, legally binding regulations on the conditions under which it is allowed. THat definitely has to include the ability to update software that's gone out of support. In most cases, it should probably also include the ability for the owner of any hardware to take total control of all the software that runs on it.

    People should be tolerating this kind of abuse any longer. Not only are we suffering from wasteful obsolescence, and not only are enormous resources constantly wasted by intentionally crippled functionality and intentionally hindered interoperability, but there are massive unfixable security problems in all the shit software and abandonware that's being shoveled out.

    Meanwhile, we should be poisoning the market for this crap by mocking anybody who opts in without being absolutely forced. In the specific case of home control, there were perfectly good open alternatives that these idiots could have used instead.

  • Oct 18th, 2019 @ 4:49pm

    (untitled comment)

    Good first step. Now ban all use of it by everybody. There's nothing magically different about state surveillance.

  • Oct 11th, 2019 @ 11:35am

    Re: Re: Re: Re: Re: Re: Re: Re:

    I need to correct that slightly. That news site just turned off the name and I got a message saying "your screen name has been rejected; choose a new one" or something nonspecific like that. I only inferred that they wanted something that looked like a "real name".

  • Oct 11th, 2019 @ 11:31am

    Re: Re: Re: Re: Re: Re: Re:

    I use the name only for commenting on places like this. I have a couple of aliases, although I don't use more than one on the same site. You won't find any of them on my birth certificate. Isn't that technically what a sock puppet is?

    I use the name to make it clear to the reader that I'm not associating the comments with my "real world" identity.

    Amusingly enough, one news site decided it didn't like the name because it looked obviously fake, and made me choose one that looked like a "real name". The one I chose wasn't, of course, my actual "real name". I can't imagine what they think they're accomplishing with that nonsense.

    By the way, although I take strong stances and try to shake up assumptions, I do not write comments that I don't believe, nor do I write comments just to upset people.

    I really don't understand what pissed people off about that one, since I would think pretty much everybody would agree with it if they thought for 15 seconds. But maybe it touched some taboo or another. My first guess would be the part about the US Constitution being poorly written.

  • Oct 7th, 2019 @ 7:07am

    Let's not forget...

    It's true that these back doors are very dangerous from a technical security point of view. But they are also bad ideas when they're working as intended.

    If these companies give that kind of access to "the authorities" in the US, they have no leverage to not give it to "the authorities" in $insert_hellhole_dictatorship_here. And even in places like the US, "the authorities" routinely break the rules, overstep their bounds, and create giant unjustifiable oppressive programs. It's stupidly dangerous, to children and everybody else, to concentrate that kind of power.

    They shouldn't have that power, period, even if it could be secured, which of course it can't.

  • Oct 3rd, 2019 @ 1:37pm

    (untitled comment)

    Wow, I wish Barr would lend his enormous prestige and sterling personal reputation to MY issues...

  • Sep 24th, 2019 @ 1:41pm

    (untitled comment)

    It's the court saying that Google is required to make sure that top search results "reflects the current legal position." In other words, if someone was exonerated after being accused of a crime, that must now be the top link.

    Or, y'know, it could be the court not being absolutely perfect in drafting its decision.

    What you quoted could easily be interpreted to mean that if, say, somebody had been exonerated in a criminal case, that link would have to appear first among links mentioning the case. If the criminal case were minor 20-year-old bullshit, all of the links about it might still be on the tenth page, below more relevant material. Or nonexistant, for that matter.

    And even if you don't accept that interpretation, I bet they will correct it quickly if it ever becomes a problem.

    Now, whether it's possible for a computer to meet even that standard is another question. I sure don't know how I'd write code to figure out the current legal interpretation of anything. Maybe they'll accept "best efforts and quick correction". Which is still a can of worms. But why not stick to that real can of worms rather than jumping on things that are almost certainly unintentional and won't stick anyway?

More comments from Sok Puppette >>


This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it