CopyFail Was Not Disclosed to Distros

(openwall.com)

180 points | by ori_b 2 hours ago

7 comments

  • xeeeeeeeeeeenu 1 hour ago
    For context, the author of the linked post, Sam James, is a Gentoo developer.

    Anyway, this is a disaster. It was extremely irresponsible to share the exploit with the world before the distributions shipped the fix. Who knows how many shared hosting providers were hacked with this.

    It's also worrying that it seems there's no communication between the kernel security team and distribution maintainers. One would hope that the former would notify the latter, but apparently it's the responsibility of whoever finds the vulnerability.

    • Lammy 29 minutes ago
      > It was extremely irresponsible

      As a user and admin I disagree. Makes one appreciate what a masterful bit of lexical-engineering “Responsible” Disclosure is, kinda like “Secure” (from me, not forme) Boot — “Responsible” Disclosure is 100% about reputation-management for the various corporation/foundation middleman entities sitting between me and my computer.

      Those groups don't care that my individual computer is vulnerable but about nobody being able to say “RHEL is vulnerable” or “Ubuntu is vulnerable”. The vulnerability exists for me either way, and I'd rather have the chance to know about it and minimize risk than to be surprised by the fix and hope nothing bad happened in that meantime.

      Immediate public disclosure is the only choice that isn't irresponsible as far as I'm concerned.

      • BeetleB 1 minute ago
        So if I found a vulnerability that lets hackers withdraw withdraw all the money in your account without a trail on where the money went, you'd be fine with them disclosing it to the public at the same time as the bank learns about it?

        Even when there is no known use case of the attack (other than the security researcher's)?

        > The vulnerability exists for me either way, and I'd rather have the chance to know about it and minimize risk

        By the time you hear about it, the money could be gone because 1000 hackers heard about it from the researcher before you did.

        > than to be surprised by the fix and hope nothing bad happened in that meantime.

        Hope is not a good strategy here.

      • eschaton 13 minutes ago
        “The choice that maximizes potential damage isn’t irresponsible, because it means I can mitigate my own systems immediately.”

        That’s what you’re saying here.

        • tptacek 8 minutes ago
          They're literally just restating the argument for full disclosure security. This is one of the oldest debates in information security.
      • notsound 5 minutes ago
        Those groups care about whether millions of computers are vulnerable, likely including your computer. If "immediate public disclosure" was done in all cases every vuln would be exploited and patches would be much lower quality. Shortening the disclosure timeline might be a good idea, 90 days is starting to feel long.
    • zamalek 1 hour ago
      The disclosure was more about marketing than security. From the disclosure page:

      > Is your software AI-era safe?

      > Copy Fail was surfaced by Xint Code about an hour of scan time against the Linux crypto/ subsystem. [...]

      > [Try Xint Code]

      More chaos makes their product seem even more attractive.

      • esseph 1 hour ago
        Your advertising for them on HN would help them too, I bet.
        • jasonmp85 50 minutes ago
          Does it? Now that I see their name again in this context they're blacklisted for life.
          • eaf7e281 7 minutes ago
            Same. They do become famous, but not in a wholly positive way.
          • selectively 40 minutes ago
            Researchers are under no obligation to engage in coordinated disclosure and are free to sell 0day for profit. Just fyi. Be glad it was disclosed at all. Be glad a patch was available prior to release.
            • lambda 33 minutes ago
              If they want to be seen as responsible rather than opportunistic, then yeah, they should do a proper coordinated disclosure.

              Sure, they have no legal obligation to disclose, but we all also have no legal obligation to buy their services. Blacklisting bad actors like this is the right move to discourage this kind of behavior.

              • selectively 31 minutes ago
                Who cares about how you are seen when you are selling 0day for big bucks? The bad actor makes more money than the 'legitimate' one without breaking any law. Punishing someone who didn't alert distros despite a patch being available encourages the company to simply find flaws and sell them for profit - it pays more to begin with.
                • maxbond 22 minutes ago
                  If they want to take advantage of disclosure for marketing, they're either going to need to accept the norms around responsible disclosure, or they're going to need to accept how shirking those norms will come off. That's life in society. Sometimes it's annoying and sometimes it doesn't feel rational, but these norms have been negotiated throughout the history of our industry and are the way they are for reasons good and bad.

                  I just don't see the point in complaining about how shirking the norms of your industry will make you look irresponsible. I don't really care that they could have decided to sell the vulnerability instead. It isn't material.

                  • tptacek 6 minutes ago
                    It is absolutely not true that viable commercial vulnerability labs need to "accept the norms around responsible disclosure". There are no such norms. "Responsible disclosure" is an Orwellian term cooked up between @Stake and Microsoft and other large vendors to coerce researchers into synchronizing with vendor release schedules. It was fantastically successful at that, and it's worth pushing back on at every opportunity.

                    Tavis Ormandy dropped Zenbleed right onto Twitter. He's doing fine. You can blacklist him if you want; I imagine he's not going to notice.

                    • SCHiM 0 minutes ago
                      Microsoft's policy is: "if you contact us with a vulnerability, you automatically agree to the terms of our responsible disclosure policy", which includes waiting 30 days after patch was created, and says nothing about how long that process takes.

                      There is actually no way to give them a friendly heads up, and then do your own thing. The only way not to be bound is by not sending them any notification at all...

                    • maxbond 4 minutes ago
                      You're right, they don't need to. They have an alternative, to accept what people say or think about them in response. That's what I said.
                  • selectively 22 minutes ago
                    Those norms do not exist. Those are people asking companies to do stuff to benefit the person complaining for free, and many companies will not do that.
                    • maxbond 20 minutes ago
                      It seems to me you're unaware of them, but there are strong norms around disclosure. They've been discussed for decades. It is the expectation that vendors would be notified in a scenario like this.
                      • selectively 14 minutes ago
                        No, there are users who want those to be norms. Qualified researchers happily sell substantive vulns to people who pay (Governments/Cellebrite and companies like that) enough to quell any complaint.
                        • maxbond 7 minutes ago
                          Which is again, irrelevant to the question of how disclosure works and what expectations there are around it because that is not disclosure and is not what was being discussed.
                • dirasieb 15 minutes ago
                  it’s called building and preserving a high trust society, you wouldn’t understand
            • kelnos 4 minutes ago
              I'm pretty sure they have a legal obligation in most jurisdictions not to sell 0days for profit.

              And they absolutely have a moral obligation to do things in a way to minimize damage and impact to other people's systems. (I'm not saying "responsible disclosure" is the correct way to do that, but hoarding vulnerabilities and exploits and selling them to the highest bidder certainly isn't.)

              This is how society needs to work.

            • eschaton 8 minutes ago
              They should have a legal obligation to engage in coordinated/responsible disclosure, and it should be a crime to sell or disclose a 0day to anyone other than a state-designated security organization or the vendor/provider.

              If it won’t be handled through criminal law then it’ll be handled through civil litigation: Anyone who was exploited as a result of this disclosure should sue the discloser for contributing to the damage they’ve suffered.

          • true_religion 40 minutes ago
            Same. I did not know who they were, but now they have been named and shamed. Not every publicity is good.
          • CSSer 42 minutes ago
            Yes, exactly. Name and shame.
    • shimman 1 hour ago
      Expecting people to do the right thing is a fundamental issue here. Why would you ever expect for all of vulnerabilities to be disclosed privately? There's very little actual incentive to do this.

      I'm honestly unaware of what systems could be put in place to prevent this but expecting people to always do the right thing is fantasy level thinking. I mean I bet the disclosers thought they were doing the right thing, hence why it's a bad thing to rely on.

      edit: spelling/grammer.

      • dwedge 1 hour ago
        When the exploit is an advertisement for an exploit detection company, not doing the right thing is a bad look
        • dgellow 1 hour ago
          The worst thing would be to exploit or sell it for profit. Instead of that, publicizing the exploit is closer to neutral–good in my books, that did trigger a really quick reaction from the different actors to patch their kernels and systems
          • ori_b 1 hour ago
            Imagine how much quicker the distros would have reacted if they were given a heads up a month ago. But, sure, I guess kudos to this company for not being actively criminal, and merely bumblingly incompetent and overly eager to get their marketing pitch out the door.
      • egonschiele 1 hour ago
        Why don't all these distro maintainers add their own back doors, and mine crypto off our machines without our knowledge? Surely, there is some legal fine print they can add that would let them do that. There is very little incentive for them to maintain these systems, given how thankless and underpaid the work is.
      • holowoodman 1 hour ago
        I can accept (and welcome) disclosure before there are patches.

        But publishing a working exploit together with the disclosure before patches are available is really really irresponsible, maybe even criminal.

        And no, the proposed mitigations don't help with half of the distributions out there...

        • akerl_ 57 minutes ago
          > maybe even criminal

          What’s your theory here? What crime?

          • michaelmrose 48 minutes ago
            If it's not a crime I see no reason not to work with partner nations to build responsible disclosure into a legal framework everywhere because it pretty obviously should be.
            • jodrellblank 4 minutes ago
              You know companies are allowed to pay people to find vulns, and pay people bug bounties?

              Instead of that, you’d rather make the law compel free individuals to limit their speech, or to hand over their work to big companies privately, so big companies can save money?

              That doesn’t sound like a nice future, if it’s even enforceable at all.

            • akerl_ 44 minutes ago
              If you wanted to somehow make coordinated disclosure into a legal framework, that would be an interesting and complex project.

              But it’s not the law anywhere I’m aware of today, and I’d not support it becoming a law.

        • SoftTalker 1 hour ago
          AIUI the exploit was fairly low-effort once you knew the vulnerability. So publishing one probably didn't change the landscape much.
        • semiquaver 1 hour ago
          Patches were available for nearly a month.
          • ori_b 1 hour ago
            Basic care would involve making sure the patches had made it into the wild before ending the embargo, and nagging the relevant parties if not.

            Edit: As of this writing, most distros including Redhat, Fedora, Debian Stable, do not have patches available in the package repos, though they're being actively worked on.

            • sgjohnson 1 hour ago
              Not true, if there’s any evidence of the exploit being used in the wild, it’s much more responsible to release immediately.

              Considering that the patches have been available for a while, someone surely reversed what they were for and was actually exploiting this in the wild.

              In the age of AI, I’d argue that “responsible disclosure” is dead. Arguably even in closed source projects. Just ask Claude to do a diff between the previous version and to see whether anything fixed in there could have had security implications.

              We’re not there yet, but very soon the only way to responsibly disclose a vulnerability will be immediately.

              • ori_b 51 minutes ago
                But they didn't release immediately -- they waited a month, but forgot to tell the distros, and forgot to check if waiting a month had actually lead to distros picking up the patches and shipping them.
            • semiquaver 1 hour ago
              “Made it into the wild?” Patches landed a month ago. Should they also wait until my linksys router from 2018 has a patch ready?
              • ori_b 1 hour ago
                Patches are still in the process of landing in most major distros as of the time of this writing. Most users are not able to get an update through their distro's packaging mechanisms.
              • SoftTalker 59 minutes ago
                It's a local vulnerability at least. How many people do you let log in to your router?

                With the way linux is used these days, I'd guess the number of systems with untrusted local users is pretty limited. Even with shared hosting, you generally have root in your VM or container anyway. Unless this enables an escape from that?

                Still the risk that people who run "curl | bash" without care could get bitten, but usually its "curl | sudo bash" anyway...

                • sgbeal 47 minutes ago
                  > Even with shared hosting, you generally have root in your VM or container

                  Lots of shared hosters don't use VMs or containers. It's some arbitrary number of people logging in to a shared system, each one with a home directory under /home/THE_USER_NAME. i've had several such hosters over the years (thankfully not right now, though).

                • michaelmrose 42 minutes ago
                  Local root is part of the path to escaping
                • dist-epoch 35 minutes ago
                  With this exploit it's trivial to jump from one container to another neighbor container. I've tried it and succeeded.

                  So containers don't protect you, only a VM.

                  • SoftTalker 26 minutes ago
                    So anyone pulling a malicious dockerfile jeopardizes the host? That would be bad...
            • GrayShade 49 minutes ago
              Fedora is patched.
          • em-bee 51 minutes ago
            only for versions 6.19.12 & 6.18.22. older versions (which are used in distributions) are not ready yet.
        • wang_li 1 hour ago
          There is an alternative mitigation you can use which blacklists the function calls when the affected code is not built as a kernel module.
      • baggy_trough 1 hour ago
        Why wouldn't the linux security team notify the main linux distributions?
        • bonzini 59 minutes ago
          Partly they already have enough on their plate. It's up to the reporter to pick how to handle the disclosure, and unless a specific maintainer chooses to handle it, the Linux security team clearly says they won't.

          Partly they have a strong belief that all kernel bugs are vulnerabilities and all vulnerabilities are just bugs; sometimes taken to the extreme in both ways (on one hand this case where the vulnerability is almost ignored; on the other hand, I saw cases where a VM panic that could be triggered only by a misbehaving host—which could just choose to stop executing the VM—was given a CVE).

          • baggy_trough 24 minutes ago
            Seems a little crazy. Somebody should evaluate blast radius and do appropriate distro notifications in a case like this (I presume the impact was part of the disclosure, so not much extra work).
      • skywhopper 1 hour ago
        I think it’s reasonable to expect folks in the security community who go to the trouble of creating a website detailing security vulnerabilities in specific listed software to pre-notify the security teams of that software. The CopyFail website calls out Ubuntu and Red Hat specifically, but apparently the author of the site did not inform them of the issue?

        But even if you think making unethical decisions in personal self interest is something no one should be criticized for, surely the Linux kernel team ought to have some process for notifying the top distributions of an upcoming LPE, just out of practicality.

        • semiquaver 1 hour ago
          In what sense do you believe that the reporter did not notify the security team of the relevant software? The vulnerability is in the kernel. Reporter responsibly disclosed using the kernel’s security report mechanism and waited until a patch was ready.

          Distros are downstream of kernel, that doesn’t entitle them to expect to be contacted directly by every security reporter. That’s not on them. Distros that are big enough should be plugged into the linux security team for notifications.

          Security researchers cannot be held responsible for broken lines of communication within the org charts of projects that they study. They’re providing a valuable public service already, how much more do you want?

          • michaelmrose 21 minutes ago
            It is suggested that they out of an abundance of caution and 5 or 6 emails. If this is entirely to much to expect we can always help them by mandating that they spend 6 figures annually meeting a much more robust set of requirements that will include notifying all possible affected parties down to Hannah Montana Linux devs if any still exist.

            Any strategy that assumes that the rest of the world is functional or makes you personally responsible for fixing all of it is equally broken but there is a reasonable middle ground and sending a few more emails lies within it

            • semiquaver 19 minutes ago

                > we can always help them by mandating that they spend 6 figures
              
              Who’s we? Mandate with what authority?

              AWS and GCP are downstream another level. Should the reporter also have worked with them? And their customers? And the customers of their customers?

              IMO this whole discussion seems like people are annoyed by the security researchers doing god’s work and wish they didn’t exist or think that they should be fully subservient to the projects and companies they are helping for free. The bugs were there before the researchers revealed them!!

          • ragall 1 hour ago
            > that doesn’t entitle them to expect to be contacted directly by the reporter

            Yes it does. That's how it's always been done and distros can ship a fix well before it ends up in a kernel release.

      • bossyTeacher 37 minutes ago
        > expecting people to always do the right thing is fantasy level thinking.

        Most people in tech think like the techie in this comic strip.

        https://xkcd.com/538/

    • lifis 46 minutes ago
      The Linux kernel is not usable as a security boundary, so anyone who wants to do "shared hosting" and not be hacked needs to use something else, like gVisor or firecracker VMs

      The only important system that uses it as a security boundary is Android and there is mitigated by the fact that APKs need user approval, plus strict SELinux and seccomp policy plus the GrapheneOS hardening, and in this case the mitigations succeeded (https://discuss.grapheneos.org/d/35110-grapheneos-is-protect...)

      • dawnerd 35 minutes ago
        A LOT of websites are tenants on WHM/CPanel hosts. Not to mention how many agencies use it for their clients Wordpress sites.
      • watermelon0 15 minutes ago
        I'm quite sure there are many application hosting providers which rely on container runtime such as runC (default runtime of containerd/Docker), and a shared kernel between users.
    • cmckn 7 minutes ago
      It’s a trade-off. The number of people CC’d on the embargo would be large, leaks become likely. Sometimes it makes sense to disclose and scramble.
    • mschuster91 1 minute ago
      > Anyway, this is a disaster. It was extremely irresponsible to share the exploit with the world before the distributions shipped the fix. Who knows how many shared hosting providers were hacked with this.

      Maybe it is irresponsible how little attention we pay to software security. Maybe, software developers of all kind should spend an entire year not developing any features at all, but fix all the tech debt of 30 years instead.

      Yes, that sounds revolutionary, but I do not see an alternative in an age where all you need to find kernel bugs of this scale with AI agents.

    • akerl_ 1 hour ago
      Who knows how many attackers had found this vulnerability and had already been using it prior to this research finding it?
      • Quarrelsome 54 minutes ago
        well now everyone does, so the irresponsible disclosure makes it significantly worse.
        • akerl_ 53 minutes ago
          It’s your opinion that it’s irresponsible and that it makes something worse.
          • Quarrelsome 49 minutes ago
            and its your opinion that it doesn't. Shall we continue stating the obvious? We are communicating using glyphs. This language is English. We are on Hacker News. This branch of the conversation is extremely unproductive.
            • akerl_ 45 minutes ago
              I asked a question and you replied with a statement. Your statement didn’t frame itself as an opinion but as fact.

              The hilarious bit is that the idea that they needed to coordinate is clearly broken even in just this example. They did give prior notice to the Linux developers, who issued a patch. And they’re still getting raked over the coals in this comment page by armchair quarterbacks who have decided they needed to coordinate with specific distros. If they’d coordinated with those distros, somebody would have a pet distro that didn’t make the cut and they’d be pissed about that.

              There are risks no matter how they do it, and there will be people who are pissed no matter how they do it. Security researchers don’t owe anybody a specific methodology.

              • Quarrelsome 40 minutes ago
                you seemed to suggest with your initial statement that any disclosure was acceptable as people would have been using the exploit prior to the disclosure. I don't think that's a strong argument given now the initial people who were using the exploit prior to disclosure are now joined by people who have learned of the exploit as a consequence of the disclosure happening before all the distribtions were ready.

                So I feel like the argument reduces into "why is it a problem that now anyone could exploit it, if some people were exploiting it already". Which imho isn't a sensible argument because the issue is clearly the amount of people capable of using the exploit for nefarious purposes, which has increased.

                • akerl_ 36 minutes ago
                  Idk why you felt the need to use quotes to wrap something I didn’t say, and that is a pretty uncharitable attempt at reframing my question. If you wanted a quote, here’s what I’d say:

                  “Because we can’t know if there was exploitation by existing parties who had discovered the vulnerability on their own, there are upsides to disclosing earlier so that affected users can take mitigating steps and review their systems for indicators of compromise. Additionally, the more projects the researchers pull into the loop for coordinated disclosure, the higher the likelihood that they further leak the vulnerability to more attackers.”

                  • Quarrelsome 27 minutes ago
                    Idk why you felt the need to use quotes to wrap something I didn’t say. Despite the fact I didn't say that, its a much more interesting argument than your original statement implies and it is unfortunate we didn't start there.

                    However the issue is that we cannot know if the attack space has been broadened or lessened as a consequence of this disclosure, because of how eager it was. If it wasn't eager then we could much more comfortable in suggesting that the attack space has probably been reduced.

                    Given the exploit had been living in the linux code base undetected for so long in the first place, I think its fair to state that disclosing the exploit prior to the distributions being ready and given the distributions are the principal attack vector of the exploit: that the researcher has made the situation worse and should reflect on their actions.

                    • akerl_ 24 minutes ago
                      … I used quotes to wrap something that I was saying. I even called out that it was something I was saying, as a more accurate variant of what you’d claimed I meant.
                      • Quarrelsome 8 minutes ago
                        and I prefaced my quotes with the statement "So I feel like the argument reduces into". I mean, idk what punctuation I'm supposed to use there that doesn't offend you, but I just figured we can all read words and it was clear that I wasn't saying you said that, but rather, as I read the argument it was reducable to that and I took issue with that potential reduction.

                        The idea about the available exploit space and how the actors within it might, or might not move is a much more interesting avenue of conversation and I thank you for elaborating on your initial comment. <3

                        I do however feel that its hard to be confident about whether or not the attack space has been increased or reduced as a consequence of the eager disclosure. I feel we could make the case either way.

    • 999900000999 52 minutes ago
      Counterpoint. End users have a right to mitigate this issue on their systems.

      It is a really really bad look for Linux, puts a bit of water on all hype around switching from Windows.

      • roxolotl 44 minutes ago
        It does? The disclosure even says the concern for single user systems is very low. If someone has access to your single user system, remote or otherwise, you’ve already lost on the sort of device people would be switching from windows to Linux on.
        • 999900000999 32 minutes ago
          Someone like an AI coding agent perhaps ? This is the type of thing Prompt injection was made for.

          No OS is perfect. The awkward rollout for this bug fix is proof of that.

      • vhantz 44 minutes ago
        As opposed to all other operating systems with no CVEs ever?
      • weavejester 44 minutes ago
        Hype around switching from Windows servers?
      • johnbarron 36 minutes ago
        >> puts a bit of water on all hype around switching from Windows.

        Said no one ever...present post excluded :-))

      • cbarnes99 25 minutes ago
        You clearly have no idea how often windows has unpatched privesc exploits.
      • jasonmp85 50 minutes ago
        [dead]
    • johnbarron 38 minutes ago
      >> Anyway, this is a disaster. It was extremely irresponsible to share the exploit with the world before the distributions shipped the fix.

      Maybe a decade of corporations with revenue in the billions, paying peanuts and coffee money, for critical vulnerability disclosures made it....

    • deng 1 hour ago
      > It was extremely irresponsible to share the exploit with the world before the distributions shipped the fix.

      Yes, this was clearly a marketing stunt to promote Xint code.

      I, for one, will never use Xint code and will advise everyone to never use it. To anyone working there: enjoy your 15 minutes, I hope this backfires right in your face.

    • tptacek 21 minutes ago
      Without taking a position on the disclosure mechanics: any hosting provider hacked with this was already playing to lose. It is not OK to run competing untrusted tenant workloads under a single shared kernel. Kernel LPEs are not rare. This was a particularly simple and portable one, but the underlying raw capability is a CNE commodity.
  • semiquaver 1 hour ago
    > Note that for Linux kernel vulnerabilities, unless the reporter chooses to bring it to the linux-distros ML, there is no heads-up to distributions.

    Why would they imply it is incumbent on the reporter to liaise with distributions? That seems to assume a high level of familiarity with the linux project. Vulnerability reporters shouldn’t be responsible for directly working with every downstream consumer of the linux kernel, what’s the limiting principal there? Should the reporter also be directly talking to all device manufacturers that use Linux on their machines?

    IMO reporter did more than enough by responsibly disclosing it to linux and waiting for a patch to land.

    Aren’t there people in the linux project itself with authority over and responsibility for security vulnerabilities? One would think they would be the ones notifying downstream distros…

    • aduwah 37 minutes ago
      Especially since the reporter is explicitly asked not to notify the distro teams first.

      https://docs.kernel.org/process/security-bugs.html

      ```As such, the kernel security team strongly recommends that as a reporter of a potential security issue you DO NOT contact the “linux-distros” mailing list UNTIL a fix is accepted by the affected code’s maintainers and you have read the distros wiki page above and you fully understand the requirements that contacting “linux-distros” will impose on you and the kernel community. ```

    • sega_sai 1 hour ago
      The reporter took time to check and mention on their website specific distributions Ubuntu/RHEL/SUSE. One would have thought reporting to security teams of at least those would be responsible.
      • semiquaver 1 hour ago
        “One” would have thought? Can you point to a written policy that says that’s how it should be?
        • happyopossum 1 hour ago
          No, not can I point to a written policy that states one should cover one’s mouth when they cough.

          Everyone involved here failed to do the right thing, and hiding behind the lack of written words is weak sauce.

        • anikom15 1 hour ago
          The tenets of decency don’t need to be written down.
          • tob_scott_a 1 hour ago
            If you can't write it down, why would you expect it to be universal and enforceable? Different cultures exist and have different opinions on what "decency' means, after all.

            A security researcher's ethical obligations are to protect users over vendors (barring any contractual agreement in place). From what has been discussed in this thread, they meet that bar.

            Sure, they could have gone the extra mile to ensure the distros were in a good place to patch before they published the exploit. That's a kindness you can wish for, but don't disparage them for not going that extra mile. It's a bonus.

            It's also possible that it simply didn't occur to them to do so this time. There's certainly lessons to be learned either way. I don't know that the right lessons will emerge from hostility.

            • Quarrelsome 52 minutes ago
              > If you can't write it down, why would you expect it to be universal and enforceable?

              and this is the problem. It used to be the case that if you were smart enough to find an exploit you were also smart enough to realise what would happen if you irresponsibly disclosed it. I guess these tools have made that pattern no longer apply.

              • true_religion 30 minutes ago
                From my point of view, they told the kernel security team which is in charge of fixing this. If it’s important for them to tell other people, then it should’ve been written down and further reiterated when they made their report.

                The skills to detect code exploits is not the same as the skills to navigate an informal org chart to the satisfaction of an amorphous audience if end users (i.e. us on HN).

                That said… as they are a company that supposedly specializes in this field, and is trying to sell a product, I do believe they should do better. Right now, I don’t have much confidence in their product.

            • anikom15 4 minutes ago
              There is little difference in culture here. Nearly all open source work is done in English.
            • scragz 50 minutes ago
              different cultures have different views on disclosing vulnerabilities to distros before the public?
              • embedding-shape 35 minutes ago
                Yes :) The blackhatter would obviously sit on it until they can sell it or use it, the whitehatter collaborate the kernel and distros to patch, and the greyhatter argues on HN whether the latest *fail was responsible enough or not.
    • sparker72678 1 hour ago
      Sure, maybe it's not a _requirement_, but now we're all in more pain because the reporters are more interested in Fame than Safe Remediation.
    • skywhopper 1 hour ago
      The reporter made a website explicitly calling out Ubuntu, RedHat, Amazon, and SUSE but didn’t notify them, and you think that’s reasonable? That they might not have known those distributions are downstream from the kernel team?
    • froh 44 minutes ago
      it's trivial to find out how to report a security issue like this to Linux distros.

      Google search: https://share.google/aimode/eihDKXZJy94Z5lC1p

      and it's beyond me to not think about doing this and instead exposing everyone and their neighbor to this exploit up front.

      I'm certain this is even a felony in some legislations, rightfully so.

  • GranPC 23 minutes ago
    Just for what it's worth, I just pushed an eBPF-based workaround for people who are running kernels in which AF_ALG is linked directly into the kernel and not as a module: https://github.com/Dabbleam/CVE-2026-31431-mitigation

    I am running this in production right now and it mitigates the attack, with no unexpected side-effects as far as I can see.

  • ectospheno 1 hour ago
    The Bleeping Computer link below mentions a potential remedy until a patch is ready.

    https://www.bleepingcomputer.com/news/security/new-linux-cop...

  • seniorThrowaway 32 minutes ago
    Ubuntu has patches out, tested before and after patching.
  • uberduper 1 hour ago
    `initcall_blacklist` is a thing.
  • ChrisArchitect 1 hour ago