Package hallucination: LLMs may deliver malicious code to careless devs
Package hallucination: LLMs may deliver malicious code to careless devs 2025-04-14 at 15:46 By Zeljka Zorz LLMs’ tendency to “hallucinate” code packages that don’t exist could become the basis for a new type of supply chain attack dubbed “slopsquatting” (courtesy of Seth Larson, Security Developer-in-Residence at the Python Software Foundation). A known occurrence Many software […]
React to this headline:
Package hallucination: LLMs may deliver malicious code to careless devs Read More »