In April of 2016, a researcher launched a tool called OnionScan that probes darkweb sites for vulnerabilities and security threats. The tool, as we wrote, “lets you scan it automatically for common vulnerabilities and errors that can deanonymize the owner or users.” A new researcher has taken it upon himself to describe, to the public, how to deploy the tool using a Python script to help others scan sites in the same way.
The security researcher, Justin Seitz, tasked with helping others use OnionScan published the results of 8,000 site scans using this method. He told Motherboard that this “was to allow others to start more large-scale analyses that are usually too technically difficult for non-technologists to jump into to.” However, the creator of OnionScan is worried about this approach as she fears users of darknet sites could be quickly deanonymized if a large number of people were to use the tool.
OnionScan searches for information that could be sensitive in Tor hidden services, characterized by their .onion addresses. Metadata in uploaded images or exposed server status pages are two examples of what the tool hunts for. “When used against multiple targets, it can find shared encryption keys, implying a strong correlation between different sites.”
Motherboard found eight illegal sites leaking potentially identifying data about their owners, using the OnionScan tool. An example of what they used OnionScan to find is below:
On Mollyworld, a hidden service run by a team of vendors selling MDMA, metadata in an image revealed that the camera used was a NIKON D3100. A site run by vendor Doctor Drugs is being hosted on the same server as another hidden service, called “The Polish Connect,” possibly alluding to the vendor’s location (on other marketplaces, Doctor Drugs lists the dispatch location as the Netherlands).
It’s obvious the tool would be beneficial to the public it was designed for but the tool becomes a dangerous weapon in the hands of the law. Darknet venders who run their own sites, such as the well-known Gammagoblin (not an .onion link), run a greater risk of being deanonymized than vendors who sell on the larger markets, such as Alphabay or Dream.
The tool, OnionScan, was set up by Sarah Jamie Lewis to search a single onion link at a time. Lewis used the scans in an attempt to help fix the revealed vulnerabilities. To the public, she had not pointed out specific sites or released results but instead only provided brief, yet insightful, summaries of her findings.
On Thursday July 28th, however, the second researcher, Seitz published a detailed blog post on using OnionScan to a much more efficient degree.
“In his write-up, Seitz steps through setting up a server with Tor, installing all the necessary software prerequisites and Go, and automating OnionScan to loop through a list of Tor hidden services,” Motherboard writes.
The developer of OnionScan, Lewis, says “If more people begin publishing these results then I imagine there are a whole range of deanonymization vectors that come from monitoring page changes over time. Part of the reason I destroy OnionScan results once I’m done with them is because people deserve a chance to fix the issue and move on—especially when it comes to deanonymization vectors.”
She claims that, when capable, she sends emails to the hosts of the breached sites in an effort to help him or her resolve the problems.
Seitz takes a completely different approaching, believing the script could be too useful a tool to keep from people. “Too often we set the bar so high for the general practitioner (think journalists, detectives, data geeks) to do some of this larger scale data work that people just can’t get into it in a reasonable way. I wanted to give people a starting point,” he said.
“I am a technologist, so it’s the technology and resulting data that interest me, not the moral pros and cons of data dumping, anonymity, etc. I leave that to others, and it is a grey area that as an offensive security guy I am no stranger to.”
Whatever happens, many of the public may soon be able to perform massive crawls of darknet sites – resulting is a potential mass data breach.