website safety check

<hidden> anonymous
Created: 5 years and 8 months ago • Updated: 2 years and 7 months ago
given a certain website, use multiple online services (ex. mcaffee site advisor,Norton Safe Web, AVG Online Web Page Scanner) to report on it being malicious or not and present a list of results as an instant.

not sure of the context in which to fit this in though...
should it be a plugin to able to say say:
!safe www.whatevs.com

or should that safety net be integrated somehow when browsing with ddg by default. (like when google says: Warning this is not safe! once you click on it"

This forum has been archived

Thank you all for the many comments, questions and suggestions. Particular thanks go to user x.15a2 for constantly monitoring, replying and helping so many users here. To continue these discussions, please head over to the DuckDuckGo subreddit.


This comment has been removed for violation of our forum rules.
posted by <hidden> • 5 years and 3 months ago
brianrisk
If a site is identified as unsafe, should it be included in search results?
posted by brianrisk 2 years and 9 months ago Link
yegg
From user ScreapDK:

"I would like it to be displayed both on the result pages when just searching for an url [1]

And when searching for "is <URL or domain> safe" [2][3].

[1] = https://duckduckgo.com/?q=http%3A%2F%2Fw...
[2] = https://duckduckgo.com/?q=is+http%3A%2F%...
[3] https://duckduckgo.com/?q=is+ye.gg+safe

I think you should include the following services:

1) VirusTotal - https://www.virustotal.com/
2) AVG Threat Labs - http://www.avgthreatlabs.com/sitereports...
3) Norton Safe Web - https://safeweb.norton.com/
4) McAfee SiteAdvisor - https://www.siteadvisor.com/ (see right sidebar where it reads sometihng like Website report or something like that)"
posted by yegg Staff5 years and 5 months ago Link
yegg
Thanks, ScreapDK! Let us know when you're working on the integration and we can update the status of this : )
posted by yegg Staff5 years and 1 month ago Link
yegg
Thanks, Nikhil! We have WOT ratings available via the DDG Settings (that will show in-line with the results). This idea is for a site-check on a per-URL basis. It's especially handy for people that don't want to see site ratings all the time.
posted by yegg Staff4 years and 11 months ago Link
anonymous
I've been trying to find a place to report a site in the search engine.

It would have saved me quite a bit of time to see a red dot next to the link signifying that
http://www.freerip.com/ installs extra unwanted/unsafe software called Malware Protection Live.
posted by <hidden> • 2 years and 7 months ago Link
x.15a2
Here you go: https://duck.co/help/results/spam
Thanks!
posted by x.15a2 Community Leader2 years and 7 months ago Link
anonymous
You could use https://www.mywot.com/ it has a pretty cool API to use.
The API is at https://www.mywot.com/wiki/API
posted by [UserVoice Nikhil Jha] • 4 years and 11 months ago Link
anonymous
I'm currently working on an online service/web app, that shows if a website/URL is containing viruses etc., based on several different sources.

The service will provide the result through an API. Feel free to use that API, if you want. : )

I will keep you updated, when it's ready.
posted by [UserVoice ScreapDK] • 5 years and 1 month ago Link
anonymous
For source suggestions, please see the comment posted by the user named DuckDuckGo (admin on this ideas-platform) starting with the text "From user ScreapDK:", (which is me).

Best regards,
Casper Qvortrup
posted by [UserVoice ScreapDK] • 5 years and 2 months ago Link
anonymous
It would be useful it also cultivates a dynamic blacklist for which the "darkness" is voted by netizens. For example, websites that pretend to be legitimate banks/companies.
posted by [UserVoice Jean] • 5 years and 4 months ago Link
anonymous
I would like it to be displayed both on the result pages when just searching for an url [1]

And when searching for "is <URL or domain> safe" [2][3].

[1] = https://duckduckgo.com/?q=http%3A%2F%2Fw...
[2] = https://duckduckgo.com/?q=is+http%3A%2F%...
[3] https://duckduckgo.com/?q=is+ye.gg+safe

I think you should include the following services:

1) VirusTotal - https://www.virustotal.com/
2) AVG Threat Labs - http://www.avgthreatlabs.com/sitereports...
3) Norton Safe Web - https://safeweb.norton.com/
4) McAfee SiteAdvisor - https://www.siteadvisor.com/ (see right sidebar where it reads sometihng like Website report*)

*The site is shown in my native language, which is not English, so this is just loosely translated. I hope it makes sense :-)
posted by [UserVoice ScreapDK] • 5 years and 7 months ago Link
anonymous
I would like it to be displayed both on the result pages when just searching for an url [1]

And when searching for "is <URL or domain> safe" [2][3].

[1] = https://duckduckgo.com/?q=http%3A%2F%2Fw...
[2] = https://duckduckgo.com/?q=is+http%3A%2F%...
[3] https://duckduckgo.com/?q=is+ye.gg+safe

I think you should include the following services:

1) VirusTotal - https://www.virustotal.com/
2) AVG Threat Labs - http://www.avgthreatlabs.com/sitereports...
3) Norton Safe Web - https://safeweb.norton.com/
4) McAfee SiteAdvisor - https://www.siteadvisor.com/ (see right sidebar where it reads sometihng like Website report*)

*The site is shown in my native language, which is not English, so this is just loosely translated. I hope it makes sense :-)
posted by [UserVoice ScreapDK] • 5 years and 7 months ago Link
anonymous
Is there anyway we could cache search results as to limit the number of times we revisit a site for ratings?

For example, if someone searches for the safety of "maliciousdomainlink.com" and we run the api for the results, would we be able to cache it and then only update that record again after 24 hours.
posted by [UserVoice Zeppelin] • 4 years and 10 months ago Link
yegg
Not currently, though, it depends on the implementation (triggers, discoverability, etc). Can revisit closer to completion to see if we should inquire about a different arrangement. Wouldn't worry about the limit until then! : )
posted by yegg Staff4 years and 10 months ago Link
yegg
@Zeppelin, yes, spice results can be cached. You can override the defaults by using the proxy_cache_valid line.
posted by yegg Staff4 years and 10 months ago Link
anonymous
It should be somewhat straight forward to design an app, or associated program that does to a certain extent what social media has done. Several factors are associated w/what issue, or "number of clicks/post visits and comments" and modify or just adapt the same analytics to for example, associate what people have experienced w/ a site; be it effectiveness or confidence. The later would be based on a protocol that started with simple "thumbs up/down" votes for whatever reason. If people are "comfortable" in accessing a site, we don't "self censor" ourselves as much, and traffic volume goes up. Why we are more comfortable in a site is not as important as just being so. It shows.

Protocols based on the Mechanical Turk have someone, many individuals making value judgments. Get enough subjective POV's, and it becomes to a degree objective. A protocol could be based on certain values that over time reinforce whatever the targeted issue is. Thats in effect how living things "learn" anything, very basically.

As for more automated ways which you would need as well, a basic approach I've used thats simple, cheap and actually, usually effective given its set up as a one-way passive filter. Bouncing from email/web site at least as I've noticed is not likely perceived as a red flag when it's on the receiving, aka: passive end. You can keep it even w/in one site, but compartmented email accounts. It (?) for some reason using actual sites, http/https and up, w/ a VPN seems more effective, I have no idea why. Send site feeds through several different web/email addresses. Each is designed to act as an increasingly picky and very flexible "firewall".

It's not the best approach but its cheap and usually quite effective. Also "washing" a web site through https, and other filters does not in itself assure the site is kosher, so-to-speak. But it funnels the stream threw increasingly more specific filters.

Breaking up a site feed into more then one "package" is very effective, usually. It seems that is not as "creepy", and has a long history in secure radio communications. Just make sure if there's any signal degradation, each package can stand on it's own. As anyone can tell, my computer skills are very ad hoc.
posted by [UserVoice Robert] • 4 years and 9 months ago Link
anonymous
i find myself doing these checks quite often sometimes... i can see how it can be useful.

would like to code this up as well if you people think it's reasonable.

I'm guessing this is a fathead...?
posted by [UserVoice Dan Droid] • 5 years and 8 months ago Link
anonymous
I am currently looking into this. However, the WOT API ToS claim that there shouldn't be more than 50,000 requests in 24 hours. Do you think there is a possibility to exceed that amount?
posted by [UserVoice Svetlozar Kalchev] • 4 years and 10 months ago Link