OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

Any point in using a paid threat intelligence service over Google's Lookup API for detecting malicious URLs? [closed]

  • Thread starter Thread starter Adi
  • Start date Start date
A

Adi

Guest
I have been working on a project, and a part of it involves receiving a link from the user end. Currently, I am using a very rudimentary mechanism in terms of verifying the user input (all it does is check whether the input is a url or not). In an attempt to improve the security, I have been googling around to find solutions. Some of the solutions I came across involved checking the characters in the link, checking the links length, etc. I also came across some websites that have auto updating databases with known malicious websites. However, going through and comparing thousands of links would be very inefficient, and the first solution felt a bit spotty.

After a little more searching, I came across Google's Lookup API:

https://cloud.google.com/web-risk/docs/lookup-api#python

The gist of it is that it compares inputted URLs with various Web Risk lists.

But, I also came across the following thread:


The thread suggests to use a paid intelligence service over anything free, or self made.

So, my questions are as follows:


  1. Is there a reason to use the paid services over google's? (I realize that Google limits the number of requests but for now assume that I am not dealing with a huge amount of requests)


  2. Can I myself make something myself that checks URLs with updated databases in a quick and easy way? (The impression I got from the thread above was that this is not possible)


  3. What services should I use if I opt against google (I saw some listed in the thread above, however, those services would check more for sites that already have a fair bit of traffic. My project deals with some users who have sites with little to no traffic, which could result in a false malicious tag)

The thread posted above mirrors my situation quite similarly, in that I have no cybersecurity experience and I too am accessing a lot of the sites metadata.

Any advice would be much appreciated!
<p>I have been working on a project, and a part of it involves receiving a link from the user end. Currently, I am using a very rudimentary mechanism in terms of verifying the user input (all it does is check whether the input is a url or not). In an attempt to improve the security, I have been googling around to find solutions. Some of the solutions I came across involved checking the characters in the link, checking the links length, etc. I also came across some websites that have auto updating databases with known malicious websites. However, going through and comparing thousands of links would be very inefficient, and the first solution felt a bit spotty.</p>
<p>After a little more searching, I came across Google's Lookup API:</p>
<p><a href="https://cloud.google.com/web-risk/docs/lookup-api#python" rel="nofollow noreferrer">https://cloud.google.com/web-risk/docs/lookup-api#python</a></p>
<p>The gist of it is that it compares inputted URLs with various Web Risk lists.</p>
<p>But, I also came across the following thread:</p>
<p><a href=" " rel="nofollow noreferrer"> </a></p>
<p>The thread suggests to use a paid intelligence service over anything free, or self made.</p>
<p>So, my questions are as follows:</p>
<ol>
<li><p>Is there a reason to use the paid services over google's? (I realize that Google limits the number of requests but for now assume that I am not dealing with a huge amount of requests)</p>
</li>
<li><p>Can I myself make something myself that checks URLs with updated databases in a quick and easy way? (The impression I got from the thread above was that this is not possible)</p>
</li>
<li><p>What services should I use if I opt against google (I saw some listed in the thread above, however, those services would check more for sites that already have a fair bit of traffic. My project deals with some users who have sites with little to no traffic, which could result in a false malicious tag)</p>
</li>
</ol>
<p>The thread posted above mirrors my situation quite similarly, in that I have no cybersecurity experience and I too am accessing a lot of the sites metadata.</p>
<p>Any advice would be much appreciated!</p>
Continue reading...
 
Top