Solution 1 :

You can inspect the page. By either pressing CTRL + SHIFT + I or CMD + OPTION + I on mac, or you can just double click and click inspect.

You click on the “elements” tab and then press CTRL + F or CMD + F, then you paste the like you wanna check.enter image description here

Check this example.

Hope it helped

Solution 2 :

If there is a link in a web page, it should mostly be in an <a> tag within the href attribute. So, it is a matter of opening the console and getting all the <a> tags and collecting their href.

var allLinkElements = document.querySelectorAll("a");
var allLinks= []

for (var i=0; i<allLinkElements .length; i++){
   allLinks.push(allLinkElements [i].href);
}

console.log(allLinks.includes('http://www.what-im-looking-for.com'));

Problem :

I’d like to check if any sites on domain https://example.com contain any links like https://my-link.com.

I know I can use following Google search:
site:https://example.com text to find
although it only works for texts.

Any idea if it’s possible to find links in href that or other way?

Comments

Comment posted by ahrefs.com/backlink-checker

It sounds like you are wanting to search for “backlinks” (links that point back to your site)… if so, and this is a one time thing there are lots of backlink checker tools out there e.g. here’s one:

Comment posted by What topics can I ask about here?

I’m voting to close this question because it is not a programming question.

Comment posted by engray

@scunliffe thanks for the tool! This is exactly what I want to do, although not for checking backlinks linking to my website, but I want to be sure some links I had in WordPress page are not hardcoded in WP template. I have WP admin access, but no FTP rights. To be precise my question is about finding such links via google search

Comment posted by engray

@Rob my question is about Google search, page you’ve linked contains point “software tools commonly used by programmers; and is”. Is Google not a tool commonly used by programmers? šŸ˜‰

Comment posted by engray

Thanks for reply, I know I could do it this way, but what I’m looking for is finding all hrefs in all sites in some domain. I’ve edited my question – now I read it and it’s really not too specific šŸ˜€

Comment posted by Tomas Mota

You could try using beautiful soup with python maybe. That would let you get all of the hrefs within a page.

Comment posted by engray

Thanks for reply, I know I could do it this way, but what I’m looking for is finding all hrefs in all sites in some domain. I’ve edited my question – now I read it and it’s really not too specific šŸ˜€

Comment posted by Syed

look its not specific thats correct. it is universally understood that you need source code in database to extract term aka ‘href’ from the each source code file. how you do it is up to you, grab the sitemap file and fetch cource code of each link. could use any simple old php or phython and run regex to extract terms into array. No one will hand it for free its what the big business is all about.

Comment posted by stackoverflow.com/questions/1439326/…

here is a previous answer on the subject that might help

By