Question for all the smart nostrich cookies out there:

I need some way to scan, and find when certain URL links on WhyBitcoinOnly.com die, route to a 404 error, or something like that.

Is there some sort of AI link-scanning tool that does that? Or some other way you guys know of?

It would be insanely tedious to go through every link on the entire site (every few months or whatever) to keep dead links cleaned out.

Thanks in advance! #asknostr

Reply to this note

Please Login to reply.

Discussion

This is definitely not the best way but one I know. Put all links into a single bookmark folder, then open all bookmarks at once and ctlr-w through them. Ctrl-shift-t if you accidentally close one you didn't want to close.

We are looking for investors who can lend 450,000 US dollars to our holding company.

We will establish an educational website with training in many fields in many languages ​​with a budget of 450,000 US dollars that you will lend to our holding company.

We are looking for an investor who can provide our holding with a loan of 450,000 US Dollars. With the 450,000 US Dollars loan you provide, we will establish a major online education platform offering courses in many languages and fields.

All courses on our platform will be prepared by professional instructors and will fully belong to us. Since the courses are short and detailed, even a beginner can become professional in the chosen field within just one month.

To join our platform, members will pay 100 US Dollars. After this one-time payment, they will have unlimited lifetime access to all courses. After completing their training, if they wish, they can work as interns in our holding and earn income. We will assign paid projects from companies to interns, generating more profit. From each project’s income, 10% will be given to the interns.

If an intern successfully completes 7 projects, they will be hired as a full-time employee with a fixed salary in our holding.

Because our education platform will be multilingual and cover many subjects, while also offering career and income opportunities, it will attract a wide global audience. In this way, we will generate fast revenue both from projects and from memberships. This project will create great profit for you and allow us to establish our business successfully.

πŸ’Ό Your Profit:

You will lend our holding 450,000 US Dollars. On 22.05.2026, you will receive 1,500,000 US Dollars back.

You will lend 450,000 US dollars to our holding company. You will receive your money back as 1,500,000 US dollars on 22.05.2026.

You will invest 450,000 US dollars in our holding company. When 22.05.2026 comes, I will return your money as 1,500,000 US dollars.

When May 22, 2026 comes, I will refund your money in the amount of 1,500,000 US dollars.

πŸ“© To learn more details about our education project and how you can lend 450,000 US Dollars to our holding, please contact us via WhatsApp or Telegram.

To get more detailed information about our educational website project and to learn how you can lend 450,000 US dollars to our holding company, send a message to my WhatsApp number or Telegram username below and I will give you detailed information.

For detailed information, send a message to my WhatsApp number or Telegram username below and I will give you detailed information.

My WhatsApp phone number:

+44 7842 572711

My telegram username:

@adenholding

I'd go for a scraper or end-to-end-testing tool, validating whether URLs produce healthy responses in general or lead to specific sites/endpoints. Your site's sitemap.xml could be the jump-off point since contains all the pages that might contain links/anchors.

So no AI required πŸ˜‰

I've built similar things in the past and I'd be happy to hook you up for a fistful of sats.

What do I actually have to do? I need a 3rd grader explanation lol

There might be out-of-the-box tools for this, especially for wordpress, but my approach would involve writing a teeny bit of code and then either host the tool somewhere or just run it on your computer (since you're the only one interested in the result) to save 100% of the cost.

If I were to build something like this for myself (and I might, sounds like a great idea), I'd use a quirky little Python package called scrapy ("The world's most used open source data extraction framework") and simply go through all the "" tags on your site (I see that pretty much all the information is on a single page which makes things much easier), call the "href" or link/url on it, and see what comes back, creating a list of broken links and logging them for the user to review afterwards.

If there's any chance you would be able to make that and help out, I would absolutely throw you some sats and show you love back in any way I can πŸ™πŸ™πŸ™

I think it's meant to be, I've just finished another little project like that so I'm free to look into yours, I'll whip up a little something and get back to you, feel free to DM me if you have any further input or questions.

Just one more thing, which OS are you using? Linux, Windows, or Mac?

Hey there, I've messaged you a while ago with all the broken URLs on your site, might want to check your DMs πŸ˜‰