اختبار شهادة Certified Ethical Hacker (CEH) V13

السؤال 437 من 448

كل الأسئلة

During web footprinting, standard spiders cannot crawl a target site because of restrictions in the root directory (e.g., robots.txt). The tester instead browses manually while a tool records all requests and responses. What is this technique?

الخيارات

  • A Using Photon to retrieve archived URLs from archive.org
  • B Using Netcraft to gather website information
  • C Examining HTML source code and cookies only
  • D User-directed spidering with tools like Burp Suite and WebScarab

النقاشات

لا توجد نقاشات منشورة لهذا السؤال حالياً.