Job Description
About the Role
HEROIC Cybersecurity (HEROIC.com) is seeking a senior-level engineer to design, build, and operate fully automated intelligence collection systems that power our AI-driven cybersecurity and breach intelligence platforms.
This role owns the end-to-end discovery, acquisition, and ingestion pipeline for continuously discovering, crawling, extracting, indexing, and normalizing millions of new artifacts daily —including documents, chats, forums, leaked datasets, repositories, threat actor communications, hacker marketplaces, unsecured infrastructure, and decentralized networks across the surface web, deep web, dark web, and anonymized networks.
Our Threat Research Team’s mission is aggressive: achieve near-total coverage of global breach and leak data with 99%+ automation. Your work directly enables HEROIC’s ability to identify exposures before they are weaponized.
What You Will Do
Automated Intelligence Collection & Discovery
Architect and operate large-scale, distributed crawling and discovery systems across:
- Surface web, deep web, and dark web
- Hacker forums, underground marketplaces, and breach communities
- Chat platforms (Telegram, Discord, IRC, WhatsApp, etc.)
- Paste sites, code repositories, and social platforms used for breach disclosure
Continuously discover, archive, and download newly released datasets, logs, credentials, and artifacts the moment they appear.
Build automated collectors and archivers for anonymized and decentralized networks including:
- Tor (.onion), I2P, ZeroNet, Freenet, IPFS, GNUnet, Lokinet, Yggdrasil, and similar systems
Design resilient workflows for unreliable, adversarial, or episodic data sources.
Normalize and index data from non-traditional network protocols and formats.
Infrastructure & Exposure Discovery
Develop automated scanning systems to identify:
- Unsecured databases (Elasticsearch, MySQL, PostgreSQL, MongoDB, etc.)
- Exposed cloud storage (S3, Azure, GCP, DigitalOcean Spaces)
- Open FTP servers, backups, and misconfigured archives
Monitor and ingest data from file hosting and distribution platforms commonly used for breach dumps.
Pipeline Engineering & Operations
Build ETL pipelines to clean, normalize, enrich, and index structured and unstructured data.
Implement advanced anti-bot evasion strategies (proxy rotation, fingerprinting, CAPTCHA mitigation, session management).
Integrate collected intelligence into centralized databases and search systems.
Design APIs and internal tooling to support downstream analysis and AI/ML workflows.
Implement advanced anti-bot, evasion, and resiliency techniques (proxy rotation, fingerprinting, CAPTCHA mitigation, session handling).
Automate deployment, scaling, and monitoring using Docker, Kubernetes, and cloud infrastructure.
Continuously optimize performance, reliability, and cost efficiency of crawler clusters.
Requirements
Minimum 4 years of hands‑on experience in data engineering, intelligence collection, crawling, or distributed data pipelines.
Strong Python expertise and experience with frameworks such as Scrapy, Playwright, Selenium, or custom async systems.
Proven experience operating high‑volume, automated data collection systems in production.
Deep understanding of web protocols, HTTP, DOM parsing, and adversarial scraping environments.
Experience with asynchronous, concurrent, and distributed architectures.
Familiarity with SQL and NoSQL databases (PostgreSQL, MongoDB, Elasticsearch, Cassandra).
Strong Linux/Unix, shell scripting, and Git‑based workflows.
Experience deploying and operating systems using Docker, Kubernetes, AWS, or GCP.
Excellent analytical, debugging, and problem‑solving skills.
Strong written and verbal communication skills.
Preferred / High‑Value Experience
Direct experience with dark web intelligence, breach data, OSINT, or threat research.
Familiarity with Tor, I2P, underground forums, stealer logs, or credential ecosystems.
Experience processing large breach datasets or stealer logs.
Background working in adversarial data environments.
Exposure to AI/ML‑driven intelligence platforms.
- PositionType: Full‑time
- Location: Remote in India. Work from wherever you please! Your home, the beach, our offices, etc.
- Compensation: USD 1300-2000 monthly
- ProfessionalGrowth: Amazing upward mobility in a rapidly expanding company.
- InnovativeCulture: Be part of a team that leverages AI and cutting‑edge technologies.
AboutUs
HEROIC Cybersecurity (HEROIC.com) is building the future of cybersecurity. Unlike traditional cybersecurity solutions, HEROIC takes a predictive and proactive approach to intelligently secure our users before an attack or threat occurs. Our work environment is fast‑paced, challenging, and exciting. At HEROIC, you’ll work with a team of passionate, engaged individuals dedicated to intelligently securing the technology of people all over the world.
#J-18808-Ljbffr
HEROIC Cybersecurity (HEROIC.com) is seeking a senior-level engineer to design, build, and operate fully automated intelligence collection systems that power our AI-driven cybersecurity and breach intelligence platforms.
This role owns the end-to-end discovery, acquisition, and ingestion pipeline for continuously discovering, crawling, extracting, indexing, and normalizing millions of new artifacts daily —including documents, chats, forums, leaked datasets, repositories, threat actor communications, hacker marketplaces, unsecured infrastructure, and decentralized networks across the surface web, deep web, dark web, and anonymized networks.
Our Threat Research Team’s mission is aggressive: achieve near-total coverage of global breach and leak data with 99%+ automation. Your work directly enables HEROIC’s ability to identify exposures before they are weaponized.
What You Will Do
Automated Intelligence Collection & Discovery
Architect and operate large-scale, distributed crawling and discovery systems across:
- Surface web, deep web, and dark web
- Hacker forums, underground marketplaces, and breach communities
- Chat platforms (Telegram, Discord, IRC, WhatsApp, etc.)
- Paste sites, code repositories, and social platforms used for breach disclosure
Continuously discover, archive, and download newly released datasets, logs, credentials, and artifacts the moment they appear.
Build automated collectors and archivers for anonymized and decentralized networks including:
- Tor (.onion), I2P, ZeroNet, Freenet, IPFS, GNUnet, Lokinet, Yggdrasil, and similar systems
Design resilient workflows for unreliable, adversarial, or episodic data sources.
Normalize and index data from non-traditional network protocols and formats.
Infrastructure & Exposure Discovery
Develop automated scanning systems to identify:
- Unsecured databases (Elasticsearch, MySQL, PostgreSQL, MongoDB, etc.)
- Exposed cloud storage (S3, Azure, GCP, DigitalOcean Spaces)
- Open FTP servers, backups, and misconfigured archives
Monitor and ingest data from file hosting and distribution platforms commonly used for breach dumps.
Pipeline Engineering & Operations
Build ETL pipelines to clean, normalize, enrich, and index structured and unstructured data.
Implement advanced anti-bot evasion strategies (proxy rotation, fingerprinting, CAPTCHA mitigation, session management).
Integrate collected intelligence into centralized databases and search systems.
Design APIs and internal tooling to support downstream analysis and AI/ML workflows.
Implement advanced anti-bot, evasion, and resiliency techniques (proxy rotation, fingerprinting, CAPTCHA mitigation, session handling).
Automate deployment, scaling, and monitoring using Docker, Kubernetes, and cloud infrastructure.
Continuously optimize performance, reliability, and cost efficiency of crawler clusters.
Requirements
Minimum 4 years of hands‑on experience in data engineering, intelligence collection, crawling, or distributed data pipelines.
Strong Python expertise and experience with frameworks such as Scrapy, Playwright, Selenium, or custom async systems.
Proven experience operating high‑volume, automated data collection systems in production.
Deep understanding of web protocols, HTTP, DOM parsing, and adversarial scraping environments.
Experience with asynchronous, concurrent, and distributed architectures.
Familiarity with SQL and NoSQL databases (PostgreSQL, MongoDB, Elasticsearch, Cassandra).
Strong Linux/Unix, shell scripting, and Git‑based workflows.
Experience deploying and operating systems using Docker, Kubernetes, AWS, or GCP.
Excellent analytical, debugging, and problem‑solving skills.
Strong written and verbal communication skills.
Preferred / High‑Value Experience
Direct experience with dark web intelligence, breach data, OSINT, or threat research.
Familiarity with Tor, I2P, underground forums, stealer logs, or credential ecosystems.
Experience processing large breach datasets or stealer logs.
Background working in adversarial data environments.
Exposure to AI/ML‑driven intelligence platforms.
- PositionType: Full‑time
- Location: Remote in India. Work from wherever you please! Your home, the beach, our offices, etc.
- Compensation: USD 1300-2000 monthly
- ProfessionalGrowth: Amazing upward mobility in a rapidly expanding company.
- InnovativeCulture: Be part of a team that leverages AI and cutting‑edge technologies.
AboutUs
HEROIC Cybersecurity (HEROIC.com) is building the future of cybersecurity. Unlike traditional cybersecurity solutions, HEROIC takes a predictive and proactive approach to intelligently secure our users before an attack or threat occurs. Our work environment is fast‑paced, challenging, and exciting. At HEROIC, you’ll work with a team of passionate, engaged individuals dedicated to intelligently securing the technology of people all over the world.
#J-18808-Ljbffr
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application