Available for projects

AI Automation
Engineer.

I build workflows that replace manual work permanently — lead gen pipelines, CRM automation, AI agents, and data systems that run 24/7 without babysitting.

20-30
qualified leads/day, automated
5K+
records/day, scrapers running
100%
inbox rate, cold email system
5.0★
Upwork rating, 21 jobs

Selected work

Five workflows.
Zero manual steps.

01 / AI Lead Generation Pipeline

Innovapack — Autonomous Lead Engine

Daily-scheduled AI pipeline that discovers packaging companies, finds decision-makers, verifies emails, and pushes qualified leads to Pipedrive — fully hands-off.

n8n OpenRouter AI SerpAPI Hunter.io Pipedrive Google Sheets
20-30 leads/day 0 manual steps 35 nodes
+

The problem

Innovapack's sales team was manually Googling packaging companies, copy-pasting names into LinkedIn, guessing email formats, and dropping qualified leads into a spreadsheet by hand. It was eating 4-5 hours a day and still missing hundreds of prospects.

What I built

Schedule Trigger fires daily. AI extracts company names from SerpAPI search results. A deduplication layer checks Google Sheets to skip already-processed domains. For each new company, a second AI call finds the decision-maker's name and title. Hunter.io verifies the email. A qualification AI scores the lead against Innovapack's ICP. Qualified leads are pushed to Pipedrive with full context.

The outcome

20-30 new qualified leads per day with zero manual input. The team wakes up to a full pipeline instead of an empty spreadsheet.

Node chain

Schedule Trigger Build Search Queries SerpAPI Discovery AI: Extract Companies Deduplicate (Sheets) SerpAPI Founder Search AI: Decision Maker Hunter Email Finder AI Qualification Push to Pipedrive

Key technical details

Multi-search fallback: if a single SERP result is insufficient, the workflow aggregates from multiple search queries before qualifying.

Lock mechanism: Google Sheets append-with-dedup prevents the same company being processed twice across runs.

AI models: OpenRouter with two separate models — fast model for extraction, reasoning model for qualification scoring.

02 / AI Data Enrichment & CRM

HubSpot Prospect List Generator

Intake form accepts a name, CSV, Excel, or image — AI extracts and enriches contact data, finds emails via two providers with fallback, and writes clean records to Google Sheets.

n8n Mistral OCR OpenRouter AI Snov.io Hunter.io HubSpot Serper
5+ input formats dual email fallback 58 nodes
+

The problem

Sales reps were manually converting prospect lists from business card photos, PDFs, and spreadsheets into usable CRM records — hunting for emails one by one, copy-pasting into HubSpot by hand. The process was inconsistent and slow.

What I built

A Form Trigger accepts text, CSV, Excel, or an image. A router detects the input type. Mistral OCR handles images. AI normalizes all formats into a unified contact schema. A dual email-finder runs Snov.io first, falls back to Hunter.io. If email is still missing, SERP enrichment digs up context. Verified contacts are written to Google Sheets in a HubSpot-ready format.

The outcome

Any team member can drop in a business card photo or messy spreadsheet and get a clean, enriched, deduped list ready for HubSpot import — in seconds.

Node chain

Form Trigger Route by Input Type Mistral OCR (images) AI: Extract & Structure Snov Email Search Hunter Fallback SERP Enrichment AI: Enrich from SERP Google Sheets Append

Key technical details

58-node workflow with 4 parallel enrichment paths that converge via Merge nodes before final output.

Name-based email finding: separate sub-flow handles cases where only a name and company domain are known.

Multi-format extraction: XLSX, CSV, image, and raw text all normalize into the same schema before any enrichment runs.

03 / CRM + Slack + AI Sales Agent

HubSpot Task Automation — AI Sales Rep

AI reads email history for each contact, drafts the next outreach email, routes it to Slack for one-click approval, sends it via Brevo, logs it back to HubSpot, and schedules the next follow-up task automatically.

n8n HubSpot API OpenRouter AI Slack Brevo Google Sheets
30-50 hrs/week saved human-in-loop 114 nodes
+

The problem

A sales team was manually reading through HubSpot contact histories, writing individual follow-up emails, and manually scheduling the next task after each send. Every rep was spending 30-50 hours a week on work that required no real judgment.

What I built

A Webhook fires when a HubSpot task comes due. The workflow pulls the contact's full email history, strips HTML, and builds AI context. An AI agent drafts the next outreach email. A Slack message with Approve / Reject / Revise buttons is sent to the rep. On approval, Brevo sends the email, HubSpot logs it, the task is marked complete, and a new follow-up task is scheduled automatically.

The outcome

Reps review and approve emails — they don't write them. The full cycle from task due to next task scheduled runs in under 60 seconds.

Node chain

Webhook (Slack) Fetch Contact Tasks Get Email History AI: Draft Email Slack Approval Route: Approve / Revise Brevo: Send Email HubSpot: Log Email Mark Task Complete Create Next Task

Key technical details

114-node workflow with full human-in-the-loop: approval, rejection, revision request, and re-generation paths all handled.

Context building: email history sorted by date, HTML stripped, truncated to fit AI context window with relevance preserved.

Timeout guard: if no Slack response within window, a timeout notification fires and the task stays open for manual handling.

04 / CRM Migration & Data Pipeline

Notion → Pipedrive Migration

One-shot migration for Benjamin: pulled funds, professional contacts, and personal contacts from three Notion databases, transformed and deduplicated records, and pushed clean organizations and people to Pipedrive.

n8n Notion API Pipedrive API Google Drive
3 databases migrated zero duplicates 17 nodes
+

The problem

Benjamin had years of contacts and fund data spread across three separate Notion databases with inconsistent field names, duplicate entries, and mixed relationship types. Moving to Pipedrive manually would have taken days and introduced errors.

What I built

The workflow reads all three Notion databases in parallel — Funds, Professional Contacts, Personal Contacts. Custom Code nodes transform each schema to Pipedrive's data model. A Merge node consolidates all people records. Deduplication runs before any Pipedrive writes. Organizations are created first, then people are associated to the correct org. A Google Drive attachment handler migrates linked files.

The outcome

Clean, deduped migration completed in one execution. All relationships preserved. No manual cleanup needed on the Pipedrive side.

Node chain

Manual Trigger Notion: Get Funds DB Notion: Pro Contacts Notion: Personal Contacts Transform Schemas Merge All People Remove Duplicates Push Orgs to Pipedrive Push People to Pipedrive

Key technical details

Schema transformation: Notion's property format (nested objects with type keys) requires substantial flattening before Pipedrive's flat API can accept it.

Relationship preservation: Fund → Organization mapping maintained so contacts land in the correct Pipedrive org on creation.

File migration: Google Drive export links processed and attached to records.

05 / AI Job Matching & Apply System

Upwork AI Scraper Control Panel

n8n control layer for an autonomous Upwork scraper: AI ranks scraped jobs by score, approved jobs are queued for automated application, and Telegram handles real-time approval and confirmation.

n8n OpenRouter AI Telegram Bot Google Sheets Playwright Webhooks
AI job scoring auto-apply pipeline 35 nodes
+

The problem

Manually reviewing Upwork job feeds, evaluating fit, writing proposals, and submitting applications across dozens of jobs per day is a full-time job in itself. The goal was to automate the entire pipeline while keeping human judgment in the loop for final approval.

What I built

A Form Trigger gives control (Run, Pause, Status) via a web UI. Scraped jobs are passed to an AI ranking node that scores each job 0-10 for automation relevance, client quality, and opportunity. Jobs scoring ≥5 are written to Google Sheets. Approved jobs trigger HTTP calls to the Playwright-based scraper which handles the actual application. Webhooks return confirmation results from the scraper back to n8n, which logs outcomes and sends a Telegram notification.

The outcome

The full loop — scrape → rank → approve → apply → confirm — runs with one click. Human time is spent only on reviewing AI-ranked candidates, not on finding or applying to them.

Node chain

Form: Control UI HTTP: Run/Pause/Status Receive Scraped Jobs AI: Rank All Jobs Filter Score ≥ 5 Write to Sheets Read Approved Jobs HTTP: Send to Scraper Webhook: Confirm Telegram Notify

Key technical details

Bidirectional control: n8n both controls the external Playwright process via HTTP and receives webhook callbacks from it — acting as the orchestration layer.

AI ranking prompt: structured scoring across three dimensions (automation fit, client quality, opportunity) with specific penalties for low budgets, unverified clients, and disqualifying criteria.

Approval gate: Google Sheets acts as the human-review queue — jobs sit there until manually approved before the apply pipeline fires.


Technical stack

Tools I ship with.

⚙️n8n
🔄Make.com
Zapier
🐍Python
🤖OpenAI API
🧠Claude API
🔀OpenRouter
🎯HubSpot
📊Pipedrive
📋Airtable
📝Notion API
💬Slack API
📱Telegram Bot
📧Brevo / Sendgrid
🔍SerpAPI / Serper
📬Hunter.io
🌐Snov.io
🗄️PostgreSQL
📊Google Sheets
🕷️Playwright
🔗REST APIs
🪝Webhooks

Stop doing it by hand.

If your team is copy-pasting, manually following up, or running the same process every day — that's the workflow I build next.

View Upwork Profile Email me