Job Details
Experience Needed:
Career Level:
Education Level:
Salary:
Job Categories:
Skills And Tools:
Job Description
Position Summary
We are seeking a Senior Full Stack Engineer to lead the development of our core shipping platform and data intelligence engines. In this role, you will architect robust pipelines for automated data acquisition and integrate Generative AI workflows to enrich and classify market data.
You will work heavily with Google Cloud Platform (GCP) to build resilient infrastructure that can handle large-scale data ingestion without compromising performance.
Key Responsibilities
- Scalable Architecture: Design and maintain microservices and event-driven architectures that power our shipping and logistics dashboard.
- Automated Data Acquisition: Build and optimize sophisticated engines for external data ingestion and market intelligence gathering, ensuring reliability and compliance with anti-bot protocols.
- AI & LLM Integration: Implement AI-driven features using Large Language Models (LLMs) to automate data classification, extract insights from unstructured text, and enhance user workflows.
- GCP Infrastructure Management: Own the deployment and scaling of applications using Cloud Run, App Engine, and GKE.
- Performance Engineering: Optimize database queries and API response times for heavy-load environments.
- Frontend Development: Create responsive, data-rich user interfaces for our clients to manage their shipping operations
Job Requirements
Qualifications
Required
- 5+ years of full-stack development experience.
- Deep expertise in Google Cloud Platform (Cloud Run, Pub/Sub, Cloud Functions, BigQuery).
- Proven track record in building systems for high-volume data harvesting or web automation (using tools like Puppeteer, Playwright, or Selenium).
- Experience integrating LLMs (e.g., Gemini, OpenAI, Anthropic) into production workflows, utilizing prompt engineering, and working with vector databases or RAG (Retrieval-Augmented Generation) patterns.
- Strong command of Python or Node.js, with a focus on asynchronous programming.
- Proficiency in SQL and NoSQL databases, specifically designing schemas for large datasets.
Preferred
- Experience using frameworks like LangChain or building autonomous AI agents for task automation.
- Background in shipping APIs, logistics logic, or platforms like Shopify/WooCommerce.
- Familiarity with data cleaning pipelines and normalization techniques.
- Experience with Terraform and CI/CD pipelines.







