Deploy ML Model on AWS with Time Series CSV Training & Inference Pipeline


$25.00
Hourly: $25.00 - $52.00

Project Overview: I’m looking for an experienced AWS + Machine Learning + DevOps engineer to build a fully automated ML pipeline on AWS. The project involves loading a machine learning model (of my choice), training it on time series CSV data, running inference via API, and automating infrastructure provisioning using Terraform. Project Goals: Infrastructure-as-Code (IaC): Create a Terraform script to deploy all resources needed for the ML pipeline. Support terraform apply to spin up the full pipeline and terraform destroy to tear it down. Model Hosting: Set up an environment using AWS (SageMaker, S3, Lambda, EC2, or appropriate combo). Load a user-specified ML model (PyTorch, TensorFlow, etc.). Data Pipeline: Accept time series data via CSV uploads to S3. Trigger training pipeline automatically or via API/CLI. Model Training: Set up a training job (SageMaker preferred) using the uploaded CSV files. Enable model re-training and versioning. Inference Endpoint: Provide a secure endpoint (API Gateway + Lambda or SageMaker endpoint) for inference on new data inputs. Return predictions in a clean format (e.g., JSON). Deliverables: Terraform scripts that can: Provision all AWS resources Configure access roles and policies Destroy the environment cleanly End-to-end ML pipeline: Model deployment Training trigger with CSV upload Inference via REST API Documentation covering: How to use Terraform scripts Model training workflow Inference usage (with example request/response) Preferred Skills: AWS (SageMaker, Lambda, S3, API Gateway, IAM, EC2) Terraform (modules, outputs, variables, state handling) ML Frameworks: PyTorch / TensorFlow Python for data handling & model logic MLOps & secure cloud practices Time series ML experience is a bonus What I Will Provide: Model architecture or saved model file Sample CSV training & inference data Example expected outputs Optional (Bonus Points): Lightweight frontend to upload CSVs and view predictions CI/CD workflow using GitHub Actions or similar Logging & monitoring via CloudWatch

Keyword: Data Cleaning

Price: $25.0

Data Science Python TensorFlow Machine Learning Amazon Web Services

 

Web Analyst to build Looker Studio Dashboards and pull data for reports

Looking for an experienced web analytics specialist who is intimately familiar with marketing for approximately 6 weeks. We are a content marketing agency with more work than our current staff can handle. It is critical that this person understands the purpose of the da...

View Job
Pitch Deck

Hi, my name is Dave Peak, and I’m the Founder of EasyBee AI (link removed). At EasyBee, we’re pioneering the future of AI-driven business automation. Think of our AI agents as your business’s own team of intelligent digital assistants

View Job
Data Scraping – Licensed Residential Contractor & Architects in SoCal (Fire-Affected Regions)

We are seeking an experienced data scraping expert to gather a contact list of licensed general contractors and licensed architects who specialize in residential construction across several counties in Southern California, particularly those impacted by or adjacent to r...

View Job