Job Description
Key Responsibilities:
- Architect and implement self-supervised learning frameworks (masked prediction,contrastive learning, temporal order prediction, etc.).
- Design and train transformer encoders for high-dimensional time-series data.
- Build data pipelines and augmentation strategies for large, noisy sequential datasets.
- Fine-tune shared backbones for multi-task problems (classification + regression).
- Optimize and export models for edge / mobile inference (TFLite, Core ML, ONNX, quantization).
- Develop and train time-series models (LSTM, GRU, 1D CNN, Transformer) to detect user motion and events.
- Design data preprocessing pipelines including synchronization, normalization, segmentation, and feature extraction.
- Collaborate with Researcher to integrate physics-based features into ML model training.
- Perform hyperparameter tuning, validation, and cross-device generalization testing across smart devices.
- Build tools for dataset management, labeling, and feature visualization.
- Maintain documentation, experiment tracking, and reproducibility logs.
Qualifications & Skills:
- B.Tech/M.Tech or Ph.D. in Computer Science, AI/ML related field.
- 0-3 years of experience developing ML solutions for time-series or sensor data.
- Hands-on experience with self-supervised or contrastive learning (e.g., SimCLR, MoCo, MAE, BYOL, BERT-style pretraining).
- Strong grounding in time-series / sequential modeling (Transformers, RNNs, CNN-1D).
- Solid understanding of data preprocessing and normalization for continuous sensor signals.
- Strong proficiency in Python, PyTorch/TensorFlow, NumPy, Pandas, and Scikit-learn.
- Hands-on experience with sequence models (RNN/LSTM/GRU/CNN).
- Familiarity with MLOps tools (Weights & Biases, MLflow) and edge deployment pipelines.
- Strong debugging, version control (Git), and collaborative documentation habits.
Skills Required
Python, Pytorch, Tensorflow, Numpy, Pandas
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application