Join our team
Build sovereign AI for critical infrastructure
Research mentorship for MSc & PhD students
Join our mentorship program focused on generative AI research. Work on small language models, mechanistic interpretability, and domain-specific ML topics with compute access and hands-on guidance.
The Program
A hands-on mentorship program for MSc and PhD students interested in generative AI research. You’ll work on real problems — small language models, mechanistic interpretability, and domain-specific ML — with compute access and direct guidance from our research team.
What You’ll Work On
•
Small language model training and evaluation
•
Mechanistic interpretability experiments
•
Domain-specific fine-tuning for energy applications
•
Agent architecture design and benchmarking
What You Bring
•
Active MSc or PhD student in ML, NLP, or related field
•
Familiarity with PyTorch and transformer architectures
•
Strong motivation to publish and ship research
What We Provide
•
Compute access for experiments
•
Weekly 1:1 mentorship sessions
•
Co-authorship on publications
•
Potential path to full-time role
Fill out our application form
ML Engineer
Own training and inference infrastructure for 1B+ parameter models. Build distributed training pipelines with FSDP, DeepSpeed, and work directly with research to turn architecture ideas into experiments.
The Role
An ML engineer who has trained and served language models before. You’ll own training and inference infrastructure — from setup to distributed training and inference.
What You’ll Do
•
Train/post-train and iterate on 1B+ parameter models across multi-GPUs
•
Build and optimize distributed training and inference infrastructure (FSDP, DeepSpeed, llm-d)
•
Work directly with research to turn architecture ideas into running experiments
What You Bring
•
Hands-on experience training language models
•
Strong PyTorch; familiarity with distributed training frameworks
•
Comfortable with Linux, cluster management
•
Background in HPC or cloud infrastructure (Local, AWS, GCP)
Nice to Have
•
Experience with MoE architectures and sparse models
•
Contributions to open-source ML training tools
Why EnergyAI
•
Shares in the company, competitive salary
•
Access to local and cloud compute
•
Experiments with direct business impact
•
AI startup building sovereign AI for critical infrastructure
Fill out our application form
Don't see your role?
We're always interested in hearing from exceptional people. If you think you can contribute to what we're building, reach out.