We train data engineers the necessary tools and techniques needed to build stable, cutting-edge cloud architecture. For more details on our training, click to download our 2022 Data Engineering Training Brochure.
Our training modules are organized into two learning journeys below. You can mix-and-match our training modules to create your own custom journey.
Associate Cloud Data Engineer Journey
Gain the fundamentals to design and develop efficient Cloud data pipelines.
- Python & Pandas Fundamentals for Data Engineering
- SQL Literacy
- Cloud Data Engineering Fundamentals
- API Alchemy
- Serverless Data Pipelines
Technical Level Data Engineer Journey
Learn how to scale, orchestrate, and leverage Cloud technologies—taking your cloud engineering skills to the next level.
- Data Distribution with Apache Spark
- Data Pipeline Orchestration with Apache Airflow
- Stream Processing
- Containerization at scale with Kubernetes
Design & Solution Proposal
Starting a new project or transforming your existing infrastructure? Our expertise will help you choose a design that will scale and grow with your data needs.
- Document your requirements
- Evaluate solution options
- Frame the problem and the solution
- Propose architecture and design
- Scale and performance plan
- Development plan tailored for your team
Duration: 2 weeks
C-Level Strategic Advisory
Asking the big questions? Working on an aspiring startup or transforming your business? We’re here to guide you to make your decisions with confidence.
- Document your idea
- Staged development plan
- Choice of Cloud partner and services
- Proposed scalable architecture
- Long-term cost & performance optimization of Cloud services
- Advice on product pricing & business strategy
- Future hiring and staffing guidelines
Duration: 4 weeks
Data Pipeline Development
Need help building scalable Cloud data pipelines? We’re here to build and automate. Our experts help you to design, test, and deploy with confidence.
- Clear documentation of your requirements
- Development of scalable Cloud data pipelines
- CI/CD workflows
- Automated testing and deployment
- Optimal combination of technologies
- Maintenance plan, runbook, and documentation
Duration: 4 week sprint