**Description**:
**What You Will Do**
- Design, deploy, and maintain our cloud-based infrastructure using technologies such as Google Cloud Platform (GCP)
- Manage and optimize data pipelines using Dataflow, ensuring efficient data processing and integration
- Work with Bigquery for data analysis and reporting, ensuring data accuracy and availability
- Monitor system performance and troubleshoot issues related to logging, monitoring, and alerting tools
- Implement load balancing strategies to ensure high availability and optimal resource utilization
- Utilize Cloud SQL and Redis for efficient and secure data storage and caching
- Set up networking configurations, including VPCs and firewalls, to ensure secure communication between services
- Manage and optimize the usage of Google Pub/Sub, Bigtable, and API management tools
- Utilize Cloud Scheduler to automate tasks and optimize system efficiency
**What You Will Need**
- 2-4 years of professional experience in DevOps/DataOps or cloud infrastructure management
- Strong proficiency in GCP services, including Dataflow, Bigquery, Cloud Storage, Compute Engine, Kubernetes, Monitoring, Logging, Alerting, Load Balancer, Cloud SQL, Redis, Networking, Pub/Sub, Bigtable, and API management
- Experience with orchestration tools like Composer and deployment using Cloud Functions
- Familiarity with AI Notebook usage and containerization via container registries
- Proficiency in setting up and managing CI/CD pipelines using tools like Jenkins and Docker
- Hands-on experience with infrastructure as code (IAC) tools such as Terraform and Ansible
- Familiarity with version control systems, particularly GitHub
- Knowledge of monitoring and metrics tools like StatsD
- Strong problem-solving skills and ability to troubleshoot complex technical issues
- Excellent communication and teamwork skills, with the ability to collaborate effectively across teams
**IMPORTANT**: