استخدام DataOps Engineer
شرح موقعیت شغلی
In the Story of Snappfood, we believe in creating value that goes beyond the ordinary. We are wiling to establish innovative tendencies and are eager to have you on our team to help us get through our business challenges with creativity, intelligence, and agility.
We are waiting for you to continue this story.
We are waiting for you to continue this story.
DataOps Engineer is a specialized role focused on the design, management, and optimization of data pipelines and services within an organization. This role is pivotal in ensuring the smooth operation of data workflows, from ingestion and processing to storage and access. DataOps Engineers are responsible for maintaining and enhancing data pipelines, troubleshooting issues, and ensuring reliable data delivery and performance.
Requirements:
• Kubernetes: Experience with Kubernetes for container orchestration, including deploying and managing containerized applications and services.
• Linux Expertise: Extensive knowledge of Linux operating systems and command-line tools. Proficient in system administration, including storage and network environments.
• Configuration Management and Infrastructure as Code: Hands-on experience with tools like Ansible and Terraform.
• Log Management: Experience with log tracing and analysis to identify and resolve system and data issues (e.g., Elasticsearch, Fluentd, Kibana).
• Monitoring and Alerting Tools: Experience with monitoring tools and platforms for tracking the health and performance of data pipelines and services (e.g., Grafana and Prometheus stack).
• Data Pipeline Tools: Familiarity with data pipeline orchestration tools (Apache Airflow) and workflow automation tools (e.g., GitLab CI/CD, ArgoCD).
• Batch and Stream Processing: Familiarity with batch and stream processing systems (e.g., Apache Spark, Apache NiFi).
• Programming Languages: Proficiency in Python.
• Database Systems: Familiarity with relational databases (e.g., PostgreSQL, MySQL, MSSQL) and OLAP databases (ClickHouse) and data lake (Iceberg)
- Event Streaming: Familiarity with Kafka or other event streaming platforms.
• Linux Expertise: Extensive knowledge of Linux operating systems and command-line tools. Proficient in system administration, including storage and network environments.
• Configuration Management and Infrastructure as Code: Hands-on experience with tools like Ansible and Terraform.
• Log Management: Experience with log tracing and analysis to identify and resolve system and data issues (e.g., Elasticsearch, Fluentd, Kibana).
• Monitoring and Alerting Tools: Experience with monitoring tools and platforms for tracking the health and performance of data pipelines and services (e.g., Grafana and Prometheus stack).
• Data Pipeline Tools: Familiarity with data pipeline orchestration tools (Apache Airflow) and workflow automation tools (e.g., GitLab CI/CD, ArgoCD).
• Batch and Stream Processing: Familiarity with batch and stream processing systems (e.g., Apache Spark, Apache NiFi).
• Programming Languages: Proficiency in Python.
• Database Systems: Familiarity with relational databases (e.g., PostgreSQL, MySQL, MSSQL) and OLAP databases (ClickHouse) and data lake (Iceberg)
- Event Streaming: Familiarity with Kafka or other event streaming platforms.
Benefits:
• Vouchers for vacation, Gym, Therapy Sessions, Internet Costs
• Complementary Insurance
• Educational platform of advanced courses
• Snappfood’s Discount codes
• Loans
• Complementary Insurance
• Educational platform of advanced courses
• Snappfood’s Discount codes
• Loans
مهارتهای مورد نیاز
- Linux
- Python
- MySQL
حداقل سابقه کار
- سه تا شش سال
جنسیت
- مهم نیست
وضعیت نظام وظیفه
- مهم نیست