About the Role
We are seeking a Data Engineer to design, implement, and maintain scalable data pipelines and storage solutions that enable company-wide data-driven decision-making. You'll manage real-time analytics systems, collaborate with cross-functional teams, and ensure efficient, reliable data flow.
Technical Skills
- Programming Languages: Python, Java or Scala
- Database Technologies: SQL, NoSQL (e.g., MongoDB, Cassandra)
- Data Processing Frameworks: Apache Spark, Hadoop
- Data Warehousing: Amazon Redshift, Google BigQuery, Snowflake
- ETL Tools: Apache Airflow, Apache NiFi, Talend, Informatica
- Cloud Platforms: AWS, Azure, Google Cloud Platform
- Version Control: Git, GitHub, GitLab
Responsibilities:
- Design, develop, and maintain scalable data pipelines using Apache Spark and other big data technologies.
· Build and maintain data architectures on Hadoop or similar distributed file systems.
· Collaborate with cross-functional teams to identify, design, and implement data-driven solutions to complex business problems.
· Optimize data systems for maximum performance and scalability.
· Develop and manage real-time analytics systems, ensuring their reliability, performance, and maintenance.
· Propose and refine data architecture to meet evolving business needs.
· Collaborate with Business Intelligence, Ventures, and Data Science teams to ensure their data requirements are met.
· Monitor and troubleshoot data services, resolving any issues that arise.
· Set up real-time analytics solutions tailored to specific services and business demands.
· Ensure highly efficient data pipelines by identifying and fixing performance bottlenecks.
· Design, implement and maintain data infrastructure to ensure steady and undisrupted data flow.
Job requirements:
· Bachelor's or Master's degree in Computer Engineering/Science or equivalent experience.
- 2+ years of experience in data engineering or a related field.
· Expertise in designing and maintaining scalable data pipelines and big data systems.
· Proficiency in Hadoop ecosystem (HDFS, Yarn, Hive, Spark).
· Hands-on experience with Kafka and Zookeeper for data streaming and coordination.
· Strong programming skills in Python, Java, Scala, or Go (minimum 2 years of experience).
· Familiarity with monitoring systems such as Grafana, Prometheus, and Exporters.
· Experience working with Linux, virtualization, Docker, and Kubernetes.
· Proven experience in setting up and maintaining real-time analytics and big data systems.
· Hands-on experience with big data technologies such as Pig, Kafka, and NoSQL databases.
- Strong communication skills and ability to work collaboratively in a team environment.
- Excellent problem-solving skills and attention to detail.
Familiarity with data visualization and reporting tools (e.g., Tableau, Power BI, Metabase).
مجموعه دانش بنیان بیت 24، یکی از بزرگترین شرکتهای فینتک که بهطور تخصصی در زمینه ارزهای دیجیتال و تکنولوژی بلاکچین فعالیت میکند.
مجموعه بیت24 متشکل از متخصصین نخبه کشور با هدف حذف واسطهها و در راستای رفع نیاز کاربران ایرانی برای مبادله و خرید و فروش ارزهای دیجیتال، پلتفرم امن و آسان بیت24 را طراحی و راهاندازی کردهاند. با گذشت زمان و رشد و توسعه پلتفرم بیت24 مفتخریم با رضایت بیش از 95 درصدی کاربران خود، پیشرو در بازار معاملات ارزهای دیجیتال باشیم.
مزایای بیت 24:
بیمه درمان تکمیلی
کمک هزینه دوره آموزشی
وام
پاداش
ساعت کاری منعطف
صبحانه
ناهار
بسته ها و هدایای مناسبتی
میان وعده
اتاق بازی
امریه سربازی