استخدام Senior Data Engineer
شرح موقعیت شغلی
About First Source Arya Solutions
First Source Arya Solutions is a software development company specialising in the global financial industry. We help our international clients grow their businesses with customer-centric products and services.
Our company offers a work environment that fosters personal and professional growth. A career with us is an opportunity to make an impact in a fast-growing, global organisation.
Our company offers a work environment that fosters personal and professional growth. A career with us is an opportunity to make an impact in a fast-growing, global organisation.
The team
We want to build and deliver more valuable products and services to cater to our consumers’ needs. To do this, we collect relevant data and analytics to learn more about them. What we do is critical to making smart marketing decisions, optimising our business, and boosting profitability. We lead the organisation in cultivating a data-driven culture as the organisation advances — we are the Business Intelligence team.
The role
As a Senior Data Engineer at First Source, you'll be in charge of gathering, managing, and transforming raw data into useful information. Your expertise will help us evaluate and improve our performance. You’ll be actively involved in developing, testing, and maintaining data processing architectures and building pipelines for Extract, Transform, and Load (ETL) operations. You'll be our data custodian, taking charge of the accuracy and quality of existing and new data.
Challenges
● Acquire data from various sources and ensure its efficient absorption for the organisation’s future use.
● Address the organisational needs for accurate, relevant, and up-to-date data by converting raw data into an easy-to-understand format for analysis and reporting purposes.
● Creating new data values, converting them into useful information, and designing solutions.
● Build and maintain the organisation’s database, which includes tasks such as designing, processing, and analysing data, as well as data flow optimisation.
● Oversee the pipelined architecture, which includes tasks like addressing logging issues, testing, database administration, and maintaining a stable pipeline.
● Address the organisational needs for accurate, relevant, and up-to-date data by converting raw data into an easy-to-understand format for analysis and reporting purposes.
● Creating new data values, converting them into useful information, and designing solutions.
● Build and maintain the organisation’s database, which includes tasks such as designing, processing, and analysing data, as well as data flow optimisation.
● Oversee the pipelined architecture, which includes tasks like addressing logging issues, testing, database administration, and maintaining a stable pipeline.
Minimum requirements
● Knowledge of a variety of data modelling techniques such as Kimball star schema, Anchor modelling, and Data vault
● Strong skills in Python and other similar object-oriented or object-function scripting languages
● Understanding of SQL and NoSQL relational databases, especially PostgreSQL, PITR, pg basebackup, WAL
archiving, and Replication
● Proven track record in building and maintaining ETL/ELT data pipelines and workflow management solutions like Airflow
● Exposure to Google Cloud Services (GCS) knowledge, such as Google Compute Engine (GCE), BigQuery, Dataflow, and Cloud functions
● Strong analytical mindset with the capability to translate data into usable information for corporate decision-making
● Practical experience in assisting teams to make data-driven decisions for the organisation
● Effective communication and presentation skills
● Good working knowledge of both spoken and written English
● Strong skills in Python and other similar object-oriented or object-function scripting languages
● Understanding of SQL and NoSQL relational databases, especially PostgreSQL, PITR, pg basebackup, WAL
archiving, and Replication
● Proven track record in building and maintaining ETL/ELT data pipelines and workflow management solutions like Airflow
● Exposure to Google Cloud Services (GCS) knowledge, such as Google Compute Engine (GCE), BigQuery, Dataflow, and Cloud functions
● Strong analytical mindset with the capability to translate data into usable information for corporate decision-making
● Practical experience in assisting teams to make data-driven decisions for the organisation
● Effective communication and presentation skills
● Good working knowledge of both spoken and written English
Preferred experience
● Broad awareness of cybersecurity and data protection
● Proficiency in Luigi and other similar data pipeline and workflow management tools
● Proficiency in Luigi and other similar data pipeline and workflow management tools
Perks and benefits
● Growth-inducing challenges
● Productive work atmosphere
● Cooperation, support, and empowerment
● Career progression opportunities
● Competitive salary
● Annual performance bonus
● Health benefits
● Productive work atmosphere
● Cooperation, support, and empowerment
● Career progression opportunities
● Competitive salary
● Annual performance bonus
● Health benefits
مهارتهای مورد نیاز
- Data engineer
- SQL
- ETL
- Python
زبانهای مورد نیاز
- انگلیسی
حداقل سابقه کار
- سه تا شش سال
جنسیت
- مهم نیست
وضعیت نظام وظیفه
- معافیت دائم پایان خدمت