Vacancy description
Cloudfresh ⛅️ is a global Google Cloud Premier Partner, Zendesk Premier Partner, Asana Solutions Partner, GitLab Select Partner, Microsoft Partner, and Okta Partner.
At Cloudfresh, we put ourselves in your shoes and act as a vital link between your company and cloud technologies.
More about us here — https://bit.ly/3cj86Tb
And here — https://bit.ly/3PErkB0
We are currently looking for a Data Engineer to join our team.
As a Data Engineer/Data Analyst, you will play a pivotal role in transforming raw data into valuable insights. You will be responsible for designing and maintaining data infrastructure, performing data analysis, and generating actionable reports to support business operations and strategic planning. You will work closely with cross-functional teams, including data scientists, business analysts, and IT professionals, to ensure data integrity and accessibility.
The ideal candidate will be passionate about data and possess excellent analytical, problem-solving, and communication skills.
Requirements:
- — B2+ English;
- — Proven experience as a Data Engineer, with a minimum of 6 months of relevant work experience;
- — Programming skills in Python;
- — Experience with Big Data technologies such as Hadoop, Spark, and Kafka;
- — Familiarity with cloud-based data storage and processing platforms such as AWS, Azure, or Google Cloud;
- — Knowledge of database technologies such as SQL, NoSQL, and data warehousing;
- — Proven experience in data engineering, data analysis, or a similar role in [industry or sector].
- — Proficiency in data manipulation and analysis tools (e.g., Python, SQL, R).
- — Knowledge of data visualization tools (e.g., Looker, Tableau, Power BI).
- — Strong problem-solving and analytical skills;
- — Excellent communication and collaboration skills.
Responsibilities:
- — Data Collection and Integration: Collect, clean, and preprocess data from various sources to ensure data quality and consistency.
- Integrate data from structured and unstructured sources into our data warehouse.
- — Data Infrastructure: Design, develop, and maintain data pipelines, ETL processes, and data models. Ensure data storage, retrieval, and access mechanisms are efficient and scalable.
- — Data Analysis: Perform exploratory data analysis to discover trends, patterns, and anomalies. Generate descriptive and diagnostic reports to provide insights to the business.
- — Data Visualization: Create visualizations and dashboards using tools like Tableau, Power BI, or custom solutions. Present data insights to non-technical stakeholders clearly and understandably.
- — Data Governance: Implement and maintain data governance best practices, including data security and compliance.
- Ensure data accuracy, consistency, and availability for internal and external users.
- — Collaboration: Collaborate with data scientists and analysts to identify opportunities for data-driven decision-making.
- Work with IT teams to optimize data infrastructure and troubleshoot data-related issues.
- — Troubleshoot and resolve data-related issues in a timely manner;
- — Stay up-to-date with industry trends and best practices in data engineering.
Would be a plus:
- — Programming skills in Java or Scala;
- — Bachelor’s degree in Computer Science, Engineering, or a related field.
Work conditions:
- — Fully remote or office in the center of Kyiv or Prague.
- — An opportunity to work with the world’s best products from Google, Zendesk, Asana, and Gitlab.
- — The competitive salary level + transparent bonus system.
- — Official employment under the Labor Code, vacation, and sick leave.
- — Flexible and hybrid working hours: 9:00 a.m. to 6:00 p.m. or 10:00 a.m. to 7:00 p.m.
- — Educational sponsorship (various educational opportunities, e.g., courses, conferences, meet-ups, etc., are sponsored by the company)
- — The opportunity to work in an international IT company that has been growing for five years in a row;
- — Work with products like Google, Asana, Zendesk, GitLab, Microsoft, and Okta.