NoFluffJobs Praca zdalna Expert New

Data analyst

apreel Sp. z o.o.

⚲ Warsaw, Wrocław

23 520 - 26 880 PLN (B2B)

Wymagania

  • HTML
  • Data analysis
  • Data models
  • Storage
  • Data modelling
  • Data engineering
  • Python
  • Spark
  • Shell
  • SQL
  • SQL Server
  • Neo4j
  • Snowflake
  • PostgreSQL
  • Data structures
  • Product design
  • Oracle
  • Cloud
  • BigQuery
  • Redshift
  • Azure Cloud
  • Data Lake
  • Azure SQL
  • Operating system
  • Project management (nice to have)
  • Data visualization (nice to have)
  • Tableau (nice to have)
  • BI (nice to have)

Opis stanowiska

O projekcie: Offer: - Location: 100% remote / equipment pick up in Warsaw OR Wrocław - Start: ASAP- Employment: B2B contract with apreel - Rate: up to 160 zł /h + VAT Wymagania: Skills and attributes for success : - Highly skilled in data modelling: Experience developing data models from scratch for green field projects in multiple domains. Should have Deep understanding of data warehousing concepts, dimensional modelling, and normalization/denormalization techniques. Expertise in tools such as Erwin Data Modeler, PowerDesigner , or similar.  - Knowledge of data products: Strong understanding of data product design principles and lifecycle.  - Strong SQL skills and experience with relational (e.g., Oracle, SQL Server, PostgreSQL) and cloud databases (e.g., Snowflake, BigQuery , Redshift).  - Good understanding of Azure cloud data services (Data Lake, Data Factory, Azure SQL).  - Problem-Solving: Adept at tackling complex issues and finding effective solutions.  - Curiosity and Self-Starter: Always eager to learn and take initiative without needing constant guidance.  - Comfortable with Ambiguity: Capable of working efficiently even when the answers are not immediately clear.  - Effective Communication: Excellent at conveying complex ideas and collaborating with stakeholders.  - Experience with Databases and Data Formats: Familiar with various databases, operating systems, file types, and data formats.  - Experience with different data roles (analysis, modelling, science, etc).   Preferred Skills: - Advanced Data Modelling Techniques: Experience with advanced modelling.  - Busin ess Analysis Expertise: Ability to bridge the gap between technical and business requirements.  - Exposure to Programming: Skilled in Python, Spark and SQL.  - Project Management: Skills in managing projects, timelines, and deliverables.  - Data Visualization Tools: Proficiency with tools such as Tableau, Power BI, or similar. Codzienne zadania: - Data Extraction and Standardization : Lead initiatives to extract and standardize financial data from various formats, including PDF, HTML, XBRL and iXBRL , ensuring data accuracy and consistency. - Mentorship and Development : Provide guidance and support to junior analysts, fostering their growth in data analysis, programming, and statistical methodologies. - Exploratory Data Analysis : Conduct exploratory data analysis to identify trends, raise important questions, and derive actionable insights. - Data Model development : Design, implement and optimize conceptual, logical and physical data models for enterprise-scale data products. Develop and maintain data models using ERD diagrams and manage the data dictionaries, for transactional, star and flat schemas etc for different storage structures. - Data Model Democratization: Partner with data engineering teams to democratize the data model for designing efficient data pipelines. - Data Modelling Standards: Define and enforce data modelling standards and best practices. Conduct data analysis to validate modelling standard compliance, model accuracy, identify anomalies, and ensure data quality. - Data Classification and Taxonomies: Design custom taxonomies, and reference data classification methods/structures - Programming and Automation : Utilize programming languages such as Python, Spark, Regex, Shell Scripts and SQL for data manipulation, analysis, and automation of processes, including meta-programming and dynamic code generation. - Database Management : Manage and optimize databases (SQL Server, Neo4j, Snowflake, Postgres), understanding join types, aggregate functions, and data storage formats (Parquet, AVRO, Delta). - Collaboration : Collaborate with product managers, data engineers, and analysts to translate business requirements into robust data structures. Work closely with cross-functional teams to address data quality issues and implement effective solutions, promoting a cult