Poste
Data Engineer
ventriloc
Remote work, Quebec, CA
Full time job
Company Description
At ventriloc, we develop innovative data solutions for humans in charge of making informed decisions. We have the strong conviction that it is the individuals who make up the team that are our strength; a professional and highly qualified team in business intelligence. By joining our ranks, you will have the opportunity to contribute to the success of the team and play a key role in enhancing the value of our partners’ data.
Learn from our experts
You will have the chance to collaborate with a team made up of experts in the industry you are passionate about, business intelligence. You will work on captivating projects that make a real difference for our partners. At ventriloc, you will be surrounded by colleagues who are passionate about what they do and who are not afraid to share their expertise.
The Opportunity
Our data integration service is looking for someone who is passionate about data, who thrives on challenges and is not afraid to think outside the box and work with all kinds of data (structured, semi-structured and unstructured) from various sources (ERP, CRM, API, Web, etc.). If you love developing data integration flows and seize any chance to automate processes, this opportunity is for you.
- Facilitate working sessions with users (clients) to identify their analytical and automation needs;
- Modify or design the necessary data model to support the analytical requirements outlined by the user (star schema, data vault 2.0);
- Implement processes for the extraction, transformation, and loading of complex data (ETL, ELT, or ELTL);
- Develop internal and external tools to streamline data management;
- Implement automation processes in Python for our clients;
- Prepare and integrate data feeding analytical solutions using the Microsoft Azure platform (Azure Data Factory/Azure Synapse/Azure Databricks/Azure SQL Database/Azure Storage);
- Document the implemented data transformation rules;
- Collaborate with users to prepare prototypes of analytical solutions and iteratively improve them;
- Adhere to the deadlines for delivering deliverables, as agreed upon with the client and team members.
Required profile
- Bachelor’s degree in Information Technology, Software Engineering, Computer Science, or a relevant field for the role;
- Minimum of 2 years of experience in implementing data extraction, transformation, and loading (ETL, ELT, or ELTL) processes;
- Demonstrated experience with databases and SQL language;
- Proven experience in Python programming;
- Excellent oral and written communication skills, including the ability to write technical reports delivered to partners;
- Strong analytical and synthesis skills, results-oriented, and customer service-oriented.
Strengths to the application:
- Knowledge of the Microsoft Azure data platform (Azure Data Factory/Azure Synapse/Azure Functions/Databricks/Azure SQL Database/Azure Storage).
- Expertise in data modeling for a data warehouse (e.g., Multidimensional (star schema), data vault 2.0).
- Active certification in the exam: DP-203: Data Engineer on Microsoft Azure.
Your benefits
- A competitive salary commensurate with your skills and expertise;
- Incentives linked to training;
- 5 paid personal days off (e.g., illness);
- A flexible work schedule allowing for a work-life balance;
- 100% telecommuting (with occasional gatherings!);
- 50% coverage of the organization’s group insurance program fees;
- $500 per year for a health/wellness program via the Tedy platform;
- Access to a telemedicine program (Telus Health);
- A paid day off on your birthday;
- Unlimited access to all Hedhofis office locations;
- A BYOD (Bring Your Own Device) cell phone plan covered;
- New computer equipment to start your job;
- Option to contribute to Fondaction through salary deductions;
- A dynamic work environment with people who are already excited to work with you!
Ready to take your career to another level?
We are convinced that you are and we look forward to meeting you!