Data Engineer


Remote work, Quebec, CA
Full time job

Company Description

At ventriloc, we develop innovative data solutions for humans in charge of making informed decisions. We have the strong conviction that it is the individuals who make up the team that are our strength; a professional and highly qualified team in business intelligence. By joining our ranks, you will have the opportunity to contribute to the success of the team and play a key role in enhancing the value of our partners’ data.

Learn from our experts

You will have the chance to collaborate with a team made up of experts in the industry you are passionate about, business intelligence. You will work on captivating projects that make a real difference for our partners. At ventriloc, you will be surrounded by colleagues who are passionate about what they do and who are not afraid to share their expertise.

The Opportunity

Our data integration service is looking for someone who is passionate about data, who thrives on challenges and is not afraid to think outside the box and work with all kinds of data (structured, semi-structured and unstructured) from various sources (ERP, CRM, API, Web, etc.). If you love developing data integration flows and seize any chance to automate processes, this opportunity is for you.

As a Data Engineer, you will have the responsibilities described below:
  • Write analytical architecture plans and implement proposed solutions for clients;
  • Conduct training and knowledge transfer to Ventriloc resources and clients resources;
  • Animate working sessions with users (clients) to identify their analytical needs;
  • Document analytical needs through principles following the Agile project management methodology;
  • Develop a good understanding of the business context in which analytical solutions will be used;
  • Have an excellent understanding of the analytical or decision-making process followed by users of analytical applications;
  • Document the data transformation rules used in the solutions;
  • Modify or design the data model required to support the analytical needs stated by the user (star schema, data vault 2.0);
  • Set up processes for extracting, transforming and loading (ETL or ELT or ELTL) complex data;
  • Develop internal and external tools to facilitate data management;
  • Implement robot automation (RPA) processes via the Power Automate platform or custom code (Python) in our partners environment;
  • Set up data extraction processes using different APIs to store or transfer data between systems;
  • Prepare and integrate data for analytical solutions using the technology chosen by our customers or ventriloc (Azure Data Factory/Azure Synapse/SQL/Databricks);
  • Have a mastery of Microsoft Azure data storage platforms and offer the right platforms to customers according to their needs;
  • Implement data warehouses in our customers environment based on different types of technological architecture;
  • Estimate the effort and time required to complete the design, development and testing of the analytical solutions to be implemented;
  • Respect the production dates of the deliverables, as agreed with the client and team members;
  • Follow the evolution of technologies in relation to the design and implementation of analytical solutions;
  • Follow ventriloc’s administrative processes such as producing weekly timesheets, producing expense reports and vacation requests.
Depending on the evolution of the organization and your career path, changes may occur on this list of responsibilities. This should not be interpreted as being exclusive of all other responsibilities and may be modified at any time.

Required profile

In order to fulfill these responsibilities, you must have the following skills:
  • Bachelor’s degree in information technology, software engineering, computer science or a field relevant to the function;
  • Master’s degree in business intelligence (important asset);
  • Minimum of 2 years of experience in business intelligence;
  • Minimum of 2 years of experience in developing extraction, transformation and loading processes (ETL or ELT or ELTL);
  • Advanced knowledge of the Microsoft Azure platform (Azure Data Factory/Azure Synapse/Databricks);
  • Proven experience in process automation (RPA) via Power Automate platform or custom code;
  • Advanced knowledge of databases and SQL language;
  • Advanced programming knowledge (Python, C#);
  • Advanced knowledge of data modeling of a data warehouse (e.g. Multidimensional (star schema), data vault 2.0);
  • Active exam certification: DP-203: Data Engineer on Microsoft Azure (asset);
  • Excellent oral and written communication, including the writing of technical reports to be delivered to partners;
  • Great intellectual curiosity and on the lookout for the latest trends in business intelligence;
  • Good analytical and synthesis skills, results-oriented and focus on customer service;
  • Demonstrate a very good capacity for influence and decision-making;
  • Adaptability and good change management;
  • Rigor, concern for the quality of the work delivered and open-mindedness.

Your Benefits

  • A competitive salary representing your skills and expertise;
  • Training incentives;
  • 5 days of paid leave for personal reasons (e.g. illness);
  • A flexible work schedule allowing you to balance work and family;
  • 100% teleworking (we still keep time for everyone to get together!);
  • Payment of 50% of the costs associated with the organization’s group insurance program;
  • $500 per year to a health/wellness program via the Tedy platform;
  • Access to a telemedicine program (Telus Health);
  • A paid day off on your birthday;
  • Possibility of participating in the company’s telephone program to have access to a cellular plan at advantageous rates;
  • New computer equipment to start your job;
  • Possibility to contribute to Fondaction via payroll deduction;
  • A dynamic working atmosphere with people who already want to work with you!

Ready to take your career to another level?

We are convinced that you are and we look forward to meeting you!