Cogeco

Specialist, Data Engineering

Quincy, MA Full time

Our culture lifts you up—there is no ego in the way. Our common purpose? We all want to win for our customers. We aim to always be evolving, dynamic, and ambitious. We believe in the power of genuine connections. Each employee is a part of what makes us unique on the market: agile and dedicated.

Time Type:

Regular

Job Description :

Key Responsibilities

1. Data Integration & Architecture

○ Develop and orchestrate data pipelines for ingestion from various sources (e.g.,

MySQL, Oracle, PostgreSQL, flat files…etc.) into a cloud-based environment and

move data around multiple system based on the business needs and

requirements.

○ Collaborate with Data Analysts and Data Architects on defining data models,

requirements, and architecture for optimal performance in databases (e.g.,

BigQuery or other cloud-based relational databases).

○ Ensure robust ETL/ELT processes that support scalability, reliability, and efficient

data access.

2. Data Governance & Classification

○ Implement and maintain data governance frameworks and standards, focusing

on data classification, lineage, and documentation.

○ Utilize Collibra or similar platforms to manage data catalogs, business glossaries,

and data policies.

○ Work closely with stakeholders to uphold best practices for data security,

compliance, and privacy.

3. Process Improvement & Automation

○ Identify, design, and implement process enhancements for data delivery,

ensuring scalability and cost-effectiveness.

○ Automate manual tasks using scripting languages (e.g., Bash, Python) and

enterprise scheduling/orchestration tools like Airflow.

○ Conduct root cause analysis to troubleshoot data issues and implement solutions

that enhance data reliability.

4. Cross-Functional Collaboration

○ Partner with cross-functional teams (IT, Analytics, Data Science, etc.) to gather

data requirements and improve data-driven decision-making.

○ Provide subject matter expertise on cloud data services, data classification

standards, and governance tools.

○ Monitor and communicate platform performance, proactively recommending

optimizations to align with organizational goals.

Skills & Qualifications

● Technical Expertise

○ Experience with at least one major cloud platform (AWS, Azure, GCP), with GCP

exposure considered a significant asset.

○ Strong understanding of RDBMS (PostgreSQL, MySQL, Oracle, SQL Server) with

the ability to optimize SQL queries and maintain database performance.

○ Familiarity with version control systems (Git) to manage codebase changes and

maintain a clean development workflow.

○ Familiarity with data governance and classification concepts, leveraging Collibra

or similar platforms to manage data lineage, business glossaries, and metadata.

○ Knowledge of Linux/UNIX environments, and experience working with APIs

(XML, JSON, REST, SOAP).

● Data Pipeline Development

○ Demonstrated ability to build large-scale, complex data pipelines for ETL/ELT

processes.

○ Hands-on experience with scripting/programming languages (e.g., Python, Bash)

to automate data workflows and error handling.

○ Strong analytical and problem-solving skills with the ability to work with

unstructured datasets.

● Security & Compliance

○ Functional knowledge of encryption technologies (SSL, TLS, SSH) and data

protection measures.

○ Experience implementing governance best practices to ensure data security and

regulatory compliance.

● Soft Skills

○ Excellent communication and collaboration skills to partner effectively with cross-

functional teams.

○ Curiosity and a growth mindset, with the initiative to explore emerging data

technologies.

Education & Experience

● Bachelor’s degree in Information Technology, Computer Science, or a related field; or an

equivalent combination of education and experience.

● 5 years of progressive experience in data engineering, data analytics, or a similar role.

● Proven track record in architecting, optimizing, and delivering enterprise-grade data

solutions on a major cloud platform (AWS, Azure, or GCP).

● Demonstrated commitment to continuous learning and improvement in data engineering

methodologies.

If you’re passionate about leveraging cloud technologies to build efficient, scalable data

pipelines—and ensuring that data remains well-classified, well-governed, and accessible—we’d

love to hear from you!

You’ll benefit from:

Flexibility: Yes, we think that what you do matters. At work and at home.

Fun: We laugh a lot, it makes every day brighter.

Discounted services: We provide amazing services to our clients, and you’ll get them at

home, because you deserve them.

Rewarding Pay: Let's be honest, everybody likes to make a good salary. We offer attractive

compensation packages, and it comes with a great culture.

Benefits: We’ve got you covered.

Career Evolution: Join us and we will give you the tools to achieve your career goals!

Technology: You have a passion for technology? Excellent, we do too. Here, you will manage,

influence, play, create, fix, and shape the industry.

Location :

Quincy, MA

Company :

Breezeline

At Cogeco, we know that different backgrounds, perspectives, and beliefs can bring critical value to our business. The strength of this diversity enhances our ability to imagine, innovate, and grow as a company. So, we are committed to doing everything in our power to create a more diverse and inclusive world of belonging.

By creating a culture where all our colleagues can bring their best selves to work, we’re doing our part to build a more equitable workplace and world. From professional development to personal safety, Cogeco constantly strives to create an environment that welcomes and nurtures all. We make the health and well-being of our colleagues one of our highest priorities, for we know engaged and appreciated employees equate to a better overall experience for our customers.


If you need any accommodations to apply or as part of the recruitment process, please contact us confidentially at inclusion@cogeco.com