Job Description
Our world is transforming, and PTC is leading the way. Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business.
Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow – all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible.
As a Data Warehouse Engineer, you will support the modeling, development, optimization, and automation of the CDOPS data warehouse. You will help ensure that our Business Intelligence (BI) teams have access to accurate, high-quality data that drives informed decision-making across the organization.
This role is ideal for someone early in their data engineering career who is eager to learn modern data modeling techniques, work with cloud data platforms, and contribute to improving data delivery processes in a dynamic environment.
You will collaborate with infrastructure developers, data engineers, BI developers, product owners, and data analysts across the organization. With guidance and mentorship, you’ll help maintain the Snowflake data warehouse, contribute to an efficient data architecture, and support data integrity, performance, and cost-effectiveness.
You will also gain experience following data warehousing and ETL best practices, documenting data models and processes, and learning how to maintain high standards of data quality. Staying curious and keeping up with industry trends will help ensure our data platform continues to evolve and improve.
Day-to-Day Responsibilities
Data Modeling & Warehousing
· Support the development of dimensional data models for subject areas such as Technical Support ISO processes & Knowledge Centric Support, Operational Expenditure, Web Analytics, Data models.
· Assist in creating and maintaining ETL pipelines using tools such as DBT, Apache Airflow, and GitLab.
· Help manage Snowflake data warehouse assets and learn to optimize storage and compute usage using tools like Terraform
· Participate in implementing data storage and transformation solutions in Snowflake.
· Contribute to the development of AI/BI products
Collaboration & Support
· Work with BI developers to provide data sets (Business Data Marts) for analytics use cases.
· Collaborate with cross-functional teams to gather and understand business and data requirements.
· Help troubleshoot data warehouse and ETL issues with senior team members.
· Contribute to improving the performance and capabilities of the data warehouse.
Best Practices & Continuous Improvement
· Learn and apply data warehousing and ETL best practices.
· Assist in documenting data models, processes, and system designs.
· Participate in efforts to improve data quality and documentation using tools like DBT and Great Expectations.
· Support internal process improvements such as automating manual tasks and optimizing data workflows.
· Stay informed about new trends and technologies related to data warehousing and Snowflake.
Preferred Skills & Knowledge
(You are not expected to be an expert—experience through coursework, internships, or personal projects is welcome.)
· Basic understanding of analytical and dimensional modeling concepts.
· Strong SQL fundamentals.
· Familiarity with Snowflake or other cloud data warehouse technologies.
· Exposure to DBT, Airflow, Git, or Docker is a plus.
· Experience with relational database concepts.
· Introductory experience with Python for data workflows.
· Strong communication skills and willingness to learn in a fast-paced environment.
· Organizational skills and a proactive mindset.
· Ability to work effectively with cross-functional teams.
Preferred Experience
(Nice to have, not required)
· Exposure to SaaS business models.
· Experience working in Agile environments.
· Familiarity with GitLab CI/CD.
· Basic understanding of Terraform or infrastructure-as-code concepts.
· Experience using Docker as a development environment.
Basic Qualifications
· Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field.
· Strong interest in data engineering and cloud data architectures.
· English fluency.
· Foundational understanding of data systems, ETL concepts, or database design (coursework acceptable).
· Internship, academic project, or equivalent experience working with data tools is a plus
Life at PTC is about more than working with today’s most cutting-edge technologies to transform the physical world. It’s about showing up as you are and working alongside some of today’s most talented industry leaders to transform the world around you.
If you share our passion for problem-solving through innovation, you’ll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us?
We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here."