Data Engineering with Dagster
A well built data platform allows for fast iteration and safe deployments. This course will teach you how to design, build, and maintain a data platform that supports a wide range of data tasks.You will start by taking a simple data workflow, common to most companies, and deconstructing it to its core components. By the end of the course, you will reimplement the pipeline using modern data frameworks, running in the cloud.
Course taught by expert instructors
Senior DevEx Engineer at Materialize
Dennis is a Senior DevEx Engineer at Materialize. He has over a decade+ experience in the data space, focusing on platform engineering and data infrastructure. He has helped many companies improve their data stacks, starting from Wolfram Alpha and Epic, and most recently Dutchie and Drizly.
Learn and apply skills with real-world projects.
- ProjectDeconstruct a simple data pipeline into its components can that be tested locally within DockerLearn
- The challenges of building data applications
- How to deconstruct a data workflows into its key components
- Docker and designing a local environment
- ProjectImplement a data pipeline from into a robust workflow using Dagster and leveraging its core abstractionsLearn
- Fundamentals of Dagster
- How to handle some of the complexities of data workflows
- ProjectInclude your data workflow as one of many in a fully functional Dagster project and the multiple ways to execute your workflowsLearn
- How to view individuals workflows as existing within a full data application
- Building both scheduled and event based processes
- Manage and isolate dependencies
- ProjectA Dagster project backed by assets and a practical data meshLearn
- Define software assets within your Dagster project
- Discuss how to deploy Dagster
Work on projects that bring your learning to life.
Made to be directly applicable in your work.
Live access to experts
Sessions and Q&As with our expert instructors, along with real-world projects.
Network & community
Core reviews a study groups. Share experiences and learn alongside a global network of professionals.
Support & accountability
We have a system in place to make sure you complete the course, and to help nudge you along the way.
Course success stories
Learn together and share experiences with other industry professionals
Dennis Hume is an expert and mentor on how to build out data platforms and support data teams. Dennis knows the past/present/future of the modern data stack intimately, and how to enable companies to graduate from analytics engineering batch SQL workflows to realtime, python, and machine learning services. He's a leading voice in the data community (see his talks on Dagster, Materialize, Modern Data Stack conference, and more). Any analytics engineer or data scientist would be lucky to work with Denins, and anyone lucky enough to learn from Dennis should hop at the opportunity!
Dennis brings with him ample industry experience when it comes to designing and delivering on data platform solutions. In addition to building out the data platform at Drizly, he has been instrumental in leveling up others on topics pertaining to the data space. The clarity of thought and the structured approach to delivering solutions makes it very easy for one to engage and learn with Dennis.
This course is for...
Data Engineers looking to build more reliable pipelines to support their analytics team
Software engineers who want to be more involved in building reliable data applications
Ability to write Python and work with documented libraries
Comfort working with Docker basics (start, stop) and the command line