Czas trwania szkolenia: 5 days. The training program is tailored to the needs of the group.
Training addressees
adresaci-szkolenia: The training is addressed to programmers and business analysts whose goal is to learn about Big Data tools. Basic knowledge of Java or Scala is recommended.
Big Data with Apache Family (Hadoop, Spark)
Purpose of training
The main objective of the training is to acquire practical skills of Big Data technologies and Scala language. Participants will learn how to effectively use technologies (Apache Spark, Apache Kafka, Apache NiFi, Apache Druid and Apache Hadoop) commonly used in designing platforms that process large amounts of data. Workshops are an important part of the training.
Hadoop - Hive queries, Hive configuration, external and internal tables, metastore (table properties, partitioning), Spark on Hive
Friday
Apache Kafka
guide for developers (architecture, best practices, tips)
Workshops - Apache Kafka - implementation of the producer and consumer of Kafka in Scala
Apache Druid
a modern solution in Big Data
real-time analytics
architecture
comparison with solutions based on Hadoop
Summary
Ask for training Big Data with Apache Family (Hadoop, Spark)
Get to know your trainers
Piotr Guzik
CEO of Datumo. I am a graduate of WEiTI PW. After almost 5 years of working in the JVM environment, I came to the conclusion that I am really passionat...
Piotr Guzik
CEO of Datumo. I am a graduate of WEiTI PW. After almost 5 years of working in the JVM environment, I came to the conclusion that I am really passionate about the Big Data areas. At Datumo, I am responsible for the mission and goals of the company, and still remain an active team leader and architect.
I run Big Data projects for clients on a daily basis. I like working in cloud environments and with interesting people. I am responsible for selecting the best possible technologies to achieve the goals set by clients. Apart from my work as an architect, I care about the high morale of teams and the working atmosphere.
For many years I have been conducting training in the area of Big Data. I had the pleasure of giving lectures at postgraduate studies and commercial training for clients. I have performed at industry conferences and meetups many times. I like people and I have oratorical skills, so conducting lectures and workshops comes to me with ease and a smile on my face, as evidenced by regular high grades from training participants. I am open to criticism and I try to improve my training based on the feedback received.
Daniel Pogrebniak
Board member and co-founder of Datumo. Experienced JVM developer with many years of practice. A graduate of the Faculty of Electronics and Information ...
Daniel Pogrebniak
Board member and co-founder of Datumo. Experienced JVM developer with many years of practice. A graduate of the Faculty of Electronics and Information Technology at the Warsaw University of Technology. I like to solve complex problems in a simple way and projects with test coverage above 90%. Quality oriented software.
What do I do every day?
I manage the work of a team of ambitious developers. I am a data engineer and Spark developer at Datumo. I run Scala training. In my free time, I ride a bike.
Why do I like to train?
I like sharing knowledge and spreading good programming practices. The training allows me to popularize the right models on a larger scale than in my daily work.
Michał Misiewicz
A person who combines work with passion. I am a graduate of WEITI PW. I started my adventure with data processing at Allegro, where I gained solid tech...
Michał Misiewicz
A person who combines work with passion. I am a graduate of WEITI PW. I started my adventure with data processing at Allegro, where I gained solid technical and business foundations. In 2017, I became an employee of the newly created Datumo start-up, where I co-created a cloud-based product based on the high-performance Apache Druid data warehouse. During the first years of work at Datumo, I gained extensive knowledge of creating modern analytical platforms that process large amounts of data, which I use as an architect.
Currently, I am responsible for the technology department at Datumo. My daily challenges include the development and implementation of the company's technological development direction, management of developer teams and architectural support. Internal training is an important part of my job.
Since 2019, I have been conducting training in Big Data technology. Training is an opportunity for further development, an opportunity to meet interesting people and identify problems that Big Data technologies should solve.
I combine my professional work with running. I take part in competitions on the treadmill and street running. The distance of 5000m covers less than 15 minutes.
Rafał Rozpondek
I specialize in Big Data processing, mainly in solutions from the Apache family. I support and design numerous Big Data platforms, both in terms of clo...
Rafał Rozpondek
I specialize in Big Data processing, mainly in solutions from the Apache family. I support and design numerous Big Data platforms, both in terms of cloud and on-premise. I am a fan of automation technologies (Apache Ariflow, Apache NiFi) and Apache Druida. For several years, I have been conducting Big Data training, where I successfully transfer knowledge about Big Data technology (Apache Hadoop, Spark, Nifi, Airflow, Druid, Kafka).
I like sharing knowledge and promoting new solutions in the world of Big Data. Training allows you to spread interest in the use of new technologies and, as a result, support overcoming business challenges.
Why is it worth choosing Datumo training?
Practical classes
The trainings are conducted in the form of workshops, and the theory is explained on the basis of examples.
Learning system
Which allows you to develop practical skills related to a given technology.
Experienced trainers
Big Data specialists with many years of practice, working in the industry on a daily basis.
Workshops
Allows for an individual approach and activation of each participant.