+917330907991 (Online) | +917330907992 (Classroom) info@rcptec.com

Apache Spark and Scala Training

Spark and Scala Training Course Content

Module 1

Introduction to Scala

Learning Objectives – In this module, you will understand basic concepts of Scala, motives towards learning a new language and get your set-up ready.

Topics

1)      Why Scala?

2)       What is Scala?

3)       Introducing Scala

4)       Installing Scala

5)       Journey – Java to Scala

6)       First Dive – Interactive Scala

7)       Writing Scala Scripts – Compiling Scala Programs

8)       Scala Basics

9)       Scala Basic Types

10)    Defining Functions

11)    IDE for Scala, Scala Community

Module 2

Scala Essentials

Learning Objectives – In this module, you will learn essentials of Scala that are needed to work on it.

Topics

1)       Immutability in Scala – Semicolons

2)       Method Declaration, Literals

3)       Lists

4)       Tuples

5)       Options

6)       Maps

7)      Reserved Words

8)       Operators

9)      Precedence Rules

10)    If statements

11)    Scala For Comprehensions

12)    While Loops

13)    Do-While Loops

14)    Conditional Operators

15)    Pattern Matching

16)    Enumerations

Module 3

Traits and OOPs in Scala

Learning Objectives – In this module, you will understand implementation of OOPs concepts in Scala and use Traits as Mixins

Topics

1)       Traits Intro – Traits as Mixins

2)       Stackable Traits

3)       Creating Traits Basic OOPS – Class and Object Basics

4)       Scala Constructors

5)       Nested Classes

6)       Visibility Rules

Module 4

Functional Programming in Scala

Learning Objectives – In this module, you will understand functional programming know-how for Scala.

Topics

1)       What is Functional Programming?

2)       Functional Literals and Closures

3)       Recursion

4)       Tail Calls

5)       Functional Data Structures

6)       Implicit Function Parameters

7)       Call by Name

8)       Call by Value

Module 5

Introduction to Big Data and Spark

Learning Objectives – In this module, you will understand what Big Data is, it’s associated challenges, various frameworks available and will get the first-hand introduction to Spark

Topics

1)       Introduction to Big Data

2)       Challenges with Big Data

3)       Batch Vs. Real-Time Big Data Analytics

4)       Batch Analytics – Hadoop Ecosystem Overview

5)       Real-Time Analytics Options, Streaming Data – Storm

6)       In Memory Data – Spark

7)       What is Spark?

8)       Modes of Spark

9)       Spark Installation Demo

10)    Overview of Spark on a cluster

11)    Spark Standalone Cluster

Module 6

Spark Baby Steps

Learning Objectives – In this module, you will learn how to invoke Spark shell and use it for various standard operations.

Topics

1)       Invoking Spark Shell

2)       Loading a File in Shell

3)       Performing Some Basic Operations on Files in Spark Shell

4)       Building a Spark Project with sbt, Building and Running Spark Project with sbt

5)       Caching Overview, Distributed Persistence

6)       Spark Streaming Overview

7)       Example: Streaming Word Count

Module 7

Playing with RDDs

Learning Objectives – In this module, you will learn one of the building blocks of Spark – RDDs and related manipulations for implementing business logics.

Topics

1)       RDDs

2)       Transformations in RDD

3)       Actions in RDD

4)       Loading Data in RDD

5)       Saving Data through RDD

6)       Scala and Hadoop Integration Hands-on

Module 8

Shark – When Spark meets Hive

Learning Objectives – In this module, you will see different offspring of Spark like Shark, SparkSQL, and Mila. This session is primarily interactive for discussing industrial use cases of Spark and latest developments happening in this area.

Topics

1)       Why Shark?

2)       Installing Shark

3)       Running Shark

4)       Loading of Data

5)       Hive Queries through Spark

6)       Testing Tips in Scala

7)       Performance Tuning Tips in Spark

8)       Shared Variables: Broadcast Variables

9)       Shared Variables: Accumulators

What is a Spark?

Apache Spark is a day to an analytics cluster computing framework. It is an open source software.It was a fully developed in the A&P lab that you see Barkley spa fits into the Hadoop open source community.It builds on top of the Hadoop distributed file system called a DFS, however, Spark is not tied to the two-stage MapReduce paradigm.It promises performance up to times faster than how do MapReduce for certain applications. Spark provides primitives foreign memory cluster computing; the in-memory cluster computing allows use programs to load data into a clusters memory and clearing.It repeatedly this makes Spark well-suited to machine learning algorithms.

Spark became an Apache top-level project. It was previously an Apache Incubator project. It has received code contributions from large companies that use Spark the companies include Yahoo and Intel .over individual developers had contributed code to Spark representing different companies .the software is written in scholar Java and Python language.It is available for operating systems Linux Mac operating system, and Windows Spark is available for use an under Apache License to the official website is Spark doctor patchy .org

Tags: Apache Spark and Scala Training

17 Comments

  1. Give me the fees structure for Scala nd Spark

  2. Hi, What is the duration of Spark & Scala training and what is the cost for online training?
    Is there a project involved as well?

    – Raja

  3. Looking for Scala Training

  4. What is a Basic idea about Scala? can you explain in briefly.

  5. Basic idea about Scala

    Scala is a pure object-oriented language with support for functional programming. So, it is a pure object-oriented programming, in which everything is an object, but which provides support for functional programming too. When it comes to the Big Data systems and all, object programming takes a back seat and one’s style of programming typically becomes functional. That is the reason why people go for Hadoop scalding rather than going for native MapReduce unless it becomes absolutely necessary.

    So, in short, Scala is a general purpose programming language designed to express common programming patterns in a concise, elegant and type-safe way. It supports both object-oriented programming as well as functional programming. Hence, it supports both styles of programming styles altogether. And most importantly, Scala is very much in a fabric of the present and future Big Data frameworks like Scalding, Spark and Akka. Additionally, an important advantage is one can also write MapReduce programs in Scala. However, there is already a framework called Scalding using which makes it much easier than Scala.

    Another important thing about Scala is that to understand people need to have at least some understanding of programming. Though Scala is quite different from other languages basic expertise in programming is a definite requirement for learning it.

  6. How is coding in Scala done?

  7. What are the things that one can do with Scala?
    First of all, Scala is a functional language. Well, all that is very good but what does it mean and how does it actually feel when working with it?

    Coding can be of two types: imperative and functional.
    In an imperative style of coding two things are seen: a first thing about an imperative code is you have to tell the code every step, not only what to do but also how to do it. The second smell of an imperative code is the mutability of variables.

    In a functional style of coding, we lean more towards a declarative style of coding rather than being imperative. Rather than telling how to do something we focus more on what result we want to achieve and then simply let the code take care of how to implement it.

    Thus, a more declarative style of coding is easier to write, more expressive and hence is a lot easier to work with. The other thing in functional coding is we eliminate the mutable variable. So rather than mutating this value over and over again we would program with no mutability. So, in functional programming we do to functions what we normally do to objects. And with Scala we have an option for using the functional style of coding.

  8. How is Scala unique compared to Java?

  9. we are going to discuss some things that are unique to Scala when compared to Java itself. One of the things that Java provides is static typing; so, what does it mean really?

    If you are writing code in Java at every step you have to tell what the type is, which can become very annoying. At times, we specify the type but you know what the type is, the compiler knows what the type is and yet we have to spell it out. In the case of Scala, you don’t have to go through all that because Scala is statically typed. In fact it is more statically typed as compared to Java. And hence the programmer doesn’t have to do as much typing when coding because the language is a bit more intelligent and it can figure what the type is.

    So, Scala does a fairly healthy dose of type inference and the only time it doesn’t do the inference is when the type is either ambiguous or a bit confusing. And Scala actually has a nice balance as compared to languages like F# where they would walk into a function implementation to explore the type and then decide what the parameter types are, which can be a bit problematic since if the implementation is changed the type might also be changed and that might have other impacts. But Scala doesn’t go to that lengths to infer types; it only does the inference where ever it can. So for most parts, the places where the type needs to be specified is when you are defining parameters to functions or members of a class, etc.

  10. Why go for Scala? Let me know some basic things

  11. When talking about Scala, the first thing people want to know is why Scala? Why should we go for something like Scala? The number one reason of course is it is fun to program in a language like Scala. Scala gives one the Erlang’s concurrency model but on the JVM. So, Scala gives one some of the really interesting benefits of functional programming. Here are some of the things Scala can do for us:

    Scala is, first of all, statically typed language on the JVM; so if you are writing code in Java you have another option and that is to program in Scala. And what’s beneficial is that it is more statically typed than Java is. It is a hybrid functional language; what that means is it is a language that gives you the functional style of programming but at the same time it doesn’t force you to do so, which can be both good news and bad news depending on how you look at it. It can be kind of a dial that you can turn all the way to left and write infinitive code like you are used to in Java or turn the dial all the way to the right and write in functional style like the functional languages like Erlang, etc.

  12. which is good hadoop or spark?

  13. Send details for Scala and spark details.I want to know about the course details and fee structure.Is it available in Online Classes.If so, let me know.I’m comfort to attend online.

  14. Hi,
    Pls send me the details on Scala and Spark training, duration, fees, etc
    I’m comfortable with online classes.

    Regards

  15. which is good hadoop or spark? why majority of peoples choose Hadoop?

  16. Pls send me the details on Scala and Spark training, duration, fees, etc
    I’m comfortable with online classes.

Submit a Comment

Your email address will not be published. Required fields are marked *

ENQUIRE US NOW