Spark SQL with Scala : deprecation warning for registerTempTable












0














getting this below warning trying to create a temporary table
please help to solve this warning




scala> df.registerTempTable("df")
warning: there was one deprecation warning; re-run with -deprecation for details











share|improve this question
























  • try changing the temp table with another name other than "df"
    – Satish Karuturi
    Nov 13 at 3:09


















0














getting this below warning trying to create a temporary table
please help to solve this warning




scala> df.registerTempTable("df")
warning: there was one deprecation warning; re-run with -deprecation for details











share|improve this question
























  • try changing the temp table with another name other than "df"
    – Satish Karuturi
    Nov 13 at 3:09
















0












0








0







getting this below warning trying to create a temporary table
please help to solve this warning




scala> df.registerTempTable("df")
warning: there was one deprecation warning; re-run with -deprecation for details











share|improve this question















getting this below warning trying to create a temporary table
please help to solve this warning




scala> df.registerTempTable("df")
warning: there was one deprecation warning; re-run with -deprecation for details








scala apache-spark apache-spark-sql






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 18 at 20:44









Ram Ghadiyaram

15.9k54275




15.9k54275










asked Nov 13 at 3:04









palanivel

71




71












  • try changing the temp table with another name other than "df"
    – Satish Karuturi
    Nov 13 at 3:09




















  • try changing the temp table with another name other than "df"
    – Satish Karuturi
    Nov 13 at 3:09


















try changing the temp table with another name other than "df"
– Satish Karuturi
Nov 13 at 3:09






try changing the temp table with another name other than "df"
– Satish Karuturi
Nov 13 at 3:09














2 Answers
2






active

oldest

votes


















3














The registerTempTable method is deprecated in Spark 2.0



createOrReplaceTempView is the supported replacement function






share|improve this answer





























    1














    Spark Code DataSet.scala see this message from doc




    Use createOrReplaceTempView(viewName) instead




     /**
    * Registers this Dataset as a temporary table using the given name. The lifetime of this
    * temporary table is tied to the [[SparkSession]] that was used to create this Dataset.
    *
    * @group basic
    * @since 1.6.0
    */
    @deprecated("Use createOrReplaceTempView(viewName) instead.", "2.0.0")
    def registerTempTable(tableName: String): Unit = {
    createOrReplaceTempView(tableName)
    }


    Example Usage demo with sample dataset join using createOrReplaceTempView:



       package com.examples

    import com.droolsplay.util.SparkSessionSingleton
    import org.apache.log4j.{Level, Logger}
    import org.apache.spark.internal.Logging
    import org.apache.spark.sql.SparkSession
    import org.apache.spark.sql.functions._

    /**
    * Join Example and some basics demonstration using sample data.
    *
    * @author : Ram Ghadiyaram
    */
    object JoinExamplesv2 extends Logging {
    // switch off un necessary logs
    Logger.getLogger("org").setLevel(Level.OFF)
    Logger.getLogger("akka").setLevel(Level.OFF)
    // val spark: SparkSession = SparkSession.builder.config("spark.master", "local").getOrCreate;
    val spark: SparkSession = SparkSessionSingleton.getInstance(Option(this.getClass.getName))

    /**
    * main
    *
    * @param args Array[String]
    */
    def main(args: Array[String]): Unit = {
    import spark.implicits._
    /**
    * create 2 dataframes here using case classes one is Person df1 and another one is profile df2
    */
    val df1 = spark.sqlContext.createDataFrame(
    spark.sparkContext.parallelize(
    Person("Sarath", 33, 2)
    :: Person("Vasudha Nanduri", 30, 2)
    :: Person("Ravikumar Ramasamy", 34, 5)
    :: Person("Ram Ghadiyaram", 42, 9)
    :: Person("Ravi chandra Kancharla", 43, 9)
    :: Nil))


    val df2 = spark.sqlContext.createDataFrame(
    Profile("Spark", 2, "SparkSQLMaster")
    :: Profile("Spark", 5, "SparkGuru")
    :: Profile("Spark", 9, "DevHunter")
    :: Nil
    )

    // you can do alias to refer column name with aliases to increase readablity

    val df_asPerson = df1.as("dfperson")
    val df_asProfile = df2.as("dfprofile")
    /** *
    * Example displays how to join them in the dataframe level
    * next example demonstrates using sql with createOrReplaceTempView
    */
    val joined_df = df_asPerson.join(
    df_asProfile
    , col("dfperson.personid") === col("dfprofile.personid")
    , "inner")
    joined_df.select(
    col("dfperson.name")
    , col("dfperson.age")
    , col("dfprofile.name")
    , col("dfprofile.profileDescription"))
    .show

    /// example using sql statement after registering createOrReplaceTempView

    df_asPerson.createOrReplaceTempView("dfPerson");
    df_asProfile.createOrReplaceTempView("dfprofile")
    // this is example of plain sql
    val dfJoin = spark.sqlContext.sql(
    """SELECT dfperson.name, dfperson.age, dfprofile.profileDescription
    FROM dfperson JOIN dfprofile
    ON dfperson.personid == dfprofile.personid""")
    logInfo("Example using sql statement after registering createOrReplaceTempView ")
    dfJoin.show(false)

    }

    // models here

    case class Person(name: String, age: Int, personid: Int)

    case class Profile(name: String, personId: Int, profileDescription: String)

    }


    Result :



    +--------------------+---+-----+------------------+
    | name|age| name|profileDescription|
    +--------------------+---+-----+------------------+
    | Sarath| 33|Spark| SparkSQLMaster|
    | Vasudha Nanduri| 30|Spark| SparkSQLMaster|
    | Ravikumar Ramasamy| 34|Spark| SparkGuru|
    | Ram Ghadiyaram| 42|Spark| DevHunter|
    |Ravi chandra Kanc...| 43|Spark| DevHunter|
    +--------------------+---+-----+------------------+

    18/11/12 23:03:38 INFO JoinExamplesv2: Example using sql statement after registering createOrReplaceTempView
    +----------------------+---+------------------+
    |name |age|profileDescription|
    +----------------------+---+------------------+
    |Sarath |33 |SparkSQLMaster |
    |Vasudha Nanduri |30 |SparkSQLMaster |
    |Ravikumar Ramasamy |34 |SparkGuru |
    |Ram Ghadiyaram |42 |DevHunter |
    |Ravi chandra Kancharla|43 |DevHunter |





    share|improve this answer























      Your Answer






      StackExchange.ifUsing("editor", function () {
      StackExchange.using("externalEditor", function () {
      StackExchange.using("snippets", function () {
      StackExchange.snippets.init();
      });
      });
      }, "code-snippets");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "1"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53273182%2fspark-sql-with-scala-deprecation-warning-for-registertemptable%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      3














      The registerTempTable method is deprecated in Spark 2.0



      createOrReplaceTempView is the supported replacement function






      share|improve this answer


























        3














        The registerTempTable method is deprecated in Spark 2.0



        createOrReplaceTempView is the supported replacement function






        share|improve this answer
























          3












          3








          3






          The registerTempTable method is deprecated in Spark 2.0



          createOrReplaceTempView is the supported replacement function






          share|improve this answer












          The registerTempTable method is deprecated in Spark 2.0



          createOrReplaceTempView is the supported replacement function







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 13 at 3:13









          A. Timms

          764




          764

























              1














              Spark Code DataSet.scala see this message from doc




              Use createOrReplaceTempView(viewName) instead




               /**
              * Registers this Dataset as a temporary table using the given name. The lifetime of this
              * temporary table is tied to the [[SparkSession]] that was used to create this Dataset.
              *
              * @group basic
              * @since 1.6.0
              */
              @deprecated("Use createOrReplaceTempView(viewName) instead.", "2.0.0")
              def registerTempTable(tableName: String): Unit = {
              createOrReplaceTempView(tableName)
              }


              Example Usage demo with sample dataset join using createOrReplaceTempView:



                 package com.examples

              import com.droolsplay.util.SparkSessionSingleton
              import org.apache.log4j.{Level, Logger}
              import org.apache.spark.internal.Logging
              import org.apache.spark.sql.SparkSession
              import org.apache.spark.sql.functions._

              /**
              * Join Example and some basics demonstration using sample data.
              *
              * @author : Ram Ghadiyaram
              */
              object JoinExamplesv2 extends Logging {
              // switch off un necessary logs
              Logger.getLogger("org").setLevel(Level.OFF)
              Logger.getLogger("akka").setLevel(Level.OFF)
              // val spark: SparkSession = SparkSession.builder.config("spark.master", "local").getOrCreate;
              val spark: SparkSession = SparkSessionSingleton.getInstance(Option(this.getClass.getName))

              /**
              * main
              *
              * @param args Array[String]
              */
              def main(args: Array[String]): Unit = {
              import spark.implicits._
              /**
              * create 2 dataframes here using case classes one is Person df1 and another one is profile df2
              */
              val df1 = spark.sqlContext.createDataFrame(
              spark.sparkContext.parallelize(
              Person("Sarath", 33, 2)
              :: Person("Vasudha Nanduri", 30, 2)
              :: Person("Ravikumar Ramasamy", 34, 5)
              :: Person("Ram Ghadiyaram", 42, 9)
              :: Person("Ravi chandra Kancharla", 43, 9)
              :: Nil))


              val df2 = spark.sqlContext.createDataFrame(
              Profile("Spark", 2, "SparkSQLMaster")
              :: Profile("Spark", 5, "SparkGuru")
              :: Profile("Spark", 9, "DevHunter")
              :: Nil
              )

              // you can do alias to refer column name with aliases to increase readablity

              val df_asPerson = df1.as("dfperson")
              val df_asProfile = df2.as("dfprofile")
              /** *
              * Example displays how to join them in the dataframe level
              * next example demonstrates using sql with createOrReplaceTempView
              */
              val joined_df = df_asPerson.join(
              df_asProfile
              , col("dfperson.personid") === col("dfprofile.personid")
              , "inner")
              joined_df.select(
              col("dfperson.name")
              , col("dfperson.age")
              , col("dfprofile.name")
              , col("dfprofile.profileDescription"))
              .show

              /// example using sql statement after registering createOrReplaceTempView

              df_asPerson.createOrReplaceTempView("dfPerson");
              df_asProfile.createOrReplaceTempView("dfprofile")
              // this is example of plain sql
              val dfJoin = spark.sqlContext.sql(
              """SELECT dfperson.name, dfperson.age, dfprofile.profileDescription
              FROM dfperson JOIN dfprofile
              ON dfperson.personid == dfprofile.personid""")
              logInfo("Example using sql statement after registering createOrReplaceTempView ")
              dfJoin.show(false)

              }

              // models here

              case class Person(name: String, age: Int, personid: Int)

              case class Profile(name: String, personId: Int, profileDescription: String)

              }


              Result :



              +--------------------+---+-----+------------------+
              | name|age| name|profileDescription|
              +--------------------+---+-----+------------------+
              | Sarath| 33|Spark| SparkSQLMaster|
              | Vasudha Nanduri| 30|Spark| SparkSQLMaster|
              | Ravikumar Ramasamy| 34|Spark| SparkGuru|
              | Ram Ghadiyaram| 42|Spark| DevHunter|
              |Ravi chandra Kanc...| 43|Spark| DevHunter|
              +--------------------+---+-----+------------------+

              18/11/12 23:03:38 INFO JoinExamplesv2: Example using sql statement after registering createOrReplaceTempView
              +----------------------+---+------------------+
              |name |age|profileDescription|
              +----------------------+---+------------------+
              |Sarath |33 |SparkSQLMaster |
              |Vasudha Nanduri |30 |SparkSQLMaster |
              |Ravikumar Ramasamy |34 |SparkGuru |
              |Ram Ghadiyaram |42 |DevHunter |
              |Ravi chandra Kancharla|43 |DevHunter |





              share|improve this answer




























                1














                Spark Code DataSet.scala see this message from doc




                Use createOrReplaceTempView(viewName) instead




                 /**
                * Registers this Dataset as a temporary table using the given name. The lifetime of this
                * temporary table is tied to the [[SparkSession]] that was used to create this Dataset.
                *
                * @group basic
                * @since 1.6.0
                */
                @deprecated("Use createOrReplaceTempView(viewName) instead.", "2.0.0")
                def registerTempTable(tableName: String): Unit = {
                createOrReplaceTempView(tableName)
                }


                Example Usage demo with sample dataset join using createOrReplaceTempView:



                   package com.examples

                import com.droolsplay.util.SparkSessionSingleton
                import org.apache.log4j.{Level, Logger}
                import org.apache.spark.internal.Logging
                import org.apache.spark.sql.SparkSession
                import org.apache.spark.sql.functions._

                /**
                * Join Example and some basics demonstration using sample data.
                *
                * @author : Ram Ghadiyaram
                */
                object JoinExamplesv2 extends Logging {
                // switch off un necessary logs
                Logger.getLogger("org").setLevel(Level.OFF)
                Logger.getLogger("akka").setLevel(Level.OFF)
                // val spark: SparkSession = SparkSession.builder.config("spark.master", "local").getOrCreate;
                val spark: SparkSession = SparkSessionSingleton.getInstance(Option(this.getClass.getName))

                /**
                * main
                *
                * @param args Array[String]
                */
                def main(args: Array[String]): Unit = {
                import spark.implicits._
                /**
                * create 2 dataframes here using case classes one is Person df1 and another one is profile df2
                */
                val df1 = spark.sqlContext.createDataFrame(
                spark.sparkContext.parallelize(
                Person("Sarath", 33, 2)
                :: Person("Vasudha Nanduri", 30, 2)
                :: Person("Ravikumar Ramasamy", 34, 5)
                :: Person("Ram Ghadiyaram", 42, 9)
                :: Person("Ravi chandra Kancharla", 43, 9)
                :: Nil))


                val df2 = spark.sqlContext.createDataFrame(
                Profile("Spark", 2, "SparkSQLMaster")
                :: Profile("Spark", 5, "SparkGuru")
                :: Profile("Spark", 9, "DevHunter")
                :: Nil
                )

                // you can do alias to refer column name with aliases to increase readablity

                val df_asPerson = df1.as("dfperson")
                val df_asProfile = df2.as("dfprofile")
                /** *
                * Example displays how to join them in the dataframe level
                * next example demonstrates using sql with createOrReplaceTempView
                */
                val joined_df = df_asPerson.join(
                df_asProfile
                , col("dfperson.personid") === col("dfprofile.personid")
                , "inner")
                joined_df.select(
                col("dfperson.name")
                , col("dfperson.age")
                , col("dfprofile.name")
                , col("dfprofile.profileDescription"))
                .show

                /// example using sql statement after registering createOrReplaceTempView

                df_asPerson.createOrReplaceTempView("dfPerson");
                df_asProfile.createOrReplaceTempView("dfprofile")
                // this is example of plain sql
                val dfJoin = spark.sqlContext.sql(
                """SELECT dfperson.name, dfperson.age, dfprofile.profileDescription
                FROM dfperson JOIN dfprofile
                ON dfperson.personid == dfprofile.personid""")
                logInfo("Example using sql statement after registering createOrReplaceTempView ")
                dfJoin.show(false)

                }

                // models here

                case class Person(name: String, age: Int, personid: Int)

                case class Profile(name: String, personId: Int, profileDescription: String)

                }


                Result :



                +--------------------+---+-----+------------------+
                | name|age| name|profileDescription|
                +--------------------+---+-----+------------------+
                | Sarath| 33|Spark| SparkSQLMaster|
                | Vasudha Nanduri| 30|Spark| SparkSQLMaster|
                | Ravikumar Ramasamy| 34|Spark| SparkGuru|
                | Ram Ghadiyaram| 42|Spark| DevHunter|
                |Ravi chandra Kanc...| 43|Spark| DevHunter|
                +--------------------+---+-----+------------------+

                18/11/12 23:03:38 INFO JoinExamplesv2: Example using sql statement after registering createOrReplaceTempView
                +----------------------+---+------------------+
                |name |age|profileDescription|
                +----------------------+---+------------------+
                |Sarath |33 |SparkSQLMaster |
                |Vasudha Nanduri |30 |SparkSQLMaster |
                |Ravikumar Ramasamy |34 |SparkGuru |
                |Ram Ghadiyaram |42 |DevHunter |
                |Ravi chandra Kancharla|43 |DevHunter |





                share|improve this answer


























                  1












                  1








                  1






                  Spark Code DataSet.scala see this message from doc




                  Use createOrReplaceTempView(viewName) instead




                   /**
                  * Registers this Dataset as a temporary table using the given name. The lifetime of this
                  * temporary table is tied to the [[SparkSession]] that was used to create this Dataset.
                  *
                  * @group basic
                  * @since 1.6.0
                  */
                  @deprecated("Use createOrReplaceTempView(viewName) instead.", "2.0.0")
                  def registerTempTable(tableName: String): Unit = {
                  createOrReplaceTempView(tableName)
                  }


                  Example Usage demo with sample dataset join using createOrReplaceTempView:



                     package com.examples

                  import com.droolsplay.util.SparkSessionSingleton
                  import org.apache.log4j.{Level, Logger}
                  import org.apache.spark.internal.Logging
                  import org.apache.spark.sql.SparkSession
                  import org.apache.spark.sql.functions._

                  /**
                  * Join Example and some basics demonstration using sample data.
                  *
                  * @author : Ram Ghadiyaram
                  */
                  object JoinExamplesv2 extends Logging {
                  // switch off un necessary logs
                  Logger.getLogger("org").setLevel(Level.OFF)
                  Logger.getLogger("akka").setLevel(Level.OFF)
                  // val spark: SparkSession = SparkSession.builder.config("spark.master", "local").getOrCreate;
                  val spark: SparkSession = SparkSessionSingleton.getInstance(Option(this.getClass.getName))

                  /**
                  * main
                  *
                  * @param args Array[String]
                  */
                  def main(args: Array[String]): Unit = {
                  import spark.implicits._
                  /**
                  * create 2 dataframes here using case classes one is Person df1 and another one is profile df2
                  */
                  val df1 = spark.sqlContext.createDataFrame(
                  spark.sparkContext.parallelize(
                  Person("Sarath", 33, 2)
                  :: Person("Vasudha Nanduri", 30, 2)
                  :: Person("Ravikumar Ramasamy", 34, 5)
                  :: Person("Ram Ghadiyaram", 42, 9)
                  :: Person("Ravi chandra Kancharla", 43, 9)
                  :: Nil))


                  val df2 = spark.sqlContext.createDataFrame(
                  Profile("Spark", 2, "SparkSQLMaster")
                  :: Profile("Spark", 5, "SparkGuru")
                  :: Profile("Spark", 9, "DevHunter")
                  :: Nil
                  )

                  // you can do alias to refer column name with aliases to increase readablity

                  val df_asPerson = df1.as("dfperson")
                  val df_asProfile = df2.as("dfprofile")
                  /** *
                  * Example displays how to join them in the dataframe level
                  * next example demonstrates using sql with createOrReplaceTempView
                  */
                  val joined_df = df_asPerson.join(
                  df_asProfile
                  , col("dfperson.personid") === col("dfprofile.personid")
                  , "inner")
                  joined_df.select(
                  col("dfperson.name")
                  , col("dfperson.age")
                  , col("dfprofile.name")
                  , col("dfprofile.profileDescription"))
                  .show

                  /// example using sql statement after registering createOrReplaceTempView

                  df_asPerson.createOrReplaceTempView("dfPerson");
                  df_asProfile.createOrReplaceTempView("dfprofile")
                  // this is example of plain sql
                  val dfJoin = spark.sqlContext.sql(
                  """SELECT dfperson.name, dfperson.age, dfprofile.profileDescription
                  FROM dfperson JOIN dfprofile
                  ON dfperson.personid == dfprofile.personid""")
                  logInfo("Example using sql statement after registering createOrReplaceTempView ")
                  dfJoin.show(false)

                  }

                  // models here

                  case class Person(name: String, age: Int, personid: Int)

                  case class Profile(name: String, personId: Int, profileDescription: String)

                  }


                  Result :



                  +--------------------+---+-----+------------------+
                  | name|age| name|profileDescription|
                  +--------------------+---+-----+------------------+
                  | Sarath| 33|Spark| SparkSQLMaster|
                  | Vasudha Nanduri| 30|Spark| SparkSQLMaster|
                  | Ravikumar Ramasamy| 34|Spark| SparkGuru|
                  | Ram Ghadiyaram| 42|Spark| DevHunter|
                  |Ravi chandra Kanc...| 43|Spark| DevHunter|
                  +--------------------+---+-----+------------------+

                  18/11/12 23:03:38 INFO JoinExamplesv2: Example using sql statement after registering createOrReplaceTempView
                  +----------------------+---+------------------+
                  |name |age|profileDescription|
                  +----------------------+---+------------------+
                  |Sarath |33 |SparkSQLMaster |
                  |Vasudha Nanduri |30 |SparkSQLMaster |
                  |Ravikumar Ramasamy |34 |SparkGuru |
                  |Ram Ghadiyaram |42 |DevHunter |
                  |Ravi chandra Kancharla|43 |DevHunter |





                  share|improve this answer














                  Spark Code DataSet.scala see this message from doc




                  Use createOrReplaceTempView(viewName) instead




                   /**
                  * Registers this Dataset as a temporary table using the given name. The lifetime of this
                  * temporary table is tied to the [[SparkSession]] that was used to create this Dataset.
                  *
                  * @group basic
                  * @since 1.6.0
                  */
                  @deprecated("Use createOrReplaceTempView(viewName) instead.", "2.0.0")
                  def registerTempTable(tableName: String): Unit = {
                  createOrReplaceTempView(tableName)
                  }


                  Example Usage demo with sample dataset join using createOrReplaceTempView:



                     package com.examples

                  import com.droolsplay.util.SparkSessionSingleton
                  import org.apache.log4j.{Level, Logger}
                  import org.apache.spark.internal.Logging
                  import org.apache.spark.sql.SparkSession
                  import org.apache.spark.sql.functions._

                  /**
                  * Join Example and some basics demonstration using sample data.
                  *
                  * @author : Ram Ghadiyaram
                  */
                  object JoinExamplesv2 extends Logging {
                  // switch off un necessary logs
                  Logger.getLogger("org").setLevel(Level.OFF)
                  Logger.getLogger("akka").setLevel(Level.OFF)
                  // val spark: SparkSession = SparkSession.builder.config("spark.master", "local").getOrCreate;
                  val spark: SparkSession = SparkSessionSingleton.getInstance(Option(this.getClass.getName))

                  /**
                  * main
                  *
                  * @param args Array[String]
                  */
                  def main(args: Array[String]): Unit = {
                  import spark.implicits._
                  /**
                  * create 2 dataframes here using case classes one is Person df1 and another one is profile df2
                  */
                  val df1 = spark.sqlContext.createDataFrame(
                  spark.sparkContext.parallelize(
                  Person("Sarath", 33, 2)
                  :: Person("Vasudha Nanduri", 30, 2)
                  :: Person("Ravikumar Ramasamy", 34, 5)
                  :: Person("Ram Ghadiyaram", 42, 9)
                  :: Person("Ravi chandra Kancharla", 43, 9)
                  :: Nil))


                  val df2 = spark.sqlContext.createDataFrame(
                  Profile("Spark", 2, "SparkSQLMaster")
                  :: Profile("Spark", 5, "SparkGuru")
                  :: Profile("Spark", 9, "DevHunter")
                  :: Nil
                  )

                  // you can do alias to refer column name with aliases to increase readablity

                  val df_asPerson = df1.as("dfperson")
                  val df_asProfile = df2.as("dfprofile")
                  /** *
                  * Example displays how to join them in the dataframe level
                  * next example demonstrates using sql with createOrReplaceTempView
                  */
                  val joined_df = df_asPerson.join(
                  df_asProfile
                  , col("dfperson.personid") === col("dfprofile.personid")
                  , "inner")
                  joined_df.select(
                  col("dfperson.name")
                  , col("dfperson.age")
                  , col("dfprofile.name")
                  , col("dfprofile.profileDescription"))
                  .show

                  /// example using sql statement after registering createOrReplaceTempView

                  df_asPerson.createOrReplaceTempView("dfPerson");
                  df_asProfile.createOrReplaceTempView("dfprofile")
                  // this is example of plain sql
                  val dfJoin = spark.sqlContext.sql(
                  """SELECT dfperson.name, dfperson.age, dfprofile.profileDescription
                  FROM dfperson JOIN dfprofile
                  ON dfperson.personid == dfprofile.personid""")
                  logInfo("Example using sql statement after registering createOrReplaceTempView ")
                  dfJoin.show(false)

                  }

                  // models here

                  case class Person(name: String, age: Int, personid: Int)

                  case class Profile(name: String, personId: Int, profileDescription: String)

                  }


                  Result :



                  +--------------------+---+-----+------------------+
                  | name|age| name|profileDescription|
                  +--------------------+---+-----+------------------+
                  | Sarath| 33|Spark| SparkSQLMaster|
                  | Vasudha Nanduri| 30|Spark| SparkSQLMaster|
                  | Ravikumar Ramasamy| 34|Spark| SparkGuru|
                  | Ram Ghadiyaram| 42|Spark| DevHunter|
                  |Ravi chandra Kanc...| 43|Spark| DevHunter|
                  +--------------------+---+-----+------------------+

                  18/11/12 23:03:38 INFO JoinExamplesv2: Example using sql statement after registering createOrReplaceTempView
                  +----------------------+---+------------------+
                  |name |age|profileDescription|
                  +----------------------+---+------------------+
                  |Sarath |33 |SparkSQLMaster |
                  |Vasudha Nanduri |30 |SparkSQLMaster |
                  |Ravikumar Ramasamy |34 |SparkGuru |
                  |Ram Ghadiyaram |42 |DevHunter |
                  |Ravi chandra Kancharla|43 |DevHunter |






                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited Nov 14 at 5:27

























                  answered Nov 13 at 5:04









                  Ram Ghadiyaram

                  15.9k54275




                  15.9k54275






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Stack Overflow!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.





                      Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                      Please pay close attention to the following guidance:


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53273182%2fspark-sql-with-scala-deprecation-warning-for-registertemptable%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Guess what letter conforming each word

                      Port of Spain

                      Run scheduled task as local user group (not BUILTIN)