Packages

  • package root
    Definition Classes
    root
  • package ai
    Definition Classes
    root
  • package eto
    Definition Classes
    ai
  • package rikai
    Definition Classes
    eto
  • package sql

    Rikai SQL-ML extension.

    Rikai SQL-ML extension.

    Rikai offers DDL to manipulate ML Models:

    CREATE MODEL model_name
    [ OPTIONS (key=value, key=value, ...) ]
    [ AS "model_registry_uri" ]
    
    # List all registered models.
    SHOW MODELS
    
    # Describe the details of a model.
    (DESC | DESCRIBE) MODEL model_name
    
    # Drop a Model
    DROP MODEL model_name

    A ML_PREDICT function is implemented to run model inference.

    SELECT id, ML_PREDICT(model_name, col1, col2, col3) as predicted FROM table
    Definition Classes
    rikai
  • package model
    Definition Classes
    sql
  • package spark
    Definition Classes
    sql
  • package execution
  • package expressions
  • package functions
  • Python
  • RikaiSparkSessionExtensions
  • SparkRunnable

package spark

Ordering
  1. Alphabetic
Visibility
  1. Public
  2. Protected

Package Members

  1. package execution
  2. package expressions
  3. package functions

Type Members

  1. class RikaiSparkSessionExtensions extends (SparkSessionExtensions) => Unit

    Rikai SparkSession extensions to enable Spark SQL ML.

  2. trait SparkRunnable extends AnyRef

    Make ai.eto.rikai.sql.model.Model runnable on Spark.

    Make ai.eto.rikai.sql.model.Model runnable on Spark.

    For a ML_PREDICT expression in Spark SQL,

    SELECT ML_PREDICT(model_zoo, col1, col2, col3) FROM t1

    It generates a LogicalPlan equivalent to

    SELECT <Model{model_zoo}.asSpark(col1, col2, col3)> FROM t1
    Example:
    1. To implement a ai.eto.rikai.sql.model.Model for RegistryFoo:

      class FooModel(name, uri) extends Model with SparkRunnable {
      
          /** Use a Spark UDF with the same name to run RegistryFoo's model */
         def asSpark(args: Seq[Expression]) : Expression = {
             UnresolvedFunction(
                new FunctionIdentifier(s"${name}"),
                arguments,
                isDistinct = false,
                Option.empty
             )
         }
    Note

    It is the ai.eto.rikai.sql.model.Registry's responsibility to implement a Model that runs with Spark SQL.

Value Members

  1. object Python

Ungrouped