@RequiresEnterpriseLicense(allowEval=true) @MetaDataScope(value=DataSenseResolver.class) @Connector(name="spark-sql", friendlyName="SparkSql", minMuleVersion="3.6") public class SparkSqlConnector extends Object
Constructor and Description |
---|
SparkSqlConnector() |
Modifier and Type | Method and Description |
---|---|
void |
addCassandraDataProperties(List<CassandraDataProperties> listCassandraDataProps)
Dynamically add new datasources from Cassandra database keyspaces and
tables
|
void |
addFileDataProperties(List<FileDataProperties> listFileDataProps)
Dynamically add new datasources from Hadoop HDFS files
|
void |
addJdbcDataProperties(List<JdbcDataProperties> listJdbcDataProps)
Dynamically add new datasources from JDBC accesible databases
|
Object |
customSql(String sql)
Run an arbitraty SQL query through Spark SQL Context.
|
void |
disconnect()
Force global configuration disconnection.
|
void |
dropTemporaryTable(String temporaryTableName)
Drop a temporary table present at the Spark SQL Context.
|
org.apache.spark.sql.DataFrame |
executeSql(String sql)
Execute sql in sql context
|
ConnectorConfig |
getConfig() |
boolean |
isSqlSelectValid(String sql)
Validate sql select
|
void |
setConfig(ConnectorConfig config) |
Object |
sqlSelect(String sqlSelect,
String temporaryTableName,
Boolean streaming)
SQL Select
|
String |
toNativeQuery(org.mule.common.query.DsqlQuery query) |
@Processor public void addJdbcDataProperties(List<JdbcDataProperties> listJdbcDataProps) throws SQLException
listJdbcDataProps
- List of JDBC accesible databases optionsSparkSqlConnectorException
- If new JDBC data source options are not validSQLException
- error thrown during jdbc configuration@Processor public void addFileDataProperties(List<FileDataProperties> listFileDataProps)
listFileDataProps
- List of Hadoop HDFS files optionsSparkSqlConnectorException
- If new file(s) options are not valid@Processor public void addCassandraDataProperties(List<CassandraDataProperties> listCassandraDataProps)
listCassandraDataProps
- List of Cassandra database keyspaces and tablesSparkSqlConnectorException
- If new schemas and/or tables options are not valid@Processor public Object customSql(String sql)
sql
- SQL QuerySparkSqlConnectorException
- if there is any problem running the query (for example, query
is not valid)@MetaDataScope(value=DataSenseResolver.class) @Processor public Object sqlSelect(@Placement(order=1) @Query String sqlSelect, @Placement(order=2) @Optional String temporaryTableName, @Placement(order=3) @Optional Boolean streaming)
sqlSelect
- SQL select query to run into the Apache Spark Cluster through
the Spark SQL Context.temporaryTableName
- If set, it will be the name of a new temporary table available
at the Spark SQL Context with current query results.streaming
- if set (true) the result will be of type
'Iterator<Map<String, Object>>' and not a
'List<Map<String, Object>>'. This is the way to
avoid loading large sizes of data into memory.SqlSelectNotValidException
- if the query is not a select querySparkSqlConnectorException
- if there is any problem running the query (for example, query
is not valid)public boolean isSqlSelectValid(String sql)
sql
- SQL select querypublic org.apache.spark.sql.DataFrame executeSql(String sql)
sql
- SQL QuerySparkSqlConnectorException
- if there is any problem running the query (for example, query
is not valid)@Processor public void dropTemporaryTable(String temporaryTableName) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException
temporaryTableName
- Temporary table nameorg.apache.spark.sql.catalyst.analysis.NoSuchTableException
- if the table no exists at the Spark SQL Context.EmptyTableNameException
- if table name argument is null or emptypublic ConnectorConfig getConfig()
public void setConfig(ConnectorConfig config)
@Processor public void disconnect()
@QueryTranslator public String toNativeQuery(org.mule.common.query.DsqlQuery query)
Copyright © 2010–2017. All rights reserved.