Skip to content

Commit

Permalink
e2e tests Oracle source
Browse files Browse the repository at this point in the history
  • Loading branch information
itsmekumari committed Jan 9, 2025
1 parent 9f13a64 commit 4b3d35e
Show file tree
Hide file tree
Showing 6 changed files with 241 additions and 6 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -199,3 +199,39 @@ Feature: Oracle source- Verify Oracle source plugin design time validation scena
Then Enter textarea plugin property: "importQuery" with value: "invalidImportQuery"
Then Click on the Validate button
Then Verify that the Plugin Property: "user" is displaying an in-line error message: "errorMessageBlankUsername"

@Oracle_Required
Scenario: To verify Oracle source plugin validation error message with blank password
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "Oracle" from the plugins list as: "Source"
Then Navigate to the properties page of plugin: "Oracle"
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Select radio button plugin property: "connectionType" with value: "service"
Then Select radio button plugin property: "role" with value: "normal"
Then Enter input plugin property: "referenceName" with value: "sourceRef"
Then Replace input plugin property: "database" with value: "databaseName"
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
Then Click on the Validate button
Then Verify that the Plugin is displaying an error message: "errorMessageBlankPassword" on the header

@Oracle_Required
Scenario: To verify Oracle source plugin validation error message with blank Host
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "Oracle" from the plugins list as: "Source"
Then Navigate to the properties page of plugin: "Oracle"
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Select radio button plugin property: "connectionType" with value: "service"
Then Select radio button plugin property: "role" with value: "normal"
Then Enter input plugin property: "referenceName" with value: "sourceRef"
Then Replace input plugin property: "database" with value: "databaseName"
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
Then Click on the Validate button
Then Verify that the Plugin is displaying an error message: "errorMessageBlankHost" on the header
107 changes: 107 additions & 0 deletions oracle-plugin/src/e2e-test/features/source/OracleRunTime.feature
Original file line number Diff line number Diff line change
Expand Up @@ -438,3 +438,110 @@ Feature: Oracle - Verify data transfer from Oracle source to BigQuery sink
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate the values of records transferred to target Big Query table is equal to the values from source table

@ORACLE_SOURCE_TEST @BQ_SINK_TEST @Oracle_Required
Scenario: To verify data is getting transferred from Oracle source to BigQuery sink successfully with bounding query
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "Oracle" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "BigQuery" from the plugins list as: "Sink"
Then Connect plugins: "Oracle" and "BigQuery" to establish connection
Then Navigate to the properties page of plugin: "Oracle"
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Select radio button plugin property: "connectionType" with value: "service"
Then Select radio button plugin property: "role" with value: "normal"
Then Enter input plugin property: "referenceName" with value: "sourceRef"
Then Replace input plugin property: "database" with value: "databaseName"
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
Then Enter textarea plugin property: "boundingQuery" with value: "boundingQuery"
Then Click on the Get Schema button
Then Verify the Output Schema matches the Expected Schema: "outputSchema"
Then Validate "Oracle" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery"
Then Replace input plugin property: "project" with value: "projectId"
Then Enter input plugin property: "datasetProject" with value: "projectId"
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
Then Enter input plugin property: "dataset" with value: "dataset"
Then Enter input plugin property: "table" with value: "bqTargetTable"
Then Click plugin property: "truncateTable"
Then Click plugin property: "updateTableSchema"
Then Validate "BigQuery" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Preview and run the pipeline
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate the values of records transferred to target Big Query table is equal to the values from source table

@ORACLE_SOURCE_TEST @BQ_SINK_TEST @CONNECTION @Oracle_Required
Scenario: To verify data is getting transferred from Oracle source to BigQuery sink successfully with use connection
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "Oracle" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "BigQuery" from the plugins list as: "Sink"
Then Connect plugins: "Oracle" and "BigQuery" to establish connection
Then Navigate to the properties page of plugin: "Oracle"
And Click plugin property: "switch-useConnection"
And Click on the Browse Connections button
And Click on the Add Connection button
Then Click plugin property: "connector-Oracle"
And Enter input plugin property: "name" with value: "connection.name"
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Select radio button plugin property: "connectionType" with value: "service"
Then Replace input plugin property: "database" with value: "databaseName"
Then Select radio button plugin property: "role" with value: "normal"
Then Click on the Test Connection button
And Verify the test connection is successful
Then Click on the Create button
Then Select connection: "connection.name"
Then Enter input plugin property: "referenceName" with value: "sourceRef"
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
Then Click on the Get Schema button
Then Verify the Output Schema matches the Expected Schema: "outputSchema"
Then Validate "Oracle" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery"
Then Replace input plugin property: "project" with value: "projectId"
Then Enter input plugin property: "datasetProject" with value: "projectId"
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
Then Enter input plugin property: "dataset" with value: "dataset"
Then Enter input plugin property: "table" with value: "bqTargetTable"
Then Click plugin property: "truncateTable"
Then Click plugin property: "updateTableSchema"
Then Validate "BigQuery" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Preview and run the pipeline
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate the values of records transferred to target Big Query table is equal to the values from source table

Original file line number Diff line number Diff line change
Expand Up @@ -305,3 +305,64 @@ Feature: Oracle - Verify Oracle plugin data transfer with macro arguments
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate the values of records transferred to target Big Query table is equal to the values from source table

@ORACLE_SOURCE_TEST @ORACLE_TARGET_TEST @Oracle_Required
Scenario: To verify data is getting transferred from Oracle to Oracle successfully when connection arguments,Isolation level,bounding query are macro enabled
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "Oracle" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "Oracle" from the plugins list as: "Sink"
Then Connect plugins: "Oracle" and "Oracle2" to establish connection
Then Navigate to the properties page of plugin: "Oracle"
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Select radio button plugin property: "connectionType" with value: "service"
Then Select radio button plugin property: "role" with value: "normal"
Then Enter input plugin property: "referenceName" with value: "sourceRef"
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
Then Click on the Macro button of Property: "connectionArguments" and set the value to: "connArgumentsSource"
Then Click on the Macro button of Property: "transactionIsolationLevel" and set the value to: "defaultTransactionIsolationLevel"
Then Replace input plugin property: "database" with value: "databaseName"
Then Click on the Macro button of Property: "boundingQuery" and set the value in textarea: "oracleBoundingQuery"
Then Validate "Oracle" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "Oracle2"
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
Then Replace input plugin property: "database" with value: "databaseName"
Then Replace input plugin property: "tableName" with value: "targetTable"
Then Replace input plugin property: "dbSchemaName" with value: "schema"
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Enter input plugin property: "referenceName" with value: "targetRef"
Then Select radio button plugin property: "connectionType" with value: "service"
Then Select radio button plugin property: "role" with value: "normal"
Then Validate "Oracle2" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Preview and run the pipeline
Then Enter runtime argument value "connectionArguments" for key "connArgumentsSource"
Then Enter runtime argument value "boundingQuery" for key "oracleBoundingQuery"
Then Enter runtime argument value "transactionIsolationLevel" for key "defaultTransactionIsolationLevel"
Then Run the preview of pipeline with runtime arguments
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Enter runtime argument value "connectionArguments" for key "connArgumentsSource"
Then Enter runtime argument value "boundingQuery" for key "oracleBoundingQuery"
Then Enter runtime argument value "transactionIsolationLevel" for key "defaultTransactionIsolationLevel"
Then Run the Pipeline in Runtime with runtime arguments
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate the values of records transferred to target table is equal to the values from source table
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@
package io.cdap.plugin.common.stepsdesign;

import com.google.cloud.bigquery.BigQueryException;
import io.cdap.e2e.pages.actions.CdfConnectionActions;
import io.cdap.e2e.pages.actions.CdfPluginPropertiesActions;
import io.cdap.e2e.utils.BigQueryClient;
import io.cdap.e2e.utils.PluginPropertyUtils;
import io.cdap.plugin.OracleClient;
Expand Down Expand Up @@ -48,8 +50,10 @@ public static void setTableName() {
PluginPropertyUtils.addPluginProp("sourceTable", sourceTableName);
PluginPropertyUtils.addPluginProp("targetTable", targetTableName);
String schema = PluginPropertyUtils.pluginProp("schema");
PluginPropertyUtils.addPluginProp("selectQuery", String.format("select * from %s.%s", schema,
sourceTableName));
PluginPropertyUtils.addPluginProp("selectQuery", String.format("select * from %s.%s "
+ "WHERE $CONDITIONS", schema, sourceTableName));
PluginPropertyUtils.addPluginProp("boundingQuery", String.format("select MIN(ID),MAX(ID)"
+ " from %s.%s", schema, sourceTableName));
}

@Before(order = 2, value = "@ORACLE_SOURCE_TEST")
Expand Down Expand Up @@ -416,4 +420,25 @@ public static void dropOracleTargetDateTable() throws SQLException, ClassNotFoun
BeforeActions.scenario.write("Oracle Target Table - " + PluginPropertyUtils.pluginProp("targetTable")
+ " deleted successfully");
}

@Before(order = 1, value = "@CONNECTION")
public static void setNewConnectionName() {
String connectionName = "Oracle" + RandomStringUtils.randomAlphanumeric(10);
PluginPropertyUtils.addPluginProp("connection.name", connectionName);
BeforeActions.scenario.write("New Connection name: " + connectionName);
}

private static void deleteConnection(String connectionType, String connectionName) throws IOException {
CdfConnectionActions.openWranglerConnectionsPage();
CdfConnectionActions.expandConnections(connectionType);
CdfConnectionActions.openConnectionActionMenu(connectionType, connectionName);
CdfConnectionActions.selectConnectionAction(connectionType, connectionName, "Delete");
CdfPluginPropertiesActions.clickPluginPropertyButton("Delete");
}

@After(order = 1, value = "@CONNECTION")
public static void deleteBQConnection() throws IOException {
deleteConnection("Oracle", "connection.name");
PluginPropertyUtils.removePluginProp("connection.name");
}
}
2 changes: 2 additions & 0 deletions oracle-plugin/src/e2e-test/resources/errorMessage.properties
Original file line number Diff line number Diff line change
Expand Up @@ -17,3 +17,5 @@ errorMessageInvalidSinkDatabase=Exception while trying to validate schema of dat
errorMessageInvalidHost=Exception while trying to validate schema of database table '"table"' for connection
errorLogsMessageInvalidBoundingQuery=Spark program 'phase-1' failed with error: ORA-00936: missing expression . \
Please check the system logs for more details.
errorMessageBlankPassword=SQL error while getting query schema: ORA-01005: null password given; logon denied
errorMessageBlankHost=SQL error while getting query schema: IO Error:
Loading

0 comments on commit 4b3d35e

Please sign in to comment.