Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] rapids_integration-scala213-dev-github tests failing #2699

Closed
sameerz opened this issue Dec 15, 2024 · 0 comments · Fixed by #2700
Closed

[BUG] rapids_integration-scala213-dev-github tests failing #2699

sameerz opened this issue Dec 15, 2024 · 0 comments · Fixed by #2700
Assignees
Labels
? - Needs Triage bug Something isn't working

Comments

@sameerz
Copy link
Collaborator

sameerz commented Dec 15, 2024

Describe the bug
13 test failures related to com.nvidia.spark.rapids.jni.Hash.getMaxStackDepth

Example failure here

[2024-12-14T21:08:57.541Z] answer = 'xro516042'
[2024-12-14T21:08:57.541Z] gateway_client = <py4j.clientserver.JavaClient object at 0x7f6ed7cc3280>
[2024-12-14T21:08:57.541Z] target_id = 'o82', name = 'sql'
[2024-12-14T21:08:57.541Z]
[2024-12-14T21:08:57.541Z] �[94mdef�[39;49;00m �[92mget_return_value�[39;49;00m(answer, gateway_client, target_id=�[94mNone�[39;49;00m, name=�[94mNone�[39;49;00m):�[90m�[39;49;00m
[2024-12-14T21:08:57.541Z] �[90m �[39;49;00m�[33m"""Converts an answer received from the Java gateway into a Python object.�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m For example, string representation of integers are converted to Python�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m integer, string representation of objects are converted to JavaObject�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m instances, etc.�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m :param answer: the string returned by the Java gateway�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m :param gateway_client: the gateway client used to communicate with the Java�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m Gateway. Only necessary if the answer is a reference (e.g., object,�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m list, map)�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m :param target_id: the name of the object from which the answer comes from�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m (e.g., object1 in object1.hello()). Optional.�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m :param name: the name of the member from which the answer comes from�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m (e.g., hello in object1.hello()). Optional.�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m """�[39;49;00m�[90m�[39;49;00m
[2024-12-14T21:08:57.541Z] �[94mif�[39;49;00m is_error(answer)[�[94m0�[39;49;00m]:�[90m�[39;49;00m
[2024-12-14T21:08:57.541Z] �[94mif�[39;49;00m �[96mlen�[39;49;00m(answer) > �[94m1�[39;49;00m:�[90m�[39;49;00m
[2024-12-14T21:08:57.541Z] �[96mtype�[39;49;00m = answer[�[94m1�[39;49;00m]�[90m�[39;49;00m
[2024-12-14T21:08:57.541Z] value = OUTPUT_CONVERTER[�[96mtype�[39;49;00m](answer[�[94m2�[39;49;00m:], gateway_client)�[90m�[39;49;00m
[2024-12-14T21:08:57.541Z] �[94mif�[39;49;00m answer[�[94m1�[39;49;00m] == REFERENCE_TYPE:�[90m�[39;49;00m
[2024-12-14T21:08:57.541Z] > �[94mraise�[39;49;00m Py4JJavaError(�[90m�[39;49;00m
[2024-12-14T21:08:57.541Z] �[33m"�[39;49;00m�[33mAn error occurred while calling �[39;49;00m�[33m{0}�[39;49;00m�[33m{1}�[39;49;00m�[33m{2}�[39;49;00m�[33m.�[39;49;00m�[33m\n�[39;49;00m�[33m"�[39;49;00m.�[90m�[39;49;00m
[2024-12-14T21:08:57.541Z] �[96mformat�[39;49;00m(target_id, �[33m"�[39;49;00m�[33m.�[39;49;00m�[33m"�[39;49;00m, name), value)�[90m�[39;49;00m
[2024-12-14T21:08:57.541Z] �[1m�[31mE py4j.protocol.Py4JJavaError: An error occurred while calling o82.sql.�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE : java.lang.NoClassDefFoundError: Could not initialize class com.nvidia.spark.rapids.jni.Hash�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at com.nvidia.spark.rapids.GpuOverrides$$anon$174.tagExprForGpu(GpuOverrides.scala:3339)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at com.nvidia.spark.rapids.BaseExprMeta.tagSelfForGpu(RapidsMeta.scala:1220)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at com.nvidia.spark.rapids.RapidsMeta.tagForGpu(RapidsMeta.scala:318)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at org.apache.spark.sql.rapids.BucketIdMetaUtils$.$anonfun$tagForBucketingHiveWrite$1(GpuFileFormatDataWriter.scala:957)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at org.apache.spark.sql.rapids.BucketIdMetaUtils$.$anonfun$tagForBucketingHiveWrite$1$adapted(GpuFileFormatDataWriter.scala:953)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at scala.Option.foreach(Option.scala:437)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at org.apache.spark.sql.rapids.BucketIdMetaUtils$.tagForBucketingHiveWrite(GpuFileFormatDataWriter.scala:953)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at com.nvidia.spark.rapids.shims.BucketingUtilsShim$.tagForHiveBucketingWrite(BucketingUtilsShim.scala:87)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at com.nvidia.spark.rapids.InsertIntoHadoopFsRelationCommandMeta.tagSelfForGpuInternal(GpuOverrides.scala:329)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at com.nvidia.spark.rapids.DataWritingCommandMeta.tagSelfForGpu(RapidsMeta.scala:556)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at com.nvidia.spark.rapids.RapidsMeta.tagForGpu(RapidsMeta.scala:318)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at com.nvidia.spark.rapids.RapidsMeta.$anonfun$tagForGpu$4(RapidsMeta.scala:295)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at com.nvidia.spark.rapids.RapidsMeta.$anonfun$tagForGpu$4$adapted(RapidsMeta.scala:295)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at scala.collection.immutable.List.foreach(List.scala:333)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at com.nvidia.spark.rapids.RapidsMeta.tagForGpu(RapidsMeta.scala:295)�[0m
[2024-12-14T21:08:57.541Z] �[1m�[31mE at com.nvidia.spark.rapids.GpuOverrides$.wrapAndTagPlan(GpuOverrides.scala:4512)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at com.nvidia.spark.rapids.GpuOverrides.applyOverrides(GpuOverrides.scala:4844)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at com.nvidia.spark.rapids.GpuOverrides.$anonfun$applyWithContext$3(GpuOverrides.scala:4723)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at com.nvidia.spark.rapids.GpuOverrides$.logDuration(GpuOverrides.scala:458)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at com.nvidia.spark.rapids.GpuOverrides.$anonfun$applyWithContext$1(GpuOverrides.scala:4720)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at com.nvidia.spark.rapids.GpuOverrideUtil$.$anonfun$tryOverride$1(GpuOverrides.scala:4686)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at com.nvidia.spark.rapids.GpuOverrides.applyWithContext(GpuOverrides.scala:4740)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at com.nvidia.spark.rapids.GpuOverrides.apply(GpuOverrides.scala:4713)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at com.nvidia.spark.rapids.GpuOverrides.apply(GpuOverrides.scala:4709)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$1(Columnar.scala:530)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$1$adapted(Columnar.scala:530)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at scala.collection.immutable.List.foreach(List.scala:333)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:530)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:482)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution$.$anonfun$prepareForExecution$1(QueryExecution.scala:477)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at scala.collection.LinearSeqOps.foldLeft(LinearSeq.scala:169)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at scala.collection.LinearSeqOps.foldLeft$(LinearSeq.scala:165)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at scala.collection.immutable.List.foldLeft(List.scala:79)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution$.prepareForExecution(QueryExecution.scala:476)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$1(QueryExecution.scala:186)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:138)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:219)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:546)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:219)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:218)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:186)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:179)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution.simpleString(QueryExecution.scala:238)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$explainString(QueryExecution.scala:284)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution.explainString(QueryExecution.scala:252)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:117)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:201)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:108)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:107)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:461)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:76)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:461)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:32)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:437)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:98)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:85)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:83)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.Dataset.(Dataset.scala:220)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:638)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:629)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:659)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at jdk.internal.reflect.GeneratedMethodAccessor186.invoke(Unknown Source)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at java.base/java.lang.reflect.Method.invoke(Method.java:569)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at py4j.Gateway.invoke(Gateway.java:282)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at py4j.commands.CallCommand.execute(CallCommand.java:79)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at py4j.ClientServerConnection.run(ClientServerConnection.java:106)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at java.base/java.lang.Thread.run(Thread.java:840)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE Caused by: java.lang.ExceptionInInitializerError: Exception java.lang.UnsatisfiedLinkError: 'int com.nvidia.spark.rapids.jni.Hash.getMaxStackDepth()' [in thread "Thread-4"]�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at com.nvidia.spark.rapids.jni.Hash.getMaxStackDepth(Native Method)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE at com.nvidia.spark.rapids.jni.Hash.(Hash.java:28)�[0m
[2024-12-14T21:08:57.542Z] �[1m�[31mE ... 84 more�[0m

Steps/Code to reproduce bug
Run integration tests with scala 2.13

Expected behavior
Integration tests pass

Environment details (please complete the following information)

  • Environment location: Nightly jenkins build

Additional context

@sameerz sameerz added ? - Needs Triage bug Something isn't working labels Dec 15, 2024
@ustcfy ustcfy self-assigned this Dec 16, 2024
@res-life res-life self-assigned this Dec 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
? - Needs Triage bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants