Build Logs

galliaproject/gallia-spark • 3.8.0-RC2:2025-11-28

Errors

50

Warnings

114

Total Lines

848

1##################################
2Clonning https://github.com/galliaproject/gallia-spark.git into /build/repo using revision v0.6.1
3##################################
4Note: switching to '593174c0c54fc356dbe6c05994a6aa493eaee7b7'.
5
6You are in 'detached HEAD' state. You can look around, make experimental
7changes and commit them, and you can discard any commits you make in this
8state without impacting any branches by switching back to a branch.
9
10If you want to create a new branch to retain commits you create, you may
11do so (now or later) by using -c with the switch command. Example:
12
13 git switch -c <new-branch-name>
14
15Or undo this operation with:
16
17 git switch -
18
19Turn off this advice by setting config variable advice.detachedHead to false
20
21Would override fixed Scala version: 3.3.1
22----
23Preparing build for 3.8.0-RC2
24Scala binary version found: 3.8
25Implicitly using source version 3.8
26Scala binary version found: 3.8
27Implicitly using source version 3.8
28Would try to apply common scalacOption (best-effort, sbt/mill only):
29Append: ,REQUIRE:-source:3.8
30Remove: ,-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e
31
32Try apply source patch:
33Path: project/GalliaCommonSettings.scala
34Pattern: val scala3 = "3.3.1"
35Replacement: val scala3 = "3.8.0-RC2"
36Starting compilation server
37Compiling project (Scala 3.7.3, JVM (17))
38Compiled project (Scala 3.7.3, JVM (17))
39Successfully applied pattern 'val scala3 = "3.3.1"' in project/GalliaCommonSettings.scala
40----
41Starting build for 3.8.0-RC2
42Execute tests: true
43sbt project found:
44Sbt version 1.9.0 is not supported, minimal supported version is 1.11.5
45Enforcing usage of sbt in version 1.11.5
46No prepare script found for project galliaproject/gallia-spark
47##################################
48Scala version: 3.8.0-RC2
49Targets: io.github.galliaproject%gallia-spark
50Project projectConfig: {"projects":{"exclude":[],"overrides":{}},"java":{},"sbt":{"commands":[],"options":[]},"mill":{"options":[]},"tests":"full","migrationVersions":[],"sourcePatches":[{"path":"project/GalliaCommonSettings.scala","pattern":"val scala3 = \"3.3.1\"","replaceWith":"val scala3 = \"<SCALA_VERSION>\""}]}
51##################################
52Using extra scalacOptions: ,REQUIRE:-source:3.8
53Filtering out scalacOptions: ,-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e
54[sbt_options] declare -a sbt_options=()
55[process_args] java_version = '17'
56[copyRt] java9_rt = '/root/.sbt/1.0/java9-rt-ext-eclipse_adoptium_17_0_8/rt.jar'
57# Executing command line:
58java
59-Dfile.encoding=UTF-8
60-Dcommunitybuild.scala=3.8.0-RC2
61-Dcommunitybuild.project.dependencies.add=
62-Xmx7G
63-Xms4G
64-Xss8M
65-Dsbt.script=/root/.sdkman/candidates/sbt/current/bin/sbt
66-Dscala.ext.dirs=/root/.sbt/1.0/java9-rt-ext-eclipse_adoptium_17_0_8
67-jar
68/root/.sdkman/candidates/sbt/1.11.5/bin/sbt-launch.jar
69"setCrossScalaVersions 3.8.0-RC2"
70"++3.8.0-RC2 -v"
71"mapScalacOptions ",REQUIRE:-source:3.8,-Wconf:msg=can be rewritten automatically under:s" ",-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e""
72"set every credentials := Nil"
73"excludeLibraryDependency com.github.ghik:zerowaste_{scalaVersion} com.olegpy:better-monadic-for_3 org.polyvariant:better-tostring_{scalaVersion} org.wartremover:wartremover_{scalaVersion}"
74"removeScalacOptionsStartingWith -P:wartremover"
75
76moduleMappings
77"runBuild 3.8.0-RC2 """{"projects":{"exclude":[],"overrides":{}},"java":{},"sbt":{"commands":[],"options":[]},"mill":{"options":[]},"tests":"full","migrationVersions":[],"sourcePatches":[{"path":"project/GalliaCommonSettings.scala","pattern":"val scala3 = \"3.3.1\"","replaceWith":"val scala3 = \"<SCALA_VERSION>\""}]}""" io.github.galliaproject%gallia-spark"
78
79[info] welcome to sbt 1.11.5 (Eclipse Adoptium Java 17.0.8)
80[info] loading settings for project repo-build from akka.sbt, plugins.sbt...
81[info] loading project definition from /build/repo/project
82[info] compiling 4 Scala sources to /build/repo/project/target/scala-2.12/sbt-1.0/classes ...
83[info] Non-compiled module 'compiler-bridge_2.12' for Scala 2.12.20. Compiling...
84[info] Compilation completed in 8.842s.
85[info] done compiling
86[info] loading settings for project root from build.sbt...
87[info] set current project to gallia-spark (in build file:/build/repo/)
88Execute setCrossScalaVersions: 3.8.0-RC2
89OpenCB::Changing crossVersion 3.8.0-RC2 -> 3.8.0-RC2 in root/crossScalaVersions
90[info] set current project to gallia-spark (in build file:/build/repo/)
91[info] Setting Scala version to 3.8.0-RC2 on 1 projects.
92[info] Switching Scala version on:
93[info] * root (3.8.0-RC2, 2.13.12, 2.12.18)
94[info] Excluding projects:
95[info] Reapplying settings...
96[info] set current project to gallia-spark (in build file:/build/repo/)
97Execute mapScalacOptions: ,REQUIRE:-source:3.8,-Wconf:msg=can be rewritten automatically under:s ,-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e
98[info] Reapplying settings...
99[info] set current project to gallia-spark (in build file:/build/repo/)
100[info] Defining Global / credentials, credentials
101[info] The new values will be used by Global / pgpSelectPassphrase, Global / pgpSigningKey and 4 others.
102[info] Run `last` for details.
103[info] Reapplying settings...
104[info] set current project to gallia-spark (in build file:/build/repo/)
105Execute excludeLibraryDependency: com.github.ghik:zerowaste_{scalaVersion} com.olegpy:better-monadic-for_3 org.polyvariant:better-tostring_{scalaVersion} org.wartremover:wartremover_{scalaVersion}
106[info] Reapplying settings...
107OpenCB::Failed to reapply settings in excludeLibraryDependency: Reference to undefined setting:
108
109 Global / allExcludeDependencies from Global / allExcludeDependencies (CommunityBuildPlugin.scala:331)
110 Did you mean allExcludeDependencies ?
111 , retry without global scopes
112[info] Reapplying settings...
113[info] set current project to gallia-spark (in build file:/build/repo/)
114Execute removeScalacOptionsStartingWith: -P:wartremover
115[info] Reapplying settings...
116[info] set current project to gallia-spark (in build file:/build/repo/)
117[success] Total time: 0 s, completed Nov 28, 2025, 2:25:40 PM
118Build config: {"projects":{"exclude":[],"overrides":{}},"java":{},"sbt":{"commands":[],"options":[]},"mill":{"options":[]},"tests":"full","migrationVersions":[],"sourcePatches":[{"path":"project/GalliaCommonSettings.scala","pattern":"val scala3 = \"3.3.1\"","replaceWith":"val scala3 = \"<SCALA_VERSION>\""}]}
119Parsed config: Success(ProjectBuildConfig(ProjectsConfig(List(),Map()),Full,List()))
120Starting build...
121Projects: Set(root)
122Starting build for ProjectRef(file:/build/repo/,root) (gallia-spark)... [0/1]
123OpenCB::Exclude Scala3 specific scalacOption `REQUIRE:-source:3.8` in Scala 2.12.20 module Global
124OpenCB::Filter out '-feature', matches setting pattern '^-?-feature'
125OpenCB::Filter out '-deprecation', matches setting pattern '^-?-deprecation'
126Compile scalacOptions: -encoding, UTF-8, -unchecked, -language:implicitConversions, -Wunused:implicits, -Wunused:explicits, -Wunused:imports, -Wunused:locals, -Wunused:params, -Wunused:privates, -no-indent, -Wvalue-discard, -Wconf:msg=can be rewritten automatically under:s, -source:3.8
127[info] compiling 14 Scala sources to /build/repo/../bin/spark/scala-3.8.0-RC2/classes ...
128[warn] -- [E175] Potential Issue Warning: /build/repo/src/main/scala/aptus/spark/SparkDriver.scala:34:28
129[warn] 34 | sys.addShutdownHook {
130[warn] | ^
131[warn] |discarded non-Unit value of type scala.sys.ShutdownHookThread. Add `: Unit` to discard silently.
132[warn] 35 | // TODO: or as "whatever is in cache?"
133[warn] 36 | //println("stopping spark-context")
134[warn] 37 | sc.stop() } }
135[warn] -- Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerUtils.scala:37:6
136[warn] 37 | private def postCoGroup[K: ClassTag, V: ClassTag](joinType: JoinType)(coGrouped: RDD[(K, (Iterable[V], Iterable[V]))]): RDD[(K, (Iterable[V], Iterable[V]))] = {
137[warn] | ^
138[warn] | Line is indented too far to the right, or a `{` is missing before:
139[warn] |
140[warn] | left.cogroup(right).pype(postCoGroup(joinType))
141[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/spark/SparkRddIn.scala:9:12
142[warn] 9 | def rdd (sc: SparkContext, schema: Cls, rdd: RDD[Obj]): HeadS = RddInputObjs (schema, rdd) .pipe(heads.Head.inputZ)
143[warn] | ^^
144[warn] | unused explicit parameter
145[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerHashJoin.scala:11:43
146[warn] 11 | def innerHashJoinWithLeftBias[K: ClassTag, V: ClassTag](
147[warn] | ^
148[warn] | unused implicit parameter
149[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerHashJoin.scala:11:56
150[warn] 11 | def innerHashJoinWithLeftBias[K: ClassTag, V: ClassTag](
151[warn] | ^
152[warn] | unused implicit parameter
153[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerHashJoin.scala:30:30
154[warn] 30 | def leftHashJoin[K: ClassTag, V: ClassTag](
155[warn] | ^
156[warn] | unused implicit parameter
157[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerHashJoin.scala:30:43
158[warn] 30 | def leftHashJoin[K: ClassTag, V: ClassTag](
159[warn] | ^
160[warn] | unused implicit parameter
161[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerHashJoin.scala:49:31
162[warn] 49 | def rightHashJoin[K: ClassTag, V: ClassTag](
163[warn] | ^
164[warn] | unused implicit parameter
165[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerHashJoin.scala:49:44
166[warn] 49 | def rightHashJoin[K: ClassTag, V: ClassTag](
167[warn] | ^
168[warn] | unused implicit parameter
169[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerUtils.scala:37:41
170[warn] 37 | private def postCoGroup[K: ClassTag, V: ClassTag](joinType: JoinType)(coGrouped: RDD[(K, (Iterable[V], Iterable[V]))]): RDD[(K, (Iterable[V], Iterable[V]))] = {
171[warn] | ^
172[warn] | unused implicit parameter
173[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerUtils.scala:37:54
174[warn] 37 | private def postCoGroup[K: ClassTag, V: ClassTag](joinType: JoinType)(coGrouped: RDD[(K, (Iterable[V], Iterable[V]))]): RDD[(K, (Iterable[V], Iterable[V]))] = {
175[warn] | ^
176[warn] | unused implicit parameter
177[warn] there was 1 deprecation warning; re-run with -deprecation for details
178[warn] 12 warnings found
179[info] done compiling
180[info] compiling 7 Scala sources to /build/repo/../bin/spark/scala-3.8.0-RC2/test-classes ...
181[warn] -- [E175] Potential Issue Warning: /build/repo/src/test/scala/galliatesting/spark/NonGalliaSparkTest.scala:36:61
182[warn] 36 | "/tmp/spark_test.csv/part-00000".readFileLines().ensuring(_ == List(
183[warn] | ^
184[warn] |discarded non-Unit value of type List[aptus.Line]. Add `: Unit` to discard silently.
185[warn] 37 | "FOO,BAZ",
186[warn] 38 | "BAR1,1",
187[warn] 39 | "BAR2,2",
188[warn] 40 | "")) }
189[warn] -- [E175] Potential Issue Warning: /build/repo/src/test/scala/galliatesting/spark/SparkCsvTest.scala:22:16
190[warn] 22 | res.ensuring(
191[warn] | ^
192[warn] |discarded non-Unit value of type gallia.domain.AObjs. Add `: Unit` to discard silently.
193[warn] 23 | _ == bobjs(
194[warn] 24 | bobj("foo" -> "BAR1", "baz" -> "1"),
195[warn] 25 | bobj("foo" -> "BAR2", "baz" -> "2"))
196[warn] 26 | .forceAObjs) } }
197[warn] -- [E175] Potential Issue Warning: /build/repo/src/test/scala/galliatesting/spark/SparkJsonLinesTest.scala:23:15
198[warn] 22 | res
199[warn] 23 | .ensuring(
200[warn] | ^
201[warn] |discarded non-Unit value of type gallia.domain.AObjs. Add `: Unit` to discard silently.
202[warn] 24 | _ == bobjs(
203[warn] 25 | bobj("foo" -> "BAR1", "baz" -> 1),
204[warn] 26 | bobj("foo" -> "BAR2", "baz" -> 2))
205[warn] 27 | .forceAObjs) } }
206[warn] -- [E175] Potential Issue Warning: /build/repo/src/test/scala/galliatesting/spark/SparkLinesTest.scala:23:15
207[warn] 22 | res
208[warn] 23 | .ensuring(
209[warn] | ^
210[warn] |discarded non-Unit value of type gallia.domain.AObjs. Add `: Unit` to discard silently.
211[warn] 24 | _ == bobjs(
212[warn] 25 | bobj(_line -> "FOO,BAZ"),
213[warn] 26 | bobj(_line -> "BAR1,1"),
214[warn] 27 | bobj(_line -> "BAR2,2"))
215[warn] 28 | .forceAObjs) } }
216[warn] -- [E175] Potential Issue Warning: /build/repo/src/test/scala/galliatesting/spark/SparkRddDirectlyTest.scala:25:15
217[warn] 24 | res
218[warn] 25 | .ensuring(
219[warn] | ^
220[warn] |discarded non-Unit value of type gallia.domain.AObjs. Add `: Unit` to discard silently.
221[warn] 26 | _ == bobjs(
222[warn] 27 | bobj(_line -> "FOO,BAZ"),
223[warn] 28 | bobj(_line -> "BAR1,1"),
224[warn] 29 | bobj(_line -> "BAR2,2"),
225[warn] 30 | bobj(_line -> ""))
226[warn] 31 | .forceAObjs) } }
227[warn] -- [E198] Unused Symbol Warning: /build/repo/src/test/scala/galliatesting/spark/NonGalliaSparkTest.scala:4:21
228[warn] 4 |import util.chaining._
229[warn] | ^
230[warn] | unused import
231[warn] -- [E198] Unused Symbol Warning: /build/repo/src/test/scala/galliatesting/spark/NonGalliaSparkTest.scala:7:14
232[warn] 7 |import gallia._
233[warn] | ^
234[warn] | unused import
235[warn] -- [E198] Unused Symbol Warning: /build/repo/src/test/scala/galliatesting/spark/SparkAvroTest.scala:4:14
236[warn] 4 |import gallia._
237[warn] | ^
238[warn] | unused import
239[warn] -- [E198] Unused Symbol Warning: /build/repo/src/test/scala/galliatesting/spark/SparkAvroTest.scala:5:20
240[warn] 5 |import gallia.spark._
241[warn] | ^
242[warn] | unused import
243[warn] 9 warnings found
244[info] done compiling
245-------------------------------- Running Tests --------------------------------
246Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
24725/11/28 14:26:47 INFO SparkContext: Running Spark version 3.5.0
24825/11/28 14:26:47 INFO SparkContext: OS info Linux, 6.8.0-1041-azure, amd64
24925/11/28 14:26:47 INFO SparkContext: Java version 17.0.8
25025/11/28 14:26:47 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
25125/11/28 14:26:47 INFO ResourceUtils: ==============================================================
25225/11/28 14:26:47 INFO ResourceUtils: No custom resources configured for spark.driver.
25325/11/28 14:26:47 INFO ResourceUtils: ==============================================================
25425/11/28 14:26:47 INFO SparkContext: Submitted application: my-spark
25525/11/28 14:26:47 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
25625/11/28 14:26:47 INFO ResourceProfile: Limiting resource is cpu
25725/11/28 14:26:47 INFO ResourceProfileManager: Added ResourceProfile id: 0
25825/11/28 14:26:47 INFO SecurityManager: Changing view acls to: root
25925/11/28 14:26:47 INFO SecurityManager: Changing modify acls to: root
26025/11/28 14:26:47 INFO SecurityManager: Changing view acls groups to:
26125/11/28 14:26:47 INFO SecurityManager: Changing modify acls groups to:
26225/11/28 14:26:47 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
26325/11/28 14:26:48 INFO Utils: Successfully started service 'sparkDriver' on port 39833.
26425/11/28 14:26:48 INFO SparkEnv: Registering MapOutputTracker
26525/11/28 14:26:48 INFO SparkEnv: Registering BlockManagerMaster
26625/11/28 14:26:48 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
26725/11/28 14:26:48 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
268X galliatesting.spark.SparkTest.basic spark (non gallia) 1273ms
269 java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x37971d0) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x37971d0
270 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
271 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
272 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
273 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
274 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
275 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
276 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
277 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
278 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
279 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
28025/11/28 14:26:48 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
281org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
282galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
283galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
284utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
285utest.framework.TestCallTree.run(Model.scala:45)
286utest.framework.TestCallTree.run(Model.scala:43)
287utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
288utest.framework.Executor.utestWrap(Executor.scala:12)
289utest.framework.Executor.utestWrap$(Executor.scala:5)
290utest.TestSuite.utestWrap(TestSuite.scala:12)
291utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
292utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
293utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
294utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
295utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
296scala.concurrent.Future$.traverse$$anonfun$1(Future.scala:888)
297scala.collection.IterableOnceOps.foldLeft(IterableOnce.scala:741)
298scala.collection.IterableOnceOps.foldLeft$(IterableOnce.scala:337)
299scala.collection.AbstractIterator.foldLeft(Iterator.scala:1328)
300scala.concurrent.Future$.traverse(Future.scala:888)
30125/11/28 14:26:48 INFO SparkContext: Running Spark version 3.5.0
30225/11/28 14:26:48 INFO SparkContext: OS info Linux, 6.8.0-1041-azure, amd64
30325/11/28 14:26:48 INFO SparkContext: Java version 17.0.8
30425/11/28 14:26:48 INFO ResourceUtils: ==============================================================
30525/11/28 14:26:48 INFO ResourceUtils: No custom resources configured for spark.driver.
30625/11/28 14:26:48 INFO ResourceUtils: ==============================================================
30725/11/28 14:26:48 INFO SparkContext: Submitted application: spark-csv
30825/11/28 14:26:48 INFO ResourceProfileManager: Added ResourceProfile id: 0
30925/11/28 14:26:48 INFO SecurityManager: Changing view acls to: root
31025/11/28 14:26:48 INFO SecurityManager: Changing modify acls to: root
31125/11/28 14:26:48 INFO SecurityManager: Changing view acls groups to:
31225/11/28 14:26:48 INFO SecurityManager: Changing modify acls groups to:
31325/11/28 14:26:48 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
31425/11/28 14:26:48 INFO Utils: Successfully started service 'sparkDriver' on port 33001.
31525/11/28 14:26:48 INFO SparkEnv: Registering MapOutputTracker
31625/11/28 14:26:48 INFO SparkEnv: Registering BlockManagerMaster
31725/11/28 14:26:48 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
31825/11/28 14:26:48 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
319X galliatesting.spark.SparkTest.spark-csv 26ms
320 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
321 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
322 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
323 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
324 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
325 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
326 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
327 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
328 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
329 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
330 scala.Option.getOrElse(Option.scala:203)
331 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
332 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
333 galliatesting.spark.SparkCsvTest$.apply(SparkCsvTest.scala:11)
334 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$2(SparkTest.scala:33)
335 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x37971d0) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x37971d0 [in thread "pool-18-thread-3"]
336 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
337 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
338 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
339 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
340 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
341 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
342 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
343 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
344 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
345 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
34625/11/28 14:26:48 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
347org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
348aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
349aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
350scala.Option.getOrElse(Option.scala:203)
351aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
352gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
353galliatesting.spark.SparkCsvTest$.apply(SparkCsvTest.scala:11)
354galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$2(SparkTest.scala:33)
355utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
356utest.framework.TestCallTree.run(Model.scala:45)
357utest.framework.TestCallTree.run(Model.scala:43)
358utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
359utest.framework.Executor.utestWrap(Executor.scala:12)
360utest.framework.Executor.utestWrap$(Executor.scala:5)
361utest.TestSuite.utestWrap(TestSuite.scala:12)
362utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
363utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
364utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
365utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
366utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
36725/11/28 14:26:48 INFO SparkContext: Running Spark version 3.5.0
36825/11/28 14:26:48 INFO SparkContext: OS info Linux, 6.8.0-1041-azure, amd64
36925/11/28 14:26:48 INFO SparkContext: Java version 17.0.8
37025/11/28 14:26:48 INFO ResourceUtils: ==============================================================
37125/11/28 14:26:48 INFO ResourceUtils: No custom resources configured for spark.driver.
37225/11/28 14:26:48 INFO ResourceUtils: ==============================================================
37325/11/28 14:26:48 INFO SparkContext: Submitted application: spark-csv-no-trailing-newline
37425/11/28 14:26:48 INFO ResourceProfileManager: Added ResourceProfile id: 0
37525/11/28 14:26:48 INFO SecurityManager: Changing view acls to: root
37625/11/28 14:26:48 INFO SecurityManager: Changing modify acls to: root
37725/11/28 14:26:48 INFO SecurityManager: Changing view acls groups to:
37825/11/28 14:26:48 INFO SecurityManager: Changing modify acls groups to:
37925/11/28 14:26:48 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
38025/11/28 14:26:48 INFO Utils: Successfully started service 'sparkDriver' on port 39757.
38125/11/28 14:26:48 INFO SparkEnv: Registering MapOutputTracker
38225/11/28 14:26:48 INFO SparkEnv: Registering BlockManagerMaster
38325/11/28 14:26:48 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
38425/11/28 14:26:48 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
385X galliatesting.spark.SparkTest.spark-csv-no-trailing-newline 28ms
386 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
387 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
388 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
389 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
390 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
391 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
392 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
393 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
394 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
395 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
396 scala.Option.getOrElse(Option.scala:203)
397 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
398 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
399 galliatesting.spark.SparkCsvTest$.apply(SparkCsvTest.scala:11)
400 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$3(SparkTest.scala:34)
401 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x37971d0) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x37971d0 [in thread "pool-18-thread-3"]
402 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
403 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
404 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
405 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
406 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
407 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
408 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
409 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
410 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
411 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
41225/11/28 14:26:48 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
413org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
414aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
415aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
416scala.Option.getOrElse(Option.scala:203)
417aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
418gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
419galliatesting.spark.SparkCsvTest$.apply(SparkCsvTest.scala:11)
420galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$3(SparkTest.scala:34)
421utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
422utest.framework.TestCallTree.run(Model.scala:45)
423utest.framework.TestCallTree.run(Model.scala:43)
424utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
425utest.framework.Executor.utestWrap(Executor.scala:12)
426utest.framework.Executor.utestWrap$(Executor.scala:5)
427utest.TestSuite.utestWrap(TestSuite.scala:12)
428utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
429utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
430utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
431utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
432utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
43325/11/28 14:26:48 INFO SparkContext: Running Spark version 3.5.0
43425/11/28 14:26:48 INFO SparkContext: OS info Linux, 6.8.0-1041-azure, amd64
43525/11/28 14:26:48 INFO SparkContext: Java version 17.0.8
43625/11/28 14:26:48 INFO ResourceUtils: ==============================================================
43725/11/28 14:26:48 INFO ResourceUtils: No custom resources configured for spark.driver.
43825/11/28 14:26:48 INFO ResourceUtils: ==============================================================
43925/11/28 14:26:48 INFO SparkContext: Submitted application: spark-lines-plain
44025/11/28 14:26:48 INFO ResourceProfileManager: Added ResourceProfile id: 0
44125/11/28 14:26:48 INFO SecurityManager: Changing view acls to: root
44225/11/28 14:26:48 INFO SecurityManager: Changing modify acls to: root
44325/11/28 14:26:48 INFO SecurityManager: Changing view acls groups to:
44425/11/28 14:26:48 INFO SecurityManager: Changing modify acls groups to:
44525/11/28 14:26:48 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
44625/11/28 14:26:48 INFO Utils: Successfully started service 'sparkDriver' on port 46199.
44725/11/28 14:26:48 INFO SparkEnv: Registering MapOutputTracker
44825/11/28 14:26:48 INFO SparkEnv: Registering BlockManagerMaster
44925/11/28 14:26:48 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
45025/11/28 14:26:48 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
451X galliatesting.spark.SparkTest.spark-lines-plain 22ms
452 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
453 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
454 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
455 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
456 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
457 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
458 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
459 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
460 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
461 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
462 scala.Option.getOrElse(Option.scala:203)
463 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
464 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
465 galliatesting.spark.SparkLinesTest$.apply(SparkLinesTest.scala:11)
466 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$4(SparkTest.scala:37)
467 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x37971d0) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x37971d0 [in thread "pool-18-thread-3"]
468 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
469 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
470 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
471 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
472 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
473 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
474 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
475 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
476 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
477 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
47825/11/28 14:26:48 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
479org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
480aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
481aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
482scala.Option.getOrElse(Option.scala:203)
483aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
484gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
485galliatesting.spark.SparkLinesTest$.apply(SparkLinesTest.scala:11)
486galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$4(SparkTest.scala:37)
487utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
488utest.framework.TestCallTree.run(Model.scala:45)
489utest.framework.TestCallTree.run(Model.scala:43)
490utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
491utest.framework.Executor.utestWrap(Executor.scala:12)
492utest.framework.Executor.utestWrap$(Executor.scala:5)
493utest.TestSuite.utestWrap(TestSuite.scala:12)
494utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
495utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
496utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
497utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
498utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
49925/11/28 14:26:48 INFO SparkContext: Running Spark version 3.5.0
50025/11/28 14:26:48 INFO SparkContext: OS info Linux, 6.8.0-1041-azure, amd64
50125/11/28 14:26:48 INFO SparkContext: Java version 17.0.8
50225/11/28 14:26:48 INFO ResourceUtils: ==============================================================
50325/11/28 14:26:48 INFO ResourceUtils: No custom resources configured for spark.driver.
50425/11/28 14:26:48 INFO ResourceUtils: ==============================================================
50525/11/28 14:26:48 INFO SparkContext: Submitted application: spark-lines-gz
50625/11/28 14:26:48 INFO ResourceProfileManager: Added ResourceProfile id: 0
50725/11/28 14:26:48 INFO SecurityManager: Changing view acls to: root
50825/11/28 14:26:48 INFO SecurityManager: Changing modify acls to: root
50925/11/28 14:26:48 INFO SecurityManager: Changing view acls groups to:
51025/11/28 14:26:48 INFO SecurityManager: Changing modify acls groups to:
51125/11/28 14:26:48 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
51225/11/28 14:26:48 INFO Utils: Successfully started service 'sparkDriver' on port 46319.
51325/11/28 14:26:48 INFO SparkEnv: Registering MapOutputTracker
51425/11/28 14:26:48 INFO SparkEnv: Registering BlockManagerMaster
51525/11/28 14:26:48 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
51625/11/28 14:26:48 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
517X galliatesting.spark.SparkTest.spark-lines-gz 19ms
518 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
519 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
520 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
521 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
522 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
523 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
524 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
525 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
526 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
527 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
528 scala.Option.getOrElse(Option.scala:203)
529 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
530 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
531 galliatesting.spark.SparkLinesTest$.apply(SparkLinesTest.scala:11)
532 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$5(SparkTest.scala:38)
533 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x37971d0) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x37971d0 [in thread "pool-18-thread-3"]
534 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
535 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
536 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
537 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
538 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
539 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
540 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
541 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
542 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
543 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
54425/11/28 14:26:48 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
545org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
546aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
547aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
548scala.Option.getOrElse(Option.scala:203)
549aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
550gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
551galliatesting.spark.SparkLinesTest$.apply(SparkLinesTest.scala:11)
552galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$5(SparkTest.scala:38)
553utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
554utest.framework.TestCallTree.run(Model.scala:45)
555utest.framework.TestCallTree.run(Model.scala:43)
556utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
557utest.framework.Executor.utestWrap(Executor.scala:12)
558utest.framework.Executor.utestWrap$(Executor.scala:5)
559utest.TestSuite.utestWrap(TestSuite.scala:12)
560utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
561utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
562utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
563utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
564utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
56525/11/28 14:26:48 INFO SparkContext: Running Spark version 3.5.0
56625/11/28 14:26:48 INFO SparkContext: OS info Linux, 6.8.0-1041-azure, amd64
56725/11/28 14:26:48 INFO SparkContext: Java version 17.0.8
56825/11/28 14:26:48 INFO ResourceUtils: ==============================================================
56925/11/28 14:26:48 INFO ResourceUtils: No custom resources configured for spark.driver.
57025/11/28 14:26:48 INFO ResourceUtils: ==============================================================
57125/11/28 14:26:48 INFO SparkContext: Submitted application: spark-lines-bz2
57225/11/28 14:26:48 INFO ResourceProfileManager: Added ResourceProfile id: 0
57325/11/28 14:26:48 INFO SecurityManager: Changing view acls to: root
57425/11/28 14:26:48 INFO SecurityManager: Changing modify acls to: root
57525/11/28 14:26:48 INFO SecurityManager: Changing view acls groups to:
57625/11/28 14:26:48 INFO SecurityManager: Changing modify acls groups to:
57725/11/28 14:26:48 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
57825/11/28 14:26:48 INFO Utils: Successfully started service 'sparkDriver' on port 34753.
57925/11/28 14:26:48 INFO SparkEnv: Registering MapOutputTracker
58025/11/28 14:26:48 INFO SparkEnv: Registering BlockManagerMaster
58125/11/28 14:26:48 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
58225/11/28 14:26:48 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
583X galliatesting.spark.SparkTest.spark-lines-bz2 20ms
584 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
585 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
586 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
587 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
588 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
589 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
590 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
591 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
592 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
593 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
594 scala.Option.getOrElse(Option.scala:203)
595 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
596 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
597 galliatesting.spark.SparkLinesTest$.apply(SparkLinesTest.scala:11)
598 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$6(SparkTest.scala:39)
599 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x37971d0) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x37971d0 [in thread "pool-18-thread-3"]
600 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
601 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
602 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
603 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
604 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
605 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
606 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
607 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
608 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
609 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
61025/11/28 14:26:48 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
611org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
612aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
613aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
614scala.Option.getOrElse(Option.scala:203)
615aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
616gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
617galliatesting.spark.SparkLinesTest$.apply(SparkLinesTest.scala:11)
618galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$6(SparkTest.scala:39)
619utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
620utest.framework.TestCallTree.run(Model.scala:45)
621utest.framework.TestCallTree.run(Model.scala:43)
622utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
623utest.framework.Executor.utestWrap(Executor.scala:12)
624utest.framework.Executor.utestWrap$(Executor.scala:5)
625utest.TestSuite.utestWrap(TestSuite.scala:12)
626utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
627utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
628utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
629utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
630utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
63125/11/28 14:26:48 INFO SparkContext: Running Spark version 3.5.0
63225/11/28 14:26:48 INFO SparkContext: OS info Linux, 6.8.0-1041-azure, amd64
63325/11/28 14:26:48 INFO SparkContext: Java version 17.0.8
63425/11/28 14:26:48 INFO ResourceUtils: ==============================================================
63525/11/28 14:26:48 INFO ResourceUtils: No custom resources configured for spark.driver.
63625/11/28 14:26:48 INFO ResourceUtils: ==============================================================
63725/11/28 14:26:48 INFO SparkContext: Submitted application: spark-rdd-directly
63825/11/28 14:26:48 INFO ResourceProfileManager: Added ResourceProfile id: 0
63925/11/28 14:26:48 INFO SecurityManager: Changing view acls to: root
64025/11/28 14:26:48 INFO SecurityManager: Changing modify acls to: root
64125/11/28 14:26:48 INFO SecurityManager: Changing view acls groups to:
64225/11/28 14:26:48 INFO SecurityManager: Changing modify acls groups to:
64325/11/28 14:26:48 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
64425/11/28 14:26:48 INFO Utils: Successfully started service 'sparkDriver' on port 33625.
64525/11/28 14:26:48 INFO SparkEnv: Registering MapOutputTracker
64625/11/28 14:26:48 INFO SparkEnv: Registering BlockManagerMaster
64725/11/28 14:26:48 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
64825/11/28 14:26:48 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
649X galliatesting.spark.SparkTest.spark-rdd-directly 31ms
650 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
651 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
652 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
653 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
654 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
655 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
656 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
657 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
658 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
659 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
660 scala.Option.getOrElse(Option.scala:203)
661 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
662 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
663 galliatesting.spark.SparkRddDirectlyTest$.apply(SparkRddDirectlyTest.scala:12)
664 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$7(SparkTest.scala:42)
665 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x37971d0) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x37971d0 [in thread "pool-18-thread-3"]
666 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
667 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
668 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
669 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
670 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
671 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
672 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
673 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
674 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
675 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
67625/11/28 14:26:48 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
677org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
678aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
679aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
680scala.Option.getOrElse(Option.scala:203)
681aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
682gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
683galliatesting.spark.SparkRddDirectlyTest$.apply(SparkRddDirectlyTest.scala:12)
684galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$7(SparkTest.scala:42)
685utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
686utest.framework.TestCallTree.run(Model.scala:45)
687utest.framework.TestCallTree.run(Model.scala:43)
688utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
689utest.framework.Executor.utestWrap(Executor.scala:12)
690utest.framework.Executor.utestWrap$(Executor.scala:5)
691utest.TestSuite.utestWrap(TestSuite.scala:12)
692utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
693utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
694utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
695utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
696utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
69725/11/28 14:26:48 INFO SparkContext: Running Spark version 3.5.0
69825/11/28 14:26:48 INFO SparkContext: OS info Linux, 6.8.0-1041-azure, amd64
69925/11/28 14:26:48 INFO SparkContext: Java version 17.0.8
70025/11/28 14:26:48 INFO ResourceUtils: ==============================================================
70125/11/28 14:26:48 INFO ResourceUtils: No custom resources configured for spark.driver.
70225/11/28 14:26:48 INFO ResourceUtils: ==============================================================
70325/11/28 14:26:48 INFO SparkContext: Submitted application: spark-jsonl
70425/11/28 14:26:48 INFO ResourceProfileManager: Added ResourceProfile id: 0
70525/11/28 14:26:48 INFO SecurityManager: Changing view acls to: root
70625/11/28 14:26:48 INFO SecurityManager: Changing modify acls to: root
70725/11/28 14:26:48 INFO SecurityManager: Changing view acls groups to:
70825/11/28 14:26:48 INFO SecurityManager: Changing modify acls groups to:
70925/11/28 14:26:48 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
71025/11/28 14:26:48 INFO Utils: Successfully started service 'sparkDriver' on port 40031.
71125/11/28 14:26:48 INFO SparkEnv: Registering MapOutputTracker
71225/11/28 14:26:48 INFO SparkEnv: Registering BlockManagerMaster
71325/11/28 14:26:48 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
71425/11/28 14:26:48 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
715X galliatesting.spark.SparkTest.spark-jsonl 22ms
716 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
717 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
718 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
719 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
720 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
721 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
722 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
723 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
724 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
725 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
726 scala.Option.getOrElse(Option.scala:203)
727 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
728 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
729 galliatesting.spark.SparkJsonLinesTest$.apply(SparkJsonLinesTest.scala:11)
730 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$8(SparkTest.scala:45)
731 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x37971d0) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x37971d0 [in thread "pool-18-thread-3"]
732 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
733 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
734 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
735 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
736 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
737 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
738 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
739 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
740 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
741 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
74225/11/28 14:26:48 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
743org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
744aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
745aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
746scala.Option.getOrElse(Option.scala:203)
747aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
748gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
749galliatesting.spark.SparkJsonLinesTest$.apply(SparkJsonLinesTest.scala:11)
750galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$8(SparkTest.scala:45)
751utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
752utest.framework.TestCallTree.run(Model.scala:45)
753utest.framework.TestCallTree.run(Model.scala:43)
754utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
755utest.framework.Executor.utestWrap(Executor.scala:12)
756utest.framework.Executor.utestWrap$(Executor.scala:5)
757utest.TestSuite.utestWrap(TestSuite.scala:12)
758utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
759utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
760utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
761utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
762utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
76325/11/28 14:26:48 INFO SparkContext: Running Spark version 3.5.0
76425/11/28 14:26:48 INFO SparkContext: OS info Linux, 6.8.0-1041-azure, amd64
76525/11/28 14:26:48 INFO SparkContext: Java version 17.0.8
76625/11/28 14:26:48 INFO ResourceUtils: ==============================================================
76725/11/28 14:26:48 INFO ResourceUtils: No custom resources configured for spark.driver.
76825/11/28 14:26:48 INFO ResourceUtils: ==============================================================
76925/11/28 14:26:48 INFO SparkContext: Submitted application: spark-register
77025/11/28 14:26:48 INFO ResourceProfileManager: Added ResourceProfile id: 0
77125/11/28 14:26:48 INFO SecurityManager: Changing view acls to: root
77225/11/28 14:26:48 INFO SecurityManager: Changing modify acls to: root
77325/11/28 14:26:48 INFO SecurityManager: Changing view acls groups to:
77425/11/28 14:26:48 INFO SecurityManager: Changing modify acls groups to:
77525/11/28 14:26:48 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
77625/11/28 14:26:48 INFO Utils: Successfully started service 'sparkDriver' on port 39225.
77725/11/28 14:26:48 INFO SparkEnv: Registering MapOutputTracker
77825/11/28 14:26:48 INFO SparkEnv: Registering BlockManagerMaster
77925/11/28 14:26:48 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
78025/11/28 14:26:48 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
781X galliatesting.spark.SparkTest.spark-register 23ms
782 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
783 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
784 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
785 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
786 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
787 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
788 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
789 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
790 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
791 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
792 scala.Option.getOrElse(Option.scala:203)
793 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
794 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
795 galliatesting.spark.SparkTest$.galliaSparkRegister(SparkTest.scala:57)
796 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$9(SparkTest.scala:51)
797 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x37971d0) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x37971d0 [in thread "pool-18-thread-3"]
798 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
799 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
800 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
801 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
802 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
803 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
804 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
805 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
806 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
807 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
808[error] Test suite galliatesting.spark.SparkTest failed with java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x37971d0) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x37971d0.
809[error] This may be due to the ClassLoaderLayeringStrategy (ScalaLibrary) used by your task.
810[error] To improve performance and reduce memory, sbt attempts to cache the class loaders used to load the project dependencies.
811[error] The project class files are loaded in a separate class loader that is created for each test run.
812[error] The test class loader accesses the project dependency classes using the cached project dependency classloader.
813[error] With this approach, class loading may fail under the following conditions:
814[error]
815[error] * Dependencies use reflection to access classes in your project's classpath.
816[error] Java serialization/deserialization may cause this.
817[error] * An open package is accessed across layers. If the project's classes access or extend
818[error] jvm package private classes defined in a project dependency, it may cause an IllegalAccessError
819[error] because the jvm enforces package private at the classloader level.
820[error]
821[error] These issues, along with others that were not enumerated above, may be resolved by changing the class loader layering strategy.
822[error] The Flat and ScalaLibrary strategies bundle the full project classpath in the same class loader.
823[error] To use one of these strategies, set the ClassLoaderLayeringStrategy key
824[error] in your configuration, for example:
825[error]
826[error] set root / Test / classLoaderLayeringStrategy := ClassLoaderLayeringStrategy.ScalaLibrary
827[error] set root / Test / classLoaderLayeringStrategy := ClassLoaderLayeringStrategy.Flat
828[error]
829[error] See ClassLoaderLayeringStrategy.scala for the full list of options.
830
831************************
832Build summary:
833[{
834 "module": "gallia-spark",
835 "compile": {"status": "ok", "tookMs": 12075, "warnings": 11, "errors": 0, "sourceVersion": "3.8"},
836 "doc": {"status": "skipped", "tookMs": 0, "files": 0, "totalSizeKb": 0},
837 "test-compile": {"status": "ok", "tookMs": 1856, "warnings": 9, "errors": 0, "sourceVersion": "3.8"},
838 "test": {"status": "failed", "tookMs": 2072, "passed": 0, "failed": 9, "ignored": 0, "skipped": 0, "total": 9, "byFramework": [{"framework": "unknown", "stats": {"passed": 0, "failed": 9, "ignored": 0, "skipped": 0, "total": 9}}]},
839 "publish": {"status": "skipped", "tookMs": 0},
840 "metadata": {
841 "crossScalaVersions": ["2.13.12", "2.12.18", "3.3.1"]
842}
843}]
844************************
845[error] Scala3CommunityBuild$ProjectBuildFailureException: 1 module(s) finished with failures: gallia-spark
846[error] (Global / runBuild) Scala3CommunityBuild$ProjectBuildFailureException: 1 module(s) finished with failures: gallia-spark
847[error] Total time: 69 s (0:01:09.0), completed Nov 28, 2025, 2:26:48 PM
848[0JBuild failed, not retrying.