Build Logs
galliaproject/gallia-spark • 3.8.1-RC1:2026-01-13
Errors
50
Warnings
112
Total Lines
846
1##################################
2Clonning https://github.com/galliaproject/gallia-spark.git into /build/repo using revision v0.6.1
3##################################
4Note: switching to '593174c0c54fc356dbe6c05994a6aa493eaee7b7'.
5
6You are in 'detached HEAD' state. You can look around, make experimental
7changes and commit them, and you can discard any commits you make in this
8state without impacting any branches by switching back to a branch.
9
10If you want to create a new branch to retain commits you create, you may
11do so (now or later) by using -c with the switch command. Example:
12
13 git switch -c <new-branch-name>
14
15Or undo this operation with:
16
17 git switch -
18
19Turn off this advice by setting config variable advice.detachedHead to false
20
21Would override fixed Scala version: 3.3.1
22----
23Preparing build for 3.8.1-RC1
24Scala binary version found: 3.8
25Implicitly using source version 3.8
26Scala binary version found: 3.8
27Implicitly using source version 3.8
28Would try to apply common scalacOption (best-effort, sbt/mill only):
29Append: ,REQUIRE:-source:3.8
30Remove: ,-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e
31
32Try apply source patch:
33Path: project/GalliaCommonSettings.scala
34Pattern: val scala3 = "3.3.1"
35Replacement: val scala3 = "3.8.1-RC1"
36Starting compilation server
37Compiling project (Scala 3.7.3, JVM (17))
38Compiled project (Scala 3.7.3, JVM (17))
39Successfully applied pattern 'val scala3 = "3.3.1"' in project/GalliaCommonSettings.scala
40----
41Starting build for 3.8.1-RC1
42Execute tests: true
43sbt project found:
44Sbt version 1.9.0 is not supported, minimal supported version is 1.11.5
45Enforcing usage of sbt in version 1.11.5
46No prepare script found for project galliaproject/gallia-spark
47##################################
48Scala version: 3.8.1-RC1
49Targets: io.github.galliaproject%gallia-spark
50Project projectConfig: {"projects":{"exclude":[],"overrides":{}},"java":{},"sbt":{"commands":[],"options":[]},"mill":{"options":[]},"tests":"full","migrationVersions":[],"sourcePatches":[{"path":"project/GalliaCommonSettings.scala","pattern":"val scala3 = \"3.3.1\"","replaceWith":"val scala3 = \"<SCALA_VERSION>\""}]}
51##################################
52Using extra scalacOptions: ,REQUIRE:-source:3.8
53Filtering out scalacOptions: ,-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e
54[sbt_options] declare -a sbt_options=()
55[process_args] java_version = '17'
56[copyRt] java9_rt = '/root/.sbt/1.0/java9-rt-ext-eclipse_adoptium_17_0_8/rt.jar'
57# Executing command line:
58java
59-Dfile.encoding=UTF-8
60-Dcommunitybuild.scala=3.8.1-RC1
61-Dcommunitybuild.project.dependencies.add=
62-Xmx7G
63-Xms4G
64-Xss8M
65-Dsbt.script=/root/.sdkman/candidates/sbt/current/bin/sbt
66-Dscala.ext.dirs=/root/.sbt/1.0/java9-rt-ext-eclipse_adoptium_17_0_8
67-jar
68/root/.sdkman/candidates/sbt/1.11.5/bin/sbt-launch.jar
69"setCrossScalaVersions 3.8.1-RC1"
70"++3.8.1-RC1 -v"
71"mapScalacOptions ",REQUIRE:-source:3.8,-Wconf:msg=can be rewritten automatically under:s" ",-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e""
72"set every credentials := Nil"
73"excludeLibraryDependency com.github.ghik:zerowaste_{scalaVersion} com.olegpy:better-monadic-for_3 org.polyvariant:better-tostring_{scalaVersion} org.wartremover:wartremover_{scalaVersion}"
74"removeScalacOptionsStartingWith -P:wartremover"
75
76moduleMappings
77"runBuild 3.8.1-RC1 """{"projects":{"exclude":[],"overrides":{}},"java":{},"sbt":{"commands":[],"options":[]},"mill":{"options":[]},"tests":"full","migrationVersions":[],"sourcePatches":[{"path":"project/GalliaCommonSettings.scala","pattern":"val scala3 = \"3.3.1\"","replaceWith":"val scala3 = \"<SCALA_VERSION>\""}]}""" io.github.galliaproject%gallia-spark"
78
79[info] welcome to sbt 1.11.5 (Eclipse Adoptium Java 17.0.8)
80[info] loading settings for project repo-build from akka.sbt, plugins.sbt...
81[info] loading project definition from /build/repo/project
82[info] compiling 4 Scala sources to /build/repo/project/target/scala-2.12/sbt-1.0/classes ...
83[info] Non-compiled module 'compiler-bridge_2.12' for Scala 2.12.20. Compiling...
84[info] Compilation completed in 9.094s.
85[info] done compiling
86[info] loading settings for project root from build.sbt...
87[info] set current project to gallia-spark (in build file:/build/repo/)
88Execute setCrossScalaVersions: 3.8.1-RC1
89OpenCB::Changing crossVersion 3.8.1-RC1 -> 3.8.1-RC1 in root/crossScalaVersions
90[info] set current project to gallia-spark (in build file:/build/repo/)
91[info] Setting Scala version to 3.8.1-RC1 on 1 projects.
92[info] Switching Scala version on:
93[info] * root (3.8.1-RC1, 2.13.12, 2.12.18)
94[info] Excluding projects:
95[info] Reapplying settings...
96[info] set current project to gallia-spark (in build file:/build/repo/)
97Execute mapScalacOptions: ,REQUIRE:-source:3.8,-Wconf:msg=can be rewritten automatically under:s ,-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e
98[info] Reapplying settings...
99[info] set current project to gallia-spark (in build file:/build/repo/)
100[info] Defining Global / credentials, credentials
101[info] The new values will be used by Global / pgpSelectPassphrase, Global / pgpSigningKey and 4 others.
102[info] Run `last` for details.
103[info] Reapplying settings...
104[info] set current project to gallia-spark (in build file:/build/repo/)
105Execute excludeLibraryDependency: com.github.ghik:zerowaste_{scalaVersion} com.olegpy:better-monadic-for_3 org.polyvariant:better-tostring_{scalaVersion} org.wartremover:wartremover_{scalaVersion}
106[info] Reapplying settings...
107OpenCB::Failed to reapply settings in excludeLibraryDependency: Reference to undefined setting:
108
109 Global / allExcludeDependencies from Global / allExcludeDependencies (CommunityBuildPlugin.scala:331)
110 Did you mean allExcludeDependencies ?
111 , retry without global scopes
112[info] Reapplying settings...
113[info] set current project to gallia-spark (in build file:/build/repo/)
114Execute removeScalacOptionsStartingWith: -P:wartremover
115[info] Reapplying settings...
116[info] set current project to gallia-spark (in build file:/build/repo/)
117[success] Total time: 0 s, completed Jan 13, 2026, 8:54:48 PM
118Build config: {"projects":{"exclude":[],"overrides":{}},"java":{},"sbt":{"commands":[],"options":[]},"mill":{"options":[]},"tests":"full","migrationVersions":[],"sourcePatches":[{"path":"project/GalliaCommonSettings.scala","pattern":"val scala3 = \"3.3.1\"","replaceWith":"val scala3 = \"<SCALA_VERSION>\""}]}
119Parsed config: Success(ProjectBuildConfig(ProjectsConfig(List(),Map()),Full,List()))
120Starting build...
121Projects: Set(root)
122Starting build for ProjectRef(file:/build/repo/,root) (gallia-spark)... [0/1]
123OpenCB::Exclude Scala3 specific scalacOption `REQUIRE:-source:3.8` in Scala 2.12.20 module Global
124OpenCB::Filter out '-feature', matches setting pattern '^-?-feature'
125OpenCB::Filter out '-deprecation', matches setting pattern '^-?-deprecation'
126Compile scalacOptions: -encoding, UTF-8, -unchecked, -language:implicitConversions, -Wunused:implicits, -Wunused:explicits, -Wunused:imports, -Wunused:locals, -Wunused:params, -Wunused:privates, -no-indent, -Wvalue-discard, -Wconf:msg=can be rewritten automatically under:s, -source:3.8
127[info] compiling 14 Scala sources to /build/repo/../bin/spark/scala-3.8.1-RC1/classes ...
128[warn] -- [E175] Potential Issue Warning: /build/repo/src/main/scala/aptus/spark/SparkDriver.scala:34:28
129[warn] 34 | sys.addShutdownHook {
130[warn] | ^
131[warn] |discarded non-Unit value of type scala.sys.ShutdownHookThread. Add `: Unit` to discard silently.
132[warn] |
133[warn] 35 | // TODO: or as "whatever is in cache?"
134[warn] 36 | //println("stopping spark-context")
135[warn] 37 | sc.stop() } }
136[warn] -- Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerUtils.scala:37:6
137[warn] 37 | private def postCoGroup[K: ClassTag, V: ClassTag](joinType: JoinType)(coGrouped: RDD[(K, (Iterable[V], Iterable[V]))]): RDD[(K, (Iterable[V], Iterable[V]))] = {
138[warn] | ^
139[warn] | Line is indented too far to the right, or a `{` is missing before:
140[warn] |
141[warn] | left.cogroup(right).pype(postCoGroup(joinType))
142[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/spark/SparkRddIn.scala:9:12
143[warn] 9 | def rdd (sc: SparkContext, schema: Cls, rdd: RDD[Obj]): HeadS = RddInputObjs (schema, rdd) .pipe(heads.Head.inputZ)
144[warn] | ^^
145[warn] | unused explicit parameter
146[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerHashJoin.scala:11:43
147[warn] 11 | def innerHashJoinWithLeftBias[K: ClassTag, V: ClassTag](
148[warn] | ^
149[warn] | unused implicit parameter
150[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerHashJoin.scala:11:56
151[warn] 11 | def innerHashJoinWithLeftBias[K: ClassTag, V: ClassTag](
152[warn] | ^
153[warn] | unused implicit parameter
154[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerHashJoin.scala:30:30
155[warn] 30 | def leftHashJoin[K: ClassTag, V: ClassTag](
156[warn] | ^
157[warn] | unused implicit parameter
158[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerHashJoin.scala:30:43
159[warn] 30 | def leftHashJoin[K: ClassTag, V: ClassTag](
160[warn] | ^
161[warn] | unused implicit parameter
162[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerHashJoin.scala:49:31
163[warn] 49 | def rightHashJoin[K: ClassTag, V: ClassTag](
164[warn] | ^
165[warn] | unused implicit parameter
166[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerHashJoin.scala:49:44
167[warn] 49 | def rightHashJoin[K: ClassTag, V: ClassTag](
168[warn] | ^
169[warn] | unused implicit parameter
170[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerUtils.scala:37:41
171[warn] 37 | private def postCoGroup[K: ClassTag, V: ClassTag](joinType: JoinType)(coGrouped: RDD[(K, (Iterable[V], Iterable[V]))]): RDD[(K, (Iterable[V], Iterable[V]))] = {
172[warn] | ^
173[warn] | unused implicit parameter
174[warn] -- [E198] Unused Symbol Warning: /build/repo/src/main/scala/gallia/streamer/RddStreamerUtils.scala:37:54
175[warn] 37 | private def postCoGroup[K: ClassTag, V: ClassTag](joinType: JoinType)(coGrouped: RDD[(K, (Iterable[V], Iterable[V]))]): RDD[(K, (Iterable[V], Iterable[V]))] = {
176[warn] | ^
177[warn] | unused implicit parameter
178[warn] there was 1 deprecation warning; re-run with -deprecation for details
179[warn] 12 warnings found
180[info] done compiling
181[info] compiling 7 Scala sources to /build/repo/../bin/spark/scala-3.8.1-RC1/test-classes ...
182[warn] -- [E175] Potential Issue Warning: /build/repo/src/test/scala/galliatesting/spark/NonGalliaSparkTest.scala:36:61
183[warn] 36 | "/tmp/spark_test.csv/part-00000".readFileLines().ensuring(_ == List(
184[warn] | ^
185[warn] |discarded non-Unit value of type List[aptus.Line]. Add `: Unit` to discard silently.
186[warn] |
187[warn] 37 | "FOO,BAZ",
188[warn] 38 |...
189[warn] 40 | "")) }
190[warn] -- [E175] Potential Issue Warning: /build/repo/src/test/scala/galliatesting/spark/SparkCsvTest.scala:22:16
191[warn] 22 | res.ensuring(
192[warn] | ^
193[warn] |discarded non-Unit value of type gallia.domain.AObjs. Add `: Unit` to discard silently.
194[warn] |
195[warn] 23 | _ == bobjs(
196[warn] 24 |...
197[warn] 26 | .forceAObjs) } }
198[warn] -- [E175] Potential Issue Warning: /build/repo/src/test/scala/galliatesting/spark/SparkJsonLinesTest.scala:23:15
199[warn] 22 | res
200[warn] 23 | .ensuring(
201[warn] | ^
202[warn] |discarded non-Unit value of type gallia.domain.AObjs. Add `: Unit` to discard silently.
203[warn] |
204[warn] 24 | _ == bobjs(
205[warn] 25 |...
206[warn] 27 | .forceAObjs) } }
207[warn] -- [E175] Potential Issue Warning: /build/repo/src/test/scala/galliatesting/spark/SparkLinesTest.scala:23:15
208[warn] 22 | res
209[warn] 23 | .ensuring(
210[warn] | ^
211[warn] |discarded non-Unit value of type gallia.domain.AObjs. Add `: Unit` to discard silently.
212[warn] |
213[warn] 24 | _ == bobjs(
214[warn] 25 |...
215[warn] 28 | .forceAObjs) } }
216[warn] -- [E175] Potential Issue Warning: /build/repo/src/test/scala/galliatesting/spark/SparkRddDirectlyTest.scala:25:15
217[warn] 24 | res
218[warn] 25 | .ensuring(
219[warn] | ^
220[warn] |discarded non-Unit value of type gallia.domain.AObjs. Add `: Unit` to discard silently.
221[warn] |
222[warn] 26 | _ == bobjs(
223[warn] 27 |...
224[warn] 31 | .forceAObjs) } }
225[warn] -- [E198] Unused Symbol Warning: /build/repo/src/test/scala/galliatesting/spark/NonGalliaSparkTest.scala:4:21
226[warn] 4 |import util.chaining._
227[warn] | ^
228[warn] | unused import
229[warn] -- [E198] Unused Symbol Warning: /build/repo/src/test/scala/galliatesting/spark/NonGalliaSparkTest.scala:7:14
230[warn] 7 |import gallia._
231[warn] | ^
232[warn] | unused import
233[warn] -- [E198] Unused Symbol Warning: /build/repo/src/test/scala/galliatesting/spark/SparkAvroTest.scala:4:14
234[warn] 4 |import gallia._
235[warn] | ^
236[warn] | unused import
237[warn] -- [E198] Unused Symbol Warning: /build/repo/src/test/scala/galliatesting/spark/SparkAvroTest.scala:5:20
238[warn] 5 |import gallia.spark._
239[warn] | ^
240[warn] | unused import
241[warn] 9 warnings found
242[info] done compiling
243-------------------------------- Running Tests --------------------------------
244Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
24526/01/13 20:55:07 INFO SparkContext: Running Spark version 3.5.0
24626/01/13 20:55:07 INFO SparkContext: OS info Linux, 6.8.0-1044-azure, amd64
24726/01/13 20:55:07 INFO SparkContext: Java version 17.0.8
24826/01/13 20:55:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
24926/01/13 20:55:07 INFO ResourceUtils: ==============================================================
25026/01/13 20:55:07 INFO ResourceUtils: No custom resources configured for spark.driver.
25126/01/13 20:55:07 INFO ResourceUtils: ==============================================================
25226/01/13 20:55:07 INFO SparkContext: Submitted application: my-spark
25326/01/13 20:55:07 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
25426/01/13 20:55:07 INFO ResourceProfile: Limiting resource is cpu
25526/01/13 20:55:07 INFO ResourceProfileManager: Added ResourceProfile id: 0
25626/01/13 20:55:07 INFO SecurityManager: Changing view acls to: root
25726/01/13 20:55:07 INFO SecurityManager: Changing modify acls to: root
25826/01/13 20:55:07 INFO SecurityManager: Changing view acls groups to:
25926/01/13 20:55:07 INFO SecurityManager: Changing modify acls groups to:
26026/01/13 20:55:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
26126/01/13 20:55:07 INFO Utils: Successfully started service 'sparkDriver' on port 41073.
26226/01/13 20:55:07 INFO SparkEnv: Registering MapOutputTracker
26326/01/13 20:55:07 INFO SparkEnv: Registering BlockManagerMaster
26426/01/13 20:55:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
26526/01/13 20:55:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
266X galliatesting.spark.SparkTest.basic spark (non gallia) 1121ms
267 java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5bf2c474) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5bf2c474
268 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
269 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
270 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
271 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
272 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
273 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
274 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
275 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
276 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
277 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
27826/01/13 20:55:07 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
279org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
280galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
281galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
282utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
283utest.framework.TestCallTree.run(Model.scala:45)
284utest.framework.TestCallTree.run(Model.scala:43)
285utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
286utest.framework.Executor.utestWrap(Executor.scala:12)
287utest.framework.Executor.utestWrap$(Executor.scala:5)
288utest.TestSuite.utestWrap(TestSuite.scala:12)
289utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
290utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
291utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
292utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
293utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
294scala.concurrent.Future$.traverse$$anonfun$1(Future.scala:888)
295scala.collection.IterableOnceOps.foldLeft(IterableOnce.scala:741)
296scala.collection.IterableOnceOps.foldLeft$(IterableOnce.scala:337)
297scala.collection.AbstractIterator.foldLeft(Iterator.scala:1328)
298scala.concurrent.Future$.traverse(Future.scala:888)
29926/01/13 20:55:07 INFO SparkContext: Running Spark version 3.5.0
30026/01/13 20:55:07 INFO SparkContext: OS info Linux, 6.8.0-1044-azure, amd64
30126/01/13 20:55:07 INFO SparkContext: Java version 17.0.8
30226/01/13 20:55:07 INFO ResourceUtils: ==============================================================
30326/01/13 20:55:07 INFO ResourceUtils: No custom resources configured for spark.driver.
30426/01/13 20:55:07 INFO ResourceUtils: ==============================================================
30526/01/13 20:55:07 INFO SparkContext: Submitted application: spark-csv
30626/01/13 20:55:07 INFO ResourceProfileManager: Added ResourceProfile id: 0
30726/01/13 20:55:07 INFO SecurityManager: Changing view acls to: root
30826/01/13 20:55:07 INFO SecurityManager: Changing modify acls to: root
30926/01/13 20:55:07 INFO SecurityManager: Changing view acls groups to:
31026/01/13 20:55:07 INFO SecurityManager: Changing modify acls groups to:
31126/01/13 20:55:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
31226/01/13 20:55:07 INFO Utils: Successfully started service 'sparkDriver' on port 46857.
31326/01/13 20:55:07 INFO SparkEnv: Registering MapOutputTracker
31426/01/13 20:55:07 INFO SparkEnv: Registering BlockManagerMaster
31526/01/13 20:55:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
31626/01/13 20:55:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
317X galliatesting.spark.SparkTest.spark-csv 30ms
318 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
319 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
320 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
321 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
322 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
323 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
324 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
325 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
326 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
327 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
328 scala.Option.getOrElse(Option.scala:203)
329 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
330 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
331 galliatesting.spark.SparkCsvTest$.apply(SparkCsvTest.scala:11)
332 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$2(SparkTest.scala:33)
333 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5bf2c474) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5bf2c474 [in thread "pool-18-thread-6"]
334 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
335 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
336 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
337 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
338 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
339 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
340 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
341 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
342 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
343 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
34426/01/13 20:55:07 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
345org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
346aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
347aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
348scala.Option.getOrElse(Option.scala:203)
349aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
350gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
351galliatesting.spark.SparkCsvTest$.apply(SparkCsvTest.scala:11)
352galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$2(SparkTest.scala:33)
353utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
354utest.framework.TestCallTree.run(Model.scala:45)
355utest.framework.TestCallTree.run(Model.scala:43)
356utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
357utest.framework.Executor.utestWrap(Executor.scala:12)
358utest.framework.Executor.utestWrap$(Executor.scala:5)
359utest.TestSuite.utestWrap(TestSuite.scala:12)
360utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
361utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
362utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
363utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
364utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
36526/01/13 20:55:07 INFO SparkContext: Running Spark version 3.5.0
36626/01/13 20:55:07 INFO SparkContext: OS info Linux, 6.8.0-1044-azure, amd64
36726/01/13 20:55:07 INFO SparkContext: Java version 17.0.8
36826/01/13 20:55:07 INFO ResourceUtils: ==============================================================
36926/01/13 20:55:07 INFO ResourceUtils: No custom resources configured for spark.driver.
37026/01/13 20:55:07 INFO ResourceUtils: ==============================================================
37126/01/13 20:55:07 INFO SparkContext: Submitted application: spark-csv-no-trailing-newline
37226/01/13 20:55:07 INFO ResourceProfileManager: Added ResourceProfile id: 0
37326/01/13 20:55:07 INFO SecurityManager: Changing view acls to: root
37426/01/13 20:55:07 INFO SecurityManager: Changing modify acls to: root
37526/01/13 20:55:07 INFO SecurityManager: Changing view acls groups to:
37626/01/13 20:55:07 INFO SecurityManager: Changing modify acls groups to:
37726/01/13 20:55:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
37826/01/13 20:55:07 INFO Utils: Successfully started service 'sparkDriver' on port 46709.
37926/01/13 20:55:07 INFO SparkEnv: Registering MapOutputTracker
38026/01/13 20:55:07 INFO SparkEnv: Registering BlockManagerMaster
38126/01/13 20:55:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
38226/01/13 20:55:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
383X galliatesting.spark.SparkTest.spark-csv-no-trailing-newline 28ms
384 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
385 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
386 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
387 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
388 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
389 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
390 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
391 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
392 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
393 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
394 scala.Option.getOrElse(Option.scala:203)
395 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
396 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
397 galliatesting.spark.SparkCsvTest$.apply(SparkCsvTest.scala:11)
398 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$3(SparkTest.scala:34)
399 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5bf2c474) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5bf2c474 [in thread "pool-18-thread-6"]
400 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
401 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
402 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
403 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
404 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
405 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
406 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
407 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
408 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
409 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
41026/01/13 20:55:07 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
411org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
412aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
413aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
414scala.Option.getOrElse(Option.scala:203)
415aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
416gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
417galliatesting.spark.SparkCsvTest$.apply(SparkCsvTest.scala:11)
418galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$3(SparkTest.scala:34)
419utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
420utest.framework.TestCallTree.run(Model.scala:45)
421utest.framework.TestCallTree.run(Model.scala:43)
422utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
423utest.framework.Executor.utestWrap(Executor.scala:12)
424utest.framework.Executor.utestWrap$(Executor.scala:5)
425utest.TestSuite.utestWrap(TestSuite.scala:12)
426utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
427utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
428utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
429utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
430utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
43126/01/13 20:55:07 INFO SparkContext: Running Spark version 3.5.0
43226/01/13 20:55:07 INFO SparkContext: OS info Linux, 6.8.0-1044-azure, amd64
43326/01/13 20:55:07 INFO SparkContext: Java version 17.0.8
43426/01/13 20:55:07 INFO ResourceUtils: ==============================================================
43526/01/13 20:55:07 INFO ResourceUtils: No custom resources configured for spark.driver.
43626/01/13 20:55:07 INFO ResourceUtils: ==============================================================
43726/01/13 20:55:07 INFO SparkContext: Submitted application: spark-lines-plain
43826/01/13 20:55:07 INFO ResourceProfileManager: Added ResourceProfile id: 0
43926/01/13 20:55:07 INFO SecurityManager: Changing view acls to: root
44026/01/13 20:55:07 INFO SecurityManager: Changing modify acls to: root
44126/01/13 20:55:07 INFO SecurityManager: Changing view acls groups to:
44226/01/13 20:55:07 INFO SecurityManager: Changing modify acls groups to:
44326/01/13 20:55:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
44426/01/13 20:55:07 INFO Utils: Successfully started service 'sparkDriver' on port 40237.
44526/01/13 20:55:07 INFO SparkEnv: Registering MapOutputTracker
44626/01/13 20:55:07 INFO SparkEnv: Registering BlockManagerMaster
44726/01/13 20:55:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
44826/01/13 20:55:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
449X galliatesting.spark.SparkTest.spark-lines-plain 27ms
450 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
451 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
452 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
453 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
454 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
455 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
456 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
457 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
458 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
459 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
460 scala.Option.getOrElse(Option.scala:203)
461 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
462 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
463 galliatesting.spark.SparkLinesTest$.apply(SparkLinesTest.scala:11)
464 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$4(SparkTest.scala:37)
465 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5bf2c474) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5bf2c474 [in thread "pool-18-thread-6"]
466 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
467 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
468 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
469 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
470 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
471 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
472 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
473 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
474 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
475 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
47626/01/13 20:55:07 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
477org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
478aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
479aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
480scala.Option.getOrElse(Option.scala:203)
481aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
482gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
483galliatesting.spark.SparkLinesTest$.apply(SparkLinesTest.scala:11)
484galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$4(SparkTest.scala:37)
485utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
486utest.framework.TestCallTree.run(Model.scala:45)
487utest.framework.TestCallTree.run(Model.scala:43)
488utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
489utest.framework.Executor.utestWrap(Executor.scala:12)
490utest.framework.Executor.utestWrap$(Executor.scala:5)
491utest.TestSuite.utestWrap(TestSuite.scala:12)
492utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
493utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
494utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
495utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
496utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
49726/01/13 20:55:07 INFO SparkContext: Running Spark version 3.5.0
49826/01/13 20:55:07 INFO SparkContext: OS info Linux, 6.8.0-1044-azure, amd64
49926/01/13 20:55:07 INFO SparkContext: Java version 17.0.8
50026/01/13 20:55:07 INFO ResourceUtils: ==============================================================
50126/01/13 20:55:07 INFO ResourceUtils: No custom resources configured for spark.driver.
50226/01/13 20:55:07 INFO ResourceUtils: ==============================================================
50326/01/13 20:55:07 INFO SparkContext: Submitted application: spark-lines-gz
50426/01/13 20:55:07 INFO ResourceProfileManager: Added ResourceProfile id: 0
50526/01/13 20:55:07 INFO SecurityManager: Changing view acls to: root
50626/01/13 20:55:07 INFO SecurityManager: Changing modify acls to: root
50726/01/13 20:55:07 INFO SecurityManager: Changing view acls groups to:
50826/01/13 20:55:07 INFO SecurityManager: Changing modify acls groups to:
50926/01/13 20:55:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
51026/01/13 20:55:07 INFO Utils: Successfully started service 'sparkDriver' on port 42433.
51126/01/13 20:55:07 INFO SparkEnv: Registering MapOutputTracker
51226/01/13 20:55:07 INFO SparkEnv: Registering BlockManagerMaster
51326/01/13 20:55:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
51426/01/13 20:55:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
515X galliatesting.spark.SparkTest.spark-lines-gz 22ms
516 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
517 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
518 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
519 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
520 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
521 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
522 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
523 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
524 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
525 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
526 scala.Option.getOrElse(Option.scala:203)
527 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
528 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
529 galliatesting.spark.SparkLinesTest$.apply(SparkLinesTest.scala:11)
530 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$5(SparkTest.scala:38)
531 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5bf2c474) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5bf2c474 [in thread "pool-18-thread-6"]
532 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
533 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
534 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
535 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
536 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
537 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
538 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
539 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
540 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
541 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
54226/01/13 20:55:07 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
543org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
544aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
545aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
546scala.Option.getOrElse(Option.scala:203)
547aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
548gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
549galliatesting.spark.SparkLinesTest$.apply(SparkLinesTest.scala:11)
550galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$5(SparkTest.scala:38)
551utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
552utest.framework.TestCallTree.run(Model.scala:45)
553utest.framework.TestCallTree.run(Model.scala:43)
554utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
555utest.framework.Executor.utestWrap(Executor.scala:12)
556utest.framework.Executor.utestWrap$(Executor.scala:5)
557utest.TestSuite.utestWrap(TestSuite.scala:12)
558utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
559utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
560utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
561utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
562utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
56326/01/13 20:55:07 INFO SparkContext: Running Spark version 3.5.0
56426/01/13 20:55:07 INFO SparkContext: OS info Linux, 6.8.0-1044-azure, amd64
56526/01/13 20:55:07 INFO SparkContext: Java version 17.0.8
56626/01/13 20:55:07 INFO ResourceUtils: ==============================================================
56726/01/13 20:55:07 INFO ResourceUtils: No custom resources configured for spark.driver.
56826/01/13 20:55:07 INFO ResourceUtils: ==============================================================
56926/01/13 20:55:07 INFO SparkContext: Submitted application: spark-lines-bz2
57026/01/13 20:55:07 INFO ResourceProfileManager: Added ResourceProfile id: 0
57126/01/13 20:55:07 INFO SecurityManager: Changing view acls to: root
57226/01/13 20:55:07 INFO SecurityManager: Changing modify acls to: root
57326/01/13 20:55:07 INFO SecurityManager: Changing view acls groups to:
57426/01/13 20:55:07 INFO SecurityManager: Changing modify acls groups to:
57526/01/13 20:55:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
57626/01/13 20:55:07 INFO Utils: Successfully started service 'sparkDriver' on port 39283.
57726/01/13 20:55:07 INFO SparkEnv: Registering MapOutputTracker
57826/01/13 20:55:07 INFO SparkEnv: Registering BlockManagerMaster
57926/01/13 20:55:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
58026/01/13 20:55:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
581X galliatesting.spark.SparkTest.spark-lines-bz2 20ms
582 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
583 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
584 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
585 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
586 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
587 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
588 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
589 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
590 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
591 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
592 scala.Option.getOrElse(Option.scala:203)
593 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
594 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
595 galliatesting.spark.SparkLinesTest$.apply(SparkLinesTest.scala:11)
596 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$6(SparkTest.scala:39)
597 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5bf2c474) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5bf2c474 [in thread "pool-18-thread-6"]
598 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
599 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
600 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
601 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
602 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
603 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
604 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
605 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
606 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
607 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
60826/01/13 20:55:07 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
609org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
610aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
611aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
612scala.Option.getOrElse(Option.scala:203)
613aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
614gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
615galliatesting.spark.SparkLinesTest$.apply(SparkLinesTest.scala:11)
616galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$6(SparkTest.scala:39)
617utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
618utest.framework.TestCallTree.run(Model.scala:45)
619utest.framework.TestCallTree.run(Model.scala:43)
620utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
621utest.framework.Executor.utestWrap(Executor.scala:12)
622utest.framework.Executor.utestWrap$(Executor.scala:5)
623utest.TestSuite.utestWrap(TestSuite.scala:12)
624utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
625utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
626utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
627utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
628utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
62926/01/13 20:55:07 INFO SparkContext: Running Spark version 3.5.0
63026/01/13 20:55:07 INFO SparkContext: OS info Linux, 6.8.0-1044-azure, amd64
63126/01/13 20:55:07 INFO SparkContext: Java version 17.0.8
63226/01/13 20:55:07 INFO ResourceUtils: ==============================================================
63326/01/13 20:55:07 INFO ResourceUtils: No custom resources configured for spark.driver.
63426/01/13 20:55:07 INFO ResourceUtils: ==============================================================
63526/01/13 20:55:07 INFO SparkContext: Submitted application: spark-rdd-directly
63626/01/13 20:55:07 INFO ResourceProfileManager: Added ResourceProfile id: 0
63726/01/13 20:55:07 INFO SecurityManager: Changing view acls to: root
63826/01/13 20:55:07 INFO SecurityManager: Changing modify acls to: root
63926/01/13 20:55:07 INFO SecurityManager: Changing view acls groups to:
64026/01/13 20:55:07 INFO SecurityManager: Changing modify acls groups to:
64126/01/13 20:55:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
64226/01/13 20:55:07 INFO Utils: Successfully started service 'sparkDriver' on port 38199.
64326/01/13 20:55:07 INFO SparkEnv: Registering MapOutputTracker
64426/01/13 20:55:07 INFO SparkEnv: Registering BlockManagerMaster
64526/01/13 20:55:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
64626/01/13 20:55:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
647X galliatesting.spark.SparkTest.spark-rdd-directly 26ms
648 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
649 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
650 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
651 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
652 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
653 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
654 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
655 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
656 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
657 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
658 scala.Option.getOrElse(Option.scala:203)
659 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
660 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
661 galliatesting.spark.SparkRddDirectlyTest$.apply(SparkRddDirectlyTest.scala:12)
662 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$7(SparkTest.scala:42)
663 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5bf2c474) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5bf2c474 [in thread "pool-18-thread-6"]
664 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
665 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
666 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
667 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
668 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
669 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
670 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
671 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
672 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
673 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
67426/01/13 20:55:07 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
675org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
676aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
677aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
678scala.Option.getOrElse(Option.scala:203)
679aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
680gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
681galliatesting.spark.SparkRddDirectlyTest$.apply(SparkRddDirectlyTest.scala:12)
682galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$7(SparkTest.scala:42)
683utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
684utest.framework.TestCallTree.run(Model.scala:45)
685utest.framework.TestCallTree.run(Model.scala:43)
686utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
687utest.framework.Executor.utestWrap(Executor.scala:12)
688utest.framework.Executor.utestWrap$(Executor.scala:5)
689utest.TestSuite.utestWrap(TestSuite.scala:12)
690utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
691utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
692utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
693utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
694utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
69526/01/13 20:55:07 INFO SparkContext: Running Spark version 3.5.0
69626/01/13 20:55:07 INFO SparkContext: OS info Linux, 6.8.0-1044-azure, amd64
69726/01/13 20:55:07 INFO SparkContext: Java version 17.0.8
69826/01/13 20:55:07 INFO ResourceUtils: ==============================================================
69926/01/13 20:55:07 INFO ResourceUtils: No custom resources configured for spark.driver.
70026/01/13 20:55:07 INFO ResourceUtils: ==============================================================
70126/01/13 20:55:07 INFO SparkContext: Submitted application: spark-jsonl
70226/01/13 20:55:07 INFO ResourceProfileManager: Added ResourceProfile id: 0
70326/01/13 20:55:07 INFO SecurityManager: Changing view acls to: root
70426/01/13 20:55:07 INFO SecurityManager: Changing modify acls to: root
70526/01/13 20:55:07 INFO SecurityManager: Changing view acls groups to:
70626/01/13 20:55:07 INFO SecurityManager: Changing modify acls groups to:
70726/01/13 20:55:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
70826/01/13 20:55:07 INFO Utils: Successfully started service 'sparkDriver' on port 41349.
70926/01/13 20:55:07 INFO SparkEnv: Registering MapOutputTracker
71026/01/13 20:55:07 INFO SparkEnv: Registering BlockManagerMaster
71126/01/13 20:55:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
71226/01/13 20:55:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
713X galliatesting.spark.SparkTest.spark-jsonl 20ms
714 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
715 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
716 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
717 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
718 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
719 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
720 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
721 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
722 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
723 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
724 scala.Option.getOrElse(Option.scala:203)
725 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
726 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
727 galliatesting.spark.SparkJsonLinesTest$.apply(SparkJsonLinesTest.scala:11)
728 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$8(SparkTest.scala:45)
729 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5bf2c474) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5bf2c474 [in thread "pool-18-thread-6"]
730 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
731 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
732 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
733 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
734 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
735 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
736 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
737 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
738 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
739 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
74026/01/13 20:55:07 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
741org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
742aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
743aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
744scala.Option.getOrElse(Option.scala:203)
745aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
746gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
747galliatesting.spark.SparkJsonLinesTest$.apply(SparkJsonLinesTest.scala:11)
748galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$8(SparkTest.scala:45)
749utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
750utest.framework.TestCallTree.run(Model.scala:45)
751utest.framework.TestCallTree.run(Model.scala:43)
752utest.TestRunner$.$anonfun$8$$anonfun$1(TestRunner.scala:74)
753utest.framework.Executor.utestWrap(Executor.scala:12)
754utest.framework.Executor.utestWrap$(Executor.scala:5)
755utest.TestSuite.utestWrap(TestSuite.scala:12)
756utest.TestRunner$.$anonfun$8(TestRunner.scala:84)
757utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
758utest.TestRunner$.$anonfun$6$$anonfun$1(TestRunner.scala:85)
759utest.TestRunner$.evaluateFutureTree(TestRunner.scala:171)
760utest.TestRunner$.evaluateFutureTree$$anonfun$2(TestRunner.scala:174)
76126/01/13 20:55:07 INFO SparkContext: Running Spark version 3.5.0
76226/01/13 20:55:07 INFO SparkContext: OS info Linux, 6.8.0-1044-azure, amd64
76326/01/13 20:55:07 INFO SparkContext: Java version 17.0.8
76426/01/13 20:55:07 INFO ResourceUtils: ==============================================================
76526/01/13 20:55:07 INFO ResourceUtils: No custom resources configured for spark.driver.
76626/01/13 20:55:07 INFO ResourceUtils: ==============================================================
76726/01/13 20:55:07 INFO SparkContext: Submitted application: spark-register
76826/01/13 20:55:07 INFO ResourceProfileManager: Added ResourceProfile id: 0
76926/01/13 20:55:07 INFO SecurityManager: Changing view acls to: root
77026/01/13 20:55:07 INFO SecurityManager: Changing modify acls to: root
77126/01/13 20:55:07 INFO SecurityManager: Changing view acls groups to:
77226/01/13 20:55:07 INFO SecurityManager: Changing modify acls groups to:
77326/01/13 20:55:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
77426/01/13 20:55:07 INFO Utils: Successfully started service 'sparkDriver' on port 38991.
77526/01/13 20:55:07 INFO SparkEnv: Registering MapOutputTracker
77626/01/13 20:55:07 INFO SparkEnv: Registering BlockManagerMaster
77726/01/13 20:55:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
77826/01/13 20:55:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
779X galliatesting.spark.SparkTest.spark-register 23ms
780 java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
781 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
782 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
783 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
784 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
785 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
786 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
787 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
788 aptus.spark.SparkDriver$.createContext(SparkDriver.scala:29)
789 aptus.spark.SparkDriver$.context$$anonfun$1(SparkDriver.scala:18)
790 scala.Option.getOrElse(Option.scala:203)
791 aptus.spark.SparkDriver$.context(SparkDriver.scala:18)
792 gallia.spark.package$.galliaSparkContext(SparkPackage.scala:30)
793 galliatesting.spark.SparkTest$.galliaSparkRegister(SparkTest.scala:57)
794 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$9(SparkTest.scala:51)
795 java.lang.ExceptionInInitializerError: Exception java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5bf2c474) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5bf2c474 [in thread "pool-18-thread-6"]
796 org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
797 org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
798 org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
799 org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
800 org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
801 org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
802 org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
803 org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
804 galliatesting.spark.NonGalliaSparkTest$.apply(NonGalliaSparkTest.scala:24)
805 galliatesting.spark.SparkTest$.$init$$$anonfun$1$$anonfun$1(SparkTest.scala:30)
806[error] Test suite galliatesting.spark.SparkTest failed with java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5bf2c474) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5bf2c474.
807[error] This may be due to the ClassLoaderLayeringStrategy (ScalaLibrary) used by your task.
808[error] To improve performance and reduce memory, sbt attempts to cache the class loaders used to load the project dependencies.
809[error] The project class files are loaded in a separate class loader that is created for each test run.
810[error] The test class loader accesses the project dependency classes using the cached project dependency classloader.
811[error] With this approach, class loading may fail under the following conditions:
812[error]
813[error] * Dependencies use reflection to access classes in your project's classpath.
814[error] Java serialization/deserialization may cause this.
815[error] * An open package is accessed across layers. If the project's classes access or extend
816[error] jvm package private classes defined in a project dependency, it may cause an IllegalAccessError
817[error] because the jvm enforces package private at the classloader level.
818[error]
819[error] These issues, along with others that were not enumerated above, may be resolved by changing the class loader layering strategy.
820[error] The Flat and ScalaLibrary strategies bundle the full project classpath in the same class loader.
821[error] To use one of these strategies, set the ClassLoaderLayeringStrategy key
822[error] in your configuration, for example:
823[error]
824[error] set root / Test / classLoaderLayeringStrategy := ClassLoaderLayeringStrategy.ScalaLibrary
825[error] set root / Test / classLoaderLayeringStrategy := ClassLoaderLayeringStrategy.Flat
826[error]
827[error] See ClassLoaderLayeringStrategy.scala for the full list of options.
828
829************************
830Build summary:
831[{
832 "module": "gallia-spark",
833 "compile": {"status": "ok", "tookMs": 9566, "warnings": 11, "errors": 0, "sourceVersion": "3.8"},
834 "doc": {"status": "skipped", "tookMs": 0, "files": 0, "totalSizeKb": 0},
835 "test-compile": {"status": "ok", "tookMs": 1880, "warnings": 9, "errors": 0, "sourceVersion": "3.8"},
836 "test": {"status": "failed", "tookMs": 1956, "passed": 0, "failed": 9, "ignored": 0, "skipped": 0, "total": 9, "byFramework": [{"framework": "unknown", "stats": {"passed": 0, "failed": 9, "ignored": 0, "skipped": 0, "total": 9}}]},
837 "publish": {"status": "skipped", "tookMs": 0},
838 "metadata": {
839 "crossScalaVersions": ["2.13.12", "2.12.18", "3.3.1"]
840}
841}]
842************************
843[error] Scala3CommunityBuild$ProjectBuildFailureException: 1 module(s) finished with failures: gallia-spark
844[error] (Global / runBuild) Scala3CommunityBuild$ProjectBuildFailureException: 1 module(s) finished with failures: gallia-spark
845[error] Total time: 20 s, completed Jan 13, 2026, 8:55:08 PM
846[0JBuild failed, not retrying.