Build Logs
kaizen-solutions/fs2-kafka-jsonschema • 3.8.0-RC2:2025-11-28
Errors
15
Warnings
4
Total Lines
453
1##################################
2Clonning https://github.com/kaizen-solutions/fs2-kafka-jsonschema.git into /build/repo using revision v0.0.1
3##################################
4Note: switching to 'cd75c5f4d4182f46d206c036ec973ff6a26e2fd7'.
5
6You are in 'detached HEAD' state. You can look around, make experimental
7changes and commit them, and you can discard any commits you make in this
8state without impacting any branches by switching back to a branch.
9
10If you want to create a new branch to retain commits you create, you may
11do so (now or later) by using -c with the switch command. Example:
12
13 git switch -c <new-branch-name>
14
15Or undo this operation with:
16
17 git switch -
18
19Turn off this advice by setting config variable advice.detachedHead to false
20
21Would override fixed Scala version: 3.3.3
22----
23Preparing build for 3.8.0-RC2
24Scala binary version found: 3.8
25Implicitly using source version 3.8
26Scala binary version found: 3.8
27Implicitly using source version 3.8
28Would try to apply common scalacOption (best-effort, sbt/mill only):
29Append: ,REQUIRE:-source:3.8
30Remove: ,-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e
31
32Try apply source patch:
33Path: build.sbt
34Pattern: val scala3 = "3.3.3"
35Replacement: val scala3 = "3.8.0-RC2"
36Starting compilation server
37Compiling project (Scala 3.7.3, JVM (17))
38Compiled project (Scala 3.7.3, JVM (17))
39Successfully applied pattern 'val scala3 = "3.3.3"' in build.sbt
40----
41Starting build for 3.8.0-RC2
42Execute tests: true
43sbt project found:
44Sbt version 1.10.1 is not supported, minimal supported version is 1.11.5
45Enforcing usage of sbt in version 1.11.5
46No prepare script found for project kaizen-solutions/fs2-kafka-jsonschema
47##################################
48Scala version: 3.8.0-RC2
49Targets: io.kaizen-solutions%fs2-kafka-jsonschema
50Project projectConfig: {"projects":{"exclude":[],"overrides":{}},"java":{"version":"17"},"sbt":{"commands":[],"options":[]},"mill":{"options":[]},"tests":"full","migrationVersions":[],"sourcePatches":[{"path":"build.sbt","pattern":"val scala3 = \"3.3.3\"","replaceWith":"val scala3 = \"<SCALA_VERSION>\""}]}
51##################################
52Using extra scalacOptions: ,REQUIRE:-source:3.8
53Filtering out scalacOptions: ,-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e
54[sbt_options] declare -a sbt_options=()
55[process_args] java_version = '17'
56[copyRt] java9_rt = '/root/.sbt/1.0/java9-rt-ext-eclipse_adoptium_17_0_8/rt.jar'
57# Executing command line:
58java
59-Dfile.encoding=UTF-8
60-Dcommunitybuild.scala=3.8.0-RC2
61-Dcommunitybuild.project.dependencies.add=
62-Xmx7G
63-Xms4G
64-Xss8M
65-Dsbt.script=/root/.sdkman/candidates/sbt/current/bin/sbt
66-Dscala.ext.dirs=/root/.sbt/1.0/java9-rt-ext-eclipse_adoptium_17_0_8
67-jar
68/root/.sdkman/candidates/sbt/1.11.5/bin/sbt-launch.jar
69"setCrossScalaVersions 3.8.0-RC2"
70"++3.8.0-RC2 -v"
71"mapScalacOptions ",REQUIRE:-source:3.8,-Wconf:msg=can be rewritten automatically under:s" ",-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e""
72"set every credentials := Nil"
73"excludeLibraryDependency com.github.ghik:zerowaste_{scalaVersion} com.olegpy:better-monadic-for_3 org.polyvariant:better-tostring_{scalaVersion} org.wartremover:wartremover_{scalaVersion}"
74"removeScalacOptionsStartingWith -P:wartremover"
75
76moduleMappings
77"runBuild 3.8.0-RC2 """{"projects":{"exclude":[],"overrides":{}},"java":{"version":"17"},"sbt":{"commands":[],"options":[]},"mill":{"options":[]},"tests":"full","migrationVersions":[],"sourcePatches":[{"path":"build.sbt","pattern":"val scala3 = \"3.3.3\"","replaceWith":"val scala3 = \"<SCALA_VERSION>\""}]}""" io.kaizen-solutions%fs2-kafka-jsonschema"
78
79[info] welcome to sbt 1.11.5 (Eclipse Adoptium Java 17.0.8)
80[info] loading settings for project repo-build-build-build from metals.sbt...
81[info] loading project definition from /build/repo/project/project/project
82[info] loading settings for project repo-build-build from metals.sbt...
83[info] loading project definition from /build/repo/project/project
84[success] Generated .bloop/repo-build-build.json
85[success] Total time: 3 s, completed Nov 28, 2025, 3:11:18 PM
86[info] loading settings for project repo-build from akka.sbt, metals.sbt, plugins.sbt...
87[info] loading project definition from /build/repo/project
88[success] Generated .bloop/repo-build.json
89[info] compiling 2 Scala sources to /build/repo/project/target/scala-2.12/sbt-1.0/classes ...
90[info] Non-compiled module 'compiler-bridge_2.12' for Scala 2.12.20. Compiling...
91[info] Compilation completed in 8.785s.
92[info] done compiling
93[success] Total time: 18 s, completed Nov 28, 2025, 3:11:38 PM
94[info] loading settings for project root from build.sbt...
95[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
96Execute setCrossScalaVersions: 3.8.0-RC2
97OpenCB::Changing crossVersion 3.8.0-RC2 -> 3.8.0-RC2 in root/crossScalaVersions
98[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
99[info] Setting Scala version to 3.8.0-RC2 on 1 projects.
100[info] Switching Scala version on:
101[info] * root (3.8.0-RC2)
102[info] Excluding projects:
103[info] Reapplying settings...
104[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
105Execute mapScalacOptions: ,REQUIRE:-source:3.8,-Wconf:msg=can be rewritten automatically under:s ,-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e
106[info] Reapplying settings...
107[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
108[info] Defining Global / credentials, credentials
109[info] The new values will be used by Compile / scalafmtOnly, Global / pgpSelectPassphrase and 9 others.
110[info] Run `last` for details.
111[info] Reapplying settings...
112[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
113Execute excludeLibraryDependency: com.github.ghik:zerowaste_{scalaVersion} com.olegpy:better-monadic-for_3 org.polyvariant:better-tostring_{scalaVersion} org.wartremover:wartremover_{scalaVersion}
114[info] Reapplying settings...
115OpenCB::Failed to reapply settings in excludeLibraryDependency: Reference to undefined setting:
116
117 Global / allExcludeDependencies from Global / allExcludeDependencies (CommunityBuildPlugin.scala:331)
118 Did you mean allExcludeDependencies ?
119 , retry without global scopes
120[info] Reapplying settings...
121[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
122Execute removeScalacOptionsStartingWith: -P:wartremover
123[info] Reapplying settings...
124[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
125[success] Total time: 0 s, completed Nov 28, 2025, 3:11:46 PM
126Build config: {"projects":{"exclude":[],"overrides":{}},"java":{"version":"17"},"sbt":{"commands":[],"options":[]},"mill":{"options":[]},"tests":"full","migrationVersions":[],"sourcePatches":[{"path":"build.sbt","pattern":"val scala3 = \"3.3.3\"","replaceWith":"val scala3 = \"<SCALA_VERSION>\""}]}
127Parsed config: Success(ProjectBuildConfig(ProjectsConfig(List(),Map()),Full,List()))
128Starting build...
129Projects: Set(root)
130Starting build for ProjectRef(file:/build/repo/,root) (fs2-kafka-jsonschema)... [0/1]
131OpenCB::Exclude Scala3 specific scalacOption `REQUIRE:-source:3.8` in Scala 2.12.20 module Global
132OpenCB::Filter out '-deprecation', matches setting pattern '^-?-deprecation'
133OpenCB::Filter out '-feature', matches setting pattern '^-?-feature'
134Compile scalacOptions: -release, 17, -Wunused:imports, -Wunused:explicits, -Wvalue-discard, -unchecked, -encoding, utf8, -Wunused:implicits, -Ykind-projector, -Xsemanticdb, -semanticdb-target, /build/repo/target/scala-3.8.0-RC2/meta, -Wconf:msg=can be rewritten automatically under:s, -source:3.8
135[info] compiling 4 Scala sources to /build/repo/target/scala-3.8.0-RC2/classes ...
136[warn] Option -Ykind-projector is deprecated: Use -Xkind-projector instead.
137[warn] one warning found
138[info] done compiling
139[info] compiling 2 Scala sources to /build/repo/target/scala-3.8.0-RC2/test-classes ...
140[warn] Option -Ykind-projector is deprecated: Use -Xkind-projector instead.
141[warn] one warning found
142[info] done compiling
1432025-11-28 15:13:32.682+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.ConfigResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.ConfigResource will be ignored.
1442025-11-28 15:13:32.690+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.ContextsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.ContextsResource will be ignored.
1452025-11-28 15:13:32.690+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.SubjectsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.SubjectsResource will be ignored.
1462025-11-28 15:13:32.690+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.SchemasResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.SchemasResource will be ignored.
1472025-11-28 15:13:32.690+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.SubjectVersionsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.SubjectVersionsResource will be ignored.
1482025-11-28 15:13:32.691+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.CompatibilityResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.CompatibilityResource will be ignored.
1492025-11-28 15:13:32.691+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.ModeResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.ModeResource will be ignored.
1502025-11-28 15:13:32.691+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.ServerMetadataResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.ServerMetadataResource will be ignored.
151[2025-11-28 15:13:32,988] INFO HV000001: Hibernate Validator 6.1.7.Final (org.hibernate.validator.internal.util.Version:21)
15215:13:34.871 [io-compute-0] ERROR i.c.k.s.c.CachedSchemaRegistryClient - Invalid schema type JSON
153org.apache.kafka.common.errors.SerializationException: Error serializing JSON message
154 at io.confluent.kafka.serializers.json.AbstractKafkaJsonSchemaSerializer.serializeImpl(AbstractKafkaJsonSchemaSerializer.java:171)
155 at io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer.serialize(KafkaJsonSchemaSerializer.java:95)
156 at fs2.kafka.GenericSerializer$.delegate$$anonfun$1(Serializer.scala:84)
157 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
158 at fs2.kafka.GenericSerializer$$anon$1.contramap$$anonfun$1(Serializer.scala:131)
159 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
160 at fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1$$anonfun$1(Serializer.scala:147)
161 at defer @ fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1(Serializer.scala:147)
162 at defer @ weaver.Test$.apply$$anonfun$1$$anonfun$1(Test.scala:21)
163 at product$extension @ fs2.kafka.KafkaProducer$.serializeToBytes(KafkaProducer.scala:242)
164 at map @ fs2.kafka.KafkaProducer$.asJavaRecord(KafkaProducer.scala:259)
165 at flatMap @ fs2.kafka.KafkaProducer$.produceRecord$$anonfun$1(KafkaProducer.scala:209)
166 at map2 @ fs2.Chunk.loop$1(Chunk.scala:434)
167 at map @ fs2.Chunk.loop$1(Chunk.scala:437)
168 at map @ fs2.Chunk.traverse(Chunk.scala:458)
169 at map @ fs2.kafka.KafkaProducer$.produce$$anonfun$1(KafkaProducer.scala:180)
170 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
171 at handleErrorWith @ fs2.Compiler$Target.handleErrorWith(Compiler.scala:161)
172 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
173 at get @ fs2.internal.Scope.openScope(Scope.scala:275)
174 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
175 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
176Caused by: java.io.IOException: Invalid schema {"$schema":"http://json-schema.org/draft-04/schema#","$defs":{"Book":{"title":"Book","type":"object","required":["name","isbn"],"properties":{"name":{"description":"name of the book","type":"string"},"isbn":{"description":"international standard book number","type":"integer","format":"int32"}}}},"title":"PersonV1","type":"object","required":["name","age"],"properties":{"name":{"description":"name of the person","type":"string"},"age":{"description":"age of the person","type":"integer","format":"int32"},"books":{"description":"A list of books that the person has read","type":"array","items":{"$ref":"#/$defs/Book"}}}} with refs [] of type JSON
177 at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lambda$lookupLatestVersion$8(AbstractKafkaSchemaSerDe.java:582)
178 at java.base/java.util.Optional.orElseThrow(Optional.java:403)
179 at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:579)
180 at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:557)
181 at io.confluent.kafka.serializers.json.AbstractKafkaJsonSchemaSerializer.serializeImpl(AbstractKafkaJsonSchemaSerializer.java:151)
182 at io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer.serialize(KafkaJsonSchemaSerializer.java:95)
183 at fs2.kafka.GenericSerializer$.delegate$$anonfun$1(Serializer.scala:84)
184 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
185 at fs2.kafka.GenericSerializer$$anon$1.contramap$$anonfun$1(Serializer.scala:131)
186 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
187 at fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1$$anonfun$1(Serializer.scala:147)
188 at cats.effect.IOFiber.runLoop(IOFiber.scala:413)
189 at cats.effect.IOFiber.execR(IOFiber.scala:1362)
190 at cats.effect.IOFiber.run(IOFiber.scala:112)
191 at cats.effect.unsafe.WorkerThread.run(WorkerThread.scala:743)
192org.apache.kafka.common.errors.SerializationException: Error serializing JSON message
19315:13:34.960 [qtp125623069-452] ERROR i.c.r.e.DebuggableExceptionMapper - Request Failed with exception
194io.confluent.kafka.schemaregistry.rest.exceptions.RestIncompatibleSchemaException: Schema being registered is incompatible with an earlier schema for subject "example-topic-persons-value", details: [{errorType:"PROPERTY_ADDED_TO_OPEN_CONTENT_MODEL", description:"The new schema has an open content model and has a property or item at path '#/properties/booksRead' which is missing in the old schema'}, {oldSchemaVersion: 1}, {oldSchema: '{"$schema":"http://json-schema.org/draft-04/schema#","$defs":{"Book":{"title":"Book","type":"object","required":["name","isbn"],"properties":{"name":{"description":"name of the book","type":"string"},"isbn":{"description":"international standard book number","type":"integer","format":"int32"}}}},"title":"PersonV1","type":"object","required":["name","age"],"properties":{"name":{"description":"name of the person","type":"string"},"age":{"description":"age of the person","type":"integer","format":"int32"},"books":{"description":"A list of books that the person has read","type":"array","items":{"$ref":"#/$defs/Book"}}}}'}, {validateFields: 'false', compatibility: 'BACKWARD'}]
195 at io.confluent.kafka.schemaregistry.rest.exceptions.Errors.incompatibleSchemaException(Errors.java:134)
196 at io.confluent.kafka.schemaregistry.rest.resources.SubjectVersionsResource.register(SubjectVersionsResource.java:436)
197 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
198 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
199 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
200 at java.base/java.lang.reflect.Method.invoke(Method.java:568)
201 at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:52)
202 at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:134)
203 at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:177)
204 at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$VoidOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:159)
205 at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:81)
206 at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:475)
207 at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:397)
208 at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:81)
209 at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:255)
210 at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248)
211 at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244)
212 at org.glassfish.jersey.internal.Errors.process(Errors.java:292)
213 at org.glassfish.jersey.internal.Errors.process(Errors.java:274)
214 at org.glassfish.jersey.internal.Errors.process(Errors.java:244)
215 at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265)
216 at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:234)
217 at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:684)
218 at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:394)
219 at org.glassfish.jersey.servlet.ServletContainer.serviceImpl(ServletContainer.java:378)
220 at org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:553)
221 at org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:494)
222 at org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:431)
223 at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
224 at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626)
225 at io.confluent.rest.metrics.JettyRequestMetricsFilter.doFilter(JettyRequestMetricsFilter.java:84)
226 at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
227 at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626)
228 at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552)
229 at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
230 at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)
231 at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
232 at org.eclipse.jetty.server.handler.RequestLogHandler.handle(RequestLogHandler.java:54)
233 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
234 at io.confluent.kafka.schemaregistry.rest.RequestIdHandler.handle(RequestIdHandler.java:51)
235 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
236 at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
237 at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440)
238 at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
239 at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505)
240 at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)
241 at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
242 at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355)
243 at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
244 at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
245 at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
246 at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:181)
247 at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:234)
248 at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:772)
249 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
250 at org.eclipse.jetty.server.Server.handle(Server.java:516)
251 at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)
252 at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)
253 at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)
254 at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
255 at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
256 at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
257 at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
258 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
259 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
260 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
261 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
262 at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)
263 at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
264 at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
265 at java.base/java.lang.Thread.run(Thread.java:833)
266Caused by: io.confluent.kafka.schemaregistry.exceptions.IncompatibleSchemaException: [{errorType:"PROPERTY_ADDED_TO_OPEN_CONTENT_MODEL", description:"The new schema has an open content model and has a property or item at path '#/properties/booksRead' which is missing in the old schema'}, {oldSchemaVersion: 1}, {oldSchema: '{"$schema":"http://json-schema.org/draft-04/schema#","$defs":{"Book":{"title":"Book","type":"object","required":["name","isbn"],"properties":{"name":{"description":"name of the book","type":"string"},"isbn":{"description":"international standard book number","type":"integer","format":"int32"}}}},"title":"PersonV1","type":"object","required":["name","age"],"properties":{"name":{"description":"name of the person","type":"string"},"age":{"description":"age of the person","type":"integer","format":"int32"},"books":{"description":"A list of books that the person has read","type":"array","items":{"$ref":"#/$defs/Book"}}}}'}, {validateFields: 'false', compatibility: 'BACKWARD'}]
267 at io.confluent.kafka.schemaregistry.storage.KafkaSchemaRegistry.register(KafkaSchemaRegistry.java:751)
268 at io.confluent.kafka.schemaregistry.storage.KafkaSchemaRegistry.registerOrForward(KafkaSchemaRegistry.java:882)
269 at io.confluent.kafka.schemaregistry.rest.resources.SubjectVersionsResource.register(SubjectVersionsResource.java:417)
270 ... 69 common frames omitted
271org.apache.kafka.common.errors.SerializationException: Error serializing JSON message
272 at io.confluent.kafka.serializers.json.AbstractKafkaJsonSchemaSerializer.serializeImpl(AbstractKafkaJsonSchemaSerializer.java:171)
273 at io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer.serialize(KafkaJsonSchemaSerializer.java:95)
274 at fs2.kafka.GenericSerializer$.delegate$$anonfun$1(Serializer.scala:84)
275 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
276 at fs2.kafka.GenericSerializer$$anon$1.contramap$$anonfun$1(Serializer.scala:131)
277 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
278 at fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1$$anonfun$1(Serializer.scala:147)
279 at defer @ fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1(Serializer.scala:147)
280 at defer @ weaver.Test$.apply$$anonfun$1$$anonfun$1(Test.scala:21)
281 at product$extension @ fs2.kafka.KafkaProducer$.serializeToBytes(KafkaProducer.scala:242)
282 at map @ fs2.kafka.KafkaProducer$.asJavaRecord(KafkaProducer.scala:259)
283 at flatMap @ fs2.kafka.KafkaProducer$.produceRecord$$anonfun$1(KafkaProducer.scala:209)
284 at map2 @ fs2.Chunk.loop$1(Chunk.scala:434)
285 at map @ fs2.Chunk.loop$1(Chunk.scala:437)
286 at map @ fs2.Chunk.traverse(Chunk.scala:458)
287 at map @ fs2.kafka.KafkaProducer$.produce$$anonfun$1(KafkaProducer.scala:180)
288 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
289 at handleErrorWith @ fs2.Compiler$Target.handleErrorWith(Compiler.scala:161)
290 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
291 at get @ fs2.internal.Scope.openScope(Scope.scala:275)
292 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
293 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
294Caused by: java.io.IOException: Incompatible schema {"$schema":"http://json-schema.org/draft-04/schema#","$defs":{"Book":{"title":"Book","type":"object","required":["name","isbn"],"properties":{"name":{"description":"name of the book","type":"string"},"isbn":{"description":"international standard book number","type":"integer","format":"int32"}}}},"title":"PersonV1","type":"object","required":["name","age"],"properties":{"name":{"description":"name of the person","type":"string"},"age":{"description":"age of the person","type":"integer","format":"int32"},"books":{"description":"A list of books that the person has read","type":"array","items":{"$ref":"#/$defs/Book"}}}} with refs [] of type JSON for schema {"$schema":"http://json-schema.org/draft-04/schema#","$defs":{"Book":{"title":"Book","type":"object","required":["name","isbn"],"properties":{"name":{"description":"name of the book","type":"string"},"isbn":{"description":"international standard book number","type":"integer","format":"int32"}}}},"title":"PersonV2Bad","type":"object","required":["name","age"],"properties":{"name":{"description":"name of the person","type":"string"},"age":{"description":"age of the person","type":"integer","format":"int32"},"booksRead":{"description":"A list of books that the person has read","type":"array","items":{"$ref":"#/$defs/Book"}}}}. Set latest.compatibility.strict=false to disable this check
295 at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:590)
296 at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:557)
297 at io.confluent.kafka.serializers.json.AbstractKafkaJsonSchemaSerializer.serializeImpl(AbstractKafkaJsonSchemaSerializer.java:151)
298 at io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer.serialize(KafkaJsonSchemaSerializer.java:95)
299 at fs2.kafka.GenericSerializer$.delegate$$anonfun$1(Serializer.scala:84)
300 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
301 at fs2.kafka.GenericSerializer$$anon$1.contramap$$anonfun$1(Serializer.scala:131)
302 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
303 at fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1$$anonfun$1(Serializer.scala:147)
304 at cats.effect.IOFiber.runLoop(IOFiber.scala:413)
305 at cats.effect.IOFiber.execR(IOFiber.scala:1362)
306 at cats.effect.IOFiber.run(IOFiber.scala:112)
307 at cats.effect.unsafe.WorkerThread.run(WorkerThread.scala:743)
30815:13:35.068 [qtp125623069-450] ERROR i.c.r.e.DebuggableExceptionMapper - Request Failed with exception
309io.confluent.rest.exceptions.RestNotFoundException: Schema not found
310 at io.confluent.kafka.schemaregistry.rest.exceptions.Errors.schemaNotFoundException(Errors.java:121)
311 at io.confluent.kafka.schemaregistry.rest.resources.SubjectsResource.lookUpSchemaUnderSubject(SubjectsResource.java:132)
312 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
313 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
314 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
315 at java.base/java.lang.reflect.Method.invoke(Method.java:568)
316 at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:52)
317 at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:134)
318 at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:177)
319 at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$VoidOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:159)
320 at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:81)
321 at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:475)
322 at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:397)
323 at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:81)
324 at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:255)
325 at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248)
326 at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244)
327 at org.glassfish.jersey.internal.Errors.process(Errors.java:292)
328 at org.glassfish.jersey.internal.Errors.process(Errors.java:274)
329 at org.glassfish.jersey.internal.Errors.process(Errors.java:244)
330 at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265)
331 at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:234)
332 at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:684)
333 at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:394)
334 at org.glassfish.jersey.servlet.ServletContainer.serviceImpl(ServletContainer.java:378)
335 at org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:553)
336 at org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:494)
337 at org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:431)
338 at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
339 at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626)
340 at io.confluent.rest.metrics.JettyRequestMetricsFilter.doFilter(JettyRequestMetricsFilter.java:84)
341 at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
342 at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626)
343 at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552)
344 at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
345 at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)
346 at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
347 at org.eclipse.jetty.server.handler.RequestLogHandler.handle(RequestLogHandler.java:54)
348 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
349 at io.confluent.kafka.schemaregistry.rest.RequestIdHandler.handle(RequestIdHandler.java:51)
350 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
351 at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
352 at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440)
353 at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
354 at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505)
355 at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)
356 at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
357 at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355)
358 at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
359 at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
360 at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
361 at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:181)
362 at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:234)
363 at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:772)
364 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
365 at org.eclipse.jetty.server.Server.handle(Server.java:516)
366 at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)
367 at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)
368 at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)
369 at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
370 at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
371 at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
372 at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
373 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
374 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
375 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
376 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
377 at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)
378 at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
379 at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
380 at java.base/java.lang.Thread.run(Thread.java:833)
381org.apache.kafka.common.errors.SerializationException: Error retrieving JSON schema: {"$schema":"http://json-schema.org/draft-04/schema#","$defs":{"Book":{"title":"Book","type":"object","required":["name","isbn"],"properties":{"name":{"description":"name of the book","type":"string"},"isbn":{"description":"international standard book number","type":"integer","format":"int32"}}}},"title":"PersonV2Bad","type":"object","required":["name","age"],"properties":{"name":{"description":"name of the person","type":"string"},"age":{"description":"age of the person","type":"integer","format":"int32"},"booksRead":{"description":"A list of books that the person has read","type":"array","items":{"$ref":"#/$defs/Book"}}}}
382 at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.toKafkaException(AbstractKafkaSchemaSerDe.java:805)
383 at io.confluent.kafka.serializers.json.AbstractKafkaJsonSchemaSerializer.serializeImpl(AbstractKafkaJsonSchemaSerializer.java:173)
384 at io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer.serialize(KafkaJsonSchemaSerializer.java:95)
385 at fs2.kafka.GenericSerializer$.delegate$$anonfun$1(Serializer.scala:84)
386 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
387 at fs2.kafka.GenericSerializer$$anon$1.contramap$$anonfun$1(Serializer.scala:131)
388 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
389 at fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1$$anonfun$1(Serializer.scala:147)
390 at defer @ fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1(Serializer.scala:147)
391 at defer @ weaver.Test$.apply$$anonfun$1$$anonfun$1(Test.scala:21)
392 at product$extension @ fs2.kafka.KafkaProducer$.serializeToBytes(KafkaProducer.scala:242)
393 at map @ fs2.kafka.KafkaProducer$.asJavaRecord(KafkaProducer.scala:259)
394 at flatMap @ fs2.kafka.KafkaProducer$.produceRecord$$anonfun$1(KafkaProducer.scala:209)
395 at map2 @ fs2.Chunk.loop$1(Chunk.scala:434)
396 at map @ fs2.Chunk.loop$1(Chunk.scala:437)
397 at map @ fs2.Chunk.traverse(Chunk.scala:458)
398 at map @ fs2.kafka.KafkaProducer$.produce$$anonfun$1(KafkaProducer.scala:180)
399 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
400 at handleErrorWith @ fs2.Compiler$Target.handleErrorWith(Compiler.scala:161)
401 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
402 at get @ fs2.internal.Scope.openScope(Scope.scala:275)
403 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
404 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
405Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema not found; error code: 40403
406 at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:336)
407 at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:409)
408 at io.confluent.kafka.schemaregistry.client.rest.RestService.lookUpSubjectVersion(RestService.java:500)
409 at io.confluent.kafka.schemaregistry.client.rest.RestService.lookUpSubjectVersion(RestService.java:485)
410 at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getIdFromRegistry(CachedSchemaRegistryClient.java:372)
411 at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getId(CachedSchemaRegistryClient.java:651)
412 at io.confluent.kafka.serializers.json.AbstractKafkaJsonSchemaSerializer.serializeImpl(AbstractKafkaJsonSchemaSerializer.java:155)
413 at io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer.serialize(KafkaJsonSchemaSerializer.java:95)
414 at fs2.kafka.GenericSerializer$.delegate$$anonfun$1(Serializer.scala:84)
415 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
416 at fs2.kafka.GenericSerializer$$anon$1.contramap$$anonfun$1(Serializer.scala:131)
417 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
418 at fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1$$anonfun$1(Serializer.scala:147)
419 at cats.effect.IOFiber.runLoop(IOFiber.scala:413)
420 at cats.effect.IOFiber.execR(IOFiber.scala:1362)
421 at cats.effect.IOFiber.run(IOFiber.scala:112)
422 at cats.effect.unsafe.WorkerThread.run(WorkerThread.scala:743)
423[info] io.kaizensolutions.jsonschema.JsonSchemaSerdesSpec
424[info] + JsonSchemaSerialization will automatically register the JSON Schema and allow you to send JSON data to Kafka 1s
425[info] + Enabling use latest (and disabling auto-registration) without configuring the client will fail 94ms
426[info] + Attempting to publish an incompatible change with auto-registration will fail 84ms
427[info] + Attempting to publish an incompatible change without auto-registration (using latest server schema) will fail 43ms
428[info] + Attempting to publish an incompatible change without auto-registration and not using the latest schema will fail 58ms
429[info] + Publishing a forward compatible change with auto-registration is allowed (in forward-compatibility mode) 198ms
430[info] + Reading data back from the topic with the latest schema is allowed provided you compensate for missing fields in your Decoder 3s
431[info] + Reading data back from the topic with an older schema is allowed 3s
432
433************************
434Build summary:
435[{
436 "module": "fs2-kafka-jsonschema",
437 "compile": {"status": "ok", "tookMs": 9970, "warnings": 0, "errors": 0, "sourceVersion": "3.8"},
438 "doc": {"status": "skipped", "tookMs": 0, "files": 0, "totalSizeKb": 0},
439 "test-compile": {"status": "ok", "tookMs": 8796, "warnings": 0, "errors": 0, "sourceVersion": "3.8"},
440 "test": {"status": "ok", "tookMs": 19906, "passed": 8, "failed": 0, "ignored": 0, "skipped": 0, "total": 8, "byFramework": [{"framework": "unknown", "stats": {"passed": 8, "failed": 0, "ignored": 0, "skipped": 0, "total": 8}}]},
441 "publish": {"status": "skipped", "tookMs": 0},
442 "metadata": {
443 "crossScalaVersions": ["3.3.3"]
444}
445}]
446************************
447[success] Total time: 118 s (0:01:58.0), completed Nov 28, 2025, 3:13:43 PM
448[0JChecking patch project/plugins.sbt...
449Checking patch project/build.properties...
450Checking patch build.sbt...
451Applied patch project/plugins.sbt cleanly.
452Applied patch project/build.properties cleanly.
453Applied patch build.sbt cleanly.