Build Logs
kaizen-solutions/fs2-kafka-jsonschema • 3.8.0-RC6:2026-01-08
Errors
13
Warnings
4
Total Lines
412
1##################################
2Clonning https://github.com/kaizen-solutions/fs2-kafka-jsonschema.git into /build/repo using revision v0.0.1
3##################################
4Note: switching to 'cd75c5f4d4182f46d206c036ec973ff6a26e2fd7'.
5
6You are in 'detached HEAD' state. You can look around, make experimental
7changes and commit them, and you can discard any commits you make in this
8state without impacting any branches by switching back to a branch.
9
10If you want to create a new branch to retain commits you create, you may
11do so (now or later) by using -c with the switch command. Example:
12
13 git switch -c <new-branch-name>
14
15Or undo this operation with:
16
17 git switch -
18
19Turn off this advice by setting config variable advice.detachedHead to false
20
21Would override fixed Scala version: 3.3.3
22----
23Preparing build for 3.8.0-RC6
24Scala binary version found: 3.8
25Implicitly using source version 3.8
26Scala binary version found: 3.8
27Implicitly using source version 3.8
28Would try to apply common scalacOption (best-effort, sbt/mill only):
29Append: ,REQUIRE:-source:3.8
30Remove: ,-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e
31
32Try apply source patch:
33Path: build.sbt
34Pattern: val scala3 = "3.3.3"
35Replacement: val scala3 = "3.8.0-RC6"
36Starting compilation server
37Compiling project (Scala 3.7.3, JVM (17))
38Compiled project (Scala 3.7.3, JVM (17))
39Successfully applied pattern 'val scala3 = "3.3.3"' in build.sbt
40----
41Starting build for 3.8.0-RC6
42Execute tests: true
43sbt project found:
44Sbt version 1.10.1 is not supported, minimal supported version is 1.11.5
45Enforcing usage of sbt in version 1.11.5
46No prepare script found for project kaizen-solutions/fs2-kafka-jsonschema
47##################################
48Scala version: 3.8.0-RC6
49Targets: io.kaizen-solutions%fs2-kafka-jsonschema
50Project projectConfig: {"projects":{"exclude":[],"overrides":{}},"java":{"version":"17"},"sbt":{"commands":[],"options":[]},"mill":{"options":[]},"tests":"full","migrationVersions":[],"sourcePatches":[{"path":"build.sbt","pattern":"val scala3 = \"3.3.3\"","replaceWith":"val scala3 = \"<SCALA_VERSION>\""}]}
51##################################
52Using extra scalacOptions: ,REQUIRE:-source:3.8
53Filtering out scalacOptions: ,-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e
54[sbt_options] declare -a sbt_options=()
55[process_args] java_version = '17'
56[copyRt] java9_rt = '/root/.sbt/1.0/java9-rt-ext-eclipse_adoptium_17_0_8/rt.jar'
57# Executing command line:
58java
59-Dfile.encoding=UTF-8
60-Dcommunitybuild.scala=3.8.0-RC6
61-Dcommunitybuild.project.dependencies.add=
62-Xmx7G
63-Xms4G
64-Xss8M
65-Dsbt.script=/root/.sdkman/candidates/sbt/current/bin/sbt
66-Dscala.ext.dirs=/root/.sbt/1.0/java9-rt-ext-eclipse_adoptium_17_0_8
67-jar
68/root/.sdkman/candidates/sbt/1.11.5/bin/sbt-launch.jar
69"setCrossScalaVersions 3.8.0-RC6"
70"++3.8.0-RC6 -v"
71"mapScalacOptions ",REQUIRE:-source:3.8,-Wconf:msg=can be rewritten automatically under:s" ",-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e""
72"set every credentials := Nil"
73"excludeLibraryDependency com.github.ghik:zerowaste_{scalaVersion} com.olegpy:better-monadic-for_3 org.polyvariant:better-tostring_{scalaVersion} org.wartremover:wartremover_{scalaVersion}"
74"removeScalacOptionsStartingWith -P:wartremover"
75
76moduleMappings
77"runBuild 3.8.0-RC6 """{"projects":{"exclude":[],"overrides":{}},"java":{"version":"17"},"sbt":{"commands":[],"options":[]},"mill":{"options":[]},"tests":"full","migrationVersions":[],"sourcePatches":[{"path":"build.sbt","pattern":"val scala3 = \"3.3.3\"","replaceWith":"val scala3 = \"<SCALA_VERSION>\""}]}""" io.kaizen-solutions%fs2-kafka-jsonschema"
78
79[info] welcome to sbt 1.11.5 (Eclipse Adoptium Java 17.0.8)
80[info] loading settings for project repo-build-build-build from metals.sbt...
81[info] loading project definition from /build/repo/project/project/project
82[info] loading settings for project repo-build-build from metals.sbt...
83[info] loading project definition from /build/repo/project/project
84[success] Generated .bloop/repo-build-build.json
85[success] Total time: 4 s, completed Jan 8, 2026, 3:00:05 AM
86[info] loading settings for project repo-build from akka.sbt, metals.sbt, plugins.sbt...
87[info] loading project definition from /build/repo/project
88[success] Generated .bloop/repo-build.json
89[info] compiling 2 Scala sources to /build/repo/project/target/scala-2.12/sbt-1.0/classes ...
90[info] Non-compiled module 'compiler-bridge_2.12' for Scala 2.12.20. Compiling...
91[info] Compilation completed in 7.731s.
92[info] done compiling
93[success] Total time: 20 s, completed Jan 8, 2026, 3:00:26 AM
94[info] loading settings for project root from build.sbt...
95[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
96Execute setCrossScalaVersions: 3.8.0-RC6
97OpenCB::Changing crossVersion 3.8.0-RC6 -> 3.8.0-RC6 in root/crossScalaVersions
98[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
99[info] Setting Scala version to 3.8.0-RC6 on 1 projects.
100[info] Switching Scala version on:
101[info] * root (3.8.0-RC6)
102[info] Excluding projects:
103[info] Reapplying settings...
104[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
105Execute mapScalacOptions: ,REQUIRE:-source:3.8,-Wconf:msg=can be rewritten automatically under:s ,-deprecation,-feature,-Xfatal-warnings,-Werror,MATCH:.*-Wconf.*any:e
106[info] Reapplying settings...
107[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
108[info] Defining Global / credentials, credentials
109[info] The new values will be used by Compile / scalafmtOnly, Global / pgpSelectPassphrase and 9 others.
110[info] Run `last` for details.
111[info] Reapplying settings...
112[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
113Execute excludeLibraryDependency: com.github.ghik:zerowaste_{scalaVersion} com.olegpy:better-monadic-for_3 org.polyvariant:better-tostring_{scalaVersion} org.wartremover:wartremover_{scalaVersion}
114[info] Reapplying settings...
115OpenCB::Failed to reapply settings in excludeLibraryDependency: Reference to undefined setting:
116
117 Global / allExcludeDependencies from Global / allExcludeDependencies (CommunityBuildPlugin.scala:331)
118 Did you mean allExcludeDependencies ?
119 , retry without global scopes
120[info] Reapplying settings...
121[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
122Execute removeScalacOptionsStartingWith: -P:wartremover
123[info] Reapplying settings...
124[info] set current project to fs2-kafka-jsonschema (in build file:/build/repo/)
125[success] Total time: 0 s, completed Jan 8, 2026, 3:00:34 AM
126Build config: {"projects":{"exclude":[],"overrides":{}},"java":{"version":"17"},"sbt":{"commands":[],"options":[]},"mill":{"options":[]},"tests":"full","migrationVersions":[],"sourcePatches":[{"path":"build.sbt","pattern":"val scala3 = \"3.3.3\"","replaceWith":"val scala3 = \"<SCALA_VERSION>\""}]}
127Parsed config: Success(ProjectBuildConfig(ProjectsConfig(List(),Map()),Full,List()))
128Starting build...
129Projects: Set(root)
130Starting build for ProjectRef(file:/build/repo/,root) (fs2-kafka-jsonschema)... [0/1]
131OpenCB::Exclude Scala3 specific scalacOption `REQUIRE:-source:3.8` in Scala 2.12.20 module Global
132OpenCB::Filter out '-deprecation', matches setting pattern '^-?-deprecation'
133OpenCB::Filter out '-feature', matches setting pattern '^-?-feature'
134Compile scalacOptions: -release, 17, -Wunused:imports, -Wunused:explicits, -Wvalue-discard, -unchecked, -encoding, utf8, -Wunused:implicits, -Ykind-projector, -Xsemanticdb, -semanticdb-target, /build/repo/target/scala-3.8.0-RC6/meta, -Wconf:msg=can be rewritten automatically under:s, -source:3.8
135[info] compiling 4 Scala sources to /build/repo/target/scala-3.8.0-RC6/classes ...
136[warn] Option -Ykind-projector is deprecated: Use -Xkind-projector instead.
137[warn] one warning found
138[info] done compiling
139[info] compiling 2 Scala sources to /build/repo/target/scala-3.8.0-RC6/test-classes ...
140[warn] Option -Ykind-projector is deprecated: Use -Xkind-projector instead.
141[warn] one warning found
142[info] done compiling
1432026-01-08 03:01:44.618+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.ConfigResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.ConfigResource will be ignored.
1442026-01-08 03:01:44.624+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.ContextsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.ContextsResource will be ignored.
1452026-01-08 03:01:44.624+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.SubjectsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.SubjectsResource will be ignored.
1462026-01-08 03:01:44.624+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.SchemasResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.SchemasResource will be ignored.
1472026-01-08 03:01:44.624+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.SubjectVersionsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.SubjectVersionsResource will be ignored.
1482026-01-08 03:01:44.624+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.CompatibilityResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.CompatibilityResource will be ignored.
1492026-01-08 03:01:44.624+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.ModeResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.ModeResource will be ignored.
1502026-01-08 03:01:44.624+0100 warn [Providers] A provider io.confluent.kafka.schemaregistry.rest.resources.ServerMetadataResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider io.confluent.kafka.schemaregistry.rest.resources.ServerMetadataResource will be ignored.
151[2026-01-08 03:01:44,790] INFO HV000001: Hibernate Validator 6.1.7.Final (org.hibernate.validator.internal.util.Version:21)
15203:01:46.480 [io-compute-0] ERROR i.c.k.s.c.CachedSchemaRegistryClient - Invalid schema type JSON
153org.apache.kafka.common.errors.SerializationException: Error serializing JSON message
154 at io.confluent.kafka.serializers.json.AbstractKafkaJsonSchemaSerializer.serializeImpl(AbstractKafkaJsonSchemaSerializer.java:171)
155 at io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer.serialize(KafkaJsonSchemaSerializer.java:95)
156 at fs2.kafka.GenericSerializer$.delegate$$anonfun$1(Serializer.scala:84)
157 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
158 at fs2.kafka.GenericSerializer$$anon$1.contramap$$anonfun$1(Serializer.scala:131)
159 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
160 at fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1$$anonfun$1(Serializer.scala:147)
161 at defer @ fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1(Serializer.scala:147)
162 at defer @ weaver.Test$.apply$$anonfun$1$$anonfun$1(Test.scala:21)
163 at product$extension @ fs2.kafka.KafkaProducer$.serializeToBytes(KafkaProducer.scala:242)
164 at map @ fs2.kafka.KafkaProducer$.asJavaRecord(KafkaProducer.scala:259)
165 at flatMap @ fs2.kafka.KafkaProducer$.produceRecord$$anonfun$1(KafkaProducer.scala:209)
166 at map2 @ fs2.Chunk.loop$1(Chunk.scala:434)
167 at map @ fs2.Chunk.loop$1(Chunk.scala:437)
168 at map @ fs2.Chunk.traverse(Chunk.scala:458)
169 at map @ fs2.kafka.KafkaProducer$.produce$$anonfun$1(KafkaProducer.scala:180)
170 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
171 at handleErrorWith @ fs2.Compiler$Target.handleErrorWith(Compiler.scala:161)
172 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
173 at get @ fs2.internal.Scope.openScope(Scope.scala:275)
174 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
175 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
176Caused by: java.io.IOException: Invalid schema {"$schema":"http://json-schema.org/draft-04/schema#","$defs":{"Book":{"title":"Book","type":"object","required":["name","isbn"],"properties":{"name":{"description":"name of the book","type":"string"},"isbn":{"description":"international standard book number","type":"integer","format":"int32"}}}},"title":"PersonV1","type":"object","required":["name","age"],"properties":{"name":{"description":"name of the person","type":"string"},"age":{"description":"age of the person","type":"integer","format":"int32"},"books":{"description":"A list of books that the person has read","type":"array","items":{"$ref":"#/$defs/Book"}}}} with refs [] of type JSON
177 at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lambda$lookupLatestVersion$8(AbstractKafkaSchemaSerDe.java:582)
178 at java.base/java.util.Optional.orElseThrow(Optional.java:403)
179 at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:579)
180 at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:557)
181 at io.confluent.kafka.serializers.json.AbstractKafkaJsonSchemaSerializer.serializeImpl(AbstractKafkaJsonSchemaSerializer.java:151)
182 at io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer.serialize(KafkaJsonSchemaSerializer.java:95)
183 at fs2.kafka.GenericSerializer$.delegate$$anonfun$1(Serializer.scala:84)
184 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
185 at fs2.kafka.GenericSerializer$$anon$1.contramap$$anonfun$1(Serializer.scala:131)
186 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
187 at fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1$$anonfun$1(Serializer.scala:147)
188 at cats.effect.IOFiber.runLoop(IOFiber.scala:413)
189 at cats.effect.IOFiber.execR(IOFiber.scala:1362)
190 at cats.effect.IOFiber.run(IOFiber.scala:112)
191 at cats.effect.unsafe.WorkerThread.run(WorkerThread.scala:743)
192org.apache.kafka.common.errors.SerializationException: Error serializing JSON message
19303:01:46.552 [qtp382889912-420] ERROR i.c.r.e.DebuggableExceptionMapper - Request Failed with exception
194io.confluent.kafka.schemaregistry.rest.exceptions.RestIncompatibleSchemaException: Schema being registered is incompatible with an earlier schema for subject "example-topic-persons-value", details: [{errorType:"PROPERTY_ADDED_TO_OPEN_CONTENT_MODEL", description:"The new schema has an open content model and has a property or item at path '#/properties/booksRead' which is missing in the old schema'}, {oldSchemaVersion: 1}, {oldSchema: '{"$schema":"http://json-schema.org/draft-04/schema#","$defs":{"Book":{"title":"Book","type":"object","required":["name","isbn"],"properties":{"name":{"description":"name of the book","type":"string"},"isbn":{"description":"international standard book number","type":"integer","format":"int32"}}}},"title":"PersonV1","type":"object","required":["name","age"],"properties":{"name":{"description":"name of the person","type":"string"},"age":{"description":"age of the person","type":"integer","format":"int32"},"books":{"description":"A list of books that the person has read","type":"array","items":{"$ref":"#/$defs/Book"}}}}'}, {validateFields: 'false', compatibility: 'BACKWARD'}]
195 at io.confluent.kafka.schemaregistry.rest.exceptions.Errors.incompatibleSchemaException(Errors.java:134)
196 at io.confluent.kafka.schemaregistry.rest.resources.SubjectVersionsResource.register(SubjectVersionsResource.java:436)
197 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
198 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
199 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
200 at java.base/java.lang.reflect.Method.invoke(Method.java:568)
201 at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:52)
202 at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:134)
203 at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:177)
204 at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$VoidOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:159)
205 at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:81)
206 at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:475)
207 at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:397)
208 at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:81)
209 at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:255)
210 at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248)
211 at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244)
212 at org.glassfish.jersey.internal.Errors.process(Errors.java:292)
213 at org.glassfish.jersey.internal.Errors.process(Errors.java:274)
214 at org.glassfish.jersey.internal.Errors.process(Errors.java:244)
215 at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265)
216 at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:234)
217 at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:684)
218 at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:394)
219 at org.glassfish.jersey.servlet.ServletContainer.serviceImpl(ServletContainer.java:378)
220 at org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:553)
221 at org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:494)
222 at org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:431)
223 at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
224 at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626)
225 at io.confluent.rest.metrics.JettyRequestMetricsFilter.doFilter(JettyRequestMetricsFilter.java:84)
226 at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
227 at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626)
228 at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552)
229 at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
230 at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)
231 at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
232 at org.eclipse.jetty.server.handler.RequestLogHandler.handle(RequestLogHandler.java:54)
233 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
234 at io.confluent.kafka.schemaregistry.rest.RequestIdHandler.handle(RequestIdHandler.java:51)
235 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
236 at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
237 at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440)
238 at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
239 at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505)
240 at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)
241 at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
242 at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355)
243 at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
244 at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
245 at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
246 at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:181)
247 at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:234)
248 at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:772)
249 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
250 at org.eclipse.jetty.server.Server.handle(Server.java:516)
251 at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)
252 at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)
253 at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)
254 at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
255 at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
256 at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
257 at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
258 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
259 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
260 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
261 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
262 at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)
263 at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
264 at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
265 at java.base/java.lang.Thread.run(Thread.java:833)
266Caused by: io.confluent.kafka.schemaregistry.exceptions.IncompatibleSchemaException: [{errorType:"PROPERTY_ADDED_TO_OPEN_CONTENT_MODEL", description:"The new schema has an open content model and has a property or item at path '#/properties/booksRead' which is missing in the old schema'}, {oldSchemaVersion: 1}, {oldSchema: '{"$schema":"http://json-schema.org/draft-04/schema#","$defs":{"Book":{"title":"Book","type":"object","required":["name","isbn"],"properties":{"name":{"description":"name of the book","type":"string"},"isbn":{"description":"international standard book number","type":"integer","format":"int32"}}}},"title":"PersonV1","type":"object","required":["name","age"],"properties":{"name":{"description":"name of the person","type":"string"},"age":{"description":"age of the person","type":"integer","format":"int32"},"books":{"description":"A list of books that the person has read","type":"array","items":{"$ref":"#/$defs/Book"}}}}'}, {validateFields: 'false', compatibility: 'BACKWARD'}]
267 at io.confluent.kafka.schemaregistry.storage.KafkaSchemaRegistry.register(KafkaSchemaRegistry.java:751)
268 at io.confluent.kafka.schemaregistry.storage.KafkaSchemaRegistry.registerOrForward(KafkaSchemaRegistry.java:882)
269 at io.confluent.kafka.schemaregistry.rest.resources.SubjectVersionsResource.register(SubjectVersionsResource.java:417)
270 ... 69 common frames omitted
271org.apache.kafka.common.errors.InvalidConfigurationException: Schema being registered is incompatible with an earlier schema for subject "example-topic-persons-value", details: [{errorType:"PROPERTY_ADDED_TO_OPEN_CONTENT_MODEL", description:"The new schema has an open content model and has a property or item at path '#/properties/booksRead' which is missing in the old schema'}, {oldSchemaVersion: 1}, {oldSchema: '{"$schema":"http://json-schema.org/draft-04/schema#","$defs":{"Book":{"title":"Book","type":"object","required":["name","isbn"],"properties":{"name":{"description":"name of the book","type":"string"},"isbn":{"description":"international standard book number","type":"integer","format":"int32"}}}},"title":"PersonV1","type":"object","required":["name","age"],"properties":{"name":{"description":"name of the person","type":"string"},"age":{"description":"age of the person","type":"integer","format":"int32"},"books":{"description":"A list of books that the person has read","type":"array","items":{"$ref":"#/$defs/Book"}}}}'}, {validateFields: 'false', compatibility: 'BACKWARD'}]; error code: 409
272org.apache.kafka.common.errors.SerializationException: Error serializing JSON message
273 at io.confluent.kafka.serializers.json.AbstractKafkaJsonSchemaSerializer.serializeImpl(AbstractKafkaJsonSchemaSerializer.java:171)
274 at io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer.serialize(KafkaJsonSchemaSerializer.java:95)
275 at fs2.kafka.GenericSerializer$.delegate$$anonfun$1(Serializer.scala:84)
276 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
277 at fs2.kafka.GenericSerializer$$anon$1.contramap$$anonfun$1(Serializer.scala:131)
278 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
279 at fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1$$anonfun$1(Serializer.scala:147)
280 at defer @ fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1(Serializer.scala:147)
281 at defer @ weaver.Test$.apply$$anonfun$1$$anonfun$1(Test.scala:21)
282 at product$extension @ fs2.kafka.KafkaProducer$.serializeToBytes(KafkaProducer.scala:242)
283 at map @ fs2.kafka.KafkaProducer$.asJavaRecord(KafkaProducer.scala:259)
284 at flatMap @ fs2.kafka.KafkaProducer$.produceRecord$$anonfun$1(KafkaProducer.scala:209)
285 at map2 @ fs2.Chunk.loop$1(Chunk.scala:434)
286 at map @ fs2.Chunk.loop$1(Chunk.scala:437)
287 at map @ fs2.Chunk.traverse(Chunk.scala:458)
288 at map @ fs2.kafka.KafkaProducer$.produce$$anonfun$1(KafkaProducer.scala:180)
289 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
290 at handleErrorWith @ fs2.Compiler$Target.handleErrorWith(Compiler.scala:161)
291 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
292 at get @ fs2.internal.Scope.openScope(Scope.scala:275)
293 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
294 at flatMap @ fs2.Compiler$Target.flatMap(Compiler.scala:163)
295Caused by: java.io.IOException: Incompatible schema {"$schema":"http://json-schema.org/draft-04/schema#","$defs":{"Book":{"title":"Book","type":"object","required":["name","isbn"],"properties":{"name":{"description":"name of the book","type":"string"},"isbn":{"description":"international standard book number","type":"integer","format":"int32"}}}},"title":"PersonV1","type":"object","required":["name","age"],"properties":{"name":{"description":"name of the person","type":"string"},"age":{"description":"age of the person","type":"integer","format":"int32"},"books":{"description":"A list of books that the person has read","type":"array","items":{"$ref":"#/$defs/Book"}}}} with refs [] of type JSON for schema {"$schema":"http://json-schema.org/draft-04/schema#","$defs":{"Book":{"title":"Book","type":"object","required":["name","isbn"],"properties":{"name":{"description":"name of the book","type":"string"},"isbn":{"description":"international standard book number","type":"integer","format":"int32"}}}},"title":"PersonV2Bad","type":"object","required":["name","age"],"properties":{"name":{"description":"name of the person","type":"string"},"age":{"description":"age of the person","type":"integer","format":"int32"},"booksRead":{"description":"A list of books that the person has read","type":"array","items":{"$ref":"#/$defs/Book"}}}}. Set latest.compatibility.strict=false to disable this check
296 at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:590)
297 at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupLatestVersion(AbstractKafkaSchemaSerDe.java:557)
298 at io.confluent.kafka.serializers.json.AbstractKafkaJsonSchemaSerializer.serializeImpl(AbstractKafkaJsonSchemaSerializer.java:151)
299 at io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer.serialize(KafkaJsonSchemaSerializer.java:95)
300 at fs2.kafka.GenericSerializer$.delegate$$anonfun$1(Serializer.scala:84)
301 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
302 at fs2.kafka.GenericSerializer$$anon$1.contramap$$anonfun$1(Serializer.scala:131)
303 at fs2.kafka.GenericSerializer$$anon$1.serialize(Serializer.scala:127)
304 at fs2.kafka.GenericSerializer$$anon$1.suspend$$anonfun$1$$anonfun$1(Serializer.scala:147)
305 at cats.effect.IOFiber.runLoop(IOFiber.scala:413)
306 at cats.effect.IOFiber.execR(IOFiber.scala:1362)
307 at cats.effect.IOFiber.run(IOFiber.scala:112)
308 at cats.effect.unsafe.WorkerThread.run(WorkerThread.scala:743)
30903:01:46.648 [qtp382889912-420] ERROR i.c.r.e.DebuggableExceptionMapper - Request Failed with exception
310io.confluent.rest.exceptions.RestNotFoundException: Schema not found
311 at io.confluent.kafka.schemaregistry.rest.exceptions.Errors.schemaNotFoundException(Errors.java:121)
312 at io.confluent.kafka.schemaregistry.rest.resources.SubjectsResource.lookUpSchemaUnderSubject(SubjectsResource.java:132)
313 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
314 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
315 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
316 at java.base/java.lang.reflect.Method.invoke(Method.java:568)
317 at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:52)
318 at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:134)
319 at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:177)
320 at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$VoidOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:159)
321 at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:81)
322 at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:475)
323 at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:397)
324 at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:81)
325 at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:255)
326 at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248)
327 at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244)
328 at org.glassfish.jersey.internal.Errors.process(Errors.java:292)
329 at org.glassfish.jersey.internal.Errors.process(Errors.java:274)
330 at org.glassfish.jersey.internal.Errors.process(Errors.java:244)
331 at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265)
332 at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:234)
333 at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:684)
334 at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:394)
335 at org.glassfish.jersey.servlet.ServletContainer.serviceImpl(ServletContainer.java:378)
336 at org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:553)
337 at org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:494)
338 at org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:431)
339 at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
340 at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626)
341 at io.confluent.rest.metrics.JettyRequestMetricsFilter.doFilter(JettyRequestMetricsFilter.java:84)
342 at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
343 at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626)
344 at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552)
345 at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
346 at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)
347 at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
348 at org.eclipse.jetty.server.handler.RequestLogHandler.handle(RequestLogHandler.java:54)
349 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
350 at io.confluent.kafka.schemaregistry.rest.RequestIdHandler.handle(RequestIdHandler.java:51)
351 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
352 at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
353 at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440)
354 at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
355 at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505)
356 at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)
357 at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
358 at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355)
359 at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
360 at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
361 at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
362 at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:181)
363 at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:234)
364 at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:772)
365 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
366 at org.eclipse.jetty.server.Server.handle(Server.java:516)
367 at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)
368 at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)
369 at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)
370 at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
371 at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
372 at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
373 at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
374 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
375 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
376 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
377 at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
378 at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)
379 at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
380 at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
381 at java.base/java.lang.Thread.run(Thread.java:833)
382[info] io.kaizensolutions.jsonschema.JsonSchemaSerdesSpec
383[info] + JsonSchemaSerialization will automatically register the JSON Schema and allow you to send JSON data to Kafka 1s
384[info] + Enabling use latest (and disabling auto-registration) without configuring the client will fail 59ms
385[info] + Attempting to publish an incompatible change with auto-registration will fail 69ms
386[info] + Attempting to publish an incompatible change without auto-registration (using latest server schema) will fail 37ms
387[info] + Attempting to publish an incompatible change without auto-registration and not using the latest schema will fail 52ms
388[info] + Publishing a forward compatible change with auto-registration is allowed (in forward-compatibility mode) 205ms
389[info] + Reading data back from the topic with the latest schema is allowed provided you compensate for missing fields in your Decoder 3s
390[info] + Reading data back from the topic with an older schema is allowed 3s
391
392************************
393Build summary:
394[{
395 "module": "fs2-kafka-jsonschema",
396 "compile": {"status": "ok", "tookMs": 8132, "warnings": 0, "errors": 0, "sourceVersion": "3.8"},
397 "doc": {"status": "skipped", "tookMs": 0, "files": 0, "totalSizeKb": 0},
398 "test-compile": {"status": "ok", "tookMs": 8783, "warnings": 0, "errors": 0, "sourceVersion": "3.8"},
399 "test": {"status": "ok", "tookMs": 19182, "passed": 8, "failed": 0, "ignored": 0, "skipped": 0, "total": 8, "byFramework": [{"framework": "unknown", "stats": {"passed": 8, "failed": 0, "ignored": 0, "skipped": 0, "total": 8}}]},
400 "publish": {"status": "skipped", "tookMs": 0},
401 "metadata": {
402 "crossScalaVersions": ["3.3.3"]
403}
404}]
405************************
406[success] Total time: 81 s (0:01:21.0), completed Jan 8, 2026, 3:01:55 AM
407[0JChecking patch project/plugins.sbt...
408Checking patch project/build.properties...
409Checking patch build.sbt...
410Applied patch project/plugins.sbt cleanly.
411Applied patch project/build.properties cleanly.
412Applied patch build.sbt cleanly.