I am working on a Flink job that uses KafkaSinkBuilder to configure a Kafka sink. leveraging ververica example. However, when I run the job, I encounter the following exception:
Exception in thread "main" java.lang.reflect.InaccessibleObjectException: Unable to make field private final byte[] java.lang.String.value accessible: module java.base does not "opens java.lang" to unnamed module @22f71333
at java.base/java.lang.reflect.AccessibleObject.throwInaccessibleObjectException(AccessibleObject.java:388)
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:364)
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:312)
at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:183)
at java.base/java.lang.reflect.Field.setAccessible(Field.java:177)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:106)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:69)
at org.apache.flink.connector.kafka.sink.KafkaSinkBuilder.setRecordSerializer(KafkaSinkBuilder.java:141)
at com.ververica.flink.example.datausage.KafkaProducerJob.main(KafkaProducerJob.java:53)
Adding --add-opens
JVM arguments:
--add-opens java.base/java.lang=ALL-UNNAMED
This mitigates the issue but doesn't seem like a long-term solution.
Ensuring compatibility between Java and Flink versions:
Refactoring my code to use an explicit serializer instead of relying on Flink's default serialization.
--add-opens
?ClosureCleaner
entirely? If so, how would that look for a Kafka sink?Any guidance or examples would be greatly appreciated!
I am working on a Flink job that uses KafkaSinkBuilder to configure a Kafka sink. leveraging ververica example. However, when I run the job, I encounter the following exception:
Exception in thread "main" java.lang.reflect.InaccessibleObjectException: Unable to make field private final byte[] java.lang.String.value accessible: module java.base does not "opens java.lang" to unnamed module @22f71333
at java.base/java.lang.reflect.AccessibleObject.throwInaccessibleObjectException(AccessibleObject.java:388)
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:364)
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:312)
at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:183)
at java.base/java.lang.reflect.Field.setAccessible(Field.java:177)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:106)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:69)
at org.apache.flink.connector.kafka.sink.KafkaSinkBuilder.setRecordSerializer(KafkaSinkBuilder.java:141)
at com.ververica.flink.example.datausage.KafkaProducerJob.main(KafkaProducerJob.java:53)
Adding --add-opens
JVM arguments:
--add-opens java.base/java.lang=ALL-UNNAMED
This mitigates the issue but doesn't seem like a long-term solution.
Ensuring compatibility between Java and Flink versions:
Refactoring my code to use an explicit serializer instead of relying on Flink's default serialization.
--add-opens
?ClosureCleaner
entirely? If so, how would that look for a Kafka sink?Any guidance or examples would be greatly appreciated!
Experimental support for Java 17 was added in Flink 1.18. With Flink 1.14 the best you can do is to use Java 11.