How to resolve InaccessibleObjectException in Flink when setting a record serializer for KafkaSink? - Stack Overflow

admin2025-04-18  1

I am working on a Flink job that uses KafkaSinkBuilder to configure a Kafka sink. leveraging ververica example. However, when I run the job, I encounter the following exception:

Exception in thread "main" java.lang.reflect.InaccessibleObjectException: Unable to make field private final byte[] java.lang.String.value accessible: module java.base does not "opens java.lang" to unnamed module @22f71333
    at java.base/java.lang.reflect.AccessibleObject.throwInaccessibleObjectException(AccessibleObject.java:388)
    at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:364)
    at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:312)
    at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:183)
    at java.base/java.lang.reflect.Field.setAccessible(Field.java:177)
    at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:106)
    at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
    at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:69)
    at org.apache.flink.connector.kafka.sink.KafkaSinkBuilder.setRecordSerializer(KafkaSinkBuilder.java:141)
    at com.ververica.flink.example.datausage.KafkaProducerJob.main(KafkaProducerJob.java:53)

What I've Tried

  1. Adding --add-opens JVM arguments:

    --add-opens java.base/java.lang=ALL-UNNAMED
    

    This mitigates the issue but doesn't seem like a long-term solution.

  2. Ensuring compatibility between Java and Flink versions:

    • Java version: 17
    • Flink version: 1.14
  3. Refactoring my code to use an explicit serializer instead of relying on Flink's default serialization.

Questions

  1. Is there a way to avoid this exception without using --add-opens?
  2. Are there updates or specific configurations in recent Flink versions to resolve this issue?
  3. Should I be implementing a custom serializer to bypass ClosureCleaner entirely? If so, how would that look for a Kafka sink?

Any guidance or examples would be greatly appreciated!

I am working on a Flink job that uses KafkaSinkBuilder to configure a Kafka sink. leveraging ververica example. However, when I run the job, I encounter the following exception:

Exception in thread "main" java.lang.reflect.InaccessibleObjectException: Unable to make field private final byte[] java.lang.String.value accessible: module java.base does not "opens java.lang" to unnamed module @22f71333
    at java.base/java.lang.reflect.AccessibleObject.throwInaccessibleObjectException(AccessibleObject.java:388)
    at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:364)
    at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:312)
    at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:183)
    at java.base/java.lang.reflect.Field.setAccessible(Field.java:177)
    at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:106)
    at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
    at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:69)
    at org.apache.flink.connector.kafka.sink.KafkaSinkBuilder.setRecordSerializer(KafkaSinkBuilder.java:141)
    at com.ververica.flink.example.datausage.KafkaProducerJob.main(KafkaProducerJob.java:53)

What I've Tried

  1. Adding --add-opens JVM arguments:

    --add-opens java.base/java.lang=ALL-UNNAMED
    

    This mitigates the issue but doesn't seem like a long-term solution.

  2. Ensuring compatibility between Java and Flink versions:

    • Java version: 17
    • Flink version: 1.14
  3. Refactoring my code to use an explicit serializer instead of relying on Flink's default serialization.

Questions

  1. Is there a way to avoid this exception without using --add-opens?
  2. Are there updates or specific configurations in recent Flink versions to resolve this issue?
  3. Should I be implementing a custom serializer to bypass ClosureCleaner entirely? If so, how would that look for a Kafka sink?

Any guidance or examples would be greatly appreciated!

Share Improve this question asked Jan 29 at 15:15 apolakapolak 1
Add a comment  | 

1 Answer 1

Reset to default 0

Experimental support for Java 17 was added in Flink 1.18. With Flink 1.14 the best you can do is to use Java 11.

转载请注明原文地址:http://anycun.com/QandA/1744957252a90016.html