Kafka-ui supports multiple ways to serialize/deserialize data.
Int32, Int64, UInt32, UInt64
Big-endian 4/8 bytes representation of signed/unsigned integers.
Base64
Base64 (RFC4648) binary data representation. Can be useful in case if the actual data is not important, but exactly the same (byte-wise) key/value should be send.
Hex
Hexadecimal binary data representation. Bytes delimiter and case can be configured.
Class name: io.kafbat.ui.serdes.builtin.ProtobufFileSerde
Sample configuration:
kafka:clusters: - name:Cluster1# Other Cluster configuration omitted ... serde: - name:ProtobufFileproperties:# protobufFilesDir specifies root location for proto files (will be scanned recursively)# NOTE: if 'protobufFilesDir' specified, then 'protobufFile' and 'protobufFiles' settings will be ignoredprotobufFilesDir:"/path/to/my-protobufs"# (DEPRECATED) protobufFile is the path to the protobuf schema. (deprecated: please use "protobufFiles")protobufFile:path/to/my.proto# (DEPRECATED) protobufFiles is the location of one or more protobuf schemasprotobufFiles: - /path/to/my-protobufs/my.proto - /path/to/my-protobufs/another.proto# protobufMessageName is the default protobuf type that is used to deserialize# the message's value if the topic is not found in protobufMessageNameByTopic. protobufMessageName:my.DefaultValType# default protobuf type that is used for KEY serialization/deserialization# optionalprotobufMessageNameForKey:my.Type1# mapping of topic names to protobuf types, that will be used for KEYS serialization/deserialization# optionalprotobufMessageNameForKeyByTopic:topic1:my.KeyType1topic2:my.KeyType2# default protobuf type that is used for VALUE serialization/deserialization# optional, if not set - first type in file will be used as defaultprotobufMessageName:my.Type1# mapping of topic names to protobuf types, that will be used for VALUES serialization/deserialization# optionalprotobufMessageNameByTopic:topic1:my.Type1"topic.2":my.Type2
ProtobufRawDecoder
Deserialize-only serde. Decodes protobuf payload without a predefined schema (like protoc --decode_raw command).
SchemaRegistry
SchemaRegistry serde is automatically configured if schema registry properties set on cluster level. But you can add new SchemaRegistry-typed serdes that will connect to another schema-registry instance.
Class name: io.kafbat.ui.serdes.builtin.sr.SchemaRegistrySerde
Sample configuration:
kafka:clusters: - name:Cluster1# this url will be used by "SchemaRegistry" by defaultschemaRegistry:http://main-schema-registry:8081serde: - name:AnotherSchemaRegistryclassName:io.kafbat.ui.serdes.builtin.sr.SchemaRegistrySerdeproperties:url:http://another-schema-registry:8081# auth properties, optionalusername:nameForAuthpassword:P@ssW0RdForAuth# and also add another SchemaRegistry serde - name:ThirdSchemaRegistryclassName:io.kafbat.ui.serdes.builtin.sr.SchemaRegistrySerdeproperties:url:http://another-yet-schema-registry:8081
Setting serdes for specific topics
You can specify preferable serde for topics key/value. This serde will be chosen by default in UI on topic's view/produce pages. To do so, set topicValuesPattern/topicValuesPattern properties for the selected serde. Kafka-ui will choose a first serde that matches specified pattern.
If selected serde couldn't be applied (exception was thrown), then fallback (String serde with UTF-8 encoding) serde will be applied. Such messages will be specially highlighted in UI.
Custom pluggable serde registration
You can implement your own serde and register it in kafbat-ui application. To do so:
Add serde-api dependency (should be downloadable via maven central)
Implement io.kafbat.ui.serde.api.Serde interface. See javadoc for implementation requirements.
Pack your serde into uber jar, or provide directory with no-dependency jar and it's dependencies jars