Test Cases failing for Build your own Kafka-Send Correlation ID #nv3

I’m stuck on Stage #Send Correlation ID #nv3

I referred to code examples and implemented similar way.

Here are my logs:

include relevant logs here (please make sure to keep the backticks around this!)
[compile] [INFO] Scanning for projects...
[compile] [INFO] 
[compile] [INFO] -----------------< io.codecrafters:codecrafters-kafka >-----------------
[compile] [INFO] Building codecrafters-kafka 1.0
[compile] [INFO]   from pom.xml
[compile] [INFO] --------------------------------[ jar ]---------------------------------
[compile] [INFO] 
[compile] [INFO] --- resources:3.3.1:resources (default-resources) @ codecrafters-kafka ---
[compile] [INFO] skip non existing resourceDirectory /app/src/main/resources
[compile] [INFO] 
[compile] [INFO] --- compiler:3.13.0:compile (default-compile) @ codecrafters-kafka ---
[compile] [INFO] Recompiling the module because of changed source code.
[compile] [INFO] Compiling 1 source file with javac [debug target 23] to target/classes
[compile] [INFO] 
[compile] [INFO] --- resources:3.3.1:testResources (default-testResources) @ codecrafters-kafka ---
[compile] [INFO] skip non existing resourceDirectory /app/src/test/resources
[compile] [INFO] 
[compile] [INFO] --- compiler:3.13.0:testCompile (default-testCompile) @ codecrafters-kafka ---
[compile] [INFO] No sources to compile
[compile] [INFO] 
[compile] [INFO] --- surefire:3.2.5:test (default-test) @ codecrafters-kafka ---
[compile] [INFO] No tests to run.
[compile] [INFO] 
[compile] [INFO] --- jar:3.4.1:jar (default-jar) @ codecrafters-kafka ---
[compile] [INFO] Building jar: /app/target/codecrafters-kafka-1.0.jar
[compile] [INFO] 
[compile] [INFO] --- assembly:3.7.1:single (make-assembly) @ codecrafters-kafka ---
[compile] [INFO] Building jar: /tmp/codecrafters-build-kafka-java/codecrafters-kafka.jar
[compile] [WARNING] Configuration option 'appendAssemblyId' is set to false.
[compile] Instead of attaching the assembly file: /tmp/codecrafters-build-kafka-java/codecrafters-kafka.jar, it will become the file for main project artifact.
[compile] NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic!
[compile] [WARNING] Replacing pre-existing project main-artifact file: /app/target/codecrafters-kafka-1.0.jar
[compile] with assembly file: /tmp/codecrafters-build-kafka-java/codecrafters-kafka.jar
[compile] [INFO] ------------------------------------------------------------------------
[compile] [INFO] BUILD SUCCESS
[compile] [INFO] ------------------------------------------------------------------------
[compile] [INFO] Total time:  2.519 s
[compile] [INFO] Finished at: 2024-11-29T01:11:19Z
[compile] [INFO] ------------------------------------------------------------------------
[compile] Moved ./.codecrafters/run.sh → ./your_program.sh
[compile] Compilation successful.
[tester::#NV3] Running tests for Stage #NV3 (Send Correlation ID)
[tester::#NV3] $ ./your_program.sh /tmp/server.properties
[your_program] Logs from your program will appear here!
[tester::#NV3] Sending "ApiVersions" (version: 4) request (Correlation id: 7)
[tester::#NV3] [Decoder] Received:
[tester::#NV3] [Decoder] Hex (bytes 0--1)                                | ASCII
[tester::#NV3] [Decoder] ------------------------------------------------+------------------
[tester::#NV3] [Decoder] | 
[tester::#NV3] [Decoder]  ^                                                ^
[tester::#NV3] [Decoder] Error: Expected int32 length to be 4 bytes, got 0 bytes
[tester::#NV3] [Decoder] Context:
[tester::#NV3] [Decoder] - response
[tester::#NV3] [Decoder]   - message length
[tester::#NV3] [Decoder]     - INT32
[tester::#NV3] [Decoder] 
[tester::#NV3] [Decoder] Test failed (try setting 'debug: true' in your codecrafters.yml to see more details)

And here’s a snippet of my code:

 public static void main(String[] args){
    // You can use print statements as follows for debugging, they'll be visible when running tests.
    System.err.println("Logs from your program will appear here!");

    // Uncomment this block to pass the first stage
     
     ServerSocket serverSocket = null;
     Socket clientSocket = null;
     int port = 9092;
     try {
       serverSocket = new ServerSocket(port);
       // Since the tester restarts your program quite often, setting SO_REUSEADDR
       // ensures that we don't run into 'Address already in use' errors
       serverSocket.setReuseAddress(true);
       // Wait for connection from client.
       clientSocket = serverSocket.accept();
	 OutputStream out = clientSocket.getOutputStream();
      out.write(new byte[] {0, 1, 2, 3});
      out.write(new byte[] {0, 0, 0, 7});
     } catch (IOException e) {
       System.out.println("IOException: " + e.getMessage());
     } finally {
       try {
         if (clientSocket != null) {
           clientSocket.close();
         }
       } catch (IOException e) {
         System.out.println("IOException: " + e.getMessage());
       }
     }
  }

Hi @veda-moda, could you upload your code to GitHub and share the link? It will be much easier to debug if I can run it directly.

@veda-moda, the latest version of your code doesn’t seem to be implementing any functionality.

Could you update it with your most recent changes?

I did it in Rust but ran into what I think is the same issue: The codecrafters test suite reports that the program doesn’t provide any output.

I solved this by partially reading the input from the client:

let mut request_bytes = [0; 36];
stream.read(&mut request_bytes).unwrap();
let request = KafkaRequest::try_from(request_bytes.as_slice()).unwrap();

Then sending the response

let response = KafkaResponse::new(request);
let response_bytes: Vec<u8> = response.to_bytes().collect();
stream.write_all(&response_bytes).unwrap();

Then finally reading all input from the client until EOF

let mut ignored_bytes = Vec::new();
stream.read_to_end(&mut ignored_bytes).unwrap();

Hi @Wattsy2020, looks like you’ve got past #nv3 — do you happen to remember what was wrong? Would love to see if we can improve the tester / instructions.

Yeah here’s the logs I got when it was going wrong (with an older version of the program):

[tester::#WA6] Running tests for Stage #WA6 (Parse Correlation ID)
[tester::#WA6] $ ./your_program.sh /tmp/server.properties
[tester::#WA6] Connecting to broker at: localhost:9092
[your_program] Logs from your program will appear here!
[your_program] Received new request
[tester::#WA6] Connection to broker at localhost:9092 successful
[tester::#WA6] Sending "ApiVersions" (version: 4) request (Correlation id: 235102252)
[tester::#WA6] Hexdump of sent "ApiVersions" request: 
[tester::#WA6] Idx  | Hex                                             | ASCII
[tester::#WA6] -----+-------------------------------------------------+-----------------
[tester::#WA6] 0000 | 00 00 00 23 00 12 00 04 0e 03 60 2c 00 09 6b 61 | ...#......`,..ka
[tester::#WA6] 0010 | 66 6b 61 2d 63 6c 69 00 0a 6b 61 66 6b 61 2d 63 | fka-cli..kafka-c
[tester::#WA6] 0020 | 6c 69 04 30 2e 31 00                            | li.0.1.
[tester::#WA6] 
[tester::#WA6] Hexdump of received "ApiVersions" response: 
[tester::#WA6] Idx  | Hex                                             | ASCII
[tester::#WA6] -----+-------------------------------------------------+-----------------
[tester::#WA6] | 
[tester::#WA6] 
[tester::#WA6] [Decoder] - .Response
[tester::#WA6] [Decoder] Received:
[tester::#WA6] [Decoder] Hex (bytes 0--1)                                | ASCII
[tester::#WA6] [Decoder] ------------------------------------------------+------------------
[tester::#WA6] [Decoder] | 
[tester::#WA6] [Decoder]  ^                                                ^
[tester::#WA6] [Decoder] Error: Expected int32 length to be 4 bytes, got 0 bytes
[tester::#WA6] [Decoder] Context:
[tester::#WA6] [Decoder] - response
[tester::#WA6] [Decoder]   - message length
[tester::#WA6] [Decoder]     - INT32
[tester::#WA6] [Decoder] 
[tester::#WA6] [Decoder] Test failed
[tester::#WA6] [Decoder] Terminating program
[your_program] Received Request: KafkaRequest { message_size: 35, request_api_key: 18, request_api_version: 4, correlation_id: 235102252 }
[your_program] Sending Response: KafkaResponse { message_size: 0, correlation_id: 235102252 }
[your_program] Sent response bytes: [0, 0, 0, 0, 14, 3, 96, 44]
[tester::#WA6] [Decoder] Program terminated successfully

From the logs you can see my program received the request Received new request. But then before it responded the test code immediately read the response, and found 0 bytes in the response (shown in the line [tester::#WA6] Hexdump of received "ApiVersions" response: ).

Finally the last 4 lines of logs show that my program fully read the content of the request, and sent the response. But the testing code had already tried to read the response, so the test remained failed.

The code in main() that resulted in these logs was:

println!("Received new request");

let mut request_bytes: Vec<u8> = Vec::new();
stream.read_to_end(&mut request_bytes).unwrap();
let request = KafkaRequest::try_from(request_bytes.as_slice()).unwrap();
println!("Received Request: {request:?}");

let response = KafkaResponse::new(request);
println!("Sending Response: {response:?}");
let response_bytes: Vec<u8> = response.to_bytes().collect();
stream.write_all(&response_bytes).unwrap();
println!("Sent response bytes: {response_bytes:?}");

The main difference is here I use read_to_end first (instead of the correct solution where I used read_to_end last). Maybe the issue is that the test code doesn’t send an EOF? read_to_end expects an EOF, so perhaps it was waiting for that, and only continued once the test code closed the connection

1 Like

Nice catch! read_to_end() isn’t suitable for TCP connections in this scenario.