Create and Deploy a Streaming Application with Google Cloud Platform Apache Kafka

This guide describes how to create a streaming application that demonstrates how to use the Micronaut Streaming API. The application consists of two Micronaut microservices that use Google Cloud Platform Apache Kafka to communicate with each other in an asynchronous and decoupled way.

Prerequisites #

Note: This guide uses paid services; you may need to enable billing in Google Cloud to complete some steps in this guide.

Follow the steps below to create the application from scratch. However, you can also download the completed example in Java:

A note regarding your development environment

Consider using Visual Studio Code that provides native support for developing applications with the Graal Cloud Native Tools extension.

Note: If you use IntelliJ IDEA, enable annotation processing.

1. Create the Microservices #

The two microservices are:

  • Books returns a list of books. It uses a domain consisting of a book name and an International Standard Book Number (ISBN). It also publishes a message to the streaming service every time a book is accessed.
  • Analytics connects to the streaming service to update the analytics for every book (a counter). It also exposes an endpoint to retrieve the counter.

1.1. Create the Books Microservice #

  1. Open the GCN Launcher in advanced mode.

  2. Create a new project using the following selections.
    • Project Type: Application (Default)
    • Project Name: books
    • Base Package: com.example.publisher
    • Clouds: GCP
    • Language: Java (Default)
    • Build Tool: Gradle (Groovy) or Maven
    • Test Framework: JUnit (Default)
    • Java Version: 17 (Default)
    • Micronaut Version: (Default)
    • Cloud Services: Streaming
    • Features: Awaitility Framework, GraalVM Native Image, Micronaut Serialization Jackson Core, and Reactor
    • Sample Code: No
  3. Click Generate Project. The GCN Launcher creates an application with the default package com.example.publisher in a directory named books. The application ZIP file will be downloaded in your default downloads directory. Unzip it, open in your code editor, and proceed to the next steps.

Alternatively, use the GCN CLI as follows:

gcn create-app com.example.publisher.books \
    --clouds=gcp \
    --services=streaming \
    --features=awaitility,graalvm,reactor,serialization-jackson \
    --build=gradle \
    --lang=java \
    --example-code=false
gcn create-app com.example.publisher.books \
    --clouds=gcp \
    --services=streaming \
    --features=awaitility,graalvm,reactor,serialization-jackson \
    --build=maven \
    --lang=java \
    --example-code=false

1.1.1. Book Domain Class

The launcher created a Book domain class in a file named lib/src/main/java/com/example/publisher/Book.java, as follows:

package com.example.publisher;

import io.micronaut.core.annotation.Creator;
import io.micronaut.serde.annotation.Serdeable;

import java.util.Objects;

@Serdeable
public class Book {

    private final String isbn;
    private final String name;

    @Creator
    public Book(String isbn, String name) {
        this.isbn = isbn;
        this.name = name;
    }

    public String getIsbn() {
        return isbn;
    }

    public String getName() {
        return name;
    }

    @Override
    public String toString() {
        return "Book{" +
                "isbn='" + isbn + '\'' +
                ", name='" + name + '\'' +
                '}';
    }

    @Override
    public boolean equals(Object o) {
        if (this == o) return true;
        if (o == null || getClass() != o.getClass()) return false;
        Book other = (Book) o;
        return Objects.equals(isbn, other.isbn) &&
                Objects.equals(name, other.name);
    }

    @Override
    public int hashCode() {
        return Objects.hash(isbn, name);
    }
}

1.1.2. BookService

To keep this guide simple there is no database persistence: the Books microservice keeps the list of books in memory. The launcher created a class named BookService in lib/src/main/java/com/example/publisher/BookService.java with the following contents:

package com.example.publisher;

import jakarta.annotation.PostConstruct;
import jakarta.inject.Singleton;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;

@Singleton
public class BookService {

    private final List<Book> bookStore = new ArrayList<>();

    @PostConstruct
    void init() {
        bookStore.add(new Book("1491950358", "Building Microservices"));
        bookStore.add(new Book("1680502395", "Release It!"));
        bookStore.add(new Book("0321601912", "Continuous Delivery"));
    }

    public List<Book> listAll() {
        return bookStore;
    }

    public Optional<Book> findByIsbn(String isbn) {
        return bookStore.stream()
                .filter(b -> b.getIsbn().equals(isbn))
                .findFirst();
    }
}

1.1.3. BookController

The launcher created a controller to access Book instances in a file named lib/src/main/java/com/example/publisher/BookController.java with the following contents:

package com.example.publisher;

import io.micronaut.http.annotation.Controller;
import io.micronaut.http.annotation.Get;

import java.util.List;
import java.util.Optional;

@Controller("/books") // <1>
class BookController {

    private final BookService bookService;

    BookController(BookService bookService) { // <2>
        this.bookService = bookService;
    }

    @Get // <3>
    List<Book> listAll() {
        return bookService.listAll();
    }

    @Get("/{isbn}") // <4>
    Optional<Book> findBook(String isbn) {
        return bookService.findByIsbn(isbn);
    }
}

1 The @Controller annotation defines the class as a controller mapped to the root URI /books.

2 Use constructor injection to inject a bean of type BookService.

3 The @Get annotation maps the listAll method to an HTTP GET request on /books.

4 The @Get annotation maps the findBook method to an HTTP GET request on /books/{isbn}.

1.1.4. BookControllerTest

The launcher created a test for BookController to verify the interaction with the Analytics microservice in a file named gcp/src/test/java/com/example/publisher/BookControllerTest.java with the following contents:

package com.example.publisher;

import io.micronaut.configuration.kafka.annotation.KafkaListener;
import io.micronaut.configuration.kafka.annotation.Topic;
import io.micronaut.core.type.Argument;
import io.micronaut.http.HttpRequest;
import io.micronaut.http.client.HttpClient;
import io.micronaut.http.client.annotation.Client;
import io.micronaut.http.client.exceptions.HttpClientResponseException;
import io.micronaut.test.extensions.junit5.annotation.MicronautTest;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestInstance;

import jakarta.inject.Inject;
import java.util.Collection;
import java.util.Optional;
import java.util.concurrent.ConcurrentLinkedDeque;

import static io.micronaut.configuration.kafka.annotation.OffsetReset.EARLIEST;
import static java.util.concurrent.TimeUnit.SECONDS;
import static org.awaitility.Awaitility.await;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertThrows;
import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.junit.jupiter.api.TestInstance.Lifecycle.PER_CLASS;

@MicronautTest
@TestInstance(PER_CLASS) // <1>
class BookControllerTest {

    private static final Collection<Book> received = new ConcurrentLinkedDeque<>();

    @Inject
    AnalyticsListener analyticsListener; // <2>

    @Inject
    @Client("/")
    HttpClient client; // <3>

    @Test
    void testMessageIsPublishedToKafkaWhenBookFound() {
        String isbn = "1491950358";

        Optional<Book> result = retrieveGet("/books/" + isbn); // <4>
        assertNotNull(result);
        assertTrue(result.isPresent());
        assertEquals(isbn, result.get().getIsbn());

        await().atMost(5, SECONDS).until(() -> !received.isEmpty()); // <5>

        assertEquals(1, received.size()); // <6>
        Book bookFromKafka = received.iterator().next();
        assertNotNull(bookFromKafka);
        assertEquals(isbn, bookFromKafka.getIsbn());
    }

    @Test
    void testMessageIsNotPublishedToKafkaWhenBookNotFound() throws Exception {
        assertThrows(HttpClientResponseException.class, () -> {
            retrieveGet("/books/INVALID");
        });

        Thread.sleep(5_000); // <7>
        assertEquals(0, received.size());
    }

    @AfterEach
    void cleanup() {
        received.clear();
    }

    @KafkaListener(offsetReset = EARLIEST)
    static class AnalyticsListener {

        @Topic("analytics")
        void updateAnalytics(Book book) {
            received.add(book);
        }
    }

    private Optional<Book> retrieveGet(String url) {
        return client.toBlocking().retrieve(HttpRequest.GET(url), Argument.of(Optional.class, Book.class));
    }
}

1 Classes that implement TestPropertyProvider must use this annotation to create a single class instance for all tests.

2 Dependency injection for the AnalyticsListener class declared later in the file. This is a listener class that replicates the functionality of the class of the same name in the Analytics microservice.

3 Dependency injection for an HTTP client that the Micronaut framework will implement at compile to make calls to BookController.

4 Use the HttpClient to retrieve the details of a Book, which will trigger sending a message.

5 Wait a few seconds for the message to arrive; it should happen very quickly, but the message will be sent on a separate thread.

6 Verify that the message was received and that it has the correct data.

7 Wait a few seconds to ensure no message is received.

1.1.5. AnalyticsClient

The launcher created a client interface to send messages to the streaming service in a file named lib/src/main/java/com/example/publisher/AnalyticsClient.java with the contents shown below. (Micronaut generates an implementation for the client interface at compilation time.)

package com.example.publisher;

import io.micronaut.configuration.kafka.annotation.KafkaClient;
import io.micronaut.configuration.kafka.annotation.Topic;
import reactor.core.publisher.Mono;

@KafkaClient
public interface AnalyticsClient {

    @Topic("analytics") // <1>
    Mono<Book> updateAnalytics(Book book); // <2>
}

1 Set the name of the topic.

2 Send the Book POJO. Micronaut will automatically convert it to JSON before sending it.

1.1.6. AnalyticsFilter

Sending a message to the streaming service is as simple as injecting AnalyticsClient and calling its updateAnalytics method. The goal is to send a message every time the details of a book are returned from the Books microservice or, in other words, every time there is a call to http://localhost:8080/books/{isbn}. To achieve this, the launcher created an HTTP Server Filter in a file named lib/src/main/java/com/example/publisher/AnalyticsFilter.java as follows:

package com.example.publisher;

import io.micronaut.http.HttpRequest;
import io.micronaut.http.MutableHttpResponse;
import io.micronaut.http.annotation.Filter;
import io.micronaut.http.filter.HttpServerFilter;
import io.micronaut.http.filter.ServerFilterChain;
import reactor.core.publisher.Flux;
import org.reactivestreams.Publisher;

@Filter("/books/?*") // <1>
class AnalyticsFilter implements HttpServerFilter { // <2>

    private final AnalyticsClient analyticsClient;

    AnalyticsFilter(AnalyticsClient analyticsClient) { // <3>
        this.analyticsClient = analyticsClient;
    }

    @Override
    public Publisher<MutableHttpResponse<?>> doFilter(HttpRequest<?> request,
                                                      ServerFilterChain chain) { // <4>
        return Flux
                .from(chain.proceed(request)) // <5>
                .flatMap(response -> {
                    Book book = response.getBody(Book.class).orElse(null); // <6>
                    if (book == null) {
                        return Flux.just(response);
                    }
                    return Flux.from(analyticsClient.updateAnalytics(book)).map(b -> response); // <7>
                });
    }
}

1 Annotate the class with @Filter and define the Ant-style matcher pattern to intercept all calls to the desired URIs.

2 The class must implement HttpServerFilter.

3 Dependency injection for AnalyticsClient.

4 Implement the doFilter method.

5 Call the request; this will invoke the controller action.

6 Get the response from the controller and return the body as an instance of the Book class.

7 If the book is retrieved, use the client to send a message.

1.1.7. Test the Microservice

Use the following command to test the Books microservice:

./gradlew :gcp:test
./mvnw install -pl lib -am && ./mvnw package -pl gcp -DskipTests
./mvnw test

1.2. Create the Analytics Microservice #

  1. Open the GCN Launcher in advanced mode.

  2. Create a new project using the following selections.
    • Project Type: Application (Default)
    • Project Name: analytics
    • Base Package: com.example.consumer
    • Clouds: GCP
    • Language: Java (Default)
    • Build Tool: Gradle (Groovy) or Maven
    • Test Framework: JUnit (Default)
    • Java Version: 17 (Default)
    • Micronaut Version: (Default)
    • Cloud Services: Streaming
    • Features: Awaitility Framework, GraalVM Native Image and Micronaut Serialization Jackson Core
    • Sample Code: No
  3. Click Generate Project. The GCN Launcher creates an application with the default package com.example.consumer in a directory named analytics. The application ZIP file will be downloaded in your default downloads directory. Unzip it, open in your code editor, and proceed to the next steps.

Alternatively, use the GCN CLI as follows:

gcn create-app com.example.consumer.analytics \
    --clouds=gcp \
    --services=streaming \
    --features=awaitility,graalvm,serialization-jackson \
    --build=gradle \
    --lang=java \
    --example-code=false
gcn create-app com.example.consumer.analytics \
    --clouds=gcp \
    --services=streaming \
    --features=awaitility,graalvm,serialization-jackson \
    --build=maven \
    --lang=java \
    --example-code=false

1.2.1. Domain Classes

The launcher created a Book domain class in a file named lib/src/main/java/com/example/consumer/Book.java, as shown below. (This Book POJO is the same as the one in the Books microservice. In a real application this would be in a shared library but to keep things simple, just duplicate it.)

package com.example.consumer;

import io.micronaut.core.annotation.Creator;
import io.micronaut.serde.annotation.Serdeable;

import java.util.Objects;

@Serdeable
public class Book {

    private final String isbn;
    private final String name;

    @Creator
    public Book(String isbn, String name) {
        this.isbn = isbn;
        this.name = name;
    }

    public String getIsbn() {
        return isbn;
    }

    public String getName() {
        return name;
    }

    @Override
    public String toString() {
        return "Book{" +
                "isbn='" + isbn + '\'' +
                ", name='" + name + '\'' +
                '}';
    }

    @Override
    public boolean equals(Object o) {
        if (this == o) return true;
        if (o == null || getClass() != o.getClass()) return false;
        Book other = (Book) o;
        return Objects.equals(isbn, other.isbn) &&
                Objects.equals(name, other.name);
    }

    @Override
    public int hashCode() {
        return Objects.hash(isbn, name);
    }
}

The launcher also created a BookAnalytics domain class in a file named lib/src/main/java/com/example/consumer/BookAnalytics.java, as follows:

package com.example.consumer;

import io.micronaut.core.annotation.Creator;
import io.micronaut.serde.annotation.Serdeable;

@Serdeable
public class BookAnalytics {

    private final String bookIsbn;
    private final long count;

    @Creator
    public BookAnalytics(String bookIsbn, long count) {
        this.bookIsbn = bookIsbn;
        this.count = count;
    }

    public String getBookIsbn() {
        return bookIsbn;
    }

    public long getCount() {
        return count;
    }
}

1.2.2. AnalyticsService

To keep this guide simple there is no database persistence: the Analytics microservice keeps book analytics in memory. The launcher created a class named AnalyticsService in lib/src/main/java/com/example/consumer/AnalyticsService.java with the following contents:

package com.example.consumer;

import jakarta.inject.Singleton;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.stream.Collectors;

@Singleton
public class AnalyticsService {

    private final Map<Book, Long> bookAnalytics = new ConcurrentHashMap<>(); // <1>

    public void updateBookAnalytics(Book book) { // <2>
        bookAnalytics.compute(book, (k, v) -> {
            return v == null ? 1L : v + 1;
        });
    }

    public List<BookAnalytics> listAnalytics() { // <3>
        return bookAnalytics
                .entrySet()
                .stream()
                .map(e -> new BookAnalytics(e.getKey().getIsbn(), e.getValue()))
                .collect(Collectors.toList());
    }
}

1 Keep the book analytics in memory.

2 Initialize and update the analytics for the book passed as parameter.

3 Return all the analytics.

1.2.3. AnalyticsServiceTest

The launcher created a test for the AnalyticsService class, in a file named gcp/src/test/java/com/example/consumer/AnalyticsServiceTest.java, as follows:

package com.example.consumer;

import static org.junit.jupiter.api.Assertions.assertEquals;

import io.micronaut.test.extensions.junit5.annotation.MicronautTest;
import org.junit.jupiter.api.Test;

import jakarta.inject.Inject;
import java.util.List;

@MicronautTest
class AnalyticsServiceTest {

    @Inject
    AnalyticsService analyticsService;

    @Test
    void testUpdateBookAnalyticsAndGetAnalytics() {
        Book b1 = new Book("1491950358", "Building Microservices");
        Book b2 = new Book("1680502395", "Release It!");

        analyticsService.updateBookAnalytics(b1);
        analyticsService.updateBookAnalytics(b1);
        analyticsService.updateBookAnalytics(b1);
        analyticsService.updateBookAnalytics(b2);

        List<BookAnalytics> analytics = analyticsService.listAnalytics();
        assertEquals(2, analytics.size());

        assertEquals(3, findBookAnalytics(b1, analytics).getCount());
        assertEquals(1, findBookAnalytics(b2, analytics).getCount());
    }

    private BookAnalytics findBookAnalytics(Book b, List<BookAnalytics> analytics) {
        return analytics
                .stream()
                .filter(bookAnalytics -> bookAnalytics.getBookIsbn().equals(b.getIsbn()))
                .findFirst()
                .orElseThrow(() -> new RuntimeException("Book not found"));
    }
}

1.2.4. AnalyticsController

The launcher created a Controller to create an endpoint for the Analytics microservice in a file named lib/src/main/java/com/example/consumer/AnalyticsController.java, as follows:

package com.example.consumer;

import io.micronaut.http.annotation.Controller;
import io.micronaut.http.annotation.Get;

import java.util.List;

@Controller("/analytics") // <1>
class AnalyticsController {

    private final AnalyticsService analyticsService;

    AnalyticsController(AnalyticsService analyticsService) {
        this.analyticsService = analyticsService;
    }

    @Get // <2>
    List<BookAnalytics> listAnalytics() {
        return analyticsService.listAnalytics();
    }
}

1 The @Controller annotation defines the class as a controller mapped to the root URI /analytics.

2 The @Get annotation maps the listAnalytics method to an HTTP GET request on /analytics.

The application doesn’t expose the method updateBookAnalytics created in AnalyticsService. This method will be invoked when reading messages from Kafka.

1.2.5. AnalyticsListener

The launcher created a class to act as a consumer of the messages sent to the streaming service by the Books microservice. The Micronaut framework implements logic to invoke the consumer at compile time. The AnalyticsListener class is in a file named lib/src/main/java/com/example/consumer/AnalyticsListener.java, as follows:

package com.example.consumer;

import io.micronaut.configuration.kafka.annotation.KafkaListener;
import io.micronaut.configuration.kafka.annotation.Topic;
import io.micronaut.context.annotation.Requires;
import io.micronaut.context.env.Environment;

@Requires(notEnv = Environment.TEST) // <1>
@KafkaListener // <2>
class AnalyticsListener {

    private final AnalyticsService analyticsService; // <3>

    AnalyticsListener(AnalyticsService analyticsService) { // <3>
        this.analyticsService = analyticsService;
    }

    @Topic("analytics") // <4>
    void updateAnalytics(Book book) {
        analyticsService.updateBookAnalytics(book); // <5>
    }
}

1 Do not load this bean in the test environment: you can run tests without access to a streaming service.

2 Annotate the class with @KafkaListener to indicate that this bean consumes messages from Kafka.

3 Constructor injection for AnalyticsService.

4 Annotate the method with @Topic and specify the topic name.

5 Call AnalyticsService to update the analytics for the book.

1.2.6. Test the Microservice

Use the following command to test the Analytics microservice:

./gradlew :gcp:test
./mvnw install -pl lib -am && ./mvnw package -pl gcp -DskipTests
./mvnw test

1.2.7. Change the Port of the Analytics Microservice

The Books and Analytics microservices are both run on the same GCP Compute Engine instance, so they must run on different ports. Change the port that Analytics runs on by editing the gcp/src/main/resources/application-gcp.properties file so that it has the following contents:

micronaut.application.name=gcp
micronaut.server.port=8080

2. Run the Microservices #

To run the microservices locally, starting from bootstrap, comment out the Kafka section of each service’s application-gcp.properties file by inserting “#” at the beginning of each line, as shown below:

# kafka.bootstrap.servers=${KAFKA_BOOTSTRAP_SERVERS}
# kafka.max.partition.fetch.bytes=1048576
# kafka.max.request.size=1048576
# kafka.retries=3

2.1. Start the Books Microservice #

To run the Books microservice, use the following command, which starts the application on port 8080.

./gradlew :gcp:run
./mvnw install -pl lib -am && ./mvnw -pl gcp mn:run

2.2. Start the Analytics Microservice #

To run the Analytics microservice, use the following command, which starts the application on port 8081.

./gradlew :gcp:run
./mvnw install -pl lib -am && ./mvnw -pl gcp mn:run

2.3. Test the Microservices #

Use curl to test the microservices, as follows.

  1. Retrieve the list of books:

     curl http://localhost:8080/books
    
     [{"isbn":"1491950358","name":"Building Microservices"},{"isbn":"1680502395","name":"Release It!"},{"isbn":"0321601912","name":"Continuous Delivery"}]
    
  2. Retrieve the details of a specified book:

     curl http://localhost:8080/books/1491950358
    
     {"isbn":"1491950358","name":"Building Microservices"}
    
  3. Retrieve the analytics:

     curl http://localhost:8081/analytics
    
     [{"bookIsbn":"1491950358","count":1}]
    

Update the curl command to the Books microservice to retrieve other books and repeat the invocations, then re-run the curl command to the Analytics microservice to see that the counts increase.

3. Generate a Native Executable Using GraalVM #

GCN supports compiling a Java application ahead-of-time into a native executable using GraalVM Native Image. You can use the Gradle plugin for GraalVM Native Image building/Maven plugin for GraalVM Native Image building. Packaged as a native executable, it significantly reduces application startup time and memory footprint.


Before running this native executable, you must start and connect to a Kafka instance. You can use a Kafka container.


  1. Install a Kafka container:

     docker pull spotify/kafka
    
  2. Start the Kafka container (use CTRL-C to stop it):

       docker run -p 2181:2181 \
         -p 9092:9092 \
         --name kafka-docker-container \
         --env ADVERTISED_HOST=127.0.0.1 \
         --env ADVERTISED_PORT=9092 \
         spotify/kafka
    
  3. To run the microservices locally, add the bootstrap.servers configuration to each microservice’s application-gcp.properties file. Remove the bootstrap.servers configuration when you finish testing the native executables.

     kafka.enabled=true
     kafka.bootstrap.servers=localhost:9092
    

    Alternatively, you can install and run a local Kafka instance.

  4. To generate a native executable, run the following command for each microservice:

    ./gradlew :gcp:nativeCompile

    The native executable is created in the gcp/build/native/nativeCompile/ directory and can be run with the following command.

    gcp/build/native/nativeCompile/gcp
    ./mvnw install -pl lib -am && ./mvnw package -pl gcp -Dpackaging=native-image

    The native executable is created in the gcp/target/ directory and can be run with the following command:

    gcp/target/gcp

Start the native executables for the two microservices and run the same curl requests as before to check that everything works as expected.

You can see that the microservices behave identically as if you run them from JAR files, but with reduced startup time and smaller memory footprint.

4. Set up GCP Resources #

Start with the Kafka cluster, and then configure the GCP Compute instance.

4.1. Create a GCP Project #

Create a new GCP project named “gcn-guides” (follow the instructions contained in Creating and managing projects).


The Cloud SDK includes the gcloud command-line tool.


  1. Initialize the Cloud CLI:

    gcloud init
    
  2. Log in to the Google Cloud Platform:

    gcloud auth login
    
  3. Change your project:

    gcloud config set project gcn-guides
    

4.2. Configure GCP Kafka VM #

  1. In the Google Cloud console, open the navigation menu, click Marketplace.

  2. In the search bar that appears, enter “Kafka”.

  3. Select Apache Kafka Server on CentOS 8 Server from the drop-down list of results.

  4. Click LAUNCH.

  5. Click ENABLE to enable required Google APIs, if necessary.

  6. Enter a deployment name: “streaming-kafka”.

  7. Select the us-east1-b zone.

  8. Uncheck the check-boxes for Allow TCP port 9092 traffic from the Internet and Allow TCP port 6667 traffic from the Internet.

  9. Click DEPLOY and wait for the deployment to finish.

  10. Copy the internal IP address of the Kafka service VM. On your local computer run the following command:

    gcloud compute instances list
    

    Note: Look for the INTERNAL_IP column (for example, 10.142.0.10) and save it for later use.

4.3. Launch a GCP Compute Instance #

gcloud compute instances create streaming-instance \
--image=centos-stream-9-v20230509 \
--image-project=centos-cloud \
--machine-type=n2-standard-4 \
--tags streaming-instance

Note: You will deploy the Books and Analytics microservices this compute instance.

4.4. Start Kafka and Create a Kafka Topic #

Note: Open three terminals.

  1. In the first terminal connect to the Kafka VM using SSH:

     gcloud compute ssh streaming-kafka-vm
    
  2. Run the following command to start the ZooKeeper service:

     cd /opt/kafka/
     sudo bin/zookeeper-server-start.sh config/zookeeper.properties
    
  3. In the second terminal, run the following command to start the Kafka broker service:

     cd /opt/kafka/
     sudo bin/kafka-server-start.sh config/server.properties
    
  4. In the third terminal, run the following command to create the analytics Kafka topic:

     cd /opt/kafka/
     sudo bin/kafka-topics.sh --create --topic analytics --bootstrap-server localhost:9092
    

5. Configure Microservices #

  1. Edit the file named gcp/src/main/resources/application-gcp.properties for the Books microservice so that it matches the following contents. (The Micronaut framework applies this configuration file only for the gcp environment.)

    micronaut.application.name=gcp
    micronaut.server.port=8080
    netty.default.allocator.max-order=3
    kafka.enabled=true
    kafka.bootstrap.servers=${KAFKA_BOOTSTRAP_SERVERS}
    kafka.max.partition.fetch.bytes=1048576
    kafka.max.partition.request.size=1048576
    kafka.retries=3
    
  2. Edit the file named gcp/src/main/resources/application-gcp.properties for the Analytics microservice so that it matches the following contents. (The Micronaut framework applies this configuration file only for the gcp environment.)

    micronaut.application.name=gcp
    micronaut.server.port=8081
    netty.default.allocator.max-order=3
    kafka.enabled=true
    kafka.bootstrap.servers=${KAFKA_BOOTSTRAP_SERVERS}
    kafka.max.partition.fetch.bytes=1048576
    kafka.max.partition.request.size=1048576
    kafka.retries=3
    
  3. After you deploy your applications to the GCP Compute instance, make sure you export the following environment variable:

     export KAFKA_BOOTSTRAP_SERVERS=<replace_with_the_saved_internal_kafka_vm_address_>:9092
    

6. Deploy the Books service to GCP Cloud #

  1. Create a JAR file containing all the microservice’s dependencies, as follows:

    ./gradlew :gcp:shadowJar
    ./mvnw install -pl lib -am && ./mvnw package -pl gcp -DskipTests
  2. Copy the JAR file to your GCP Compute instance, as follows:

    gcloud compute scp gcp/build/libs/gcp-1.0-SNAPSHOT.jar streaming-instance:~/books_application.jar
    gcloud compute scp gcp/target/gcp-1.0-SNAPSHOT.jar streaming-instance:~/books_application.jar
  3. Connect to the GCP Compute instance:

     gcloud compute ssh streaming-instance
    
  4. Once connected, install GraalVM JDK with Native Image for Java 17. See Install GraalVM JDK with Native Image.

  5. Start the Books microservice, as follows:

     java -jar books_application.jar
    
  6. Enable port 8080 via firewall rules:

     gcloud compute firewall-rules create firewall-rule-port-8080 --allow tcp:8080 --target-tags=streaming-instance
    
  7. Verify that the application is running by invoking the controller at http://\[GCP_PUBLIC_IP\]:8080/books using curl:

     curl -i http://[GCP_PUBLIC_IP]:8080/books
    
  8. Invoke the controller endpoint to trigger a message to be published to the Streaming service. You can test other ISBNs as well.

     curl -i http://[GCP_PUBLIC_IP]:8080/books/1491950358
    

7. Deploy the Analytics Microservice to GCP Cloud #

  1. Create a JAR file containing all the microservice’s dependencies, as follows:

    ./gradlew :gcp:shadowJar
    ./mvnw install -pl lib -am && ./mvnw package -pl gcp -DskipTests
  2. Copy the JAR file to your GCP Compute instance, as follows:

    gcloud compute scp gcp/build/libs/gcp-1.0-SNAPSHOT.jar streaming-instance:~/analytics_application.jar
    gcloud compute scp gcp/target/gcp-1.0-SNAPSHOT.jar streaming-instance:~/analytics_application.jar
  3. Connect to the GCP Compute instance:

     gcloud compute ssh streaming-instance
    
  4. Once connected, install GraalVM JDK with Native Image for Java 17. See Install GraalVM JDK with Native Image.

  5. Start the Analytics microservice, as follows:

     java -jar analytics_application.jar
    
  6. Enable port 8081 via firewall rules:

    gcloud compute firewall-rules create firewall-rule-port-8081 --allow tcp:8081 --target-tags=streaming-instance
    
  7. Verify that the application is running by invoking the controller at http://\[GCP_PUBLIC_IP\]:8081/analytics using curl:

     curl -i http://[VM IP Address]:8081/analytics
    

8. Deploy Native Executables #

If you build a native executable locally, it will only run on your OS, and is not suitable for deployment to a GCP Compute instance. To create a deployable native executable, use a different packaging command that builds the executable inside a container, which you will extract from the container to deploy to the cloud.

8.1. Run Native Executables on the GCP Instance #

  1. Copy the compressed sample file to the GCP instance:

    gcloud compute scp /path/to/downloaded/sample/zip streaming-instance:~/streaming_sample.zip
    
  2. Once copied, connect to the GCP instance (if it has disconnected by now):

     gcloud compute ssh streaming-instance
    
  3. Unpack the compressed file:

    sudo yum install unzip
    unzip streaming_sample.zip
    
  4. Install the packages that GraalVM needs to work properly on GCP:

    sudo yum -y install gcc
    sudo yum -y install zlib*
    
  5. You must apply the settings from sections 5 (Configure Microservices) and 8.1 (Configure Native Image).

  6. To generate a native executable, run the following command for each microservice:

    ./gradlew :gcp:nativeCompile

    The native executable is created in the gcp/build/native/nativeCompile/ directory and can be run with the following command.

    gcp/build/native/nativeCompile/gcp
    ./mvnw install -pl lib -am && ./mvnw package -pl gcp -Dpackaging=native-image

    The native executable is created in the gcp/target/ directory and can be run with the following command:

    gcp/target/gcp

    Start the native executables for the two microservices and run the same curl requests as before to check that everything works as expected.

You can see that the microservices behave identically as if you run them from JAR files, but with reduced startup time and smaller memory footprint.

9. Clean up #

When you have completed the guide, you can clean up the resources you created on Google Cloud Platform so you will not be billed for them in the future.

9.1. Delete the Project #

The easiest way to eliminate billing is to delete the project you created.

Deleting a project has the following consequences:

  • If you used an existing project, you will also delete any other work you have done in the project.

  • You cannot reuse the project ID of a deleted project. If you created a custom project ID that you plan to use in the future, you should delete the resources inside the project instead. This ensures that URLs that use the project ID, such as an appspot.com URL, remain available.

  • If you are exploring multiple guides, reusing projects instead of deleting them prevents you from exceeding project quota limits.

9.1.1. Via the CLI

To delete the project using the Google Cloud CLI, run the following command:

gcloud projects delete gcn-guides

9.1.2. Via the Cloud Platform Console

  1. In the Cloud Platform Console, go to the Projects page.

  2. In the project list, select the project you want to delete and click Delete project. Select the check box next to the project name and click Delete project.

  3. In the dialog box, enter the project ID, and then click Shut down to delete the project.

Summary #

In this guide you created a streaming application with the Micronaut framework, Kafka, and Google Cloud Platform Apache Kafka VM. The communication between two microservices acting as a producer and consumer ran asynchronously. Then you packaged these microservices into native executables with GraalVM Native Image for their faster startup and reduced memory footprint, and deployed to Google Cloud Platform.