Go to the Cloudformation console, click on the stack you just created, click on the Outputs tab. Then scroll down and copy the value for the key KafkaClientEC2InstanceSsh.
SSH into the KafkaClientEC2Instance instance. Open a terminal on Mac or a putty terminal on Windows and paste the command you copied from the Outputs section above. (Please open a new terminal and let the terminal where the producer is producing records keep running)
Use the Apache Kafka console consumer to consume records from the 3 clickstream analytics Topics that you created earlier.
/home/ec2-user/kafka/bin/kafka-console-consumer.sh --bootstrap-server <Bootstrap servers** value that you copied in the Setup section> --topic Departments_Agg --from-beginning
Note that this will read all messages in the topic from the beginning. Please Ctrl-C out of it and read the next 2 topics
/home/ec2-user/kafka/bin/kafka-console-consumer.sh --bootstrap-server <Bootstrap servers** value that you copied in the Setup section> --topic ClickEvents_UserId_Agg_Result --from-beginning
/home/ec2-user/kafka/bin/kafka-console-consumer.sh --bootstrap-server <Bootstrap servers** value that you copied in the Setup section> --topic User_Sessions_Aggregates_With_Order_Checkout --from-beginning