How to resolve dependency org.apache.kafka.clients.producer.Producer to java producer for apache kafka in maven project


roof tiles

This is the controller of my Java web application. package com.proj.controller;

import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.ResponseBody;
import org.springframework.web.bind.annotation.RestController;

import com.proj.dao.Kafka_ObjectDao;
import com.proj.daoImpl.Kafka_ObjectDaoImpl;
import com.proj.model.Kafka_Object;

@RestController
public class MainController {

Kafka_ObjectDao kafka_ObjectDao = new Kafka_ObjectDaoImpl();

@RequestMapping("/")
public String welcome() {

    return "<h1>Welcome to Rest Services for Kafka</h1><br><br>";
}

 @RequestMapping(value = "/modify/student", method = RequestMethod.POST)
  public @ResponseBody int modifyStudent(@RequestBody Kafka_Object kafka_Object)
 { 
     System.out.println("In controller"); 
     return kafka_ObjectDao.modifyStudent(kafka_Object);

  } 
}

The Dao of this web application is the com.proj.dao package;

   import com.proj.model.Kafka_Object;

 public interface Kafka_ObjectDao {

  public int modifyStudent(Kafka_Object kafka_Object);

}

The implementation of the dao interface is

    package com.proj.daoImpl;
import java.util.Properties;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerRecord;

import com.google.gson.Gson;
import com.proj.dao.Kafka_ObjectDao;
import com.proj.model.Kafka_Object;

public class Kafka_ObjectDaoImpl implements Kafka_ObjectDao {
Gson gson = new Gson();

public Kafka_ObjectDaoImpl() {
}

public int modifyStudent(Kafka_Object pojo) {
    Properties props = new Properties();
    props.put("bootstrap.servers", "localhost:9092");
    props.put("acks", "all");
    props.put("retries", 0);
    props.put("batch.size", 16384);
    props.put("linger.ms", 1);
    props.put("buffer.memory", 33554432);
    props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    try {
        Producer<String, String> producer = new KafkaProducer<String, String>(props);
        ProducerRecord<String, String> PR = new ProducerRecord<String, String>("test", gson.toJson(pojo));
        producer.send(PR);
        System.out.println("Sent:" + PR.toString());
        producer.close();
    } catch (Exception e) {
        e.printStackTrace();
    }
    return 1;
}

}

The application configuration for this web application is

    package com.proj.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;

import com.proj.dao.Kafka_ObjectDao;
import com.proj.daoImpl.Kafka_ObjectDaoImpl;

@Configuration
@ComponentScan(basePackages = "com.proj")
public class ApplicationConfig {

@Bean
public Kafka_ObjectDao getKafka_ObjectDao() {
    return new Kafka_ObjectDaoImpl();
}
}

The pom.xml of my maven project is

    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.rest</groupId>
<artifactId>QIC_FIRST</artifactId>
<packaging>war</packaging>
<version>1.0-SNAPSHOT</version>
<name>QIC_FIRST Maven Webapp</name>
<url>http://maven.apache.org</url>
<dependencies>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>3.8.1</version>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka-clients</artifactId>
        <version>0.9.0.1</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka_2.10</artifactId>
        <version>0.10.0.0</version>
    </dependency>
    <dependency>
        <groupId>com.google.code.gson</groupId>
        <artifactId>gson</artifactId>
        <version>2.6.2</version>
    </dependency>
    <dependency>
        <groupId>org.springframework.data</groupId>
        <artifactId>spring-data-rest-webmvc</artifactId>
        <version>2.4.2.RELEASE</version>
    </dependency>
    <dependency>
        <groupId>javax.servlet</groupId>
        <artifactId>javax.servlet-api</artifactId>
        <version>3.1.0</version>
    </dependency>
    <dependency>
        <groupId>org.springframework</groupId>
        <artifactId>spring-jdbc</artifactId>
        <version>4.3.0.RELEASE</version>
    </dependency>
    <dependency>
        <groupId>com.google.code.gson</groupId>
        <artifactId>gson</artifactId>
        <version>2.6.2</version>
    </dependency>
</dependencies>
<build>
    <finalName>QIC_FIRST</finalName>
</build>
</project>

What the error is getting is

  Caused by: java.lang.ClassNotFoundException:org.apache.kafka.clients.producer.Producer

The full list of errors is

    SEVERE: Exception sending context initialized event to listener instance       of class org.springframework.web.context.ContextLoaderListener
  org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'mainController' defined in file            [C:\Users\teja.k\workspace\.metadata\.plugins\org.eclipse.wst.server.core\tmp2\wtpwebapps\QIC_FIRST\WEB-INF\classes\com\proj\controller\MainController.class]: Instantiation of bean failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.proj.controller.MainController]: Constructor threw exception; nested exception is java.lang.NoClassDefFoundError: org/apache/kafka/clients/producer/Producer
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:1105)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1050)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:510)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:482)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:775)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:762)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:480)
at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:434)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:306)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:106)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5097)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5615)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1571)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1561)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
  Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.proj.controller.MainController]: Constructor threw exception; nested exception is java.lang.NoClassDefFoundError: org/apache/kafka/clients/producer/Producer
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:159)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:89)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:1098)
... 22 more
Caused by: java.lang.NoClassDefFoundError: org/apache/kafka/clients/producer/Producer
at com.proj.controller.MainController.<init>(MainController.java:16)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:147)
... 24 more
  Caused by: java.lang.ClassNotFoundException: org.apache.kafka.clients.producer.Producer
at       org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase  .java:1891)
at         org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1734)
... 30 more

Thanks in advance.

Trever Shick

Your kafka-clientsdependencies have providedscope. Should be compileor runtime. Scope provideddoes not put dependencies into .jarfiles. If used provided, it must be manually added to your classpath .jarby other means . The easiest solution is to remove <scope>provided</scope>from your kafka-clientsdependencies .

Related


Apache Kafka Producer misconfigured

spark According to the documentation of the producer configuration, refer to the 0.9.0.0 version of Apache Kafka: http://kafka.apache.org/documentation.html#producerconfigs I need to specify a list of proxies using the following properties: props.put("bootstra

Apache Kafka Producer misconfigured

Joe Hill According to the documentation of the producer configuration, refer to the 0.9.0.0 version of Apache Kafka: http://kafka.apache.org/documentation.html#producerconfigs I need to specify a list of proxies using the following properties: props.put("boots

Apache Kafka Producer misconfigured

spark According to the documentation of the producer configuration, refer to the 0.9.0.0 version of Apache Kafka: http://kafka.apache.org/documentation.html#producerconfigs I need to specify a list of proxies using the following properties: props.put("bootstra

Apache Kafka Producer misconfigured

spark According to the documentation of the producer configuration, refer to the 0.9.0.0 version of Apache Kafka: http://kafka.apache.org/documentation.html#producerconfigs I need to specify a list of proxies using the following properties: props.put("bootstra

Apache Kafka - Producer - Idempotency between producer messages

poison dog Our project has microservice architecture. There is a microservice where the scheduler triggers the logic to send messages to Kafka. The problem is that if there is more than one instance of the microservice, each instance of the microservice will s

How to specify key for Kafka producer in Apache Nifi?

Nikita Ryanov I have simple pipeline using Apache Nifi and I want to publish some messages in kafka topic using existing Kafka puplisher processor. The question is how to specify kafka key using apache nifi expression language? I'm tired, but, of course, I'm g

How to specify key for Kafka producer in Apache Nifi?

Nikita Ryanov I have simple pipeline using Apache Nifi and I want to publish some messages in kafka topic using existing Kafka puplisher processor. The question is how to specify kafka key using apache nifi expression language? I'm tired, but, of course, I'm g

Apache Kafka producer broker connection

spark I have a set of Kafka broker instances running as a cluster. I have a client that is producing data to Kafka: props.put("metadata.broker.list", "broker1:9092,broker2:9092,broker3:9092"); When we monitor with tcpdump, I can see that only connections are

Apache Camel as Kafka producer/consumer

Cranadex I've been working on Apache Camel and Kafka for the past two days looking to learn about messaging frameworks/brokers. Is it possible to have a Camel/Kafka use case using Kafka as a message broker when using Camel to implement a producer and consumer?

Apache Kafka producer broker connection

spark I have a set of Kafka broker instances running as a cluster. I have a client that is producing data to Kafka: props.put("metadata.broker.list", "broker1:9092,broker2:9092,broker3:9092"); When we monitor with tcpdump, I can see that only connections to b

Apache Camel as Kafka producer/consumer

Cranadex I've been working on Apache Camel and Kafka for the past two days looking to learn about messaging frameworks/brokers. Is it possible to have a Camel/Kafka use case using Kafka as a message broker when using Camel to implement a producer and consumer?

Apache Camel as Kafka producer/consumer

Cranadex I've been working on Apache Camel and Kafka for the past two days looking to learn about messaging frameworks/brokers. Is it possible to have a Camel/Kafka use case using Kafka as a message broker when using Camel to implement a producer and consumer?

Apache Camel as Kafka producer/consumer

Cranadex I've been working on Apache Camel and Kafka for the past two days looking to learn about messaging frameworks/brokers. Is it possible to have a Camel/Kafka use case using Kafka as a message broker when using Camel to implement a producer and consumer?

Apache Kafka producer broker connection

Joe Hill I have a set of Kafka broker instances running as a cluster. I have a client that is producing data to Kafka: props.put("metadata.broker.list", "broker1:9092,broker2:9092,broker3:9092"); When we monitor with tcpdump, I can see that only connections a

Apache Kafka producer broker connection

spark I have a set of Kafka broker instances running as a cluster. I have a client that is producing data to Kafka: props.put("metadata.broker.list", "broker1:9092,broker2:9092,broker3:9092"); When we monitor with tcpdump, I can see that only connections to b

Apache Kafka producer broker connection

spark I have a set of Kafka broker instances running as a cluster. I have a client that is producing data to Kafka: props.put("metadata.broker.list", "broker1:9092,broker2:9092,broker3:9092"); When we monitor with tcpdump, I can see that only connections to b

Apache Camel as Kafka producer/consumer

Cranadex I've been working on Apache Camel and Kafka for the past two days looking to learn about messaging frameworks/brokers. Is it possible to have a Camel/Kafka use case using Kafka as a message broker when using Camel to implement a producer and consumer?

Apache Camel as Kafka producer/consumer

Cranadex I've been working on Apache Camel and Kafka for the past two days looking to learn about messaging frameworks/brokers. Is it possible to have a Camel/Kafka use case using Kafka as a message broker when using Camel to implement a producer and consumer?

Apache Camel as Kafka producer/consumer

Cranadex I've been working on Apache Camel and Kafka for the past two days looking to learn about messaging frameworks/brokers. Is it possible to have a Camel/Kafka use case using Kafka as a message broker when using Camel to implement a producer and consumer?

Apache Camel as Kafka producer/consumer

Cranadex I've been working on Apache Camel and Kafka for the past two days looking to learn about messaging frameworks/brokers. Is it possible to have a Camel/Kafka use case using Kafka as a message broker when using Camel to implement a producer and consumer?