Go to file
Dong Lin 45d9fb3d5f KAFKA-4735; Fix deadlock issue during MM shutdown
In https://issues.apache.org/jira/browse/KAFKA-4521 we fixed a potential message reorder bug in MM. However, the patch introduced another bug that can cause deadlock during MM shutdown. The deadlock will happen if zookeeper listener thread call requestAndWaitForCommit() after MirrorMaker thread has already exited loop of consuming and producing messages.

This patch fixes the problem by setting `iter` to `null` in `MirrorMakerOldConsumer.cleanup()`. If zookeeper listener thread calls `requestAndWaitForCommit()` after `cleanup()`, then it will not block waiting for commit notification since `iter == null`. If zookeeper listener thread calls `requestAndWaitForCommit()` before `cleanup()`, then `cleanup()` will call `notifyAll()` to unblock zookeeper listener thread.

Author: Dong Lin <lindong28@gmail.com>

Reviewers: Jiangjie Qin <becket.qin@gmail.com>

Closes #2504 from lindong28/KAFKA-4735
2017-02-06 16:01:59 -08:00
bin MINOR: Update kafka-run-class.bat to handle spaces in classpath 2017-01-26 14:17:09 -08:00
checkstyle KAFKA-4039; Fix deadlock during shutdown due to log truncation not allowed 2017-02-02 22:23:49 +00:00
clients/src KAFKA-4728; KafkaConsumer#commitSync should copy its input 2017-02-06 11:47:45 -08:00
config KAFKA-3959: enforce offsets.topic.replication.factor upon __consumer_offsets auto topic creation (KIP-115) 2017-02-01 19:55:06 -08:00
connect KAFKA-4039; Fix deadlock during shutdown due to log truncation not allowed 2017-02-02 22:23:49 +00:00
core/src KAFKA-4735; Fix deadlock issue during MM shutdown 2017-02-06 16:01:59 -08:00
docs MINOR: added upgrade and API changes to docs 2017-02-02 21:36:27 -08:00
examples MINOR: add a space to separate two words 2016-11-16 11:36:05 -08:00
gradle MINOR: Update rocksDB dependency to 5.0.1 2017-01-11 09:37:06 -08:00
log4j-appender/src KAFKA-4500; Code quality improvements 2016-12-20 12:40:07 +00:00
streams HOTFIX: Do Not use unlimited num messages in IntegrationTestUtils 2017-02-06 14:55:12 -08:00
tests MINOR: Fix import for streams broker compatibility test to use new DEV_BRANCH constant 2017-02-06 13:53:22 -08:00
tools/src/main/java/org/apache/kafka/tools KAFKA-4039; Fix deadlock during shutdown due to log truncation not allowed 2017-02-02 22:23:49 +00:00
vagrant KAFKA-4450; Add upgrade tests for 0.10.1 and rename TRUNK to DEV_BRANCH to reduce confusion 2017-01-28 01:40:34 +00:00
.gitignore MINOR: Remove incomplete gradle wrapper infrastructure 2016-12-20 10:49:19 +00:00
CONTRIBUTING.md
HEADER
LICENSE
NOTICE MINOR: Update copyright year in the NOTICE file. 2017-01-27 14:49:12 -08:00
README.md KAFKA-4552; README.md has org.gradle.project.maxParallelForms instead of maxParallelForks 2016-12-16 15:02:46 -08:00
Vagrantfile
build.gradle KAFKA-4717: Use absolute paths to files in root directory so all jars include LICENSE and NOTICE files 2017-01-31 16:06:10 -08:00
doap_Kafka.rdf
gradle.properties Bump version to 0.10.3.0-SNAPSHOT 2017-01-16 11:39:13 -08:00
kafka-merge-pr.py Bump version to 0.10.3.0-SNAPSHOT 2017-01-16 11:39:13 -08:00
release_notes.py MINOR: Make release notes script check resolutions to avoid spurious inclusion of non-fix 'fixes' in release notes. 2016-11-29 11:36:06 -08:00
settings.gradle KAFKA-3209: KIP-66: single message transforms 2017-01-12 16:14:53 -08:00
wrapper.gradle

README.md

Apache Kafka

See our web site for details on the project.

You need to have Gradle and Java installed.

Kafka requires Gradle 2.0 or higher.

Java 7 should be used for building in order to support both Java 7 and Java 8 at runtime.

First bootstrap and download the wrapper

cd kafka_source_dir
gradle

Now everything else will work.

Building a jar and running it

./gradlew jar

Follow instructions in http://kafka.apache.org/documentation.html#quickstart

Building source jar

./gradlew srcJar

Building aggregated javadoc

./gradlew aggregatedJavadoc

Building javadoc and scaladoc

./gradlew javadoc
./gradlew javadocJar # builds a javadoc jar for each module
./gradlew scaladoc
./gradlew scaladocJar # builds a scaladoc jar for each module
./gradlew docsJar # builds both (if applicable) javadoc and scaladoc jars for each module

Running unit tests

./gradlew test

Forcing re-running unit tests w/o code change

./gradlew cleanTest test

Running a particular unit test

./gradlew -Dtest.single=RequestResponseSerializationTest core:test

Running a particular test method within a unit test

./gradlew core:test --tests kafka.api.ProducerFailureHandlingTest.testCannotSendToInternalTopic
./gradlew clients:test --tests org.apache.kafka.clients.MetadataTest.testMetadataUpdateWaitTime

Running a particular unit test with log4j output

Change the log4j setting in either clients/src/test/resources/log4j.properties or core/src/test/resources/log4j.properties

./gradlew -i -Dtest.single=RequestResponseSerializationTest core:test

Generating test coverage reports

./gradlew reportCoverage

Building a binary release gzipped tar ball

./gradlew clean
./gradlew releaseTarGz

The above command will fail if you haven't set up the signing key. To bypass signing the artifact, you can run:

./gradlew releaseTarGz -x signArchives

The release file can be found inside ./core/build/distributions/.

Cleaning the build

./gradlew clean

Running a task on a particular version of Scala (either 2.10.6 or 2.11.8)

Note that if building the jars with a version other than 2.10.6, you need to set the SCALA_VERSION variable or change it in bin/kafka-run-class.sh to run the quick start.

You can pass either the major version (eg 2.11) or the full version (eg 2.11.8):

./gradlew -PscalaVersion=2.11 jar
./gradlew -PscalaVersion=2.11 test
./gradlew -PscalaVersion=2.11 releaseTarGz

Running a task for a specific project

This is for core, examples and clients

./gradlew core:jar
./gradlew core:test

Listing all gradle tasks

./gradlew tasks

Building IDE project

Note that this is not strictly necessary (IntelliJ IDEA has good built-in support for Gradle projects, for example).

./gradlew eclipse
./gradlew idea

Building the jar for all scala versions and for all projects

./gradlew jarAll

Running unit tests for all scala versions and for all projects

./gradlew testAll

Building a binary release gzipped tar ball for all scala versions

./gradlew releaseTarGzAll

Publishing the jar for all version of Scala and for all projects to maven

./gradlew uploadArchivesAll

Please note for this to work you should create/update ${GRADLE_USER_HOME}/gradle.properties (typically, ~/.gradle/gradle.properties) and assign the following variables

mavenUrl=
mavenUsername=
mavenPassword=
signing.keyId=
signing.password=
signing.secretKeyRingFile=

Installing the jars to the local Maven repository

./gradlew installAll

Building the test jar

./gradlew testJar

Determining how transitive dependencies are added

./gradlew core:dependencies --configuration runtime

Determining if any dependencies could be updated

./gradlew dependencyUpdates

Running checkstyle on the java code

./gradlew checkstyleMain checkstyleTest

This will most commonly be useful for automated builds where the full resources of the host running the build and tests may not be dedicated to Kafka's build.

Common build options

The following options should be set with a -D switch, for example ./gradlew -Dorg.gradle.project.maxParallelForks=1 test.

  • org.gradle.project.mavenUrl: sets the URL of the maven deployment repository (file://path/to/repo can be used to point to a local repository).
  • org.gradle.project.maxParallelForks: limits the maximum number of processes for each task.
  • org.gradle.project.showStandardStreams: shows standard out and standard error of the test JVM(s) on the console.
  • org.gradle.project.skipSigning: skips signing of artifacts.
  • org.gradle.project.testLoggingEvents: unit test events to be logged, separated by comma. For example ./gradlew -Dorg.gradle.project.testLoggingEvents=started,passed,skipped,failed test

Running in Vagrant

See vagrant/README.md.

Contribution

Apache Kafka is interested in building the community; we would welcome any thoughts or patches. You can reach us on the Apache mailing lists.

To contribute follow the instructions here: