Java ConcurrentHashMap学习 —— HashMap Vs. ConcurrentHashMap Vs. SynchronizedMap – How a HashMap can be Synchronized in Java

HashMap is a very powerful data structure in Java. We use it everyday and almost in all applications. There are quite a few examples which I have written before on How to Implement Threadsafe cache, How to convert Hashmap to Arraylist?

We used Hashmap in both above examples but those are pretty simple use cases of Hashmap.HashMap is a non-synchronized collection class.

Do you have any of below questions?

  • What’s the difference between ConcurrentHashMap and Collections.synchronizedMap(Map)?
  • What’s the difference between ConcurrentHashMap and Collections.synchronizedMap(Map) in term of performance?
  • ConcurrentHashMap vs Collections.synchronizedMap()
  • Popular HashMap and ConcurrentHashMap interview questions

In this tutorial we will go over all above queries and reason why and how we could Synchronize Hashmap?

Why?

The Map object is an associative containers that store elements, formed by a combination of a uniquely identify key and a mapped value. If you have very highly concurrent application in which you may want to modify or read key value in different threads then it’s ideal to use Concurrent Hashmap. Best example is Producer Consumer which handles concurrent read/write.

So what does the thread-safe Map means? If multiple threads access a hash map concurrently, and at least one of the threads modifies the map structurally, it must be synchronized externallyto avoid an inconsistent view of the contents.

How?

There are two ways we could synchronized HashMap

  1. Java Collections synchronizedMap() method
  2. Use ConcurrentHashMap

HashMap Vs. synchronizedMap Vs. ConcurrentHashMap

//Hashtable
Map<String, String> normalMap = new Hashtable<String, String>();

//synchronizedMap
synchronizedHashMap = Collections.synchronizedMap(new HashMap<String, String>());

//ConcurrentHashMap
concurrentHashMap = new ConcurrentHashMap<String, String>();

ConcurrentHashMap

  • You should use ConcurrentHashMap when you need very high concurrency in your project.
  • It is thread safe without synchronizing the whole map.
  • Reads can happen very fast while write is done with a lock.
  • There is no locking at the object level.
  • The locking is at a much finer granularity at a hashmap bucket level.
  • ConcurrentHashMap doesn’t throw a ConcurrentModificationException if one thread tries to modify it while another is iterating over it.
  • ConcurrentHashMap uses multitude of locks.

SynchronizedHashMap

  • Synchronization at Object level.
  • Every read/write operation needs to acquire lock.
  • Locking the entire collection is a performance overhead.
  • This essentially gives access to only one thread to the entire map & blocks all the other threads.
  • It may cause contention.
  • SynchronizedHashMap returns Iterator, which fails-fast on concurrent modification.

Now let’s take a look at code

  1. Create class CrunchifyConcurrentHashMapVsSynchronizedHashMap.java
  2. Create object for each HashTable, SynchronizedMap and CrunchifyConcurrentHashMap
  3. Add and retrieve 500k entries from Map
  4. Measure start and end time and display time in milliseconds
  5. We will use ExecutorService to run 5 threads in parallel

CrunchifyConcurrentHashMapVsSynchronizedMap.java

Java

package crunchify.com.tutorials;

import java.util.Collections;
import java.util.HashMap;
import java.util.Hashtable;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.TimeUnit;

/**
* @author Crunchify.com
*
*/

public class CrunchifyConcurrentHashMapVsSynchronizedMap {

public final static int THREAD_POOL_SIZE = 5;

public static Map<String, Integer> crunchifyHashTableObject = null;
public static Map<String, Integer> crunchifySynchronizedMapObject = null;
public static Map<String, Integer> crunchifyConcurrentHashMapObject = null;

public static void main(String[] args) throws InterruptedException {

// Test with Hashtable Object
crunchifyHashTableObject = new Hashtable<String, Integer>();
crunchifyPerformTest(crunchifyHashTableObject);

// Test with synchronizedMap Object
crunchifySynchronizedMapObject = Collections.synchronizedMap(new HashMap<String, Integer>());
crunchifyPerformTest(crunchifySynchronizedMapObject);

// Test with ConcurrentHashMap Object
crunchifyConcurrentHashMapObject = new ConcurrentHashMap<String, Integer>();
crunchifyPerformTest(crunchifyConcurrentHashMapObject);

}

public static void crunchifyPerformTest(final Map<String, Integer> crunchifyThreads) throws InterruptedException {

System.out.println("Test started for: " + crunchifyThreads.getClass());
long averageTime = 0;
for (int i = 0; i < 5; i++) {

long startTime = System.nanoTime();
ExecutorService crunchifyExServer = Executors.newFixedThreadPool(THREAD_POOL_SIZE);

for (int j = 0; j < THREAD_POOL_SIZE; j++) {
crunchifyExServer.execute(new Runnable() {
@SuppressWarnings("unused")
@Override
public void run() {

for (int i = 0; i < 500000; i++) {
Integer crunchifyRandomNumber = (int) Math.ceil(Math.random() * 550000);

// Retrieve value. We are not using it anywhere
Integer crunchifyValue = crunchifyThreads.get(String.valueOf(crunchifyRandomNumber));

// Put value
crunchifyThreads.put(String.valueOf(crunchifyRandomNumber), crunchifyRandomNumber);
}
}
});
}

// Make sure executor stops
crunchifyExServer.shutdown();

// Blocks until all tasks have completed execution after a shutdown request
crunchifyExServer.awaitTermination(Long.MAX_VALUE, TimeUnit.DAYS);

long entTime = System.nanoTime();
long totalTime = (entTime - startTime) / 1000000L;
averageTime += totalTime;
System.out.println("2500K entried added/retrieved in " + totalTime + " ms");
}
System.out.println("For " + crunchifyThreads.getClass() + " the average time is " + averageTime / 5 + " ms\n");
}
}

Result

Test started for: class java.util.Hashtable
500K entried added/retrieved in 1432 ms
500K entried added/retrieved in 1425 ms
500K entried added/retrieved in 1373 ms
500K entried added/retrieved in 1369 ms
500K entried added/retrieved in 1438 ms
For class java.util.Hashtable the average time 1407 ms

Test started for: class java.util.Collections$SynchronizedMap
500K entried added/retrieved in 1431 ms
500K entried added/retrieved in 1460 ms
500K entried added/retrieved in 1387 ms
500K entried added/retrieved in 1456 ms
500K entried added/retrieved in 1406 ms
For class java.util.Collections$SynchronizedMap the average time 1428 ms

Test started for: class java.util.concurrent.ConcurrentHashMap
500K entried added/retrieved in 413 ms
500K entried added/retrieved in 351 ms
500K entried added/retrieved in 427 ms
500K entried added/retrieved in 337 ms
500K entried added/retrieved in 339 ms
For class java.util.concurrent.ConcurrentHashMap the average time 373 ms  <== Much faster
时间: 2024-10-08 20:49:27

Java ConcurrentHashMap学习 —— HashMap Vs. ConcurrentHashMap Vs. SynchronizedMap – How a HashMap can be Synchronized in Java的相关文章

Java EE 学习(7):IDEA + maven + spring 搭建 web(3)- 配置数据库

参考: https://my.oschina.net/gaussik/blog/513444 注:在阅读本文前,请先阅读: Java EE 学习(5):IDEA + maven + spring 搭建 web(1) Java EE 学习(6):IDEA + maven + spring 搭建 web(2) 5 数据库配置 下面,就要通过一个简单的例子,来介绍 SpringMVC 如何集成 Spring Data JPA(由 Hibernate JPA 提供),来进行强大的数据库访问,并通过本章节

Java EE 学习(9):IDEA + maven + spring 搭建 web(5)- 博客文章管理

转载:Gaussic(一个致力于AI研究却不得不兼顾项目的研究生) . 注:在阅读本文前,请先阅读: Java EE 学习(5):IDEA + maven + spring 搭建 web(1) Java EE 学习(6):IDEA + maven + spring 搭建 web(2)- 配置 Spring Java EE 学习(7):IDEA + maven + spring 搭建 web(3)- 配置数据库 Java EE 学习(8):IDEA + maven + spring 搭建 web(

java虚拟机学习-触摸java常量池(13)

java虚拟机学习-深入理解JVM(1) java虚拟机学习-慢慢琢磨JVM(2) java虚拟机学习-慢慢琢磨JVM(2-1)ClassLoader的工作机制 java虚拟机学习-JVM内存管理:深入Java内存区域与OOM(3) java虚拟机学习-JVM内存管理:深入垃圾收集器与内存分配策略(4) java虚拟机学习-JVM调优总结(5) java虚拟机学习-JVM调优总结(6) java虚拟机学习-JVM调优总结-基本垃圾回收算法(7) java虚拟机学习-JVM调优总结-垃圾回收面临的

JAVA学习:HashMap 和 ConcurrentHashMap

 一.最基本的HashMap 和 ConcurrentHashMap 1.HashMap的结构和底层原理:由数组和链表组成,数组里面每个地方都存了Key-Value这样的实例,在Java7叫Entry在Java8中叫Node 在JDK1.6,JDK1.7中,HashMap采用位桶+链表实现,即使用链表处理冲突,同一hash值的链表都存储在一个链表里.但是当位于一个桶中的元素较多,即hash值相等的元素较多时,通过key值依次查找的效率较低.而JDK1.8中,HashMap采用位桶+链表+红黑树实

[Java集合] 彻底搞懂HashMap,HashTable,ConcurrentHashMap之关联.

注: 今天看到的一篇讲hashMap,hashTable,concurrentHashMap很透彻的一篇文章, 感谢原作者的分享. 原文地址: http://blog.csdn.net/zhangerqing/article/details/8193118 Java集合类是个非常重要的知识点,HashMap.HashTable.ConcurrentHashMap等算是集合类中的重点,可谓"重中之重",首先来看个问题,如面试官问你:HashMap和HashTable有什么区别,一个比较简

Java并发指南13:Java7/8 中的 HashMap 和 ConcurrentHashMap 全解析

Java7/8 中的 HashMap 和 ConcurrentHashMap 全解析 转自https://www.javadoop.com/post/hashmap#toc7 部分内容转自 http://www.jasongj.com/java/concurrenthashmap 今天发一篇"水文",可能很多读者都会表示不理解,不过我想把它作为并发序列文章中不可缺少的一块来介绍.本来以为花不了多少时间的,不过最终还是投入了挺多时间来完成这篇文章的. 网上关于 HashMap 和 Con

Java再学习——关于ConcurrentHashMap

ConcurrentHashMap提供了和Hashtable以及SynchronizedMap中所不同的锁机制. 1,在并发方面, ConcurrentHashMap提供了好得多的并发性.多个读操作几乎总可以并发地执行,同时进行的读和写操作通常也能并发地执行,而同时进行的写操作仍然可以不时地并发进行(相关的类也提供了类似的多个读线程的并发性,但是,只允许有一个活动的写线程). 而现有的Hashtable或者SynchronizedMap采用的锁机制是一次锁住整个hash表,从而同一时刻只能由一个

java集合类源码分析-concurrentHashMap

Java Core系列之ConcurrentHashMap实现(JDK 1.7) ConcurrentHashMap类似Hashtable,是HashMap更高效的线程安全版本的实现.不同于Hashtable简单的将所有方法标记为synchronized,它将内部数组分成多个Segment,每个Segment类似一个Hashtable,从而减少锁的粒度,并且它内部有一些比较tricky实现,让get操作很多时候甚至不需要锁(本文代码基于JDK 1.7,它在JDK 1.6的基础上做了进一步的优化,

Popular HashMap and ConcurrentHashMap Interview Questions

http://howtodoinjava.com/core-java/collections/popular-hashmap-and-concurrenthashmap-interview-questions/ Popular HashMap and ConcurrentHashMap Interview Questions June 14, 2013 by Lokesh Gupta In my previous post related to “How HashMap works in jav