Hibernate-Search使用

Hibernate.cfg.xml

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE hibernate-configuration PUBLIC
		"-//Hibernate/Hibernate Configuration DTD 3.0//EN"
		"http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd">
<hibernate-configuration>
    <session-factory name="sessionFactory">

        <property name="hibernate.connection.driver_class">com.mysql.jdbc.Driver</property>
        <property name="hibernate.connection.url">jdbc:mysql://127.0.0.1:3306/hibernate_search</property>
        <property name="hibernate.connection.username">travis</property>
        <property name="hibernate.dialect">org.hibernate.dialect.MySQL5InnoDBDialect</property>
        <property name="hibernate.hbm2ddl.auto" value="update" /> 

                <property name="hibernate.search.lucene_version" value="LUCENE_36"/>
		<property name="hibernate.search.default.directory_provider" value="filesystem"/>
		<property name="hibernate.search.default.indexBase" value="target/lucene/indexes"/>

		<event type="post-update">
			<listener class="org.hibernate.search.event.FullTextIndexEventListener" />
		</event>

		<event type="post-insert">
			<listener class="org.hibernate.search.event.FullTextIndexEventListener" />
		</event>

		<event type="post-delete">
			<listener class="org.hibernate.search.event.FullTextIndexEventListener" />
		</event>

		<event type="post-collection-recreate">
			<listener class="org.hibernate.search.event.FullTextIndexEventListener" />
		</event>

		<event type="post-collection-remove">
			<listener class="org.hibernate.search.event.FullTextIndexEventListener" />
		</event>

		<event type="post-collection-update">
			<listener class="org.hibernate.search.event.FullTextIndexEventListener" />
		</event> 

    </session-factory>
</hibernate-configuration>

spring配置文件

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:context="http://www.springframework.org/schema/context"
	xmlns:mvc="http://www.springframework.org/schema/mvc" xmlns:jdbc="http://www.springframework.org/schema/jdbc"
	xmlns:aop="http://www.springframework.org/schema/aop" xmlns:tx="http://www.springframework.org/schema/tx"
	xsi:schemaLocation="http://www.springframework.org/schema/mvc
	http://www.springframework.org/schema/mvc/spring-mvc-3.1.xsd
	http://www.springframework.org/schema/beans
	http://www.springframework.org/schema/beans/spring-beans-3.1.xsd
	http://www.springframework.org/schema/aop
	http://www.springframework.org/schema/aop/spring-aop-3.1.xsd
	http://www.springframework.org/schema/tx
	http://www.springframework.org/schema/tx/spring-tx-3.1.xsd
	http://www.springframework.org/schema/jdbc
	http://www.springframework.org/schema/jdbc/spring-jdbc-3.1.xsd
	http://www.springframework.org/schema/context
	http://www.springframework.org/schema/context/spring-context-3.1.xsd">

	<context:annotation-config />

	<context:property-placeholder location="classpath:jdbc.properties" />

	<context:component-scan base-package="org.hibernate.search.hibernate.example">
		<context:exclude-filter type="annotation"
			expression="org.springframework.stereotype.Controller" />
	</context:component-scan>

	<!-- 阿里 Druid数据源 -->
	<!-- https://github.com/alibaba/druid/wiki/%E5%B8%B8%E8%A7%81%E9%97%AE%E9%A2%98 -->
	<bean id="hibernate4DataSource" class="com.alibaba.druid.pool.DruidDataSource"
		init-method="init" destroy-method="close">
		<property name="username" value="${username}"></property>
		<property name="password" value="${password}"></property>
		<property name="url" value="${log4jdbc.url}"></property>
		<property name="driverClassName" value="${log4jdbc.driverClassName}"></property>

		<!-- 配置初始化大小、最小、最大 -->
		<property name="initialSize" value="1" />
		<property name="minIdle" value="1" />
		<property name="maxActive" value="20" />

		<!-- 配置获取连接等待超时的时间 -->
		<property name="maxWait" value="60000" />

		<!-- 配置间隔多久才进行一次检测,检测需要关闭的空闲连接,单位是毫秒 -->
		<property name="timeBetweenEvictionRunsMillis" value="60000" />

		<!-- 配置一个连接在池中最小生存的时间,单位是毫秒 -->
		<property name="minEvictableIdleTimeMillis" value="300000" />

		<property name="validationQuery" value="SELECT 'x'" />
		<property name="testWhileIdle" value="true" />
		<property name="testOnBorrow" value="false" />
		<property name="testOnReturn" value="false" />

		<!-- 打开PSCache,并且指定每个连接上PSCache的大小 -->
		<property name="poolPreparedStatements" value="true" />
		<property name="maxPoolPreparedStatementPerConnectionSize"
			value="20" />

		<!-- 配置监控统计拦截的filters -->
		<property name="filters" value="stat,log4j" />
	</bean>

	<!-- 数据初始化导入 -->

	<jdbc:initialize-database data-source="hibernate4DataSource">
		<jdbc:script location="classpath:hibernate_search.sql"
			encoding="GBK" />
	</jdbc:initialize-database>

	<bean id="hibernate4sessionFactory"
		class="org.springframework.orm.hibernate4.LocalSessionFactoryBean">
		<property name="dataSource" ref="hibernate4DataSource"></property>
		<property name="hibernateProperties">
			<props>
				<prop key="hibernate.dialect">org.hibernate.dialect.MySQL5InnoDBDialect</prop>
				<!-- <prop key="hibernate.show_sql">true</prop> -->
				<!-- <prop key="hibernate.hbm2ddl.auto">update</prop> -->
				<prop key="hibernate.jdbc.batch_size">20</prop>

				<!-- hibernate cache -->
				<prop key="hibernate.cache.use_query_cache">true</prop>
				<prop key="hibernate.cache.use_second_level_cache">true</prop>
				<prop key="hibernate.cache.provider_class">org.hibernate.cache.EhCacheProvider</prop>
				<prop key="hibernate.cache.region.factory_class">org.hibernate.cache.ehcache.EhCacheRegionFactory
				</prop>
				<prop key="net.sf.ehcache.configurationResourceName">ehcache.xml</prop>

				<!-- hibernate search configuration -->
				<prop key="hibernate.search.lucene_version">LUCENE_36</prop>
				<prop key="hibernate.search.default.directory_provider">filesystem</prop>
				<prop key="hibernate.search.default.indexBase">target/lucene/indexes</prop>

				<!-- hibernate search Index Optimization -->
				<prop key="hibernate.search.default.optimizer.operation_limit.max">1000</prop>
				<prop key="hibernate.search.default.optimizer.transaction_limit.max">100</prop>
			</props>
		</property>

		<property name="packagesToScan">
			<list>
				<value>org.hibernate.search.hibernate.example.model</value>
			</list>
		</property>

		<!-- <property name="annotatedClasses"> <list> <value>org.hibernate.search.hibernate.example.model.Author</value>
			<value>org.hibernate.search.hibernate.example.model.Book</value> </list>
			</property> -->

		<!-- <property name="mappingResources"> <list> <value>org/hibernate/search/hibernate/example/model/Author.hbm.xml</value>
			<value>org/hibernate/search/hibernate/example/model/Book.hbm.xml</value>
			</list> </property> -->

	</bean>

	<bean class="org.hibernate.search.hibernate.example.IndexManger"
		depends-on="hibernate4sessionFactory" />

	<bean id="hibernate4TransactionManager"
		class="org.springframework.orm.hibernate4.HibernateTransactionManager">
		<property name="sessionFactory" ref="hibernate4sessionFactory"></property>
	</bean>

	<aop:aspectj-autoproxy expose-proxy="true" />
	<tx:annotation-driven transaction-manager="hibernate4TransactionManager" />

	<tx:advice id="txAdvice" transaction-manager="hibernate4TransactionManager">
		<tx:attributes>
			<tx:method name="save*" propagation="REQUIRED" />
			<tx:method name="add*" propagation="REQUIRED" />
			<tx:method name="create*" propagation="REQUIRED" />
			<tx:method name="insert*" propagation="REQUIRED" />
			<tx:method name="update*" propagation="REQUIRED" />
			<tx:method name="merge*" propagation="REQUIRED" />
			<tx:method name="del*" propagation="REQUIRED" />
			<tx:method name="remove*" propagation="REQUIRED" />
			<tx:method name="put*" propagation="REQUIRED" />
			<tx:method name="use*" propagation="REQUIRED" />
			<!--hibernate4必须配置为开启事务 否则 getCurrentSession()获取不到 -->
			<tx:method name="get*" propagation="REQUIRED" read-only="true" />
			<tx:method name="count*" propagation="REQUIRED" read-only="true" />
			<tx:method name="find*" propagation="REQUIRED" read-only="true" />
			<tx:method name="list*" propagation="REQUIRED" read-only="true" />
			<tx:method name="query*" propagation="REQUIRED" read-only="true" />
			<tx:method name="load*" propagation="REQUIRED" read-only="true" />
			<tx:method name="*" read-only="true" />
		</tx:attributes>
	</tx:advice>

	<aop:config expose-proxy="true">
		<aop:pointcut
			expression="execution(* org.hibernate.search.hibernate.example.service.*.*(..))"
			id="pointcut" />
		<aop:advisor advice-ref="txAdvice" pointcut-ref="pointcut" />
	</aop:config>

</beans>

实体类

Author

package org.hibernate.search.hibernate.example.model;

import java.util.Set;

import javax.persistence.Entity;
import javax.persistence.FetchType;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.ManyToMany;
import javax.persistence.Table;

import org.codehaus.jackson.annotate.JsonIgnore;
import org.hibernate.annotations.Cache;
import org.hibernate.annotations.CacheConcurrencyStrategy;
import org.hibernate.search.annotations.Analyze;
import org.hibernate.search.annotations.ContainedIn;
import org.hibernate.search.annotations.Field;
import org.hibernate.search.annotations.Index;
import org.hibernate.search.annotations.Store;

@Entity
@Table(catalog="hibernate_search",name="Author")
@Cache(usage = CacheConcurrencyStrategy.READ_WRITE,region="org.hibernate.search.hibernate.example.model.Author")
public class Author {
	@Id
	@GeneratedValue(strategy=GenerationType.AUTO)
	private Integer id;

	@Field(index=Index.YES,analyze=Analyze.NO,store=Store.COMPRESS)
	private String name;

	@ManyToMany(fetch=FetchType.LAZY,mappedBy="authors"/*,cascade={CascadeType.PERSIST,CascadeType.MERGE,CascadeType.REFRESH,CascadeType.REMOVE}*/)
	@ContainedIn
	@Cache(usage = CacheConcurrencyStrategy.READ_WRITE,region="org.hibernate.search.hibernate.example.model.Book")
	@JsonIgnore
	private Set<Book> books;

	public Integer getId() {
		return id;
	}

	public void setId(Integer id) {
		this.id = id;
	}

	public String getName() {
		return name;
	}

	public void setName(String name) {
		this.name = name;
	}

	public Set<Book> getBooks() {
		return books;
	}

	public void setBooks(Set<Book> books) {
		this.books = books;
	}

	public Author() {
	}

	public Author(String name) {
		super();
		this.name = name;
	}

}

Book

package org.hibernate.search.hibernate.example.model;

import static org.hibernate.search.annotations.FieldCacheType.CLASS;
import static org.hibernate.search.annotations.FieldCacheType.ID;

import java.util.Date;
import java.util.HashSet;
import java.util.Set;

import javax.persistence.CascadeType;
import javax.persistence.Entity;
import javax.persistence.FetchType;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.JoinColumn;
import javax.persistence.JoinTable;
import javax.persistence.ManyToMany;
import javax.persistence.Table;

import net.paoding.analysis.analyzer.PaodingAnalyzer;

import org.hibernate.annotations.Cache;
import org.hibernate.annotations.CacheConcurrencyStrategy;
import org.hibernate.search.annotations.Analyze;
import org.hibernate.search.annotations.Analyzer;
import org.hibernate.search.annotations.Boost;
import org.hibernate.search.annotations.CacheFromIndex;
import org.hibernate.search.annotations.DateBridge;
import org.hibernate.search.annotations.DocumentId;
import org.hibernate.search.annotations.Field;
import org.hibernate.search.annotations.Index;
import org.hibernate.search.annotations.Indexed;
import org.hibernate.search.annotations.IndexedEmbedded;
import org.hibernate.search.annotations.Resolution;
import org.hibernate.search.annotations.Store;

@Entity
@Table(catalog="hibernate_search",name="Book")
@Indexed(index="book")
//@Analyzer(impl=IKAnalyzer.class)
@Analyzer(impl=PaodingAnalyzer.class)
@Cache(usage = CacheConcurrencyStrategy.READ_WRITE,region="org.hibernate.search.hibernate.example.model.Book")
@Boost(2.0f)
@CacheFromIndex( { CLASS, ID } )
public class Book {
	@Id
	@GeneratedValue(strategy=GenerationType.AUTO)
	@DocumentId
	private Integer id;

	@Field(index = Index.YES, analyze = Analyze.YES, store = Store.COMPRESS)
	@Boost(1.5f)
	private String name;

	@Field(index = Index.YES, analyze = Analyze.YES, store = Store.COMPRESS)
	@Boost(1.2f)
	private String description;

	@Field(index = Index.YES, analyze = Analyze.NO, store = Store.YES)
	@DateBridge(resolution = Resolution.DAY)
	private Date publicationDate;

	@IndexedEmbedded(depth=1)
	@ManyToMany(cascade={CascadeType.PERSIST,CascadeType.MERGE,CascadeType.REFRESH,CascadeType.REMOVE},fetch=FetchType.LAZY)
	@JoinTable(
			catalog="hibernate_search",
			name="Book_Author",
			joinColumns={@JoinColumn(name = "book_id")},
			inverseJoinColumns = {@JoinColumn(name = "author_id")}
	)
	@Cache(usage = CacheConcurrencyStrategy.READ_WRITE,region="org.hibernate.search.hibernate.example.model.Author")
	private Set<Author> authors = new HashSet<Author>();

	public Integer getId() {
		return id;
	}

	public void setId(Integer id) {
		this.id = id;
	}

	public String getName() {
		return name;
	}

	public void setName(String name) {
		this.name = name;
	}

	public String getDescription() {
		return description;
	}

	public void setDescription(String description) {
		this.description = description;
	}

	public Date getPublicationDate() {
		return publicationDate;
	}

	public void setPublicationDate(Date publicationDate) {
		this.publicationDate = publicationDate;
	}

	public Set<Author> getAuthors() {
		return authors;
	}

	public void setAuthors(Set<Author> authors) {
		this.authors = authors;
	}

	public Book() {
	}

}

返回结果类

package org.hibernate.search.hibernate.example.model;

import java.util.List;

/**
 * 查询结果集封装
 * @author Administrator
 *
 * @param <T>
 */
public class QueryResult<T> {

	private int searchresultsize;

	List<T> searchresult;

	public int getSearchresultsize() {
		return searchresultsize;
	}

	public void setSearchresultsize(int searchresultsize) {
		this.searchresultsize = searchresultsize;
	}

	public List<T> getSearchresult() {
		return searchresult;
	}

	public void setSearchresult(List<T> searchresult) {
		this.searchresult = searchresult;
	}

}

Dao

package org.hibernate.search.hibernate.example.dao.impl;

import java.util.ArrayList;
import java.util.List;
import java.util.Set;

import org.apache.lucene.analysis.Analyzer;
import org.apache.lucene.queryParser.MultiFieldQueryParser;
import org.apache.lucene.search.Query;
import org.apache.lucene.search.Sort;
import org.apache.lucene.search.SortField;
import org.apache.lucene.search.highlight.Highlighter;
import org.apache.lucene.search.highlight.QueryScorer;
import org.apache.lucene.search.highlight.SimpleHTMLFormatter;
import org.apache.lucene.util.Version;
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.hibernate.search.FullTextQuery;
import org.hibernate.search.FullTextSession;
import org.hibernate.search.Search;
import org.hibernate.search.hibernate.example.dao.BookDao;
import org.hibernate.search.hibernate.example.model.Author;
import org.hibernate.search.hibernate.example.model.Book;
import org.hibernate.search.hibernate.example.model.QueryResult;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.stereotype.Repository;

@Repository(value="bookDaoImpl")
public class BookDaoImpl implements BookDao {

	@Autowired
	@Qualifier("hibernate4sessionFactory")
	private SessionFactory sessionFactory;

	private Session getSession(){
		return sessionFactory.getCurrentSession();
	}

	@Override
	public void add(Book book) {

		getSession().persist(book);

	}

	@SuppressWarnings("unchecked")
	@Override
	public List<Book> query(int start, int pagesize) {
		return getSession().createCriteria(Book.class).setFirstResult(start).setMaxResults(pagesize).list();
	}

	@Override
	public void update(Book book) {
		getSession().merge(book);
	}

	@Override
	public void delete(Book book) {
		getSession().delete(book);
	}

	@Override
	public void delete(int id) {

		getSession().delete(load(id));

	}

	@Override
	public QueryResult<Book> query(String keyword, int start, int pagesize,Analyzer analyzer,String...field) throws Exception{

		QueryResult<Book> queryResult=new QueryResult<Book>();

		List<Book> books=new ArrayList<Book>();

		FullTextSession fullTextSession = Search.getFullTextSession(getSession());

		//使用Hibernate Search api查询 从多个字段匹配 name、description、authors.name
		//QueryBuilder qb = fullTextSession.getSearchFactory().buildQueryBuilder().forEntity(Book.class ).get();
		//Query luceneQuery = qb.keyword().onFields(field).matching(keyword).createQuery();

		//使用lucene api查询 从多个字段匹配 name、description、authors.name

		MultiFieldQueryParser queryParser=new MultiFieldQueryParser(Version.LUCENE_36,new String[]{"name","description","authors.name"}, analyzer);
		Query luceneQuery=queryParser.parse(keyword);

		FullTextQuery fullTextQuery = fullTextSession.createFullTextQuery(luceneQuery);
		int searchresultsize = fullTextQuery.getResultSize();
		queryResult.setSearchresultsize(searchresultsize);
		System.out.println("共查找到["+searchresultsize+"]条记录");

		fullTextQuery.setFirstResult(start);
		fullTextQuery.setMaxResults(pagesize);

		//设置按id排序
		fullTextQuery.setSort(new Sort(new SortField("id", SortField.INT ,true)));

		//高亮设置
		SimpleHTMLFormatter formatter=new SimpleHTMLFormatter("<b><font color='red'>", "</font></b>");
		QueryScorer queryScorer=new QueryScorer(luceneQuery);
		Highlighter highlighter=new Highlighter(formatter, queryScorer);

		@SuppressWarnings("unchecked")
		List<Book> tempresult = fullTextQuery.list();
		for (Book book : tempresult) {
			String highlighterString=null;
			try {
				//高亮name
				highlighterString=highlighter.getBestFragment(analyzer, "name", book.getName());
				if(highlighterString!=null){
					book.setName(highlighterString);
				}
				//高亮authors.name
				Set<Author> authors = book.getAuthors();
				for (Author author : authors) {
					highlighterString=highlighter.getBestFragment(analyzer, "authors.name", author.getName());
					if(highlighterString!=null){
						author.setName(highlighterString);
					}
				}
				//高亮description
				highlighterString=highlighter.getBestFragment(analyzer, "description", book.getDescription());
				if(highlighterString!=null){
					book.setDescription(highlighterString);
				}
			} catch (Exception e) {
			}

			books.add(book);

			System.out.println("书名:"+book.getName()+"\n描述:"+book.getDescription()+"\n出版日期:"+book.getPublicationDate());
			System.out.println("----------------------------------------------------------");
		}

		queryResult.setSearchresult(books);

		return queryResult;
	}

	@Override
	public Book load(int id) {
		return (Book) getSession().get(Book.class, id);
	}

}

IndexManager 建索引

package org.hibernate.search.hibernate.example;

import org.hibernate.SessionFactory;
import org.hibernate.search.FullTextSession;
import org.hibernate.search.Search;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;

/**
 * @author Administrator
 *
 */
public class IndexManger implements InitializingBean{

	@Autowired
	@Qualifier("hibernate4sessionFactory")
	private SessionFactory sessionFactory;

	@Override
	public void afterPropertiesSet() throws Exception {
		//重建索引
		FullTextSession fullTextSession = Search.getFullTextSession(sessionFactory.openSession());

		fullTextSession.createIndexer().startAndWait();
	}
}

SearchManager 查询类

package org.hibernate.search.hibernate.example;

import java.util.List;
import java.util.Set;

import net.paoding.analysis.analyzer.PaodingAnalyzer;

import org.apache.lucene.analysis.Analyzer;
import org.apache.lucene.queryParser.MultiFieldQueryParser;
import org.apache.lucene.search.Query;
import org.apache.lucene.search.highlight.Highlighter;
import org.apache.lucene.search.highlight.QueryScorer;
import org.apache.lucene.search.highlight.SimpleHTMLFormatter;
import org.apache.lucene.util.Version;
import org.hibernate.SessionFactory;
import org.hibernate.search.FullTextQuery;
import org.hibernate.search.FullTextSession;
import org.hibernate.search.Search;
import org.hibernate.search.hibernate.example.model.Author;
import org.hibernate.search.hibernate.example.model.Book;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;

public class SearchManager {

	public static void main(String[] args) throws Exception{
		ApplicationContext applicationContext=new ClassPathXmlApplicationContext("applicationContext.xml");
		SessionFactory sessionFactory = applicationContext.getBean("hibernate4sessionFactory",SessionFactory.class);
		FullTextSession fullTextSession = Search.getFullTextSession(sessionFactory.openSession());

		//使用Hibernate Search api查询 从多个字段匹配 name、description、authors.name
//		QueryBuilder qb = fullTextEntityManager.getSearchFactory().buildQueryBuilder().forEntity(Book.class ).get();
//		Query luceneQuery = qb.keyword().onFields("name","description","authors.name").matching("移动互联网").createQuery();

		//使用lucene api查询 从多个字段匹配 name、description、authors.name
		//使用庖丁分词器
		MultiFieldQueryParser queryParser=new MultiFieldQueryParser(Version.LUCENE_36, new String[]{"name","description","authors.name"}, new PaodingAnalyzer());
		Query luceneQuery=queryParser.parse("实战");

		FullTextQuery fullTextQuery =fullTextSession.createFullTextQuery(luceneQuery, Book.class);
		//设置每页显示多少条
		fullTextQuery.setMaxResults(5);
		//设置当前页
		fullTextQuery.setFirstResult(0);

		//高亮设置
		SimpleHTMLFormatter formatter=new SimpleHTMLFormatter("<b><font color='red'>", "<font/></b>");
		QueryScorer queryScorer=new QueryScorer(luceneQuery);
		Highlighter highlighter=new Highlighter(formatter, queryScorer);

		@SuppressWarnings("unchecked")
		List<Book> resultList = fullTextQuery.list();
		System.out.println("共查找到["+resultList.size()+"]条记录");
		for (Book book : resultList) {
			String highlighterString=null;
			Analyzer analyzer=new PaodingAnalyzer();
			try {
				//高亮name
				highlighterString=highlighter.getBestFragment(analyzer, "name", book.getName());
				if(highlighterString!=null){
					book.setName(highlighterString);
				}
				//高亮authors.name
				Set<Author> authors = book.getAuthors();
				for (Author author : authors) {
					highlighterString=highlighter.getBestFragment(analyzer, "authors.name", author.getName());
					if(highlighterString!=null){
						author.setName(highlighterString);
					}
				}
				//高亮description
				highlighterString=highlighter.getBestFragment(analyzer, "description", book.getDescription());
				if(highlighterString!=null){
					book.setDescription(highlighterString);
				}
			} catch (Exception e) {
			}

			System.out.println("书名:"+book.getName()+"\n描述:"+book.getDescription()+"\n出版日期:"+book.getPublicationDate());
			System.out.println("----------------------------------------------------------");
		}

		fullTextSession.close();
		sessionFactory.close();

	}
}
时间: 2024-10-12 02:59:02

Hibernate-Search使用的相关文章

Hibernate search与Lucene包异常学习心得

最近使用了了一下Hibernate  Search这个组件 这个组件是对域模型进行全文检索,在全文检索的底层实现上使用了Lucene技术 在进行小测试的时候费了很大的力气去搞定包的问题 我直接通过实例进行验证 开始的时候我用的是最新的hibernate—search包,导致的直接问题是各种包不一致 这个问题花了我很长时间,最突出的一个异常就是java.lang.VerifyError 在比较久远的hibernate-search.jar包如:hibernate-search-3.1.0.GA 中

S2SH+Hibernate search出现的问题

一  java.lang.NoSuchMethodError: org.hibernate.engine.transaction.spi.TransactionEnvironment.getJtaPlatform()Lorg/hibernate/service/jta/platform/spi/JtaPlatform 相信大家遇到这个问题一定是累觉不爱了吧,呵呵.本人qq1413557667 出现这个问题的原因是不同版本的(例如4.2跟4.5)的hibernate-core jar包里面In H

[Hibernate Search] 实体类型的映射

实体类型映射 在上一篇文章中,简要介绍了如何使用Hibernate Search来对一个实体进行全文搜索. 然而,在真实的应用中,实体与实体之间的关系也许更为复杂.为了对复杂实体进行搜索,就需要让底层的Lucene查询也能够理解这些关系. 下图反映了Database,Hibernate,Hibernate Search和Lucene之间的关系: 域映射选项(Field Mapping Options) 我们已经知道@Field注解用来让某个域可以被全文搜索到. 实际上,在添加该注解后,Hiber

[Hibernate Search] 初识Hibernate Search

初识Hibernate Search 要让你的应用具备Hibernate Search赋予的全文搜索的能力,需要做以下三件事: 给项目添加必要的依赖和配置信息 给你的实体类添加必要的信息,从而让Lucene知道如何对它们进行索引(Indexing) 在需要的地方使用符合Hibernate Search规范的查询来完成业务逻辑 对于需要添加的依赖信息,会在以后进行介绍.我们首先来看看代码该如何写. 我们会使用一个类似于经典的"Java Pet Store"那样的Web应用来展示Hiber

[Hibernate Search] (4) 实体类型的高级映射功能

高级映射 前面介绍的可搜索的域基本上都是字符串类型,实际上可搜索的类型是非常丰富的. 本文会介绍以下几个方面的内容: Lucene对实体进行索引的过程 借助Solr组件对这个过程的改进 修改域的重要程度,从而让基于相关度的排序更加有意义 动态决定是否对一个实体类型进行索引 桥接器(Bridges) 实体类型中可以使用的类型是无穷无尽的,但是对于Lucene索引而言,任何类型归根到底都会以字符串来表示.所以,在对实体的域进行索引时,这些域最终需要被转换为字符串类型的对象. 在Hibernate S

[Hibernate Search] (3) 基础查询

基础查询 眼下我们仅仅用到了基于keyword的查询,实际上Hibenrate Search DSL还提供了其他的查询方式,以下我们就来一探到底. 映射API和查询API 对于映射API.我们能够通过使用Hibernate提供的注解来完毕映射工作.同一时候我们也能够使用JPA提供的注解来完毕.类似的,对于查询API,我们也能够从Hibernate和JPA提供的查询API中进行选择. 每种方式都有它的长处和缺点,比方当我们使用Hibernate提供的查询API时,意味着能够使用很多其它的特性,毕竟

[Hibernate Search] (5) 高级查询 - 过滤,投影和分面

高级查询 在介绍了更多的高级映射功能之后,是时候回顾一下之前介绍过的查询功能了,看看如何借助这些高级的映射功能来使用一些高级的查询功能.本文会通过以下几个方面进行介绍: 如何在不和数据库进行任何交互的前提下,借助Lucene的力量来动态的筛选结果 如何通过使用基于投影(Projection)的查询来获取需要的属性,从而避免与数据库的交互 如何使用分面搜索(Faceted Search)对搜索结果进行划分 如何使用查询时提升(Boosting) 如何给查询设置时间限制 过滤(Filtering)

[Hibernate Search] (5) 解析和条件索引

解析(Analysis) 当一个实体域被Lucene索引时,往往还会经历一个语法分析(Parsing)和转换(Conversion)的步骤,这些步骤被称为解析.在前文中,我们提到过Hibernate Search会默认对字符串类型的实体域进行分词,而这个分词过程就需要用到解析器(Analyzer).在需要对实体域进行排序的场合,需要禁用这个默认的分词行为. 在解析过程中,还可以借助Apache Solr提供的组件来完成更多的操作.为了弄清楚Solr组件是如何参与到这个过程中并完成更多的操作,需要

将Hibernate Search集成进已有项目中,实现全文检索功能

本来是准备使用Lucene的但是新版本的API过于繁琐,最后还是决定使用Hibernate Search来实现全文检索.这篇博文以我以前做的博客为例来实现全文检索. 1.修改Hibernate配置文件,因为我的系统采用的是SSH2来开发的所以我修改的是spring配置文件 <bean id="sessionFactory" class="org.springframework.orm.hibernate4.LocalSessionFactoryBean">

Hibernate Search JPA + Lucene 简单应用实例

假设现在有这么一个应用场景.数据库里存放了大量文章,我们想要通过输入关键字,从数据库中检索出相关的文章. 1.建立Article Entity: public class Article { private String id;            //ID private String title;         //标题 private String content;       //内容         private String isPublication; //发布状态 publ