从今天起 我要记录一些文章中出现的好句子,以后就可以经常摘录比较好的句子在里面: 2015-10-27
-----------------------------------------------------------------------------------
Errata: This is a report that was fi nished in a bit of a rush for a deadline, and has numerous imperfections that I dislike but couldn’t have avoided: (错误无法避免)
We are not aware of any signifi cant recent work in the NLIDB domain that would be based
on well defi ned, published dataset and rigorously evaluate results — most research concerns
building auxiliary systems that augment human database experts. 最近没有什么什么
Open Question Answering Over Curated and Extracted Knowledge Bases
In a nutshell 简而言之
End-to-end differentiable neural architectures have failed to approach state-of-the-art performance until very recently.
The models presented here are general sequence models, requiring no appeal to natural language specific processing beyond tokenization
We surmise that without effective salience models on text-derived rules, reasoning is unable to leverage the systematic advantages of the MLN-based models.
recapitulated briefly here for completeness
extends recursive neural network through a gating mechanism to allow it to learn the structure of recursive composition on the fly 不断的 不断加入的
回复列表: