About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ACL-IJCNLP 2015
Conference paper
Dependency-based convolutional neural networks for sentence embedding
Abstract
In sentence modeling and classification, convolutional neural network approaches have recently achieved state-of-the-art re-sults, but all such efforts process word vec-tors sequentially and neglect long-distance dependencies. To combine deep learn-ing with linguistic structures, we pro-pose a dependency-based convolution ap-proach, making use of tree-based n-grams rather than surface ones, thus utlizing non-local interactions between words. Our model improves sequential baselines on all four sentiment and question classification tasks, and achieves the highest published accuracy on TREC.