Publication | Open Access
From image descriptions to visual denotations: New similarity metrics for semantic inference over event descriptions
2.4K
Citations
31
References
2014
Year
Image DescriptionsEngineeringVisual DenotationsSemanticsCorpus LinguisticsNatural Language ProcessingImage AnalysisText-to-image RetrievalData ScienceVisual GroundingComputational LinguisticsLanguage StudiesMachine TranslationMachine VisionVision Language ModelComputer VisionScene InterpretationEvent DescriptionsDenotation GraphSubsumption HierarchyLinguisticsSemantic SimilarityAutomatic Annotation
We propose to use the visual denotations of linguistic expressions (i.e. the set of images they describe) to define novel denotational similarity metrics, which we show to be at least as beneficial as distributional similarities for two tasks that require semantic inference. To compute these denotational similarities, we construct a denotation graph, i.e. a subsumption hierarchy over constituents and their denotations, based on a large corpus of 30K images and 150K descriptive captions.
| Year | Citations | |
|---|---|---|
Page 1
Page 1