Hpsg Bibliography Page

Head-driven phrase structure grammar (HPSG) is a highly lexicalized, constraint-based grammar[1] developed by Carl Pollard and Ivan Sag.[2][3] It is a type of phrase structure grammar, as opposed to a dependency grammar, and it is the immediate successor to generalized phrase structure grammar. HPSG draws from other fields such as computer science (data type theory and knowledge representation) and uses Ferdinand de Saussure's notion of the sign. It uses a uniform formalism and is organized in a modular way which makes it attractive for natural language processing.

An HPSG grammar includes principles and grammar rules and lexicon entries which are normally not considered to belong to a grammar. The formalism is based on lexicalism. This means that the lexicon is more than just a list of entries; it is in itself richly structured. Individual entries are marked with types. Types form a hierarchy. Early versions of the grammar were very lexicalized with few grammatical rules (schema). More recent research has tended to add more and richer rules, becoming more like construction grammar.[4]

The basic type HPSG deals with is the sign. Words and phrases are two different subtypes of sign. A word has two features: [PHON] (the sound, the phonetic form) and [SYNSEM] (the syntactic and semantic information), both of which are split into subfeatures. Signs and rules are formalized as typedfeature structures.

Sample grammar[edit]

HPSG generates strings by combining signs, which are defined by their location within a type hierarchy and by their internal feature structure, represented by attribute value matrices (AVMs). [5][6] Features take types or lists of types as their values, and these values may in turn have their own feature structure. Grammatical rules are largely expressed through the constraints signs place on one another. A sign's feature structure describes its phonological, syntactic, and semantic properties. In common notation, AVMs are written with features in upper case and types in italicized lower case. Numbered indices in an AVM represent token identical values.

In the simplified AVM for the word "walks" below, the verb's categorical information (CAT) is divided into features that describe it (HEAD) and features that describe its arguments (VALENCE).

"Walks" is a sign of type word with a head of type verb. As an intransitive verb, "walks" has no complement but requires a subject that is a third person singular noun. The semantic value of the subject (CONTENT) is co-indexed with the verb's only argument (the individual doing the walking). The following AVM for "she" represents a sign with a SYNSEM value that could fulfill those requirements.

Signs of type phrase unify with one or more children and propagate information upward. The following AVM encodes the immediate dominance rule for a head-subj-phrase, which requires two children: the head child (a verb) and a non-head child that fulfills the verb's SUBJ constraints.

The end result is a sign with a verb head, empty subcategorization features, and a phonological value that orders the two children.

Although the actual grammar of HPSG is composed entirely of feature structures, linguists often use trees to represent the unification of signs where the equivalent AVM would be unwieldy.


Various parsers based on the HPSG formalism have been written and optimizations are currently being investigated. An example of a system analyzing Germansentences is provided by the Freie Universität Berlin.[7] In addition the CoreGram[8] project of the Grammar Group of the Freie Universität Berlin provides open source grammars that were implemented in the TRALE system. Currently there are grammars for German,[9]Danish,[10]Mandarin Chinese,[11]Maltese,[12] and Persian[13] that share a common core and are publicly available.

Large HPSG grammars of various languages are being developed in the Deep Linguistic Processing with HPSG Initiative (DELPH-IN).[14] Wide-coverage grammars of English,[15] German,[16] and Japanese[17] are available under an open-source license. These grammars can be used with a variety of inter-compatible open-source HPSG parsers: LKB, PET,[18] Ace,[19] and agree.[20] All of these produce semantic representations in the format of “Minimal Recursion Semantics,” MRS.[21] The declarative nature of the HPSG formalism means that these computational grammars can typically be used for both parsing and generation (producing surface strings from semantic inputs). Treebanks, also distributed by DELPH-IN, are used to develop and test the grammars, as well as to train ranking models to decide on plausible interpretations when parsing (or realizations when generating).

Enju is a freely available wide-coverage probabilistic HPSG parser for English developed by the Tsujii Laboratory at The University of Tokyo in Japan.[22]

See also[edit]


Further reading[edit]

  • Carl Pollard, Ivan A. Sag (1987): Information-based Syntax and Semantics. Volume 1: Fundamentals. Stanford: CSLI Publications.
  • Carl Pollard, Ivan A. Sag (1994): Head-Driven Phrase Structure Grammar. Chicago: University of Chicago Press. ([1])
  • Ivan A. Sag, Thomas Wasow, Emily M. Bender (2003): Syntactic Theory: a formal introduction, Second Edition. Chicago: University of Chicago Press. ([2])
  • Levine, Robert D.; W. Detmar Meurers (2006). "Head-Driven Phrase Structure Grammar: Linguistic Approach, Formal Foundations, and Computational Realization"(PDF). In Keith Brown. Encyclopedia of Language and Linguistics (second ed.). Oxford: Elsevier. 
  • Müller, Stefan (2013). "Unifying Everything: Some Remarks on Simpler Syntax, Construction Grammar, Minimalism and HPSG". Language. doi:10.1353/lan.2013.0061. 

External links[edit]

  1. ^https://www.acsu.buffalo.edu/~rchaves/hpsg-ideas.html. [permanent dead link]
  2. ^Pollard, Carl, and Ivan A. Sag. 1987. Information-based syntax and semantics. Volume 1. Fundamentals. CLSI Lecture Notes 13.
  3. ^Pollard, Carl; Ivan A. Sag. (1994). Head-driven phrase structure grammar. Chicago: University of Chicago Press.
  4. ^Sag, Ivan A. 1997. English Relative Clause Constructions. Journal of Linguistics . 33.2: 431-484
  5. ^Pollard, Carl; Ivan A. Sag. (1994). Head-driven phrase structure grammar. Chicago: University of Chicago Press.
  6. ^Sag, Ivan A.; Thomas Wasow; & Emily Bender. (2003). Syntactic theory: a formal introduction. 2nd ed. Chicago: University of Chicago Press.
  7. ^The Babel-System: HPSG Interactive
  8. ^The CoreGram Project
  9. ^Berligram
  10. ^DanGram
  11. ^Chinese
  12. ^Maltese
  13. ^Persian
  14. ^DELPH-IN: Open-Source Deep Processing
  15. ^English Resource Grammar and Lexicon
  16. ^Berthold Crysmann
  17. ^JacyTop - Deep Linguistic Processing with HPSG (DELPH-IN)
  18. ^DELPH-IN PET parser
  19. ^Ace: the Answer Constraint Engine
  20. ^agree grammar engineering
  21. ^Copestake, A., Flickinger, D., Pollard, C., & Sag, I. A. (2005). Minimal recursion semantics: An introduction. Research on Language and Computation, 3(2-3), 281-332.
  22. ^Tsuji Lab: Enju parser home pageArchived 2010-03-07 at the Wayback Machine. (retrieved Nov 24, 2009)


Welcome to the on-line wiki forum for DELPH-INsoftware and resources. It serves to enable both developers and users to incrementally create further documentation and up-to-date information on aspects of installation or usage of DELPH-IN technology. Mostly to enforce some discipline among ourselves, these pages require that users are registered to the wiki server in order to obtain write access. Please create a WikiName for yourself, which may require obtaining a ‘textcha’ to protect against wiki spam; once registered at the wiki, to request write access please contact at. The developers do hope that active DELPH-IN users will contribute to these pages over time.

Components and Resources

  • Tools and Architectures
    • LKB: Lexical Knowledge Builder --- Grammar Engineering Environment

    • [incr tsdb()]: Competence and Performance Profiler

    • Pet: Platform for Experimentation with efficient HPSG processing Techniques

    • Heart of Gold: XML-based middleware for the integration of deep and shallow NLP components

    • LOGON: Information about the LOGON machine translation infrastructure.

    • Other tools: Supporting software, addons, peripheral contributions

  • Grammars, Frameworks and Treebanks
    • Catalogue of Grammars

    • Matrix: Starter-Kit for rapid prototyping of LKB-compatible precision grammars

    • CLIMB: Tools to support grammar development of LKB-compatible precision grammars

    • Redwoods: HPSG Treebank Comprised of Analyses from the ERG

    • MRS: Minimal Recursion Semantics --- Theory and Implementation (including extensions and variants such Robust MRS (RMS), Elementary Dependency Structures (EDS) and Dependency MRS (DMRS))

    • Shared Corpora, Treebanks
    • Grammar discussions: Discussions for grammar developers (analyses, terminology, harmonization, …)

    • DELPH-IN RFCs (Requests For Comments; formal specifications)

  • Applications

Further Information

Additional information about the Deep Linguistic Processing with HPSG Initiative (DELPH-IN) is available from the DELPH-IN home page, including pointers to on-line information for most of the DELPH-IN resources, a bibliography of select background publications, and links to electronic copies in most cases.

There is a collection of DELPH-IN mailings lists to which users can subscribe on-line and browse archives of previous postings through the DELPH-IN mailing list manager. If you click on a list, there is a link to the archive near the top of the page.

There is also an experimental stack-exchange style forum.

FrontPage (last edited 2017-11-20 18:56:45 by OlgaZamaraeva)

0 thoughts on “Hpsg Bibliography Page”


Leave a Comment

Your email address will not be published. Required fields are marked *