Dependency Structures
Dependency structures connect words in a sentence by putting arrows between the words. Arrows show relationships between the phrases and establish some grammatical relations.
Arrows connect ahead ( Governor superior Regent) with dependent ( modifier, inferior, subordinate). Usually, dependencies form a tree.
Criteria for Heads and dependencies
Here are the Criteria for a syntactic relation between ahead H and a dependent D in a construction C.
- H determines the syntactic category of C; that is, H can replace C.
- D specifies H.
- H is obligatory; that is, D may be optimal.
- H selects d and remains whether D is obligatory.
- The form of D depends on H(agreement or government.)
- The linear position of D is specified concerning H.
Dependency Graph
A dependency structure is a direct graph G consisting of a set V of nodes and a set A of edges. A labelled dependency graph has nodes in V that are labelled with word forms and annotations. The edges in the chart are labelled with dependency types.
Notational convention
- edges(wi,d,w) links head wi to dependent wj with label d
- wi → wj ⇔ (wi,d,w) ε A
- i → j ≡ (i,j) ϵ A
- i → *j ≡ i = j ∨ ∃k : i → k, k → *j
Formal conditions on Dependency graphs
- If G is connected, then for every node 'i', there is a node j such as "I → j or j → i."
- If G is cyclic and if I → j if then not j → *I
- If G obeys the single head constraint, And if I → j, then not k → j if for any k ≠ i.
- If G is projective and if I → j, then j → *k, for any k such that both j and k lie on the same side of i.
- Connectedness- syntactic structure is complete
- Acyclicity- syntactic structure is hierarchical
- Single head- every word has at most one synthetic ahead
- Projectivity- no crossing of dependencies
The NLTK package is a set of libraries and prewritten programs for statistical analysis and natural language processing for human language and is used for dependency parsing.
Parsing Methods
Deterministic Parsing
In natural language processing, deterministic parsing refers to algorithms that do not backtrack. LR-parsers are an example.
Minimum Spanning Tree-based
Start with a fully-connected directed graph and find a Minimum Spanning Tree. Then Subtract its score from all incoming edges. Contracts nodes if there are cycles and recursively computes MST.
We can use NLTK for dependency parsing in any of the various ways.
1. Probabilistic, projective dependency parser- This projective dependency parser predicts new sentences using human language data gathered from hand-parsed sentences. These parsers are not efficient and make mistakes and work with a limited collection of information.
2. Stanford parser- Stanford parser is a Java-based language parser. The Stanford parser supports several languages, including English, Chinese, German, and Arabic.
Check out this problem - No of Spanning Trees in a Graph
You can also check about Java Tokens here.
FAQs
1. Define Dependency Parsing.
Dependency Parsing is the process of analysing the grammatical structure in a sentence and finding out related words and the type of the relationship between them.
2. Define a Parsing tree?
A parse tree converts the sentence into a tree-like structure. The leaf of the parse tree holds POS tags that correspond to words in the sentence. The rest of the tree tells us how these words merge to form the entire sentence.
3. Define dependency structures.
Dependency structures connect words in a sentence by putting arrows between the words. Arrows show relationships between the phrases and establish some grammatical relations.
4. What is transition-based parsing?
A transition-based person is a deterministic and greedy choice of local transitions guided by a good classifier.
5. Define context-free grammar.
We can represent a Context-free Grammar (CFG) in four tuples:
G = (N, Σ, P, S)
N is a finite set of nonterminal symbols.
Σ = limited set of terminal characters (tokens).
P = limited set of rules.
Key Takeaways
This blog taught us dependency structures, dependency graphs, and various passing methods for dependency parsing. Dependency parsing is used widely in Machine Learning, and its application is increasing day by day.
Further readings-
CYK Algorithm
Restricted Boltzmann Machine on MNIST Dataset
Has-A Relationship in Java
Chomsky Normal Forms
Recursive Descent Parser