Table of contents
1.
Introduction
2.
Parsing Tree
3.
Dependency Structures
4.
Dependency Graph
5.
Parsing Methods
6.
FAQs
7.
Key Takeaways
Last Updated: Mar 27, 2024

Dependency Parsing

Author Rajkeshav
0 upvote
Career growth poll
Do you think IIT Guwahati certified course can help you in your career?

Introduction

source

The dependency parsing concept became famous in the last three years, but its origin is ancient. Dependency Parsing is the process of analysing the grammatical structure in a sentence and finding out related words and the type of the relationship between them.

A sentence is divided into many sections based mainly on this. The process is based on the assumption of considering a direct relationship between each linguistic unit in a sentence. These hyperlinks are called dependencies.

source

Here we can see the connection between the car and black. Black modifies the context of the car. Here, the car acts as the head, and black depends on the head. The nature of the relationship here is a mod which stands for "Adjectival Modifier ." An adjective phrase modifies a noun.

Parsing Tree

source

A parse tree converts the sentence into a tree-like structure. The leaf of the parse tree holds POS tags that correspond to words in the sentence. The rest of the tree tells us how these words merge to form the entire sentence.

We can combine an adjective and a noun to make a 'Noun Phrase' (the red shirt), connecting another adjective to make another Noun Phrase (e.g., the new red shirt).

Dependency Structures

Dependency structures connect words in a sentence by putting arrows between the words. Arrows show relationships between the phrases and establish some grammatical relations.

Arrows connect ahead ( Governor superior Regent) with dependent ( modifier, inferior, subordinate). Usually, dependencies form a tree.

Criteria for Heads and dependencies

Here are the Criteria for a syntactic relation between ahead H and a dependent D  in a construction C.

  •  H determines the syntactic category of C; that is, H can replace C.
  •  D specifies H.
  •  H is obligatory; that is, D may be optimal.
  •  H selects d and remains whether D is obligatory.
  • The form of D depends on H(agreement or government.)
  • The linear position of D is specified concerning H. 

Dependency Graph

A dependency structure is a direct graph G consisting of a set V of nodes and a set A of edges. A labelled dependency graph has nodes in V that are labelled with word forms and annotations. The edges in the chart are labelled with dependency types.

Notational convention

  • edges(wi,d,w) links head wi to dependent wj with label d
  • wi → wj ⇔ (wi,d,w) ε A
  • i → j ≡ (i,j) ϵ A
  • i → *j ≡ i = j ∨ ∃k : i → k, k → *j

Formal conditions on Dependency graphs

  • If G is connected, then for every node 'i', there is a node j  such as "I → j or j → i."
  • If G is cyclic and if I → j  if then not j → *I
  • If G  obeys the single head constraint, And if I → j, then not k → j  if for any k ≠ i.
  • If G is projective and if I → j,  then j → *k,  for any k such that both j and k lie on the same side of i.
  • Connectedness-  syntactic structure is complete
  • Acyclicity-  syntactic structure is hierarchical
  • Single head-  every word has at most one synthetic ahead
  • Projectivity-  no crossing of dependencies

The NLTK package is a set of libraries and prewritten programs for statistical analysis and natural language processing for human language and is used for dependency parsing.

Parsing Methods

Deterministic Parsing

In natural language processing, deterministic parsing refers to algorithms that do not backtrack. LR-parsers are an example.

 Minimum Spanning Tree-based

Start with a fully-connected directed graph and find a Minimum Spanning Tree. Then Subtract its score from all incoming edges. Contracts nodes if there are cycles and recursively computes MST.

We can use NLTK for dependency parsing in any of the various ways.

1. Probabilistic, projective dependency parser- This projective dependency parser predicts new sentences using human language data gathered from hand-parsed sentences. These parsers are not efficient and make mistakes and work with a limited collection of information.

2. Stanford parser- Stanford parser is a Java-based language parser. The Stanford parser supports several languages, including English, Chinese, German, and Arabic.


Check out this problem - No of Spanning Trees in a Graph

You can also check about Java Tokens here.

FAQs

1. Define Dependency Parsing.

Dependency Parsing is the process of analysing the grammatical structure in a sentence and finding out related words and the type of the relationship between them.

2. Define a Parsing tree?

A parse tree converts the sentence into a tree-like structure. The leaf of the parse tree holds POS tags that correspond to words in the sentence. The rest of the tree tells us how these words merge to form the entire sentence.

3. Define dependency structures.

Dependency structures connect words in a sentence by putting arrows between the words. Arrows show relationships between the phrases and establish some grammatical relations.

4. What is transition-based parsing?

 A transition-based person is a deterministic and greedy choice of local transitions guided by a good classifier.

5. Define context-free grammar.

We can represent a Context-free Grammar (CFG) in four tuples: 

G = (N, Σ, P, S) 

N is a finite set of nonterminal symbols.  

Σ = limited set of terminal characters (tokens). 

P = limited set of rules.

Key Takeaways

This blog taught us dependency structures, dependency graphs, and various passing methods for dependency parsing. Dependency parsing is used widely in Machine Learning, and its application is increasing day by day. 

Further readings-

CYK Algorithm

Restricted Boltzmann Machine on MNIST Dataset

Has-A Relationship in Java

Chomsky Normal Forms

Recursive Descent Parser

Live masterclass