保谷哲也(著)/統語ネットワーク:カーネルメモリー・アプローチ<br>Syntactic Networks—Kernel Memory Approach (Studies in Computational Intelligence)

個数:

保谷哲也(著)/統語ネットワーク:カーネルメモリー・アプローチ
Syntactic Networks—Kernel Memory Approach (Studies in Computational Intelligence)

  • 提携先の海外書籍取次会社に在庫がございます。通常3週間で発送いたします。
    重要ご説明事項
    1. 納期遅延や、ご入手不能となる場合が若干ございます。
    2. 複数冊ご注文の場合は、ご注文数量が揃ってからまとめて発送いたします。
    3. 美品のご指定は承りかねます。

    ●3Dセキュア導入とクレジットカードによるお支払いについて
  • 【入荷遅延について】
    世界情勢の影響により、海外からお取り寄せとなる洋書・洋古書の入荷が、表示している標準的な納期よりも遅延する場合がございます。
    おそれいりますが、あらかじめご了承くださいますようお願い申し上げます。
  • ◆画像の表紙や帯等は実物とは異なる場合があります。
  • ◆ウェブストアでの洋書販売価格は、弊社店舗等での販売価格とは異なります。
    また、洋書販売価格は、ご注文確定時点での日本円価格となります。
    ご注文確定後に、同じ洋書の販売価格が変動しても、それは反映されません。
  • 製本 Hardcover:ハードカバー版/ページ数 129 p.
  • 言語 ENG
  • 商品コード 9783031573118

Full Description

This book proposes a novel connectionist approach to a challenging topic of language modeling within the context of kernel memory and artificial mind system, both proposed previously by the author in the very first volume of the series, Artificial Mind System—Kernel Memory Approach: Studies in Computational Intelligence, Vol. 1. The present volume focuses on how syntactic structures of language are modeled in terms of the respective composite connectionist architectures, each embracing both the nonsymbolic and symbolic parts. These two parts are developed via inter-module processes within the artificial mind system and eventually integrated under a unified framework of kernel memory. The data representation by the networks embodied within the kernel memory principle is essentially local, unlike conventional artificial neural network models such as the pervasive multilayer perceptron-based neural networks. With this locality principle, kernel memory inherently bears many attractive features, such as topologically unconstrained network formation, straightforward network growing, shrinking, and reconfiguration, no requirement of arduous iterative parameter tuning, construction of transparent and hierarchical data structures, and multimodal and temporal data processing via the network representation. Exploiting these multifacet properties of kernel memory with interweaving the notion of inter-module processing within the artificial mind system provides coherent accounts for concept formation and how various linguistic phenomena, viz. word compoundings, morphologies, and multiword constructions, are modeled. The description is then extended to more intricate network models of context-dependent lexical network and syntactic-oriented processing, the latter being the central theme of the present study, and further to those representing a hybrid of nonverbal and verbal thinking, and semantic and pragmatic aspects of sentential meaning. The book is intended for general readers engaging in various areas of study in cognitive science, computer science, engineering, linguistics, philosophy, psycholinguistics, and psychology.

Contents

Review of the Two Existing Artificial Neural Network Models - Multilayer Perceptron and Probabilistic Neural Networks.- Beyond the Original PNN Model - Kernel Memory for Modeling Various Neural Pattern Processing Mechanism.- Modules within the Artificial Mind System and Their Interactions Relevant to Language Pattern Processing.- Concept Formation.