Provides you with ebooks download links of various topics such as design of programming languages, theory of programming languages, features, proof and logic of programming languages, syntax and semantics of programming languages, functional languages, computer language history, Automata theory, Context-Free Grammars, and more.

Programming Languages: History, Utilities, tutorials and ebooks.

Following are the useful computer programming languages history, tutorials, ebooks and other  useful information pages available in web.

Implementing functional languages: a tutorial

By Simon Peyton Jones and David Lester

This book gives a practical approach to understanding implementations of non-strict functional languages using lazy graph reduction. The book is intended to be a source of practical labwork material, to help make functional-language implementations `come alive', by helping the reader to develop, modify and experiment with some non-trivial compilers.

The unusual aspect of the book is that it is meant to be executed as well as read. Rather than merely presenting an abstract description of each implementation technique, we present the code for a complete working prototype of each major method, and then work through a series of improvements to it. All of the code is available in machine-readable form.

The principal content of the book is a series of implementations of a small functional language called the Core language. The Core language is designed to be as small as possible, so that it is easy to implement, but still rich enough to allow modern non-strict functional languages to be translated into it without losing efficiency. It is described in detail in Chapter 1, in which we also develop a parser and pretty-printer for the Core language.

Appendix B contains a selection of Core-language programs for use as test programs thoughout the book. The main body of the book consists of four distinct implementations of the Core language.
  • Chapter 2 describes the most direct implementation, based on template instantiation.
  • Chapter 3 introduces the G-machine, and shows how to compile the program to sequences of instructions (G-code) which can be further translated to machine code.
  • Chapter 4 repeats the same exercise for a different abstract machine, the Three Instruction Machine (TIM), whose evaluation model is very different from that of the G-machine. The TIM was developed more recently than the G-machine, so there is much less other literature about it. Chapter 4 therefore contains a rather more detailed development of the TIM's evaluation model than that given for the G-machine.
  • Finally, Chapter 5 adds a new dimension by showing how to compile functional programs for a parallel G-machine.
For each of these implementations we discuss two main parts, the compiler and the machine interpreter. The compiler takes a Core-language program and translates it into a form suitable for execution by the machine interpreter.

The machine interpreter simulates the execution of the compiled program. In each case the interpreter is modelled as a state transition system so that there is a very clear connection between the machine interpreter and a `real' implementation.

One important way in which the Core language is restrictive is in its lack of local function definitions. There is a well-known transformation, called lambda lifting, which turns local function definitions into global ones, thus enabling local function definitions to be written freely and transformed out later. In Chapter 6 we develop a suitable lambda lifter. This chapter is more than just a re-presentation of standard material. Full laziness is a property of functional programs which had previously been seen as inseparable from lambda lifting. In Chapter 6 we show that they are in fact quite distinct, and show how to implement full laziness in a separate pass from lambda lifting.

Denotational Semantics: A Methodology for Language Development

By David Schmidt

Denotational semantics is a methodology for giving mathematical meaning to programming languages and systems. It was developed by Christopher Strachey’s Programming Research Group at Oxford University in the 1960s. The method combines mathematical rigor, due to the work of Dana Scott, with notational elegance, due to Strachey. Originally used as an analysis tool, denotational semantics has grown in use as a tool for language design and implementation. This book was written to make denotational semantics accessible to a wider audience and to update existing texts in the area. I have presented the topic from an engineering viewpoint, emphasizing the descriptional and implementational aspects. The relevant mathematics is also included, for it gives rigor and validity to the method and provides a foundation for further research.

The book is intended as a tutorial for computing professionals and as a text for university courses at the upper undergraduate or beginning graduate level. The reader should be acquainted with discrete structures and one or more general purpose programming languages. Experience with an applicative-style language such as LISP, ML, or Scheme is also helpful.

The Introduction and Chapters 1 through 7 form the core of the book. The Introduction provides motivation and a brief survey of semantics specification methods. Chapter 1 introduces BNF, abstract syntax, and structural induction. Chapter 2 lists those concepts of set theory that are relevant to semantic domain theory. Chapter 3 covers semantic domains, the value sets used in denotational semantics. The fundamental domains and their related operations are presented. Chapter 4 introduces basic denotational semantics. Chapter 5 covers the semantics of computer storage and assignment as found in conventional imperative languages. Nontraditional methods of store evaluation are also considered. Chapter 6 presents least fixed point semantics, which is used for determining the meaning of iterative and recursive definitions. The related semantic domain theory is expanded to include complete partial orderings; ‘‘predomains’’ (complete partial orderings less ‘‘bottom’’ elements) are used. Chapter 7 covers block structure and data structures. Chapters 8 through 12 present advanced topics. Tennent’s analysis of procedural abstraction and general binding mechanisms is used as a focal point for Chapter 8. Chapter 9 analyzes forms of imperative control and branching. Chapter 10 surveys techniques for converting a denotational definition into a computer implementation. Chapter 11 contains an overview of Scott’s inverse limit construction for building recursively defined domains. Chapter 12 closes the book with an introduction to methods for understanding nondeterminism and concurrency.

Throughout the book I have consistently abused the noun ‘‘access,’’ treating it as a verb. Also, ‘‘iff’’ abbreviates the phrase ‘‘if and only if.’’

Categories. Types and Structures: An Introduction to Category Theory for the working computer scientist

By Andrea Asperti and Giuseppe Longo

The main methodological connection between programming language theory and category theory is the fact that both theories are essentially “theories of functions.” A crucial point, though, is that the categorical notion of morphism generalizes the set-theoretical description of function in a very broad sense, which provides a unified understanding of various aspects of the theory of programs. This is one of the reasons for the increasing role of category theory in the semantic investigation of programs if compared, say, to the set-theoretic approach. However, the influence of this mathematical discipline on computer science goes beyond the methodological issue, as the categorical approach to mathematical formalization seems to be suitable for focusing concerns in many different areas of computer science, such as software engineering and artificial intelligence, as well as automata theory and other theoretical aspects of computation.

This book is mostly inspired by this specific methodological connection and its applications to the theory of programming languages. More precisely, as expressed by the subtitle, it aims at a selfcontained introduction to general category theory (part I) and at a categorical understanding of the mathematical structures that constituted, in the last twenty or so years, the theoretical background of relevant areas of language design (part II). The impact on functional programming, or example, of the mathematical tools described in part II, is well known, as it ranges from the early dialects of Lisp, to Edinburgh ML, to the current work in polymorphisms and modularity. Recent applications, such as CAML, which will be described, use categorical formalization for the purposes of implementation.

In addition to its direct relevance to theoretical knowledge and current applications, category theory is often used as an (implicit) mathematical jargon rather than for its explicit notions and results. Indeed, category theory may prove useful in construction of a sound, unifying mathematical environment, one of the purposes of theoretical investigation. As we have all probably experienced, it is good to know in which “category” one is working, i.e., which are the acceptable morphisms and
constructions, and the language of categories may provide a powerful standardization of methods and language. In other words, many different formalisms and structures may be proposed for what is essentially the same concept; the categorical language and approach may simplify through abstraction, display the generality of concepts, and help to formulate uniform definitions. This has been the case, for example, in the early applications of category theory to algebraic geometry.

The first part of this book should encourage even the reader with no specific interest in programming language theory to acquire at least some familiarity with the categorical way of looking at formal descriptions. The explicit use of deeper facts is a further step, which becomes easier with access to this information. Part II and some chapters in part I are meant to take this further step, at least in one of the possible directions, namely the mathematical semantics of data types and programs as objects and morphisms of categories.

Syntax and Semantics of Programming Languages

By Ken Slonneger  and Barry L. Kurtz 

This text developed out of our experiences teaching courses covering the formal semantics of programming languages. Independently we both developed laboratory exercises implementing small programming languages in Prolog following denotational definitions. Prolog proved to be an excellent tool for illustrating the formal semantics of programming languages. We found that these laboratory exercises were highly successful in motivating students since the hands-on experience helped demystify the study of formal semantics. At a professional meeting we became aware of each other’s experiences with a laboratory approach to semantics, and this book evolved from that conference.

Although this text has been carefully written so that the laboratory activities can be omitted without loss of continuity, we hope that most readers will try the laboratory approach and experience the same success that we have observed in our classes.

We have pursued a broad spectrum of definitional techniques, illustrated with numerous examples. Although the specification methods are formal, the presentation is “gentle”, providing just enough in the way of mathematical underpinnings to produce an understanding of the metalanguages. We hope to supply enough coverage of mathematics and formal methods to justify the definitional techniques, but the text is accessible to students with a basic grounding in discrete mathematics as presented to undergraduate computer science students.

There has been a tendency in the area of formal semantics to create cryptic, overly concise semantic definitions that intimidate students new to the study of programming languages. The emphasis in this text is on clear notational conventions with the goals of readability and understandability foremost in our minds.

Following are the few topics covered in this programming language book.
  • Specifying Syntax
  • Grammers and BNF
  • The programming language Wren
  • Variants of BNF
  • Abstract Syntax
  • Scanning
  • Logic Grammers
  • Parsing Wren
  • Attribute grammers
  • Two level grammers
  • The lambda calculas
  • Definition of Programming languages
  • Translational semantics
  • Traditional operational semantics
  • Denotational Semantics
  • Domain theory and fixed point semantics
  • Axiomatic semantics
  • Algebraic Semantics
  • Action Semantics
  • Logic programming with PROLOG
  • Functional Programing with Scheme
Read More/Download

Language Translation Using PCCTS and C++: A Reference Guide

 By Terence John Parr

A few years ago, I implemented a programming language called NewtonScript, the application development language for the Newton(R) operating system. You may not have heard of NewtonScript, but you’ve probably heard of the tool I used to implement it: a crusty old thing called YACC.

YACC--like the C language, Huffman coding, and the QWERTY keyboard--is an example of a standard engineering tool that is standard because it was the first "80% solution". YACC opened up parsing to the average programmer. Writing a parser for a "little language" using YACC was vastly simpler than writing one by hand, which made YACC quite successful. In fact, it was so successful, progress on alternative parsing tools just about stopped.

Not everybody adopted YACC, of course. There were those who needed something better. A lot of serious compiler hackers stuck with hand-coded LL parsers, to get maximum power and flexibility. In many cases, they had to, because languages got more and more complicated--LALR just wasn’t good enough without lots of weird hacks. Of course, these people had to forego the advantages of using a parser generator.

So if your language is simple, you use YACC. If your language is too complex, or if you want good error recovery, or if performance is critical, you write a parser from scratch. This has been the status quo for about 20 years.

Terence Parr and PCCTS have the potential to jolt us out of this situation. First, Terence pursued and formalized a new parsing strategy, called predicated LL(k), that combines the robustness and intelligibility of LL with the generality of LALR. Second, he implemented a parser generator, called ANTLR, that makes this power easy to use. Even the dedicated hand-coders may change their minds after a close look at this stuff. Finally, for those situations where you need to traverse a parse tree (and who doesn't?), SORCERER applies the ANTLR philosophy to that problem.

The result is a tool set that I think deserves to take over from YACC and LEX as the default answer to any parsing problem. And as Terence and others point out, a lot of problems are parsing problems.

Finally, let me mention that PCCTS is a tool with a face. Although it’s in the public domain, it’s actively supported by the tireless Terence Parr, as well as the large and helpful community of users who hang out on comp.compilers.tools.pccts. This book will help the PCCTS community to grow and prosper, so that one day predicated LL(k) will rule, YACC will be relegated to the history books, and Terence will finally achieve his goal of world domination.


Programming Languages: Application and Interpretation

by Shriram Krishnamurthi

The book is the textbook for the programming languages course at Brown University, which is taken primarily
by third and fourth year undergraduates and beginning graduate (both MS and PhD) students. It seems very accessible to smart second year students too, and indeed those are some of my most successful students. The book has been used at over a dozen other universities as a primary or secondary text. The book’s material is worth one undergraduate course worth of credit.

This book is the fruit of a vision for teaching programming languages by integrating the “two cultures” that have evolved in its pedagogy. One culture is based on interpreters, while the other emphasizes a survey of languages. Each approach has significant advantages but also huge drawbacks. The interpreter method writes programs to learn concepts, and has its heart the fundamental belief that by teaching the computer to execute a concept we more thoroughly learn it ourselves. While this reasoning is internally consistent, it fails to recognize that understanding definitions does not imply we understand consequences of those definitions. For instance, the difference between strict and lazy evaluation, or between static and dynamic scope, is only a few lines of interpreter code, but the consequences of these choices is enormous. The survey of languages school is better suited to understand these consequences.

The text therefore melds these two approaches. Concretely, students program with a new set of features first, then try to distill those principles into an actual interpreter. This has the following benefits:
  • By seeing the feature in the context of a real language, students can build something interesting with it first, so they understand that it isn’t an entirely theoretical construct, and will actually care to build  an interpreter for it. (Relatively few students are excited in interpreters for their own sake, and we have an obligation to appeal to the remainder too.)
  • Students get at least fleeting exposure to multiple languages, which is an important educational attribute that is being crushed by the wide adoption of industrially fashionable languages. (Better still, by experimenting widely, they may come to appreciate that industrial fashions are just that, not the last word in technological progress.) 
  • Because they have already programmed with the feature, the explanations and discussions are much more interesting than when all students have seen is an abstract model. 
  • By first building a mental model for the feature through experience, students have a much better chance of actually discovering how the interpreter is supposed to work.
In short, many more humans learn by induction than by deduction, so a pedagogy that supports it is much more likely to succeed than one that suppresses it. The book currently reflects this design, though the survey parts are done better in lecture than in the book.

Separate from this vision is a goal. My goal is to not only teach students new material, but to also change the way they solve problems. I want to show students where languages come from, why we should regard languages as the ultimate form of abstraction, how to recognize such an evolving abstraction, and how to turn what they recognize into a language. The last section of the book, on domain-specific languages, is a growing step in this direction.

Introduction to Programming Languages

This tutorial introduce the topic of programming languages by discussing the five important concepts in a computer language: identifiers, expressions, control structures, input/output, and abstraction. The final lessons illustrate these concepts with an example program implementing the selection sort algorithm. Each lesson includes a set of review questions which test the important concepts from the lesson and provide practice problems.

Following are the few topics covered in this programming language guide.
  • Introduction to Programming Languages
  • Identifiers
  • Assignment
  • Expressions
  • Boolean Expressions
  • Data Types
  • Control Structures
  • Inpu/Output
  • Programs
Read More/Download

Understanding Programming Languages

By M. Ben-Ari 

To say that a good programmer can write good software in any language is like saying that a good pilot can fly any aircraft: true, but irrelevant. A passenger aircraft is designed for safety, comfort and economic viability; a military aircraft is designed for performance and mission capability; an ultralite aircraft is designed for low cost and operational simplicity.

The role of language in programming has been downgraded in favor of software methodology and tools; not just downgraded, but totally repudiated when it is claimed that a well-designed system can be implemented equally well in any language. But programming languages are not just a tool; they furnish the raw material of software, the thing we look at on our screens most of the day. I believe that the programming language is one of the most important, not one of the leastimportant, factors that influence the ultimate quality of a software system. Unfortunately, too many programmers have poor linguistic skills. He/she is passionately in love with his/her ``native'' programming language, but is not able to analyze and compare language constructs, nor to understand the advantages and disadvantages of modern languages and language concepts. Too often, one hears statements that demonstrate conceptual confusion: ``Language L1 is more powerful (or more efficient) than language L2''.

This lack of knowledge is a contributing factor to two serious problems in software. The first is the ultra-conservatism that exists in the choice of programming languages. Despite the explosive advances in computer hardware and the sophistication of modern software systems, most programming is still done in languages that were developed about 1970, if not earlier. Extensive research in programming languages is never tested in practice, and software developers are forced to use tools and methodologies to compensate for obsolete language technology. It is as if airlines would refuse to try jet aircraft on the grounds that an old-fashioned propeller aircraft is perfectly capable of getting you from here to there.

The second problem is that language constructs are used indiscriminately, with little or no regard for safety or efficiency. This leads to unreliable software that cannot be maintained, as well as to inefficiencies that are solved by assembly language coding, rather than by refinement of the algorithms and the programming paradigms.

Programming languages exist only for the purpose of bridging the gap in the level of abstraction between the hardware and the real world. There is an inevitable tension between higher levels of abstraction that are easier to understand and safer to use, and lower levels of abstraction that are more flexible and can often be implemented more efficiently. To design or choose a programming language is to select an appropriate level of abstraction, and it is not surprising that different programmers prefer different levels, or that one language may be appropriate for one project and not for another. Within a specific language, a programmer should understand in depth the safety and efficiency implications of each construct in the language.


Read More/Download

Language, Proof and Logic

By Jon Barwise and John Etchemendy

This book is intended to introduce you to some of the most important concepts and tools of logic. Our goal is to provide detailed and systematic answers to the questions raised above. We want you to understand just how the laws of logic follow inevitably from the meanings of the expressions we use to make claims. Convention is crucial in giving meaning to a language, but once the meaning is established, the laws of logic follow inevitably.

More particularly, we have two main aims. The first is to help you learn a new language, the language of first-order logic. The second is to help you learn about the notion of logical consequence, and about how one goes about establishing whether some claim is or is not a logical consequence of other accepted claims. While there is much more to logic than we can even hint at in this book, or than any one person could learn in a lifetime, we can at least cover these most basic of issues.

Following are the few topic covered in this language, proof and logic book
  • Propositional Logic
  • The Logic of Atomic Sentences
  • The Boolean Connectives
  • Quantifiers
  • Introduction to Quantification
  • The Logic of Quantifiers
  • Multiple Quantifiers
  • Methods of Proof for Quantifiers
  • Formal Proofs and Quantifiers
  • Applications and Metatheory
  • First-order Set Theory
  • Mathematical Induction
  • Advanced Topics in Propositional Logic
  • Advanced Topics in FOL
  • Completeness and Incompleteness
Read More/download

The Theory of Languages and Computation

By Jean Gallier and Andrew Hicks

This book covers various topics on Automata, Formal Languages, Computablity, etc and more. Following are the few topics explained in this theory of computation book.
  • Automata Notation, Proofs
  • Set Theory
  • The Natural numbers and Induction
  • Foundations of Language Theory
  • Operations on Languages
  • Deterministic Finite Automata
  • The Cross Product Construction
  • Non-Deterministic Finite Automata
  • Directed Graphs and Paths
  • Labeled Graphs and Automata
  • The Theorem of Myhill and Nerode
  • Minimal DFAs
  • State Equivalence and Minimal DFA’s
  • Formal Languages
  • A Grammar for Parsing English
  • Context-Free Grammars
  • Derivations and Context-Free Languages
  • Normal Forms for Context-Free Grammars, Chomsky Normal Form
  • Regular Languages are Context-Free
  • Useless Productions in Context-Free Grammars
  • The Greibach Normal Form
  • Least Fixed-Points
  • Context-Free Languages as Least Fixed-Points
  • Least Fixed-Points and the Greibach Normal Form
  • Tree Domains and Gorn Trees
  • Derivations Trees
  • Ogden’s Lemma
  • Pushdown Automata
  • From Context-Free Grammars To PDA’s
  • From PDA’s To Context-Free Grammars
  • Computability 
  • Computations of Turing Machines
  • The Primitive Recursive Functions
  • The Partial Recursive Functions 
  • Recursively Enumerable Languages and Recursive Languages 
  • Phrase-Structure Grammars
  • Derivations and Type-0 Languages 
  • Type-0 Grammars and Context-Sensitive Grammars 
  • The Halting Problem
  • A Univeral Machine 
  • The Parameter Theorem 
  • Recursively Enumerable Languages
  • Hilbert’s Tenth Problem 
  • DNA Computing
  • Analog Computing
  • Scientific Computing/Dynamical Systems 
  • Quantum Computing 
Read More/Download

Advanced Programming Language Features for Executable Design Patterns “Better Patterns Through Reflection”

By Gregory T. Sullivan

The Design Patterns book presents 24 time-tested patterns that consistently appear in well-designed software systems. Each pattern is presented with a description of the design problem the pattern addresses, as well as sample implementation code and design considerations.

This paper explores how the patterns from the “Gang of Four”, or “GOF” book, as it is often called, appear when similar problems are addressed using a dynamic, higher-order, object-oriented programming language. Some of the patterns disappear – that is, they are supported directly by language features, some patterns are simpler or have a different focus, and some are essentially unchanged.

Peter Norvig describes design patterns as:
  • Descriptions of what experienced designers know (that isn’t written down in the Language Manual)
  • Hints/reminders for choosing classes and methods
  • Higher-order abstractions for program organization
  • To discuss, weigh and record design tradeoffs
  • To avoid limitations of implementation language.
 Read More/Download

Semantics with Applications

This book is one of the best for knowing formal semantics of programming languages. You will learn how to use semantics for validating prototype implementations of programming language, how to use semantics for verifying analyses used in more advanced implementations of programming languages and how to use semantics for verifying useful program properties including information about execution time.

Read More/Download

Dictionary of Programming Languages,

Welcome to the Dictionary of Programming Languages, a compendium of computer coding methods assembled to provide information and aid your appreciation for computer science history. The dictionary currently has over 120 entries.

The largest and most comprehensive list on the net, and a fine job by the CUI group at University of Geneva and Bill Kinnersley. This list has a very good search capability, and many of the entries have links to FTP sites for compilers and tools. Unfortunately, some of the links on the list are a little out of date, but its source attribution and journal references are great.

Read More/Download

Programming Languages

By Scott F. Smith

This book is an introduction to the study of programming languages. The material has evolved from lecture notes used in a programming languages course for juniors, seniors, and graduate students at Johns Hopkins University.

The book treats programming language topics from a foundational, but not formal, perspective. It is foundational in that it focuses on core concepts in language design such as functions, records, objects, and types and not directly on applied languages such as C, C++, or Java. We show how the particular core concepts are realized in these modern languages, and so the reader should emerge from this book with a stronger sense of how they are structured.

The book is not formal in the sense that no theorems are proved about programming languages. We do, however, use several techniques that are useful in the formal study of programming languages, including operational semantics and type systems. With these techniques we can define more carefully how programs should behave.

The OCaml Language
The Caml programming language is used throughout the book, and assignments related to the book are best written in Caml. Caml is a modern dialect of ML which has the advantages of being reliable, fast, free, and available on just about any platform through http://caml.inria.fr.

This book does not provide an introduction to Caml, and we recommend the following resources for learning the basics:

    * The OCaml Manual, in particular the first two sections of Part I and the first two sections of part IV.
    * Introduction to OCaml by Jason Hickey.

The OCaml manual is complete but terse. Hickey’s book may be your antidote if you want a more descriptive explanation than that provided in the manual.

The FbDK
Complementing the book is the Fb Development Kit, FbDK. It is a set of Caml utilities and interpreters for designing and experimenting with the toy Fb and FbSR languages defined in the book. It is available from the book homepage at http://www.cs.jhu.edu/~scott/pl/book, and is documented in Appendix A.

Background Needed
The book assumes familiarity with the basics of Caml, including the module system (but not the objects, the “O” in OCaml). Beyond that there is no absolute prerequisite, but knowledge of C, C++, and Java is helpful because many of the topics in this book are implemented in these languages. The compiler presented in chapter 7 produces C code as its target, and so a very basic knowledge of C will be needed to implement the compiler. More nebulously, a certain “mathematical maturity” greatly helps in understanding the concepts, some of which are deep. for this reason, previous study of mathematics, formal logic and other foundational topics in Computer Science such as automata theory, grammars, and algorithms will be a great help.

How Language Works

By  Michael Gasser

I started writing this book because I was teaching an introductory linguistics course, and I was dissatisfied with the available textbooks. In particular, I felt that they did not do a good job of showing how the study of language fits into the larger field of cognitive science. Once I got into it, the book turned into more than a textbook on linguistics because it began to veer off into areas of study that usually don't count as linguistics. One way to define linguistics is as the study of language itself, which can be contrasted with language behavior. Language behavior is studied by people in the fields of psycholinguistics, language development, natural language processing, and computational linguistics, and there is often an attempt to keep these fields distinct from linguistics "proper". I believe that it is more productive to see all of these fields as making up "the language sciences" or "language science", and it is really this meta-field that is the topic of this book.

I also think that most introductory textbooks (on all topics, not just linguistics) try to introduce too many concepts and fail to tie them together in terms of a small number of themes. I believe that the way language works makes sense (not all linguists agree), and I've tried to organize the book around this idea. I also believe that a basic understanding of how language works is just as important to a basic education as an understanding of algebra or geography, and I hope that I've made it clear in the book why I believe this.

Finally, I've tried to incorporate several other novel ideas of mine about how best to teach about language: start with simplified, artificial examples; select real examples from a relatively small number of languages (especially those that are somewhat familiar to the author); and be open about the large gaps in our knowledge about language, as well as the excitement that comes with a young field.

This is edition 3.0 of How Language Works. It is quite different from the last edition (2.0). In particular, it includes material on computational approaches to language. Long after coming up with the title, I realized that there were several published books with the same title (and at least one more has appeared since I released this book). So if you refer to this book elsewhere, be sure to make it clear that you are referring to "How Language Works (edition 3.0) by Michael Gasser". The book is freely available to anyone, under the terms of GNU Free Documentation License, Version 1.2.

The organization of this book is based on the idea that human language has a small set of basic properties, each of which plays a role in the workings of language as an instrument for communication and thought. Each chapter in the book (after this one) introduces a new property. Chapter 2 discusses words and word meaning. Chapter 3 discusses phonological categories, the units that are combined to make word forms. Chapter 4 discusses phonological processes, the ways in which the units of word form interact with one another. Chapter 5 discusses compositionality, the principle that allows complex meanings to be expressed by combinations of words. Chapter 6 discusses how words are organized into larger units and how these allow us to refer to states and events in the world. Chapter 7 discusses how the grammars of languages divide the world into abstract conceptual categories. Chapter 8 discusses the productivity and flexibility of language and how grammar makes this possible.

Practical Foundations for Programming Languages

By Robert Harper

This is a working draft of a book on the foundations of programming languages. The central organizing principle of the book is that programming language features may be seen as manifestations of an underlying type structure that governs its syntax and semantics. The emphasis, therefore, is on the concept of type, which codifies and organizes the computational universe in much the same way that the concept of set may be seen as an organizing principle for the mathematical universe. The purpose of this book is to explain this remark.

Introduction to Computing Explorations in Language, Logic, and Machines

By David Evans

This book is useful for computer students and people who wish to know the various aspects and procedures of computing, programming, and logic. This theory of programming language and computing book covers following topics in detail.
  • Computing
  • Defining Procedures
  • Language
  • Programming
  • Problems and Procedures
  • Data
  • Analyzing Procedures
  • Machines
  • Cost Sorting and Searching
  • Improving Expressiveness
  • Mutation
  • Objects
  • Interpreters
  • The Limits of Computing
  • Computability
  • Intractability

Read More/Download

Followers

Privacy Policy
We use third-party advertising companies to serve ads when you visit our website. They uses cookies to serve ads on our site. These companies may use information (not including your name, address, email address, or telephone number) about your visits to this and other websites in order to provide advertisements about goods and services of interest to you. If you would like more information about this practice and to know your choices about not having this information used by these companies, click here
Disclaimer
Copyright of books and articles goes to its respective owners. In this blog, i am writing reviews about articles & books and giving free links available in world wide web. Intention of this blog is educative purpose and using contents from other sites as it is fair use. So if you have any complaints regarding books copyright, please contact book hosting servers for the removal of the book. we got all download links from the search engines and so we are not responsible for any damage due to the links given here. This is only for education purpose.