Wednesday, August 24, 2005

Chapter Outline for Ph.D. topic

Chapter Outline:




  1. Introduction
    The intelligent robots envisioned by researchers at the dawn of artificial intelligence have never been created by computer science, and instead what has proved wildly successful is the Web. Informal and undisciplined, the success of the Web challenges both notions of classical artificial intelligence and the analytic philosophy that motivates AI, as well as newer variations of connectionism, dynamic systems theory, and embodied intelligence. The Web is historically situated, and trends in its future development are briefly sketched. There is a noticeable lack of philosophical and formal analysis of the Web, and this thesis will provide both. The main social problem the Web causes is not that of information retrieval but that of information organisation, and a novel solution to the problem in terms of narrative structuring of information is given to demonstrate the value of the philosophical and formal framework proposed.

  2. Computation and the Extended Mind
    Although this is so obvious as to be a truism, the computational theory of the mind is realized by computers, not the human brain. If we take the extended mind hypothesis seriously, then the mind is in a real sense distributed amongst not only the brain and the human body, but aspects of the environment, and so by definition the human mind can include computers, and architectures implemented on them such as the Web. Computers are best understood as an inverse reflection of the capacities of humans: computers allow humans to "off-load" tasks the brain has limited capacity for, such as arithmetic and deduction. A parallel example using the development of written language and logic is given. Computation is taken to be given by its classical mathematical definition, and computation is explored more fully from a philosophical standpoint. The Web is then compared and contrasted with traditional models of language and computation. The Web presents a whole challenge for traditional understanding, for the Web primarily for the digital communication of information, distinguishing it from the strict definition of computation and informal linguistic communication.

  3. The Web and Network Intelligence
    Artificial intelligence and the philosophy of the mind both begin from a premises that have in recent years been shown to be incorrect: intelligence emerges from the mind, which is assumed a unitary organization encased in the brain of an individual. This highly influenced artificial intelligence, which conceived that intelligence could be created by having a computer, as a unitary organization, be given the correct programs and code. However, intelligence can be conceived of as emerging from the "extended mind", defined as a dense network of interconnections between various machines. This is called "network intelligence" to contrast it with the more traditional artificial intelligence. What is traditionally conceived of as the "mind" behind intelligence is the narrative that the network produces to describe itself historically, and the narrative is not necessarily stored in any one component of the system. The Web can be taken as a primary example of network intelligence due to its definition as a "universal" network. The success of the Web is due to it being a manifestation of the extended mind that takes into account network intelligence.

  4. Information and Encoding
    The main traffic of the Web, information, is notoriously hard to analyse. We first begin with a reformulation of Brian Cantwell Smith's theory on the origin of objects in order to lay the grounds for the notion of objects and identity. His theory is extended by a notion of information. We analyse information as a two-fold phenomena consisting of "information content" (Dretske and Barwise) and as a methodology of "encoding"(Shannon).

  5. Digital Representation
    Building upon the definition of information previously given, the ideas of presentation and representation are separated. From the work of Haugeland and Goodman, a new definition of digitality is given. The notion of computation is tied to that of causation, and the ideas of syntax and and semantics are distinguished in terms of information, digitality, and computation.

  6. Principles of Web Architecture
    The architecture of the Web are explained, as given by previously developed concepts of digital representation and the principles of universality, extensibility, least power, and the network effect. Close inspection is given to the "Architecture of the World Wide Web" document produced by the W3C, and the current functioning of the Web is contrasted with the REST model.

  7. The Semantic Web as Types
    The Semantic Web is defined as a Web with machine-readable semantics. The current state of the Semantic Web is explained, and the Semantic Web is explained as giving a uniform encoding of identity and representation to information on the Web. This leads to two distinct notions of semantics: semantics as given by the allowed operations of a given computer program, and semantics as given by the information content of a given representation. The Semantic Web is then shown to be a distributed type system, giving a model-theory for the former and a way for users of the Web to formulate the latter. A XML-only solution to binding Semantic Web types to XML is demonstrated. This data is dual-typed, once with a "data type" and encoding specific to computational use, and once again with a "semantic type" and encoding specific to the informational content of the data.

  8. Web Services as Functions
    Web Services are programs that can be called over the Web, and are formally equivalent to functions. Web Services given Semantic Web typing can then be shown to be functions that compute over both semantic and data typing. Given that Web Services are functions and the Semantic Web a type system, a formal analysis of the next generation of the Web can be given: a distributed, truly universal computer.

  9. Computation on the Web: functionalXML
    If the next generation of the Web is a computer, it needs a programming language. Currently a simple XML-based programming language (entitled functionalXML) has been proposed by Henry S. Thompson. The language can be formally characterized by the lambda calculus, and then how the language can be extended to deal with Web Services and Semantic Web typing via the typed lambda calculus, and how such an extension can be realised in practice.

  10. Personalized Webs
    The question is then what type of information does the Web traffic in? The Web, as evidenced by the growth of blogs and other personalized forms of information creation and delivery, is the antithesis of the ontologies proposed by projects such as Cyc. Instead of delivering universal "common-sense" information, information is structured to be relevant to the highly personalized environment of the agent. We present a framework in which such information can be displayed in both a machine-readable manner compatible with the Semantic Web using a format entitled Web Proper Names.

  11. Narration and Cognition
    Blogs are how the narratives of human agents on the Web are extended computationally. The work on personalized ontologies is extended to deal with linguistic narratives. The cognitive development and aspects of narratives are explored, with examples from a corpus of stories generated by children being used.

  12. Computational and Narration
    A formal model of narratives, the narrative calculus, is expressed in terms of a propositional calculus that is optionally enriched by ontologies, temporal ordering, and a probabilistic weighting of its importance in the narrative.

  13. Narrative Detection
    A pipeline for the extraction of the narrative calculus is created using a series of Web Service NLP components, composed using functionalXML and storing the results as Semantic Web types stored using Web Proper Names. The results of detecting narratives are shown both using children stories and using the activity of Web users.

  14. Narrative Generation on the Web
    A reverse pipeline for the generation of natural language texts from the narrative calculus is given, again composed using functionalXML. These texts can be considered the automatic generation of "blogs" documenting the Web activity of the browsers of the Web. Since they are expressed in the informal language of everyday life, they are simpler for humans to understand that mere lists, and since these narratives are augmented with Semantic Web types and are open for extension, they offer a level of versatility unique to the Web.

  15. Conclusion
    This thesis analyzes the Web from both a philosophical and formal standpoint, explaining the challenge presented by the Web to artificial intelligence. It demonstrates the value of both the philosophical and formal framework by using the example application of personalized narrative generation for the management of Web information. The final chapter concludes by looking at the embedding of the Web in society and sketching out future avenues of research.

0 Comments:

Post a Comment

<< Home