Please login first

List of accepted submissions

 
 
Show results per page
Find papers
 
  • Open access
  • 30 Reads
Musicological Significance of Traditional Chinese Music Inheritance and Information Theory Research

Introduction

Noted physicist Sir Arthur Stanley Eddington’s famous argument states reasoning suffices entropy to be comparable with beauty and melody. Entropy is the level of disorder in a system and in information theory, it refers to the measurement of uncertainty. Significance of music originates from expected uncertainties of time and musical characteristics in consequent situation resulted from estimation and evaluation of musical inheritance method possibilities by antecedent situation.

With the developments of school songs in the first half of 20th century, musical notations, both stave and numerical score, quickly spread all over and gained popularity in China. Traditional musical notations of Chinese music, because they only record pitch and either roughly or do not reflect pace, were conceptualized to be obsolete. This is the very reason traditional Chinese musical notations were abandoned. However, a traditional Chinese musical book (纳书楹曲谱), translated as Nashu Studio Theatrical Music, writes 1 very detailed notation of pace is beneficial for beginners, however it restricts people with profound understanding of music and excellent performing techniques from developing their own performing characteristics. For instance, in upper Gong Chi of Gong Chi Score (工尺谱), a form of traditional Chinese musical notation, if three Yin translate to one Ban, there can be as many as over 5 possible combinations of pace and this allows flexibility for performers to recreate the music. Jian Zi Score (减字谱) of Gu Qin, a seven-stringed plucked instrument, indirectly records pitch and even detailed playing techniques but recording of pace is nowhere to be found in such score. Pace of these scores are determined by the playing technique of different schools of performers. The same ancient score, when played by different performers, shows distinct styles. The traditional Chinese inheritance method of oral instruction and rote memory also has many uncertainties. The background, life experience, personality, a aesthetic taste and mode of teachers and learners all contribute to forming different styles and characteristics during inheritance, creating uncertainties.

Significance and information are both associated with uncertainties through probability. In any communication of information, the lower probability of a subsequent event, the more uncertainties (and information) are contained in antecedent-consequent relationship. Information is the measurement of degree of freedom in message selection. The greater this degree of freedom and volume of information are, the more uncertain that message is. Thus, the chosen degree of freedom, uncertainty and volume of information are positively correlated. If information theory is applied to the discussion of significance of traditional Chinese music score inheritance, what conclusion will it lead us to?

The value of information itself is based on representation, expressing, externalization revealing objects and their characteristics and significance. This paper combines algorithm music2 and information theory and discusses the value and significance of traditional Chinese music scores and inheritance method of oral instruction and rote memory for the purpose of discovering new methods for traditional Chinese music inheritance..

References and Notes

  1. Original text: 板眼中另有小眼原为初学者而设,在善歌者自能生巧,若细细注明转觉束缚
  2. Algorithm music expresses music using algorithm instead of music notations. It is a more abstract visualization of music scores. Algorithm music records the internal structure of music. Style andcharacteristics of performers can be recorded and analyzed with probability computation using computers.
  • Open access
  • 80 Reads
Concept of Information: The Point of Convergence for Philosophy and Science or the Vanishing Point of Parallels?

Science and philosophy parted their ways much earlier than it is usually considered. The sources of the divergence were in their foundations set in the works of Aristotle. His division of inquiry made the explicit double distinction between theoretical disciplines and “other disciplines”, and within the theoretical disciplines tripartite division into physics, mathematics and first philosophy (i.e. ontology). Even more important was the implicit distinction between what now we call epistemology and ontology. Every attempt to converge science and philosophy has to transcend the conceptual framework introduced by Aristotle reflecting those divisions. The concept of information defined by the author with the use of the categorical opposition of one and many satisfies this requirement. This new conceptual framework has application to scientific analysis and philosophical reflection. Moreover, language itself which is the means of inquiry can be considered a special instance of the use of the concept of information. Thus, we have the common conceptual framework for epistemology and ontology, for science and philosophy.  

Science and Philosophy: The Origins of Divergence

There is a popular belief that the divergence of science and philosophy began recently, but more careful look at the intellectual history of Europe can trace this divorce much earlier. Thus, sometimes it is associated with the division into the Two Cultures denounced by P. C. Snow originally in 1956, sometimes with the period of popularity of Logical Positivism and its crusade against metaphysics, with the period of the Enlightenment, with the views of Francis Bacon revolting against philosophical tradition in his Novum Organum and his promotion of the inductive method, or with the much earlier distinction between philosophers and mere “mathematicians” whose only concern was “saving the appearances”.

It was mainly the influence of Aristotle and his philosophical views that, if not immediately, then inevitably led to the divergence. His views on the division of knowledge, which gave the priority to the theoretical sciences over “other sciences” and then divided the former into mathematics, physics and first philosophy (Aristotle, 1955: Metaphysics 1025b18-1026a31) contributed to the divorce, but they were not the most important. At least equally important was the less explicit, but equally fundamental division into the two aspects of inquiry expressed in the questions – “What do we know?” and “How do we know?”

The division was reflected later in the pair of parallel divisions in philosophy and science. In the former, it is now reflected into the two domains of philosophical reflection – ontology and epistemology. In science, we have the division into research outcomes (empirical data and scientific theories) and research methodology (inductive methods of proceeding from the data to scientific theories, deriving logical consequences of theoretical generalizations, and their empirical testing). Without any doubt, the division was of tremendous value for the progress of both philosophy and science in the next two millennia. But, the progress had its limits. In modern physics and biology the division into the two initial questions – “What do we know?” and “How do we know?” lost its original sense. For instance, relativity theories and quantum mechanics showed that the answer to the first question depends on the answer to the second. The presence of an observer (human or not) cannot be eliminated from the description of the observed.

These strategic, fundamental divisions organizing the streams of intellectual activity into parallel directions of development had their influence not so much through Aristotle’s declarations, but through the conceptual framework of his philosophy, which is present, sometimes in a hidden way or through an apparent negation in the entire later European intellectual tradition. It is this framework which has to be examined when we want to search for the methods for convergence of science and philosophy.

Problematic Relationship of Science and Philosophy

Since the main objective of this paper is to explore the potential role of the studies of information in reuniting science and philosophy, it is necessary to examine the points where science, especially science in Newtonian and post-Newtonian paradigm (i.e. with the adaptations to Relativity, Quantum Theory, etc.), is in a problematic relationship with philosophy. Some of these problems were already mentioned above, but there are other equally important ones which require examination. The influence of Aristotelian concepts of the four causes and of generation on the way of thinking had its constructive role in the early development of biology, but only to the point when the theory of evolution entered.

This original conceptual framework became even more problematic with the development of genetics and the studies of metabolism bringing back the old question “What is life?” Ironically, the answer given by Aristotle referring to “self-generation”, i.e. natural generation, change with the internal source, as opposed to artificial generation (Aristotle 1955: Metaphysics 1032a12-1034a8), may seem at first as identical with the answer given by Maturana and Varela when they introduced the concept of autopoiesis, literally self-creation (Maturana & Varela 1980). But, while the terms are basically identical, the actual philosophical meaning of them is fundamentally different.

The examination of the concepts which have been carrying Aristotelian conceptual framework through the millennia and contributing in the past to the progress of inquiry, but reached the point where their role became questionable can help to justify our search for reunification or at least realignment of science and philosophy, and at the same time we can identify the problems, which can be solved with the help of the new conceptual framework of information studies.

Conceptual Framework for Information

Thus far there was only reference to information studies without any clarification of their scientific or philosophical status. Here is the key point of the paper. If information studies are supposed to become the point of convergence for science and philosophy, they have to assume the dual role of both. To overcome the divisive tendencies in the comprehension of reality based on the intellectual tradition inherited from the Classical Antiquity, it is necessary to reach to the point beyond the origins of the present conceptualization in philosophy and in science. In order to be a bridge between science and philosophy, the concept of information cannot belong to only one of the separated sides. This is the reason why information, to assume this role cannot be defined in the terms of any specific scientific theory or any specific philosophical system. Of course, we have to prevent trivialization of this concept by reducing it to a common sense expression which through the lack of precision and rigor allows arbitrary interpretations satisfying uncritical intuitive feeling of understanding.

There is a legitimate question whether the task of finding concepts defining information transcending scientific theories, philosophical systems, and escaping triviality of the common sense is possible. The author believes that it is possible and as a justification of his belief gives an example of own definition of information defined and elaborated in earlier publications (Schroeder 2005, 2009, 2011a).

The definition is referring to the categorical opposition of one and many. Categorical are concepts or relations which are most general and which by this reason cannot be defined by any more general genus. They are undefinable. The opposition of one and many belongs to categorical relations (or dual categories) in every European philosophical system, which was mature enough to specify its conceptual limits, from the Pythagoreans, through the Platonism, Aristotelean philosophy, Epicureanism, Neo-Platonism, Scholastic philosophy, to the philosophy of Kant and modern philosophical systems. Of course, it was in the center of attention of the philosophy of mathematics, especially of the set theory at the time of its formation (Schroeder 2005).

There is another aspect of the universality of the one-many relation. It can be found as a central theme of Eastern philosophy in particular in the discussions of the relation between Atman and Brahman in the ancient philosophical schools of Hinduism and Buddhism (James, 1967). The relation between one and many could be described and understood not necessarily as an actual opposition, in Buddhism and Taoism for instance the opposition is considered illusionary. However, the question about its status and understanding is the most fundamental of all questions in every school of thought, and the answer to this question is frequently considered the defining statement of the school.

Thus, when we are using the opposition of one and many as the only concept defining information, we are safely beyond any point of divergence in the conceptualization of reality, and for sure beyond any division into science and philosophy. The cross-cultural universality makes this one-many relation not only universal for human intellectual activity, but also a necessary condition for the comprehension of reality. We can observe the presence of this opposition in languages of the tribes whose cultures remained unchanged for thousands of years, which lack words for numbers beyond one and two, but which have a clear recognition of the opposition of the words of one and of many.

Some philosophical systems have multiple categorical concepts and the one-many relation is not always considered the most-fundamental. Kant for instance gave special role to the categorical concept of time. However, the present author believes that the other categories can be eliminated by defining them with the use of the relationship of one and many. For instance, time can be conceptualized in terms of the multiplicity derived from the change. There is no time, if there is no change. Change requires differences, and differences require some multiplicity to be differentiated. Gregory Bateson observed that “it takes at least two somethings to create a difference” (his fundamental concept defining information as “any difference that makes a difference”), we have to have at least two of something (Bateson 1988: 72). The multiplicity is called usually moments of time, but we can disregard at this point this terminology. On the other hand, time requires unity, as an arrangement of this multiplicity. The arrangement is in the standard conceptualization of time a linear order, and therefore we are making a choice of one of several possible ways the unity is achieved.

Concept of Information

Information as defined by the author as identification of the multiplicity, i.e. anything that makes one out of, or of the many. One out of the many is a selection of one element out of many, which can be called a selective manifestation of information. Making one of the many is giving the many a binding structure, which can be called a structural manifestation of information. It can be shown that these two manifestations are always coexistent, but for different multiplicities, or as they were called by the author for different information carriers. The degree of the determination of selection (for instance in terms of probability distribution and the value of entropy for this distribution) can be used as a quantitative characteristic of information when we focus on the selective manifestation. The degree in which the structure can be decomposed into a product of components describes the level of integration of information (Schroeder 2009). Both manifestations can be given one mathematical formalism, which due to the high level of abstraction of the concept of information is developed in terms of set theory and general algebra (general closure operators or closure spaces) (Schroeder 2011b).

With the tool of a general concept of information, we can proceed to the analysis of concepts which create problems in aligning science and philosophy, such as concepts of a physical system (isolated or open), the state of such system, inertial reference frame, etc. It may be a surprise that even if they have correlates in the scientific formalism (in the example of a state of the physical system, a point in the phase space or a vector in the appropriate Hilbert space), they are not clearly defined as general concepts.

Conclusion

Why should we believe that the concept of information, no matter how general and inclusive, when defined as above with the use of the categorical relation transcending the original sources of the division between science and philosophy can help in their convergence? What makes us believe that it will be the point of convergence, not the vanishing point of the parallels?

The author’s answer is that the concept of information has an exceptional status. Recent development of science shows that information (defined as above or in a more narrow way) can be used as a fundamental concept which can replace the traditional concepts of ontology such as matter, substance, cause, etc. But, at the same time the inquiry of reality is carried out with the use of language, or languages, if we consider the distinction between natural languages and formal languages of mathematical or logical formalizations. Thus, in the past, the precipice between epistemology which had as its universe linguistic conceptual framework of inquiry within the mental realm (“mind”), and ontology whose interest was in entities of the “physical” realm of objective reality (“body”), was impassable. The concept of information can be applied in both realms equally well. We can develop generalized logic of information with its special instance applicable to the traditional logic (Schroeder 2012), or we can think about information in the scientific terms of its dynamics, for instance to describe the process of computation (Schroeder 2013a, 2013b).

References and Notes

Aristotle, (1955). Metaphysics. In Ross, W. D. (ed.) Aristotle: Selections. Charles Scribner’s Sons, New York.

Bateson, G. (1988). Mind and Nature: A Necessary Unity. Toronto: Bantam Books.

James, W. (1967). “The One and the Many.” In McDermott J. J. (ed.)The Writings of William James: A Comprehensive Edition. New York: Random House, 1967, pp. 258-270.

Maturana, H. R. & Varela, F. J. (1980). Autopoiesis and Cognition: The Realization of the Living. Boston Studies in the Philosophy of Science, vol. 42, Dordrecht: D. Reidel.

Schroeder, M.J., (2005). Philosophical Foundations for the Concept of Information: Selective and Structural Information. In: Proceedings of the Third International Conference on the Foundations of Information Science, Paris 2005, http://www.mdpi.org/fis2005

Schroeder, M.J., (2009). Quantum Coherence without Quantum Mechanics in Modeling the Unity of Consciousness, in: Bruza, P. et al. (Eds.) QI 2009, LNAI 5494, Berlin: Springer, pp. 97-112.

Schroeder, M. J., (2011a). Concept of Information as a Bridge between Mind and Brain. Information, 2 (3), 478-509.

Schroeder, M.J. (2011b). From Philosophy to Theory of Information. Intl. J. Information Theor. and Appl., 18 (1), 56-68.

Schroeder, M. J. (2012). Search for Syllogistic Structure of Semantic Information. J. Appl. Non Classical Logic, 22, 83-103.

Schroeder, M. J., (2013a). Dualism of Selective and Structural Manifestations of Information in Modelling of Information Dynamics. In G. Dodig-Crnkovic and R. Giovagnoli (Eds.): Computing Nature, SAPERE 7, Berlin: Springer, pp. 125-137.

Schroeder, M. J. (2013b). From Proactive to Interactive Theory of Computation. In: M. Bishop and Y. J. Erden (Eds.) The 6th AISB Symposium on Computing and Philosophy: The Scandal of Computation – What is Computation? The Society for the Study of Artificial Intelligence and the Simulation of Behaviour (pp. 47-51).

  • Open access
  • 76 Reads
Frontiers of Science of Information

In today's knowledge-based society, the central role of data, data-driven insights, and their use in scientific, commercial, and social enterprise is widely recognized. Information is acquired, stored, communicated, curated, organized, aggregated, analyzed, valued, secured, and used to understand, optimize, and control complex processes. In the context of real systems, these tasks pose significant challenges stemming from incomplete or noisy data, varying levels of abstraction and heterogeneity, requirements on scalability, constraints on resources, considerations of privacy and security, and real-time performance. These problems lie at the core of a large class of applications, ranging from analysis of biochemical processes in living cells to building robust wide-area communication systems, and understanding global-scale economic and social systems. Claude Shannon laid the foundation of information theory, demonstrating that problems of data transmission and compression can be precisely modeled, formulated, and analyzed. He also provided basic mathematical tools for addressing these problems. Motivated by Shannon's focus on the fundamental, and his precise quantitative analysis, the Center for Science of Information aims to develop rigorous principles guiding all aspects of information, integrating elements of space, time, structure, semantics, cooperation, and value, in different application contexts. Led by Purdue University, Center member institutions include Bryn Mawr, Howard, MIT, Princeton University, Stanford University, Texas A&M University, University of California at Berkeley and San Diego, University of Hawaii at Manoa, and University of Illinois at Urbana-Champaign. Center mission is to advance science and technology through a new paradigm in the quantitative understanding of the representation, communication, and processing of information.

The research theme of the Center focuses on transforming data to information to knowledge. It targets fundamental principles, formal methodologies and frameworks, and a rich set of algorithms and computational tools that are validated on applications in life sciences, communication systems, and economics. The Center targets the following fundamental problems: (i) modeling complex systems and developing analytical methods for quantifying information representation and flow in such systems; (ii) methods for quantification and extraction of informative substructures; (iii) understanding cooperation, competition, and adversaries in complex systems; (iv) developing information-theoretic models for managing, querying, and analyzing data with real-world constraints of incompleteness, noise, distribution, and resource constraints.

Following the principles of Shannon and Turing - who engaged themselves with practical systems before arriving at their theoretical abstractions, the Center focuses on specific applications with the goal of obtaining a broader and more general understanding of information. For instance, timeliness of information is extremely important when information is used in cyber-physical systems – leading one to investigate the issue of delay, which Shannon's theory largely ignored. The use of information also brings to focus the issue of semantics -- the meaning of the message is integral to performance of the consequent task, leading one to investigate goal-oriented communication and control under constrained communication. These problems are helping define information-semantics in fundamentally new and relevant ways by the Center scientists. Cyber-physical systems bring together processing and communication of information which is explored in the context of various applications ranging from vehicle information systems to (human) body sensor networks.

Investigation of biological systems at the Center motivates understanding of representation, inference, and aggregation of data. Since the time of Shannon, biology has undergone a major revolution, giving rise to significant challenges in interpreting data and understanding biological mechanisms. From a historical perspective, Henry Quastler first introduced information theory in biology in 1949, just a year after the landmark paper of Shannon, and four years before the inception of molecular biology (shaped by the work of Crick and Watson). Continuing this effort, Quastler organized two symposia on ``Information Theory in Biology". These attempts were rather unsuccessful, as argued by Henry Linschitz who pointed out that there are ``difficulties in defining information of a system composed of functionally interdependent units and channel information (entropy) to ``produce a functioning cell''. The advent of high-throughput technologies for data collection from living systems, coupled with our refined understanding of biological processes provides new impetus for efforts aimed at understanding how biological systems (from biomolecules to tissues) represent and communicate information. How can one infer this information optimally (from genome sequencing to functional image analysis)? How can one control specific functional and structural aspects of processes based on this understanding?

In yet other applications such as economics, questions of how information is valued are important. Flow of Information in economic systems and associated control problems are vitally important, and have been recognized through recent Nobel Prizes. More recently, with the ability to collect large amounts of data from diverse systems such as sensed environments and business processes, problems in `big and small data' have gained importance. Data analytics at scale is critically reliant on models and methods whose performance can be quantified. Issues of privacy and security pose problems for data management, obfuscation, querying, and secure computations. The Center is at the cutting edge of research in knowledge extraction from massive and structural data.

Acknowledgments

This work was supported by NSF Center for Science of Information (CSoI) Grant CCF-0939370, and in addition by NSA Grant 130923, NSF Grant DMS-0800568,

  • Open access
  • 45 Reads
Information Science: New Response to Old Challenges in the Scientific View of Reality

The challenges of complexity, of the lack of a comprehensive holistic methodology and of antithetic aspects of life and cognition are not new. They accompanied development of Western philosophy and science from the very beginning. Thus, it is not their presence in European intellectual tradition that is surprising, but their persistence. The first of these challenges, complexity is well known, but the focus of its study is more on the limits of the unconquerable, not on their elimination. Holism always fascinated European intellectuals, but was never fully admitted into standard scientific methodology. Only very recently under the name of integrative medicine acquires the status of institutionally recognized way of inquiry and practice. In the study of life and cognition challenges are multiple and commonly recognized, but there is lack of comprehensive, cross-disciplinary methodology to respond to them. Information science with sufficiently general and well defined concept of information can replace the crumbling foundations for science which until recently were given by physics. Concepts of modern physics lost intuitive character and are in increasing degree dependent on interpretation in terms of information, computation, or cognition. However, in order to serve as a firm new foundation they require a common comprehensive framework. The approach to information, its structure, integration and dynamics proposed by the author can serve as an example of a conceptual framework which can serve this purpose.    

Challenge of Complexity

The challenges of complexity, of the lack of a comprehensive holistic methodology and of antithetic aspects of life and cognition are not new. They accompanied development of Western philosophy and science from the very beginning. Thus, it is not their presence in European intellectual tradition that is surprising, but their persistence.

Although the qualification for a system or process to be considered complex has changed, we still are facing limits for the exploration of systems which are and seem for ever to be too complex to explore, or for performing processes which due to their complexity require for their implementation too much of necessary resources (mainly time). The simplest example of the latter case is the impossibility of implementation of an algorithm which requires the number of steps that is an exponential function of the volume of data. Unplanned “gift” from the technological progress in the form of Moore’s Law (Moore 1965) reporting the exponential temporal increase of the quantitative characteristics of electronic devices in the industrial innovation can push the limits, but cannot eliminate them. Algorithmic complexity theory has as its objective a clarification of the issue which of algorithms are feasible in the current forms of computation, and which are beyond practical applications. The actual challenge is in finding ways to implement algorithms of the latter type.

There are several other faces of complexity blocking intellectual and technological progress. Another example is in recently recognized need for the revision of statistical methodology. It is very often forgotten that traditional statistics is heavily dependent on the assumption of the independence of individual events which justifies the use of normal distributions in the analysis of collective phenomena and gives this form of analysis its predictive power. The recognition of the necessity to consider mutual dependence of events, and therefore of the necessity to replace normal distributions by power laws brought back the issue of unpredictability of “black swans” (Taleb 2007).

In order to confront the challenges of complexity, its studies have to go far beyond its traditional conceptual framework set by Warren Weaver (1948) in the 1940’s, later complemented by the quantitative description in terms of algorithmic complexity.

Challenge of Holistic Methodology

Holism is a new name for the ancient view of reality which gives priority to a whole over its multiplicity of components. Jan Christiaan Smuts introduced this term in 1927 in his anticipation for a true holistic and integrative approach to science with the specific interest in the study of complex adaptive systems such as those exemplified in the biological evolution. It was an early attempt to give an academic status to the way of thinking that over centuries or even millennia remained on the fringes of science and philosophy. This inferior, non-academic status of the holistic thought did not come from the lack of interest in what integrative approach was offering. For instance, Hermeticism in which holistic view of reality played most fundamental role influenced Renaissance thinkers not less than Neoplatonic philosophers. Even the greatest minds of science and philosophy (e.g. Isaac Newton) were attracted to it by its alternative view of reality. The problem was in the lack of a consistent and comprehensive conceptual framework which could be used to formalize its vague esoteric claims (Schroeder 2012).

The situation did not change in the recent revival of the interest in holism which came with the works of Ludwig Von Bertalanffy (1950) on his General System Theory. Once again, the initial interest waned when the general description of the theory was not followed by a comprehensive scientific methodology with the increased power in solving problems unsolvable with the earlier methodological tools. This can be blamed on the excessive trust in the common sense comprehension.

Expressions of the type “a whole is more than sum of its parts” frequently used in the context of General System Theory can be easily understood by a lay audience, but is philosophically meaningless. The danger of this easy “understanding” for an unprepared audience was in the lack of clear distinction between an innovative view of the scientific methodology and pseudo-scientific creations such as George Van Tassel’s Ministry of Universal Wisdom. Van Tassel was using similar holistic terminology. His rejuvenating device built on instruction of Solganda from the planet Venus had name “integratron”. In absence of more profound philosophical foundations or mathematical formalism the distinction between General System Theory and Universal Wisdom was not easy for many who believed that they understood both. The latter gained undeserved recognition as a scientific invention, the former undeserved judgment of being just a toy for dilettantes.

We are currently witnessing yet another instance of the revival of interest in the holistic way of thinking. Integrative medicine which emphasizes wellness and healing of the entire person in all bio-psycho-socio-spiritual dimensions came out of the fringes of medical sciences. The Consortium of Academic Health Centers for Integrative Medicine founded in 1999 has at present 60 members including medical institutions of the highest world recognition such as Mayo Clinique or Johns Hopkins University Hospital. The emphasis on the holistic aspect of human health is fundamental for this approach, as can be seen already in its name (Bell et al. 2002). It would be an ironic smile of history, if the triumphant reentry of holistic methodology happens through the interest in unorthodox methods of care for human health a millennium after pagan Aristotelean and other Classic Antiquity philosophy reentered European intellectual tradition through the interest in Arabic methods of such care.

Challenge of Life and Cognition

Studies of life and cognition (consciousness) are facing not one challenge, but a large variety of challenges, many of them old and some new. An example of an old challenge is the homunculus fallacy reappearing again and again in the attempts to explain consciousness (Schroeder 2011a, 2014). More fundamental challenge is in the consequences of the Cartesian mind-body dualism, which after each of its obituaries in philosophical works of the past was resurrected in more specific problems of the relationship between the subjective and objective forms of reality, problems of qualia, problem of quantum measurement, etc.

Consequences of Cartesian dualism can be found in many unexpected contexts. Even the title of this section of the abstract involves its reflection in the separation of the concepts of life and cognition. The need for the view in which life and cognition are only different facets of the same natural phenomenon, was already recognized in the works of Humberto Maturana and Francisco Varela (1980). But the concept of autopoiesis (self-creation) at the center of their philosophy of life is yet another example of a challenge to the traditional scientific methodology.

Even more revolutionary and challenging to the traditional science were ideas of Nicolas Rashevsky (1965, 1972) and Robert Rosen (1958, 1981, 1991) regarding a new order of scientific disciplines in which biology would have more fundamental place than physics in the development of united conceptual framework for the view of reality. This idea is definitely revolutionary, but the actual challenge in Rosen’s vision is in his insistence on the fundamental role of self-reference, which was banned from science as the main source of logical problems.

This brings us to the question about theoretical and philosophical tools necessary to confront these and other challenges to science.

Information Science

Even from the perspective of a singular discipline, the scientific view of reality is not as firm and clear, as it may seem. Physicists are referring to the concept of a physical system or its state, but what exactly is the meaning of these concepts is not clear, although we can identify elements of mathematical formalism associated with them. We still can find expressions such as “matter and energy”, clearly inherited from the 19th Century view of the world in which matter could be safely associated with the physical concept of mass. Today the use of such expression is questionable, as the former is a philosophical concept and the latter comes from a specific physical theory. The persistence of this hybrid can be viewed as a symptom of the lost conceptual framework that used to make physics a good foundation for other disciplines. Not only concepts of particles, waves, mass or energy have relative meaning, but the involvement of a conscious observer became a necessary element of the scientific view of reality.

The new conceptual foundation is sought in the concepts of information and computation. They can be characterized as more fundamental than those of traditional physics and can be found in the studies in the domains of physics, biology, cognition, etc. Problem is that not always, or even not frequently they are clearly defined. Information is simply associated with one of its postulated measures (most frequently with Shannon’s entropy) and computation with the work of a Turing machine. In order to serve as a new foundation for the scientific view of reality both concepts have to be well defined and equipped with a comprehensive theoretical formalism. In particular, information has to be described not only through its quantitative, but also qualitative or structural characteristics.  

Concept of Information

There is no reason to claim that there is only one way to define information. No non-trivial philosophical or scientific concept was ever defined in a universal, absolute way. In the following one particular definition is considered as a potential candidate for the work with the challenges to science. Information was defined by the author in his earlier publications as identification of the multiplicity, i.e. anything that makes one out of, or of the many (Schroeder 2005). One out of the many is a selection of one element out of many, which can be called a selective manifestation of information. Making one of the many is giving the many a binding structure, which can be called a structural manifestation of information. It can be shown that these two manifestations are always coexistent, but for different multiplicities, or as they were called by the author for different information carriers. The degree of the determination of selection (for instance in terms of probability distribution and the value of entropy for this distribution) can be used as a quantitative characteristic of information when we focus on the selective manifestation. The degree in which the structure can be decomposed into a product of components describes the level of integration of information (Schroeder 2009). Both manifestations can be given one mathematical formalism, which due to the high level of abstraction of the concept of information is developed in terms of set theory and general algebra (general closure operators or closure spaces) (Schroeder 2011b).

With the tool of a general concept of information, we can proceed to the definition of computation in terms of dynamics of information (Schroeder 2013 b, 2013c).

Conclusion

The conceptual framework for information, its structure, integration and dynamics used by the author is definitely not the only possible one. The mathematical formalism developed for these concepts can also be considered as a matter of choice. However, we can see that the idea of using some form of information science as a response to the challenges to science is feasible. Now it will be a matter of comparison of the competing different approaches to information to find the best response.

References and Notes

Bell, I. R., Caspi, O., Schwartz, G. E., at al. (2002). Integrative medicine and systematic outcomes research issues in the emergence of a new model for primary health care. Arch. Intern. Med. 162 (2), 133-140.

Von Bertalanffy, L. (1950). An Outline of General System Theory. British Journal for the Philosophy of Science, 1, 134-165.

Maturana, H. R. & Varela, F. J. (1980). Autopoiesis and Cognition: The Realization of the Living. Boston Studies in the Philosophy of Science, vol. 42, Dordrecht: D. Reidel.

Moore, G. E. (1965). Cramming More Components onto Integrated Circuits. Electronics, April 19, pp.114-117; reprinted in Moore (1998), Proc. IEEE, 86 (1), pp. 82-85.

Rashevsky, N. (1965). The Representation of Organisms in Terms of (logical) Predicates. Bull. of Math. Biophysics, 27, 477-491.

Rashevsky, N. (1972) Organismic Sets: Some Reflections on the Nature of Life and Society. Holland, Michigan: Mathematical Biology, Inc.

Rosen, R. (1958). The Representation of Biological Systems from the Standpoint of the Theory of Categories. Bull. of Math. Biophysics, 20, 317-341.

Rosen, R. (1987). Some epistemological issues in physics and biology. In B. J. Hiley and F. D. Peat (Eds.) Quantum Implications: Essays in honour of David Bohm. London: Routledge & Kegan Paul.

Rosen, R. (1991). Life Itself: A Comprehensive Inquiry into the Nature, Origin, and Fabrication of Life. New York: Columbia University Press.

Schroeder, M.J., (2005). Philosophical Foundations for the Concept of Information: Selective and Structural Information. In: Proceedings of the Third International Conference on the Foundations of Information Science, Paris 2005, http://www.mdpi.org/fis2005

Schroeder, M.J., (2009). Quantum Coherence without Quantum Mechanics in Modeling the Unity of Consciousness, in: Bruza, P. et al. (Eds.) QI 2009, LNAI 5494, Berlin: Springer, pp. 97-112.

Schroeder, M. J., (2011a). Concept of Information as a Bridge between Mind and Brain. Information, 2 (3), 478-509.

Schroeder, M.J. (2011b). From Philosophy to Theory of Information. Intl. J. Information Theor. and Appl., 18 (1), 56-68.

Schroeder, M. J. (2012). The Role of Information Integration in Demystification of Holistic Methodology. In P. L. Simeonov, L. S. Smith, A. C. Ehresmann (Eds.) Integral Biomathics: Tracing the Road to Reality. Berlin: Springer, pp. 283-296.

Schroeder, M. J. (2013a). The Complexity of Complexity: Structural vs. Quantitative Approach. In: Proceedings of the International Conference on Complexity, Cybernetics, and Informing Science CCISE 2013 in Porto, Portugal, http://www.iiis-summer13.org/ccise/Virtual Session/viewpaper. asp?C2=CC195GT&vc=58/ Accessed 18 July 2014.

Schroeder, M. J., (2013b). Dualism of Selective and Structural Manifestations of Information in Modelling of Information Dynamics. In G. Dodig-Crnkovic and R. Giovagnoli (Eds.): Computing Nature, SAPERE 7, Berlin: Springer, pp. 125-137.

Schroeder, M. J. (2013c). From Proactive to Interactive Theory of Computation. In: M. Bishop and Y. J. Erden (Eds.) The 6th AISB Symposium on Computing and Philosophy: The Scandal of Computation – What is Computation? The Society for the Study of Artificial Intelligence and the Simulation of Behaviour (pp. 47-51).

Schroeder, M.J. (2014). Autonomy of Computation and Observer Dependence. In Preston J., Erden Y.J. (Eds.) Proceedings of the 7th Symposium on Computing and Philosophy: Is Computations Observer-Relative? 50th Anniversary Convention of AISB, London April 1-4, 2014, available at http://doc.gold.ac.uk/aisb50/

Smuts, J. C. (1927). Holism and Evolution. London: Macmillan.

Taleb, N. N. (2007). The Black Swan: The Impact of the Highly Improbable. New York: Random House.

Weaver, W. (1948). Science and Complexity. American Scientist, 36(4), 536-544.

  • Open access
  • 70 Reads
Homo Informaticus: Equal Opportunities for People With Disabilities

Information and Communication Technology (ICT) has become the central access point. More and more interaction with the environment and also with human beings is mediated via ICT. The Human-Computer Interface (HCI) is the entity where users get access to virtual representations of real world processes. HCI thereby separates the interface from the actual activity and makes it an independent adaptable entity. (Miesenberger, 2009) ICT and HCI, facilitated by sensor technology (Wilson, 2005), makes interaction more flexible and independent in two directions, towards the user and towards the environment (Miesenberger et al 2013):

  • Any measuring, tracking and representing real world objects, processes, and even other living or human/social beings in abstract models as well as in processing and reasoning for enhanced activities, allows the integration of interaction into HCI and advances the potential of a more adaptable and flexible interaction. For people with disabilities we call this field eAccessibility.
  • Any progress in sensor technology in measuring, tracking, representing and using individual skills of a person in terms of controlled activities (e.g. with muscle, eye movements, head movement, movements of any part of the body or the body as a whole, electromyography – EMG, electroencephalography – EEG, towards brain Computer Interfaces - BCI) allows better using individual skills for interaction and accessing the standardized HCI, even for the most sever disabled people, where we call this field Assistive Technologies.

Progress in sensor technology and the increased flexibility at the HCI for sure has been core enablers for the ICT revolution in general, “at the desktop” as well as in emerging domains like mobile and embedded systems. Almost each application uses the standardized HCI, integrates into it to allow the user to apply existing skills and known concepts of interaction for more and more applications. The same holds even more true for people with disabilities as they, when the standard HCI can be managed, get access to the same systems and services as anybody else: A universal tool for inclusion.

HCI thereby is fundamentally different to traditional “mechanical” interaction. Each “traditional” device tended to provide an own interface. Technical developments tended towards increasing complexity for users as a new interface had to be learned. HCI in contrary stays stable cross tools and applications. This might read with surprise as we live in a world where we experience a faster and faster exchange of ICT gadgets in shorter and shorter time spans. But although we change devices and include more and more application domains, we can take the known concepts of HCI interaction with us, from device to device, from application to application and also away from the desktop and use it in many different situations for many different purposes. If new devices and applications would not integrate into this established user experience, the take up of innovation would be much slower and the resistance in society would grow. Since the invention of the desktop and HCI in the 60ties of last century (Müller-Prove, 2002) we use basically the same interaction concepts: (WIMP – Windows / Icons / Menus / Pointers, SILK – Speech / Images / Language / Knowledge, Touch …) and manipulation techniques (Point&Click, Trag&Drop, Copy&Paste, wip, …) (Miesenberger, 2009). These principles stay stable whatever the acceleration in terms of changing hardware and applications (including “Apps”) might be. And even in the days of mobile and embedded computing beyond the desktop developers have learned to support these principles, otherwise users will not follow. Only small and moderate changes, well integrated into existing user experiences meet with acceptance. Stability and standardization of the HCI are therefore key success factors in the ICT revolution.

But at the same time, as outlined, HCI is flexible and adaptable for the individual user. A broad range of alternatives and enhancements in terms of interaction techniques, methodologies and devices has become available allowing individualization and adaptation of the HCI to the needs and preferences of users, the environment, the situation and other characteristics. (e.g. Shneiderman, 2012) Once profiled and optimized for the user, the HCI stays stable, the user can take it with her/him and use it for more and more activities. This is what users expect when changing to the next level of the information society (e.g. “Webx.0”, cloud, Internet of Things).

And this makes HCI, when accessibility requirements (“eAccessibility”) are taken into account, also the core enabler for enhanced eParticipation and eInclusion of people with disabilities. The key challenge is to interface the HCI and to allow people with disabilities to become active in these virtual representations and via it in real world activities. AT and eAccessibility can focus on this single and stable instance to allow access to more and more diverse systems and service. HCI provides the freedom of selecting the media and mode of interaction what makes it much easier to adapt it to the needs, requirements and preferences of individual users, including those of people with disabilities. The core qualities of ICT/HCI/AT facilitating inclusion are its

  1. flexibility and adaptability in terms of media representation and modes of interaction
  2. universality in terms of application in almost any aspect of the information society
  3. standardization and stability in its basic principles and techniques.

And with this we allow people to reach out to any systems and services, away from the desktop to conquer the Internet of Things (Sundmaeker et al, 2010) or Sservices (Howard & Jones, 2004), where sensor technology provides a virtual representation of the environment (including human beings) and makes it subject to the AT/HCI/ICT mediated interaction. Exploiting this potential of millions and trillions of (“WEBx.0”) interconnected objects for a more flexible and adaptable Information Society and a more flexible is the core challenge we face in the domain of eAccessibility and AT today, of course by also addressing related risks of security, privacy and in particular accessibility (e.g. W3C, 2012) seriously. “The Internet of Things (IoT) is a new actualization of subject-object relationships. Me and my surroundings, objects, clothes, mobility, whatever, will have an added component, a digital potentiality that is potentially outside of 'my' control. Every generation builds its own add-ons to the notions of reality, to what it believes are the foundations of the real.” [Sundmaeker et al, 2010, p.26]

Whatever the cultural impact and discussion might be, for people with disabilities it is often the first time of independent, self determined and non-mediated interaction with the physical and human/social world. This matches with the changing understanding of disability as it is no longer an individual or medical phenomenon but in particular determined by the way we design our environment – accessible of not accessible. With sensor technology and the Internet of Things the environment become more and more moldable and we get a tool at hand to implement accessibility. The way we design our environment it will impact on the way people with disabilities can interact and participate.

More than any other individual or group, people with disabilities benefit from progress in sensor technology and a more flexible and adaptable interaction via virtual representations which should become accessible through the standardized HCI. Often only a fancy gadget for the average, AT/HCI/ICT provides unique and often first time access for people with disabilities.

This potential of inclusion, participation and enhanced democracy for many users who were so far excluded from many aspects of our society and culture must not be neglected as part of the “homo informaticus”. Exploring the potential and cultivating the way of using it needs to be balanced.

References

  1. Howard, P. & Jones, S. (Ed.) (2004). Society online. The Internet in Context. Thousand Oaks: Sage Publishing.
  2. Miesenberger, K.; Nussbaum, G.; Ossmann, R. (2013): AsTeRICS: A Framework for Including Sensor Technology into AT Solutions for People with Motor Disabilities, in: Kouroupetroglou, G.: Assistive Technologies and Computer Access for Motor Disabilities, IGI Global.
  3. Miesenberger, K. (2009). Design for All Principles. In Sik Lányi, C.(Ed.). Principles and practice in Europe for e-Accessibility. EDeAN Publication 2009, Veszprém: Panonia University Press.
  4. Müller-Prove, M. (2002). Vision and Reality of Hypertext and Graphical User Interfaces. Dissertation. Universität Hamburg.
  5. Shneiderman, B. (2012). Handbook of Human Factors and Ergonomics. (4th Ed.). Int. J. Hum. Comput. Interaction 28(12): 838.
  6. Sundmaeker, H., Guillemin, P., Friess, P. & Woelfflé, S. (Ed.) (2010): Vision and Challenges for Realising the Internet of Things, CERP – IoT, Cluster of European Research Projects on the Internet of Things. European Commission. Retrieved October 1, 2012, from http://www.internet-of-things-research.eu/pdf/IoT_Clusterbook_March_2010.pdf
  7. W3C (2012). Web Accessibility Initiative (WAI). Retrieved October 1, 2012, from http://www.w3.org/WAI/
  8. Wilson, J. (Ed.) (2005). Sensor Technology Handbook. Burlington: Elsevier.
  • Open access
  • 216 Reads
Orientalism and/as Information:The Indifference That Makes a Difference
Introduction

In a recent work inspired by Borgman's [1] approach to exploring issues at the intersection of information and reality, Chapman [2] outlined a framework for thinking about issues at the intersection of information and religion, viz. information about religion, information for religion, and information as religion. This framework might be complemented by another in which the same issues – that is, information and religion – are reflexively engaged, viz. religion about information, religion for information, and religion as information. However, irrespective of which framework is engaged, in attempting to consider such intersections there is a need to consider how terms should be framed – that is, what 'information' and 'religion' means in each framework.

In an earlier work [3] which introduced a reflexive hermeneutic framework wherein race is considered from an information-theoretical perspective and information is considered from a critical race-theoretical perspective, genealogical links between Bateson's [4] conceptualization of (a unit of) information as "a difference that makes a difference" and Kantian notions of difference were briefly explored. It was shown that a growing body of critical race philosophy scholarship has demonstrated, somewhat controversially, Kant's seminal contribution to what might be described as a modern 'scientific' concept of race. By way of reference to his writings on philosophical anthropology, which describe non-European 'races' in explicitly racist terms (as ontologically-inferior in some sense), it was argued that, insofar as Kant's thinking on race racism might not be accidental but rather essential to his philosophy {#1} – more specifically, his epistemology – it is possible that Kantian racism informs the Batesonian concept of information. For example, it was noted that Bateson’s concept of information, and Kant’s concept of aesthetic judgement which inspired it, are fundamentally teleological (or goal-oriented) in that they appeal to selection which is a purposeful act; in addition, attention was drawn to Bateson's assertion that difference entails classification and that all classification is hierarchic. This is significant since, as stated previously, Kant, arguably the primary genealogical source of Bateson's conception of information as grounded in difference, is also committed to a hierarchical conception of difference and, in his work on philosophical anthropology, to one in which difference is understood as 'otherness' or 'alterity', rather than change or alteration.

Perhaps most significant for present purposes, however, is Bateson's distinction between what he calls 'Occidental Epistemology' (or OE) and cybernetic epistemology (CE). While it might be argued that Kant’s epistemology with its connection to, if not grounding in, Eurocentric philosophical anthropology is a paradigmatic instance of the former, it was previously argued that this move is problematic on at least three counts: Firstly, Bateson nowhere explores the connections between Kant, epistemology, race and information, nor does he explicitly identify Kant’s epistemology as an instance of OE; secondly, Kantian epistemology is often appealed to in formulating cybernetic conceptions of knowing (including that articulated by Bateson himself, at least with respect to its informational aspect), which means that identifying it as an instance of OE, as against CE, is questionable; thirdly, Bateson’s distinction between OE and CE is itself contestable. In addition, the absence of explicit reference to the 'Orient' in framing the opposition between OE and CE is significant in that it points to what might be regarded as an 'indifference' or marginalization.

In this essay, and inspired by Chun's [5] "race and / as technology", I want to explore the Bateson-Kant – and thereby the information-race – connection further, but from a somewhat different perspective, viz. one in which 'race' and 'religion' are entangled {#2}. Specifically, I want to consider the implications of what Almond [6] has referred to as Kant's Eurocentrically racialized marginalization of – or rather, his 'indifference' to – the (Islamic) Orient in his philosophical anthropology, and what this might mean for Bateson's conception of information. Extending the reflexive hermeneutic framework introduced in [3], I want to consider what it might mean to think about issues at the intersection of information and Orientalism, with the latter framed in terms of a race-religion entanglement; more precisely, I want to examine what it might mean to think about Orientalism from an information-theoretical perspective, and what it might mean to think about information in terms of Orientalism; in short, I want to engage with "Orientalism and / as information".

Methodology

Methodologically, I will proceed by augmenting the critical race theoretical component which informed the reflexive hermeneutic framework introduced earlier with a 'decolonial' perspective wherein issues of epistemology (and ontology) are engaged critically in terms of geo-political and body-political, but also ego-political and theo-political (in the sense of political theology) considerations; such an extension builds on previous work concerned with motivating the conceptualization of a 'decolonial computing' [7]. I want to suggest that Bateson's ostensibly neutral conception of information as "a difference that makes a difference", grounded as it is in Kantian epistemology, might be inflected not only by Kant's "colour-line" (or epidermally-marked) racism but also by what Medevoi [8] has referred to as "dogma-line" (or religiously-marked) racism – more specifically, by a Kantian Orientalism characterized by an indifference to an Islamic 'other' conceived as sensuous (that is, physical or materialistic), superficial (that is, outward, externalist or 'syntactic') and irrational (that is, chaotic or 'noisy'); crucially for Kant, these alleged characteristics of Islam pose a threat to (Eurocentric) rationality – specifically, "the communication of thought" [6, p.29] – the boundaries of which must be preserved through a process of marginalization (bracketing, footnoting), viz. an "indifference [to the Islamic Orient] that makes a difference [to Europe]".

Analysis

It is important to appreciate that Kant's ostensibly philosophical concerns about the 'threat' posed by the Islamic Orient are articulated against a background of much earlier theo-political anxiety about this perceived threat. As a European, Kant articulates his views from a geo-political site that historically emerged out of a prior political formation with a 'religious' orientation, viz. Western Christendom. Crucially, as Mastnak [9] has argued, "Europe as a unity that developed a 'collective identity' and the ability to orchestrate action … was, as a rule, articulated in relation to Muslims as the enemy" {#3}; in short, "European identity was formed not by Islam but, predominantly, in the relationship … to Islam" (p.3). This is significant from an information-theoretical perspective since it points to a relational – that is, systemic – conception of difference such as that proposed by Bateson, yet, given Kant's concern with 'indifference' vis-à-vis the Islamic Orient, one in which the difference that is constituted is characterized by absence – that is, a 'background' position – and 'alterity' (or 'otherness') rather than presence and change {#4}.

Mastnak [9][10][11][12] has argued that the issue is not so much about Islam as a religion in the sense of a doctrine or theology – although such concerns do feature in pre-European discourses within Western Christendom – as it is about Islam as a socio-political order, that is, as what Hodgson [13] has referred to, somewhat problematically since based on a projection from a Western Christian context onto a non-Western Islamic context, as 'Islamdom'. However, Mastnak's position is somewhat problematic: on the one hand, he is correct insofar as European political identity is framed in terms of difference from an Islamicate {#5} order that is not purely theological in the sense of doctrinal – identify and difference being "the same" in the sense of belonging-together [14] as mutually co-constitutive structures; on the other hand, both Mastnak (and Hodgson) are incorrect insofar as the Islam – Islamdom distinction implies a partitioning of the religious as doctrinal from the political as socially- and bodily-practiced that is an artefact of European secular modernity {#6}. Such a partitioning arguably serves to conceal the operation of racializing logics that function along the "dogma-line", a position which draws support from Lloyd's [15] insistence that "religion and race both name social practices, and bodily practices" (p.80). In the contemporary Western cum global context, "the supremacy of post-Protestant religiosity is maintained by the secularist strategy that marks other groups as having a religion – and so needing special study or accommodation. Secularism is the obverse of religious pluralism: it chooses which religions to recognize and so determines their parameters. Just as the origins of religion and race are intertwined, perhaps these means of controlling religion and race are intertwined: perhaps multiculturalism and secularism go hand in hand, jointly working to distort. The robustness of religious and racial ideas and practices is reduced to one box to check among several – a belief or a skin colour – either way subject to the hegemony of the unmarked: the white post-Protestant." (p.83)

Crucially, Lloyd maintains that in a post-Protestant / secular world, "at most individuals can have personal preferences or desires, countable and quantifiable, so race and religion are disfigured into these terms [emphasis added]." (p.83) In terms of a reflexive hermeneutic framework exploring issues at the intersection of information and religion / race, it is interesting to consider how such a disfigurement – a deformation involving contraction (compression? {#7}) – might be framed in informational terms. For example, is there a correlation between the shift from pre-Protestant 'religion' as socially-embedded and embodied practice (or discursive tradition) to post-Protestant religion as "personal preference or desires" (worldview, grand narrative, ethical framework) and the shift from a contextualized, systemic and semantic conception of information (such as that formulated by Bateson) to a decontextualised, non-systemic and syntactic conception of information (such as that proposed by Shannon) that Malik [16] claims can be and has been instrumentalised in the service of (racialized) capitalism?

Conclusions

Persisting with the theme of concealment, that is, bracketing, indifference or 'absenting' – which can be related to information as a difference that is an absence [17] – and building on earlier arguments presented in [3] which point to a persistent and pervasive "epistemology of ignorance" [18][19] – that is, an occlusion, silencing, and violent erasure of racial 'otherness' that is both foundational to, and provides the tacit contextual background of, contemporary Eurocentric discourses on information and its meaning – I maintain that the "indifference that makes a difference" examined herein has a number of implications in terms of current proposals for information-based political, social and ethical initiatives. Some of these implications will be explored through a critical race / religion theoretical analysis of positions articulated from information-centric perspectives including those due to Malik [16], Deacon [17], Floridi [20][21] and others.

Endnotes

{#1} In this connection, Almond [6] refers to "the disputed status of anthropology within Kant's oeuvre" (p.45). In an attempt at resolving this dispute, Mills [23] provides a comprehensive and critical survey of the various positions on this issue, and makes a case for how Kant's racism impacts on his political and moral philosophy. However, insofar as Kant's racist philosophical anthropology inflects his aesthetics, which Mills concedes, and given that aesthetics arguably has at least some relation to, if not bearing on, epistemology, I want to argue for a stronger, albeit more controversial, position, viz. that Kantian epistemology is itself racially-inflected.

{#2} In this connection, Lloyd [15] is representative of a critical race theoretical tendency within an emerging body of scholarship associated with critical approaches to the study of religion. On his view, "writing on religion for centuries ignored race" (p.83); however, "race and religion are thoroughly entangled, perhaps starting with a shared point of origin in modernity, or in the colonial encounter. If this is the case, religion and race is not just another token of the type 'religion and,' not just one approach to the study of religion among many. Rather, every study of religion would need to be a study of religion and race." (p.80) What this means is that any reflexive hermeneutic consideration of issues at the intersection of information and religion must, of necessity, adopt a critical race theoretical approach; for this reason, (Islamic) Orientalism is framed as both a 'religious' and 'racial' phenomenon in this study.

{#3} It is crucial to point out that the opposition / antagonism at work here between Christendom and Islamdom is not 'religious' in a confessional sense nor trans-geographical in nature. On this point, consider a recent study by Penn [24] establishing that the first Christians to encounter Muslims were not Latin-speaking Christians from the Western Mediterranean or Greek-speaking Christians from Constantinople, but rather Christians from Northern Mesopotamia. Crucially, these Syriac Christians, who lived under Muslim rule from the seventh century to the present, wrote the first and most extensive accounts of Islam, describing a complicated set of religious and cultural exchanges not reducible to the solely antagonistic.

{#4} In this connection, it is significant to note that Deacon [17], whose bio-systemic semiotic conception of information draws upon contributions from Shannon (communication theory, syntactics), Boltzmann (thermodynamics, semantics) and Darwin (evolution, pragmatics), insists that information is fundamentally concerned with absence rather than presence.

(#5} By 'Islamicate', I refer to a term coined by historian Marshall Hodgson [13], viz. something that "would refer not directly to the religion, Islam, itself, but to the social and cultural complex historically associated with Islam and the Muslims, both among Muslims themselves and even when found among non-Muslims." (p.59). In short, the Islamicate refers to that which is associated with the 'civilizational complex' grounded in and emerging from Islam, yet not necessarily characterised by fidelity to Islam in any doctrinal or 'confessional' sense. Although Hodgson's characterization of Islam as a 'religion' is problematic, his distinction between Islam and the Islamicate is useful insofar as it might function as a differences that makes a difference in terms of problematizing what 'religion' means in relation to Islam, and the implications of this for European identity formation.

{#6} According to Lloyd [15], "a focus on religious beliefs and ideas was a product of a very specific religious background, namely, Protestantism … Scholarship had unthinkingly accepted the Reformation dismissal of ritual, practice, objects, bodies, and media as magical (Catholic) non-sense, not a proper part of Christianity." (p.81) As an alternative to thinking about religion in private and 'doctrinal' terms, Lloyd follows Asad [22] and others in "turning to tradition as a frame for analysis", whereby tradition is meant "a set of practices, including styles of reasoning, that grows out of a shared history, has shared values implicit within it, and is supported by institutions. Traditions in this sense are dynamic and contested, having among their components practices for contestation and transformation." (p.82)

{#7} On this point, consider Almond's [6] characterisation of Kant as "a central figure in the Enlightenment footnoting of Islam, a pivotal stage in the rationalist reduction of the Muslim Orient to a curious appendage [emphasis added]." (p.29)

References

1. Borgmann, A. Holding on to Reality: The Nature of Information at the Turn of the Millenium. University of Chicago Press: Chicago, 2009.
2. Chapman, D.A. Information and Religion. Images of Europe: Past, Present, Future. Proceedings of the 14th International Conference of the International Society for the Study of European Ideas (ISSEI 2014). (In Press)
3. Ali, S.M. Race: The Difference That Makes a Difference. tripleC: Cognition, Communication, Cooperation 2013, 11(1), 93-106.
4. Bateson, G. Steps to an Ecology of Mind. University of Chicago Press: Chicago, 1972.
5. Chun, W.H.K. Race and/as Technology; or, How to Do Things to Race. Camera Obscura 70 2009, 24(1), 6-35.
6. Almond, I. Kant, Islam and the Preservation of Boundaries. In History of Islam in German Thought. Routledge: New York, 2011, pp.29-52.
7. Ali, S.M. Towards a Decolonial Computing. In Ambiguous Technologies: Philosophical issues, Practical Solutions, Human Nature: Proceedings of the Tenth International Conference on Computer Ethics – Philosophical Enquiry (CEPE 2013). Elizabeth A. Buchanan, Paul B, de Laat, Herman T. Tavani and Jenny Klucarich, Eds. International Society of Ethics and Information Technology: Portugal, 2014, pp.28-35.
8. Medevoi, L. Dogma-Line Racism: Islamophobia and the Second Axis of Race. Social Text 111 2012, 30(2), 43-74.
9. Mastnak, T. Islam and the Creation of European Identity. CSD Perspectives. Centre for the Study of Democracy. Research Papers, Number 4. University of Westminster Press: Westminster, 1994.
10. Mastnak, T. Fictions in Political Thought: Las Casas, Sepulveda, the Indians, and the Turks. Fit vest / Acta Phil, X V (2) 1994, 127-149.
11. Mastnak, T. Europe and the Muslims: The Permanent Crusade? In The New Crusades: Constructing the Muslim Enemy. Emran Qureshi and Michael Sells, Eds. Columbia University Press: Columbia, 2003, pp.205-248.
12. Mastnak, T. Western Hostility toward Muslims: A History of the Present. In Islamophobia / Islamophilia: Beyond the Politics of Enemy and Friend. Andrew Shyrock, Ed. Indiana University Press: Bloomington, 2010, pp.29-52.
13. Hodgson, M. The Venture of Islam: Conscience and History in a World Civilization. Volume 1 The Classical Age of Islam. University of Chicago Press: Chicago, 1974.
14. Heidegger, M. Identity and Difference. Translated and with an Introduction by Joan Stambaugh. Harper & Row: New York, 1969.
15. Lloyd, V. Race and religion: Contribution to symposium on critical approaches to the study of religion. Critical Research on Religion 2013, 1(1), 80-86.
16. Malik, S. Information and Knowledge. Theory, Culture & Society 2005, 22(1), 29-49.
17. Deacon, T.W. What is missing from theories of information? In Information and the Nature of Reality. Paul Davies and Niels Gregersen, Eds. Cambridge University Press: Cambridge, 2010, pp.123-142.
18. Mills, C.W. The Racial Contract. Cornell University Press: Ithaca, 1997.
19. Mills, C.W. White Ignorance. In Race and Epistemologies of Ignorance. Shannon Sullivan and Nancy Tuana, Eds. SUNY Press: Albany, 2007, pp.13-38.
20. Floridi, L. The Information Society and Its Philosophy: Introduction to the Special Issue on "The Philosophy of Information, Its Nature, and Future Developments." The Information Society 2009, 25(3), 153-158.
21. Floridi, L. Information: A Very Short Introduction. Oxford University Press: Oxford, 2010.
22. Asad, T. Genealogies of Religion: Discipline and Power in Christianity and Islam. John Hopkins University Press: Baltimore, 1993.
23. Mills, C.W. Kant and Race, Redux. Graduate Faculty Philosophy Journal 2014, 35(1–2), 1-33.
24. Penn, M.P. When Christians First Met Muslims: A Sourcebook of the Earliest Syriac Writings on Islam. University of California Press: California, 2015.
  • Open access
  • 119 Reads
Health Information Technology: Empowering Consumers, Patients, and Caregivers

Introduction

Traditionally in health care, knowledge and authority have rested with medical professionals and care was delivered in professional settings. Individuals have been considered solely as “patients,” i.e. defined by their relationship to doctors. Health information technology (HIT) is enabling consumers (i.e. an individual outside of a patient context), patients, and family caregivers to more fully understand health and illness, to self-manage health and illness at home when feasible, and to partner with their medical providers when necessary. As a result, there is the potential for a re-balancing of the power relationship between doctors and patients toward greater collaboration (including family caregivers), and increased attention to contexts of daily life in which “health happens.” However, the available technologies and their actual implementation are currently insufficient to empower consumers, patients and caregivers to fulfill their new responsibilities.

Key Health Information Technologies

Examples of typical HIT tools and functions for consumers, patients and caregivers include the following.

  • Communication between physicians and patients via secure messaging
  • Patient web portals, usually linked to a providers’ electronic health record system
  • Personal health records
  • Web sites, social media sites, and online games
  • The purchase of health-related products and services over the Internet
  • Mobile, wearable, home-based or implanted devices that track and report data, such as heart rate, blood pressure and insulin levels
  • Online health information searches
  • Blogs, forums, and social media applications that allow people to share their experiences and pose questions [1]

Consumers, patients, and caregivers may have sole control of the technologies, for example, personal monitoring devices, or they may share control with others, such as patient portals that are part of their providers’ web sites and linked to patients’ electronic health records. When consumers, patients, and caregivers choose and control access to and use of health technologies themselves, they can determine how actively they participate, what information to include, who they want to share information with, and when to begin and end use, as well as define what is accomplished by using the technology. Consumers, patients, and caregivers often enter the information themselves in the device or the device collects information on command or passively. Technologies with shared control, such as patient portals, also allow consumers, patients, and caregivers to determine their level of participation (none, a little or a lot), but the types of information allowed, the rules for initiating and terminating use, as well as information sharing, are set by the organization providing the portal [2].

Medical professionals, employers, and policy makers often perceive “patient empowerment” strategies as means to the end of getting patients to take greater responsibility – both in terms of participating in decision-making and paying a larger share of costs – for their health and health care [1] [3]. Patients value HIT tools that make it easier to participate in their care. Many are eager to know at least some of what their providers know, and tools like portals are the first healthcare innovation that facilitates information sharing and communication, both of which open the door to collaboration. In their own words, they think this “levels the playing field” [4] [5].

With proper safeguards, digital health technologies can create significant opportunities to find health information; do more self-help and self-care; create and maintain personal health records; access personal health information held by providers; consult with healthcare providers through secure messaging or telehealth technologies; transact healthcare-related business; and purchase health-related goods and services electronically (digital health commerce).

Problems with HIT

While the digital divide has narrowed over time, access disparities by age, income, and education remain [6]. Issues of health literacy and usability pose barriers to HIT use across age, race, ethnicity, and literacy boundaries [2]. Nevertheless, researchers find that people with less education or limited health literacy skills will value and use HIT if someone explains the tools to them, they perceive the tools are helpful, and they receive encouragement and support registering for portals and completing tasks [1].

Other issues arise from changing definitions of what constitutes health information, in concepts of ownership and control of personal health information, and in new challenges with respect to ensuring the quality of health information being created by disparate individuals and enterprises. Health information is no longer just the clinical data that are created by doctors’ visits, hospitalizations, and lab tests that reside in institutional medical records. It now includes an array of longitudinal information on, for example, prevention, wellness, previous health experiences, alternative and complementary medicine, and over-the-counter remedies. Currently, there are no widely employed data standards to enable all these different bits of personal health information to be seamlessly integrated. [1] Second, concerns about violations of privacy rights and the confidentiality of health information will continue to influence debates about who owns or controls health information and who should have access to which information. Even if individuals are the presumed owners of their information, the reality is that current information-handling policies and practices give individuals few concrete ways to control the movement of their information among multiple parties.

Technologies coming from the consumer-oriented market are more end-user oriented than tools offered by providers, but their products tend to be divided by function and typically are not interoperable with providers’ portals. This sector is driving innovation rapidly, but the unfortunate trend is a proliferation of new stand-alone products. Moreover, most consumer apps fall outside any regulatory framework that would prevent the companies from selling personal information. [2]

Conclusions

Even though HIT tools, such as patient portals and health apps, are increasingly available across the socioeconomic spectrum, consumers, patients, and caregivers still confront a healthcare system coming to terms with technology’s consequences for the patient experience and healthcare service delivery. Making HIT tools available is necessary but not sufficient; making them consumer-, patient- and caregiver-centric allows the tools to be truly useful to the people who will derive the greatest benefits.

Ultimately, tools for consumers, patients and caregivers will achieve their potential when they are as highly valued and developed elements of the electronic health information infrastructure as professional tools. Momentum is building in many countries toward holding consumers and patients accountable financially and morally for their health. Before going further in that direction, those who influence health policy, practice, and technology would do well to understand which tools and approaches enable consumers, patients and caregivers with many different needs and capabilities to fully participate in managing health and care.  

Acknowledgments

The author thanks Cynthia Baur, Ph.D., U.S. Centers for Disease Control and Prevention, for her collaboration on two recent book chapters from which this article was partially derived.

References and Notes

  1. Deering, MJ and Baur C. Patient Portals as Channels for Patient-Provider Collaboration. In Information Technology for Patient Empowerment in Healthcare; Grando MA, Rozenblum R, Bates, WD, Eds.; De Gruyter: Berlin, Germany, Boston, United States; Spring 2015.
  2. Baur C and Deering MJ. E-Health for Consumers, Patients, and Caregivers, In Ethical Challenges in the Management of Health Information, 3rd ed; Harmon LB, Ed.; Aspen Publishers Inc.: New York, United States; expected publication Summer 2015.
  3. Goldzweig CL, Orshansky G, Paige NM, Towfigh AA, Haggstrom DA, Miake-Lye I, Beroes JM, Shekelle PG. Electronic patient portals: Evidence on health outcomes, satisfaction, efficiency, and attitudes: a systematic review. Ann Intern Med. 2013; 159(10):677-87.    
  4. Woods SS, Schwartz E, Tuepker A, Press NA, Nazi KM, Turvey CL, Nichol WP. Patient experiences with full electronic access to health records and clinical notes through the My HealtheVet personal health record pilot: Qualitative study. J Med Internet Res. [Internet]. 2013 [cited 2014 Jun 9]; 15(3) e65. Available from http://www.jmir.org/2013/3/e65/ doi:10.2196/jmir.2356.
  5. Osborn CY, Mayberry LS, Wallston KA, Johnson KB, Elasy TA. Understanding patient portal use: Implications for medication management. J Med Internet Res [Internet]. 2013 Mar 7 [cited 2014 Jun 9];15(7):e133. Available from http://www.jmir.org/2013/7/e133/ doi:10.2196/jmir.2589
  6. Pew Internet and American Life Project [Internet]. Internet User Demographics 2014 Jan. Washington, D.C.: Pew Internet and American Life Project. [cited 2015 March 19]. Available from http://www.pewinternet.org/data-trend/internet-use/latest-stats/.
  • Open access
  • 80 Reads
One Problem - One Thousand Faces (The Bridging of Philosophy and Science)

Introduction

Philosophy and science may appear as two distinct, even antagonistic disciplines, but this ignores a shared origin as Natural Philosophy. Natural Philosophy, from that start, focuses on grasping Earthly cause-and-effect, and this remains the case for philosophy and science. In realizing a human “information edifice,” our success with asking philosophical questions and finding scientific answers yields a fount of knowledge such that philosophy and science now seem divided by that success – the One Thousand Faces referenced in the title.

This presentation reprises the core matter of Earthly cause-and-affect to infer a new foundational/informational vista. The posited view aims to complement and bridge (or perhaps, even, surpass) philosophic and scientific models – by developing a type of “information science” or “core informatics.”

Development

This talk begins by recognizing basic issues that underpin all “asking” and “answers” – core informational topics. It then presents one way in which humanity, in partial fashion, addresses those issues. The descriptive-explanatory model used is a common Hard Disk Drive (HDD) and the Information Technology (IT) principles it uses – which affords a “most reductive” (a priori) view. The IT principles referenced are: metadata, to model “meaning-ful” information; and mechanical interpretation, used to designate “meaning-ful” information. Due to time limits, this talk discusses only metadata; while joint coverage of metadata and interpretation is modeled in a detailed accompanying paper (PSYCHE AS AN INFORMATIONAL STRATEGY – General Information Theory [14,000 words]) available at: http://issuu.com/mabundis/docs.

The accompanying paper presents a reductive “functionalist information theory” remedy to historic issues in modeling the human Psyche (consciousness, intelligence, etc.). It uses set IT models to make a contrasted analysis of IT’s-Psyche’s operations. The paper argues that all information, at a minimum (whether IT or Psyche), has an inviolate dual aspect of “form + content.” This unified dual material aspect is then shown to resolve the duality typical of historic views of Psyche (Hard Problem, Signal Grounding Problem). With this dual aspect in mind, the paper next posits a role for natural selection in the formation of Psyche . . . and of all functionalist (cause-and-effect) sense-ability.

To present “a role for natural selection,” the paper develops a genus of Shannon’s signal entropy, Bateson’s differences, Monod’s material necessity and chance, and Darwinian reproduction, as enabling a generative/emergent foil contra natural selection. Ensuing events (recurring effective-and-efficient recombinant roles) then bring about an extant ontology and epistemology for Psyche – or, a surviving objective-subjective intelligence. This innately unified vista defines a direct link between material entropy, evident in the world in many forms, and a more personal subjective sense of signal entropy (identity), that is equally evident.

Conclusion

The paper offers a precise (unified) taxonomy as general information theory (GIT) – arguing for the development of effective Universal Theories of Information (UTIs). As such, it significantly expands the classic view of information beyond Shannon's signal entropy and displaces the typical role of thermodynamic entropy as “noise” with “material entropy.” The implication of this broad “natural informatics” (thinking like nature) is that it affords a likely organizing principle for multi-state/quantum computers, strong artificial intelligence, cognitive materials, and the like. An 11 minute video of this talk’s animated slides (without audio) is available at: http://youtu.be/dlnu-KCQ70o.

References

  1. Abundis, M. (2015). Psyche as an informational strategy (general information theory). Issuu.com [online] Available at: <http://issuu.com/mabundis/docs> [Accessed 1 January 2014].
  • Open access
  • 107 Reads
Computational Account of Emotion, an Oxymoron?

In this work we address the belief that cognitive processes such as emotions cannot be modelled computationally. We base our argument on info-computational naturalist approach to cognition, where computation is understood as information processing on several levels of organisation of cognitive agency, and where an agent is defined as an entity capable to act on its own behalf. We also argue that Daniel Kahneman’s fast and slow thinking systems can be explained within our model. In doing so we connect information, computation and cognition as a dynamic triangular relationship.

Introduction

We take a broadly naturalist stance toward mental (cognitive) phenomena such as intentionality and allegedly subjective and qualitative aspects pertaining to phenomenal consciousness, in e.g. emotions and sensory experience. Emotions and feelings are occasionally excluded from cognition, and it is often argued that cognitive processes are computational in the way computation is understood as symbol processing, wherefore emotions and feeling cannot be computational phenomena.

We argue that feelings and emotions naturally belong to cognition and even evolutionary precede symbol-manipulating cognition. Furthermore, we draw upon a new understanding of computation known as natural computation or computing nature, which includes both sub-symbolic and symbolic information processing and thus is capable of modelling both sub-symbolic and symbolic cognitive functions.

As Kahneman (2011) argues, we rely greatly in many contexts on our “fast thinking” capabilities that help us manage complex situations quickly. They also help us make first steps in approximations and anticipations. Fast thinking processes are very much based on experience, and thus memory, both evolutionary (built into the morphology of an organism) and developmentally. We propose, as implied by Strannegård et al. (2012), that Kahneman’s systems 1 and 2 correspond roughly to sub-symbolic and symbolic cognition, respectively.

Naturalism and its Critique

Our computational approach can be described as naturalist computationalism, as expounded upon by Dodig-Crnkovic (2014). One typical criticism against naturalistic theories of mind is the claim that naturalism identifies the mind with the physical body. The important distinction that this claim fails to make is that mind is a process and not an object.

Yet, like running presupposes the existence of a physical body, the mind cannot be decoupled from the body. We argue that a specific physical substrate such as the embodied brain is both necessary and sufficient to explain the natural occurrence of mind. The mind-body distinction was succinctly expressed by Minsky (1988): minds are what brains do.

What we claim is that mind as a process can be adequately modelled as computation, specifically natural computation. When it comes to critique of computational approaches to mind, there are frequent claims that symbol manipulation, neural networks and dynamical models are all mutually exclusive. Fresco (2014) shows, however, that this criticism is unfounded. All three approaches are applicable, albeit in different domains, levels of organisation, and aspects of cognition.

Problem with Qualia

As a further problem of naturalism (which is often wrongly considered to be identical with physicalism, a stronger stance than naturalism) is often invoked its supposed incapability to account for ”the existence of qualia” and ”the nature of intentionality”.

We agree with Dennett (1996) that qualia are by no means the hard problem of consciousness, but a simple feature of a natural organism. The fact that each of us have our own subjective feeling of pain or joy is not unlike the fact that each of us have our own handwriting. Even though handwriting is not identical with a hand (or a pen!), it is a result of coordinated processes between our hand (keeping a pen), arm, and the rest of the body providing the right posture. Furthermore, senses (with sensors + actuators) such as vision and touch, together with the whole visual process are included via nerves transmitting and partly processing information. The brain, containing memories of previously experienced writing, is integrating information, processing it and making decisions that are further propagated towards the body, arm, hand, fingers, pen… Handwriting as an artefact, then, is a result of information processing in many parts of our body. Moreover, this process proceeds on many levels of organisation – from the molecular (although this level is typically not considered to be computational, we identify molecular processes with natural computation), to the cellular level and up to organs and the organism as a whole Dodig-Crnkovic (2014, 2013). Organisms are themselves part of a distributed cognitive system that provides a social framework from which the rules of the alphabet and other conventions about writing come.

In short, handwriting is not a hand, and in the same way, cognition is not a brain. Qualia just reflect the fact that each human being is a unique human being. The difference in qualia between individuals might be relatively big, as it is in the case of handwriting, but we are capable of communicating our subjective feelings to others who have no problems to interpret them. Even though one person’s experience of colour is probably not identical with any other person’s, neither are two persons’ bodies identical, and we do not find that fact particularly difficult to understand.

Cognition as Computational Process

The aim of this naturalisation project is to understand cognition, including feeling and emotion, in terms of computational processes, from the molecular level up. In a way, it is a modern kind of reductionism, a new kind of generative reduction where we do not reduce an object to some smaller more fundamental objects (as physics reduces macroscopic bodies into atoms and even smaller elementary particles down to strings). We are proposing to reduce complex processes to interactions of simpler processes that undergo phase transitions (as observed in nature) – from the level of molecules and their networks to cells and aggregates of cells such as organisms and their networks (Dodig-Crnkovic, 2014). In that way, cognitive processes can be modelled as emerging on different levels of scale in living organisms. The aim, then, is to be able to model cognitive processes and behaviours of different classes of agents based on an understanding of the underlying chemical processes that form biological processes that exhibit cognitive behaviour.

The reductionist project in physics led to the reduction of macroscopic properties of (ideal) gases such as pressure or temperature to the kinematic behaviour of gas molecules. In kinematic theory an ideal gas is modelled as random motion of large numbers of atoms or molecules. However, the important difference between an ideal gas and a living organism is in the complexity of their structures and interactions. Unlike an ideal gas, where identical molecules are assumed to move completely randomly, a living organism is highly heterogeneous and quasi-regularly organised with a very complex unit – the cell – as a basis of each organism’s organisation. Each cell consists of thousands of different types of parts which form compounds, that later on dissolve in a complex dynamical process. Understanding of an organism’s behaviour, even on a cellular level, is a goal we are still far from. However, with present-day research methods and modelling tools (in the first place computational tools) we see that we in the near future will have simulation tools capable of modelling the behaviour of a living organism in increasingly more realistic ways. Starting from the simplest forms of cognition in the living cell we can increase our understanding of the underlying mechanisms that cognition in more complex organisms is based on.

We propose a constructive step in the improvement of our understanding of mental, or better, cognitive (as cognitive science has a clear naturalistic and scientific orientation) phenomena, that would connect observed macroscopic behaviour and processes with their complex and layered bio-chemical basis. This project will not reduce feeling to a molecule but will connect the observed cognitive process with its biochemical generative basis.

Computational Account of Emotions and Kahneman

The computational theory of mind has been criticised for not providing a satisfactory explanation of emotion. We argue that, on the contrary, computational theories not only explain how emotion arises, but furthermore makes a strong case for the evolutionary advantage of emotion (von Haugwitz et al., 2012). Neuroscience has, over the decades since its foundation, been elucidating the biochemical basis of emotion in the brain, and the physiological effects of various neurotransmitters are increasingly well understood. The neurotransmitters mostly associated with emotion have also been shown to regulate learning in humans by providing an intrinsic reward system, modulating exploration, balancing long- and short-term planning as well as controlling the learning rate (Doya, 2002, 2008). The capability to dynamically modulate these parameters is beneficial in non-stationary environments such as the real world, and an algorithm is proposed by Schweighofer and Doya (2003). A computational model, in terms of the utility function of the organism, for how emotions are generated can be derived from appraisal theory, which suggests that (at least a large class of) emotions arise as a result of the organism's appraisal of a situation, rather than as a function of the situation itself (Marinier and Laird, 2009). We thus have a theory of implementation, evolutionary and mathematical motivation and a generative computational description of emotion.

References and Notes

Dennett, D. (1996) Facing backwards on the problem of consciousness. Journal of Consciousness Studies 3 (1), pp 4–6.

Dodig-Crnkovic, G. (2014) Modeling Life as Cognitive Info-Computation, In: Computability in Europe 2014, Arnold Beckmann, Erzsébet Csuhaj-Varjú and Klaus Meer (Eds.) Proceedings of the 10th Computability in Europe 2014, Language, Life, Limits, Budapest, Hungary, June 23 - 27, 2014, LNCS, Springer

Dodig-Crnkovic G. (2013) Information, Computation, Cognition. Agency-based Hierarchies of Levels. PT-AI St Antony's College, Oxford,  20.09.2013  http://arxiv.org/abs/1311.0413

Doya, K. (2002). Metalearning and neuromodulation. Neural Networks. doi:10.1016/S0893-6080(02)00044-8

Doya, K. (2008). Modulators of decision making. Nature Neuroscience, 11(4), 410–416. doi:10.1038/nn2077

Fresco, N. (2014) Physical Computation and Cognitive Science. Berlin Heidelberg: Springer, Studies in Applied Philosophy, Epistemology and Rational Ethics, Vol. 12, XXII

von Haugwitz, R., Kitamura, Y., & Takashima, K. (2012). Modulating reinforcement-learning parameters using agent emotions. The 6th International Conference on Soft Computing and Intelligent Systems, and The 13th International Symposium on Advanced Intelligence Systems, 1281–1285. doi:10.1109/SCIS-ISIS.2012.6505340

Kahneman, D. (2011) Thinking, fast and slow. Macmillan.

Marinier, R. P., Laird, J. E., & Lewis, R. L. (2009). A computational unification of cognitive behavior and emotion. Cognitive Systems Research, 10(1), 48–69. doi:10.1016/j.cogsys.2008.03.004

Minsky, M. (1988). The Society of Mind. Simon and Schuster, New York.

Strannegård, C., von Haugwitz, R., Wessberg, J., & Balkenius, C. (2013). A Cognitive Architecture
Based on Dual Process Theory. AGI.

Schweighofer, N., & Doya, K. (2003). Meta-learning in reinforcement learning. Neural Networks, 16(1), 5–9. doi:10.1016/S0893-6080(02)00228-9

  • Open access
  • 50 Reads
Emergence of Information and Value Formation - Important Categories for Theory Development - About the Idle but Wrong Attempt to Teach War Robots Ethic

If we want to grasp the essence of information, then we must take its emer­gence into account. [1],[2],[3] This requires an interaction between form (syntax), con­tent (semantics) and effect (pragmatics) of information as three qualitatively different, interdependent process sta­ges of information production and its utili­zation: mapping, interpreting, evaluating.

The emergence of information is a new category in the theory of biology [4] and organization. Cybernetics (1st order), control technology, always presupposed information. Informatics, which developed later, also recognizes the concept of information processing, but until now, the concept of information emergence was not or insufficiently acknowledged.

When we compare technical control with the regulation processes of cell metabolism, a comparison arises between the technical automaton and the living organism. This leads us to a central con­clusion: The living, developing organism is principally distinguished from the technical automaton by the processes of emerging information and formation of values in a process of self organization. [5]

The principle of creativity, of information generation, provides epistemological and methodological guidance. The principle of the generation of information has been of fundamental importance for the building of models and theories at the transition zone of physics, chemistry and biology. These questions about the characteristic features of information are par­ticularly topical in molecular biology, in the neuron sciences, in linguistics, in the para­digm controversy of cognitive science and in AI research, and even in the mod­ern theory of enterprise organization. This clearly shows that information emer­gence and value formation are very important categories for the the­ory development in the boundary between physics/chemistry and biology (to understand the origin of life) and between computers (software) and the human mind, information systems and creative learning social organization. [6]

The notion of complementarity was first presented by Niels Bohr 1927 to harmonize the conflicting views taken by different physicists. Bohr gradually extended the complementarity concept to a much wider domain, and it took on the character of a general theory on the nature of human knowledge, sometimes called the "Copenhagen Spirit”. Many scientists, including Schrodinger and Delbrück, were fascinated by Niels Bohr’s lecture ‘Light and Life’, and conjectured that for the ultimate understanding of life, some novel, fundamental property of matter must first be found, most likely via the discovery of an intuitively paradoxical biological phenomenon.

The development of molecular biology showed that such a paradox does not exist. It should be said, that N. Bohr, in his 1962 lecture: "Light and Life Revisited," acknowledged that meanwhile the success of molecular biology has transcended the instrumental limits on the understanding of life that he had foreseen thirty years earlier. Manfred Eigen [1] clearly said [7] that we do not need a ‘new physics’ but something ‘new in physics’ – that is ‘information’. We have an information theory but not a theory of ‘information generation’.

The role that the concept of information generation plays in the theory of the origin of life as well as in model and theory formation at the boundaries between physics, chemistry and biology, and also between computer science and the humanities [7, [5], [1], [8], [9] has to be investigated still much more.

The epistemological and methodological implications of the concept of creativity, of information generation, can inspire ideas in nearly all areas of human interest. It provides methodological guidance to navigate between the Scylla of crude reductionism: reduction of life to pure physics and chemistry (physicalism) and ‘mind–brain identity’ (neurophilosophy), strong connectionist AI research, and the Charybdis of dualism: of mind-and-matter dualism and hardware-software-dualism, of strong cognitivist AI research.

With the example of the “molecular biologist and the chicken,” we intend to il­lustrate the emergence of the biological information, for to show the basic ideas underlying our evolutionary stages concept of informa­tion. In this respect, we arrived at general affirmation: Information is not a substance; it exists as relation between a sender and a receiver. A special aspect of this relation is the structure of the information carriers, generated by evolution [9],[10],[11] as e.g. DNA-Code. The information phenomenon cannot be reduced to the map­ping or syntactic aspect on any of the different levels of complex systems.

In­deed there is no information concept that can be reduced to one of the process stages: mapping, interpreting, evaluating. It is not possible to reduce the prag­matics of information to semantics, or semantics to syntax. Humans and hu­manity can not be reduced to machines

With the "evolutionary stage concept of information" developed by us, we consider the processes of information emergence and value formation in the processes of the self organization

The realization of a function has a complex structure as prerequisite, which can only be formed on the basis of information, which however in turn, is only created and preserved by this special function. This connection of structure and function is arranged by meanings, which are formed only in this process of interaction. Information therefore arises first, when with the realization of the function, by the effect, an assessment (and with that a selection) has been made, by which the information gets its meaning. It is a circle process - this process has information as prerequisite, which is generated only in this process. This chicken or egg problem dissolves in this complex circle interaction. [7]

Information is indeed on the one hand a condition for the origin of the life and information arises first with the origin of the life

Figure 1.

(see PDF version for the Figure).

 

We represent the evolution of information, the emergence of the meaning of information, in the interaction of (syntactic) pattern, (semantic) meaning and (pragmatic) evaluation, as an evolution of different (5) levels of organismic/human and social communication processes.

  1. Level Macromolecules, 2. Nervous system, 3. Consciousness of environment, 4. Consciousness of society, 5. Consciousness of values.

 

Figure 2.

(see PDF version for the Figure).

 

We have to recognize a relatively new form of universal interconnection. Just as quantum physics had to learn that the motion of an electron is only one aspect of the whole, and just as biology had to learn that living organization does not simply consist of parts which can be analyzed and subsequently recomposed, we need to take into account the interconnection of mapping, interpretation and evaluation as specific and interrelated process stages in the generation, use and preservation of information.    

In this paper we specially look at the fifth level. As the level of self-awareness, it can also be described as the consciousness of the values.

If we are concerned with the self-development of the human personality, we proceed from the assumption that human beings live in society and follow the social values which have been formed in the process of social development. Values serve to reduce the complexity of human behavior, of human actions and interests. At the same time, the development of society is also the development of its system of values. With the development of social information and communication, also a development of societal and of self-confidence, of social and individual values takes place.

This signifies that indeed the information proc­essing approach (of cognitionistic and connectionist AI research) provides too narrow an understanding of infor­mation and values. In the case of the “New AI”, the development of relatively autonomous systems like robots, it is necessary to differentiate between pre-rational (intuitive), rational , post-rational (intuitive) and irrational action. Only the objective rational actions are the subject of rationalization and only formalized actions are the subject of automation. Analytic (rule-based) ap­proaches use in novel situations or when problems occur a selection of known possible actions.

The paradox of safety shows; that by increasing the degree of automation we derive more safety and stability but at the same time we create vulnerability of society as a whole, by its increased dependence on such sophisticated technological systems. [12]

The paradox of safety makes clear that informatics has to adopt a different image of man: The concept of complete or super-automation in the sense of a complete reduction (or exclusion) of human participation is misleading. The concept of Man as a faulty living being, as the unsafe element to be replaced is misleading.

Man must not be seen as a disturbing factor which can be more ore less completely replaced by modern information technologies, but rather as the only creative productive force, as the subject of all progress and of all development. “Since information generation is a process that allows novelty to emerge, it goes beyond a mechanical process that can be formalised, expressed by mathematical function, or carried out by a computer.” [3 S. 171]      

The intuition of man represents an essential component for coping with complex tasks. Human intuition extends the possibilities of decision making beyond what would be possible with rational abilities alone.

Differentiation of actions and the value of intuition

Intuition as essential component to master complex tasks. Human intuition extends the possibilities of decision making beyond what would be possible rational (formalized) abilities alone.

Figure 3.

(see PDF version for the Figure).

 

Also ethical decisions of Man are not only based on (formal) rules, the possibilities of making ethical decisions goes beyond the (formal) rational abilities of the computers - the war robots. This is especially important to respond to alarm or fault conditions in a battle.

The Five-Stage of the Skill-Acquisition Model of Hubert and Stuart Dreyfus has also ethical implications. [13]

Parallelism between the acquisition of skills ( H. Dreyfus, S. Dreyfus [13]) and the levels of performance (J.R. Blau, K. Fuchs-Kittowski [12])

At a low level of complexity, the response is likely to be linear and parallel to the complexity of the situation. As complexity increases, however, the response to alarm or fault conditions become unstable and eventually will become unbounded.

The only way to bound the response is to introduce the post rational criteria into the system design. A differentiation between pre rational, rational, post rational and irrational actions is necessary.

Figure 4.

(see PDF version for the Figure).

 

We assume with Hubert and Stuart Dreyfus [14] that acting ethically is a skill. We use their phenomenological description of the five stages of skill acquisition to show that an ethics based on principles corresponds to a beginner’s reliance on rules and so is developmentally inferior to an ethics based on expert response that claims that, after long experience, the ethical expert learns to respond appropriately to each unique situation.

Also civilian autonomous vehicles can get into complicated accident situations. They must be programmed to behave so, that as many human life’s as possible are spared. Is this already a moral behaviour? Of course one can programme a war robot that he stops shooting, if a man stands with raised hands in front of him, certainly it is possible to teach an armed drone to stop with bombing,  if  it sees a Red Cross on the roof of a house.  Is this however really moral behaviour, do these systems follow ethical principles? Are they thus really equal to the complex situations in the combat mission? Surely not!

Ethics often exists in terms of a set of rules that are intended to supplement the basic laws of the community. Such rules, like those of professional societies, e.g. the “A.C.M. Code of Ethics And Professional Conduct” are important, but certainly not designed, to help in such serious ethical decisions as Joseph Parnas ore Edward Snowden had to make.

But also if we not follow the phenomenological approach and think, like I. Kant and others, that there is a rational basis of morality and ethics, so that also complicated, deeper ethical decisions are based on rules and principles, one is not able to teach the computer moral behavior, because the basic principles are very general, can only be followed with a deep understanding of the situation. On the basis of the evolutionary stage concept of Information it becomes obviously, that the level of self-awareness or consciousness of the values is a meta-level of information generation and value formation, based on the syntax of syntax, the semantic of semantic and the pragmatics of pragmatics.

Ethical values are rooted in empathy that has "its real depth and width in the deep respect for life," as was pointed out above all by A. Schweitzer, in his Nobel Peace Prize acceptance speech in Oslo,1954 [15].

However, computers are not participants in a social process; they are not personalities whose development is shaped by living a life that involves an interweavement of biological, psychological and social processes.

As can be demonstrated on the basis of the evolutionary stage concept of in­formation, on the stage of self- consciousness, (the “stage of values”), on the level of ethical expertise the ethical decision-making requires post-rational ac­tions which are not the subject of automation, because they include intuition, information generation. Expert (ethical) decisions, no longer rely on rules, guidelines or maxims, they rely on intuitive grasp of situations, based on deep tacit understanding.

Some scientists therefore have an unrealistic claim that they could teach ethics to war robots. It is an idle but wrong attempt! Deeper reflections on the relationship between technology and intelligence, computer (software) and the human mind in its social context show that the process of information generation and value formation is far more complicated. To ensure an interhuman understanding, to be able to generate social mean­ings and meta meanings, to form social values, the robot would require a long period of socialization in the human community.

War robots, armed drones will not have any ethics; they should be banned as soon as possible, to prevent a new senseless arms race!

References

[1] Fuchs-Kittowski, K.; Reflections on the essence of information, Chr. Floyd, H. Züllighoven, R. Budde, R. Keil-Slawik Editors, Software Development and Reality Construction, Springer Verlag, Berlin, New York: 1992. p.p. 416 - 432

[2] Fuchs-Kittowski, K.; Information neither Matter nor Mind – On the Essence and on the Evolutionary Stages Concept of Information, W. Hofkirchner, Editor, The Quest for a Unified Theory of Information, Proceedings of the Second International Conference on the Foundations of Information Science. Vienna University of Technology, 11-15 June 1996, World Future, Gordon and Breach Publishers, Australia, Canada, 1997, Vol. 50. p.p. 551 – 570.

[3] Hofkirchner, W.; Emergent Information – A Unified Theory of Information Framework, World Scientific, New Jersey, London, 2013

[4] Fuchs-Kittowski, K.; Information und Biologie: Informationsentstehung – eine neue Kategorie für eine Theorie der Biologie. – In: Biochemie – ein Katalysator der Biowissenschaften. Kolloquium der Leibniz-Sozietät am 20. November 1997 anlässlich des 85. Geburtstages von Samuel Mitja Rapo­port. Sitzungsberichte der Leibnitz - Sozietät. Berlin, Leibniz-Sozietät, 22 (1998) 3. S. 5- 17

[5] Fuchs-Kittowski, K.; Probleme des Determinismus und der Kybernetik in der molekularen Biologie, VEB Gustav Fischer Verlag, Jena 1976

[6] Fuchs-Kittowski, K.; Heinrich, L. J.: Rolf, A.; Information entsteht in Organisa­tionen: - in kreativen Unternehmen- wissen­schaftstheoretische und methodologische Kon­sequenzen für die Wirt­schaftsinformatik. In: Wirtschaft­sinformatik und Wissenschafts­theorie - Bestand­saufnahme und Per­spektiven, J. Becker, W. König, R. Schütte, O. Wendt, S. Zelewski, Ed., Wiesbaden: Betrieb­swirtschaftlicher Verlag Dr. Th. Gabler GmbH 1999, 329-361

[7] Eigen, M.; Self organization of Matter and the Evolution of Biological Macromolecules’, Naturwis­senschaft en, 58:10 (1971), pp. 465–523.

[8] Kuppers, P.-O.: Der Ursprung biologischer Information, Piper, München 1986

[9] Ebeling, W.; Freund, J.; Schweizer, S.; Entropie – Information – Komplexität, Stuttgart, Leipzig, 1998

[10] Ebeling, W.; Evolution of strings – On the borderline between order and chaos, in: H.-M. Voigt: Evolution and optimization, Serries Mathematical Ecology, Akademie-Verlag, Berlin 1989

[11] Ebeling, W.; Feistel, R.; Selforganisation of Symbols and Information, G. Nicolis, V. Basios, Editors, Chaos, Information Processing and   Paradoxial Games: To the memory of John S. Nicolis World Scientific, 2014, S.141 – 184

[12] Fuchs-Kittowski, K.; System design, design of work and of organization. The paradox of safety, the orgware concept, the necessity for a new culture in information systems and software development, in: P. van den Besselare, A. Clement, P. Jarvinen, Editors, North – Holland, Amsterdam, 1991

[13] Dreyfus, H. L.; What is Moral Maturity? Towards a Phenomenology of Ethical Expertise." In James Ogilvy, ed., Revisioning Philosophy. Albany: State University of New York, 1992.

[14] Dreyfus, H. L.; Dreyfus, S. E.; Mind over Machine – The Power of Human Intuition and Expertise in the Era of the Computer, The Free Press A Devision of Macmillian, Inc. New York, 1968

[15] Schweizer, A.; Das Problem des Friedens in der heutigen Zeit . Rede bei der Entgegennahme des Nobel- Friedenspreises in Oslo am 4. November 1954, Verlag C.H. Beck München 1955

Top