Could a person ever transcend what it is like to be in the world as a human being? Could we ever know what it is like to be other creatures? Questions about the overcoming of a human perspective are not uncommon in the history of philosophy. In the last century, those very interrogatives were notably raised by American philosopher Thomas Nagel in the context of philosophy of mind. In his 1974 essay What is it Like to Be a Bat?, Nagel offered reflections on human subjectivity and its constraints. Nagel's insights were elaborated before the social diffusion of computers and could not anticipate the cultural impact of technological artefacts capable of materializing interactive simulated worlds as well as disclosing virtual alternatives to the "self." In this sense, this article proposes an understanding of computers as epistemological and ontological instruments. The embracing of a phenomenological standpoint entails that philosophical issues are engaged and understood from a fundamentally practical perspective. In terms of philosophical praxis, or "applied philosophy," I explored the relationship between human phenomenologies and digital mediation through the design and the development of experimental video games. For instance, I have conceptualized the first-person actionadventure video game Haerfest (Technically Finished 2009) as a digital re-formulation of the questions posed in Nagel's famous essay. Experiencing a bat's perceptual equipment in Haerfest practically corroborates Nagel's conclusions: there is no way for humans to map, reproduce, or even experience the consciousness of an actual bat. Although unverifiable in its correspondence to that of bats, Haerfest still grants access to experiences and perceptions that, albeit still inescapably within the boundaries of human kinds of phenomenologies, were inaccessible to humans prior to the advent of computers. Phenomenological alterations and virtual experiences disclosed by interactive digital media cannot take place without a shift in human kinds of ontologies, a shift which this study recognizes as the fundamental ground for the development of a new humanism (I deem it necessary to specify that I am not utilizing the term "humanism" in its common connotation, that is to say the one that emerged from the encounter between the Roman civilization and the late Hellenistic culture. According to this conventional acceptation, humanism indicates the realization of the human essence through "scholarship and training in good conduct" (Heidegger 1998, p. 244). However, Heidegger observed that this understanding of humanismdoes not truly cater to the original essence of human beings, but rather "is determined with regard to an already established interpretation of nature, history, world, and [...] beings as a whole." (Heidegger 1998, p. 245) The German thinker found this way of embracing humanism reductive: a byproduct of Western metaphysics. As Heidegger himself specified in his 1949 essay Letter on Humanism, his opposition to the traditional acceptation of the term humanism does not advocate for the "inhuman" or a return to the "barbaric" but stems instead from the belief that the humanism can only be properly understood and restored in culture as more original way of meditating and caring for humanity and understanding its relationship with Being.). Additionally, this study explicitly proposes and exemplifies the use of interactive digital technology as a medium for testing, developing and disseminating philosophical notions, problems and hypotheses in ways which are alternative to the traditional textual one. Presented as virtual experiences, philosophical concepts can be accessed without the filter of subjective imagination. In a persistent, interactive, simulated environment, I claim that the crafting and the mediation of thought takes a novel, projective (In Martin Heidegger's 1927 Being and Time, the term "projectivity" indicates the way a Being opens to the world in terms of its possibilities of being (Heidegger 1962, pp. 184-185, BT 145). Inspired by Heidegger's and VilemFlusser's work in the field of philosophy of technology as well as Helmuth Plessner's anthropological position presented in his 1928 book Die Stufen des Organischen und der Mensch. Einleitung in die philosophische Anthropologie, this study understands the concept of projectivity as the innate openness of human beings to construct themselves and their world by means of technical artefacts. In this sense, this study proposes a fundamental understanding of technology as the materialization of mankind's tendency to overcome its physical, perceptual and communicative limitations.) dimension which I propose to call "augmented ontology."
LINK
DOCUMENT
To study the ways in which compounds can induce adverse effects, toxicologists have been constructing Adverse Outcome Pathways (AOPs). An AOP can be considered as a pragmatic tool to capture and visualize mechanisms underlying different types of toxicity inflicted by any kind of stressor, and describes the interactions between key entities that lead to the adverse outcome on multiple biological levels of organization. The construction or optimization of an AOP is a labor intensive process, which currently depends on the manual search, collection, reviewing and synthesis of available scientific literature. This process could however be largely facilitated using Natural Language Processing (NLP) to extract information contained in scientific literature in a systematic, objective, and rapid manner that would lead to greater accuracy and reproducibility. This would support researchers to invest their expertise in the substantive assessment of the AOPs by replacing the time spent on evidence gathering by a critical review of the data extracted by NLP. As case examples, we selected two frequent adversities observed in the liver: namely, cholestasis and steatosis denoting accumulation of bile and lipid, respectively. We used deep learning language models to recognize entities of interest in text and establish causal relationships between them. We demonstrate how an NLP pipeline combining Named Entity Recognition and a simple rules-based relationship extraction model helps screen compounds related to liver adversities in the literature, but also extract mechanistic information for how such adversities develop, from the molecular to the organismal level. Finally, we provide some perspectives opened by the recent progress in Large Language Models and how these could be used in the future. We propose this work brings two main contributions: 1) a proof-of-concept that NLP can support the extraction of information from text for modern toxicology and 2) a template open-source model for recognition of toxicological entities and extraction of their relationships. All resources are openly accessible via GitHub (https://github.com/ontox-project/en-tox).
DOCUMENT