<?xml version="1.0" encoding="utf-8"?>
<journal>
  <titleid>75447</titleid>
  <issn>2712-9934</issn>
  <journalInfo lang="ENG">
    <title>Technology and Language</title>
  </journalInfo>
  <issue>
    <volume>3</volume>
    <number>1</number>
    <altNumber>6</altNumber>
    <dateUni>2022</dateUni>
    <pages>1-154</pages>
    <articles>
      <article>
        <artType>EDI</artType>
        <langPubl>RUS</langPubl>
        <pages>1-8</pages>
        <authors>
          <author num="001">
            <authorCodes>
              <orcid>0000-0002-0837-9129</orcid>
            </authorCodes>
            <individInfo lang="ENG">
              <orgName>Guangdong University of Foreign Studies</orgName>
              <surname>Cheng</surname>
              <initials>Lin</initials>
              <address>Guangzhou, China</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">The Construction of the Robot in Language and Culture, “Intercultural Robotics” and the “Third Robot Culture”</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">Robots are not only technological artifacts, but also elements of human culture. They play important roles such as being the double, replica, tool and companion of humans. The anthropomorphic characteristics of robots lead to philosophical thinking, linguistic and (inter-) cultural phenomena. Inspiration can be drawn from exploring how robots are imagined, defined, described, comprehended, constructed or misunderstood, and from observing the changing relationships between humans and robots from an intercultural and interdisciplinary perspective. For instance, “intercultural robotics” and the “third robot culture” deserve more attention.</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.01</doi>
          <udk>008:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Robot</keyword>
            <keyword>Robot culture</keyword>
            <keyword>Intercultural robotics</keyword>
            <keyword>Sobject</keyword>
            <keyword>Island hypothesis</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.1/</furl>
          <file>1-8.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>9-16</pages>
        <authors>
          <author num="001">
            <individInfo lang="ENG">
              <orgName>Capital Normal University</orgName>
              <surname>Wu</surname>
              <initials>Shijueshan</initials>
              <address>Beijing, China</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">The Intellectual Turn and Cultural Transfer of “Humanoid Automata” from the Ancient World to the Enlightenment Era</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">This study examines the origin and development of the “android” in the Western world, from antiquity to the Enlightenment era. The manufacture of android automata is not only a technological advance, but also reveals an intellectual shift from the Middle Ages to the Enlightenment, involving cultural transfers from different civilizations in ancient times. “Humanoid automata” offer an insight into medieval beliefs and practices as mechanical mimesis in the investigation of the relations between art and nature. Android automata in the 18th century represent Enlightenment ideas through their affective communication. This historical context could provide an important reference for today’s research on human–robot interaction</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.02</doi>
          <udk>94:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Humanoid automata</keyword>
            <keyword>Enlightenment automata</keyword>
            <keyword>Android</keyword>
            <keyword>Robot</keyword>
            <keyword>Human–robot interaction</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.2/</furl>
          <file>9-16.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>17-28</pages>
        <authors>
          <author num="001">
            <authorCodes>
              <orcid>https://orcid.org/0000-0001-8296-4361</orcid>
            </authorCodes>
            <individInfo lang="ENG">
              <orgName>Peter the Great St. Petersburg Polytechnic University</orgName>
              <surname>Romanenko</surname>
              <initials>Taras</initials>
              <address>St. Petersburg, Russia</address>
            </individInfo>
          </author>
          <author num="002">
            <authorCodes>
              <orcid>https://orcid.org/0000-0001-9576-1002</orcid>
            </authorCodes>
            <individInfo lang="ENG">
              <orgName>Peter the Great St. Petersburg Polytechnic University</orgName>
              <surname>Shcherbinina</surname>
              <initials>Polina </initials>
              <address>St. Petersburg, Russia</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">Robot vs Worker</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">The word “robot” first appeared in 1920 in the play “R.U.R.” by Czech writer Karel Capek. Within a few years, the play was translated into more than 30 languages, contributing to the spread of the new term around the world. The word “robot” was preserved in almost all translations, one of the few exceptions being Alexei Tolstoy’s Russian adaptation entitled “Riot of the Machines” (1924). Although in Russian, as well as in Czech, there is an etymological connection between “robot” and “work” (rabota), the translator Tolstoy abandoned the new term, calling robots “workers” (rabotnik), that is, refusing to give them a separate name and equating them to working people. Although the origin of the word “work ” “worker” in Russian (as well as in Czech) is associated with slavery and forced labor, in Soviet times it acquired a brightly positive connotation. If for Capek the difference between robots and people becomes the fault line of the play, their similarity is most important for Tolstoy – the performance of work.  Accordingly, the theme of the robot’s rebellion against humans is replaced by the rebellion of workers, whether of natural or artificial origin, against their oppressors.</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.03</doi>
          <udk>821.161.1:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Robot</keyword>
            <keyword>Worker</keyword>
            <keyword>Artificial</keyword>
            <keyword>Philistine</keyword>
            <keyword>R.U.R</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.3/</furl>
          <file>17-28.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>29-39</pages>
        <authors>
          <author num="001">
            <individInfo lang="ENG">
              <orgName>Darmstadt Technical University</orgName>
              <surname>Liggieri</surname>
              <initials>Kevin</initials>
              <address>Darmstadt, Germany</address>
            </individInfo>
          </author>
          <author num="002">
            <authorCodes>
              <orcid>0000-0001-7102-7479</orcid>
            </authorCodes>
            <individInfo lang="ENG">
              <orgName>Darmstadt Technical University</orgName>
              <surname>Tamborini</surname>
              <initials>Marco</initials>
              <address>Darmstadt, Germany</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">The Body, the Soul, the Robot: 21st-Century Monism</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">The thesis we will defend in the following pages is twofold. First, we indicate two linguistic-cultural turning points in the concept of the robot. The introduction of the body and the soul in the machine has paved the way towards new technical and epistemic possibilities and, thus, it has granted a new conceptual definition of robot. Second, we propose a return to Descartes as a starting point for a reinterpretation and redefinition of the concept of robot in the contemporary world. Here we will show how Cartesian dualism (in the description of humans) becomes a (material) monism in the development and construction of robots. As a result, we call on our fellow philosophers and historians of science and technology to explore, critique, reject, or further investigate the features of the 21st-century material monism proposed in this paper.</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.04</doi>
          <udk>11:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Robotics</keyword>
            <keyword>Descartes</keyword>
            <keyword>Cybernetics</keyword>
            <keyword>Robotic Monism</keyword>
            <keyword>Cartesian Dualism</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.4/</furl>
          <file>29-39.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>40-46</pages>
        <authors>
          <author num="001">
            <individInfo lang="ENG">
              <orgName>Sun Yat-sen University</orgName>
              <surname>Jiang</surname>
              <initials>Hui</initials>
              <address>Guangzhou, China</address>
            </individInfo>
          </author>
          <author num="002">
            <authorCodes>
              <orcid>0000-0002-0837-9129</orcid>
            </authorCodes>
            <individInfo lang="ENG">
              <orgName>Guangdong University of Foreign Studies</orgName>
              <surname>Cheng</surname>
              <initials>Lin</initials>
              <address>Guangzhou, China</address>
            </individInfo>
          </author>
          <author num="003">
            <individInfo lang="ENG">
              <orgName>Osaka University</orgName>
              <surname>Ishiguro</surname>
              <initials>Hiroshi </initials>
              <address>Osaka, Japan</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">The Blurring of the Boundaries between Humans and Robots is a Good Thing and a New Species would be Born: An Interview with Hiroshi Ishiguro</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">The documentary Philosophy in the Age of Desire records a short encounter between Hiroshi Ishiguro and Markus Gabriel in 2018. Their exchange on the role of technology in human life, on the conception of human being, and other topics revealed noticeable differences between the German philosopher and the Japanese engineer. Four years later two separate interviews follow up on their conversation. In this interview, Hiroshi Ishiguro makes several points: First, there is no clear definition of what a human being, intelligence, emotion, etc. is, so people can understand the meta-level of human beings by making robots, at least getting inspiration for understanding the complex human functions through the reaction of robots. Second, robots have crossed the “uncanny valley” in some situations. Third, the blurring of the boundaries between humans and robots is a good thing, then a new species would be born and people will accept them as human beings. Fourth, after the COVID-19 pandemic, robots that can be operated remotely will be utilized widely. In addition, regarding the roots of Japan’s robot culture, Hiroshi Ishiguro proposes the “Island Hypothesis.”</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.05</doi>
          <udk>008:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Hiroshi Ishiguro</keyword>
            <keyword>Humanoid</keyword>
            <keyword>Robot</keyword>
            <keyword>Uncanny Valley</keyword>
            <keyword>Island Hypothesis</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.5/</furl>
          <file>40-46.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>47-56</pages>
        <authors>
          <author num="001">
            <individInfo lang="ENG">
              <surname>Li</surname>
              <initials>Yue </initials>
            </individInfo>
          </author>
          <author num="002">
            <individInfo lang="ENG">
              <orgName>University of Bonn</orgName>
              <surname>Gabriel</surname>
              <initials>Markus </initials>
              <address>Bonn, Germany</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">Diverse Cultures, Universal Capacity: An Interview with Markus Gabriel</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">The documentary Philosophy in the Age of Desire records a short encounter between Markus Gabriel and Hiroshi Ishiguro’s Geminoid in 2018. Their exchange on the role of technology in human life, on the conception of human being, and other topics revealed noticeable differences between the German philosopher and the Japanese engineer, but can these be interpreted as “cultural” differences? Four years later two separate interviews follow up on their conversation, This interview explores their differences by examining Gabriel’s own experiences with AI and his definitions of related concepts such as “intelligence,” “ethics,” and “consciousness.” Gabriel emphasizes that due to our organic precondition there is only a lower-level response in terms of self-understanding. It is only the variability in the expression of self-understandings that results from  cultural construction. Focusing on the universal basis of humanity and the influences from Asian philosophy regarding human becoming, Gabriel calls for the further investigation of the cultural presentations of artificial intelligence.</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.06</doi>
          <udk>1:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Human-Machine Interaction</keyword>
            <keyword>Intelligence</keyword>
            <keyword>Ethics</keyword>
            <keyword>Universalism</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.6/</furl>
          <file>47-56.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>58-75</pages>
        <authors>
          <author num="001">
            <authorCodes>
              <scopusid>22233758600</scopusid>
              <orcid>0000-0001-9576-1002</orcid>
            </authorCodes>
            <individInfo lang="ENG">
              <orgName>Philosophy of Department, University of Vienna</orgName>
              <surname>Coeckelbergh</surname>
              <initials>Mark</initials>
              <address>Universitätsring 1, 1010  Vienna, Austria</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">You, Robot: on the Linguistic Construction of Artificial Others</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">How can we make sense of the idea of ‘personal’ or ‘social’ relations with robots? Starting from a social and phenomenological approach to human–robot relations, this paper explores how we can better understand and evaluate these relations by attending to the ways our conscious experience of the robot and the human–robot relation is mediated by language. It is argued that our talk about and to robots is not a mere representation of an objective robotic or social-interactive reality, but rather interprets and co-shapes our relation to these artificial quasi-others. Our use of language also changes as a result of our experiences and practices. This happens when people start talking to robots. In addition, this paper responds to the ethical objection that talking to and with robots is both unreal and deceptive. It is concluded that in order to give meaning to human–robot relations, to arrive at a more balanced ethical judgment, and to reflect on our current form of life, we should complement existing objective-scientific methodologies of social robotics and interaction studies with interpretations of the words, conversations, and stories in and about human–robot relations</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.07</doi>
          <udk>1:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Human–robot relations</keyword>
            <keyword>Robot ethics</keyword>
            <keyword>Language</keyword>
            <keyword>Phenomenology</keyword>
            <keyword>Hermeneutics</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.7/</furl>
          <file>57-75.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>76-81</pages>
        <authors>
          <author num="001">
            <individInfo lang="ENG">
              <orgName>Darmstadt Technical University</orgName>
              <surname>Ullmann</surname>
              <initials>Larissa</initials>
              <address>Darmstadt, Germany</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">The Quasi-Other as a Sobject</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">This comment on Mark Coeckelbergh’s text “You, robot: on the linguistic construction of artificial others” is about the concrete use of linguistic terms to describe the technical other, the robot, and its relationship to humans. There are many characteristics that a robot can have that are very similar to humans and interpersonal relations, but they are not human, they are quasi-human. This phenomenon is, amongst others, constructed and interpreted linguistically, but on the other hand, there is no linguistic term that could describe it unambiguously, so it can only be studied in direct human comparison, in a quasi-human way. In this comment, it is demonstrated why the use of the quasi is problematic and suggests that the phenomenon can instead be analyzed in a techno-philosophical-phenomenological context within the framework of the Sobject-approach. The term sobject describes a kind of technical objects to which humans can have deeper relations than to conventional objects. Therefore, it provides space to study the phenomenon on a phenomenological level, without the need for a permanent direct human comparison.  – This is one of six commentaries on a 2011-paper by Mark Coeckelbergh: “You, robot: on the linguistic construction of artificial others.” Coeckelbergh‘s response also appears in this issue of Technology and Language.</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.08</doi>
          <udk>1:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Human-robot relation</keyword>
            <keyword>Language</keyword>
            <keyword>Phenomenology</keyword>
            <keyword>Technical others</keyword>
            <keyword>Objects and sobjects</keyword>
            <keyword>Artificial Intelligence</keyword>
            <keyword>Hermeneutics</keyword>
            <keyword>Sobject</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.8/</furl>
          <file>76-81.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>82-103</pages>
        <authors>
          <author num="001">
            <individInfo lang="ENG">
              <orgName>Leuphana Universität Lüneburg</orgName>
              <surname>Xylander</surname>
              <initials>Cheryce von</initials>
              <address>Universitätsallee 1, 21335 Lüneburg, Germany</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">Quipping Equipment: Apropos of Robots and Kantian Chatbots</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">Robots, Bourdieu, Kant, and Sex – Coeckelbergh’s philosophy of technical assemblages has it all. This commentary considers his early work “on the linguistic construction of artificial others” in light of his later elaboration of a general theory of human-technology interaction. Coeckelbergh draws on “habitus”-theory, virtue ethics and a historically recontextualized Kantianism to propose nothing less than a new general moral philosophy for the technoscientific age. In so doing, he also conjures up something beguilingly elusive if not impossible – a pluralist personalism. Readers vested in pluralist accounts of agency and epistemic contingency will appreciate his invoking Bourdieu and Kant, thinkers prioritizing communalist over particularist interests. Readers of a personalist bent will welcome the voluntarism of his moral regimen – they like their reality served up in person-shaped bits, a perspective that prioritizes self-direction and self-possession. Two for the price of one: here everyone feels affirmed. Coeckelbergh appears to take the defining parameters of experience to be wholly contextual and, in equal measure, intrinsic. In squaring the circle, he also showcases a lurid scenario: sex with robots. The electrifying effect of this bold composition is to set the mind racing toward a position more coherent and less familiar than pluralist personalism. Central to this position is a conception of Gemüt as emergent reflexivity. Its consideration takes us via Immanuel Kant and Kant-Culture Research to such strange aberrations as corporate cannibalism and cyborg pillow talk. – This is one of six commentaries on a 2011-paper by Mark Coeckelbergh: “You, robot: on the linguistic construction of artificial others.” Coeckelbergh‘s response also appears in this issue of Technology and Language.</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.09</doi>
          <udk>1:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Commodified Agency</keyword>
            <keyword>Gemüt</keyword>
            <keyword>Kant-Culture Research</keyword>
            <keyword>Digital Cannibalism</keyword>
            <keyword>Personalism</keyword>
            <keyword>Kantbot</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.9/</furl>
          <file>82-103.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>104-110</pages>
        <authors>
          <author num="001">
            <authorCodes>
              <orcid>0000-0002-7740-6768</orcid>
            </authorCodes>
            <individInfo lang="ENG">
              <orgName>Darmstadt Technical University</orgName>
              <surname>Pezzica</surname>
              <initials>Leon</initials>
              <address>Darmstadt, Germany</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">On Talkwithability. Communicative Affordances and Robotic Deception</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">This paper operates within Mark Coeckelbergh’s framework of the linguistic construction of robots. Human-robot relations are conceptualised as affordances which are linguistically mediated, being shaped by both the linguistic performances surrounding human-robot interaction as well as the robot’s characteristics. If the robot signifies the affordance of engaging in human-human-like conversation (talkwithability), but lacks the real affordance to do so, the robot is to be thought of as deceptive. Robot deception is therefore a question of robot design. Deception by robot not only has ethically relevant consequences for the communicating individual, but also long-term effects on the human culture of trust. Mark Coeckelbergh’s account of the linguistic construction of robots as quasi-subjects excludes the possibility of deceptive robots. According to Coeckelbergh, to formulate such a deception objection, one needs to make problematic assumptions about the robot being a mere thing as well as about the authentic, which one must assume can be observed from an objective point of view. It is shown that the affordance-based deception objection to personal robots proposed in this paper can be defended against Coeckelbergh’s critique as the detection of affordances is purely experience-based and the occurrence of deception via affordance-gaps is not in principle limited to robots. In addition, no claims about authenticity are made, instead affordance-gaps are a matter of appropriate robot signals. Possible methods of bridging the affordance-gap are discussed.  – This is one of six commentaries on a 2011-paper by Mark Coeckelbergh: “You, robot: on the linguistic construction of artificial others.” Coeckelbergh‘s response also appears in this issue of Technology and Language.</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.10</doi>
          <udk>1:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Human-robot relations</keyword>
            <keyword>Robot ethics</keyword>
            <keyword>Language</keyword>
            <keyword>Affordances</keyword>
            <keyword>Deception</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.10/</furl>
          <file>104-110.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>111-126</pages>
        <authors>
          <author num="001">
            <authorCodes>
              <researcherid>J-9548-2017</researcherid>
              <scopusid>57210142445</scopusid>
              <orcid>0000-0002-7956-4647</orcid>
            </authorCodes>
            <individInfo lang="ENG">
              <orgName>Department of Social Science, Peter the Great St. Petersburg Polytechnic University</orgName>
              <surname>Bylieva</surname>
              <initials>Daria</initials>
              <address>St. Petersburg, Russia</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">Language of AI</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">In the modern world human-robot relations, language plays a significant role. One used to view language as a purely human technology, but today language is being mastered by non-humans. Chatbots, voice assistants, embodied conversational agents and robots have acquired the capacity for linguistic interaction and often present themselves as humanoid persons. Humans begin to perceive them  ambivalently as they would acknowledge an Other inside the make-believe of a game. Using artificial neural nets instead of symbolic representation of human cognitive processes in AI technology leads to self-learning models. Thus AI uses language in a  manner that is not predetermined by human ways of using it. How language is interpreted and employed by AI  may influence, even alter social reality.  – This is one of six commentaries on a 2011-paper by Mark Coeckelbergh: “You, robot: on the linguistic construction of artificial others.” Coeckelbergh‘s response also appears in this issue of Technology and Language.</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.11</doi>
          <udk>1:004.032.26</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>AI</keyword>
            <keyword>Language</keyword>
            <keyword>Virtual personal assistant</keyword>
            <keyword>Neural Machine Translation</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.11/</furl>
          <file>111-126.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>127-135</pages>
        <authors>
          <author num="001">
            <individInfo lang="ENG">
              <orgName>University of Aarhus</orgName>
              <surname>Hasse</surname>
              <initials>Cathrine</initials>
              <address> Aarhus, Denmark</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">Language and Robots: from Relations to Processes of Relations</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">The word „robot“ does not have a fixed meaning and human interactions with robots do not somehow bring it to the fore. Mark Coeckelbergh suggests as much when he presents linguistic interaction with robots as a process of becoming aware of a quasi-personal relation. A focus on material linguistic practices yields a very different story of shifting signifiers that are subject to human experiences of changing relations with robots. The material encounter with robots is prefigured by the cultural presence of robots in many stories from popular culture. These produce an anticipation of the human-like, quasi-personal qualities of robots and an initial willingness to embrace these. Over the course of time, however, and through linguistic encounters with robots, one rather learns that they are quite foreign and, finally, merely machines.  - This is one of six commentaries on a 2011-paper by Mark Coeckelbergh: “You, robot: on the linguistic construction of artificial others.” Coeckelbergh‘s response also appears in this issue of Technology and Language.</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.12</doi>
          <udk>1:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Processes of relations</keyword>
            <keyword>Social robots</keyword>
            <keyword>Socio-linguistic Artefacts</keyword>
            <keyword>Lev Vygotsky</keyword>
            <keyword>Material-conceptual meaning</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.12/</furl>
          <file>127-135.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>136-146</pages>
        <authors>
          <author num="001">
            <individInfo lang="ENG">
              <surname>Li</surname>
              <initials>Yue </initials>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">Affirming and Denying the Hybrid Character of Robots: Literary Investigations</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">The social relation between humans and robots can be observed through the words used in the human-robot verbal interaction (Coeckelbergh, 2011). This study reviews Mark Coeckelbergh’s theory in the literary context by regarding writing and co-writing as linguistic interaction between humans and robots. It argues that the fictional as well as documented real writing experiences reveal not only the intuitive but also the normative dimension of the language. Two works of contemporary literature involving linguistic interaction: Machines Like Me by Ian McEwan and My Algorithm and Me by Daniel Kehlmann serve as research objects. It is concluded that the intuitive doesn’t always correlate with the normative dimension in the selected literary works. This tendency indicates a conflict between the experiential and the conceptional aspects, which deserves further attention in ethical and technical discourses.  – This is one of six commentaries on a 2011-paper by Mark Coeckelbergh: “You, robot: on the linguistic construction of artificial others.” Coeckelbergh‘s response also appears in this issue of Technology and Language.</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.13</doi>
          <udk>008:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Human-robot interaction</keyword>
            <keyword>Linguistic turn</keyword>
            <keyword>Human-robot relationship</keyword>
            <keyword>Machines Like Me</keyword>
            <keyword>My Algorithm and Me</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.13/</furl>
          <file>136-146.pdf</file>
        </files>
      </article>
      <article>
        <artType>RAR</artType>
        <langPubl>RUS</langPubl>
        <pages>147-154</pages>
        <authors>
          <author num="001">
            <authorCodes>
              <scopusid>22233758600</scopusid>
              <orcid>0000-0001-9576-1002</orcid>
            </authorCodes>
            <individInfo lang="ENG">
              <orgName>Philosophy of Department, University of Vienna</orgName>
              <surname>Coeckelbergh</surname>
              <initials>Mark</initials>
              <address>Universitätsring 1, 1010  Vienna, Austria</address>
            </individInfo>
          </author>
        </authors>
        <artTitles>
          <artTitle lang="ENG">Response: Language and robots</artTitle>
        </artTitles>
        <abstracts>
          <abstract lang="ENG">Six commentaries on the paper “You, robot: on the linguistic construction of artificial others” articulate different points of view on the significance of linguistic interactions with robots. The author of the paper responds to each of these commentaries by highlighting salient differences. One of these regards the dangerously indeterminate notion of “quasi-other” and whether it should be maintained. Accordingly, the critical study of the linguistic aspects of human-robot relations implies a critical study of society and culture. Another salient difference concerns the question of deception and whether there is a distinction between real and perceived affordances. The prospect of AI systems creating language or co-authoring texts raises the question of the hermeneutic responsibility of humans. And regarding the missing dimension of temporality, studies of macro- and micro-level hermeneutic change become more important.</abstract>
        </abstracts>
        <codes>
          <doi>10.48417/technolang.2022.01.14</doi>
          <udk>1:62-529</udk>
        </codes>
        <keywords>
          <kwdGroup lang="ENG">
            <keyword>Linguistic constructions of robots</keyword>
            <keyword>Affordances</keyword>
            <keyword>Hermeneutic responsibility</keyword>
            <keyword>Post-phenomenology</keyword>
            <keyword>Narrative</keyword>
          </kwdGroup>
        </keywords>
        <files>
          <furl>https://soctech.spbstu.ru/article/2022.6.14/</furl>
          <file>147-154.pdf</file>
        </files>
      </article>
    </articles>
  </issue>
</journal>
