ActInf GuestStream 062.1 ~ Michael Carl, "Deep temporal Models of the Translation Process"
Michael Carl "Deep temporal Models of the Translation Process"
https://scholar.google.com/citations?... Abstract: Translation Process
Research (TPR) investigates how humans produce translations from one language
into another. Several methods have been used to gather evidence for the
assumed underlying translation processes, including think-aloud,
(retrospective) interviews, questionnaires, screen recording, brain image
technologies (EEG, fMRI), etc. In our research, we use keyloggers and
eyetrackers to collect behavioral data during the translation sessions. The
two data streams are synchronized to establish correspondences between the
sensory input (reading) and translational action (typing) and analyzed to
arrive -- among other things -- at a better understanding of the relations
between translation effort and effects, which heavily depend on the
translator's expertise, text difficulty, expected translation quality, etc.
Numerous models of the translating mind have been suggested, some of which
suggest that multiple, more or less automatized and/or conscious translation
processes complement each other during translation production. In this talk I
suggest the Free Energy Principle (FEP) and Active Inference (AIF) as a novel
framework for analyzing, specifying and modelling deep embedded translation
processes and for simulating their interaction within the POMDP framework. I
exemplify how translation-behavioral data (keystrokes and gaze data) elicit
traces of a deep temporal architecture, in which human translation production
is modelled in several temporally embedded and interacting processes that
unfold on different timelines. Following FEP/AIF, the deep temporal
architecture allows translators to arrive at a steady state of fluent
translation production by minimizing the discrepancy between the translator's
internal states and the (textual) states in the external translation
environment. Active Inference Institute information: Website:
https://activeinference.org/ Twitter: / inferenceactive Discord: / discord
YouTube: / activeinference Active Inference Livestreams:
https://coda.io/@active-inference-ins...
CSID: b0969429d1b58abb
Content Managed by ContentSafe.co
-
2:46:02
Active Inference Institute
7 months agoActInf GuestStream 064.1 ~ Elliot Murphy "ROSE: A neurocomputational architecture for syntax"
16 -
1:10:44
Active Inference Institute
3 months agoActInf GuestStream 079.1 ~ Ryota Kanai: "Meta-Representations as Representations of Processes"
6 -
32:26
ykilcher
9 months agoEfficient Streaming Language Models with Attention Sinks (Paper Explained)
48 -
29:28
ykilcher
10 months agoTree of Thoughts: Deliberate Problem Solving with Large Language Models (Full Paper Review)
13 -
28:25
ykilcher
9 months agoRetentive Network: A Successor to Transformer for Large Language Models (Paper Explained)
54 -
7:54
Best Product Reviews
1 year ago5 AI tools for translation - Cointelegraph
46 -
41:06
ykilcher
10 months agoLLaMA: Open and Efficient Foundation Language Models (Paper Explained)
31 -
3:47
mujimancode
11 months ago.Natural Language Processing: How AI Understands Human Language
151 -
22:16
Health Ranger Report
3 months agoAI Large Language Models are NOT aware, alive, conscious or even intelligent...
13.1K16 -
27:51
ykilcher
2 years ago[ML News] Microsoft trains 530B model | ConvMixer model fits into single tweet | DeepMind profitable
111