Article

Stimulus information as a determinant of reaction time

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

The information conveyed by a stimulus was varied in 3 ways: "(a) the number of equally probable alternatives from which it could be chosen, (b) the proportion of times it could occur relative to the other possible alternatives, and (c) the probability of its occurrence as a function of the immediately preceding stimulus presentation. The reaction time to the amount of information in the stimulus produced a linear regression for each of the three ways… ." (PsycINFO Database Record (c) 2012 APA, all rights reserved)

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... The mathematical framework of information theory quantifies the processing and utilization of information, which is specified as that which reduces entropy or uncertainty in a given process (Shannon, 1948). In the psychological sciences, a classic series of experiments demonstrated that response time (RT) increases linearly with increasing information uncertainty, an effect known as the Hick-Hyman law (Hick, 1952;Hyman, 1953). Studies of the Hick-Hyman law use choice RT tasks that vary information uncertainty at the level of either the stimulus or the required response (see Proctor & Schneider, ...
... As predicted, RTs in both the externally cued and internally driven uncertainty conditions conformed to the Hick-Hyman law (Hick, 1952;Hyman, 1953), showing a linear increase with increasing uncertainty load. There have been only a couple of studies that investigate the Hick-Hyman law in aging. ...
... In addition to the comparison of internally driven and externally cued uncertainty, we further examined stimulus expectancy to provide a more robust test of our hypothesis that age-related changes to the CCN would disrupt internal representations of uncertainty, given the role of the anterior cingulate cortex and anterior insula in mediating stimulus expectancy (Davis & Hasson, 2018;Oliveira et al., 2007;Wu et al., 2021). Although RT based on individual trial expectancy is not directly defined by the Hick-Hyman law, which was outlined at the level of blocks of trials (Hyman, 1953), a recent study extended the law to fit both block-and trial-level data, including surprisal value (Mordkoff, 2017). In the present study, both YA and OA had lower accuracy and greater RTs in response to unexpected (surprising) stimuli, which occurred rarely at the lowest uncertainty load. ...
Article
Full-text available
Objective: The Hick–Hyman law states that response time (RT) increases linearly with increasing information uncertainty. The effects of aging on uncertainty representations in choice RT paradigms remain unclear, including whether aging differentially affects processes mediating externally cued versus internally driven uncertainty. This study sought to characterize age-related differences in uncertainty representations using a card-sorting task. Method: The task separately manipulated internally driven uncertainty (i.e., probability of each stimulus type with fixed number of response piles) and externally cued uncertainty (i.e., number of response piles with fixed probability of each stimulus type). Results: Older adults (OA) showed greater RT slowing than younger adults in response to uncertainty load, an effect that was stronger in the externally cued than internally driven condition. While both age groups showed lower accuracy and greater RTs in response to unexpected (surprising) stimuli in the internally driven condition at low uncertainty loads, OA were unable to distinguish between expected and nonexpected stimuli at higher uncertainty loads when the probability of each stimulus type was close to equal. Among OA, better performance on the internally driven, but not externally cued, condition was associated with better global cognitive performance and verbal fluency. Conclusions: Collectively, these findings provide behavioral evidence of age-related disruptions to bottom-up (externally cued) and top-down (supporting internally driven mental representations) resources to process uncertainty and coordinate task-relevant action.
... Distributions of response times tend to be positively skewed, resembling log-normal or gamma distributions (Lindeløv, 2019;Ratcliff, 2012). Response times increase reliably with task complexity as described by the Hick-Hyman law (Hick, 1952;Hyman, 1953). Responses also quicken as a function of practice, a phenomenon called the power law of practice (Newell & Rosenbloom, 1981). ...
... However, averaging over many transmissions gives a clear picture of the expected rate of information gain, plotted in One of the earliest and most controversial applications of information-theoretic concepts to human behavior was the discovery by William Hick and Ray Hyman that response times vary with the amount of information provided by a stimulus. In an experiment requiring a subject to map stimuli to behavioral responses, information about the correct response is encoded as the number of distinct stimuli (Hick, 1952), for example, or the stimuli's relative probabilities (Hyman, 1953). The legitimacy of an information-theoretic analysis of this finding has been regularly disputed in the years since (Laming, 2010;Luce, 2003). ...
... While Hick's findings describe a relationship between the number of stimuli and response time, Ray Hyman varied the statistics of stimuli within the task, both by modifying the relative frequency of stimulus options and the conditional probabilities of sequences of stimuli (Hyman, 1953). Hyman's results replicate and extend Hick's main finding. ...
Article
Full-text available
Human response times conform to several regularities including the Hick-Hyman law, the power law of practice, speed-accuracy trade-offs, and the Stroop effect. Each of these has been thoroughly modeled in isolation, but no account describes these phenomena as predictions of a unified framework. We provide such a framework and show that the phenomena arise as decoding times in a simple neural rate code with an entropy stopping threshold. Whereas traditional information-theoretic encoding systems exploit task statistics to optimize encoding strategies, we move this optimization to the decoder, treating it as a Bayesian ideal observer that can track transmission statistics as prior information during decoding. Our approach allays prominent concerns that applying information-theoretic perspectives to modeling brain and behavior requires complex encoding schemes that are incommensurate with neural encoding.
... This results in highly context-dependent transitions, as identical letters can be followed by different sets of successors as a function of their position in the grammar (For instance, Si can only be followed by Q, but 82 can be followed by either V or P). Finally, the grammar was constructed so as to avoid direct repetitions of a particular letter, because it is known (Bertelson, 1961;Hyman, 1953) that repeated stimuli eEcit shorter RTs independently of their probability of presentation. (Direct repetitions can still occur because a small proportion of the trials were generated randomly, as described below.) ...
... First of all, it appears that a response that is actually executed remains primed for a number of subsequent trials (Bertelson, 1961;Hyman, 1953;Remington, 1969). In the last sessions of our data, we found that if a response follows The second factor may be related: Responses that are grammatical at Trial / but do not actually occur remain primed at Trial t + 1. ...
Article
Full-text available
How is complex sequential material acquired, processed, and represented when there is no intention to learn? Two experiments exploring a choice reaction time task are reported. Unknown to Ss, successive stimuli followed a sequence derived from a “noisy” finite-state grammar. After considerable practice (60,000 exposures) with Experiment 1, Ss acquired a complex body of procedural knowledge about the sequential structure of the material. Experiment 2 was an attempt to identify limits on Ss ability to encode the temporal context by using more distant contingencies that spanned irrelevant material. Taken together, the results indicate that Ss become increasingly sensitive to the temporal context set by previous elements of the sequence, up to 3 elements. Responses are also affected by priming effects from recent trials. A connectionist model that incorporates sensitivity to the sequential structure and to priming effects is shown to capture key aspects of both acquisition and processing and to account for the interaction between attention and sequence structure reported by Cohen, Ivry, and Keele (1990).
... The seminal studies on choice reaction time by Hick (1952) and Hyman (1953) provided the impetus for psychology's renewed interest in reaction time. Hick (1952) noted that there is a nonlinear increase in choice reaction time as the number of response options increases. ...
... PROCESSING SPEED, ATTENTION, AND INTELLIGENCE 3 and the information (bit) value of a particular stimulus display (Hick, 1952). At nearly the same time, Hyman (1953) published a complementary set of studies, providing greater generality to the findings initially reported by Hick (1952). Hyman's participants also responded to bulbs as they lit up, with the number of response options ranging from 1 to 8. ...
Article
Full-text available
Individual differences in processing speed and executive attention have both been proposed as explanations for individual differences in cognitive ability, particularly general and fluid intelligence (Engle et al., 1999; Kail & Salthouse, 1994). Both constructs have long intellectual histories in scientific psychology. This article attempts to describe the historical development of these constructs, particularly as they pertain to intelligence. It also aims to determine the degree to which speed and executive attention are theoretical competitors in explaining individual differences in intelligence. We suggest that attention is the more fundamental mechanism in explaining variation in human intelligence.
... A third type of sequential effects are repetition effects in which faster RTs are observed when the same stimulus repeats (Felfoldy, 1974). Such repetition effects have been first observed in the RT literature by Hyman (1953). Subsequent studies have shown that the magnitude of the repetition effect decreased as a function of the ensuing lag between consecutive stimuli (inter-stimulus-interval [ISI]) (Bertelson, 1961;Bertelson & Renkin, 1966;Hale, 1967;Smith, 1968). ...
... For example, it was found that the difference in RT between repetition and nonrepetition trials increases for a given ISI, or remains constant as ISI increases. The latter occurs when either the number of stimulus alternatives gets larger (Hyman & Umiltà, 1969;Smith, 1968) or the probability of repetition increases (Bertelson, 1961;Hyman, 1953;Hyman & Umiltà, 1969). Because the current study examines sequential effects with different number of stimulus alternatives in baseline and filtering, it is of theoretical importance to examine the potential influence of ISI. ...
Article
Full-text available
For nearly half a century now, Garner interference has been serving as the gold standard measure of dimensional interaction and selective attention. But the mechanisms that generate Garner interference are still not well understood. The current study proposes a novel theory that ascribes the interference (and dimensional interaction in general) to episodic feature integration processes at the micro (trial-to-trial) level. The novel account builds on earlier well-established notions of “feature integration” and “object files,” and is augmented by formal derivations. The sequential binding account predicts that the magnitude of Garner interference is related to the strength of feature integration along consecutive trials. Three experiments were set to test this novel binding theory. Experiment 1 and Experiment 2 tested performance with integral dimensions (chroma and value, and width and height of rectangles); whereas Experiment 3 examined performance with a pair of separable dimensions (circle’s size and angle of a diameter). In addition, the time lag ensuing between consecutive trials was manipulated. The results strongly supported the predictions of the sequential binding account: (a) with integral dimensions, substantial amounts of Garner interference were correlated with large partial repetition costs (e.g., consensual markers of feature integration), but this pattern was not observed with separable dimensions, and (b) the magnitude of both Garner interference and partial repetition costs diminished as a function of the ensuing time lag between consecutive trials, pointing to a common time-dependent memory mechanism. These results adduce strong support in the predictions of the feature binding theory of Garner interference, giving currency to the idea that dimensional interaction is driven by feature integration.
... As environments have finite sizes, determining the optimal amount of information necessary for effective global decision-making at the group level becomes crucial. To quantify this situtation, British and American psychologists William Edmund Hick and Ray Hyman experimented on this and found that there exists a crucial relationship between the average reaction time of each decision-maker and the number of choices [8,9]. This law is known as Hick's Law and states that the average reaction time is a logarithmic function of a number of options. ...
Preprint
Full-text available
Decision making is the cognitive process of selecting a course of action among multiple alternatives. As the decision maker belongs to a complex microenvironment (which contains multiple decision makers), has to make a decision where multiple options are present which often leads to a phenomenon known as the "paradox of choices". The latter refers to the case where too many options can lead to negative outcomes, such as increased uncertainty, decision paralysis, and frustration. Here, we employ an entropy driven mechanism within a statistical physics framework to explain the premises of the paradox. In turn, we focus on the emergence of a collective "paradox of choice", in the case of interacting decision-making agents, quantified as the decision synchronization time. Our findings reveal a trade-off between synchronization time and the sensing radius, indicating the optimal conditions for information transfer among group members, which significantly depends the individual sensitivity parameters. Interestingly, when agents sense their microenvironment in a biased way or their decisions are influenced by their past choices, then the collective "paradox of choice" does not occur. In a nutshell, our theory offers a low-dimensional and unified statistical explanation of the "paradox of choice" at the individual and at the collective level.
... The study showed that participants had shorter RTs for T2 than for T1, possibly due to a reduction in cognitive load in T2, as they had already partially processed the olfactory prime, and to motor facilitations due to task repetition (Hyman, 1953;Mawase et al., 2018). Most important, the priming effect was observed only for T1 and not T2. ...
Article
Full-text available
The Olfactory Priming Task (OPT) is a new implicit measure developed to capture associations between odors and feeling-related words that was inspired by previous priming techniques. Participants are presented with feeling-related words and asked to categorize them as “relaxing” or “energizing” as quickly and accurately as possible, while supposedly relaxing or stimulating odors are delivered as a prime. Accuracy and response times are recorded, and participants are expected to react faster and more accurately with feeling-related words that are congruent with the primed odor. We validated the OPT in two experiments with the use of menthol/vanillin and fine fragrances, respectively. Results indicated that the OPT could discriminate odors from their relaxing/energizing properties, with participants showing faster responses to energizing-related words after priming with menthol or “Perfume 1” and to relaxing-related words after priming with vanillin or “Perfume 2.” These associations were further confirmed by subjective reports, with participants rating menthol and Perfume 1 as more energizing and vanillin and Perfume 2 as more relaxing. The results suggest that exposure to relaxing/energizing odors activates congruent feelings in consumers. The results also demonstrate the validity and reliability of the OPT as an implicit measure for capturing associations between odors and feeling-related words, making it a valuable tool for measuring consumers' affective response to flavors and fragrances.
... There are now several types of HCI predictive models that can be used depending on the context or type of tasks analyzed. For example, the Fitts law is used for mouse pointing [12], the Hick-Hyman law is used for reviewing a sorted list [13,14], and goals operators methods and selection rules (GOMS) [15] is a high-level model that is used to describe the cognitive processes and methods involved in using a computer to achieve specific goals. The keystroke-level model (KLM) is a more specific type of GOMS model that is used for compositions of tasks that fit how an experienced user interacts with the interface [11]. ...
... In our previous work [11], we argued that response time (RT) should be a linear function of policy complexity, which can be manipulated even when the number of states is held fixed [16]. Consistent with this prediction, we found that lower policy complexity significantly predicted shorter response times in a contextual multi-armed bandit task [11,17]. ...
Article
Full-text available
Policy compression is a computational framework that describes how capacity-limited agents trade reward for simpler action policies to reduce cognitive cost. In this study, we present behavioral evidence that humans prefer simpler policies, as predicted by a capacity-limited reinforcement learning model. Across a set of tasks, we find that people exploit structure in the relationships between states, actions, and rewards to “compress” their policies. In particular, compressed policies are systematically biased towards actions with high marginal probability, thereby discarding some state information. This bias is greater when there is redundancy in the reward-maximizing action policy across states, and increases with memory load. These results could not be explained qualitatively or quantitatively by models that did not make use of policy compression under a capacity limit. We also confirmed the prediction that time pressure should further reduce policy complexity and increase action bias, based on the hypothesis that actions are selected via time-dependent decoding of a compressed code. These findings contribute to a deeper understanding of how humans adapt their decision-making strategies under cognitive resource constraints.
... These results do not support the conclusions of Kacelnik and colleagues from their studies of decision making in European starlings (Kacelnik et al., 2011) and several more recent studies done in rats that used a similar training method (Ojeda et al., 2018;Ajuwon et al., 2023). Instead, our results follow what is expected from the Hick-Hyman Law (Hick, 1952;Hyman, 1953), which states that as options increase, response latencies increase, as well as classic studies that motivated the development of drift diffusion models (Ratcliff, 1978). Our findings suggest that brain systems involved in the decision threshold will likely show learning-related changes in neural activity during early choice learning. ...
Preprint
Full-text available
Current theories of decision making propose that decisions arise through competition between choice options. Computational models of the decision process estimate how quickly information about choice options is integrated and how much information is needed to trigger a choice. Experiments using this approach typically report data from well-trained participants. As such, we do not know how the decision process evolves as a decision-making task is learned for the first time. To address this gap, we used a behavioral design separating learning the value of choice options from learning to make choices. We trained male rats to respond to single visual stimuli with different reward values. Then, we trained them to make choices between pairs of stimuli. Initially, the rats responded more slowly when presented with choices. However, as they gained experience in making choices, this slowing reduced. Response slowing on choice trials persisted throughout the testing period. We found that it was specifically associated with increased exponential variability when the rats chose the higher value stimulus. Additionally, our analysis using drift diffusion modeling revealed that the rats required less information to make choices over time. Surprisingly, we observed reductions in the decision threshold after just a single session of choice learning. These findings provide new insights into the learning process of decision-making tasks. They suggest that the value of choice options and the ability to make choices are learned separately, and that experience plays a crucial role in improving decision-making performance.
... With Shannon's measures kept firmly in their toolkit, researchers were quick to explain pattern goodness, indeed patterness, in terms of information (e.g. Attneave, 1954Attneave, , 1959Hochberg & McAlister, 1953;Hyman, 1953). The central idea was that patterned sequences and good figures contained less information due to their makeup that entails order and redundancy. ...
Article
Full-text available
Of the four interrelated concepts in the title, only symmetry has an exact mathematical definition. In mathematical development, symmetry is a graded variable-in marked contrast with the popular binary conception of symmetry in and out of the laboratory (i.e. an object is either symmetrical or nonsymmetrical). Because the notion does not have a direct graded perceptual counterpart (experimental participants are not asked about the amount of symmetry of an object), students of symmetry have taken various detours to characterize the perceptual effects of symmetry. Current approaches have been informed by information theory, mathematical group theory, randomness research, and complexity. Apart from reviewing the development of the main approaches, for the first time we calculated associations between figural goodness as measured in the Garner tradition and measures of algorithmic complexity and randomness developed in recent research. We offer novel ideas and analyses by way of integrating the various approaches.
... There was a clear delineation between low, medium, and high workloads. While the NASA TLX can infer the difficulty level, an additional estimate was calculated as a redundancy through the baud rate, a measure of human operator performance relative to a machine stimulus as defined by Phillips and colleagues (2007), which was driven by the Hick-Hyman (Hick, 1952;Hyman, 1953) and Fitts (Fitts, 1954) laws to assess reaction time to stimulus and movement time to stimulus, respectively. Given that task restrictions limited discrete controls next to the hands (cyclic control, right hand; keypad, left hand), the focus was response time rather than movement time, thereby limiting the focus to Hick-Hyman machineinitiated baud rate calculations accounting for the lights, dials, channels, and frequencies. ...
Article
Objective To evaluate neck muscle coactivation across different levels of mental workload during simulated flight tasks. Background Neck pain (NP) is highly prevalent among military aviators. Given the complex nature within the flight environment, mental workload may be a risk factor for NP. This may induce higher levels of neck muscle coactivity, which over time may accelerate fatigue, increase neck discomfort, and affect flight task performance. Method Three counterbalanced mental workload conditions represented by simulated flight tasks modulated by interstimulus frequency and complexity were investigated using the Modifiable Multitasking Environment (ModME). The primary measure was a neck coactivation index to describe the neuromuscular effort of the neck muscles as a system. Additional measures included perceived workload (NASA TLX), subjective discomfort, and task performance. Participants ( n = 60; 30M, 30F) performed three test conditions over 1 hr each while seated in a simulated seating environment. Results Neck coactivation indices (CoA) and subjective neck discomfort corresponded with increasing level of mental workload. Average CoAs for low, medium, and high workloads were: .0278(SD = .0232), .0286(SD = .0231), and .0295(SD = .0228), respectively. NASA TLX mental, temporal, effort, and overall scores also increased with the level of mental workload assigned. For ModME task performance, the overall performance score, monitoring accuracy, and resource management accuracy decreased while reaction times increased with the increasing level of mental workload. Communication accuracy was lowest with the low mental workload but had higher reaction times relative to increasing workload. Conclusion Mental workload affects neck muscle coactivation during combinations of simulated flight tasks within a simulated helicopter seating environment. Application The results of this study provide insights into the physical response to mental workload. With increasing multisensory modalities within the work environment, these insights may assist the consideration of physical effects from cognitive factors.
... Recent literature in statistical learning supports the view that humans are sensitive to different graph structures underlying transition probabilities [1,4,[12][13][14]. For example, when displaying action cues drawn from transition graphs, humans can detect differences in individual transition probabilities. ...
Article
Full-text available
Humans are constantly exposed to sequences of events in the environment. Those sequences frequently evince statistical regularities, such as the probabilities with which one event transitions to another. Collectively, inter-event transition probabilities can be modeled as a graph or network. Many real-world networks are organized hierarchically and understanding how these networks are learned by humans is an ongoing aim of current investigations. While much is known about how humans learn basic transition graph topology, whether and to what degree humans can learn hierarchical structures in such graphs remains unknown. Here, we investigate how humans learn hierarchical graphs of the Sierpiński family using computer simulations and behavioral laboratory experiments. We probe the mental estimates of transition probabilities via the surprisal effect: a phenomenon in which humans react more slowly to less expected transitions, such as those between communities or modules in the network. Using mean-field predictions and numerical simulations, we show that surprisal effects are stronger for finer-level than coarser-level hierarchical transitions. Notably, surprisal effects at coarser levels of the hierarchy are difficult to detect for limited learning times or in small samples. Using a serial response experiment with human participants (n=100), we replicate our predictions by detecting a surprisal effect at the finer-level of the hierarchy but not at the coarser-level of the hierarchy. To further explain our findings, we evaluate the presence of a trade-off in learning, whereby humans who learned the finer-level of the hierarchy better tended to learn the coarser-level worse, and vice versa. Taken together, our computational and experimental studies elucidate the processes by which humans learn sequential events in hierarchical contexts. More broadly, our work charts a road map for future investigation of the neural underpinnings and behavioral manifestations of graph learning.
... For example, expert chess players can easily find and remember positions as familiar configurations or chunks of pieces that they encountered previously (Chase and Simon, 1973). With regards to motor aspects, reaction times for novices are longer compared to experts (Hick, 1952;Hyman, 1953). Experts exhibit fewer motor operators both at the level of central neural programming and subsequent motor unit activation (Davids et al., 2006;McCaskie et al., 2011;Milton et al., 2007). ...
Article
Cognitive performance models have been used in several human factors domains such as driving and human-computer interaction. However, most models are limited to expert performance with rough adjustments to consider novices despite prior studies suggesting novices' cognitive, perceptual, and motor behaviors are different from experts. The objective of this study was to develop a cognitive performance model for novice law enforcement officers (N-CPM) to model their performance and memory load while interacting with in-vehicle technology. The model was validated based on a ride-along study with 10 novice law enforcement officers (nLEOs). The findings suggested that there were no significant differences between the N-CPM and observation data in most cases, while the results of the benchmark model were different from that of N-CPM. The model can be applied to improve future nLEO's patrol mission performance through redesigning in-vehicle technologies and training methods to reduce their workload and driving distraction.
... This is a surprising result on several counts. First, the smaller set produced more frequent presentation of particular letter pairs, and the reduced stimulus uncertainty (Hick, 1952;Hyman, 1953) and added practice thus obtained ought to have helped performance. Perhaps increased habituatlon or satiation outweighed any practice or priming effects (cf. ...
Article
Full-text available
Three experiments with 80 undergraduates examined 3 explanations (internal noise, priming, and relative frequency) of why "same" judgments typically are faster than "different" judgments in a physical matching task. The internal-noise principle, which predicts more errors as well as faster correct judgments on "same" pairs, was consistently confirmed even with sequential presentation. More elusive was priming, that is, the facilitation of encoding from letter repetition with sequential presentation. Priming was inhibited by the presence of intertrial letter repetition. Error data indicated that priming involves an increased efficiency in encoding, as R. W. Proctor claimed, rather than a criterion shift as suggested by L. E. Krueger and R. G. Shapiro . An alternative to the priming explanation, based on the greater susceptibility of simultaneous presentation to analytical processing, was tested and disconfirmed. Stimulus-set size did not affect the speed advantage for "same" pairs, thus disconfirming the relative-frequency explanation, according to which "same" judgments are faster because typically there are fewer unique "same" than "different" pairs. (34 ref)
... Classification of stimuli produced by correlated combinations of integral dimensions is facilitated because they have a functional interstimulus distance that is greater than that provided by either dimension alone. Felfoldy (1974; see also Garner, 1974) has suggested that interference with integral dimensions occurs because the greater number of functional stimuli in orthogonal conditions decreases the frequency (relative to control conditions) of facilitory stimulus repetitions (e.g., Hyman, 1953;Kornblum, 1973). Thus, facilitation with correlated dimensions and interference with orthogonal dimensions is one pair of converging operations that Garner (1970;Garner & Felfoldy, 1970) has used to identify intergral dimensions. ...
Article
Full-text available
For stimulus dimensions of line location and orientation with both card-sorting and discrete RT trials (4 experiments with undergraduates), facilitation occurred when dimensions were positively correlated, and interference appeared when dimensions varied orthogonally. Interference could not be attributed to differential sensory accrual arising from positional uncertainty or to a repetition-effects advantage for control over orthogonal conditions. Facilitation tended to disappear when the same response was not required for the 2 dimensions, and when competing responses were required, interference appeared with redundant dimensions (negatively correlated stimulus sets). Data seem consistent with a model that calls for automatic and parallel extraction of features and their locations, with facilitation-interference effects having their locus in postperceptual response processing. (47 ref)
... The next fastest were those conditions in which only a single parameter remained to be specified (two parameters precued), followed by the singly precued condition, with the condition of no precue having the longest reaction time. These results appear to be accountable, at least in part, on the basis of uncertainty (Hick, 1952;Hyman, 1953). As the number of stimulus-response alternatives was reduced (i.e., as more parameters were precued), there was a commensurate reduction in reaction time. ...
Article
Full-text available
Four experiments (42 18–30 yr old Ss) investigated the specification of movement parameters hypothesized to be involved in the initiation of movement. Initiation times did not systematically vary as a function of the type of parameter precued nor were there significant differences between specific and ambiguous precue conditions. Only in Exp I, in which precues and stimuli involved complex cognitive transformations, was there support for D. A. Rosenbaum's (1980) parameter specification model. When highly compatible conditions were employed, designed to reflect a real-world environment, the authors failed to obtain any tendency for movement parameters to be serially specified. Grounds for suspecting the generality of parameter specification models are discussed, and an alternative approach that is consonant with the dynamic characteristics of the motor control system is proposed. (58 ref)
... Recent literature in statistical learning supports the view that humans are sensitive to different graph structures underlying transition probabilities [1,4,[12][13][14]. For example, when displaying action cues drawn from transition graphs, humans can detect differences in individual transition probabilities. ...
Preprint
Full-text available
Humans are constantly exposed to sequences of events in the environment. Those sequences frequently evince statistical regularities, such as the probabilities with which one event transitions to another. Collectively, inter-event transition probabilities can be modeled as a graph or network. Many real-world networks are organized hierarchically and understanding how humans learn these networks is an ongoing aim of current investigations. While much is known about how humans learn basic transition graph topology, whether and to what degree humans can learn hierarchical structures in such graphs remains unknown. We investigate how humans learn hierarchical graphs of the Sierpi\'nski family using computer simulations and behavioral laboratory experiments. We probe the mental estimates of transition probabilities via the surprisal effect: a phenomenon in which humans react more slowly to less expected transitions, such as those between communities or modules in the network. Using mean-field predictions and numerical simulations, we show that surprisal effects are stronger for finer-level than coarser-level hierarchical transitions. Surprisal effects at coarser levels of the hierarchy are difficult to detect for limited learning times or in small samples. Using a serial response experiment with human participants (n=$100$), we replicate our predictions by detecting a surprisal effect at the finer-level of the hierarchy but not at the coarser-level of the hierarchy. To further explain our findings, we evaluate the presence of a trade-off in learning, whereby humans who learned the finer-level of the hierarchy better tended to learn the coarser-level worse, and vice versa. Our study elucidates the processes by which humans learn hierarchical sequential events. Our work charts a road map for future investigation of the neural underpinnings and behavioral manifestations of graph learning.
... When a light briefly illuminated the box, he reported being aware of only the objects in the region he had been attending to (Carrasco, 2011). A century later, the work of Hick (1952) and Hyman (1953) provided support for the idea that RT was linearly Frontiers in Psychology 05 frontiersin.org related to the amount of information (e.g., the number of items to be processed or steps) required by a task. ...
Article
Full-text available
Cognitive psychology began over three-quarters of a century ago and we have learned a great deal in that time, including concerning the development of cognitive abilities such as perception, attention, and memory, all of which develop across infancy and childhood. Attention is one aspect of cognition that is vital to success in a variety of life activities and, arguably, the foundation of memory, learning, problem solving, decision making, and other cognitive activities. The cognitive abilities of later childhood and adulthood generally appear to depend on the reflexes, abilities, and skills of infancy. Research in developmental cognitive science can help us understand adult cognition and know when to intervene when cognitive function is at risk. This area of research can be challenging because, even in typical development, the course of cognitive development for a particular child does not always improve monotonically. In addition, the typical trajectory of this development has been understood differently from different historical perspectives. Neither the history of thought that has led to our current understanding of attention (including its various types) nor the importance of developmental aspects of attention are frequently covered in training early career researchers, especially those whose primary area of research in not attention. My goal is to provide a review that will be useful especially to those new to research in the subfield of attention. Sustained attention in adults and children has been well-studied, but a review of the history of thought on the development of reflexive attention with a focus on infancy is overdue. Therefore, I draw primarily on historical and modern literature and clarify confusing terminology as it has been used over time. I conclude with examples of how cognitive development research can contribute to scientific and applied progress.
... In order to test if RSI duration affects hierarchical learning, we conducted a first analysis in which we evaluated the height of the hierarchical structure in each experiment in the same way as in our previous study (Schmid et al., 2023). Hierarchical elaboration generates expectations about the structure of the input, which the participants' RTs reflect (Huettel et al., 2002;Hyman, 1953;Lynn et al., 2020;McCarthy & Donchin, 1981;Sternberg, 1969). Hierarchical learning therefore manifests in terms of steeper slopes of RTs for disambiguated points at a given level compared to the slopes of nondisambiguated points at the same level. ...
Preprint
Full-text available
In this article, we explore the impact of presentation rate on the extraction of hierarchical structure by manipulating the duration of the Response-to-Stimulus Interval (RSI) in a Serial Reaction Time (SRT) task. Multiple hypotheses have been put forward in the literature to account for the influence of RSI duration on sequence learning in the SRT task (Frensch & Miner, 1994; Huang et al., 2017; Willingham et al., 1997). However, this question has never been addressed from the perspective of hierarchical structure extraction. We found that RSI duration affected hierarchical elaboration in a non-linear way, with participants building higher hierarchical structures with an RSI of 250 ms compared to RSIs of 1000 ms and 100 ms. This finding suggests the presence of an optimal temporal window for sequence learning in the SRT task. This U-shaped effect cannot be accounted for by any of the existing hypotheses on the influence of RSI duration on sequence learning in the SRT task. We hypothesized that this effect results from the tension between the cognitive system's limited encoding capacity and the amount of information per unit of time delivered to the system.
... Other forms of actuation such as verbal or non-verbal body movements are not described here. A fundamental model of actuation is the Hyman-Hick law [28], [29], which hypothesized that the difficulty of a motor task is proportional to the entropy of the situation. More commonly, the Hyman-Hick law is known in the context of decision-making i.e. more choices equates to higher entropy which means the person will take longer to make a decision. ...
Preprint
Full-text available
We survey the landscape of human operator modeling ranging from the early cognitive models developed in artificial intelligence to more recent formal task models developed for model-checking of human machine interactions. We review human performance modeling and human factors studies in the context of aviation, and models of how the pilot interacts with automation in the cockpit. The purpose of the survey is to assess the applicability of available state-of-the-art models of the human operators for the design, verification and validation of future safety-critical aviation systems that exhibit higher-level of autonomy, but still require human operators in the loop. These systems include the single-pilot aircraft and NextGen air traffic management. We discuss the gaps in existing models and propose future research to address them.
... This bias is an artifact that would contaminate comparisons of conditions with different trial frequencies if medians were used to summarize the RTs in each condition. Originally, comparisons of such conditions were used particularly in studies of the main effects of stimulus and response probability (e.g., Hyman, 1953), attentional cuing (e.g., Posner et al., 1978), and expectancy (e.g., Mowrer et al., 1940, Zahn andRosenthal, 1966). In addition, trial frequencies have often been varied across conditions to explore a variety of cognitive processes by investigating their interactions with probability (e.g., Broadbent and Gregory, 1965, Den Heyer et al., 1983, Miller and Pachella, 1973, Sanders, 1970, Theios et al., 1973. ...
Article
Full-text available
Contrary to the warning of Miller (1988), Rousselet and Wilcox (2020) argued that it is better to summarize each participant's single-trial reaction times (RTs) in a given condition with the median than with the mean when comparing the central tendencies of RT distributions across experimental conditions. They acknowledged that median RTs can produce inflated Type~I error rates when conditions differ in the number of trials tested, consistent with Miller's warning, but they showed that the bias responsible for this error rate inflation could be eliminated with a bootstrap bias correction technique. The present simulations extend their analysis by examining the power of bias-corrected medians to detect true experimental effects and by comparing this power with the power of analyses using means and regular medians. Unfortunately, although bias corrected medians solve the problem of inflated Type~I error rates, their power is lower than that of means or regular medians in many realistic situations. In addition, even when conditions do not differ in the number of trials tested, the power of tests (e.g., t-tests) is generally lower using medians rather than means as the summary measures. Thus, the present simulations demonstrate that summary means will often provide the most powerful test for differences between conditions, and they show what aspects of the RT distributions determine the size of the power advantage for means.
... As a basis for our augmented benchmark, we use the DREAM benchmark of 540 PSTNs (Abrahams et al. 2019) and the CAR-SHARING dataset of 169 PSTNs which was created by Santana et al. (2016) and then edited by Akmal et al. (2019). 1 Gamma benchmark has the shape parameter α set to proportional to the edge length. By doing so, this benchmark can represent the characteristics in human behavior that suggests tasks with longer duration tends to perform more normal distribution alike (Hyman 1953). The scale parameter β is set to one for consistency in comparison. ...
Article
Probabilistic Simple Temporal Networks (PSTN) facilitate solving many interesting scheduling problems by characterizing uncertain task durations with unbounded probabilistic distributions. However, most current approaches assess PSTN performance using normal or uniform distributions of temporal uncertainty. This paper explores how well such approaches extend to families of non-symmetric distributions shown to better represent the temporal uncertainty introduced by, e.g., human teammates by building new PSTN benchmarks. We also build probability-aware variations of current approaches that are more reactive to the shape of the underlying distributions. We empirically evaluate the original and modified approaches over well-established PSTN datasets. Our results demonstrate that alignment between the planning model and reality significantly impacts performance. While our ideas for augmenting existing algorithms to better account for human-style uncertainty yield only marginal gains, our results surprisingly demonstrate that existing methods handle positively-skewed temporal uncertainty better.
... The primary effect of increasing the number of response alternatives was an increase of RT with more alternatives. This increase of RT is consistent with the venerable Hick-Hyman law (Hick, 1952;Hyman, 1953) relating RT to the amount of information to be processed (see Proctor & Schneider, 2018, for a historical review). This effect is likely due to two, probably related, processes. ...
Article
Full-text available
Response repetitions aid performance when a task repeats but impair performance when a task switches. Although this interaction is robust, theoretical accounts remain controversial. Here, we used an un-cued, predictable task-switching paradigm with univalent targets to explore whether a simple bias to switch the response when the task switches can explain the interaction. In Experiment 1A (n = 40), we replicated the basic interaction in a two-choice task. In Experiment 1B (n = 60), we observed the same interaction in a three-choice task, wherein a bias to switch the response when the task switches cannot prime a specific alternative response because both remaining response alternatives are equally likely. Exploratory comparisons revealed a larger interaction between task repetition and response repetition in the three-choice task than in the two-choice task for mean response time (RT) and the opposite pattern for mean error rate (ER). Critically, in the three-choice task, response-repetition costs in task switches were significant in both RT and ER. Since a bias to switch the response cannot prime a specific response alternative in a three-choice task, we conclude that such a bias cannot account for response-repetition costs in task-switch trials.
... This theory is an extension of Hartley's study [6], which defined information as "a set of symbols obtained by successive choices (including elimination)" and contributed to the development of information theory. After Shannon's information theory was published, it was adapted to psychology [7][8][9] and developed into an analogy to understand the human mind as a computer that processes information. Moreover, it developed into a theory of human information processing. ...
Article
Full-text available
Fitts’ approach, which examines the information processing of the human motor system, has the problem that the movement speed is controlled by the difficulty index of the task, which the participant uniquely sets, but it is an arbitrary speed. This study rigorously aims to examine the relationship between movement speed and information processing using Woodworth’s method to control movement speed. Furthermore, we examined movement information processing using an approach that calculates probability-based information entropy and mutual information quantity between points from trajectory analysis. Overall, 17 experimental conditions were applied, 16 being externally controlled and one being self-paced with maximum speed. Considering that information processing occurs when irregularities decrease, the point at which information processing occurs switches at a movement frequency of approximately 3.0–3.25 Hz. Previous findings have suggested that motor control switches with increasing movement speed; thus, our approach helps explore human information processing in detail. Note that the characteristics of information processing in movement speed changes that were identified in this study were derived from one participant, but they are important characteristics of human motor control.
... Results have a strong grounding in information theory [27], notably Fitts' Law and the Hick-Hyman Law [28][29][30], which concern the movement time necessary to acquire a visual target and the relationship between information load and choicereaction time (i.e. the time taken to determine which target/item to acquire before moving towards it), respectively. Fitts' Law and the Hick-Hyman Law are predicated on the fact that human performance is limited primarily by the capacity of the human motor system, as determined by the visual and proprioceptive feedback that permits an individual to monitor their own movement and activity. ...
Article
Full-text available
Studies comparing results captured in a simulator with those on road are important to validate the approach but are scarce in the context of secondary task distraction due to the potential ramifications of diverting attention away from safe driving. We compare distraction-related data from two studies exploring HMI design: one conducted in a static, medium-fidelity driving simulator with a vehicle enclosure and immersive visual environment, and one conducted on road. In both, 19 drivers undertook an identical selection of touchscreen, point-and-select tasks. The magnitude of visual distraction (defined as off-road glances directed towards the touchscreen) differed between the road and simulator, with drivers making more and longer off-road glances when interacting with the interface on road. However, the ordering of effects in response to changes to the complexity of interface design was the same. For example, the number and duration of off-road glances increased with increasing number of interface elements, and smaller targets attracted longer off-road glances, in both the road and simulator studies. The work demonstrates good relative validity for the use of medium-fidelity driving simulators for HMI-visual distraction testing, supporting their application in this context, and adds to the literature regarding the visual demand characteristics of in-vehicle interfaces.
... In the above equation D and W represent the distance and the width of the target objects respectively, whereas a and b are the regression coefficient. Hick-Hyman law [18,20] is developed to model the reaction time (RT) of a user based on the following mathematical expression. ...
Article
Full-text available
Usability is generally considered as a metric to judge the efficacy of any interface. This is also true for the web pages of a website. There are different factors - efficiency, memorability, learnability, errors, and aesthetics play significant roles in order to determine usability. In this work, we proposed a computational model to predict the efficiency with which users can do a particular task on a website. We considered seventeen features of web pages that may affect the efficiency of a task. The statistical significance of these features was tested based on the empirical data collected using twenty websites. For each website, a representative task was identified. Twenty participants completed these tasks using a controlled environment within a group. Task completion times were recorded for feature identification. The one Dimensional ANOVA study reveals sixteen out of the seventeen are statistically significant for efficiency measurement. Using these features, a computational model was developed based on the Support Vector Regression. Experimental results show that our model can predict the efficiency of web pages’ tasks with an accuracy of 90.64%.
... Мы вернемся к обсуждению результатов, получаемых при изучении времени реакции выбора, несколько позднее в данной работе, пока же отметим, что изначальный исследовательский интерес к изучению роста времени ответа с увеличением числа альтернатив зародился в русле экспериментальной психологии. Именно работая в рамках экспериментального подхода с опорой на теорию информации К.Шеннона [Shannon, 1948], У.Хик [Hick, 1952] и Р.Хайман [Hyman, 1953] независимо друг от друга сформулировали теоретическую модель, предполагавшую логарифмическое возрастание времени ответа с увеличением числа альтернативных стимулов. ...
Article
В исследованиях скорости переработки информации в задачах с варьирующей сложностью часто обнаруживаются проблемы методологического характера, связанные с невозможностью оценить надежные индивидуальные различия в параметрах, характеризующих возрастание времени ответа с усложнением задачи, а также с нестабильностью получаемых корреляций этих параметров с внешними мерами. В данной работе с помощью компьютерных симуляций, включавших многократную генерацию данных и их последующий анализ на основе моделирования латентного роста, демонстрируется, что нестабильность и систематическое занижение оценок связи параметра роста с внешней мерой вполне может быть следствием некорректного выбора модели для описания индивидуальных траекторий возрастания. Кроме того, выбор недостаточно «гибкой» функции для моделирования может не позволить корректным образом идентифицировать индивидуальные различия в форме возрастания. Полученные результаты и выводы, хотя и обсуждаются в контексте изучения скорости переработки информации, справедливы для любых исследований, в которых на латентном уровне моделируются траектории изменения психологических показателей.
Article
Humans are exposed to sequences of events in the environment, and the interevent transition probabilities in these sequences can be modeled as a graph or network. Many real-world networks are organized hierarchically and while much is known about how humans learn basic transition graph topology, whether and to what degree humans can learn hierarchical structures in such graphs remains unknown. We probe the mental estimates of transition probabilities via the surprisal effect phenomenon: humans react more slowly to less expected transitions. Using mean-field predictions and numerical simulations, we show that surprisal effects are stronger for finer-level than coarser-level hierarchical transitions, and that surprisal effects at coarser levels are difficult to detect for limited learning times or in small samples. Using a serial response experiment with human participants (n=100), we replicate our predictions by detecting a surprisal effect at the finer level of the hierarchy but not at the coarser level of the hierarchy. We then evaluate the presence of a trade-off in learning, whereby humans who learned the finer level of the hierarchy better also tended to learn the coarser level worse, and vice versa. This study elucidates the processes by which humans learn sequential events in hierarchical contexts. More broadly, our work charts a road map for future investigation of the neural underpinnings and behavioral manifestations of graph learning.
Article
Full-text available
The research subject is the methods of building the user interface of university websites based on the intended purpose, the needs of the audience, and user limitations, including sensory-motor and cognitive-psychological limitations. As a starting point for studying the target audience and compliance with accessibility standards, eight university websites are analyzed based on open-source data. The main violations that prevent users from using a university's website to varying degrees are considered, as well as the most well-known and often-used approaches to interface design and design that make interfaces more convenient without overloading the user's short-term memory and causing premature fatigue. As a result of the research, the basic requirements for designing the interface of the university's website are formed. According to the main conclusion of this study, to adapt the University's website to the limitations of users' capabilities, it is necessary to follow the main standards of usability and accessibility considered, such as GOST R 52872-2019, WCAG 2.1 and GOST R ISO 9241-20-2014, and also take into account the peculiarities of legislation that affect the formation of sections of the site and its accessibility for people with disabilities. It is necessary to adhere to such principles of interface organization and information presentation as Hick's Law, Gestalt Principles, Miller's Law, Jacob's Law, and Heuristics. The author's special contribution to the topic is the analysis of the verification of eight sites of Russian universities for compliance with accessibility standards. This analysis showed that even the visually impaired versions of the sites reviewed do not meet accessibility standards, which makes it difficult for people with disabilities to access information and emphasizes the importance of the study. The novelty of the research lies in the formation of the basic requirements for the user interface of university websites. The study's results can be further used in constructing such information systems.
Article
Full-text available
An abundant literature reports on ‘sequential effects’ observed when humans make predictions on the basis of stochastic sequences of stimuli. Such sequential effects represent departures from an optimal, Bayesian process. A prominent explanation posits that humans are adapted to changing environments, and erroneously assume non-stationarity of the environment, even if the latter is static. As a result, their predictions fluctuate over time. We propose a different explanation in which sub-optimal and fluctuating predictions result from cognitive constraints (or costs), under which humans however behave rationally. We devise a framework of costly inference, in which we develop two classes of models that differ by the nature of the constraints at play: in one case the precision of beliefs comes at a cost, resulting in an exponential forgetting of past observations, while in the other beliefs with high predictive power are favored. To compare model predictions to human behavior, we carry out a prediction task that uses binary random stimuli, with probabilities ranging from 0.05 to 0.95. Although in this task the environment is static and the Bayesian belief converges, subjects’ predictions fluctuate and are biased toward the recent stimulus history. Both classes of models capture this ‘attractive effect’, but they depart in their characterization of higher-order effects. Only the precision-cost model reproduces a ‘repulsive effect’, observed in the data, in which predictions are biased away from stimuli presented in more distant trials. Our experimental results reveal systematic modulations in sequential effects, which our theoretical approach accounts for in terms of rationality under cognitive constraints.
Article
One of the first tasks in language acquisition is word segmentation, a process to extract word forms from continuous speech streams. Statistical approaches to word segmentation have been shown to be a powerful mechanism, in which word boundaries are inferred from sequence statistics. This approach requires the learner to represent the frequency of units from syllable sequences, though accounts differ on how much statistical exposure is required. In this study, we examined the computational limit with which words can be extracted from continuous sequences. First, we discussed why two occurrences of a word in a continuous sequence is the computational lower limit for this word to be statistically defined. Next, we created short syllable sequences that contained certain words either two or four times. Learners were presented with these syllable sequences one at a time, immediately followed by a test of the novel words from these sequences. We found that, with the computationally minimal amount of two exposures, words were successfully segmented from continuous sequences. Moreover, longer syllable sequences providing four exposures to words generated more robust learning results. The implications of these results are discussed in terms of how learners segment and store the word candidates from continuous sequences.
Chapter
Why is deciding to do something sometimes so slow and difficult? How do we make decisions when lacking key information? When making decisions, the higher areas of the brain deliberately suppress lower areas capable of generating much faster but ill-considered responses while they develop ones that are more sophisticated, based on what can be gained in return. In this engaging book, the authors explore the increasingly popular neural model that may explain these mechanisms: the linear approach to threshold ergodic rate (LATER). Presenting a detailed description of the neurophysiological processes involved in decision-making and how these link to the LATER model, this is the first major resource covering the applications in describing human behaviour. With over 100 illustrations and a thorough discussion of the mathematics supporting the model, this is a rigorous yet accessible resource for psychologists, cognitive neuroscientists and neurophysiologists interested in decision-making.
Article
Measuring the duration of cognitive processing with reaction time is fundamental to several subfields of psychology. Many methods exist for estimating movement initiation when measuring reaction time, but there is an incomplete understanding of their relative performance. The purpose of the present study was to identify and compare the tradeoffs of 19 estimates of movement initiation across two experiments. We focused our investigation on estimating movement initiation on each trial with filtered kinematic and kinetic data. Nine of the estimates involved absolute thresholds (e.g., acceleration 1000 back to 200 mm/s2, micro push-button switch), and the remaining ten estimates used relative thresholds (e.g., force extrapolation, 5% of maximum velocity). The criteria were the duration of reaction time, immunity to the movement amplitude, responsiveness to visual feedback during movement execution, reliability, and the number of manually corrected trials (efficacy). The three best overall estimates, in descending order, were yank extrapolation, force extrapolation, and acceleration 1000 to 200 mm/s2. The sensitive micro push-button switch, which was the simplest estimate, had a decent overall score, but it was a late estimate of movement initiation. The relative thresholds based on kinematics had the six worst overall scores. An issue with the relative kinematic thresholds was that they were biased by the movement amplitude. In summary, we recommend measuring reaction time on each trial with one of the three best overall estimates of movement initiation. Future research should continue to refine existing estimates while also exploring new ones.
Chapter
Every time a user taps on an element on a screen, she provides some “information”. Classically, Fitts’ law accounts for the speed accuracy trade-off in this operation, and Fitts’ throughput provides the “rate of information transfer” from the human to the device. However, Fitts’ throughput is a theoretical construct, and it is difficult to interpret it in the practical design of interfaces. Our motivation is to compare this theoretical rate of information transfer with the empirical values achieved in typical, realistic pointing tasks. To do so, we developed four smartphone-based interfaces - a 1D and a 2D interface for a typical Fitts’ study and a 1D and a 2D interface for an empirical study. In the Fitts’ study, participants touched the target bar or circle as quickly as possible. In the empirical study, participants typed seven 10-digit phone numbers ten times each. We conducted a systematic, within-subjects study with 20 participants and report descriptive statistics for the Fitts’ throughput and the empirical throughput values. We also carried out statistical significance tests, the results of which are as follows. As we had expected, the Fitts’ throughput for 1D task was significantly higher than the empirical throughput for the number typing task in 1D. Surprisingly, the difference was in the opposite direction for the 2D tasks. Further, we found that throughputs for both the 2D tasks were higher than their 1D counterparts, which too is an unusual result. We compare our values with those reported in key Fitts’ law literature and propose potential explanations for these surprises, which need to be evaluated in future research. KeywordsFitts’ lawFitts’ throughputindex of performance
Article
Using visual learning and meditation based techniques to develop brain training apps that can significantly improve cognitive performance. This paper seeks to understand what causes slow cognitive processing speed and designs an application to diagnose it in early stages and provide support to people having slow processing speed. Moreover, significant primary data is collected to validate the hypothesis that visual learning techniques presented through mobile applications can improve cognitive performance.
Article
This article can be accessed with the following link for free before July 07, 2023. https://authors.elsevier.com/a/1h6344tTwCukLl
Article
Full-text available
Предметом исследования являются методы построения пользовательского интерфейса сайтов ВУЗов, исходя из целевого назначения, потребностей пользовательской аудитории и ограничений пользователей, включая сенсорно-моторные и когнитивно-психологические ограничения. В качестве отправной точки для исследования целевой аудитории и следования стандартам доступности проводится анализ восьми сайтов университетов, на основе данных из открытых источников. Рассматриваются основные нарушения, которые препятствуют в различной степени пользоваться сайтом ВУЗа, а также наиболее известные и часто применяемые в дизайне и проектировании интерфейсов подходы, которые позволяют делать интерфейсы более удобными, не перегружая кратковременную память пользователя и не вызывая преждевременного утомления. В результате исследования формируются основные требования к дизайну интерфейса сайта ВУЗа. Согласно основному выводу данного исследования, для адаптации сайта ВУЗа к ограничениям возможностей пользователей необходимо следовать основным рассмотренным стандартам юзабилити и доступности, таким как ГОСТ Р 52872–2019, WCAG 2.1 и ГОСТ Р ИСО 9241-20-2014, а также учитывать особенности законодательства, которые накладывают свое влияние на формирование разделов сайта и его доступности для людей с ограниченными возможностями здоровья. Следует придерживаться таких принципов организации интерфейсов и подачи информации, как: Закон Хика, Гештальт-принципы, Закон Миллера, Закон и эвристики Джейкоба. Особым вкладом автора в исследование темы является анализ проверки восьми сайтов университетов России на соответствие стандартам доступности. Данный анализ показал, что даже версии для слабовидящих рассмотренных сайтов не удовлетворяют стандартам доступности, что затрудняет доступ к информации лицам с ограничениями здоровья и подчеркивает важность исследования. Новизна исследования заключается в формировании основных требований к пользовательскому интерфейсу сайтов ВУЗов. Результаты исследования могут быть в дальнейшем использованы при построении подобных информационных систем. The subject of the research is the methods of building the user interface of university websites based on the intended purpose, the needs of the user audience and user limitations, including sensory-motor and cognitive-psychological limitations. As a starting point for the study of the target audience and compliance with accessibility standards, an analysis of eight university websites is carried out, based on data from open sources. The main violations that prevent the use of the University's website to varying degrees are considered, as well as the most well-known and often used approaches in interface design and design that make interfaces more convenient without overloading the user's short-term memory and without causing premature fatigue. As a result of the research, the basic requirements for the design of the interface of the university's website are formed. According to the main conclusion of this study, in order to adapt the University's website to the limitations of users' capabilities, it is necessary to follow the main standards of usability and accessibility considered, such as GOST R 52872-2019, WCAG 2.1 and GOST R ISO 9241-20-2014, and also take into account the peculiarities of legislation that affect the formation of sections of the site and its accessibility for people with disabilities. It is necessary to adhere to such principles of interface organization and information presentation as: Hick's Law, Gestalt Principles, Miller's Law, Jacob's Law and Heuristics. A special contribution of the author to the research of the topic is the analysis of the verification of eight sites of Russian universities for compliance with accessibility standards. This analysis showed that even the visually impaired versions of the sites reviewed do not meet accessibility standards, which makes it difficult for people with disabilities to access information and emphasizes the importance of the study. The novelty of the research lies in the formation of the basic requirements for the user interface of university websites. The results of the study can be further used in the construction of such information systems.
Article
Decision making often depends on vague information that leads to uncertainty, which is a quantity contingent not on choice but on probability distributions of sensory evidence and other cognitive variables. Uncertainty may be computed in parallel and interact with decision making. Here, we adapt the classic random-dot motion direction discrimination task to allow subjects to indicate their uncertainty without having to form a decision first. The subjects' choices and reaction times for perceptual decisions and uncertainty responses are measured, respectively. We then build a value-based model in which decisions are based on optimizing value computed from a drift-diffusion process. The model accounts for key features of subjects' behavior and the variation across the individuals. It explains how the addition of the uncertainty option affects perceptual decision making. Our work establishes a value-based theoretical framework for studying uncertainty and perceptual decisions that can be readily applied in future investigations of the underlying neural mechanism.
Chapter
Die Sportmotorik befasst sich mit den inneren Kontrollmechanismen, die äußerlich sichtbarem Bewegungsverhalten zugrunde liegen, sowie mit Veränderungen dieser Mechanismen auf verschiedenen Zeitskalen, nämlich Prozessen der Adaptation (Minuten/Stunden), des Lernens (Tage/Wochen) und der Entwicklung (Jahre/Lebensaltersabschnitte). Das Kapitel untergliedert sich vor diesem Hintergrund in zwei Abschnitte. Im Zentrum des ersten Abschnitts zur motorischen Kontrolle stehen die mit Beginn des Computerzeitalters aufblühenden kognitiven Ansätze, die zum Ende des 20. Jahrhunderts aufkommenden systemischen Ansätze des Konnektionismus, der dynamischen Ansätze der Jetztzeit. Hierauf aufbauend werden im zweiten Abschnitt Leistungsmerkmale zur Bestimmung von Koordinationsveränderungen behandelt, Lernprozesse erörtert, die aus der Verstärkung guter Ausführungen, der Verarbeitung relevanter Informationen oder positivem Transfer von Bewegungserfahrungen resultieren, und schließlich Entwicklung als hoch interaktiver Prozess diskutiert, der lebensspannenbezogene Veränderungen der Motorik in Abhängigkeit von Lern-, Reifungs- und Wachstumsvorgängen betrifft und dabei auch Wechselwirkungen mit höheren kognitiven Funktionen einschließt.
Chapter
This chapter reviews some of the major findings, principles, and theories concerning selection and control of action that are relevant to designing for human use. Choice reaction time methods can be used not only to examine action selection for single-task performance but also for conditions in which two or more task sets must be maintained, and the person is required to switch between the various tasks periodically or to perform the tasks concurrently. Because considerable research on action selection has been conducted using both single and multiple tasks, the chapter covers single- and multiple-task performance separately. It shows that the relations between choice uncertainty and response time, captured by Hick's law, movement difficulty and movement time, conveyed by Fitts's law, and amount of practice and performance time, depicted by the power law of practice, follow quantitative relations that can be applied to specific research and design issues in human factors and ergonomics.
Chapter
Information processing lies at the heart of human performance. This chapter describe the characteristics of the different important stages of information processing, from perception of the environment to acting on that environment. It begins by contrasting different ways in which information processing has been treated in applied psychology, and then describes processes and transformations related to attention, perception, memory and cognition, action selection, and multiple-task performance. The chapter adopts as a framework the information-processing model depicted. Stimuli or events are sensed and attended and that information received by the sensory system is perceived, that is, provided with some meaningful interpretation based on memory of past experience. The chapter also describes the means by which comprehension is achieved. It provides information on the unification subsystem, including the corpus of knowledge it requires and the operations it performs.
Chapter
This chapter covers both fundamental knowledge of mathematical modeling and shows examples of modeling of single and multiple task performance in human factors. It begins with a description of the basic knowledge of mathematical modeling in human factors, including the expectations of an ideal model, the terminologies in this domain, and the features of mathematical modeling compared with other modeling approaches. After that, the chapter illustrates mathematical modeling of human performance in single task and multitask situations. It provides a brief history of the top-down/theory driven mathematical models, simulation/symbolic models, and their integration. Then, the chapter describes how to build and verify mathematical models, using a concrete example to illustrate the steps involved in this process. Finally, the chapter provides further discussions on mathematical modeling in human factors, together with specific examples as illustrations.
Article
Scitation is the online home of leading journals and conference proceedings from AIP Publishing and AIP Member Societies
Article
The analytical methods of information theory are applied to the data obtained in certain choice-reaction-time experiments. Two types of experiment were performed: (a) a conventional choice-reaction experiment, with various numbers of alternatives up to ten, and with a negligible proportion of errors, and (b) a ten-choice experiment in which the subjects deliberately reduced their reaction time by allowing themselves various proportions of errors. The principal finding is that the rate of gain of information is, on the average, constant with respect to time, within the duration of one perceptual-motor act, and has a value of the order of five “bits” per second. The distribution of reaction times among the ten stimuli in the second experiment is shown to be related to the objective uncertainty as to which response will be given to each stimulus. The distribution of reaction times among the responses is also related to the same uncertainty. This is further evidence that information is intimately concerned with reaction time. Some possible conceptual models of the process are considered, but tests against the data are inconclusive.
Die zeithchen Verhaltnisse der Willensthatigkeit
  • J Merkel
MERKEL, J. Die zeithchen Verhaltnisse der Willensthatigkeit. PMos. St., 1885, 2, 73-127.