669 resultados para Categorization
Resumo:
Recent brain imaging work has expanded our understanding of the mechanisms of perceptual, cognitive, and motor functions in human subjects, but research into the cerebral control of emotional and motivational function is at a much earlier stage. Important concepts and theories of emotion are briefly introduced, as are research designs and multimodal approaches to answering the central questions in the field. We provide a detailed inspection of the methodological and technical challenges in assessing the cerebral correlates of emotional activation, perception, learning, memory, and emotional regulation behavior in healthy humans. fMRI is particularly challenging in structures such as the amygdala as it is affected by susceptibility-related signal loss, image distortion, physiological and motion artifacts and colocalized Resting State Networks (RSNs). We review how these problems can be mitigated by using optimized echo-planar imaging (EPI) parameters, alternative MR sequences, and correction schemes. High-quality data can be acquired rapidly in these problematic regions with gradient compensated multiecho EPI or high resolution EPI with parallel imaging and optimum gradient directions, combined with distortion correction. Although neuroimaging studies of emotion encounter many difficulties regarding the limitations of measurement precision, research design, and strategies of validating neuropsychological emotion constructs, considerable improvement in data quality and sensitivity to subtle effects can be achieved. The methods outlined offer the prospect for fMRI studies of emotion to provide more sensitive, reliable, and representative models of measurement that systematically relate the dynamics of emotional regulation behavior with topographically distinct patterns of activity in the brain. This will provide additional information as an aid to assessment, categorization, and treatment of patients with emotional and personality disorders.
Resumo:
The IDE used in most Smalltalk dialects such as Pharo, Squeak or Cincom Smalltalk did not evolve significantly over the last years, if not to say decades. For other languages, for instance Java, the available IDEs made tremendous progress as Eclipse or NetBeans illustrate. While the Smalltalk IDE served as an exemplar for many years, other IDEs caught up or even overtook the erstwhile leader in terms of feature-richness, usability, or code navigation facilities. In this paper we first analyze the difficulty of software navigation in the Smalltalk IDE and second illustrate with concrete examples the features we added to the Smalltalk IDE to fill the gap to modern IDEs and to provide novel, improved means to navigate source space. We show that thanks to the agility and dynamics of Smalltalk, we are able to extend and enhance with reasonable effort the Smalltalk IDE to better support software navigation, program comprehension, and software maintenance in general. One such support is the integration of dynamic information into the static source views we are familiar with. Other means include easing the access to static information (for instance by better arranging important packages) or helping developers re-locating artifacts of interest (for example with a categorization system such as smart groups).
Resumo:
Audio-visual documents obtained from German TV news are classified according to the IPTC topic categorization scheme. To this end usual text classification techniques are adapted to speech, video, and non-speech audio. For each of the three modalities word analogues are generated: sequences of syllables for speech, “video words” based on low level color features (color moments, color correlogram and color wavelet), and “audio words” based on low-level spectral features (spectral envelope and spectral flatness) for non-speech audio. Such audio and video words provide a means to represent the different modalities in a uniform way. The frequencies of the word analogues represent audio-visual documents: the standard bag-of-words approach. Support vector machines are used for supervised classification in a 1 vs. n setting. Classification based on speech outperforms all other single modalities. Combining speech with non-speech audio improves classification. Classification is further improved by supplementing speech and non-speech audio with video words. Optimal F-scores range between 62% and 94% corresponding to 50% - 84% above chance. The optimal combination of modalities depends on the category to be recognized. The construction of audio and video words from low-level features provide a good basis for the integration of speech, non-speech audio and video.
Resumo:
Given arbitrary pictures, we explore the possibility of using new techniques from computer vision and artificial intelligence to create customized visual games on-the-fly. This includes coloring books, link-the-dot and spot-the-difference popular games. The feasibility of these systems is discussed and we describe prototype implementation that work well in practice in an automatic or semi-automatic way.
Resumo:
This article proposes a model explaining how family control/influence in an organization affects individual stakeholders’ perceptions of benevolence. The model suggests two effects. First, based on socioemotional wealth research, we propose that family control/influence positively affects stakeholders’ perceptions of benevolence through the benevolent behavior that the organization shows toward its stakeholders. However, this effect can be negatively influenced if the family’s socioemotional wealth goals in terms of “Family control and influence” and/or “Renewal of family bonds to the firm through dynastic succession” are at risk. Second, we argue that family control/influence, to the extent that it is perceivable to the stakeholder, influences stakeholders’ perceptions of benevolence through categorization processes. However, the impact of perceivable family control/influence on stakeholders’ perceptions of benevolence is not straightforward but instead hinges on a set of individual-level contingency factors of the stakeholder, such as stakeholders’ family business in-group membership, stakeholders’ secondhand category information, and stakeholders’ firsthand category information.
Resumo:
The authors review the implicit association test (IAT), its use in marketing, and the methodology and validity issues that surround it. They focus on a validity problem that has not been investigated previously, namely, the impact of cognitive inertia on IAT effects. Cognitive inertia refers to the difficulty in switching from one categorization rule to another, which causes IAT effects to depend on the order of administration of the two IAT blocks. In Study 1, the authors observe an IAT effect when the compatible block precedes the incompatible block but not when it follows the incompatible block. In Studies 2 and 3, the IAT effect changes its sign when the order of the blocks reverses. Cognitive inertia distorts individual IAT scores and diminishes the correlations between IAT scores and predictor variables when the block order is counterbalanced between subjects. Study 4 shows that counterbalancing the block order repeatedly within subjects can eliminate cognitive inertia effects on the individual level. The authors conclude that researchers should either interpret IAT scores at the aggregate level or, if individual IAT scores are of interest, counterbalance the block order repeatedly within subjects.
Resumo:
We demonstrate that the IAT is crucially influenced by the order in which the two IAT-blocks are administered. In three studies the IAT-effect is shown to change in magnitude and sign when the order of the ‘compatible’ and the ‘incompatible’ block is reversed. Order effects are caused by cognitive inertia, the difficulty to switch from one categorization rule to another categorization rule. Cognitive inertia distorts correlations between IAT-scores and other variables. While the common practice of counterbalancing block-order between-subjects does not cancel out these distortions, we show in study 4 that counterbalancing block-order repeatedly within-subjects can eliminate order effects.
Resumo:
This paper focuses on the majority population in the People’s Republic of China—the Han—and their various collective identities. The Han play a pivotal role in consolidating the Chinese territory and the multiethnic Chinese nation. Therefore, the governments in the twentieth century have invested substantial efforts in promoting a unitary Han identity. In spite of that, powerful local identities related to native place, occupation, and family histories persist. This essay traces these identities and analyzes their intertwinement. Further, it discusses the question of ethnicity of both the Han and local identity categories, and concludes that while Han enact ethnicity in their relations to other minzu, local identity categories are more social than ethnic. It further posits that moments of confrontation, “degree” of ethnicity, scales of categorization, and relationality of identities are notions that should be given particular attention in the studies of ethnicity in China and elsewhere.
Resumo:
This article analyzes the interaction between theories of radicalization and state responses to militancy in India. Focusing on the interpretation of the increased frequency of terrorist attacks in Indian metropolises in the last decade, the article examines the narratives surrounding those classified as terrorists in the context of rising Muslim militancy in the country. Different state agencies operate with different theories about the links between processes of radicalization and terrorist violence. The scenarios of radicalization underlying legislative efforts to prevent terrorism, the construction of motives by the police, and the interpretation of violence by the judiciary all rely on assumptions about radicalization and violence. Such narratives are used to explain terrorism both to security agencies and to the public; they inform the categories and scenarios of prevention. Prevention relies on detection of future deeds, planning, intentions, and even potential intentions. "Detection" of potential intentions relies on assumptions about specific dispositions. Identification of such dispositions in turn relies on the context-specific theories of the causes of militancy. These determine what "characteristics" of individuals or groups indicate potential threats and form the basis for their categorization as "potentially dangerous." The article explores the cultural contexts of theories of radicalization, focusing on how they are framed by societal understandings of the causes of deviance and the relation between the individual and society emerging in contemporary India. It examines the shift in the perception of threat and the categories of "dangerous others" from a focus on role to a focus on ascriptive identity.
Resumo:
In a previous paper, we presented a proposed expansion of the National Guideline Clearing-house (NGC) classification1. We performed a preliminary evaluation of the classification based on 100 guidelines randomly selected from the NGC collection. We found that 89 of the 100 guidelines could be assigned to a single guideline category. To test inter-observer agreement, twenty guidelines were also categorized by a second investigator. Agreement was found to be 40-90% depending on the axis, which compares favorably with agreement among MeSH indexers (30-60%)2. We conclude that categorization is feasible. Further research is needed to clarify axes with poor inter-observer agreement.
Resumo:
The current state of health and biomedicine includes an enormity of heterogeneous data ‘silos’, collected for different purposes and represented differently, that are presently impossible to share or analyze in toto. The greatest challenge for large-scale and meaningful analyses of health-related data is to achieve a uniform data representation for data extracted from heterogeneous source representations. Based upon an analysis and categorization of heterogeneities, a process for achieving comparable data content by using a uniform terminological representation is developed. This process addresses the types of representational heterogeneities that commonly arise in healthcare data integration problems. Specifically, this process uses a reference terminology, and associated "maps" to transform heterogeneous data to a standard representation for comparability and secondary use. The capture of quality and precision of the “maps” between local terms and reference terminology concepts enhances the meaning of the aggregated data, empowering end users with better-informed queries for subsequent analyses. A data integration case study in the domain of pediatric asthma illustrates the development and use of a reference terminology for creating comparable data from heterogeneous source representations. The contribution of this research is a generalized process for the integration of data from heterogeneous source representations, and this process can be applied and extended to other problems where heterogeneous data needs to be merged.
Resumo:
Objective Interruptions are known to have a negative impact on activity performance. Understanding how an interruption contributes to human error is limited because there is not a standard method for analyzing and classifying interruptions. Qualitative data are typically analyzed by either a deductive or an inductive method. Both methods have limitations. In this paper a hybrid method was developed that integrates deductive and inductive methods for the categorization of activities and interruptions recorded during an ethnographic study of physicians and registered nurses in a Level One Trauma Center. Understanding the effects of interruptions is important for designing and evaluating informatics tools in particular and for improving healthcare quality and patient safety in general. Method The hybrid method was developed using a deductive a priori classification framework with the provision of adding new categories discovered inductively in the data. The inductive process utilized line-by-line coding and constant comparison as stated in Grounded Theory. Results The categories of activities and interruptions were organized into a three-tiered hierarchy of activity. Validity and reliability of the categories were tested by categorizing a medical error case external to the study. No new categories of interruptions were identified during analysis of the medical error case. Conclusions Findings from this study provide evidence that the hybrid model of categorization is more complete than either a deductive or an inductive method alone. The hybrid method developed in this study provides the methodical support for understanding, analyzing, and managing interruptions and workflow.
Resumo:
Objective: Schizophrenia patients suffer from a variety of motor symptoms, including parkinsonism, catatonia, neurological soft signs, abnormal involuntary movements and psychomotor slowing. Methods: Literature review of prevalence rates and presentation of own results. Results: Parkinsonism and abnormal involuntary movements are intrinsic to schizophrenia, but may also be evoked by antipsychotic treatment. Reduced motor activity is associated with negative symptoms, catatonia and psychomotor slowing. Furthermore, 40 % of schizophrenia patients are impaired in gesture performance, which is related to executive and basic motor function. Mild motor disturbances are found in the majority of patients, while severe dysfunctions are limited to a minority. Our neuroimaging studies suggest that hypokinesia is caused by defective cortico-subcortical motor loops in schizophrenia. Taken together, a dimensional approach to schizophrenia motor symptoms seems promising. A purely descriptive assessment of motor signs is preferred over theoryladen categorization. Using objective motor parameters allows finding neural correlates of abnormal motor behaviour. Conclusion: The motor dimension of schizophrenia is linked to distinct disturbances in the cerebral motor system. Targeted modification of the defective motor system might become a relevant treatment option in patients suffering from schizophrenia with predominant motor features.
Resumo:
OBJECTIVE: Interruptions are known to have a negative impact on activity performance. Understanding how an interruption contributes to human error is limited because there is not a standard method for analyzing and classifying interruptions. Qualitative data are typically analyzed by either a deductive or an inductive method. Both methods have limitations. In this paper, a hybrid method was developed that integrates deductive and inductive methods for the categorization of activities and interruptions recorded during an ethnographic study of physicians and registered nurses in a Level One Trauma Center. Understanding the effects of interruptions is important for designing and evaluating informatics tools in particular as well as improving healthcare quality and patient safety in general. METHOD: The hybrid method was developed using a deductive a priori classification framework with the provision of adding new categories discovered inductively in the data. The inductive process utilized line-by-line coding and constant comparison as stated in Grounded Theory. RESULTS: The categories of activities and interruptions were organized into a three-tiered hierarchy of activity. Validity and reliability of the categories were tested by categorizing a medical error case external to the study. No new categories of interruptions were identified during analysis of the medical error case. CONCLUSIONS: Findings from this study provide evidence that the hybrid model of categorization is more complete than either a deductive or an inductive method alone. The hybrid method developed in this study provides the methodical support for understanding, analyzing, and managing interruptions and workflow.
Resumo:
A patient classification system was developed integrating a patient acuity instrument with a computerized nursing distribution method based on a linear programming model. The system was designed for real-time measurement of patient acuity (workload) and allocation of nursing personnel to optimize the utilization of resources.^ The acuity instrument was a prototype tool with eight categories of patients defined by patient severity and nursing intensity parameters. From this tool, the demand for nursing care was defined in patient points with one point equal to one hour of RN time. Validity and reliability of the instrument was determined as follows: (1) Content validity by a panel of expert nurses; (2) predictive validity through a paired t-test analysis of preshift and postshift categorization of patients; (3) initial reliability by a one month pilot of the instrument in a practice setting; and (4) interrater reliability by the Kappa statistic.^ The nursing distribution system was a linear programming model using a branch and bound technique for obtaining integer solutions. The objective function was to minimize the total number of nursing personnel used by optimally assigning the staff to meet the acuity needs of the units. A penalty weight was used as a coefficient of the objective function variables to define priorities for allocation of staff.^ The demand constraints were requirements to meet the total acuity points needed for each unit and to have a minimum number of RNs on each unit. Supply constraints were: (1) total availability of each type of staff and the value of that staff member (value was determined relative to that type of staff's ability to perform the job function of an RN (i.e., value for eight hours RN = 8 points, LVN = 6 points); (2) number of personnel available for floating between units.^ The capability of the model to assign staff quantitatively and qualitatively equal to the manual method was established by a thirty day comparison. Sensitivity testing demonstrated appropriate adjustment of the optimal solution to changes in penalty coefficients in the objective function and to acuity totals in the demand constraints.^ Further investigation of the model documented: correct adjustment of assignments in response to staff value changes; and cost minimization by an addition of a dollar coefficient to the objective function. ^