87 resultados para Terminological definition


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a new relative measure of signal complexity, referred to here as relative structural complexity, which is based on the matching pursuit (MP) decomposition. By relative, we refer to the fact that this new measure is highly dependent on the decomposition dictionary used by MP. The structural part of the definition points to the fact that this new measure is related to the structure, or composition, of the signal under analysis. After a formal definition, the proposed relative structural complexity measure is used in the analysis of newborn EEG. To do this, firstly, a time-frequency (TF) decomposition dictionary is specifically designed to compactly represent the newborn EEG seizure state using MP. We then show, through the analysis of synthetic and real newborn EEG data, that the relative structural complexity measure can indicate changes in EEG structure as it transitions between the two EEG states; namely seizure and background (non-seizure).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In mapping the evolutionary process of online news and the socio-cultural factors determining this development, this paper has a dual purpose. First, in reworking the definition of “online communication”, it argues that despite its seemingly sudden emergence in the 1990s, the history of online news started right in the early days of the telegraphs and spread throughout the development of the telephone and the fax machine before becoming computer-based in the 1980s and Web-based in the 1990s. Second, merging macro-perspectives on the dynamic of media evolution by DeFleur and Ball-Rokeach (1989) and Winston (1998), the paper consolidates a critical point for thinking about new media development: that something technically feasible does not always mean that it will be socially accepted and/or demanded. From a producer-centric perspective, the birth and development of pre-Web online news forms have been more or less generated by the traditional media’s sometimes excessive hype about the power of new technologies. However, placing such an emphasis on technological potentials at the expense of their social conditions not only can be misleading but also can be detrimental to the development of new media, including the potential of today’s online news.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The subject of management is renowned for its addiction to fads and fashions. Project Management is no exception. The issue of interest for this paper is the establishment of the 'College of Complex Project Managers' and their 'competency standard for complex project managers.' Both have generated significant interest in the Project Management community, and like any other human endeavour they should be subject to critical evaluation. The results of this evaluation show significant flaws in the definition of complex in this case, the process by which the College and its standard have emerged, and the content of the standard. However, there is a significant case for a portfolio of research that extends the existing bodies of knowledge into large-scale complicated (or major) projects that would be owned by the relevant practitioner communities, rather than focused on one organization. Research questions are proposed that would commence this stream of activity towards an intelligent synthesis of what is required to manage in both complicated and truly complex environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the first of two articles presenting the case for emotional intelligence in a point/counterpoint exchange, we present a brief summary of research in the field, and rebut arguments against the construct presented in this issue.We identify three streams of research: (1) a four-branch abilities test based on the model of emotional intelligence defined in Mayer and Salovey (1997); (2) self-report instruments based on the Mayer–Salovey model; and (3) commercially available tests that go beyond the Mayer–Salovey definition. In response to the criticisms of the construct, we argue that the protagonists have not distinguished adequately between the streams, and have inappropriately characterized emotional intelligence as a variant of social intelligence. More significantly, two of the critical authors assert incorrectly that emotional intelligence research is driven by a utopian political agenda, rather than scientific interest. We argue, on the contrary, that emotional intelligence research is grounded in recent scientific advances in the study of emotion; specifically regarding the role emotion plays in organizational behavior. We conclude that emotional intelligence is attracting deserved continuing research interest as an individual difference variable in organizational behavior related to the way members perceive, understand, and manage their emotions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Faced with today’s ill-structured business environment of fast-paced change and rising uncertainty, organizations have been searching for management tools that will perform satisfactorily under such ambiguous conditions. In the arena of managerial decision making, one of the approaches being assessed is the use of intuition. Based on our definition of intuition as a non-sequential information-processing mode, which comprises both cognitive and affective elements and results in direct knowing without any use of conscious reasoning, we develop a testable model of integrated analytical and intuitive decision making and propose ways to measure the use of intuition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of the virtual organization (VO) has engendered great interest in the literature, yet there is still little common understanding of the concept, as evidenced by the multitude of labels applied to VOs. In this article, we focus on a “Weberian-ideal-type” definition of the interorganizational VO, posited in our earlier work (Kasper-Fuehrer and Ashkanasy 2001). We argue, however, that this definition left unanswered critical questions relating to the nature and effects of interorganizational VOs. We answer these questions here by explicating the terms in the definition and deriving ten corollaries, or “natural consequences” of our definition. The corollaries posit that interorganizational VOs are temporary in nature, are network organizations, are independent, and are based on swift trust. We suggest further that interorganizational VOs enable small to medium enterprises to exploit market opportunities, and enable VO member organizations to create a value-adding partnership. We also identify information and communication technology (ICT) as the essential enabler of VOs. Finally, we argue that interorganizational VOs act as a single organizational unit and that they therefore constitute a uniquely distinguishable organizational form. We conclude with suggestions for further research, including trust, organizational behavior, transaction economics, virtual HRM, and business strategy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Globalisation, increasing complexity, and the need to address triple-bottom line sustainability has seen the proliferation of Learning Organisations (LO) who, by definition, have the capacity to anticipate environmental changes and economic opportunities and adapt accordingly. Such organisations use system dynamics modelling (SDM) for both strategic planning and the promotion of organisational learning. Although SDM has been applied in the context of tourism destination management for predictive reasons, the current literature does not analyse or recognise how this could be used as a foundation for an LO. This study introduces the concept of the Learning Tourism Destinations (LTD) and discusses, on the basis of a review of 6 case studies, the potential of SDM as a tool for the implementation and enhancement of collective learning processes. The results reveal that SDM is capable of promoting communication between stakeholders and stimulating organisational learning. It is suggested that the LTD approach be further utilised and explored.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

T cells recognize peptide epitopes bound to major histocompatibility complex molecules. Human T-cell epitopes have diagnostic and therapeutic applications in autoimmune diseases. However, their accurate definition within an autoantigen by T-cell bioassay, usually proliferation, involves many costly peptides and a large amount of blood, We have therefore developed a strategy to predict T-cell epitopes and applied it to tyrosine phosphatase IA-2, an autoantigen in IDDM, and HLA-DR4(*0401). First, the binding of synthetic overlapping peptides encompassing IA-2 was measured directly to purified DR4. Secondly, a large amount of HLA-DR4 binding data were analysed by alignment using a genetic algorithm and were used to train an artificial neural network to predict the affinity of binding. This bioinformatic prediction method was then validated experimentally and used to predict DR4 binding peptides in IA-2. The binding set encompassed 85% of experimentally determined T-cell epitopes. Both the experimental and bioinformatic methods had high negative predictive values, 92% and 95%, indicating that this strategy of combining experimental results with computer modelling should lead to a significant reduction in the amount of blood and the number of peptides required to define T-cell epitopes in humans.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is the first in a series of three articles which aimed to derive the matrix elements of the U(2n) generators in a multishell spin-orbit basis. This is a basis appropriate to many-electron systems which have a natural partitioning of the orbital space and where also spin-dependent terms are included in the Hamiltonian. The method is based on a new spin-dependent unitary group approach to the many-electron correlation problem due to Gould and Paldus [M. D. Gould and J. Paldus, J. Chem. Phys. 92, 7394, (1990)]. In this approach, the matrix elements of the U(2n) generators in the U(n) x U(2)-adapted electronic Gelfand basis are determined by the matrix elements of a single Ll(n) adjoint tensor operator called the del-operator, denoted by Delta(j)(i) (1 less than or equal to i, j less than or equal to n). Delta or del is a polynomial of degree two in the U(n) matrix E = [E-j(i)]. The approach of Gould and Paldus is based on the transformation properties of the U(2n) generators as an adjoint tensor operator of U(n) x U(2) and application of the Wigner-Eckart theorem. Hence, to generalize this approach, we need to obtain formulas for the complete set of adjoint coupling coefficients for the two-shell composite Gelfand-Paldus basis. The nonzero shift coefficients are uniquely determined and may he evaluated by the methods of Gould et al. [see the above reference]. In this article, we define zero-shift adjoint coupling coefficients for the two-shell composite Gelfand-Paldus basis which are appropriate to the many-electron problem. By definition, these are proportional to the corresponding two-shell del-operator matrix elements, and it is shown that the Racah factorization lemma applies. Formulas for these coefficients are then obtained by application of the Racah factorization lemma. The zero-shift adjoint reduced Wigner coefficients required for this procedure are evaluated first. All these coefficients are needed later for the multishell case, which leads directly to the two-shell del-operator matrix elements. Finally, we discuss an application to charge and spin densities in a two-shell molecular system. (C) 1998 John Wiley & Sons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Frequency, recency, and type of prior exposure to very low-and high-frequency words were manipulated in a 3-phase (i.e., familiarization training, study, and test) design. Increasing the frequency with which a definition for a very low-frequency word was provided during familiarization facilitated the word's recognition in both yes-no (Experiment 1) and forced-choice paradigms (Experiment 2). Recognition of very low-frequency words not accompanied by a definition during familiarization first increased, then decreased as familiarization frequency increased (Experiment I). Reasons for these differences were investigated in Experiment 3 using judgments of recency and frequency. Results suggested that prior familiarization of a very low-frequency word with its definition may allow a more adequate episodic representation of the word to be formed during a subsequent study trial. Theoretical implications of these results for current models of memory are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analysis of the 16S rDNA sequences of species currently assigned to the genus Herpetosiphon revealed intrageneric phylogenetic heterogeneity. The thermotolerant freshwater species Herpetosiphon geysericola is most closely related to the type species Herpetosiphon aurantiacus in the Chloroflexus Subdivision of the green non-sulfur bacteria, The marine species Herpetosiphon cohaerens, Herpetosiphon nigricans and Herpetosiphon persicus, on the other hand, were found to form a cluster with the sheathed bacterium Haliscomenobacter hydrossis in the Saprospira group of the Flexibacter-Bacteroides-Cytophaga (FBC) phylum. A proposal is made to transfer these marine species to the genus Lewinella gen. nov. as Lewinella cohaerens comb, nov., Lewinella nigricans comb. nov, and Lewinella persica comb. nov. The marine sheathed gliding bacterium Flexithrix dorotheae was also found to be a member of the FBC phylum but on a separate phylogenetic line to the marine herpetosiphons now assigned to the genus Lewinella.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Standard tools for the analysis of economic problems involving uncertainty, including risk premiums, certainty equivalents and the notions of absolute and relative risk aversion, are developed without making specific assumptions on functional form beyond the basic requirements of monotonicity, transitivity, continuity, and the presumption that individuals prefer certainty to risk. Individuals are not required to display probabilistic sophistication. The approach relies on the distance and benefit functions to characterize preferences relative to a given state-contingent vector of outcomes. The distance and benefit functions are used to derive absolute and relative risk premiums and to characterize preferences exhibiting constant absolute risk aversion (CARA) and constant relative risk aversion (CRRA). A generalization of the notion of Schur-concavity is presented. If preferences are generalized Schur concave, the absolute and relative risk premiums are generalized Schur convex, and the certainty equivalents are generalized Schur concave.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we describe a model of the human visual system (HVS) based on the wavelet transform. This model is largely based on a previously proposed model, but has a number of modifications that make it more amenable to potential integration into a wavelet based image compression scheme. These modifications include the use of a separable wavelet transform instead of the cortex transform, the application of a wavelet contrast sensitivity function (CSP), and a simplified definition of subband contrast that allows us to predict noise visibility directly from wavelet coefficients. Initially, we outline the luminance, frequency, and masking sensitivities of the HVS and discuss how these can be incorporated into the wavelet transform. We then outline a number of limitations of the wavelet transform as a model of the HVS, namely the lack of translational invariance and poor orientation sensitivity. In order to investigate the efficacy of this wavelet based model, a wavelet visible difference predictor (WVDP) is described. The WVDP is then used to predict visible differences between an original and compressed (or noisy) image. Results are presented to emphasize the limitations of commonly used measures of image quality and to demonstrate the performance of the WVDP, The paper concludes with suggestions on bow the WVDP can be used to determine a visually optimal quantization strategy for wavelet coefficients and produce a quantitative measure of image quality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computer models can be combined with laboratory experiments for the efficient determination of (i) peptides that bind MHC molecules and (ii) T-cell epitopes. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures. This requires the definition of standards and experimental protocols for model application. We describe the requirements for validation and assessment of computer models. The utility of combining accurate predictions with a limited number of laboratory experiments is illustrated by practical examples. These include the identification of T-cell epitopes from IDDM-, melanoma- and malaria-related antigens by combining computational and conventional laboratory assays. The success rate in determining antigenic peptides, each in the context of a specific HLA molecule, ranged from 27 to 71%, while the natural prevalence of MHC-binding peptides is 0.1-5%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Community awareness of the sustainable use of land, water and vegetation resources is increasing. The sustainable use of these resources is pivotal to sustainable farming systems. However, techniques for monitoring the sustainable management of these resources are poorly understood and untested. We propose a framework to benchmark and monitor resources in the grains industry. Eight steps are listed below to achieve these objectives: (i) define industry issues; (ii) identify the issues through growers, stakeholder and community consultation; (iii) identify indicators (measurable attributes, properties or characteristics) of sustainability through consultation with growers, stakeholders, experts and community members, relating to: crop productivity; resource maintenance/enhancement; biodiversity; economic viability; community viability; and institutional structure; (iv) develop and use selection criteria to select indicators that consider: responsiveness to change; ease of capture; community acceptance and involvement; interpretation; measurement error; stability, frequency and cost of measurement; spatial scale issues; and mapping capability in space and through time. The appropriateness of indicators can be evaluated using a decision making system such as a multiobjective decision support system (MO-DSS, a method to assist in decision making from multiple and conflicting objectives); (v) involve stakeholders and the community in the definition of goals and setting benchmarking and monitoring targets for sustainable farming; (vi) take preventive and corrective/remedial action; (vii) evaluate effectiveness of actions taken; and (viii) revise indicators as part of a continual improvement principle designed to achieve best management practice for sustainable farming systems. The major recommendations are to: (i) implement the framework for resources (land, water and vegetation, economic, community and institution) benchmarking and monitoring, and integrate this process with current activities so that awareness, implementation and evolution of sustainable resource management practices become normal practice in the grains industry; (ii) empower the grains industry to take the lead by using relevant sustainability indicators to benchmark and monitor resources; (iii) adopt a collaborative approach by involving various industry, community, catchment management and government agency groups to minimise implementation time. Monitoring programs such as Waterwatch, Soilcheck, Grasscheck and Topcrop should be utilised; (iv) encourage the adoption of a decision making system by growers and industry representatives as a participatory decision and evaluation process. Widespread use of sustainability indicators would assist in validating and refining these indicators and evaluating sustainable farming systems. The indicators could also assist in evaluating best management practices for the grains industry.