735 resultados para Multiple theory


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

wenty-eight international scholars contribute 11 chapters on the key role of communication in intergroup relations. Following an introductory essay on intergroup theory and communication processes, the text focuses on specific intergroup contexts, examining communication within and between cultural, disability, age, sex and sexuality, and language groups. The remaining chapters explore the communicating of identity across communication contexts, including small group, organizational, mass, and Internet communications. The text is designed for scholars in the fields of communication and intergroup social psychology, and is also suited for use in upper- division undergraduate and introductory graduate courses in those areas. Annotation ©2004 Book News, Inc., Portland, OR

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the stability analysis for a distribution static compensator (DSTATCOM) that operates in current control mode based on bifurcation theory. Bifurcations delimit the operating zones of nonlinear circuits and, hence, the capability to compute these bifurcations is of important interest for practical design. A control design for the DSTATCOM is proposed. Along with this control, a suitable mathematical representation of the DSTATCOM is proposed to carry out the bifurcation analysis efficiently. The stability regions in the Thevenin equivalent plane are computed for different power factors at the point of common coupling. In addition, the stability regions in the control gain space, as well as the contour lines for different Floquet multipliers are computed. It is demonstrated through bifurcation analysis that the loss of stability in the DSTATCOM is due to the emergence of a Neimark bifurcation. The observations are verified through simulation studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Camera calibration information is required in order for multiple camera networks to deliver more than the sum of many single camera systems. Methods exist for manually calibrating cameras with high accuracy. Manually calibrating networks with many cameras is, however, time consuming, expensive and impractical for networks that undergo frequent change. For this reason, automatic calibration techniques have been vigorously researched in recent years. Fully automatic calibration methods depend on the ability to automatically find point correspondences between overlapping views. In typical camera networks, cameras are placed far apart to maximise coverage. This is referred to as a wide base-line scenario. Finding sufficient correspondences for camera calibration in wide base-line scenarios presents a significant challenge. This thesis focuses on developing more effective and efficient techniques for finding correspondences in uncalibrated, wide baseline, multiple-camera scenarios. The project consists of two major areas of work. The first is the development of more effective and efficient view covariant local feature extractors. The second area involves finding methods to extract scene information using the information contained in a limited set of matched affine features. Several novel affine adaptation techniques for salient features have been developed. A method is presented for efficiently computing the discrete scale space primal sketch of local image features. A scale selection method was implemented that makes use of the primal sketch. The primal sketch-based scale selection method has several advantages over the existing methods. It allows greater freedom in how the scale space is sampled, enables more accurate scale selection, is more effective at combining different functions for spatial position and scale selection, and leads to greater computational efficiency. Existing affine adaptation methods make use of the second moment matrix to estimate the local affine shape of local image features. In this thesis, it is shown that the Hessian matrix can be used in a similar way to estimate local feature shape. The Hessian matrix is effective for estimating the shape of blob-like structures, but is less effective for corner structures. It is simpler to compute than the second moment matrix, leading to a significant reduction in computational cost. A wide baseline dense correspondence extraction system, called WiDense, is presented in this thesis. It allows the extraction of large numbers of additional accurate correspondences, given only a few initial putative correspondences. It consists of the following algorithms: An affine region alignment algorithm that ensures accurate alignment between matched features; A method for extracting more matches in the vicinity of a matched pair of affine features, using the alignment information contained in the match; An algorithm for extracting large numbers of highly accurate point correspondences from an aligned pair of feature regions. Experiments show that the correspondences generated by the WiDense system improves the success rate of computing the epipolar geometry of very widely separated views. This new method is successful in many cases where the features produced by the best wide baseline matching algorithms are insufficient for computing the scene geometry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Focuses on a study which introduced an iterative modeling method that combines properties of ordinary least squares (OLS) with hierarchical tree-based regression (HTBR) in transportation engineering. Information on OLS and HTBR; Comparison and contrasts of OLS and HTBR; Conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measures and theories of information abound, but there are few formalised methods for treating the contextuality that can manifest in different information systems. Quantum theory provides one possible formalism for treating information in context. This paper introduces a quantum-like model of the human mental lexicon, and shows one set of recent experimental data suggesting that concept combinations can indeed behave non-separably. There is some reason to believe that the human mental lexicon displays entanglement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The need for the development of effective business curricula that meets the needs of the marketplace has created an increase in the adoption of core competencies lists identifying appropriate graduate skills. Many organisations and tertiary institutions have individual graduate capabilities lists including skills deemed essential for success. Skills recognised as ‘critical thinking’ are popular inclusions on core competencies and graduate capability lists. While there is literature outlining ‘critical thinking’ frameworks, methods of teaching it and calls for its integration into business curricula, few studies actually identify quantifiable improvements achieved in this area. This project sought to address the development of ‘critical thinking’ skills in a management degree program by embedding a process for critical thinking within a theory unit undertaken by students early in the program. Focus groups and a student survey were used to identify issues of both content and implementation and to develop a student perspective on their needs in thinking critically. A process utilising a framework of critical thinking was integrated through a workbook of weekly case studies for group analysis, discussions and experiential exercises. The experience included formative and summative assessment. Initial results indicate a greater valuation by students of their experience in the organisation theory unit; better marks for mid semester essay assignments and higher evaluations on the university administered survey of students’ satisfaction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statisticians along with other scientists have made significant computational advances that enable the estimation of formerly complex statistical models. The Bayesian inference framework combined with Markov chain Monte Carlo estimation methods such as the Gibbs sampler enable the estimation of discrete choice models such as the multinomial logit (MNL) model. MNL models are frequently applied in transportation research to model choice outcomes such as mode, destination, or route choices or to model categorical outcomes such as crash outcomes. Recent developments allow for the modification of the potentially limiting assumptions of MNL such as the independence from irrelevant alternatives (IIA) property. However, relatively little transportation-related research has focused on Bayesian MNL models, the tractability of which is of great value to researchers and practitioners alike. This paper addresses MNL model specification issues in the Bayesian framework, such as the value of including prior information on parameters, allowing for nonlinear covariate effects, and extensions to random parameter models, so changing the usual limiting IIA assumption. This paper also provides an example that demonstrates, using route-choice data, the considerable potential of the Bayesian MNL approach with many transportation applications. This paper then concludes with a discussion of the pros and cons of this Bayesian approach and identifies when its application is worthwhile

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper outlines a method of constructing narratives about an individual’s self-efficacy. Self-efficacy is defined as “people’s judgments of their capabilities to organise and execute courses of action required to attain designated types of performances” (Bandura, 1986, p. 391), and as such represents a useful construct for thinking about personal agency. Social cognitive theory provides the theoretical framework for understanding the sources of self-efficacy, that is, the elements that contribute to a sense of self-efficacy. The narrative approach adopted offers an alternative to traditional, positivist psychology, characterised by a preoccupation with measuring psychological constructs (like self-efficacy) by means of questionnaires and scales. It is argued that these instruments yield scores which are somewhat removed from the lived experience of the person—respondent or subject—associated with the score. The method involves a cyclical and iterative process using qualitative interviews to collect data from participants – four mature aged university students. The method builds on a three-interview procedure designed for life history research (Dolbeare & Schuman, cited in Seidman, 1998). This is achieved by introducing reflective homework tasks, as well as written data generated by research participants, as they are guided in reflecting on those experiences (including behaviours, cognitions and emotions) that constitute a sense of self-efficacy, in narrative and by narrative. The method illustrates how narrative analysis is used “to produce stories as the outcome of the research” (Polkinghorne, 1995, p.15), with detail and depth contributing to an appreciation of the ‘lived experience’ of the participants. The method is highly collaborative, with narratives co-constructed by researcher and research participants. The research outcomes suggest an enhanced understanding of self-efficacy contributes to motivation, application of effort and persistence in overcoming difficulties. The paper concludes with an evaluation of the research process by the students who participated in the author’s doctoral study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The first sign of developing multiple sclerosis is a clinically isolated syndrome that resembles a multiple sclerosis relapse. Objective/methods: The objective was to review the clinical trials of two medicines in clinically isolated syndromes (interferon β and glatiramer acetate) to determine whether they prevent progression to definite multiple sclerosis. Results: In the BENEFIT trial, after 2 years, 45% of subjects in the placebo group developed clinically definite multiple sclerosis, and the rate was lower in the interferon β-1b group. Then all subjects were offered interferon β-1b, and the original interferon β-1b group became the early treatment group, and the placebo group became the delayed treatment group. After 5 years, the number of subjects with clinical definite multiple sclerosis remained lower in the early treatment than late treatment group. In the PreCISe trial, after 2 years, the time for 25% of the subjects to convert to definite multiple sclerosis was prolonged in the glatiramer group. Conclusions: Interferon β-1b and glatiramer acetate slow the progression of clinically isolated syndromes to definite multiple sclerosis. However, it is not known whether this early treatment slows the progression to the physical disabilities experienced in multiple sclerosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Most people with multiple sclerosis have the relapsing-remitting type. Objective/methods: The objective was to evaluate two clinical trials of fingolimod in relapsing multiple sclerosis. Results: FREEDOMS (FTY720 Research Evaluation Effects of Daily Oral therapy in Multiple Sclerosis), a Phase III placebo-controlled trial, showed that fingolimod (0.5 or 1.25 mg) reduced the relapse rate and disability in multiple sclerosis, compared to placebo. Fingolimod (0.5 or 1.25 mg) has been compared to interferon β-1a in a Phase III clinical trial (TRANSFORMS; Trial Assessing Injectable Interferon versus FTY720 Oral in Relapsing-Remitting Multiple Sclerosis) and shown to be more efficacious than interferon β-1a in reducing relapse rates. However, fingolimod did increase the risk of infections and skin cancers. Conclusions: Only the lower dose of fingolimod (0.5 mg), which possibly has less toxicity, should be considered for prevention of relapses in relapsing-remitting multiple sclerosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BLAST Atlas is a visual analysis system for comparative genomics that supports genome-wide gene characterisation, functional assignment and function-based browsing of one or more chromosomes. Inspired by applications such as the WorldWide Telescope, Bing Maps 3D and Google Earth, BLAST Atlas uses novel three-dimensional gene and function views that provide a highly interactive and intuitive way for scientists to navigate, query and compare gene annotations. The system can be used for gene identification and functional assignment or as a function-based multiple genome comparison tool which complements existing position based comparison and alignment viewers.