988 resultados para unified projection model
Resumo:
The state of the art to describe image quality in medical imaging is to assess the performance of an observer conducting a task of clinical interest. This can be done by using a model observer leading to a figure of merit such as the signal-to-noise ratio (SNR). Using the non-prewhitening (NPW) model observer, we objectively characterised the evolution of its figure of merit in various acquisition conditions. The NPW model observer usually requires the use of the modulation transfer function (MTF) as well as noise power spectra. However, although the computation of the MTF poses no problem when dealing with the traditional filtered back-projection (FBP) algorithm, this is not the case when using iterative reconstruction (IR) algorithms, such as adaptive statistical iterative reconstruction (ASIR) or model-based iterative reconstruction (MBIR). Given that the target transfer function (TTF) had already shown it could accurately express the system resolution even with non-linear algorithms, we decided to tune the NPW model observer, replacing the standard MTF by the TTF. It was estimated using a custom-made phantom containing cylindrical inserts surrounded by water. The contrast differences between the inserts and water were plotted for each acquisition condition. Then, mathematical transformations were performed leading to the TTF. As expected, the first results showed a dependency of the image contrast and noise levels on the TTF for both ASIR and MBIR. Moreover, FBP also proved to be dependent of the contrast and noise when using the lung kernel. Those results were then introduced in the NPW model observer. We observed an enhancement of SNR every time we switched from FBP to ASIR to MBIR. IR algorithms greatly improve image quality, especially in low-dose conditions. Based on our results, the use of MBIR could lead to further dose reduction in several clinical applications.
Resumo:
Purpose We propose a social identity model of leader prototypes to address why the maleness of leader prototypes is more pronounced among men than among women (e.g., Schein, 2001). Specifically, we argue that individuals project their ingroup prototype (e.g., a male prototype) onto a valued other category (e.g., leaders) (e.g., Wenzel, Mummendey, Weber, & Waldzus, 2003) in order to maintain a positive ingroup (e.g., gender) identity. We hypothesized that both women and men engage in ingroup projection of their gender prototype on their leader prototype, and we expected this effect to be stronger for men than women. We also investigated intelligence as a moderator of ingroup projection. Methodology Participants (276 students, University of Lausanne) assessed to what extent attributes on a list of gender traits were characteristic of a successful leader. We computed relative ingroup similarity scores (e.g., Waldzus & Mummendey, 2004) representing the difference between how characteristic ingroup traits are for a successful leader, and how characteristic outgroup traits are for a successful leader. Results Results showed that men engaged in ingroup projection while women engaged in outgroup projection, and that men engaged in ingroup projection to a greater extent. We also found a small, but positive effect of intelligence on ingroup projection among men. Limitations The use of a student sample might limit the external validity of our findings. Implications Our findings contribute to research on the under-representation of women in managerial roles, and introduce intelligence as a predictor of ingroup projection. Value Our study allows for a more fine-grained understanding of the cognitive representations of leaders of men and women.
Resumo:
This work is devoted to the development of numerical method to deal with convection diffusion dominated problem with reaction term, non - stiff chemical reaction and stiff chemical reaction. The technique is based on the unifying Eulerian - Lagrangian schemes (particle transport method) under the framework of operator splitting method. In the computational domain, the particle set is assigned to solve the convection reaction subproblem along the characteristic curves created by convective velocity. At each time step, convection, diffusion and reaction terms are solved separately by assuming that, each phenomenon occurs separately in a sequential fashion. Moreover, adaptivities and projection techniques are used to add particles in the regions of high gradients (steep fronts) and discontinuities and transfer a solution from particle set onto grid point respectively. The numerical results show that, the particle transport method has improved the solutions of CDR problems. Nevertheless, the method is time consumer when compared with other classical technique e.g., method of lines. Apart from this advantage, the particle transport method can be used to simulate problems that involve movingsteep/smooth fronts such as separation of two or more elements in the system.
Resumo:
The capabilities and thus, design complexity of VLSI-based embedded systems have increased tremendously in recent years, riding the wave of Moore’s law. The time-to-market requirements are also shrinking, imposing challenges to the designers, which in turn, seek to adopt new design methods to increase their productivity. As an answer to these new pressures, modern day systems have moved towards on-chip multiprocessing technologies. New architectures have emerged in on-chip multiprocessing in order to utilize the tremendous advances of fabrication technology. Platform-based design is a possible solution in addressing these challenges. The principle behind the approach is to separate the functionality of an application from the organization and communication architecture of hardware platform at several levels of abstraction. The existing design methodologies pertaining to platform-based design approach don’t provide full automation at every level of the design processes, and sometimes, the co-design of platform-based systems lead to sub-optimal systems. In addition, the design productivity gap in multiprocessor systems remain a key challenge due to existing design methodologies. This thesis addresses the aforementioned challenges and discusses the creation of a development framework for a platform-based system design, in the context of the SegBus platform - a distributed communication architecture. This research aims to provide automated procedures for platform design and application mapping. Structural verification support is also featured thus ensuring correct-by-design platforms. The solution is based on a model-based process. Both the platform and the application are modeled using the Unified Modeling Language. This thesis develops a Domain Specific Language to support platform modeling based on a corresponding UML profile. Object Constraint Language constraints are used to support structurally correct platform construction. An emulator is thus introduced to allow as much as possible accurate performance estimation of the solution, at high abstraction levels. VHDL code is automatically generated, in the form of “snippets” to be employed in the arbiter modules of the platform, as required by the application. The resulting framework is applied in building an actual design solution for an MP3 stereo audio decoder application.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
Germany's socio-economic model, the "social market economy", was established in West Germany after World War II and extended to the unified Germany in 1990. During a prolonged recession after the adoption of the Euro in 1998, major reforms (Agenda 2010) were introduced which many consider as the key of Germany's recent success. The reforms had mixed results: employment increased but has consisted to a large extent of precarious low-wage jobs. Growth depended on export surpluses based on an internal real devaluation (low unit labour costs) which make Germany vulnerable to global recessions as in 2009. Overall inequality increased substantially.
Resumo:
Underlying intergroup perceptions include processes of social projection (perceiving personal traitslbeliefs in others, see Krueger 1998) and meta-stereotyping (thinking about other groups' perceptions of one's own group, see Vorauer et aI., 1998). Two studies were conducted to investigate social projection and meta-stereotypes in the domain of White-Black racial relations. Study 1, a correlational study, examined the social projection of prejudice and 'prejudiced' meta-stereotypes among Whites. Results revealed that (a) Whites socially projected their intergroup attitudes onto other Whites (and Blacks) [i.e., Whites higher in prejudice against Blacks believed a large percentage of Whites (Blacks) are prejudiced against Blacks (Whites), whereas Whites low in prejudice believed a smaller percentage of Whites (Blacks) are prejudiced]; (b) Whites held the meta:..stereotype that their group (Whites) is viewed by Blacks to be prejudiced; and (c) prejudiced meta-stereotypes may be formed through the social projection of intergroup attitudes (result of path-model tests). Further, several correlates of social projection and meta-stereotypes were identified, including the finding that feeling negatively stereotyped by an outgroup predicted outgroup avoidance through heightened intergroup anxiety. Study 2 replicated and extended these findings, investigating the social projection of ingroup favouritism and meta- and other-stereotypes about ingroup favouritism. These processes were examined experimentally using an anticipated intergroup contact paradigm. The goal was to understand the experimental conditions under which people would display the strongest social projection of intergroup attitudes, and when experimentally induced meta-stereotypes (vs. other-stereotypes; beliefs about the group 11 preferences of one's outgroup) would be most damaging to intergroup contact. White participants were randomly assigned to one of six conditions and received (alleged) feedback from a previously completed computer-based test. Depending on condition, this information suggested that: (a) the participant favoured Whites over Blacks; (b) previous White participants favoured Whites over Blacks; (c) the participant's Black partner favoured Blacks over Whites; (d) previous Black participants favoured Blacks over Whites; (e) the participant's Black partner viewed the participant to favour Whites over Blacks; or (£) Black participants previously participating viewed Whites to favour Whites over Blacks. In a defensive reaction, Whites exhibited enhanced social projection of personal intergroup attitudes onto their ingroup under experimental manipulations characterized by self-concept threat (i.e., when the computer revealed that the participant favoured the ingroup or was viewed to favour the ingroup). Manipulated meta- and otherstereotype information that introduced intergroup contact threat, on the other hand, each exerted a strong negative impact on intergroup contact expectations (e.g., anxiety). Personal meta-stereotype manipulations (i.e., when the participant was informed that her/ his partner thinks s/he favours the ingroup) exerted an especially negative impact on intergroup behaviour, evidenced by increased avoidance of the upcoming interracial interaction. In contrast, personal self-stereotype manipulations (i.e., computer revealed that one favoured the ingroup) ironically improved upcoming intergroup contact expectations and intentions, likely due to an attempt to reduce the discomfort of holding negative intergroup attitudes. Implications and directions for future research are considered.
Resumo:
We study the problem of measuring the uncertainty of CGE (or RBC)-type model simulations associated with parameter uncertainty. We describe two approaches for building confidence sets on model endogenous variables. The first one uses a standard Wald-type statistic. The second approach assumes that a confidence set (sampling or Bayesian) is available for the free parameters, from which confidence sets are derived by a projection technique. The latter has two advantages: first, confidence set validity is not affected by model nonlinearities; second, we can easily build simultaneous confidence intervals for an unlimited number of variables. We study conditions under which these confidence sets take the form of intervals and show they can be implemented using standard methods for solving CGE models. We present an application to a CGE model of the Moroccan economy to study the effects of policy-induced increases of transfers from Moroccan expatriates.
Resumo:
It is well known that standard asymptotic theory is not valid or is extremely unreliable in models with identification problems or weak instruments [Dufour (1997, Econometrica), Staiger and Stock (1997, Econometrica), Wang and Zivot (1998, Econometrica), Stock and Wright (2000, Econometrica), Dufour and Jasiak (2001, International Economic Review)]. One possible way out consists here in using a variant of the Anderson-Rubin (1949, Ann. Math. Stat.) procedure. The latter, however, allows one to build exact tests and confidence sets only for the full vector of the coefficients of the endogenous explanatory variables in a structural equation, which in general does not allow for individual coefficients. This problem may in principle be overcome by using projection techniques [Dufour (1997, Econometrica), Dufour and Jasiak (2001, International Economic Review)]. AR-types are emphasized because they are robust to both weak instruments and instrument exclusion. However, these techniques can be implemented only by using costly numerical techniques. In this paper, we provide a complete analytic solution to the problem of building projection-based confidence sets from Anderson-Rubin-type confidence sets. The latter involves the geometric properties of “quadrics” and can be viewed as an extension of usual confidence intervals and ellipsoids. Only least squares techniques are required for building the confidence intervals. We also study by simulation how “conservative” projection-based confidence sets are. Finally, we illustrate the methods proposed by applying them to three different examples: the relationship between trade and growth in a cross-section of countries, returns to education, and a study of production functions in the U.S. economy.
Resumo:
Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal
Resumo:
Ce mémoire s'inscrit dans le domaine de la vision par ordinateur. Elle s'intéresse à la calibration de systèmes de caméras stéréoscopiques, à la mise en correspondance caméra-projecteur, à la reconstruction 3D, à l'alignement photométrique de projecteurs, au maillage de nuages de points, ainsi qu'au paramétrage de surfaces. Réalisé dans le cadre du projet LightTwist du laboratoire Vision3D, elle vise à permettre la projection sur grandes surfaces arbitraires à l'aide de plusieurs projecteurs. Ce genre de projection est souvent utilisé en arts technologiques, en théâtre et en projection architecturale. Dans ce mémoire, on procède au calibrage des caméras, suivi d'une reconstruction 3D par morceaux basée sur une méthode active de mise en correspondance, la lumière non structurée. Après un alignement et un maillage automatisés, on dispose d'un modèle 3D complet de la surface de projection. Ce mémoire introduit ensuite une nouvelle approche pour le paramétrage de modèles 3D basée sur le calcul efficace de distances géodésiques sur des maillages. L'usager n'a qu'à délimiter manuellement le contour de la zone de projection sur le modèle. Le paramétrage final est calculé en utilisant les distances obtenues pour chaque point du modèle. Jusqu'à maintenant, les méthodes existante ne permettaient pas de paramétrer des modèles ayant plus d'un million de points.
Resumo:
Emma Hamilton (1765-1815) eut un impact considérable à un moment charnière de l’histoire et de l’art européens. Faisant preuve d’une énorme résilience, elle trouva un moyen efficace d’affirmer son agentivité et fut une source d’inspiration puissante pour des générations de femmes et d’artistes dans leur propre quête d’expression et de réalisation de soi. Cette thèse démontre qu’Emma tira sa puissance particulière de sa capacité à négocier des identités différentes et parfois même contradictoires – objet et sujet ; modèle et portraiturée ; artiste, muse et œuvre d’art ; épouse, maîtresse et prostituée ; roturière et aristocrate ; mondaine et ambassadrice : et interprète d’une myriade de caractères historiques, bibliques, littéraires et mythologiques, tant masculins que féminins. Épouse de l’ambassadeur anglais à Naples, favorite de la reine de Naples et amante de l’amiral Horatio Nelson, elle fut un agent sur la scène politique pendant l’époque révolutionnaire et napoléonienne. Dans son ascension sociale vertigineuse qui la mena de la plus abjecte misère aux plus hauts échelons de l’aristocratie anglaise, elle sut s’adapter, s’ajuster et se réinventer. Elle reçut et divertit d’innombrables écrivains, artistes, scientifiques, nobles, diplomates et membres de la royauté. Elle participa au développement et à la dissémination du néoclassicisme au moment même de son efflorescence. Elle créa ses Attitudes, une performance répondant au goût de son époque pour le classicisme, qui fut admirée et imitée à travers l’Europe et qui inspira des générations d’interprètes féminines. Elle apprit à danser la tarentelle et l’introduisit dans les salons aristocratiques. Elle influença un réseau de femmes s’étendant de Paris à Saint-Pétersbourg et incluant Élisabeth Vigée-Le Brun, Germaine de Staël et Juliette Récamier. Modèle hors pair, elle inspira plusieurs artistes pour la production d’œuvres qu’ils reconnurent comme parmi leurs meilleures. Elle fut représentée par les plus grands artistes de son temps, dont Angelica Kauffman, Benjamin West, Élisabeth Vigée-Le Brun, George Romney, James Gillray, Joseph Nollekens, Joshua Reynolds, Thomas Lawrence et Thomas Rowlandson. Elle bouscula, de façon répétée, les limites et mœurs sociales. Néanmoins, Emma ne tentait pas de présenter une identité cohérente, unifiée, polie. Au contraire, elle était un kaléidoscope de multiples « sois » qu’elle gardait actifs et en dialogue les uns avec les autres, réarrangeant continuellement ses facettes afin de pouvoir simultanément s’exprimer pleinement et présenter aux autres ce qu’ils voulaient voir.
Resumo:
La réflexion est considérée comme un élément significatif de la pédagogie et de la pratique médicales sans qu’il n’existe de consensus sur sa définition ou sur sa modélisation. Comme la réflexion prend concurremment plusieurs sens, elle est difficile à opérationnaliser. Une définition et un modèle standard sont requis afin d’améliorer le développement d’applications pratiques de la réflexion. Dans ce mémoire, nous identifions, explorons et analysons thématiquement les conceptualisations les plus influentes de la réflexion, et développons de nouveaux modèle et définition. La réflexion est définie comme le processus de s’engager (le « soi » (S)) dans des interactions attentives, critiques, exploratoires et itératives (ACEI) avec ses pensées et ses actions (PA), leurs cadres conceptuels sous-jacents (CC), en visant à les changer et en examinant le changement lui-même (VC). Notre modèle conceptuel comprend les cinq composantes internes de la réflexion et les éléments extrinsèques qui l’influencent.
Studies on Pseudoscalar Meson Bound States and Semileptonic Decays in a Relativistic Potential Model
Resumo:
In this thesis quark-antiquark bound states are considered using a relativistic two-body equation for Dirac particles. The mass spectrum of mesons includes bound states involving two heavy quarks or one heavy and one light quark. In order to analyse these states within a unified formalism, it is desirable to have a two-fermion equation that limits to one body Dirac equation with a static interaction for the light quark when the other particle's mass tends to infinity. A suitable two-body equation has been developed by Mandelzweig and Wallace. This equation is solved in momentum space and is used to describe the complete spectrum of mesons. The potential used in this work contains a short range one-gluon exchange interaction and a long range linear confining and constant potential terms. This model is used to investigate the decay processes of heavy mesons. Semileptonic decays are more tractable since there is no final state interactions between the leptons and hadrons that would otherwise complicate the situation. Studies on B and D meson decays are helpful to understand the nonperturbative strong interactions of heavy mesons, which in turn is useful to extract the details of weak interaction process. Calculation of form factors of these semileptonic decays of pseudo scalar mesons are also presented.
Resumo:
Building robust recognition systems requires a careful understanding of the effects of error in sensed features. Error in these image features results in a region of uncertainty in the possible image location of each additional model feature. We present an accurate, analytic approximation for this uncertainty region when model poses are based on matching three image and model points, for both Gaussian and bounded error in the detection of image points, and for both scaled-orthographic and perspective projection models. This result applies to objects that are fully three- dimensional, where past results considered only two-dimensional objects. Further, we introduce a linear programming algorithm to compute the uncertainty region when poses are based on any number of initial matches. Finally, we use these results to extend, from two-dimensional to three- dimensional objects, robust implementations of alignmentt interpretation- tree search, and ransformation clustering.