997 resultados para Loyalty approach
Resumo:
Subcycling, or the use of different timesteps at different nodes, can be an effective way of improving the computational efficiency of explicit transient dynamic structural solutions. The method that has been most widely adopted uses a nodal partition. extending the central difference method, in which small timestep updates are performed interpolating on the displacement at neighbouring large timestep nodes. This approach leads to narrow bands of unstable timesteps or statistical stability. It also can be in error due to lack of momentum conservation on the timestep interface. The author has previously proposed energy conserving algorithms that avoid the first problem of statistical stability. However, these sacrifice accuracy to achieve stability. An approach to conserve momentum on an element interface by adding partial velocities is considered here. Applied to extend the central difference method. this approach is simple. and has accuracy advantages. The method can be programmed by summing impulses of internal forces, evaluated using local element timesteps, in order to predict a velocity change at a node. However, it is still only statistically stable, so an adaptive timestep size is needed to monitor accuracy and to be adjusted if necessary. By replacing the central difference method with the explicit generalized alpha method. it is possible to gain stability by dissipating the high frequency response that leads to stability problems. However. coding the algorithm is less elegant, as the response depends on previous partial accelerations. Extension to implicit integration, is shown to be impractical due to the neglect of remote effects of internal forces acting across a timestep interface. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
An equivalent unit cell waveguide approach (WGA) to designing 4 multilayer microstrip reflectarray of variable size patches is presented. In this approach, a normal incidence of a plane wave on an infinite periodic array of radiating elements is considered to obtain reflection coefficient phase curves for the reflectarray's elements. It is shown that this problem is equivalent to the problem of reflection of the dominant TEM mode in a waveguide with patches interleaved by layers of dielectric. This waveguide problem is solved using a field matching technique and a method of moments (MoM). Based on this solution, a fast computer algorithm is developed to generate reflection coefficient phase curves for a multilayer microstrip patch reflectarray. The validity of the developed algorithm is tested against alternative approaches and Agilent High Frequency Structure Simulator (HFSS). Having confirmed the validity of the WGA approach, a small offset feed two-layer microstrip patch array is designed and developed. This reflectarray is tested experimentally and shows good performance.
Resumo:
For zygosity diagnosis in the absence of genotypic data, or in the recruitment phase of a twin study where only single twins from same-sex pairs are being screened, or to provide a test for sample duplication leading to the false identification of a dizygotic pair as monozygotic, the appropriate analysis of respondents' answers to questions about zygosity is critical. Using data from a young adult Australian twin cohort (N = 2094 complete pairs and 519 singleton twins from same-sex pairs with complete responses to all zygosity items), we show that application of latent class analysis (LCA), fitting a 2-class model, yields results that show good concordance with traditional methods of zygosity diagnosis, but with certain important advantages. These include the ability, in many cases, to assign zygosity with specified probability on the basis of responses of a single informant (advantageous when one zygosity type is being oversampled); and the ability to quantify the probability of misassignment of zygosity, allowing prioritization of cases for genotyping as well as identification of cases of probable laboratory error. Out of 242 twins (from 121 like-sex pairs) where genotypic data were available for zygosity confirmation, only a single case was identified of incorrect zygosity assignment by the latent class algorithm. Zygosity assignment for that single case was identified by the LCA as uncertain (probability of being a monozygotic twin only 76%), and the co-twin's responses clearly identified the pair as dizygotic (probability of being dizygotic 100%). In the absence of genotypic data, or as a safeguard against sample duplication, application of LCA for zygosity assignment or confirmation is strongly recommended.
Resumo:
Published mobility measurements obtained by capillary zone electrophoresis of human growth hormone peptides are described reasonably well by the classical theoretical relationships for electrophoretic migration. This conformity between theory and experiment has rendered possible a more critical assessment of a commonly employed empirical relationship between mobility (u), net charge (z) and molecular mass (M) of peptides in capillary electrophoresis. The assumed linear dependence between u and z/M-2/3 is shown to be an approximate description of a shallow curvilinear dependence convex to the abscissa. An improved procedure for the calculation of peptide charge (valence) is also described. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
In the previous two papers in this three-part series, we have examined visual pigments, ocular media transmission, and colors of the coral reef fish of Hawaii. This paper first details aspects of the light field and background colors at the microhabitat level on Hawaiian reefs and does so from the perspective and scale of fish living on the reef. Second, information from all three papers is combined in an attempt to examine trends in the visual ecology of reef inhabitants. Our goal is to begin to see fish the way they appear to other fish. Observations resulting from the combination of results in all three papers include the following. Yellow and blue colors on their own are strikingly well matched to backgrounds on the reef such as coral and bodies of horizontally viewed water. These colors, therefore, depending on context, may be important in camouflage as well as conspicuousness. The spectral characteristics of fish colors are correlated to the known spectral sensitivities in reef fish single cones and are tuned for maximum signal reliability when viewed against known backgrounds. The optimal positions of spectral sensitivity in a modeled dichromatic visual system are generally close to the sensitivities known for reef fish. Models also predict that both UV-sensitive and red-sensitive cone types are advantageous for a variety of tasks. UV-sensitive cones are known in some reef fish, red-sensitive cones have yet to be found. Labroid colors, which appear green or blue to us, may he matched to the far-red component of chlorophyll reflectance for camouflage. Red cave/hole dwelling reef fish are relatively poorly matched to the background they are often viewed against but this may be visually irrelevant. The model predicts that the task of distinguishing green algae from coral is optimized with a relatively long wavelength visual pigment pair. Herbivorous grazers whose visual pigments are known possess the longest sensitivities so far found. Labroid complex colors are highly contrasting complementary colors close up but combine, because of the spatial addition, which results from low visual resolution, at distance, to match background water colors remarkably well. Therefore, they are effective for simultaneous communication and camouflage.
Resumo:
A Combined Genetic Algorithm and Method of Moments design methods is presented for the design of unusual near-field antennas for use in Magnetic Resonance Imaging systems. The method is successfully applied to the design of an asymmetric coil structure for use at 190MHz and demonstrates excellent radiofrequency field homogeneity.
Resumo:
For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
In Part 1 of this paper a methodology for back-to-back testing of simulation software was described. Residuals with error-dependent geometric properties were generated. A set of potential coding errors was enumerated, along with a corresponding set of feature matrices, which describe the geometric properties imposed on the residuals by each of the errors. In this part of the paper, an algorithm is developed to isolate the coding errors present by analysing the residuals. A set of errors is isolated when the subspace spanned by their combined feature matrices corresponds to that of the residuals. Individual feature matrices are compared to the residuals and classified as 'definite', 'possible' or 'impossible'. The status of 'possible' errors is resolved using a dynamic subset testing algorithm. To demonstrate and validate the testing methodology presented in Part 1 and the isolation algorithm presented in Part 2, a case study is presented using a model for biological wastewater treatment. Both single and simultaneous errors that are deliberately introduced into the simulation code are correctly detected and isolated. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
In this paper we investigate the concepts of 'face' and 'politeness'. We introduce a metalanguage which we believe is a framework for simplifying the analysis of 'face' and 'politeness'. This metalanguage is based on the observation that both 'face' and 'politeness' involve external evaluations of people. This common element is represented in the metalanguage as B what A shows A thinks of B and what B thinks A thinks of B. The implications of the metalanguage for the analysis of Chinese mian and lion ('face') and English face are then discussed. This is followed by an analysis of examples of politeness in English and teineisa ('politeness') in Japanese. We conclude that the metalanguage may be further developed for use in comparisons of 'face' and 'politeness' across cultures. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
The design of randomized controlled trials entails decisions that have economic as well as statistical implications. In particular, the choice of an individual or cluster randomization design may affect the cost of achieving the desired level of power, other things being equal. Furthermore, if cluster randomization is chosen, the researcher must decide how to balance the number of clusters, or sites, and the size of each site. This article investigates these interrelated statistical and economic issues. Its principal purpose is to elucidate the statistical and economic trade-offs to assist researchers to employ randomized controlled trials that have desired economic, as well as statistical, properties. (C) 2003 Elsevier Inc. All rights reserved.
Resumo:
We present the quantum theory of the far-off-resonance continuous-wave Raman laser using the Heisenberg-Langevin approach. We show that the simplified quantum Langevin equations for this system are mathematically identical to those of the nondegenerate optical parametric oscillator in the time domain with the following associations: pump pump, Stokes signal, and Raman coherence idler. We derive analytical results for both the steady-state behavior and the time-dependent noise spectra, using standard linearization procedures. In the semiclassical limit, these results match with previous purely semiclassical treatments, which yield excellent agreement with experimental observations. The analytical time-dependent results predict perfect photon statistics conversion from the pump to the Stokes and nonclassical behavior under certain operational conditions.
Resumo:
Arquitetura Corporativa promove o estabelecimento de uma visão holística da estrutura e forma de trabalho de uma organização. Um dos aspectos abordados em Arquitetura Corporativa está associada a "estrutura ativa" da organização, que diz respeito a “quem" realiza as atividades organizacionais. Várias abordagens têm sido propostas a fim de proporcionar um meio para a representação de Arquitetura Corporativa, entre as quais ARIS, RM-ODP, UPDM e ArchiMate. Apesar da aceitação por parte da comunidade, as abordagens existentes se concentram em propósitos diferentes, têm limitações de escopo e algumas não têm semântica de mundo real bem definida. Além das abordagens de modelagem, muitas abordagens de ontologias têm sido propostas, a fim de descrever o domínio de estrutura ativa, incluindo as ontologias de SUPER Project, TOVE, Enterprise Ontology e W3C Org Ontology. Embora especificadas para fundamentação semântica e negociação de significado, algumas das abordagens propostas têm fins específicos e cobertura limitada. Além disso, algumas das abordagens não são definidas usando linguagens formais e outras são especificadas usando linguagens sem semântica bem definida. Este trabalho apresenta uma ontologia de referência bem fundamentada para o domínio organizacional. A ontologia organizacional de referência apresentada abrange os aspectos básicos discutidos na literatura organizacional, tais como divisão do trabalho, relações sociais e classificação das unidades estruturais. Além disso, também abrange os aspectos organizacionais definidos em abordagens existentes, levando em consideração tanto abordagens de modelagem quanto abordagens ontológicas. A ontologia resultante é especificada em OntoUML e estende os conceitos sociais de UFO-C.
Resumo:
The article analyses how Brazilian state actions and policies regarding peace operations during the Presidency of Lula da Silva relate to the country's positions and attitudes towards United Nations peacekeeping. It argues that the inconsistencies identified on the Brazilian positions reflect the lack of a clear strategic horizon guiding the country's participation in UN peacekeeping, which consequentially hinders the country emergence as a great power.