985 resultados para Classical Invariant Theory
Resumo:
This perspectives paper and its associated commentaries examine Alan Rugman's conceptual contribution to international business scholarship. Most significantly, we highlight Rugman's version of internalization theory as an approach that integrates transaction cost economics and ‘classical’ internalization theory with elements from the resource-based view, such that it is especially relevant to strategic management. In reviewing his oeuvre, we also offer observations on his ideas for ‘new internalization theory’. We classify his other novel insights into four categories: Network Multinationals; National competitiveness; Development and public policy; and Emerging Economy MNEs. This special section offers multiple views on how his work informed the larger academic debate and considers how these ideas might evolve in the longer term.
Resumo:
We construct exact vortex solutions in 3+1 dimensions to a theory which is an extension, due to Gies, of the Skyrme-Faddeev model, and that is believed to describe some aspects of the low energy limit of the pure SU(2) Yang-Mills theory. Despite the efforts in the last decades those are the first exact analytical solutions to be constructed for such type of theory. The exact vortices appear in a very particular sector of the theory characterized by special values of the coupling constants, and by a constraint that leads to an infinite number of conserved charges. The theory is scale invariant in that sector, and the solutions satisfy Bogomolny type equations. The energy of the static vortex is proportional to its topological charge, and waves can travel with the speed of light along them, adding to the energy a term proportional to a U(1) No ether charge they create. We believe such vortices may play a role in the strong coupling regime of the pure SU(2) Yang-Mills theory.
Resumo:
We construct analytical and numerical vortex solutions for an extended Skyrme-Faddeev model in a (3 + 1) dimensional Minkowski space-time. The extension is obtained by adding to the Lagrangian a quartic term, which is the square of the kinetic term, and a potential which breaks the SO(3) symmetry down to SO(2). The construction makes use of an ansatz, invariant under the joint action of the internal SO(2) and three commuting U(1) subgroups of the Poincare group, and which reduces the equations of motion to an ordinary differential equation for a profile function depending on the distance to the x(3) axis. The vortices have finite energy per unit length, and have waves propagating along them with the speed of light. The analytical vortices are obtained for a special choice of potentials, and the numerical ones are constructed using the successive over relaxation method for more general potentials. The spectrum of solutions is analyzed in detail, especially its dependence upon special combinations of coupling constants.
Resumo:
The Work Limitations Questionnaire (WLQ) is used to determine the amount of work loss and productivity which stem from certain health conditions, including rheumatoid arthritis and cancer. The questionnaire is currently scored using methodology from Classical Test Theory. Item Response Theory, on the other hand, is a theory based on analyzing item responses. This study wanted to determine the validity of using Item Response Theory (IRT), to analyze data from the WLQ. Item responses from 572 employed adults with dysthymia, major depressive disorder (MDD), double depressive disorder (both dysthymia and MDD), rheumatoid arthritis and healthy individuals were used to determine the validity of IRT (Adler et al., 2006).^ PARSCALE, which is IRT software from Scientific Software International, Inc., was used to calculate estimates of the work limitations based on item responses from the WLQ. These estimates, also known as ability estimates, were then correlated with the raw score estimates calculated from the sum of all the items responses. Concurrent validity, which claims a measurement is valid if the correlation between the new measurement and the valid measurement is greater or equal to .90, was used to determine the validity of IRT methodology for the WLQ. Ability estimates from IRT were found to be somewhat highly correlated with the raw scores from the WLQ (above .80). However, the only subscale which had a high enough correlation for IRT to be considered valid was the time management subscale (r = .90). All other subscales, mental/interpersonal, physical, and output, did not produce valid IRT ability estimates.^ An explanation for these lower than expected correlations can be explained by the outliers found in the sample. Also, acquiescent responding (AR) bias, which is caused by the tendency for people to respond the same way to every question on a questionnaire, and the multidimensionality of the questionnaire (the WLQ is composed of four dimensions and thus four different latent variables) probably had a major impact on the IRT estimates. Furthermore, it is possible that the mental/interpersonal dimension violated the monotonocity assumption of IRT causing PARSCALE to fail to run for these estimates. The monotonicity assumption needs to be checked for the mental/interpersonal dimension. Furthermore, the use of multidimensional IRT methods would most likely remove the AR bias and increase the validity of using IRT to analyze data from the WLQ.^
Resumo:
2000 Mathematics Subject Classification: 16R10, 16R30.
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014
Resumo:
This work extends a previously presented refined sandwich beam finite element (FE) model to vibration analysis, including dynamic piezoelectric actuation and sensing. The mechanical model is a refinement of the classical sandwich theory (CST), for which the core is modelled with a third-order shear deformation theory (TSDT). The FE model is developed considering, through the beam length, electrically: constant voltage for piezoelectric layers and quadratic third-order variable of the electric potential in the core, while meclianically: linear axial displacement, quadratic bending rotation of the core and cubic transverse displacement of the sandwich beam. Despite the refinement of mechanical and electric behaviours of the piezoelectric core, the model leads to the same number of degrees of freedom as the previous CST one due to a two-step static condensation of the internal dof (bending rotation and core electric potential third-order variable). The results obtained with the proposed FE model are compared to available numerical, analytical and experimental ones. Results confirm that the TSDT and the induced cubic electric potential yield an extra stiffness to the sandwich beam. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
An exact non-linear formulation of the equilibrium of elastic prismatic rods subjected to compression and planar bending is presented, electing as primary displacement variable the cross-section rotations and taking into account the axis extensibility. Such a formulation proves to be sufficiently general to encompass any boundary condition. The evaluation of critical loads for the five classical Euler buckling cases is pursued, allowing for the assessment of the axis extensibility effect. From the quantitative viewpoint, it is seen that such an influence is negligible for very slender bars, but it dramatically increases as the slenderness ratio decreases. From the qualitative viewpoint, its effect is that there are not infinite critical loads, as foreseen by the classical inextensible theory. The method of multiple (spatial) scales is used to survey the post-buckling regime for the five classical Euler buckling cases, with remarkable success, since very small deviations were observed with respect to results obtained via numerical integration of the exact equation of equilibrium, even when loads much higher than the critical ones were considered. Although known beforehand that such classical Euler buckling cases are imperfection insensitive, the effect of load offsets were also looked at, thus showing that the formulation is sufficiently general to accommodate this sort of analysis. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
We review recent developments in quantum and classical soliton theory, leading to the possibility of observing both classical and quantum parametric solitons in higher-dimensional environments. In particular, we consider the theory of three bosonic fields interacting via both parametric (cubic) and quartic couplings. In the case of photonic fields in a nonlinear optical medium this corresponds to the process of sum frequency generation (via chi((2)) nonlinearity) modified by the chi((3)) nonlinearity. Potential applications include an ultrafast photonic AND-gate. The simplest quantum solitons or energy eigenstates (bound-state solutions) of the interacting field Hamiltonian are obtained exactly in three space dimensions. They have a point-like structure-even though the corresponding classical theory is nonsingular. We show that the solutions can be regularized with the imposition of a momentum cut-off on the nonlinear couplings. The case of three-dimensional matter-wave solitons in coupled atomic/molecular Bose-Einstein condensates is discussed.
Resumo:
Purpose - Using Brandenburger and Nalebuff`s 1995 co-opetition model as a reference, the purpose of this paper is to seek to develop a tool that, based on the tenets of classical game theory, would enable scholars and managers to identify which games may be played in response to the different conflict of interest situations faced by companies in their business environments. Design/methodology/approach - The literature on game theory and business strategy are reviewed and a conceptual model, the strategic games matrix (SGM), is developed. Two novel games are described and modeled. Findings - The co-opetition model is not sufficient to realistically represent most of the conflict of interest situations faced by companies. It seeks to address this problem through development of the SGM, which expands upon Brandenburger and Nalebuff`s model by providing a broader perspective, through incorporation of an additional dimension (power ratio between players) and three novel, respectively, (rival, individualistic, and associative). Practical implications - This proposed model, based on the concepts of game theory, should be used to train decision- and policy-makers to better understand, interpret and formulate conflict management strategies. Originality/value - A practical and original tool to use game models in conflict of interest situations is generated. Basic classical games, such as Nash, Stackelberg, Pareto, and Minimax, are mapped on the SGM to suggest in which situations they Could be useful. Two innovative games are described to fit four different types of conflict situations that so far have no corresponding game in the literature. A test application of the SGM to a classic Intel Corporation strategic management case, in the complex personal computer industry, shows that the proposed method is able to describe, to interpret, to analyze, and to prescribe optimal competitive and/or cooperative strategies for each conflict of interest situation.
Resumo:
The problem of the negative values of the interaction parameter in the equation of Frumkin has been analyzed with respect to the adsorption of nonionic molecules on energetically homogeneous surface. For this purpose, the adsorption states of a homologue series of ethoxylated nonionic surfactants on air/water interface have been determined using four different models and literature data (surface tension isotherms). The results obtained with the Frumkin adsorption isotherm imply repulsion between the adsorbed species (corresponding to negative values of the interaction parameter), while the classical lattice theory for energetically homogeneous surface (e.g., water/air) admits attraction alone. It appears that this serious contradiction can be overcome by assuming heterogeneity in the adsorption layer, that is, effects of partial condensation (formation of aggregates) on the surface. Such a phenomenon is suggested in the Fainerman-Lucassen-Reynders-Miller (FLM) 'Aggregation model'. Despite the limitations of the latter model (e.g., monodispersity of the aggregates), we have been able to estimate the sign and the order of magnitude of Frumkin's interaction parameter and the range of the aggregation numbers of the surface species. (C) 2004 Elsevier B.V All rights reserved.
Resumo:
There has been a resurgence of interest in the mean trace length estimator of Pahl for window sampling of traces. The estimator has been dealt with by Mauldon and Zhang and Einstein in recent publications. The estimator is a very useful one in that it is non-parametric. However, despite some discussion regarding the statistical distribution of the estimator, none of the recent works or the original work by Pahl provide a rigorous basis for the determination a confidence interval for the estimator or a confidence region for the estimator and the corresponding estimator of trace spatial intensity in the sampling window. This paper shows, by consideration of a simplified version of the problem but without loss of generality, that the estimator is in fact the maximum likelihood estimator (MLE) and that it can be considered essentially unbiased. As the MLE, it possesses the least variance of all estimators and confidence intervals or regions should therefore be available through application of classical ML theory. It is shown that valid confidence intervals can in fact be determined. The results of the work and the calculations of the confidence intervals are illustrated by example. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
Os desafios à engenharia moderna são cada vez maiores, pretendendo-se quase sempre obter estruturas mais leves, com propriedades mecânicas atrativas e muitas vezes com geometrias complexas. Com tais requisitos, um dos materiais que tem vindo a ter uma crescente aplicação é o material compósito. Contudo, no que toca ao cálculo estrutural destes materiais, tudo se torna mais complexo, já que são materiais que geralmente são formados por empilhamento de várias camadas de material heterogéneo, podendo estas encontrarem-se dispostas segundo diferentes orientações. Assim, a utilização de um software que permita a previsão das propriedades mecânicas de uma estrutura em material compósito através da micromecânica, a aplicação da Teoria Clássica dos Laminados e de um critério de rotura, como por exemplo o de Tsai-Hill, é fundamental para agilizar o processo de estudo da estrutura a fabricar. Para dar uma resposta a tal necessidade foi desenvolvida uma aplicação, em MATLAB® GUI, denominada CAFE – Composite Analysis For Engineers, com ambiente gráfico apelativo, que permite determinar todas as variáveis importantes no estudo de estruturas em material compósito. Esta aplicação visa suportar e agilizar a aprendizagem desta área do conhecimento, permitindo também o acesso ao código de cálculo por parte do utilizador, de modo a conhecerem-se as equações utilizadas e, eventualmente, ser alvo de futuros desenvolvimentos. O programa desenvolvido foi alvo de validação, recorrendo-se para tal, a uma comparação dos resultados obtidos entre o respetivo programa e por um outro programa de grande fiabilidade. Assim sendo, concluiu-se que o software CAFE apresenta resultados válidos, encontrando-se apto a ser utilizado.
Resumo:
This paper presents the new package entitled Simulator of Intelligent Transportation Systems (SITS) and a computational oriented analysis of traffic dynamics. The SITS adopts a microscopic simulation approach to reproduce real traffic conditions considering different types of vehicles, drivers and roads. A set of experiments with the SITS reveal the dynamic phenomena exhibited by this kind of system. For this purpose a modelling formalism is developed that embeds the statistics and the Laplace transform. The results make possible the adoption of classical system theory tools and point out that it is possible to study traffic systems taking advantage of the knowledge gathered with automatic control algorithms. A complementary perspective for the analysis of the traffic flow is also quantified through the entropy measure.
Resumo:
RESUMO - Com o presente trabalho pretende-se analisar o impacto na despesa pública com medicamentos decorrente da implementação do Decreto-Lei 48-A/2010, de 13 de Maio, e do Decreto-Lei 106-A/2010, de 1 de Outubro, nos anos de 2011 e 2012. Os referidos diplomas alteraram as regras de formação do preço de referência e terão contribuído para a redução da despesa do SNS com medicamentos verificada em 2011 e 2012. Crê-se que antes da implementação dos referidos diplomas, o mercado concorrencial de medicamentos genéricos não apresentava a competição necessária, não se verificando a aproximação dos preços praticados ao seu custo marginal, de acordo com o previsto na teoria económica clássica. Pretende-se identificar o mercado total dos grupos homogéneos e analisar 50% do seu valor, através da identificação do preço de referência efectivo do 1º trimestre de 2011 ao 4º trimestre de 2012 e do cálculo do preço de referência expectável, na ausência da implementação dos referidos diplomas, com base nas regras existentes antes da implementação dos referidos diplomas. A identificação o peso relativo da alteração das regras do sistema de preços de referência, na despesa do SNS com medicamentos ocorrida em 2011 e 2012, poderemos delinear com maior rigor futuras estratégicas de controlo da despesa pública com medicamentos. Um factor de especial relevância dado o contexto actual de austeridade.