875 resultados para Theory and method
Resumo:
Extended gcd calculation has a long history and plays an important role in computational number theory and linear algebra. Recent results have shown that finding optimal multipliers in extended gcd calculations is difficult. We present an algorithm which uses lattice basis reduction to produce small integer multipliers x(1), ..., x(m) for the equation s = gcd (s(1), ..., s(m)) = x(1)s(1) + ... + x(m)s(m), where s1, ... , s(m) are given integers. The method generalises to produce small unimodular transformation matrices for computing the Hermite normal form of an integer matrix.
Theoretical and numerical analyses of convective instability in porous media with upward throughflow
Resumo:
Exact analytical solutions have been obtained for a hydrothermal system consisting of a horizontal porous layer with upward throughflow. The boundary conditions considered are constant temperature, constant pressure at the top, and constant vertical temperature gradient, constant Darcy velocity at the bottom of the layer. After deriving the exact analytical solutions, we examine the stability of the solutions using linear stability theory and the Galerkin method. It has been found that the exact solutions for such a hydrothermal system become unstable when the Rayleigh number of the system is equal to or greater than the corresponding critical Rayleigh number. For small and moderate Peclet numbers (Pe less than or equal to 6), an increase in upward throughflow destabilizes the convective flow in the horizontal layer. To confirm these findings, the finite element method with the progressive asymptotic approach procedure is used to compute the convective cells in such a hydrothermal system. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
Objective: To compare secular trends in method-specific suicide rates among young people in Australia and England & Wales between 1968 and 1997. Methods: Australian data were obtained from the Australian Bureau of Statistics, and for England & Wales from the Office for National Statistics. Overall and method-specific suicide rates for 15-34 year old males and females were calculated using ICD codes E950-9 and E980-9 except E988.8. Results: In both settings, suicide rates have almost doubled in young males over the past 30 years (from 16.8 to 32.9 per 100,000 in Australia and from 10.1 to 19.0 in England & Wales). Overall rates have changed little in young females. In both sexes and in both settings there have been substantial increases in suicide by hanging (5-7 fold increase in Australia and four-fold increase in England & Wales). There have also been smaller increases in gassing in the 1980s and '90s. In females, the impact of these increases on overall rates has been offset by a decline in drug overdose, the most common method in females. Conclusions: Rates of male suicide have increased substantially in both settings in recent years, and hanging has become an increasingly common method of suicide. The similarity in observed trends in both settings supports the view that such changes may have common causes. Research should focus on understanding why hanging has increased in popularity and what measures may be taken to diminish it.
Resumo:
Background: Addressing human rights issues brings forth ethical and political responsibilities for occupational therapists and requires new epistemological and educational approaches. The way occupational therapists have faced these challenges has depended upon historical, cultural and social contexts. Aim and method: By means of literature review and historical analysis, this paper reflects on how occupational therapists have dealt with human matters issues and on the contemporary changes within the profession. Results and discussion: The paper portrays how Latin American occupational therapists have engaged in social transformation by choosing not to transform ethical and political problems into technical matters. Taking into account experiences and views from South Africa, Brazil and Chile, the paper outlines the importance of developing political literacy and interdisciplinary professional/postprofessional education to prepare the new generation of occupational therapists to engage in social transformation. Addressing issues of invisibility and lack of access to human rights, the paper reflects on the need of developing conceptual tools and strategies for change, and discusses the transformations being produced in contemporary occupational therapy. Conclusion: Occupational therapists and scientists need to be attentive to human rights issues. They also need to answer the call for interconnectedness in the present-day complex societies, and engage in networking and a cross-bordering dialogue. Nevertheless, although necessary and welcome, international cooperation requires a permanent exercise of cultural sensitivity, political awareness and self-awareness.
Resumo:
Uncontrolled systems (x) over dot is an element of Ax, where A is a non-empty compact set of matrices, and controlled systems (x) over dot is an element of Ax + Bu are considered. Higher-order systems 0 is an element of Px - Du, where and are sets of differential polynomials, are also studied. It is shown that, under natural conditions commonly occurring in robust control theory, with some mild additional restrictions, asymptotic stability of differential inclusions is guaranteed. The main results are variants of small-gain theorems and the principal technique used is the Krasnosel'skii-Pokrovskii principle of absence of bounded solutions.
Resumo:
The generalization of the quasi mode theory of macroscopic quantization in quantum optics and cavity QED presented in the previous paper, is applied to provide a fully quantum theoretic derivation of the laws of reflection and refraction at a boundary. The quasi mode picture of this process involves the annihilation of a photon travelling in the incident region quasi mode, and the subsequent creation of a photon in either the incident region or transmitted region quasi modes. The derivation of the laws of reflection and refraction is achieved through the dual application of the quasi mode theory and a quantum scattering theory based on the Heisenberg picture. Formal expressions from scattering theory are given for the reflection and transmission coefficients. The behaviour of the intensity for a localized one photon wave packet coming in at time minus infinity from the incident direction is examined and it is shown that at time plus infinity, the light intensity is only significant where the classical laws of reflection and refraction predict. The occurrence of both refraction and reflection is dependent upon the quasi mode theory coupling constants between incident and transmitted region quasi modes being nonzero, and it is seen that the contributions to such coupling constants come from the overlap of the mode functions in the boundary layer region, as might be expected from a microscopic theory.
Resumo:
A new method is presented to determine an accurate eigendecomposition of difficult low temperature unimolecular master equation problems. Based on a generalisation of the Nesbet method, the new method is capable of achieving complete spectral resolution of the master equation matrix with relative accuracy in the eigenvectors. The method is applied to a test case of the decomposition of ethane at 300 K from a microcanonical initial population with energy transfer modelled by both Ergodic Collision Theory and the exponential-down model. The fact that quadruple precision (16-byte) arithmetic is required irrespective of the eigensolution method used is demonstrated. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
This paper conducts a dynamic stability analysis of symmetrically laminated FGM rectangular plates with general out-of-plane supporting conditions, subjected to a uniaxial periodic in-plane load and undergoing uniform temperature change. Theoretical formulations are based on Reddy's third-order shear deformation plate theory, and account for the temperature dependence of material properties. A semi-analytical Galerkin-differential quadrature approach is employed to convert the governing equations into a linear system of Mathieu-Hill equations from which the boundary points on the unstable regions are determined by Bolotin's method. Free vibration and bifurcation buckling are also discussed as subset problems. Numerical results are presented in both dimensionless tabular and graphical forms for laminated plates with FGM layers made of silicon nitride and stainless steel. The influences of various parameters such as material composition, layer thickness ratio, temperature change, static load level, boundary constraints on the dynamic stability, buckling and vibration frequencies are examined in detail through parametric studies.
Resumo:
This article advances the theoretical integration between securitization theory and the framing approach, resulting in a set of criteria hereby called security framing. It seeks to make a twofold contribution: to sharpen the study of the ideational elements that underlie the construction of threats, and to advance towards a greater assessment of the audience's preferences. The case study under examination is the 2011 military intervention of the countries of the Gulf Cooperation Council in Bahrain. The security framing of this case will help illuminate the dynamics at play in one of the most important recent events in Gulf politics.
Resumo:
Purpose/objectives: This paper seeks to investigate whether performance management (PM) framework adopted in Portuguese local government (PLG) fit the Otley’s PM framework (1999). In particularly, the research questions are (1) whether PM framework adopted in PLG (SIADAP) fit the Otley´s framework, and (2) how local politicians (aldermen) see the operation of performance management systems (PMS) in PLG (focusing on the goal-setting process and incentive and reward structures). Theoretical positioning/contributions: With this paper we intend to contribute to literature on how the Otley’s PM framework can guide empirical research about the operation of PMS. In particular, the paper contributes to understand the fit between PMS implemented in PLG and the Otley´s PM framework. The analysis of this fit can be a good contribution to understand if PMS are used in PLG as a management tool or as a strategic response to external pressures (based on interviews conducted to aldermen). We believe that the Otley’s PM framework, as well as the extended PM framework presented by Ferreira and Otley (2009), can provide a useful research tool to understand the operation of PMS in PLG. Research method: The first research question is the central issue in this paper and is analyzed based on the main reforms introduced by Portuguese government on PM of public organizations (like municipalities). On the other hand, interviews conducted on three larger Portuguese municipalities (Oporto, Braga, and Matosinhos) show how aldermen see the operation of PMS in PLG, highlighting the goals setting process with targets associated and the existing of incentive and reward structures linked with performance. Findings: Generally we find that formal and regulated PM frameworks in PLG fit the main issues of the Otley’s PM framework. However, regarding the aldermen perceptions about PMS in practice we find a gap between theory and practice, especially regarding difficulties associated with the lack of a culture of goals and targets setting and the lack of incentive and reward structures linked with performance.
Resumo:
Functionally graded materials are composite materials wherein the composition of the constituent phases can vary in a smooth continuous way with a gradation which is function of its spatial coordinates. This characteristic proves to be an important issue as it can minimize abrupt variations of the material properties which are usually responsible for localized high values of stresses, and simultaneously providing an effective thermal barrier in specific applications. In the present work, it is studied the static and free vibration behaviour of functionally graded sandwich plate type structures, using B-spline finite strip element models based on different shear deformation theories. The effective properties of functionally graded materials are estimated according to Mori-Tanaka homogenization scheme. These sandwich structures can also consider the existence of outer skins of piezoelectric materials, thus achieving them adaptive characteristics. The performance of the models, are illustrated through a set of test cases. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.
Resumo:
Objective : The announcement, prenatally or at birth, of a cleft lip and/or palate represents a challenge for the parents. The purpose of this study is to identify parental working internal models of the child (parental representations of the child and relationship in the context of attachment theory) and posttraumatic stress disorder symptoms in mothers of infants born with a cleft. Method : The study compares mothers with a child born with a cleft (n = 22) and mothers with a healthy infant (n = 36). Results : The study shows that mothers of infants with a cleft more often experience insecure parental working internal models of the child and more posttraumatic stress symptoms than mothers of the control group. It is interesting that the severity or complexity of the cleft is not related to parental representations and posttraumatic stress disorder symptoms. The maternal emotional involvement, as expressed in maternal attachment representations, is higher in mothers of children with a cleft who had especially high posttraumatic stress disorder symptoms, as compared with mothers of children with a cleft having fewer posttraumatic stress disorder symptoms. Discussion : Mothers of children with a cleft may benefit from supportive therapy regarding parent-child attachment, even when they express low posttraumatic stress disorder symptoms.
Resumo:
We study preconditioning techniques for discontinuous Galerkin discretizations of isotropic linear elasticity problems in primal (displacement) formulation. We propose subspace correction methods based on a splitting of the vector valued piecewise linear discontinuous finite element space, that are optimal with respect to the mesh size and the Lamé parameters. The pure displacement, the mixed and the traction free problems are discussed in detail. We present a convergence analysis of the proposed preconditioners and include numerical examples that validate the theory and assess the performance of the preconditioners.
Resumo:
The recently developed variational Wigner-Kirkwood approach is extended to the relativistic mean field theory for finite nuclei. A numerical application to the calculation of the surface energy coefficient in semi-infinite nuclear matter is presented. The new method is contrasted with the standard density functional theory and the fully quantal approach.