971 resultados para Experimental Modal Analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stability of faecal egg excretion and correlation with results related to worm burden at the initial phase of schistosomiasis mansoni were observed in two groups of mice infected with different Schistosoma mansoni cercarial burdens, by means of analysis of quantitative parasitological studies and schistosome counts after perfusion. Thus, it may be stated that few quantitative parasitological stool examinations could be sufficient to express the infection intensity at the initial phase, on the same grounds that it was already demonstrated at the chronic phase. Furthermore, it is confirmed that the use of the number of eggs passed in the faeces as a tool to estimate the worm burden at the initial phase of schistosome infection is adequate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity markets are complex environments, involving a large number of different entities, with specific characteristics and objectives, making their decisions and interacting in a dynamic scene. Game-theory has been widely used to support decisions in competitive environments; therefore its application in electricity markets can prove to be a high potential tool. This paper proposes a new scenario analysis algorithm, which includes the application of game-theory, to evaluate and preview different scenarios and provide players with the ability to strategically react in order to exhibit the behavior that better fits their objectives. This model includes forecasts of competitor players’ actions, to build models of their behavior, in order to define the most probable expected scenarios. Once the scenarios are defined, game theory is applied to support the choice of the action to be performed. Our use of game theory is intended for supporting one specific agent and not for achieving the equilibrium in the market. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. The scenario analysis algorithm has been tested within MASCEM and our experimental findings with a case study based on real data from the Iberian Electricity Market are presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Immunosuppressed animals respond poorly to schistosomal chemotherapy and that a proper response can be restored by the administration of immune serum. Present study attempts to search whether immunological stimulation would increase drug effectiveness. Swiss mice infected with 50 S. mansoni cercariae were later treated with complete Freund's adjuvant. Treatment with oxaminiquine was made with 100 mg/kg.b.w., 25 mg/kg.b.w. and 50 mg/kg/b.w., the last two doses representing a fourth and a half of the recommended curative dose. Appropriate controls for the drug, the adjuvant and the infection were also studied. The serum-level of ant-S. mansoni antibodies (ELISA) and recovery of worms by perfusion of the portal vein system were the evaluated parameters. Statistical analysis of the results failed to reveal significant differences in worm recovery between adjuvant-stimulated animals treated with oxamniquine and any of the treated controls receiving the same amount of the drug. Although total lack of immunity interferes with curative treament the usual immune response seems to be sufficient to allow for curative drug action in schistosomiasis and thus apparently does not need to be artificially stimulated

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Infection of Swiss/NIH mice with Leishmania major was compared with infection in isogenic resistant C57BL/6 and susceptible BALB/c mice. Swiss/NIH mice showed self-controlled lesions in the injected foot pad. The production of high levels of interferon-g (IFN-g) and low levels of interleukin-4 (IL-4) by cells from these animals suggests that they mount a Th1-type immune response. The importance of the indigenous microbiota on the development of murine leishmaniasis was investigated by infecting germfree Swiss/NIH in the hind footpad with L. major and conventionalizing after 3 weeks of infection. Lesions from conventionalized Swiss/NIH mice were significantly larger than conventional mice. Histopathological analysis of lesions from conventionalized animals showed abscesses of variable shapes and sizes and high numbers of parasitized macrophages. In the lesions from conventional mice, besides the absence of abscess formation, parasites were rarely observed. On the other hand, cells from conventional and conventionalized mice produced similar Th1-type response characterized by high levels of IFN-g and low levels of IL-4. In this study, we demonstrated that Swiss/NIH mice are resistant to L. major infection and that the absence of the normal microbiota at the beginning of infection significantly influenced the lesion size and the inflammatory response at the site of infection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Após o enquadramento da canoagem, como modalidade Olímpica, verificou-se um aumento significativo de estudos e pesquisas acerca da biomecânica da modalidade, o que contribuiu para uma diminuição dos tempos na competição. Contudo poucos foram os que se focaram nas forças desenvolvidas e aplicadas ao caiaque através do finca-pés, criado assim oportunidade de desenvolvimento de dispositivos para a medição das mesmas. Assim, o objectivo primordial é desenvolver um sistema experimental capaz de quantificar as forças geradas em cada um dos lados do finca-pés (esquerdo e direito). Este, deverá não só ser passível de se utilizar em caiaque ergómetro como também num caiaque de pista, permitido avaliar as forças aplicadas durante o ciclo de pagaiada, tanto em compressão como em tracção. A sua concepção baseou-se num modelo de finca-pés já existente, tornando-o compatível com os caiaques de competição mais comercializados, e permitindo que no futuro se possa utilizar na água, bastando para isso apenas possuir um caiaque e embarcar o sistema de medição. Este sistema experimental foi testado em caiaque ergómetro por 7 (sete) sujeitos com níveis distintos: seis homens (um atleta olímpico medalhado e cinco de nível nacional) e uma mulher (de nível de selecção Portuguesa) que, entre outros, realizaram um plano definido por 60s a uma frequência de 75 pagaiadas por minuto, seguido de uma intensa mudança de ritmo e força (Sprint). Após análise dos dados obtidos em cada um dos diferentes sujeitos, conseguimos identificar algumas das suas características, tais como: esforço assimétrico dos membros inferiores; utilização de forma heterogénea da fita do finca-pés; diferença de forças máximas aplicadas entre atletas (Ex.: para um atleta olímpico as forças medidas (Min; Max): Pé Esquerdo (- 444; 1087) N e Pé Direito (- 476; 1068) N); etc. Os resultados não só são bastante promissores como também são motivantes e congruentes com estudos anteriores, nomeadamente Begon et al. 2008 e Sturm 2010 e 2012. Finalmente, consegue-se afirmar, com segurança, que foram alcançados os objectivos propostos com a concepção deste dispositivo de medição de forças. Este permite caracterizar os esforços desenvolvidos no finca-pés por cada membro inferior, com ou sem a fita de suporte, possibilitando aos treinadores e atletas uma visão, para muitos desconhecida, das forças transmitidas e das suas assimetrias. No final, este conhecimento permitirá aos atletas melhorar o seu desempenho desportivo bem como facilitar a gestão desportiva, com base nos principais princípios mecânicos inerentes ao movimento dos atletas desde desporto Olímpico.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fractional dynamics is a growing topic in theoretical and experimental scientific research. A classical problem is the initialization required by fractional operators. While the problem is clear from the mathematical point of view, it constitutes a challenge in applied sciences. This paper addresses the problem of initialization and its effect upon dynamical system simulation when adopting numerical approximations. The results are compatible with system dynamics and clarify the formulation of adequate values for the initial conditions in numerical simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex industrial plants exhibit multiple interactions among smaller parts and with human operators. Failure in one part can propagate across subsystem boundaries causing a serious disaster. This paper analyzes the industrial accident data series in the perspective of dynamical systems. First, we process real world data and show that the statistics of the number of fatalities reveal features that are well described by power law (PL) distributions. For early years, the data reveal double PL behavior, while, for more recent time periods, a single PL fits better into the experimental data. Second, we analyze the entropy of the data series statistics over time. Third, we use the Kullback–Leibler divergence to compare the empirical data and multidimensional scaling (MDS) techniques for data analysis and visualization. Entropy-based analysis is adopted to assess complexity, having the advantage of yielding a single parameter to express relationships between the data. The classical and the generalized (fractional) entropy and Kullback–Leibler divergence are used. The generalized measures allow a clear identification of patterns embedded in the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New arguments proving that successive (repeated) measurements have a memory and actually remember each other are presented. The recognition of this peculiarity can change essentially the existing paradigm associated with conventional observation in behavior of different complex systems and lead towards the application of an intermediate model (IM). This IM can provide a very accurate fit of the measured data in terms of the Prony's decomposition. This decomposition, in turn, contains a small set of the fitting parameters relatively to the number of initial data points and allows comparing the measured data in cases where the “best fit” model based on some specific physical principles is absent. As an example, we consider two X-ray diffractometers (defined in paper as A- (“cheap”) and B- (“expensive”) that are used after their proper calibration for the measuring of the same substance (corundum a-Al2O3). The amplitude-frequency response (AFR) obtained in the frame of the Prony's decomposition can be used for comparison of the spectra recorded from (A) and (B) - X-ray diffractometers (XRDs) for calibration and other practical purposes. We prove also that the Fourier decomposition can be adapted to “ideal” experiment without memory while the Prony's decomposition corresponds to real measurement and can be fitted in the frame of the IM in this case. New statistical parameters describing the properties of experimental equipment (irrespective to their internal “filling”) are found. The suggested approach is rather general and can be used for calibration and comparison of different complex dynamical systems in practical purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although the issue of the out-of-plane response of unreinforced masonry structures under earthquake excitation is well known with consensus among the research community, this issue is simultaneously one of the more complex and most neglected areas on the seismic assessment of existing buildings. Nonetheless, its characterization should be found on the solid knowledge of the phenomenon and on the complete understanding of methodologies currently used to describe it. Based on this assumption, this article presents a general framework on the issue of the out-of-plane performance of unreinforced masonry structures, beginning with a brief introduction to the topic, followed by a compact state of art in which the principal methodologies proposed to assess the out-of-plane behavior of unreinforced masonry structures are presented. Different analytical approaches are presented, namely force and displacement-based, complemented with the presentation of existing numerical tools for the purpose presented above. Moreover, the most relevant experimental campaigns carried out in order to reproduce the phenomenon are reviewed and briefly discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Química e Bioquímica

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the need to find an alternative way to mechanical and welding joints, and at the same time to overcome some limitations linked to these traditional techniques, adhesive bonds can be used. Adhesive bonding is a permanent joining process that uses an adhesive to bond the components of a structure. Composite materials reinforced with fibres are becoming increasingly popular in many applications as a result of a number of competitive advantages. In the manufacture of composite structures, although the fabrication techniques reduce to the minimum by means of advanced manufacturing techniques, the use of connections is still required due to the typical size limitations and design, technological and logistical aspects. Moreover, it is known that in many high performance structures, unions between composite materials with other light metals such as aluminium are required, for purposes of structural optimization. This work deals with the experimental and numerical study of single lap joints (SLJ), bonded with a brittle (Nagase Chemtex Denatite XNRH6823) and a ductile adhesive (Nagase Chemtex Denatite XNR6852). These are applied to hybrid joints between aluminium (AL6082-T651) and carbon fibre reinforced plastic (CFRP; Texipreg HS 160 RM) adherends in joints with different overlap lengths (LO) under a tensile loading. The Finite Element (FE) Method is used to perform detailed stress and damage analyses allowing to explain the joints’ behaviour and the use of cohesive zone models (CZM) enables predicting the joint strength and creating a simple and rapid design methodology. The use of numerical methods to simulate the behaviour of the joints can lead to savings of time and resources by optimizing the geometry and material parameters of the joints. The joints’ strength and failure modes were highly dependent on the adhesive, and this behaviour was successfully modelled numerically. Using a brittle adhesive resulted in a negligible maximum load (Pm) improvement with LO. The joints bonded with the ductile adhesive showed a nearly linear improvement of Pm with LO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Paracoccidioidomycosis (PCM), caused by the dimorphic fungus Paracoccidioides brasiliensis (Pb), is the most prevalent systemic mycosis in Latin America. There are few reports in the literature about the disease damages during pregnancy and the consequences to the fetuses and breeding. This study evaluated the implications of PCM during pregnancy on offspring and mothers in Wistar rats. Groups of rats were submitted to systemic Pb infection, by intraperitoneal infusion, and mated 30 days after the infection date. Immediately after birth, rats and neonates were sacrificed to obtain organs for standard histological examination, morphometric analysis, fungi recovery by plating (CFU) and dosing of anti-Pb antibodies by ELISA. There were no stillbirths or miscarriages, however, the fetuses from infected pregnant rats had lower body and organ weight but the fertility rate was 100%. The largest number of CFU was recovered from the organ of pregnant rats, the pathological examination revealed more severe infection in the same group, further on the largest number of granulomas and fungal field. It can be concluded that the PCM was more severe in the group of pregnant rats, with implications to the weight of offspring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RESUMO - A profissão de enfermagem é definida por Virgínia Henderson (1966), como tendo por objeto "ajudar o indivíduo, saudável ou doente, na execução das atividades que contribuem para conservar a sua saúde ou a sua recuperação, de tal maneira, devendo desempenhar esta função no sentido de tornar o indivíduo o mais independente possível, ou seja, a alcançar a sua anterior independência". A atenção à saúde ao longo da história da humanidade vem sendo desenvolvida de muitos modos e por diversos atores sociais. O trabalho em saúde é essencial para a vida humana e envolve atividades realizadas por profissionais e equipas multidisciplinares que dominam os conhecimentos e técnicas essenciais para assistir o indivíduo com problemas de saúde ou em risco de adoecer, em atividades de investigação, de prevenção, curativo e de reabilitação. À Enfermagem tal como a outras profissões, são cada vez mais exigidos requisitos de eficiência, eficácia e polivalência na sua atividade, tendo a motivação um papel fundamental na atitude, postura e desempenho do profissional. Este trabalho descreve o estudo da motivação profissional, numa população de enfermeiros portugueses numa instituição privada. A versão adaptada de um inquérito como instrumento de autopreenchimento representa vários construtos implicados no processo motivacional, em contexto profissional. Pretendo com este estudo identificar a dimensão da motivação mais relevantes no processo motivacional dos enfermeiros, assim como as suas variações face a variáveis de características pessoais, profissionais e institucionais. A partir dos resultados obtidos pretendo ainda apresentar sugestões de maneira a oferecer oportunidade de transformação do seu ambiente organizacional em instituições similares.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the objective of establishing biological and biochemical characteristics of a significant number of Trypanosoma cruzi strains from different geographical areas, 138 strains isolated from naturally infected humans, triatomine or vertebrate hosts were studied; 120 were isolated from different areas of Brazil and 18 from other South and Central American countries. Inocula from triatomine or culture forms were injected into suckling Swiss mice, followed by passages into mice 10 to 12 g. Biological characters and histopathological study permitted the inclusion of the strains into three Types or biodemes: I, II, III. Isoenzymic analysis confirmed a correspondence between the biodemes and zymodemes : Type I and Z2b, Type II and Z2, Type III and Z1. Results showed the ubiquitary distribution of the several types of strains. The predominance of the same Type and zymodeme in one geographical area was confirmed : Type II strains among the human cases from eastern Bahia and east of Goiás; Type III strains from humans of north Brazil and Central America and from silvatic vectors or vertebrates from other geographical areas. The biological types of strains correlate with different histopathological lesions considering cardiac involvement and neuronal lesions. These findings suggest that the biological behavior together with isoenzymes patterns and pathological pictures in the vertebrate host can be an important tool for establishing correlations between strains behavior and clinico-pathological manifestations of Chagas' disease in different geographical areas.