939 resultados para Cross-entropy Method
Resumo:
A boundary integral equation is described for the prediction of acoustic propagation from a monofrequency coherent line source in a cutting with impedance boundary conditions onto surrounding flat impedance ground. The problem is stated as a boundary value problem for the Helmholtz equation and is subsequently reformulated as a system of boundary integral equations via Green's theorem. It is shown that the integral equation formulation has a unique solution at all wavenumbers. The numerical solution of the coupled boundary integral equations by a simple boundary element method is then described. The convergence of the numerical scheme is demonstrated experimentally. Predictions of A-weighted excess attenuation for a traffic noise spectrum are made illustrating the effects of varying the depth of the cutting and the absorbency of the surrounding ground surface.
Resumo:
The use of pulse compression techniques to improve the sensitivity of meteorological radars has become increasingly common in recent years. An unavoidable side-effect of such techniques is the formation of ‘range sidelobes’ which lead to spreading of information across several range gates. These artefacts are particularly troublesome in regions where there is a sharp gradient in the power backscattered to the antenna as a function of range. In this article we present a simple method for identifying and correcting range sidelobe artefacts. We make use of the fact that meteorological targets produce an echo which fluctuates at random, and that this echo, like a fingerprint, is unique to each range gate. By cross-correlating the echo time series from pairs of gates therefore we can identify whether information from one gate has spread into another, and hence flag regions of contamination. In addition we show that the correlation coefficients contain quantitative information about the fraction of power leaked from one range gate to another, and we propose a simple algorithm to correct the corrupted reflectivity profile.
Resumo:
This paper proposes two new tests for linear and nonlinear lead/lag relationships between time series based on the concepts of cross-correlations and cross-bicorrelations, respectively. The tests are then applied to a set of Sterling-denominated exchange rates. Our analysis indicates that there existed periods during the post-Bretton Woods era where the temporal relationship between different exchange rates was strong, although these periods have become less frequent over the past 20 years. In particular, our results demonstrate the episodic nature of the nonlinearity, and have implications for the speed of flow of information between financial series. The method generalises recently proposed tests for nonlinearity to the multivariate context.
Resumo:
The i-motif structures are formed by oligonucleotides containing cytosine tracts under acidic conditions. The folding of the i-motif under physiological conditions is of great interest because of its biological role. In this study, we investigated the effect of the intra-strand cross-link on the stability of the i-motif structure. The 4-vinyl-substituted analog of thymidine (T-vinyl) was incorporated into the 5′-end of the human telomere complementary strand, which formed the intra-strand cross-link with the internal adenine. The intra-strand cross-linked i-motif displayed CD spectra similar to that of the natural i-motif at acidic pH, which was transformed into a random coil with the increasing pH. The pH midpoint for the transition from the i-motif to random coil increased from pH 6.1 for the natural one to pH 6.8 for the cross-linked one. The thermodynamic parameters were obtained by measuring the thermal melting behaviors by CD and UV, and it was determined that the intra-strand cross-linked i-motif is stabilized due to a favorable entropy effect. Thus, this study has clearly indicated the validity of the intra-strand cross-linking for stabilization of the i-motif structure.
Resumo:
Introduction Human immunodeficiency virus (HIV) is a serious disease which can be associated with various activity limitations and participation restrictions. The aim of this paper was to describe how HIV affects the functioning and health of people within different environmental contexts, particularly with regard to access to medication. Method Four cross-sectional studies, three in South Africa and one in Brazil, had applied the International Classification of Functioning, Disability and Health (ICF) as a classification instrument to participants living with HIV. Each group was at a different stage of the disease. Only two groups had had continuing access to antiretroviral therapy. The existence of these descriptive sets enabled comparison of the disability experienced by people living with HIV at different stages of the disease and with differing access to antiretroviral therapy. Results Common problems experienced in all groups related to weight maintenance, with two-thirds of the sample reporting problems in this area. Mental functions presented the most problems in all groups, with sleep (50%, 92/185), energy and drive (45%, 83/185), and emotional functions (49%, 90/185) being the most affected. In those on long-term therapy, body image affected 93% (39/42) and was a major problem. The other groups reported pain as a problem, and those with limited access to treatment also reported mobility problems. Cardiopulmonary functions were affected in all groups. Conclusion Functional problems occurred in the areas of impairment and activity limitation in people at advanced stages of HIV, and more limitations occurred in the area of participation for those on antiretroviral treatment. The ICF provided a useful framework within which to describe the functioning of those with HIV and the impact of the environment. Given the wide spectrum of problems found, consideration could be given to a number of ICF core sets that are relevant to the different stages of HIV disease. (C) 2010 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.
Resumo:
Objective Underreporting of energy intake is prevalent in food surveys, but there is controversy about which dietary assessment method provides greater underreporting rates. Our objective is to compare validity of self-reported energy intake obtained by three dietary assessment methods with total energy expenditure (TEE) obtained by doubly labeled water (DLW) among Brazilian women. Design We used a cross-sectional study. Subjects/setting Sixty-five females aged 18 to 57 years (28 normal-weight, 10 over-weight, and 27 obese) were recruited from two universities to participate. Main outcome measures TEE determined by DLW, energy intake estimated by three 24-hour recalls, 3-day food record, and a food frequency questionnaire (FFQ). Statistical analyses performed Regression and analysis of variance with repeated measures compared TEE and energy intake values, and energy intake-to-TEE ratios and energy intake-TEE values between dietary assessment methods. Bland and Altman plots were provided for each method. chi(2) test compared proportion of underreporters between the methods. Results Mean TEE was 2,622 kcal (standard deviation [SD] =490 kcal), while mean energy intake was 2,078 kcal (SD=430 kcal) for the diet recalls; 2,044 kcal (SD=479 kcal) for the food record and 1,984 kcal (SD=832 kcal) for the FFQ (all energy intake values significantly differed from TEE; P<0.0001). Bland and Altman plots indicated great dispersion, negative mean differences between measurements, and wide limits of agreement. Obese subjects underreported more than normal-weight subjects in the diet recalls and in the food records, but not in the FFQ. Years of education, income and ethnicity were associated with reporting accuracy. Conclusions The FFQ produced greater under- and overestimation of energy intake. Underreporting of energy intake is a serious and prevalent error in dietary self-reports provided by Brazilian women, as has been described in studies conducted in developed countries.
Resumo:
In the present study, the validation of an enzyme-linked immunosorbent assay (ELISA) for serodiagnosis of canine brucellosis is described. Two different antigenic extracts, obtained by heat or ultrasonic homogenization of microbial antigens from a wild isolate of Brucella canis bacteria, were compared by ELISA and Western blot (WB). A total of 145 canine sera were used to define sensitivity, specificity and accuracy of the ELISA as follows: (1) sera from 34 animals with natural B. canis infection, confirmed by blood culture and PCR, as well as 51 sera samples from healthy dogs with negative results by the agar-gel immunodiffusion (ACID) test for canine brucellosis, were used as the control panel for B. cants infection; and (2) to scrutinize the possibility of cross reactions with other common dog infections in the same geographical area in Brazil, 60 sera samples from dogs harboring known infections by Leptospira sp., Ehrlichia canis, canine distemper virus (CDV), Neospora caninum, Babesia canis and Leishmania chagasi (10 in each group) were included in the study. The ELISA using heat soluble bacterial extract (HE-antigen) as antigen showed the best values of sensitivity (91.18%), specificity (100%) and accuracy (96.47%). In the WB analyses, the HE-antigen showed no cross-reactivity with sera from dogs with different infections, while the B. canis sonicate had various protein bands identified by those sera. The performance of the ELISA standardized with the heat soluble B. canis antigen indicates that this assay can be used as a reliable and practical method to confirm infection by this microorganism, as well as a tool for seroepidemiological studies. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Non-linear methods for estimating variability in time-series are currently of widespread use. Among such methods are approximate entropy (ApEn) and sample approximate entropy (SampEn). The applicability of ApEn and SampEn in analyzing data is evident and their use is increasing. However, consistency is a point of concern in these tools, i.e., the classification of the temporal organization of a data set might indicate a relative less ordered series in relation to another when the opposite is true. As highlighted by their proponents themselves, ApEn and SampEn might present incorrect results due to this lack of consistency. In this study, we present a method which gains consistency by using ApEn repeatedly in a wide range of combinations of window lengths and matching error tolerance. The tool is called volumetric approximate entropy, vApEn. We analyze nine artificially generated prototypical time-series with different degrees of temporal order (combinations of sine waves, logistic maps with different control parameter values, random noises). While ApEn/SampEn clearly fail to consistently identify the temporal order of the sequences, vApEn correctly do. In order to validate the tool we performed shuffled and surrogate data analysis. Statistical analysis confirmed the consistency of the method. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents an automatic method to detect and classify weathered aggregates by assessing changes of colors and textures. The method allows the extraction of aggregate features from images and the automatic classification of them based on surface characteristics. The concept of entropy is used to extract features from digital images. An analysis of the use of this concept is presented and two classification approaches, based on neural networks architectures, are proposed. The classification performance of the proposed approaches is compared to the results obtained by other algorithms (commonly considered for classification purposes). The obtained results confirm that the presented method strongly supports the detection of weathered aggregates.
Resumo:
We here report the preparation of supported palladium nanoparticles (NPs) stabilized by pendant phosphine groups by reacting a palladium complex containing the ligand 2-(diphenylphosphino)benzaldehyde with an amino-functionalized silica surface The Pd nanocatalyst is active for Suzuki cross-coupling reaction avoiding any addition of other sources of phosphine ligands The Pd intermediates and Pd NPs were characterized by solid-state nuclear magnetic resonance and transmission electron microscopy techniques The synthetic method was also applied to prepare magnetically recoverable Pd NPs leading to a catalyst that could be reused for up to 10 recycles In summary we gathered the advantages of heterogeneous catalysis magnetic separation and enhanced catalytic activity of palladium promoted by phosphine ligands to synthesize a new catalyst for Suzuki cross-coupling reactions The Pd NP catalyst prepared on the phosphine-functionalized support was more active and selective than a similar Pd NP catalyst prepared on an amino-functionalized support (C) 2010 Elsevier Inc All rights reserved
Resumo:
Difficulties in cross-section measurements at very low energies, when charged particles are involved, led to the development of some indirect methods. The Trojan horse method (THM) allows us to bypass the Coulomb effects and has been successfully applied to several reactions of astrophysical interest. A brief review of the THM applications is reported together with some of the most recent results.
Resumo:
Nuclear (p,alpha) reactions destroying the so-called ""light-elements"" lithium, beryllium and boron have been largely studied in the past mainly because their role in understanding some astrophysical phenomena, i.e. mixing-phenomena occurring in young F-G stars [1]. Such mechanisms transport the surface material down to the region close to the nuclear destruction zone, where typical temperatures of the order of similar to 10(6) K are reached. The corresponding Gamow energy E(0)=1.22 (Z(x)(2)Z(X)(2)T(6)(2))(1/3) [2] is about similar to 10 keV if one considers the ""boron-case"" and replaces in the previous formula Z(x) = 1, Z(X) = 5 and T(6) = 5. Direct measurements of the two (11)B(p,alpha(0))(8)Be and (10)B(p,alpha)(7)Be reactions in correspondence of this energy region are difficult to perform mainly because the combined effects of Coulomb barrier penetrability and electron screening [3]. The indirect method of the Trojan Horse (THM) [4-6] allows one to extract the two-body reaction cross section of interest for astrophysics without the extrapolation-procedures. Due to the THM formalism, the extracted indirect data have to be normalized to the available direct ones at higher energies thus implying that the method is a complementary tool in solving some still open questions for both nuclear and astrophysical issues [7-12].
Resumo:
Direct measurements in the last decades have highlighted a new problem related to the lowering of the Coulomb barrier between the interacting nuclei due to the presence of the ""electron screening"" in the laboratory measurements. It was systematically observed that the presence of the electronic cloud around the interacting ions in measurements of nuclear reactions cross sections at astrophysical energies gives rise to an enhancement of the astrophysical S(E)-factor as lower and lower energies are explored [1]. Moreover, at present Such an effect is not well understood as the value of the potential for screening extracted from these measurements is higher than the tipper limit of theoretical predictions (adiabatic limit). On the other hand, the electron screening potential in laboratory measurement is different from that occurring in stellar plasmas thus the quantity of interest in astrophysics is the so-called ""bare nucleus cross section"". This quantity can only be extrapolated in direct measurements. These are the reasons that led to a considerable growth on interest in indirect measurement techniques and in particular the Trojan Horse Method (THM) [2,3]. Results concerning the bare nucleus cross sections measurements will be shown in several cases of astrophysical interest. In those cases the screening potential evaluated by means of the THM will be compared with the adiabatic limit and results arising from extrapolation in direct measurements.
Resumo:
The (2)H(d,p)(3)H and (2)H(d,n)(3)He reactions have been indirectly studied by means of the Trojan Horse Method applied to the quasi-free (2)H((3)He, p(3)H)(1)H (2)H((3)He, n(3)He)(1)H reaction at 18 MeV of beam energy. This is the first experiment where the spectator (here (1)H) has been detected in coincidence with the charged participant, avoiding the limitations of standard neutron detectors. The d - d relative energy has been measured from 1.5 MeV down to 2 keV, at center of mass angles from 40A degrees to 170A degrees. Indirect angular distributions are compared with the direct behaviour in the overlapping regions.
Resumo:
The bare nucleus S(E) factors for the (2)H(d, p)(3)H and (2)H(d.n)(3)He reactions have been measured for the first time via the Trojan Horse Method off the proton in (3)He from 1.5 MeV down to 2 key. This range overlaps with the relevant region for Standard Big Bang Nucleosynthesis as well as with the thermal energies of future fusion reactors and deuterium burning in the Pre-Main-Sequence phase of stellar evolution. This is the first pioneering experiment in quasi free regime where the charged spectator is detected. Both the energy dependence and the absolute value of the S(E) factors deviate by more than 15% from available direct data with new S(0) values of 57.4 +/- 1.8 MeVb for (3)H + p and 60.1 +/- 1.9 MeV b for (3)He + n. None of the existing fitting curves is able to provide the correct slope of the new data in the full range, thus calling for a revision of the theoretical description. This has consequences in the calculation of the reaction rates with more than a 25% increase at the temperatures of future fusion reactors. (C) 2011 Elsevier By. All rights reserved.