968 resultados para Information Matrix
Resumo:
Information to guide decision making is especially urgent in human dominated landscapes in the tropics, where urban and agricultural frontiers are still expanding in an unplanned manner. Nevertheless, most studies that have investigated the influence of landscape structure on species distribution have not considered the heterogeneity of altered habitats of the matrix, which is usually high in human dominated landscapes. Using the distribution of small mammals in forest remnants and in the four main altered habitats in an Atlantic forest landscape, we investigated 1) how explanatory power of models describing species distribution in forest remnants varies between landscape structure variables that do or do not incorporate matrix quality and 2) the importance of spatial scale for analyzing the influence of landscape structure. We used standardized sampling in remnants and altered habitats to generate two indices of habitat quality, corresponding to the abundance and to the occurrence of small mammals. For each remnant, we calculated habitat quantity and connectivity in different spatial scales, considering or not the quality of surrounding habitats. The incorporation of matrix quality increased model explanatory power across all spatial scales for half the species that occurred in the matrix, but only when taking into account the distance between habitat patches (connectivity). These connectivity models were also less affected by spatial scale than habitat quantity models. The few consistent responses to the variation in spatial scales indicate that despite their small size, small mammals perceive landscape features at large spatial scales. Matrix quality index corresponding to species occurrence presented a better or similar performance compared to that of species abundance. Results indicate the importance of the matrix for the dynamics of fragmented landscapes and suggest that relatively simple indices can improve our understanding of species distribution, and could be applied in modeling, monitoring and managing complex tropical landscapes.
Resumo:
A time efficient optical model is proposed for GATE simulation of a LYSO scintillation matrix coupled to a photomultiplier. The purpose is to avoid the excessively long computation time when activating the optical processes in GATE. The usefulness of the model is demonstrated by comparing the simulated and experimental energy spectra obtained with the dual planar head equipment for dosimetry with a positron emission tomograph ( DoPET). The procedure to apply the model is divided in two steps. Firstly, a simplified simulation of a single crystal element of DoPET is used to fit an analytic function that models the optical attenuation inside the crystal. In a second step, the model is employed to calculate the influence of this attenuation in the energy registered by the tomograph. The use of the proposed optical model is around three orders of magnitude faster than a GATE simulation with optical processes enabled. A good agreement was found between the experimental and simulated data using the optical model. The results indicate that optical interactions inside the crystal elements play an important role on the energy resolution and induce a considerable degradation of the spectra information acquired by DoPET. Finally, the same approach employed by the proposed optical model could be useful to simulate a scintillation matrix coupled to a photomultiplier using single or dual readout scheme.
Resumo:
NMR quantum information processing studies rely on the reconstruction of the density matrix representing the so-called pseudo-pure states (PPS). An initially pure part of a PPS state undergoes unitary and non-unitary (relaxation) transformations during a computation process, causing a ""loss of purity"" until the equilibrium is reached. Besides, upon relaxation, the nuclear polarization varies in time, a fact which must be taken into account when comparing density matrices at different instants. Attempting to use time-fixed normalization procedures when relaxation is present, leads to various anomalies on matrices populations. On this paper we propose a method which takes into account the time-dependence of the normalization factor. From a generic form for the deviation density matrix an expression for the relaxing initial pure state is deduced. The method is exemplified with an experiment of relaxation of the concurrence of a pseudo-entangled state, which exhibits the phenomenon of sudden death, and the relaxation of the Wigner function of a pseudo-cat state.
Resumo:
Amazonian oils and fats display unique triacylglycerol (TAG) profiles and, because of their economic importance as renewable raw materials and use by the cosmetic and food industries, are often subject to adulteration and forgery. Representative samples of these oils (andiroba, Brazil nut, buriti, and passion fruit) and fats (cupuacu, murumuru, and ucuba) were characterized without pre-separation or derivatization via dry (solvent-free) matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). Characteristic profiles of TAG were obtained for each oil and tat. Dry MALDI-TOF MS provides typification and direct and detailed information, via TAG profiles, of their variable combinations of fatty acids. A database from spectra could be developed and may be used for their fast and reliable typification, application screening, and quality control.
Resumo:
Random effect models have been widely applied in many fields of research. However, models with uncertain design matrices for random effects have been little investigated before. In some applications with such problems, an expectation method has been used for simplicity. This method does not include the extra information of uncertainty in the design matrix is not included. The closed solution for this problem is generally difficult to attain. We therefore propose an two-step algorithm for estimating the parameters, especially the variance components in the model. The implementation is based on Monte Carlo approximation and a Newton-Raphson-based EM algorithm. As an example, a simulated genetics dataset was analyzed. The results showed that the proportion of the total variance explained by the random effects was accurately estimated, which was highly underestimated by the expectation method. By introducing heuristic search and optimization methods, the algorithm can possibly be developed to infer the 'model-based' best design matrix and the corresponding best estimates.
Resumo:
Background There is emerging evidence that the physical environment is important for health, quality of life and care, but there is a lack of valid instruments to assess health care environments. The Sheffield Care Environment Assessment Matrix (SCEAM), developed in the United Kingdom, provides a comprehensive assessment of the physical environment of residential care facilities for older people. This paper reports on the translation and adaptation of SCEAM for use in Swedish residential care facilities for older people, including information on its validity and reliability. Methods SCEAM was translated into Swedish and back-translated into English, and assessed for its relevance by experts using content validity index (CVI) together with qualitative data. After modification, the validity assessments were repeated and followed by test-retest and inter-rater reliability tests in six units within a Swedish residential care facility that varied in terms of their environmental characteristics. Results Translation and back translation identified linguistic and semantic related issues. The results of the first content validity analysis showed that more than one third of the items had item-CVI (I-CVI) values less than the critical value of 0.78. After modifying the instrument, the second content validation analysis resulted in I-CVI scores above 0.78, the suggested criteria for excellent content validity. Test-retest reliability showed high stability (96% and 95% for two independent raters respectively), and inter-rater reliability demonstrated high levels of agreement (95% and 94% on two separate rating occasions). Kappa values were very good for test-retest (κ= 0.903 and 0.869) and inter-rater reliability (κ= 0.851 and 0.832). Conclusions Adapting an instrument to a domestic context is a complex and time-consuming process, requiring an understanding of the culture where the instrument was developed and where it is to be used. A team, including the instrument’s developers, translators, and researchers is necessary to ensure a valid translation and adaption. This study showed preliminary validity and reliability evidence for the Swedish version (S-SCEAM) when used in a Swedish context. Further, we believe that the S-SCEAM has improved compared to the original instrument and suggest that it can be used as a foundation for future developments of the SCEAM model.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A study was taken in a 1566 ha watershed situated in the Capivara River basin, municipality of Botucatu, São Paulo State, Brazil. This environment is fragile and can be subjected to different forms of negative impacts, among them soil erosion by water. The main objective of the research was to develop a methodology for the assessment of soil erosion fragility at the various different watershed positions, using the geographic information system ILWIS version 3.3 for Windows. An impact model was created to generate the soil's erosion fragility plan, based on four indicators of fragility to water erosion: land use and cover, slope, percentage of soil fine sand and accumulated water flow. Thematic plans were generated in a geographic information system (GIS) environment. First, all the variables, except land use and cover, were described by continuous numerical plans in a raster structure. The land use and cover plan was also represented by numerical values associated with the weights attributed to each class, starting from a pairwise comparison matrix and using the analytical hierarchy process. A final field check was done to record evidence of erosive processes in the areas indicated as presenting the highest levels of fragility, i.e., sites with steep slopes, high percentage of soil fine sand, tendency to accumulate surface water flow, and sites of pastureland. The methodology used in the environmental problems diagnosis of the study area can be employed at places with similar relief, soil and climatic conditions.
Resumo:
Structural Health Monitoring (SHM) has diverse potential applications, and many groups work in the development of tools and techniques for monitoring structural performance. These systems use arrays of sensors and can be integrated with remote or local computers. There are several different approaches that can be used to obtain information about the existence, location and extension of faults by non destructive tests. In this paper an experimental technique is proposed for damage location based on an observability grammian matrix. The dynamic properties of the structure are identified through experimental data using the eigensystem realization algorithm (ERA). Experimental tests were carried out in a structure through varying the mass of some elements. Output signals were obtained using accelerometers.
Resumo:
Although it has already been shown that enamel matrix derivative (Emdogain((R))) promotes periodontal regeneration in the treatment of intrabony periodontal defects, there is little information concerning its regenerative capacity in cases of delayed tooth replantation. To evaluate the alterations in the periodontal healing of replanted teeth after use of Emdogain((R)), the central incisors of 24 Wistar rats (Rattus norvegicus albinus) were extracted and left on the bench for 6 h. Thereafter, the dental papilla and the enamel organ of each tooth were sectioned for pulp removal by a retrograde way and the canal was irrigated with 1% sodium hypochlorite. The teeth were assigned to two groups:in group I, root surface was treated with 1% sodium hypochlorite for 10 min (changing the solution every 5 min), rinsed with saline for 10 min and immersed in 2% acidulated-phosphate sodium fluoride for 10 min; in group II, root surfaces were treated in the same way as described above, except for the application of Emdogain((R)) instead of sodium fluoride. The teeth were filled with calcium hydroxide (in group II right before Emdogain((R)) was applied) and replanted. All animals received antibiotic therapy. The rats were killed by anesthetic overdose 10 and 60 days after replantation. The pieces containing the replanted teeth were removed, fixated, decalcified and paraffin-embedded. Semi-serial 6-mu m-thick sections were obtained and stained with hematoxylin and eosin for histologic and histometric analyses. The use of 2% acidulated-phosphate sodium fluoride provided more areas of replacement resorption. The use of Emdogain((R)) resulted in more areas of ankylosis and was therefore not able to avoid dentoalveolar ankylosis. It may be concluded that neither 2% acidulated-phosphate sodium fluoride nor Emdogain((R)) were able to prevent root resorption in delayed tooth replantation in rats.
Measurement of the top quark mass in the lepton plus jets final state with the matrix element method
Resumo:
We present a measurement of the top quark mass with the matrix element method in the lepton+jets final state. As the energy scale for calorimeter jets represents the dominant source of systematic uncertainty, the matrix element likelihood is extended by an additional parameter, which is defined as a global multiplicative factor applied to the standard energy scale. The top quark mass is obtained from a fit that yields the combined statistical and systematic jet energy scale uncertainty. Using a data set of 0.4 fb(-1) taken with the D0 experiment at Run II of the Fermilab Tevatron Collider, the mass of the top quark is measured using topological information to be: m(top)(center dot+jets)(topo)=169.2(-7.4)(+5.0)(stat+JES)(-1.4)(+1.5)(syst) GeV, and when information about identified b jets is included: m(top)(center dot+jets)(b-tag)=170.3(-4.5)(+4.1)(stat+ JES)(-1.8)(+1.2)(syst) GeV. The measurements yield a jet energy scale consistent with the reference scale.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This study present a novel NO sensor made of a spin trap (iron(II)-diethyldithiocarbamate complex, FeDETC) incorporated in a latex rubber matrix and works as a trap for NO, which is detectable by Electron Paramagnetic Resonance (EPR). We explored the optimization of our sensors changing systematically two fabrication parameters: the latex rubber matrix temperature of polymerization and FeDETC concentration inside the matrix. The sensor was prepared in four different temperatures: 4, 10, 20 and 40°C. The FeDETC concentration was also varied from 0.975 to 14.8 mM. We observed a variation of the EPR signals from the sensors prepared at different conditions. We found a high stability of the EPR response from our sensor, 40 days at RT. The best sensor was made with a latex rubber matrix polymerized at 10°C and with a FeDETC concentration of 14.8 mM. In vivo tests show good biocompatibility of our sensor. © 2007 Asian Network for Scientific Information.
Resumo:
Research on the micro-structural characterization of metal-matrix composites uses X-ray computed tomography to collect information about the interior features of the samples, in order to elucidate their exhibited properties. The tomographic raw data needs several steps of computational processing in order to eliminate noise and interference. Our experience with a program (Tritom) that handles these questions has shown that in some cases the processing steps take a very long time and that it is not easy for a Materials Science specialist to interact with Tritom in order to define the most adequate parameter values and the proper sequence of the available processing steps. For easing the use of Tritom, a system was built which addresses the aspects described before and that is based on the OpenDX visualization system. OpenDX visualization facilities constitute a great benefit to Tritom. The visual programming environment of OpenDX allows an easy definition of a sequence of processing steps thus fulfilling the requirement of an easy use by non-specialists on Computer Science. Also the possibility of incorporating external modules in a visual OpenDX program allows the researchers to tackle the aspect of reducing the long execution time of some processing steps. The longer processing steps of Tritom have been parallelized in two different types of hardware architectures (message-passing and shared-memory); the corresponding parallel programs can be easily incorporated in a sequence of processing steps defined in an OpenDX program. The benefits of our system are illustrated through an example where the tool is applied in the study of the sensitivity to crushing – and the implications thereof – of the reinforcements used in a functionally graded syntactic metallic foam.