116 resultados para Partial autocorrelationsspectral density
Resumo:
Radioiodinated recombinant human interferon-gamma (IFN gamma) bound to human monocytes, U937, and HL60 cells in a specific, saturable, and reversible manner. At 4 degrees C, the different cell types bound 3,000-7,000 molecules of IFN gamma, and binding was of comparable affinity (Ka = 4-12 X 10(8) M-1). No change in the receptor was observed after monocytes differentiated to macrophages or when the cell lines were pharmacologically induced to differentiate. The functional relevance of the receptor was validated by the demonstration that receptor occupancy correlated with induction of Fc receptors on U937. Binding studies using U937 permeabilized with digitonin showed that only 46% of the total receptor pool was expressed at the cell surface. The receptor appears to be a protein, since treatment of U937 with trypsin or pronase reduced 125I-IFN gamma binding by 87 and 95%, respectively. At 37 degrees C, ligand was internalized, since 32% of the cell-associated IFN gamma became resistant to trypsin stripping. Monocytes degraded 125I-IFN gamma into trichloroacetic acid-soluble counts at 37 degrees C but not at 4 degrees C, at an approximate rate of 5,000 molecules/cell per h. The receptor was partially characterized by SDS-polyacrylamide gel electrophoresis analysis of purified U937 membranes that had been incubated with 125I-IFN gamma. After cross-linking, the receptor-ligand complex migrated as a broad band that displayed an Mr of 104,000 +/- 18,000 at the top and 84,000 +/- 6,000 at the bottom. These results thereby define and partially characterize the IFN gamma receptor of human mononuclear phagocytes.
Resumo:
We propose new methods for evaluating predictive densities. The methods includeKolmogorov-Smirnov and Cram?r-von Mises-type tests for the correct specification ofpredictive densities robust to dynamic mis-specification. The novelty is that the testscan detect mis-specification in the predictive densities even if it appears only overa fraction of the sample, due to the presence of instabilities. Our results indicatethat our tests are well sized and have good power in detecting mis-specification inpredictive densities, even when it is time-varying. An application to density forecastsof the Survey of Professional Forecasters demonstrates the usefulness of the proposedmethodologies.
Resumo:
Following a scheme of Levin we describe the values that functions in Fock spaces take on lattices of critical density in terms of both the size of the values and a cancelation condition that involves discrete versions of the Cauchy and Beurling-Ahlfors transforms.
Resumo:
Planning with partial observability can be formulated as a non-deterministic search problem in belief space. The problem is harder than classical planning as keeping track of beliefs is harder than keeping track of states, and searching for action policies is harder than searching for action sequences. In this work, we develop a framework for partial observability that avoids these limitations and leads to a planner that scales up to larger problems. For this, the class of problems is restricted to those in which 1) the non-unary clauses representing the uncertainty about the initial situation are nvariant, and 2) variables that are hidden in the initial situation do not appear in the body of conditional effects, which are all assumed to be deterministic. We show that such problems can be translated in linear time into equivalent fully observable non-deterministic planning problems, and that an slight extension of this translation renders the problem solvable by means of classical planners. The whole approach is sound and complete provided that in addition, the state-space is connected. Experiments are also reported.
Resumo:
This paper addresses the surprising lack of quality control on the analysis and selection on energy policies observable in the last decades. As an example, we discuss the delusional idea that it is possible to replace fossil energy with large scale ethanol production from agricultural crops. But if large scale ethanol production is not practical in energetic terms, why huge amount of money has been invested in it and is it still being invested? In order to answer this question we introduce two concepts useful to frame, in general terms, the predicament of quality control in science: (i) the concept of “granfalloons” proposed by K. Vonnegut (1963) flagging the danger of the formation of “crusades to save the world” void of real meaning. These granfalloons are often used by powerful lobbies to distort policy decisions; and (ii) the concept of Post-Normal science by S. Funtowicz and J. Ravetz (1990) indicating a standard predicament faced by science when producing information for governance. When mixing together uncertainty, multiple-scale and legitimate but contrasting views it becomes impossible to deal with complex issue using the conventional scientific approach based on reductionism. We finally discuss the implications of a different approach to the assessment of alternative energy sources by introducing the concept of Promethean technology.
Resumo:
A consistent extension of local spin density approximation (LSDA) to account for mass and dielectric mismatches in nanocrystals is presented. The extension accounting for variable effective mass is exact. Illustrative comparisons with available configuration interaction calculations show that the approach is also very reliable when it comes to account for dielectric mismatches. The modified LSDA is as fast and computationally low demanding as LSDA. Therefore, it is a tool suitable to study large particle systems in inhomogeneous media without much effort.
Resumo:
The final year project came to us as an opportunity to get involved in a topic which has appeared to be attractive during the learning process of majoring in economics: statistics and its application to the analysis of economic data, i.e. econometrics.Moreover, the combination of econometrics and computer science is a very hot topic nowadays, given the Information Technologies boom in the last decades and the consequent exponential increase in the amount of data collected and stored day by day. Data analysts able to deal with Big Data and to find useful results from it are verydemanded in these days and, according to our understanding, the work they do, although sometimes controversial in terms of ethics, is a clear source of value added both for private corporations and the public sector. For these reasons, the essence of this project is the study of a statistical instrument valid for the analysis of large datasets which is directly related to computer science: Partial Correlation Networks.The structure of the project has been determined by our objectives through the development of it. At first, the characteristics of the studied instrument are explained, from the basic ideas up to the features of the model behind it, with the final goal of presenting SPACE model as a tool for estimating interconnections in between elements in large data sets. Afterwards, an illustrated simulation is performed in order to show the power and efficiency of the model presented. And at last, the model is put into practice by analyzing a relatively large data set of real world data, with the objective of assessing whether the proposed statistical instrument is valid and useful when applied to a real multivariate time series. In short, our main goals are to present the model and evaluate if Partial Correlation Network Analysis is an effective, useful instrument and allows finding valuable results from Big Data.As a result, the findings all along this project suggest the Partial Correlation Estimation by Joint Sparse Regression Models approach presented by Peng et al. (2009) to work well under the assumption of sparsity of data. Moreover, partial correlation networks are shown to be a very valid tool to represent cross-sectional interconnections in between elements in large data sets.The scope of this project is however limited, as there are some sections in which deeper analysis would have been appropriate. Considering intertemporal connections in between elements, the choice of the tuning parameter lambda, or a deeper analysis of the results in the real data application are examples of aspects in which this project could be completed.To sum up, the analyzed statistical tool has been proved to be a very useful instrument to find relationships that connect the elements present in a large data set. And after all, partial correlation networks allow the owner of this set to observe and analyze the existing linkages that could have been omitted otherwise.
A performance lower bound for quadratic timing recovery accounting for the symbol transition density
Resumo:
The symbol transition density in a digitally modulated signal affects the performance of practical synchronization schemes designed for timing recovery. This paper focuses on the derivation of simple performance limits for the estimation of the time delay of a noisy linearly modulated signal in the presence of various degrees of symbol correlation produced by the varioustransition densities in the symbol streams. The paper develops high- and low-signal-to-noise ratio (SNR) approximations of the so-called (Gaussian) unconditional Cramér–Rao bound (UCRB),as well as general expressions that are applicable in all ranges of SNR. The derived bounds are valid only for the class of quadratic, non-data-aided (NDA) timing recovery schemes. To illustrate the validity of the derived bounds, they are compared with the actual performance achieved by some well-known quadratic NDA timing recovery schemes. The impact of the symbol transitiondensity on the classical threshold effect present in NDA timing recovery schemes is also analyzed. Previous work on performancebounds for timing recovery from various authors is generalized and unified in this contribution.
Resumo:
The inverse scattering problem concerning the determination of the joint time-delayDoppler-scale reflectivity density characterizing continuous target environments is addressed by recourse to the generalized frame theory. A reconstruction formula,involving the echoes of a frame of outgoing signals and its corresponding reciprocalframe, is developed. A ‘‘realistic’’ situation with respect to the transmission ofa finite number of signals is further considered. In such a case, our reconstruction formula is shown to yield the orthogonal projection of the reflectivity density onto a subspace generated by the transmitted signals.
Resumo:
Both the intermolecular interaction energies and the geometries for M ̄ thiophene, M ̄ pyrrole, M n+ ̄ thiophene, and M n+ ̄ pyrrole ͑with M = Li, Na, K, Ca, and Mg; and M n+ = Li+ , Na+ , K+ , Ca2+, and Mg2+͒ have been estimated using four commonly used density functional theory ͑DFT͒ methods: B3LYP, B3PW91, PBE, and MPW1PW91. Results have been compared to those provided by HF, MP2, and MP4 conventional ab initio methods. The PBE and MPW1PW91 are the only DFT methods able to provide a reasonable description of the M ̄ complexes. Regarding M n+ ̄ complexes, the four DFT methods have been proven to be adequate in the prediction of these electrostatically stabilized systems, even though they tend to overestimate the interaction energies.
Resumo:
n this work we analyze the behavior of complex information in Fresnel domain taking into account the limited capability to display complex transmittance values of current liquid crystal devices, when used as holographic displays. In order to do this analysis we compute the reconstruction of Fresnel holograms at several distances using the different parts of the complex distribution (real and imaginary parts, amplitude and phase) as well as using the full complex information adjusted with a method that combines two configurations of the devices in an adding architecture. The RMS error between the amplitude of these reconstructions and the original amplitude is used to evaluate the quality of the information displayed. The results of the error analysis show different behavior for the reconstructions using the different parts of the complex distribution and using the combined method of two devices. Better reconstructions are obtained when using two devices whose configurations densely cover the complex plane when they are added. Simulated and experimental results are also presented.
Resumo:
A seasonal period of water deficit characterizes tropical dry forests (TDFs). There, sympatric tree species exhibit a diversity of growth rates, functional traits, and responses to drought, suggesting that each species may possess different strategies to grow under different conditions of water availability. The evaluation of the long-term growth responses to changes in the soil water balance should provide an understanding of how and when coexisting tree species respond to water deficit in TDFs. Furthermore, such differential growth responses may be linked to functional traits related to water storage and conductance. We used dendrochronology and climate data to retrospectively assess how the radial growth of seven coexisting deciduous tree species responded to the seasonal soil water balance in a Bolivian TDF. Linear mixed-effects models were used to quantify the relationships between basal area increment and seasonal water balance. We related these relationships with wood density and sapwood production to assess if they affect the growth responses to climate. The growth of all species responded positively to water balance during the wet season, but such responses differed among species as a function of their wood density. For instance, species with a strong growth response to water availability averaged a low wood density which may facilitate the storage of water in the stem. By contrast, species with very dense wood were those whose growth was less sensitive to water availability. Coexisting tree species thus show differential growth responses to changes in soil water balance during the wet season. Our findings also provide a link between wood density, a trait related to the ability of trees to store water in the stem, and wood formation in response to water availability.
Resumo:
Among all inflammatory cells involved in COPD, those with a cytolytic or elastolytic activity are thought to play a key role in the pathogenesis of the disease. However, there is no data about the infiltration of cells expressing the CD57 marker in small airways and parenchyma of COPD patients. In this study, surgical specimens from 43 subjects undergoing lung resection due to lung cancer (9 non-smokers, 18 smokers without COPD and 16 smokers with moderate COPD) and 16 patients undergoing double lung transplantation for very severe COPD were examined. CD57+ cells, neutrophils, macrophages and mast cells infiltrating bronchioles (epithelium, smooth muscle and connective tissue) and parenchymal interstitium were localized and quantified by immunohistochemical analysis. Compared to the other groups, the small airways of very severe COPD patients showed a significantly higher density of CD57+ cells, mainly infiltrated in the connective tissue (p=0.001), and a significantly higher density of neutrophils located characteristically in the epithelium (p=0.037). Also, the density of neutrophils was significantly higher in parenchyma of very severe COPD patients compared with the rest of the groups (p=0.001). Finally, there were significant correlations between the bronchiolar density of CD57+ cells and the FEV1 values (R=-0.43, p=0.022), as well as between the parenchymal density of neutrophils and macroscopic emphysema degree (R=0.43, p=0.048) in COPD groups. These results show that CD57+ cells may be involved in COPD pathogenesis, especially in the most severe stages of the disease.
Resumo:
In this article, we explore the possibility of modifying the silicon nanocrystal areal density in SiOx single layers, while keeping constant their size. For this purpose, a set of SiOx monolayers with controlled thickness between two thick SiO2 layers has been fabricated, for four different compositions (x=1, 1.25, 1.5, or 1.75). The structural properties of the SiO x single layers have been analyzed by transmission electron microscopy (TEM) in planar view geometry. Energy-filtered TEM images revealed an almost constant Si-cluster size and a slight increase in the cluster areal density as the silicon content increases in the layers, while high resolution TEM images show that the size of the Si crystalline precipitates largely decreases as the SiO x stoichiometry approaches that of SiO2. The crystalline fraction was evaluated by combining the results from both techniques, finding a crystallinity reduction from 75% to 40%, for x = 1 and 1.75, respectively. Complementary photoluminescence measurements corroborate the precipitation of Si-nanocrystals with excellent emission properties for layers with the largest amount of excess silicon. The integrated emission from the nanoaggregates perfectly scales with their crystalline state, with no detectable emission for crystalline fractions below 40%. The combination of the structural and luminescence observations suggests that small Si precipitates are submitted to a higher compressive local stress applied by the SiO2 matrix that could inhibit the phase separation and, in turn, promotes the creation of nonradiative paths.
Resumo:
Olive oil decreases the risk of CVD. This effect may be due to the fatty acid profile of the oil, but it may also be due to its antioxidant content which differs depending on the type of olive oil. In this study, the concentrations of oleic acid and antioxidants (phenolic compounds and vitamin E) in plasma and LDL were compared after consumption of three similar olive oils, but with differences in their phenolic content. Thirty healthy volunteers participated in a placebo-controlled, double-blind, crossover, randomized supplementation trial. Virgin, common, and refined olive oils were administered during three periods of 3 weeks separated by a 2-week washout period. Participants were requested to ingest a daily dose of 25 ml raw olive oil, distributed over the three meals of the day, during intervention periods. All three olive oils caused an increase in plasma and LDL oleic acid (P,0·05) content. Olive oils rich in phenolic compounds led to an increase in phenolic compounds in LDL (P,0·005). The concentration of phenolic compounds in LDL was directly correlated with the phenolic concentration in the olive oils. The increase in the phenolic content of LDL could account for the increase of the resistance of LDL to oxidation, and the decrease of the in vivo oxidized LDL, observed in the frame of this trial. Our results support the hypothesis that a daily intake of virgin olive oil promotes protective LDL changes ahead of its oxidation.