43 resultados para Partial Least Square
Resumo:
The development of liquid-crystal panels for use in commercial equipment has been aimed at improving the pixel resolution and the display efficiency. These improvements have led to a reduction in the thickness of such devices, among other outcomes, that involves a loss in phase modulation. We propose a modification of the classical phase-only filter to permit displays in VGA liquid-crystal panels with a constant amplitude modulation and less than a 2¿(PI) phase modulation. The method was tested experimentally in an optical setup.
Resumo:
Surface topography and light scattering were measured on 15 samples ranging from those having smooth surfaces to others with ground surfaces. The measurement techniques included an atomic force microscope, mechanical and optical profilers, confocal laser scanning microscope, angle-resolved scattering, and total scattering. The samples included polished and ground fused silica, silicon carbide, sapphire, electroplated gold, and diamond-turned brass. The measurement instruments and techniques had different surface spatial wavelength band limits, so the measured roughnesses were not directly comparable. Two-dimensional power spectral density (PSD) functions were calculated from the digitized measurement data, and we obtained rms roughnesses by integrating areas under the PSD curves between fixed upper and lower band limits. In this way, roughnesses measured with different instruments and techniques could be directly compared. Although smaller differences between measurement techniques remained in the calculated roughnesses, these could be explained mostly by surface topographical features such as isolated particles that affected the instruments in different ways.
Resumo:
We study the mean-first-passage-time problem for systems driven by the coin-toss square-wave signal. Exact analytic solutions are obtained for the driftless case. We also obtain approximate solutions for the potential case. The mean-first-passage time exhibits discontinuities and a remarkable nonsmooth oscillatory behavior which, to our knowledge, has not been observed for other kinds of driving noise.
Resumo:
The short-range resonating-valence-bond (RVB) wave function with nearest-neighbor (NN) spin pairings only is investigated as a possible description for the Heisenberg model on a square-planar lattice. A type of long-range order associated to this RVB Ansatz is identified along with some qualitative consequences involving lattice distortions, excitations, and their coupling.
Resumo:
We characterize the Schatten class membership of the canonical solution operator to $\overline{\partial}$ acting on $L^2(e^{-2\phi})$, where $\phi$ is a subharmonic function with $\Delta\phi$ a doubling measure. The obtained characterization is in terms of $\Delta\phi$. As part of our approach, we study Hankel operators with anti-analytic symbols acting on the corresponding Fock space of entire functions in $L^2(e^{-2\phi})$
Resumo:
We propose an iterative procedure to minimize the sum of squares function which avoids the nonlinear nature of estimating the first order moving average parameter and provides a closed form of the estimator. The asymptotic properties of the method are discussed and the consistency of the linear least squares estimator is proved for the invertible case. We perform various Monte Carlo experiments in order to compare the sample properties of the linear least squares estimator with its nonlinear counterpart for the conditional and unconditional cases. Some examples are also discussed
Resumo:
The present study focuses on single-case data analysis and specifically on two procedures for quantifying differences between baseline and treatment measurements The first technique tested is based on generalized least squares regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The comparison is carried out in the context of generated data representing a variety of patterns (i.e., independent measurements, different serial dependence underlying processes, constant or phase-specific autocorrelation and data variability, different types of trend, and slope and level change). The results suggest that the two techniques perform adequately for a wide range of conditions and researchers can use both of them with certain guarantees. The regression-based procedure offers more efficient estimates, whereas the proposed non-regression procedure is more sensitive to intervention effects. Considering current and previous findings, some tentative recommendations are offered to applied researchers in order to help choosing among the plurality of single-case data analysis techniques.
Resumo:
Radioiodinated recombinant human interferon-gamma (IFN gamma) bound to human monocytes, U937, and HL60 cells in a specific, saturable, and reversible manner. At 4 degrees C, the different cell types bound 3,000-7,000 molecules of IFN gamma, and binding was of comparable affinity (Ka = 4-12 X 10(8) M-1). No change in the receptor was observed after monocytes differentiated to macrophages or when the cell lines were pharmacologically induced to differentiate. The functional relevance of the receptor was validated by the demonstration that receptor occupancy correlated with induction of Fc receptors on U937. Binding studies using U937 permeabilized with digitonin showed that only 46% of the total receptor pool was expressed at the cell surface. The receptor appears to be a protein, since treatment of U937 with trypsin or pronase reduced 125I-IFN gamma binding by 87 and 95%, respectively. At 37 degrees C, ligand was internalized, since 32% of the cell-associated IFN gamma became resistant to trypsin stripping. Monocytes degraded 125I-IFN gamma into trichloroacetic acid-soluble counts at 37 degrees C but not at 4 degrees C, at an approximate rate of 5,000 molecules/cell per h. The receptor was partially characterized by SDS-polyacrylamide gel electrophoresis analysis of purified U937 membranes that had been incubated with 125I-IFN gamma. After cross-linking, the receptor-ligand complex migrated as a broad band that displayed an Mr of 104,000 +/- 18,000 at the top and 84,000 +/- 6,000 at the bottom. These results thereby define and partially characterize the IFN gamma receptor of human mononuclear phagocytes.
Resumo:
Planning with partial observability can be formulated as a non-deterministic search problem in belief space. The problem is harder than classical planning as keeping track of beliefs is harder than keeping track of states, and searching for action policies is harder than searching for action sequences. In this work, we develop a framework for partial observability that avoids these limitations and leads to a planner that scales up to larger problems. For this, the class of problems is restricted to those in which 1) the non-unary clauses representing the uncertainty about the initial situation are nvariant, and 2) variables that are hidden in the initial situation do not appear in the body of conditional effects, which are all assumed to be deterministic. We show that such problems can be translated in linear time into equivalent fully observable non-deterministic planning problems, and that an slight extension of this translation renders the problem solvable by means of classical planners. The whole approach is sound and complete provided that in addition, the state-space is connected. Experiments are also reported.
Resumo:
This paper addresses the surprising lack of quality control on the analysis and selection on energy policies observable in the last decades. As an example, we discuss the delusional idea that it is possible to replace fossil energy with large scale ethanol production from agricultural crops. But if large scale ethanol production is not practical in energetic terms, why huge amount of money has been invested in it and is it still being invested? In order to answer this question we introduce two concepts useful to frame, in general terms, the predicament of quality control in science: (i) the concept of “granfalloons” proposed by K. Vonnegut (1963) flagging the danger of the formation of “crusades to save the world” void of real meaning. These granfalloons are often used by powerful lobbies to distort policy decisions; and (ii) the concept of Post-Normal science by S. Funtowicz and J. Ravetz (1990) indicating a standard predicament faced by science when producing information for governance. When mixing together uncertainty, multiple-scale and legitimate but contrasting views it becomes impossible to deal with complex issue using the conventional scientific approach based on reductionism. We finally discuss the implications of a different approach to the assessment of alternative energy sources by introducing the concept of Promethean technology.
Resumo:
Background Depression is one of the more severe and serious health problems because of its morbidity, disabling effects and for its societal and economic burden. Despite the variety of existing pharmacological and psychological treatments, most of the cases evolve with only partial remission, relapse and recurrence. Cognitive models have contributed significantly to the understanding of unipolar depression and its psychological treatment. However, success is only partial and many authors affirm the need to improve those models and also the treatment programs derived from them. One of the issues that requires further elaboration is the difficulty these patients experience in responding to treatment and in maintaining therapeutic gains across time without relapse or recurrence. Our research group has been working on the notion of cognitive conflict viewed as personal dilemmas according to personal construct theory. We use a novel method for identifying those conflicts using the repertory grid technique (RGT). Preliminary results with depressive patients show that about 90% of them have one or more of those conflicts. This fact might explain the blockage and the difficult progress of these patients, especially the more severe and/or chronic. These results justify the need for specific interventions focused on the resolution of these internal conflicts. This study aims to empirically test the hypothesis that an intervention focused on the dilemma(s) specifically detected for each patient will enhance the efficacy of cognitive behavioral therapy (CBT) for depression. Design A therapy manual for a dilemma-focused intervention will be tested using a randomized clinical trial by comparing the outcome of two treatment conditions: combined group CBT (eight, 2-hour weekly sessions) plus individual dilemma-focused therapy (eight, 1-hour weekly sessions) and CBT alone (eight, 2-hour group weekly sessions plus eight, 1-hour individual weekly sessions). Method Participants are patients aged over 18 years meeting diagnostic criteria for major depressive disorder or dysthymic disorder, with a score of 19 or above on the Beck depression inventory, second edition (BDI-II) and presenting at least one cognitive conflict (implicative dilemma or dilemmatic construct) as assessed using the RGT. The BDI-II is the primary outcome measure, collected at baseline, at the end of therapy, and at 3- and 12-month follow-up; other secondary measures are also used. Discussion We expect that adding a dilemma-focused intervention to CBT will increase the efficacy of one of the more prestigious therapies for depression, thus resulting in a significant contribution to the psychological treatment of depression. Trial registration ISRCTN92443999; ClinicalTrials.gov Identifier: NCT01542957.
Resumo:
The final year project came to us as an opportunity to get involved in a topic which has appeared to be attractive during the learning process of majoring in economics: statistics and its application to the analysis of economic data, i.e. econometrics.Moreover, the combination of econometrics and computer science is a very hot topic nowadays, given the Information Technologies boom in the last decades and the consequent exponential increase in the amount of data collected and stored day by day. Data analysts able to deal with Big Data and to find useful results from it are verydemanded in these days and, according to our understanding, the work they do, although sometimes controversial in terms of ethics, is a clear source of value added both for private corporations and the public sector. For these reasons, the essence of this project is the study of a statistical instrument valid for the analysis of large datasets which is directly related to computer science: Partial Correlation Networks.The structure of the project has been determined by our objectives through the development of it. At first, the characteristics of the studied instrument are explained, from the basic ideas up to the features of the model behind it, with the final goal of presenting SPACE model as a tool for estimating interconnections in between elements in large data sets. Afterwards, an illustrated simulation is performed in order to show the power and efficiency of the model presented. And at last, the model is put into practice by analyzing a relatively large data set of real world data, with the objective of assessing whether the proposed statistical instrument is valid and useful when applied to a real multivariate time series. In short, our main goals are to present the model and evaluate if Partial Correlation Network Analysis is an effective, useful instrument and allows finding valuable results from Big Data.As a result, the findings all along this project suggest the Partial Correlation Estimation by Joint Sparse Regression Models approach presented by Peng et al. (2009) to work well under the assumption of sparsity of data. Moreover, partial correlation networks are shown to be a very valid tool to represent cross-sectional interconnections in between elements in large data sets.The scope of this project is however limited, as there are some sections in which deeper analysis would have been appropriate. Considering intertemporal connections in between elements, the choice of the tuning parameter lambda, or a deeper analysis of the results in the real data application are examples of aspects in which this project could be completed.To sum up, the analyzed statistical tool has been proved to be a very useful instrument to find relationships that connect the elements present in a large data set. And after all, partial correlation networks allow the owner of this set to observe and analyze the existing linkages that could have been omitted otherwise.
Resumo:
This paper provides a systematic approach to theproblem of nondata aided symbol-timing estimation for linearmodulations. The study is performed under the unconditionalmaximum likelihood framework where the carrier-frequencyerror is included as a nuisance parameter in the mathematicalderivation. The second-order moments of the received signal arefound to be the sufficient statistics for the problem at hand and theyallow the provision of a robust performance in the presence of acarrier-frequency error uncertainty. We particularly focus on theexploitation of the cyclostationary property of linear modulations.This enables us to derive simple and closed-form symbol-timingestimators which are found to be based on the well-known squaretiming recovery method by Oerder and Meyr. Finally, we generalizethe OM method to the case of linear modulations withoffset formats. In this case, the square-law nonlinearity is foundto provide not only the symbol-timing but also the carrier-phaseerror.