897 resultados para Transformation-based semi-parametric estimators
Resumo:
In this paper the dynamical interactions of a double pendulum arm and an electromechanical shaker is investigated. The double pendulum is a three degree of freedom system coupled to an RLC circuit based nonlinear shaker through a magnetic field, and the capacitor voltage is a nonlinear function of the instantaneous electric charge. Numerical simulations show the existence of chaotic behavior for some regions in the parameter space and this behaviour is characterized by power spectral density and Lyapunov exponents. The bifurcation diagram is constructed to explore the qualitative behaviour of the system. This kind of electromechanical system is frequently found in robotic systems, and in order to suppress the chaotic motion, the State-Dependent Riccati Equation (SDRE) control and the Nonlinear Saturation control (NSC) techniques are analyzed. The robustness of these two controllers is tested by a sensitivity analysis to parametric uncertainties.
Resumo:
Micro-electromechanical systems (MEMS) are micro scale devices that are able to convert electrical energy into mechanical energy or vice versa. In this paper, the mathematical model of an electronic circuit of a resonant MEMS mass sensor, with time-periodic parametric excitation, was analyzed and controlled by Chebyshev polynomial expansion of the Picard interaction and Lyapunov-Floquet transformation, and by Optimal Linear Feedback Control (OLFC). Both controls consider the union of feedback and feedforward controls. The feedback control obtained by Picard interaction and Lyapunov-Floquet transformation is the first strategy and the optimal control theory the second strategy. Numerical simulations show the efficiency of the two control methods, as well as the sensitivity of each control strategy to parametric errors. Without parametric errors, both control strategies were effective in maintaining the system in the desired orbit. On the other hand, in the presence of parametric errors, the OLFC technique was more robust.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Evaluations of measurement invariance provide essential construct validity evidence. However, the quality of such evidence is partly dependent upon the validity of the resulting statistical conclusions. The presence of Type I or Type II errors can render measurement invariance conclusions meaningless. The purpose of this study was to determine the effects of categorization and censoring on the behavior of the chi-square/likelihood ratio test statistic and two alternative fit indices (CFI and RMSEA) under the context of evaluating measurement invariance. Monte Carlo simulation was used to examine Type I error and power rates for the (a) overall test statistic/fit indices, and (b) change in test statistic/fit indices. Data were generated according to a multiple-group single-factor CFA model across 40 conditions that varied by sample size, strength of item factor loadings, and categorization thresholds. Seven different combinations of model estimators (ML, Yuan-Bentler scaled ML, and WLSMV) and specified measurement scales (continuous, censored, and categorical) were used to analyze each of the simulation conditions. As hypothesized, non-normality increased Type I error rates for the continuous scale of measurement and did not affect error rates for the categorical scale of measurement. Maximum likelihood estimation combined with a categorical scale of measurement resulted in more correct statistical conclusions than the other analysis combinations. For the continuous and censored scales of measurement, the Yuan-Bentler scaled ML resulted in more correct conclusions than normal-theory ML. The censored measurement scale did not offer any advantages over the continuous measurement scale. Comparing across fit statistics and indices, the chi-square-based test statistics were preferred over the alternative fit indices, and ΔRMSEA was preferred over ΔCFI. Results from this study should be used to inform the modeling decisions of applied researchers. However, no single analysis combination can be recommended for all situations. Therefore, it is essential that researchers consider the context and purpose of their analyses.
Resumo:
We extend the random permutation model to obtain the best linear unbiased estimator of a finite population mean accounting for auxiliary variables under simple random sampling without replacement (SRS) or stratified SRS. The proposed method provides a systematic design-based justification for well-known results involving common estimators derived under minimal assumptions that do not require specification of a functional relationship between the response and the auxiliary variables.
Resumo:
Background: Cryptococcus neoformans causes meningitis and disseminated infection in healthy individuals, but more commonly in hosts with defective immune responses. Cell-mediated immunity is an important component of the immune response to a great variety of infections, including yeast infections. We aimed to evaluate a specific lymphocyte transformation assay to Cryptococcus neoformans in order to identify immunodeficiency associated to neurocryptococcosis (NCC) as primary cause of the mycosis. Methods: Healthy volunteers, poultry growers, and HIV-seronegative patients with neurocryptococcosis were tested for cellular immune response. Cryptococcal meningitis was diagnosed by India ink staining of cerebrospinal fluid and cryptococcal antigen test (Immunomycol-Inc, SP, Brazil). Isolated peripheral blood mononuclear cells were stimulated with C. neoformans antigen, C. albicans antigen, and pokeweed mitogen. The amount of H-3-thymidine incorporated was assessed, and the results were expressed as stimulation index (SI) and log SI, sensitivity, specificity, and cut-off value (receiver operating characteristics curve). We applied unpaired Student t tests to compare data and considered significant differences for p<0.05. Results: The lymphotoxin alpha showed a low capacity with all the stimuli for classifying patients as responders and non-responders. Lymphotoxin alpha stimulated by heated-killed antigen from patients with neurocryptococcosis was not affected by TCD4+ cell count, and the intensity of response did not correlate with the clinical evolution of neurocryptococcosis. Conclusion: Response to lymphocyte transformation assay should be analyzed based on a normal range and using more than one stimulator. The use of a cut-off value to classify patients with neurocryptococcosis is inadequate. Statistical analysis should be based on the log transformation of SI. A more purified antigen for evaluating specific response to C. neoformans is needed.
Resumo:
XML similarity evaluation has become a central issue in the database and information communities, its applications ranging over document clustering, version control, data integration and ranked retrieval. Various algorithms for comparing hierarchically structured data, XML documents in particular, have been proposed in the literature. Most of them make use of techniques for finding the edit distance between tree structures, XML documents being commonly modeled as Ordered Labeled Trees. Yet, a thorough investigation of current approaches led us to identify several similarity aspects, i.e., sub-tree related structural and semantic similarities, which are not sufficiently addressed while comparing XML documents. In this paper, we provide an integrated and fine-grained comparison framework to deal with both structural and semantic similarities in XML documents (detecting the occurrences and repetitions of structurally and semantically similar sub-trees), and to allow the end-user to adjust the comparison process according to her requirements. Our framework consists of four main modules for (i) discovering the structural commonalities between sub-trees, (ii) identifying sub-tree semantic resemblances, (iii) computing tree-based edit operations costs, and (iv) computing tree edit distance. Experimental results demonstrate higher comparison accuracy with respect to alternative methods, while timing experiments reflect the impact of semantic similarity on overall system performance.
Resumo:
Recent experimental evidence has suggested a neuromodulatory deficit in Alzheimer's disease (AD). In this paper, we present a new electroencephalogram (EEG) based metric to quantitatively characterize neuromodulatory activity. More specifically, the short-term EEG amplitude modulation rate-of-change (i.e., modulation frequency) is computed for five EEG subband signals. To test the performance of the proposed metric, a classification task was performed on a database of 32 participants partitioned into three groups of approximately equal size: healthy controls, patients diagnosed with mild AD, and those with moderate-to-severe AD. To gauge the benefits of the proposed metric, performance results were compared with those obtained using EEG spectral peak parameters which were recently shown to outperform other conventional EEG measures. Using a simple feature selection algorithm based on area-under-the-curve maximization and a support vector machine classifier, the proposed parameters resulted in accuracy gains, relative to spectral peak parameters, of 21.3% when discriminating between the three groups and by 50% when mild and moderate-to-severe groups were merged into one. The preliminary findings reported herein provide promising insights that automated tools may be developed to assist physicians in very early diagnosis of AD as well as provide researchers with a tool to automatically characterize cross-frequency interactions and their changes with disease.
Resumo:
Industrial production of semi-synthetic cephalosporins by Penicillium chrysogenum requires supplementation of the growth media with the side-chain precursor adipic acid. In glucose-limited chemostat cultures of P. chrysogenum, up to 88% of the consumed adipic acid was not recovered in cephalosporinrelated products, but used as an additional carbon and energy source for growth. This low efficiency of side-chain precursor incorporation provides an economic incentive for studying and engineering the metabolism of adipic acid in P. cluysogenum. Chemostat-based transcriptome analysis in the presence and absence of adipic acid confirmed that adipic acid metabolism in this fungus occurs via beta-oxidation. A set of 52 adipate-responsive genes included six putative genes for acyl-CoA oxidases and dehydrogenases, enzymes responsible for the first step of beta-oxidation. Subcellular localization of the differentially expressed acyl-CoA oxidases and dehydrogenases revealed that the oxidases were exclusively targeted to peroxisomes, while the dehydrogenases were found either in peroxisomes or in mitochondria. Deletion of the genes encoding the peroxisomal acyl-CoA oxidase Pc20g01800 and the mitochondrial acyl-CoA dehydrogenase Pc20g07920 resulted in a 1.6- and 3.7-fold increase in the production of the semi-synthetic cephalosporin intermediate adipoyl-6-APA, respectively. The deletion strains also showed reduced adipate consumption compared to the reference strain, indicating that engineering of the first step of beta-oxidation successfully redirected a larger fraction of adipic acid towards cephalosporin biosynthesis. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
The study of the hydro-physical behavior in soils using toposequences is of great importance for better understanding the soil, water and vegetation relationships. This study aims to assess the hydro-physical and morphological characterization of soil from a toposequence in Galia, state of São Paulo, Brazil). The plot covers an area of 10.24 ha (320 × 320 m), located in a semi-deciduous seasonal forest. Based on ultra-detailed soil and topographic maps of the area, a representative transect from the soil in the plot was chosen. Five profiles were opened for the morphological description of the soil horizons, and hydro-physical and micromorphological analyses were performed to characterize the soil. Arenic Haplustult, Arenic Haplustalf and Aquertic Haplustalf were the soil types observed in the plot. The superficial horizons had lower density and greater hydraulic conductivity, porosity and water retention in lower tensions than the deeper horizons. In the sub-superficial horizons, greater water retention at higher tensions and lower hydraulic conductivity were observed, due to structure type and greater clay content. The differences observed in the water retention curves between the sandy E and the clay B horizons were mainly due to the size distribution, shape and type of soil pores.
Resumo:
Context. To date, the CoRoT space mission has produced more than 124 471 light curves. Classifying these curves in terms of unambiguous variab ility behavior is mandatory for obtaining an unbi ased statistical view on th eir controlling root-causes. Aims. The present study provides an overview of semi-sinusoidal light curves observed by the CoRoT exo-field CCDs. Methods. We selected a sample of 4206 light curves presenting well-defined semi-si nusoidal signatures. Th e variability periods were computed based on Lomb-Scargle periodograms, harmonic fits, and visual inspection. Results. Color–period diagrams for the present sample show the trend of an increase of the variability periods as long as the stars evolve. This evolutionary behavior is also noticed when comparing the period distribution in the Galactic center and anti-center directions. These aspect s indicate a compatibility with stellar rotation, although more inform ation is needed to confirm their root- causes. Considering this possi bility, we identified a subset of th ree Sun-like candidates by their photometric peri od. Finally, the variability period versus color diagr am behavior was found to be highly depe ndent on the reddening correction.
Resumo:
Spark Plasma Sintering (SPS) is a promising rapid consolidation technique that allows a better understanding and manipulating of sintering kinetics and therefore makes it possible to obtain Si3N4-based ceramics with tailored microstructures, consisting of grains with either equiaxed or elongated morphology. The presence of an extra liquid phase is necessary for forming tough interlocking microstructures in Yb/Y-stabilised α-sialon by HP. The liquid is introduced by a new method, namely by increasing the O/N ratio in the general formula RExSi12-(3x+n)Al3x+nOnN16-n while keeping the cation ratios of RE, Si and Al constant. Monophasic α-sialon ceramics with tailored microstructures, consisting of either fine equiaxed or elongated grains, have been obtained by using SPS, whether or not such an extra liquid phase is involved. The three processes, namely densification, phase transformation and grain growth, which usually occur simultaneously during conventional HP consolidation of Si3N4-based ceramics, have been precisely followed and separately investigated in the SPS process. The enhanced densification is attributed to the non-equilibrium nature of the liquid phase formed during heating. The dominating mechanism during densification is the enhanced grain boundary sliding accompanied by diffusion- and/or reaction-controlled processes. The rapid grain growth is ascribed to a dynamic ripening mechanism based on the formation of a liquid phase that is grossly out of equilibrium, which in turn generates an extra chemical driving force for mass transfer. Monophasic α-sialon ceramics with interlocking microstructures exhibit improved damage tolerance. Y/Yb- stabilised monophasic α-sialon ceramics containing approximately 3 vol% liquid with refined interlocking microstructures have excellent thermal-shock resistance, comparable to the best β-sialon ceramics with 20 vol% additional liquid phase prepared by HP. The obtained sialon ceramics with fine-grained microstructure show formidably improved superplasticity in the presence of an electric field. The compressive strain rate reaches the order of 10-2 s-1 at temperatures above 1500oC, that is, two orders of magnitude higher than that has been realised so far by any other conventional approaches. The high deformation rate recorded in this work opens up possibilities for making ceramic components with complex shapes through super-plastic forming.
Resumo:
This thesis intends to investigate two aspects of Constraint Handling Rules (CHR). It proposes a compositional semantics and a technique for program transformation. CHR is a concurrent committed-choice constraint logic programming language consisting of guarded rules, which transform multi-sets of atomic formulas (constraints) into simpler ones until exhaustion [Frü06] and it belongs to the declarative languages family. It was initially designed for writing constraint solvers but it has recently also proven to be a general purpose language, being as it is Turing equivalent [SSD05a]. Compositionality is the first CHR aspect to be considered. A trace based compositional semantics for CHR was previously defined in [DGM05]. The reference operational semantics for such a compositional model was the original operational semantics for CHR which, due to the propagation rule, admits trivial non-termination. In this thesis we extend the work of [DGM05] by introducing a more refined trace based compositional semantics which also includes the history. The use of history is a well-known technique in CHR which permits us to trace the application of propagation rules and consequently it permits trivial non-termination avoidance [Abd97, DSGdlBH04]. Naturally, the reference operational semantics, of our new compositional one, uses history to avoid trivial non-termination too. Program transformation is the second CHR aspect to be considered, with particular regard to the unfolding technique. Said technique is an appealing approach which allows us to optimize a given program and in more detail to improve run-time efficiency or spaceconsumption. Essentially it consists of a sequence of syntactic program manipulations which preserve a kind of semantic equivalence called qualified answer [Frü98], between the original program and the transformed ones. The unfolding technique is one of the basic operations which is used by most program transformation systems. It consists in the replacement of a procedure-call by its definition. In CHR every conjunction of constraints can be considered as a procedure-call, every CHR rule can be considered as a procedure and the body of said rule represents the definition of the call. While there is a large body of literature on transformation and unfolding of sequential programs, very few papers have addressed this issue for concurrent languages. We define an unfolding rule, show its correctness and discuss some conditions in which it can be used to delete an unfolded rule while preserving the meaning of the original program. Finally, confluence and termination maintenance between the original and transformed programs are shown. This thesis is organized in the following manner. Chapter 1 gives some general notion about CHR. Section 1.1 outlines the history of programming languages with particular attention to CHR and related languages. Then, Section 1.2 introduces CHR using examples. Section 1.3 gives some preliminaries which will be used during the thesis. Subsequentely, Section 1.4 introduces the syntax and the operational and declarative semantics for the first CHR language proposed. Finally, the methodologies to solve the problem of trivial non-termination related to propagation rules are discussed in Section 1.5. Chapter 2 introduces a compositional semantics for CHR where the propagation rules are considered. In particular, Section 2.1 contains the definition of the semantics. Hence, Section 2.2 presents the compositionality results. Afterwards Section 2.3 expounds upon the correctness results. Chapter 3 presents a particular program transformation known as unfolding. This transformation needs a particular syntax called annotated which is introduced in Section 3.1 and its related modified operational semantics !0t is presented in Section 3.2. Subsequently, Section 3.3 defines the unfolding rule and prove its correctness. Then, in Section 3.4 the problems related to the replacement of a rule by its unfolded version are discussed and this in turn gives a correctness condition which holds for a specific class of rules. Section 3.5 proves that confluence and termination are preserved by the program modifications introduced. Finally, Chapter 4 concludes by discussing related works and directions for future work.
Resumo:
[EN]We present a new strategy, based on the idea of the meccano method and a novel T-mesh optimization procedure, to construct a T-spline parameterization of 2D geometries for the application of isogeometric analysis. The proposed method only demands a boundary representation of the geometry as input data. The algorithm obtains, as a result, high quality parametric transformation between 2D objects and the parametric domain, the unit square. First, we define a parametric mapping between the input boundary of the object and the boundary of the parametric domain. Then, we build a T-mesh adapted to the geometric singularities of the domain in order to preserve the features of the object boundary with a desired tolerance...