820 resultados para variable interest entity
Resumo:
With interest we read the article by Khosroshahi et al. about a novel method for quantification of left ventricular hypertrabeculation/noncompaction (LVHT) using two-dimensional echocardiography in children (1). We appreciate their efforts to contribute to an improvement and unification of echocardiographic diagnostic criteria for LVHT, which is urgently needed. Concerning their proposed method, we have the following questions and concerns:
Resumo:
A general description of the work presented in this thesis can be divided into three areas of interest: micropore fabrication, nanopore modification, and their applications. The first part of the thesis is related to the novel, reliable, cost-effective, potable, mass-productive, robust, and ease of use micropore flowcell that works based on the RPS technique. Based on our first goal, which was finding an alternate materials and processes that would shorten production times while lowering costs and improving signal quality, the polyimide film was used as a substrate to create precise pores by femtosecond laser, and the resulting current blockades of different sizes of the nanoparticles were recorded. Based on the results, the device can detecting nano-sized particles by changing the current level. The experimental and theoretical investigation, scanning electron microscopy, and focus ion beam were performed to explain the micropore's performance. The second goal was design and fabrication of a leak-free, easy-to-assemble, and portable polymethyl methacrylate flowcell for nanopore experiments. Here, ion current rectification was studied in our nanodevice. We showed a self-assembly-based, controllable, and monitorable in situ Poly(l-lysine)- g-poly(ethylene glycol) coating method under voltage-driven electrolyte flow and electrostatic interaction between nanopore walls and PLL backbones. Using designed nanopore flowcell and in situ monolayer PLL-g-PEG functionalized 20±4 nm SiN nanopores, we observed non-sticky α-1 anti-trypsin protein translocation. additionally, we could show the enhancement of translocation events through this non-sticky nanopore, and also, estimate the volume of the translocated protein. In this study, by comparing the AAT protein translocation results from functionalized and non-functionalized nanopore we demonstrated the 105 times dwell time reduction (31-0.59ms), 25% amplitude enhancement (0.24-0.3 nA), and 15 times event’s number increase (1-15events/s) after functionalization in 1×PBS at physiological pH. Also, the AAT protein volume was measured, close to the calculated AAT protein hydrodynamic volume and previous reports.
Resumo:
Negative-ion mode electrospray ionization, ESI(-), with Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) was coupled to a Partial Least Squares (PLS) regression and variable selection methods to estimate the total acid number (TAN) of Brazilian crude oil samples. Generally, ESI(-)-FT-ICR mass spectra present a power of resolution of ca. 500,000 and a mass accuracy less than 1 ppm, producing a data matrix containing over 5700 variables per sample. These variables correspond to heteroatom-containing species detected as deprotonated molecules, [M - H](-) ions, which are identified primarily as naphthenic acids, phenols and carbazole analog species. The TAN values for all samples ranged from 0.06 to 3.61 mg of KOH g(-1). To facilitate the spectral interpretation, three methods of variable selection were studied: variable importance in the projection (VIP), interval partial least squares (iPLS) and elimination of uninformative variables (UVE). The UVE method seems to be more appropriate for selecting important variables, reducing the dimension of the variables to 183 and producing a root mean square error of prediction of 0.32 mg of KOH g(-1). By reducing the size of the data, it was possible to relate the selected variables with their corresponding molecular formulas, thus identifying the main chemical species responsible for the TAN values.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
The structural engineering community in Brazil faces new challenges with the recent occurrence of high intensity tornados. Satellite surveillance data shows that the area covering the south-east of Brazil, Uruguay and some of Argentina is one of the world most tornado-prone areas, second only to the infamous tornado alley in central United States. The design of structures subject to tornado winds is a typical example of decision making in the presence of uncertainty. Structural design involves finding a good balance between the competing goals of safety and economy. This paper presents a methodology to find the optimum balance between these goals in the presence of uncertainty. In this paper, reliability-based risk optimization is used to find the optimal safety coefficient that minimizes the total expected cost of a steel frame communications tower, subject to extreme storm and tornado wind loads. The technique is not new, but it is applied to a practical problem of increasing interest to Brazilian structural engineers. The problem is formulated in the partial safety factor format used in current design codes, with all additional partial factor introduced to serve as optimization variable. The expected cost of failure (or risk) is defined as the product of a. limit state exceedance probability by a limit state exceedance cost. These costs include costs of repairing, rebuilding, and paying compensation for injury and loss of life. The total expected failure cost is the sum of individual expected costs over all failure modes. The steel frame communications, tower subject of this study has become very common in Brazil due to increasing mobile phone coverage. The study shows that optimum reliability is strongly dependent on the cost (or consequences) of failure. Since failure consequences depend oil actual tower location, it turn,,; out that different optimum designs should be used in different locations. Failure consequences are also different for the different parties involved in the design, construction and operation of the tower. Hence, it is important that risk is well understood by the parties involved, so that proper contracts call be made. The investigation shows that when non-structural terms dominate design costs (e.g, in residential or office buildings) it is not too costly to over-design; this observation is in agreement with the observed practice for non-optimized structural systems. In this situation, is much easier to loose money by under-design. When by under-design. When structural material cost is a significant part of design cost (e.g. concrete dam or bridge), one is likely to lose significantmoney by over-design. In this situation, a cost-risk-benefit optimization analysis is highly recommended. Finally, the study also shows that under time-varying loads like tornados, the optimum reliability is strongly dependent on the selected design life.
Resumo:
Common variable immunodeficiency disorder (CVID) is the commonest cause of primary antibody failure in adults and children, and characterized clinically by recurrent bacterial infections and autoimmune manifestations. Several innate immune defects have been described in CVID, but no study has yet investigated the frequency, phenotype or function of the key regulatory cell population, natural killer T (NKT) cells. We measured the frequencies and subsets of NKT cells in patients with CVID and compared these to healthy controls. Our results show a skewing of NKT cell subsets, with CD4+ NKT cells at higher frequencies, and CD8+ NKT cells at lower frequencies. However, these cells were highly activated and expression CD161. The NKT cells had a higher expression of CCR5 and concomitantly expression of CCR5+CD69+CXCR6 suggesting a compensation of the remaining population of NKT cells for rapid effector action.
Resumo:
Background: Genome wide association studies (GWAS) are becoming the approach of choice to identify genetic determinants of complex phenotypes and common diseases. The astonishing amount of generated data and the use of distinct genotyping platforms with variable genomic coverage are still analytical challenges. Imputation algorithms combine directly genotyped markers information with haplotypic structure for the population of interest for the inference of a badly genotyped or missing marker and are considered a near zero cost approach to allow the comparison and combination of data generated in different studies. Several reports stated that imputed markers have an overall acceptable accuracy but no published report has performed a pair wise comparison of imputed and empiric association statistics of a complete set of GWAS markers. Results: In this report we identified a total of 73 imputed markers that yielded a nominally statistically significant association at P < 10(-5) for type 2 Diabetes Mellitus and compared them with results obtained based on empirical allelic frequencies. Interestingly, despite their overall high correlation, association statistics based on imputed frequencies were discordant in 35 of the 73 (47%) associated markers, considerably inflating the type I error rate of imputed markers. We comprehensively tested several quality thresholds, the haplotypic structure underlying imputed markers and the use of flanking markers as predictors of inaccurate association statistics derived from imputed markers. Conclusions: Our results suggest that association statistics from imputed markers showing specific MAF (Minor Allele Frequencies) range, located in weak linkage disequilibrium blocks or strongly deviating from local patterns of association are prone to have inflated false positive association signals. The present study highlights the potential of imputation procedures and proposes simple procedures for selecting the best imputed markers for follow-up genotyping studies.
Resumo:
We report optical observations of the luminous blue variable (LBV) HR Carinae which show that the star has reached a visual minimum phase in 2009. More importantly, we detected absorptions due to Si lambda lambda 4088-4116. To match their observed line profiles from 2009 May, a high rotational velocity of nu(rot) similar or equal to 150 +/- 20 km s(-1) is needed (assuming an inclination angle of 30 degrees), implying that HR Car rotates at similar or equal to 0.88 +/- 0.2 of its critical velocity for breakup (nu(crit)). Our results suggest that fast rotation is typical in all strong-variable, bona fide galactic LBVs, which present S-Dor-type variability. Strong-variable LBVs are located in a well-defined region of the HR diagram during visual minimum (the ""LBV minimum instability strip""). We suggest this region corresponds to where nu(crit) is reached. To the left of this strip, a forbidden zone with nu(rot)/nu(crit) > 1 is present, explaining why no LBVs are detected in this zone. Since dormant/ex LBVs like P Cygni and HD 168625 have low nu(rot), we propose that LBVs can be separated into two groups: fast-rotating, strong-variable stars showing S-Dor cycles (such as AG Car and HR Car) and slow-rotating stars with much less variability (such as P Cygni and HD 168625). We speculate that supernova (SN) progenitors which had S-Dor cycles before exploding (such as in SN 2001ig, SN 2003bg, and SN 2005gj) could have been fast rotators. We suggest that the potential difficulty of fast-rotating Galactic LBVs to lose angular momentum is additional evidence that such stars could explode during the LBV phase.
Resumo:
Aims. In this work, we describe the pipeline for the fast supervised classification of light curves observed by the CoRoT exoplanet CCDs. We present the classification results obtained for the first four measured fields, which represent a one-year in-orbit operation. Methods. The basis of the adopted supervised classification methodology has been described in detail in a previous paper, as is its application to the OGLE database. Here, we present the modifications of the algorithms and of the training set to optimize the performance when applied to the CoRoT data. Results. Classification results are presented for the observed fields IRa01, SRc01, LRc01, and LRa01 of the CoRoT mission. Statistics on the number of variables and the number of objects per class are given and typical light curves of high-probability candidates are shown. We also report on new stellar variability types discovered in the CoRoT data. The full classification results are publicly available.
Resumo:
We show that scalable multipartite entanglement among light fields may be generated by optical parametric oscillators (OPOs). The tripartite entanglement existent among the three bright beams produced by a single OPO-pump, signal, and idler-is scalable to a system of many OPOs by pumping them in cascade with the same optical field. This latter serves as an entanglement distributor. The special case of two OPOs is studied, as it is shown that the resulting five bright beams share genuine multipartite entanglement. In addition, the structure of entanglement distribution among the fields can be manipulated to some degree by tuning the incident pump power. The scalability to many fields is straightforward, allowing an alternative implementation of a multipartite quantum information network with continuous variables.
Resumo:
We report the discovery with XMM-Newton of a hard-thermal (T similar to 130 MK) and variable X-ray emission from the Be star HD 157832, a new member of the puzzling class of gamma-Cas-like Be/X-ray systems. Recent optical spectroscopy reveals the presence of a large/dense circumstellar disk seen at intermediate/high inclination. With a B1.5V spectral type, HD 157832 is the coolest gamma-Cas analog known. In addition, its non-detection in the ROSAT all-sky survey shows that its average soft X-ray luminosity varied by a factor larger than similar to 3 over a time interval of 14 yr. These two remarkable features, ""low"" effective temperature, and likely high X-ray variability turn HD 157832 into a promising object for understanding the origin of the unusually high-temperature X-ray emission in these systems.
Resumo:
Context tree models have been introduced by Rissanen in [25] as a parsimonious generalization of Markov models. Since then, they have been widely used in applied probability and statistics. The present paper investigates non-asymptotic properties of two popular procedures of context tree estimation: Rissanen's algorithm Context and penalized maximum likelihood. First showing how they are related, we prove finite horizon bounds for the probability of over- and under-estimation. Concerning overestimation, no boundedness or loss-of-memory conditions are required: the proof relies on new deviation inequalities for empirical probabilities of independent interest. The under-estimation properties rely on classical hypotheses for processes of infinite memory. These results improve on and generalize the bounds obtained in Duarte et al. (2006) [12], Galves et al. (2008) [18], Galves and Leonardi (2008) [17], Leonardi (2010) [22], refining asymptotic results of Buhlmann and Wyner (1999) [4] and Csiszar and Talata (2006) [9]. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
We consider binary infinite order stochastic chains perturbed by a random noise. This means that at each time step, the value assumed by the chain can be randomly and independently flipped with a small fixed probability. We show that the transition probabilities of the perturbed chain are uniformly close to the corresponding transition probabilities of the original chain. As a consequence, in the case of stochastic chains with unbounded but otherwise finite variable length memory, we show that it is possible to recover the context tree of the original chain, using a suitable version of the algorithm Context, provided that the noise is small enough.
Resumo:
It is widely assumed that optimal timing of larval release is of major importance to offspring survival, but the extent to which environmental factors entrain synchronous reproductive rhythms in natural populations is not well known. We sampled the broods of ovigerous females of the common shore crab Pachygrapsus transversus at both sheltered and exposed rocky shores interspersed along a so-km coastline, during four different periods, to better assess inter-population differences of larval release timing and to test for the effect of wave action. Shore-specific patterns were consistent through time. Maximum release fell within 1 day around syzygies on all shores, which matched dates of maximum tidal amplitude. Within this very narrow range, populations at exposed shores anticipated hatching compared to those at sheltered areas, possibly due to mechanical stimulation by wave action. Average departures from syzygial release ranged consistently among shores from 2.4 to 3.3 days, but in this case we found no evidence for the effect of wave exposure. Therefore, processes varying at the scale of a few kilometres affect the precision of semilunar timing and may produce differences in the survival of recently hatched larvae. Understanding the underlying mechanisms causing departures from presumed optimal release timing is thus important for a more comprehensive evaluation of reproductive success of invertebrate populations.
Resumo:
The application of laser induced breakdown spectrometry (LIBS) aiming the direct analysis of plant materials is a great challenge that still needs efforts for its development and validation. In this way, a series of experimental approaches has been carried out in order to show that LIBS can be used as an alternative method to wet acid digestions based methods for analysis of agricultural and environmental samples. The large amount of information provided by LIBS spectra for these complex samples increases the difficulties for selecting the most appropriated wavelengths for each analyte. Some applications have suggested that improvements in both accuracy and precision can be achieved by the application of multivariate calibration in LIBS data when compared to the univariate regression developed with line emission intensities. In the present work, the performance of univariate and multivariate calibration, based on partial least squares regression (PLSR), was compared for analysis of pellets of plant materials made from an appropriate mixture of cryogenically ground samples with cellulose as the binding agent. The development of a specific PLSR model for each analyte and the selection of spectral regions containing only lines of the analyte of interest were the best conditions for the analysis. In this particular application, these models showed a similar performance. but PLSR seemed to be more robust due to a lower occurrence of outliers in comparison to the univariate method. Data suggests that efforts dealing with sample presentation and fitness of standards for LIBS analysis must be done in order to fulfill the boundary conditions for matrix independent development and validation. (C) 2009 Elsevier B.V. All rights reserved.