901 resultados para Non-integer voltage ratio
Resumo:
PURPOSE Gender differences in paediatric patients with inflammatory bowel disease (IBD) are frequently reported as a secondary outcome and the results are divergent. To assess gender differences by analysing data collected within the Swiss IBD cohort study database since 2008, related to children with IBD, using the Montreal classification for a systematic approach. METHODS Data on gender, age, anthropometrics, disease location at diagnosis, disease behaviour, and therapy of 196 patients, 105 with Crohn's disease (CD) and 91 with ulcerative or indeterminate colitis (UC/IC) were retrieved and analysed. RESULTS THE CRUDE GENDER RATIO (MALE : female) of patients with CD diagnosed at <10 years of age was 2.57, the adjusted ratio was 2.42, and in patients with UC/IC it was 0.68 and 0.64 respectively. The non-adjusted gender ratio of patients diagnosed at ≥10 years was 1.58 for CD and 0.88 for UC/IC. Boys with UC/IC diagnosed <10 years of age had a longer diagnostic delay, and in girls diagnosed with UC/IC >10 years a more important use of azathioprine was observed. No other gender difference was found after analysis of age, disease location and behaviour at diagnosis, duration of disease, familial occurrence of IBD, prevalence of extra-intestinal manifestations, complications, and requirement for surgery. CONCLUSION CD in children <10 years affects predominantly boys with a sex ratio of 2.57; the impact of sex-hormones on the development of CD in pre-pubertal male patients should be investigated.
Resumo:
The mid-Pliocene was an episode of prolonged global warmth and strong North Atlantic thermohaline circulation, interrupted briefly at circa 3.30 Ma by a global cooling event corresponding to marine isotope stage (MIS) M2. Paleoceanographic changes in the eastern North Atlantic have been reconstructed between circa 3.35 and 3.24 Ma at Deep Sea Drilling Project Site 610 and Integrated Ocean Drilling Program Site 1308. Mg/Ca ratios and d18O from Globigerina bulloides are used to reconstruct the temperature and relative salinity of surface waters, and dinoflagellate cyst assemblages are used to assess variability in the North Atlantic Current (NAC). Our sea surface temperature data indicate warm waters at both sites before and after MIS M2 but a cooling of ~2-3°C during MIS M2. A dinoflagellate cyst assemblage overturn marked by a decline in Operculodinium centrocarpum reflects a southward shift or slowdown of the NAC between circa 3.330 and 3.283 Ma, reducing northward heat transport 23-35 ka before the global ice volume maximum of MIS M2. This will have established conditions that ultimately allowed the Greenland ice sheet to expand, leading to the global cooling event at MIS M2. Comparison with an ice-rafted debris record excludes fresh water input via icebergs in the northeast Atlantic as a cause of NAC decline. The mechanism causing the temporary disruption of the NAC may be related to a brief reopening of the Panamanian Gateway at about this time.
Resumo:
Background: Several meta-analysis methods can be used to quantitatively combine the results of a group of experiments, including the weighted mean difference, statistical vote counting, the parametric response ratio and the non-parametric response ratio. The software engineering community has focused on the weighted mean difference method. However, other meta-analysis methods have distinct strengths, such as being able to be used when variances are not reported. There are as yet no guidelines to indicate which method is best for use in each case. Aim: Compile a set of rules that SE researchers can use to ascertain which aggregation method is best for use in the synthesis phase of a systematic review. Method: Monte Carlo simulation varying the number of experiments in the meta analyses, the number of subjects that they include, their variance and effect size. We empirically calculated the reliability and statistical power in each case Results: WMD is generally reliable if the variance is low, whereas its power depends on the effect size and number of subjects per meta-analysis; the reliability of RR is generally unaffected by changes in variance, but it does require more subjects than WMD to be powerful; NPRR is the most reliable method, but it is not very powerful; SVC behaves well when the effect size is moderate, but is less reliable with other effect sizes. Detailed tables of results are annexed. Conclusions: Before undertaking statistical aggregation in software engineering, it is worthwhile checking whether there is any appreciable difference in the reliability and power of the methods. If there is, software engineers should select the method that optimizes both parameters.
Resumo:
El objetivo de este estudio es analizar la influencia del esquema aditivo en el desarrollo del razonamiento proporcional en estudiantes de educación secundaria. 558 estudiantes de educación secundaria respondieron a un cuestionario de problemas proporcionales y no proporcionales. Los resultados indican (i) que la capacidad de los estudiantes en identificar las relaciones proporcionales en los problemas proporcionales no implica necesariamente que sean capaces de identificar correctamente las relaciones aditivas en los problemas no proporcionales y viceversa; y (ii) que el tipo de relación multiplicativa entre las cantidades (entera o no entera) influía en el nivel de éxito en la resolución de los problemas proporcionales y no proporcionales.
Resumo:
Adjuvant arthritis (AA) is a condition that involves systemic oxidative stress. Unexpectedly, it was found that sarcoplasmic reticulum Ca2 +-ATPase (SERCA) activity was elevated in muscles of rats with AA compared to controls, suggesting possible conformational changes in the enzyme. There was no alteration in the nucleotide binding site but rather in the transmembrane domain according to the tryptophan polar/non-polar fluorescence ratio. Higher relative expression of SERCA, higher content of nitrotyrosine but no increase in phospholipid oxidation in AA SR was found. In vitro treatments of SR with HOCl showed that in AA animals SERCA activity was more susceptible to oxidative stress, but SR phospholipids were more resistant and SERCA could also be activated by phosphatidic acid. It was concluded that increased SERCA activity in AA was due to increased levels of SERCA protein and structural changes to the protein, probably induced by direct and specific oxidation involving reactive nitrogen species.
Resumo:
Mathematics Subject Classification: 26A33 (main), 35A22, 78A25, 93A30
Resumo:
Aims: To investigate concordance with medication, as assessed at baseline and at 1- and 2-year follow-up, and to examine factors associated with non-concordance in a UK-resident South-Asian population. Methods: Data from the UK Asian Diabetes Study were analysed. Concordance with medications was assessed and recorded at three time points during the study. Multiple logistic regression was used to investigate the factors associated with non-concordance; the associations of baseline factors with year 1 concordance and baseline plus year 1 factors with year 2 concordance. Results: Data for 403 patients from seven practices participating in the UK Asian Diabetes Study were analysed. The numbers of patients who were non-concordant were: 63 (16%) at baseline 101 (25%) at year 1; and 122 (30%) at year 2. The baseline-measured variables that were significantly associated with year 1 non-concordance included diabetes duration, history of cardiovascular disease, components of the EuroQol quality of life questionnaire, the EQ-5D score, and number of medications prescribed. In multivariable analyses, the most important determinant of year 1 non-concordance was baseline non-concordance: odds ratio 13.6 (95% confidence limits 4.7, 39.9). Number of medications prescribed for blood pressure control was also significant: odds ratio 1.8 (95% confidence limits 1.4, 2.4). Similar results were observed for year 2 non-concordance. Conclusions: Non-concordance with medications was common and more likely in people prescribed more medications. The current target-driven management of risk factor levels may lead to increasing numbers and doses of medications. Considering the high cost of medications and the implications of poor health behaviours on morbidity and mortality, further investigation of prescribing behaviours and the factors affecting patient concordance are required.
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.
Resumo:
This thesis is concerned with the question of when the double branched cover of an alternating knot can arise by Dehn surgery on a knot in S^3. We approach this problem using a surgery obstruction, first developed by Greene, which combines Donaldson's Diagonalization Theorem with the $d$-invariants of Ozsvath and Szabo's Heegaard Floer homology. This obstruction shows that if the double branched cover of an alternating knot or link L arises by surgery on S^3, then for any alternating diagram the lattice associated to the Goeritz matrix takes the form of a changemaker lattice. By analyzing the structure of changemaker lattices, we show that the double branched cover of L arises by non-integer surgery on S^3 if and only if L has an alternating diagram which can be obtained by rational tangle replacement on an almost-alternating diagram of the unknot. When one considers half-integer surgery the resulting tangle replacement is simply a crossing change. This allows us to show that an alternating knot has unknotting number one if and only if it has an unknotting crossing in every alternating diagram. These techniques also produce several other interesting results: they have applications to characterizing slopes of torus knots; they produce a new proof for a theorem of Tsukamoto on the structure of almost-alternating diagrams of the unknot; and they provide several bounds on surgeries producing the double branched covers of alternating knots which are direct generalizations of results previously known for lens space surgeries. Here, a rational number p/q is said to be characterizing slope for K in S^3 if the oriented homeomorphism type of the manifold obtained by p/q-surgery on K determines K uniquely. The thesis begins with an exposition of the changemaker surgery obstruction, giving an amalgamation of results due to Gibbons, Greene and the author. It then gives background material on alternating knots and changemaker lattices. The latter part of the thesis is then taken up with the applications of this theory.
Resumo:
As the semiconductor industry struggles to maintain its momentum down the path following the Moore's Law, three dimensional integrated circuit (3D IC) technology has emerged as a promising solution to achieve higher integration density, better performance, and lower power consumption. However, despite its significant improvement in electrical performance, 3D IC presents several serious physical design challenges. In this dissertation, we investigate physical design methodologies for 3D ICs with primary focus on two areas: low power 3D clock tree design, and reliability degradation modeling and management. Clock trees are essential parts for digital system which dissipate a large amount of power due to high capacitive loads. The majority of existing 3D clock tree designs focus on minimizing the total wire length, which produces sub-optimal results for power optimization. In this dissertation, we formulate a 3D clock tree design flow which directly optimizes for clock power. Besides, we also investigate the design methodology for clock gating a 3D clock tree, which uses shutdown gates to selectively turn off unnecessary clock activities. Different from the common assumption in 2D ICs that shutdown gates are cheap thus can be applied at every clock node, shutdown gates in 3D ICs introduce additional control TSVs, which compete with clock TSVs for placement resources. We explore the design methodologies to produce the optimal allocation and placement for clock and control TSVs so that the clock power is minimized. We show that the proposed synthesis flow saves significant clock power while accounting for available TSV placement area. Vertical integration also brings new reliability challenges including TSV's electromigration (EM) and several other reliability loss mechanisms caused by TSV-induced stress. These reliability loss models involve complex inter-dependencies between electrical and thermal conditions, which have not been investigated in the past. In this dissertation we set up an electrical/thermal/reliability co-simulation framework to capture the transient of reliability loss in 3D ICs. We further derive and validate an analytical reliability objective function that can be integrated into the 3D placement design flow. The reliability aware placement scheme enables co-design and co-optimization of both the electrical and reliability property, thus improves both the circuit's performance and its lifetime. Our electrical/reliability co-design scheme avoids unnecessary design cycles or application of ad-hoc fixes that lead to sub-optimal performance. Vertical integration also enables stacking DRAM on top of CPU, providing high bandwidth and short latency. However, non-uniform voltage fluctuation and local thermal hotspot in CPU layers are coupled into DRAM layers, causing a non-uniform bit-cell leakage (thereby bit flip) distribution. We propose a performance-power-resilience simulation framework to capture DRAM soft error in 3D multi-core CPU systems. In addition, a dynamic resilience management (DRM) scheme is investigated, which adaptively tunes CPU's operating points to adjust DRAM's voltage noise and thermal condition during runtime. The DRM uses dynamic frequency scaling to achieve a resilience borrow-in strategy, which effectively enhances DRAM's resilience without sacrificing performance. The proposed physical design methodologies should act as important building blocks for 3D ICs and push 3D ICs toward mainstream acceptance in the near future.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper presents an efficiency investigation of an isolated high step-up ratio dc-dc converter aimed to be used for energy processing from low-voltage high-current energy sources, like batteries, photovoltaic modules or fuel-cells. The considered converter consists of an interleaved active clamp flyback topology combined with a voltage multiplier at the transformer secondary side capable of two different operating modes, i.e. resonant and non-resonant according to the design of the output capacitors. The main goal of this paper is to compare these two operating modes from the component losses point of view with the aim of maximize the overall converter efficiency. The approach is based on losses prediction using steady-state theoretical models (designed in Mathcad environment), taking into account both conduction and switching losses. The models are compared with steady-state simulations and experimental results considering different operating modes to validate the approach. © 2012 IEEE.