942 resultados para non-parametric background modeling
Resumo:
Specification of the non-functional requirements of applications and determining the required resources for their execution are activities that demand a great deal of technical knowledge, frequently resulting in an inefficient use of resources. Cloud computing is an alternative for provisioning of resources, which can be done using either the provider's own infrastructure or the infrastructure of one or more public clouds, or even a combination of both. It enables more flexibly/elastic use of resources, but does not solve the specification problem. In this paper we present an approach that uses models at runtime to facilitate the specification of non-functional requirements and resources, aiming to facilitate dynamic support for application execution in cloud computing environments with shared resources. © 2013 IEEE.
Resumo:
Clusters are aggregations of atoms or molecules, generally intermediate in size between individual atoms and aggregates that are large enough to be called bulk matter. Clusters can also be called nanoparticles, because their size is on the order of nanometers or tens of nanometers. A new field has begun to take shape called nanostructured materials which takes advantage of these atom clusters. The ultra-small size of building blocks leads to dramatically different properties and it is anticipated that such atomically engineered materials will be able to be tailored to perform as no previous material could.^ The idea of ionized cluster beam (ICB) thin film deposition technique was first proposed by Takagi in 1972. It was based upon using a supersonic jet source to produce, ionize and accelerate beams of atomic clusters onto substrates in a vacuum environment. Conditions for formation of cluster beams suitable for thin film deposition have only recently been established following twenty years of effort. Zinc clusters over 1,000 atoms in average size have been synthesized both in our lab and that of Gspann. More recently, other methods of synthesizing clusters and nanoparticles, using different types of cluster sources, have come under development.^ In this work, we studied different aspects of nanoparticle beams. The work includes refinement of a model of the cluster formation mechanism, development of a new real-time, in situ cluster size measurement method, and study of the use of ICB in the fabrication of semiconductor devices.^ The formation process of the vaporized-metal cluster beam was simulated and investigated using classical nucleation theory and one dimensional gas flow equations. Zinc cluster sizes predicted at the nozzle exit are in good quantitative agreement with experimental results in our laboratory.^ A novel in situ real-time mass, energy and velocity measurement apparatus has been designed, built and tested. This small size time-of-flight mass spectrometer is suitable to be used in our cluster deposition systems and does not suffer from problems related to other methods of cluster size measurement like: requirement for specialized ionizing lasers, inductive electrical or electromagnetic coupling, dependency on the assumption of homogeneous nucleation, limits on the size measurement and non real-time capability. Measured ion energies using the electrostatic energy analyzer are in good accordance with values obtained from computer simulation. The velocity (v) is measured by pulsing the cluster beam and measuring the time of delay between the pulse and analyzer output current. The mass of a particle is calculated from m = (2E/v$\sp2).$ The error in the measured value of background gas mass is on the order of 28% of the mass of one N$\sb2$ molecule which is negligible for the measurement of large size clusters. This resolution in cluster size measurement is very acceptable for our purposes.^ Selective area deposition onto conducting patterns overlying insulating substrates was demonstrated using intense, fully-ionized cluster beams. Parameters influencing the selectivity are ion energy, repelling voltage, the ratio of the conductor to insulator dimension, and substrate thickness. ^
Resumo:
Network simulation is an indispensable tool for studying Internet-scale networks due to the heterogeneous structure, immense size and changing properties. It is crucial for network simulators to generate representative traffic, which is necessary for effectively evaluating next-generation network protocols and applications. With network simulation, we can make a distinction between foreground traffic, which is generated by the target applications the researchers intend to study and therefore must be simulated with high fidelity, and background traffic, which represents the network traffic that is generated by other applications and does not require significant accuracy. The background traffic has a significant impact on the foreground traffic, since it competes with the foreground traffic for network resources and therefore can drastically affect the behavior of the applications that produce the foreground traffic. This dissertation aims to provide a solution to meaningfully generate background traffic in three aspects. First is realism. Realistic traffic characterization plays an important role in determining the correct outcome of the simulation studies. This work starts from enhancing an existing fluid background traffic model by removing its two unrealistic assumptions. The improved model can correctly reflect the network conditions in the reverse direction of the data traffic and can reproduce the traffic burstiness observed from measurements. Second is scalability. The trade-off between accuracy and scalability is a constant theme in background traffic modeling. This work presents a fast rate-based TCP (RTCP) traffic model, which originally used analytical models to represent TCP congestion control behavior. This model outperforms other existing traffic models in that it can correctly capture the overall TCP behavior and achieve a speedup of more than two orders of magnitude over the corresponding packet-oriented simulation. Third is network-wide traffic generation. Regardless of how detailed or scalable the models are, they mainly focus on how to generate traffic on one single link, which cannot be extended easily to studies of more complicated network scenarios. This work presents a cluster-based spatio-temporal background traffic generation model that considers spatial and temporal traffic characteristics as well as their correlations. The resulting model can be used effectively for the evaluation work in network studies.
Resumo:
OBJECTIVE: To demonstrate the application of causal inference methods to observational data in the obstetrics and gynecology field, particularly causal modeling and semi-parametric estimation. BACKGROUND: Human immunodeficiency virus (HIV)-positive women are at increased risk for cervical cancer and its treatable precursors. Determining whether potential risk factors such as hormonal contraception are true causes is critical for informing public health strategies as longevity increases among HIV-positive women in developing countries. METHODS: We developed a causal model of the factors related to combined oral contraceptive (COC) use and cervical intraepithelial neoplasia 2 or greater (CIN2+) and modified the model to fit the observed data, drawn from women in a cervical cancer screening program at HIV clinics in Kenya. Assumptions required for substantiation of a causal relationship were assessed. We estimated the population-level association using semi-parametric methods: g-computation, inverse probability of treatment weighting, and targeted maximum likelihood estimation. RESULTS: We identified 2 plausible causal paths from COC use to CIN2+: via HPV infection and via increased disease progression. Study data enabled estimation of the latter only with strong assumptions of no unmeasured confounding. Of 2,519 women under 50 screened per protocol, 219 (8.7%) were diagnosed with CIN2+. Marginal modeling suggested a 2.9% (95% confidence interval 0.1%, 6.9%) increase in prevalence of CIN2+ if all women under 50 were exposed to COC; the significance of this association was sensitive to method of estimation and exposure misclassification. CONCLUSION: Use of causal modeling enabled clear representation of the causal relationship of interest and the assumptions required to estimate that relationship from the observed data. Semi-parametric estimation methods provided flexibility and reduced reliance on correct model form. Although selected results suggest an increased prevalence of CIN2+ associated with COC, evidence is insufficient to conclude causality. Priority areas for future studies to better satisfy causal criteria are identified.
Resumo:
Network simulation is an indispensable tool for studying Internet-scale networks due to the heterogeneous structure, immense size and changing properties. It is crucial for network simulators to generate representative traffic, which is necessary for effectively evaluating next-generation network protocols and applications. With network simulation, we can make a distinction between foreground traffic, which is generated by the target applications the researchers intend to study and therefore must be simulated with high fidelity, and background traffic, which represents the network traffic that is generated by other applications and does not require significant accuracy. The background traffic has a significant impact on the foreground traffic, since it competes with the foreground traffic for network resources and therefore can drastically affect the behavior of the applications that produce the foreground traffic. This dissertation aims to provide a solution to meaningfully generate background traffic in three aspects. First is realism. Realistic traffic characterization plays an important role in determining the correct outcome of the simulation studies. This work starts from enhancing an existing fluid background traffic model by removing its two unrealistic assumptions. The improved model can correctly reflect the network conditions in the reverse direction of the data traffic and can reproduce the traffic burstiness observed from measurements. Second is scalability. The trade-off between accuracy and scalability is a constant theme in background traffic modeling. This work presents a fast rate-based TCP (RTCP) traffic model, which originally used analytical models to represent TCP congestion control behavior. This model outperforms other existing traffic models in that it can correctly capture the overall TCP behavior and achieve a speedup of more than two orders of magnitude over the corresponding packet-oriented simulation. Third is network-wide traffic generation. Regardless of how detailed or scalable the models are, they mainly focus on how to generate traffic on one single link, which cannot be extended easily to studies of more complicated network scenarios. This work presents a cluster-based spatio-temporal background traffic generation model that considers spatial and temporal traffic characteristics as well as their correlations. The resulting model can be used effectively for the evaluation work in network studies.
Resumo:
The protein lysate array is an emerging technology for quantifying the protein concentration ratios in multiple biological samples. It is gaining popularity, and has the potential to answer questions about post-translational modifications and protein pathway relationships. Statistical inference for a parametric quantification procedure has been inadequately addressed in the literature, mainly due to two challenges: the increasing dimension of the parameter space and the need to account for dependence in the data. Each chapter of this thesis addresses one of these issues. In Chapter 1, an introduction to the protein lysate array quantification is presented, followed by the motivations and goals for this thesis work. In Chapter 2, we develop a multi-step procedure for the Sigmoidal models, ensuring consistent estimation of the concentration level with full asymptotic efficiency. The results obtained in this chapter justify inferential procedures based on large-sample approximations. Simulation studies and real data analysis are used to illustrate the performance of the proposed method in finite-samples. The multi-step procedure is simpler in both theory and computation than the single-step least squares method that has been used in current practice. In Chapter 3, we introduce a new model to account for the dependence structure of the errors by a nonlinear mixed effects model. We consider a method to approximate the maximum likelihood estimator of all the parameters. Using the simulation studies on various error structures, we show that for data with non-i.i.d. errors the proposed method leads to more accurate estimates and better confidence intervals than the existing single-step least squares method.
Resumo:
Abstract The development of innovative carbon-based materials can be greatly facilitated by molecular modeling techniques. Although the Reax Force Field (ReaxFF) can be used to simulate the chemical behavior of carbon-based systems, the simulation settings required for accurate predictions have not been fully explored. Using the ReaxFF, molecular dynamics (MD) simulations are used to simulate the chemical behavior of pure carbon and hydrocarbon reactive gases that are involved in the formation of carbon structures such as graphite, buckyballs, amorphous carbon, and carbon nanotubes. It is determined that the maximum simulation time step that can be used in MD simulations with the ReaxFF is dependent on the simulated temperature and selected parameter set, as are the predicted reaction rates. It is also determined that different carbon-based reactive gases react at different rates, and that the predicted equilibrium structures are generally the same for the different ReaxFF parameter sets, except in the case of the predicted formation of large graphitic structures with the Chenoweth parameter set under specific conditions.
Resumo:
OBJECTIVES: We describe the methodology for a major study investigating the impact of reconfigured cleft care in the United Kingdom (UK) 15 years after an initial survey, detailed in the Clinical Standards Advisory Group (CSAG) report in 1998, had informed government recommendations on centralization. SETTING AND SAMPLE POPULATION: This is a UK multicentre cross-sectional study of 5-year-olds born with non-syndromic unilateral cleft lip and palate. Children born between 1 April 2005 and 31 March 2007 were seen in cleft centre audit clinics. MATERIALS AND METHODS: Consent was obtained for the collection of routine clinical measures (speech recordings, hearing, photographs, models, oral health, psychosocial factors) and anthropometric measures (height, weight, head circumference). The methodology for each clinical measure followed those of the earlier survey as closely as possible. RESULTS: We identified 359 eligible children and recruited 268 (74.7%) to the study. Eleven separate records for each child were collected at the audit clinics. In total, 2666 (90.4%) were collected from a potential 2948 records. The response rates for the self-reported questionnaires, completed at home, were 52.6% for the Health and Lifestyle Questionnaire and 52.2% for the Satisfaction with Service Questionnaire. CONCLUSIONS: Response rates and measures were similar to those achieved in the previous survey. There are practical, administrative and methodological challenges in repeating cross-sectional surveys 15 years apart and producing comparable data.
Resumo:
Building Information Modelling is changing the design and construction field ever since it entered the market. It took just some time to show its capabilities, it takes some time to be mastered before it could be used expressing all its best features. Since it was conceived to be adopted from the earliest stage of design to get the maximum from the decisional project, it still struggles to adapt to existing buildings. In fact, there is a branch of this methodology that is dedicated to what has been already made that is called Historic BIM or HBIM. This study aims to make clear what are BIM and HBIM, both from a theoretical point of view and in practice, applying from scratch the state of the art to a case study. It had been chosen the fortress of San Felice sul Panaro, a marvellous building with a thousand years of history in its bricks, that suffered violent earthquakes, but it is still standing. By means of this example, it will be shown which are the limits that could be encountered when applying BIM methodology to existing heritage, moreover will be pointed out all the new features that a simple 2D design could not achieve.
Resumo:
In this Thesis we focus on non-standard signatures from CMB polarisation, which might hint at the existence of new phenomena beyond the standard models for Cosmology and Particle physics. With the Planck ESA mission, CMB temperature anisotropies have been observed at the cosmic variance limit, but polarisation remains to be further investigated. CMB polarisation data are important not only because they contribute to provide tighter constraints of cosmological parameters but also because they allow the investigation of physical processes that would be precluded if just the CMB temperature maps were considered. We take polarisation data into account to assess the statistical significance of the anomalies currently observed only in the CMB temperature map and to constrain the Cosmic Birefringence (CB) effect, which is expected in parity-violating extensions of the standard electromagnetism. In particular, we propose a new one-dimensional estimator for the lack of power anomaly capable of taking both temperature and polarisation into account jointly. With the aim of studying the anisotropic CB we develop and perform two different and complementary methods able to evaluate the power spectrum of the CB. Finally, by employing these estimators and methodologies on Planck data we provide new constraints beyond what already known in literature. The measure of CMB polarisation represents a technological challenge and to make accurate estimates, one has to keep an exquisite control of the systematic effects. In order to investigate the impact of spurious signal in forthcoming CMB polarisation experiments, we study the interplay between half-wave plates (HWP) non-idealities and the beams. Our analysis suggests that certain HWP configurations, depending on the complexity of Galactic foregrounds and the beam models, significantly impacts the B-mode reconstruction fidelity and could limit the capabilities of next-generation CMB experiments. We provide also a first study of the impact of non-ideal HWPs on CB.
Resumo:
Background and Aim: Acute cardiac rejection is currently diagnosed by endomyocardial biopsy (EMB), but multiparametric cardiac magnetic resonance (CMR) may be a non-invasive alternative by its capacity for myocardial structure and function characterization. Our primary aim was to determine the utility of multiparametric CMR in identifying acute graft rejection in paediatric heart transplant recipients. The second aim was to compare textural features of parametric maps in cases of rejection versus those without rejection. Methods: Fifteen patients were prospectively enrolled for contrast-enhanced CMR followed by EMB and right heart catheterization. Images were acquired on a 1,5 Tesla scanner including T1 mapping (modified Look-Locker inversion recovery sequence – MOLLI) and T2 mapping (modified GraSE sequence). The extracellular volume (ECV) was calculated using pre- and post-gadolinium T1 times of blood and myocardium and the patient’s hematocrit. Markers of graft dysfunction including hemodynamic measurements from echocardiography, catheterization and CMR were collated. Patients were divided into two groups based on degree of rejection at EMB: no rejection with no change in treatment (Group A) and acute rejection requiring new therapy (Group B). Statistical analysis included student’t t test and Pearson correlation. Results: Acute rejection was diagnosed in five patients. Mean T1 values were significantly associated with acute rejection. A monotonic, increasing trend was noted in both mean and peak T1 values, with increasing degree of rejection. ECV was significantly higher in Group B. There was no difference in T2 signal between two groups. Conclusion: Multiparametric CMR serves as a noninvasive screening tool during surveillance encounters and may be used to identify those patients that may be at higher risk of rejection and therefore require further evaluation. Future and multicenter studies are necessary to confirm these results and explore whether multiparametric CMR can decrease the number of surveillance EMBs in paediatric heart transplant recipients.
Resumo:
This thesis aims to understand the behavior of a low-rise unreinforced masonry building (URM), the typical residential house in the Netherlands, when subjected to low-intensity earthquakes. In fact, in the last decades, the Groningen region was hit by several shallow earthquakes caused by the extraction of natural gas. In particular, the focus is addressed to the internal non-structural walls and to their interaction with the structural parts of the building. A simple and cost-efficient 2D FEM model is developed, focused on the interfaces representing mortar layers that are present between the non-structural walls and the rest of the structure. As a reference for geometries and materials, it has been taken into consideration a prototype that was built in full-scale at the EUCENTRE laboratory of Pavia (Italy). Firstly, a quasi-static analysis is performed by gradually applying a prescribed displacement on the roof floor of the structure. Sensitivity analyses are conducted on some key parameters characterizing mortar. This analysis allows for the calibration of their values and the evaluation of the reliability of the model. Successively, a transient analysis is performed to effectively subject the model to a seismic action and hence also evaluate the mechanical response of the building over time. Moreover, it was possible to compare the results of this analysis with the displacements recorded in the experimental tests by creating a model representing the entire considered structure. As a result, some conditions for the model calibration are defined. The reliability of the model is then confirmed by both the reasonable results obtained from the sensitivity analysis and the compatibility of the values obtained for the top displacement of the roof floor of the experimental test, and the same value acquired from the structural model.
Resumo:
Historic vaulted masonry structures often need strengthening interventions that can effectively improve their structural performance, especially during seismic events, and at the same time respect the existing setting and the modern conservation requirements. In this context, the use of innovative materials such as fiber-reinforced composite materials has been shown as an effective solution that can satisfy both aspects. This work aims to provide insight into the computational modeling of a full-scale masonry vault strengthened by fiber-reinforced composite materials and analyze the influence of the arrangement of the reinforcement on the efficiency of the intervention. At first, a parametric model of a cross vault focusing on a realistic representation of its micro-geometry is proposed. Then numerical modeling, simulating the pushover analyses, of several barrel vaults reinforced with different reinforcement configurations is performed. Finally, the results are collected and discussed in terms of force-displacement curves obtained for each proposed configuration.
Resumo:
Background: Obesity is a public health problem and it is necessary to identify if non-symptomatic obese women must be submitted to endometrial evaluation. Aims: To determine the prevalence of endometrial hyperplasia and cancer in non-symptomatic overweight or obese women. Methods: A cross-sectional study was carried out in 193 women submitted to an endometrial biopsy using a Pipelle de Cornier. The findings were classified as normal, hyperplasia or cancer, and the results were compared to body mass index (BMI; kg/m2). For the purpose of statistical analysis, women were divided into two groups: women of reproductive age and postmenopausal women, and according to BMI as overweight or obese. Results: The prevalence of endometrial cancer and hyperplasia was 1.0% and 5.8% in women of reproductive age and 3.0% and 12.1% in postmenopausal women, respectively. According to logistic regression, being in the postmenopause increased the risk of endometrial hyperplasia and cancer to 1.19 (95% confidence interval (CI): 0.36-3.90), while being postmenopausal and severely obese increased the odds ratio (OR) to 1.58 (95%CI: 0.30-8.23) and being postmenopausal and morbidly obese increased the OR to 2.72 (95%CI: 0.65-11.5). No increase in risk was found in women of reproductive age who were either overweight or obese. Discussion: Our results show that non-symptomatic, severe or morbidly obese postmenopausal women have a high risk of developing endometrial hyperplasia or cancer; however, no such risk was found for women of reproductive age.
Resumo:
Background: Celery (Apium graveolens) represents a relevant allergen source that can elicit severe reactions in the adult population. To investigate the sensitization prevalence and cross-reactivity of Api g 2 from celery stalks in a Mediterranean population and in a mouse model. Methodology: 786 non-randomized subjects from Italy were screened for IgE reactivity to rApi g 2, rArt v 3 (mugwort pollen LTP) and nPru p 3 (peach LTP) using an allergen microarray. Clinical data of 32 selected patients with reactivity to LTP under investigation were evaluated. Specific IgE titers and cross-inhibitions were performed in ELISA and allergen microarray. Balb/c mice were immunized with purified LTPs; IgG titers were determined in ELISA and mediator release was examined using RBL-2H3 cells. Simulated endolysosomal digestion was performed using microsomes obtained from human DCs. Results: IgE testing showed a sensitization prevalence of 25.6% to Api g 2, 18.6% to Art v 3, and 28.6% to Pru p 3 and frequent co-sensitization and correlating IgE-reactivity was observed. 10/32 patients suffering from LTP-related allergy reported symptoms upon consumption of celery stalks which mainly presented as OAS. Considerable IgE cross-reactivity was observed between Api g 2, Art v 3, and Pru p 3 with varying inhibition degrees of individual patients' sera. Simulating LTP mono-sensitization in a mouse model showed development of more congruent antibody specificities between Api g 2 and Art v 3. Notably, biologically relevant murine IgE cross-reactivity was restricted to the latter and diverse from Pru p 3 epitopes. Endolysosomal processing of LTP showed generation of similar clusters, which presumably represent T-cell peptides. Conclusions: Api g 2 represents a relevant celery stalk allergen in the LTP-sensitized population. The molecule displays common B cell epitopes and endolysosomal peptides that encompass T cell epitopes with pollen and plant-food derived LTP.