949 resultados para monotone missing data
Resumo:
The MAP-i Doctoral Program of the Universities of Minho, Aveiro and Porto.
Resumo:
Rockburst is characterized by a violent explosion of a block causing a sudden rupture in the rock and is quite common in deep tunnels. It is critical to understand the phenomenon of rockburst, focusing on the patterns of occurrence so these events can be avoided and/or managed saving costs and possibly lives. The failure mechanism of rockburst needs to be better understood. Laboratory experiments are undergoing at the Laboratory for Geomechanics and Deep Underground Engineering (SKLGDUE) of Beijing and the system is described. A large number of rockburst tests were performed and their information collected, stored in a database and analyzed. Data Mining (DM) techniques were applied to the database in order to develop predictive models for the rockburst maximum stress (σRB) and rockburst risk index (IRB) that need the results of such tests to be determined. With the developed models it is possible to predict these parameters with high accuracy levels using data from the rock mass and specific project.
Resumo:
In this study, we concentrate on modelling gross primary productivity using two simple approaches to simulate canopy photosynthesis: "big leaf" and "sun/shade" models. Two approaches for calibration are used: scaling up of canopy photosynthetic parameters from the leaf to the canopy level and fitting canopy biochemistry to eddy covariance fluxes. Validation of the models is achieved by using eddy covariance data from the LBA site C14. Comparing the performance of both models we conclude that numerically (in terms of goodness of fit) and qualitatively, (in terms of residual response to different environmental variables) sun/shade does a better job. Compared to the sun/shade model, the big leaf model shows a lower goodness of fit and fails to respond to variations in the diffuse fraction, also having skewed responses to temperature and VPD. The separate treatment of sun and shade leaves in combination with the separation of the incoming light into direct beam and diffuse make sun/shade a strong modelling tool that catches more of the observed variability in canopy fluxes as measured by eddy covariance. In conclusion, the sun/shade approach is a relatively simple and effective tool for modelling photosynthetic carbon uptake that could be easily included in many terrestrial carbon models.
Resumo:
A search is presented for the direct pair production of a chargino and a neutralino pp→χ~±1χ~02, where the chargino decays to the lightest neutralino and the W boson, χ~±1→χ~01(W±→ℓ±ν), while the neutralino decays to the lightest neutralino and the 125 GeV Higgs boson, χ~02→χ~01(h→bb/γγ/ℓ±νqq). The final states considered for the search have large missing transverse momentum, an isolated electron or muon, and one of the following: either two jets identified as originating from bottom quarks, or two photons, or a second electron or muon with the same electric charge. The analysis is based on 20.3 fb−1 of s√=8 TeV proton-proton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with the Standard Model expectations, and limits are set in the context of a simplified supersymmetric model.
Resumo:
A search is performed for Higgs bosons produced in association with top quarks using the diphoton decay mode of the Higgs boson. Selection requirements are optimized separately for leptonic and fully hadronic final states from the top quark decays. The dataset used corresponds to an integrated luminosity of 4.5 fb−1 of proton--proton collisions at a center-of-mass energy of 7 TeV and 20.3 fb−1 at 8 TeV recorded by the ATLAS detector at the CERN Large Hadron Collider. No significant excess over the background prediction is observed and upper limits are set on the tt¯H production cross section. The observed exclusion upper limit at 95% confidence level is 6.7 times the predicted Standard Model cross section value. In addition, limits are set on the strength of the Yukawa coupling between the top quark and the Higgs boson, taking into account the dependence of the tt¯H and tH cross sections as well as the H→γγ branching fraction on the Yukawa coupling. Lower and upper limits at 95% confidence level are set at −1.3 and +8.0 times the Yukawa coupling strength in the Standard Model.
Resumo:
Searches are performed for resonant and non-resonant Higgs boson pair production in the hh→γγbb¯ final state using 20 fb−1 of proton--proton collisions at a center-of-mass energy of 8TeV recorded with the ATLAS detector at the CERN Large Hadron Collider. A 95% confidence level upper limit on the cross section times branching ratio of non--resonant production is set at 2.2 pb, while the expected limit is 1.0 pb. The corresponding limit observed for a narrow resonance ranges between 0.8 and 3.5 pb as a function of its mass.
Resumo:
Results of a search for decays of massive particles to fully hadronic final states are presented. This search uses 20.3 fb−1 of data collected by the ATLAS detector in s√=8TeV proton--proton collisions at the LHC. Signatures based on high jet multiplicities without requirements on the missing transverse momentum are used to search for R-parity-violating supersymmetric gluino pair production with subsequent decays to quarks. The analysis is performed using a requirement on the number of jets, in combination with separate requirements on the number of b-tagged jets, as well as a topological observable formed from the scalar sum of the mass values of large-radius jets in the event. Results are interpreted in the context of all possible branching ratios of direct gluino decays to various quark flavors. No significant deviation is observed from the expected Standard Model backgrounds estimated using jet-counting as well as data-driven templates of the total-jet-mass spectra. Gluino pair decays to ten or more quarks via intermediate neutralinos are excluded for a gluino with mass mg~<1TeV for a neutralino mass mχ~01=500GeV. Direct gluino decays to six quarks are excluded for mg~<917GeV for light-flavor final states, and results for various flavor hypotheses are presented.
Resumo:
The MAP-i Doctoral Program of the Universities of Minho, Aveiro and Porto
Resumo:
The study of the interaction between hair filaments and formulations or peptides is of utmost importance in fields like cosmetic research. Keratin intermediate filaments structure is not fully described, limiting the molecular dynamics (MD) studies in this field although its high potential to improve the area. We developed a computational model of a truncated protofibril, simulated its behavior in alcoholic based formulations and with one peptide. The simulations showed a strong interaction between the benzyl alcohol molecules of the formulations and the model, leading to the disorganization of the keratin chains, which regress with the removal of the alcohol molecules. This behavior can explain the increase of peptide uptake in hair shafts evidenced in fluorescence microscopy pictures. The model developed is valid to computationally reproduce the interaction between hair and alcoholic formulations and provide a robust base for new MD studies about hair properties. It is shown that the MD simulations can improve hair cosmetic research, improving the uptake of a compound of interest.
Resumo:
Propolis is a chemically complex biomass produced by honeybees (Apis mellifera) from plant resins added of salivary enzymes, beeswax, and pollen. The biological activities described for propolis were also identified for donor plants resin, but a big challenge for the standardization of the chemical composition and biological effects of propolis remains on a better understanding of the influence of seasonality on the chemical constituents of that raw material. Since propolis quality depends, among other variables, on the local flora which is strongly influenced by (a)biotic factors over the seasons, to unravel the harvest season effect on the propolis chemical profile is an issue of recognized importance. For that, fast, cheap, and robust analytical techniques seem to be the best choice for large scale quality control processes in the most demanding markets, e.g., human health applications. For that, UV-Visible (UV-Vis) scanning spectrophotometry of hydroalcoholic extracts (HE) of seventy-three propolis samples, collected over the seasons in 2014 (summer, spring, autumn, and winter) and 2015 (summer and autumn) in Southern Brazil was adopted. Further machine learning and chemometrics techniques were applied to the UV-Vis dataset aiming to gain insights as to the seasonality effect on the claimed chemical heterogeneity of propolis samples determined by changes in the flora of the geographic region under study. Descriptive and classification models were built following a chemometric approach, i.e. principal component analysis (PCA) and hierarchical clustering analysis (HCA) supported by scripts written in the R language. The UV-Vis profiles associated with chemometric analysis allowed identifying a typical pattern in propolis samples collected in the summer. Importantly, the discrimination based on PCA could be improved by using the dataset of the fingerprint region of phenolic compounds ( = 280-400m), suggesting that besides the biological activities of those secondary metabolites, they also play a relevant role for the discrimination and classification of that complex matrix through bioinformatics tools. Finally, a series of machine learning approaches, e.g., partial least square-discriminant analysis (PLS-DA), k-Nearest Neighbors (kNN), and Decision Trees showed to be complementary to PCA and HCA, allowing to obtain relevant information as to the sample discrimination.
Resumo:
DNA microarrays are one of the most used technologies for gene expression measurement. However, there are several distinct microarray platforms, from different manufacturers, each with its own measurement protocol, resulting in data that can hardly be compared or directly integrated. Data integration from multiple sources aims to improve the assertiveness of statistical tests, reducing the data dimensionality problem. The integration of heterogeneous DNA microarray platforms comprehends a set of tasks that range from the re-annotation of the features used on gene expression, to data normalization and batch effect elimination. In this work, a complete methodology for gene expression data integration and application is proposed, which comprehends a transcript-based re-annotation process and several methods for batch effect attenuation. The integrated data will be used to select the best feature set and learning algorithm for a brain tumor classification case study. The integration will consider data from heterogeneous Agilent and Affymetrix platforms, collected from public gene expression databases, such as The Cancer Genome Atlas and Gene Expression Omnibus.
Resumo:
Transcriptional Regulatory Networks (TRNs) are powerful tool for representing several interactions that occur within a cell. Recent studies have provided information to help researchers in the tasks of building and understanding these networks. One of the major sources of information to build TRNs is biomedical literature. However, due to the rapidly increasing number of scientific papers, it is quite difficult to analyse the large amount of papers that have been published about this subject. This fact has heightened the importance of Biomedical Text Mining approaches in this task. Also, owing to the lack of adequate standards, as the number of databases increases, several inconsistencies concerning gene and protein names and identifiers are common. In this work, we developed an integrated approach for the reconstruction of TRNs that retrieve the relevant information from important biological databases and insert it into a unique repository, named KREN. Also, we applied text mining techniques over this integrated repository to build TRNs. However, was necessary to create a dictionary of names and synonyms associated with these entities and also develop an approach that retrieves all the abstracts from the related scientific papers stored on PubMed, in order to create a corpora of data about genes. Furthermore, these tasks were integrated into @Note, a software system that allows to use some methods from the Biomedical Text Mining field, including an algorithms for Named Entity Recognition (NER), extraction of all relevant terms from publication abstracts, extraction relationships between biological entities (genes, proteins and transcription factors). And finally, extended this tool to allow the reconstruction Transcriptional Regulatory Networks through using scientific literature.
Resumo:
Dijet events produced in LHC proton--proton collisions at a center-of-mass energy s√=8 TeV are studied with the ATLAS detector using the full 2012 data set, with an integrated luminosity of 20.3 fb−1. Dijet masses up to about 4.5 TeV are probed. No resonance-like features are observed in the dijet mass spectrum. Limits on the cross section times acceptance are set at the 95% credibility level for various hypotheses of new phenomena in terms of mass or energy scale, as appropriate. This analysis excludes excited quarks with a mass below 4.09 TeV, color-octet scalars with a mass below 2.72 TeV, heavy W′ bosons with a mass below 2.45 TeV, chiral W∗ bosons with a mass below 1.75 TeV, and quantum black holes with six extra space-time dimensions with threshold mass below 5.82 TeV.
Resumo:
This Letter presents a search at the LHC for s-channel single top-quark production in proton-proton collisions at a centre-of-mass energy of 8 TeV. The analyzed data set was recorded by the ATLAS detector and corresponds to an integrated luminosity of 20.3 fb−1. Selected events contain one charged lepton, large missing transverse momentum and exactly two b-tagged jets. A multivariate event classifier based on boosted decision trees is developed to discriminate s-channel single top-quark events from the main background contributions. The signal extraction is based on a binned maximum-likelihood fit of the output classifier distribution. The analysis leads to an upper limit on the s-channel single top-quark production cross-section of 14.6 pb at the 95% confidence level. The fit gives a cross-section of σs=5.0±4.3 pb, consistent with the Standard Model expectation.
Resumo:
Simultaneous measurements of the tt¯, W+W−, and Z/γ∗→ττ production cross-sections using an integrated luminosity of 4.6 fb−1 of pp collisions at s√=7 TeV collected by the ATLAS detector at the LHC are presented. Events are selected with two high transverse momentum leptons consisting of an oppositely charged electron and muon pair. The three processes are separated using the distributions of the missing transverse momentum of events with zero and greater than zero jet multiplicities. Measurements of the fiducial cross-section are presented along with results that quantify for the first time the underlying correlations in the predicted and measured cross-sections due to proton parton distribution functions. These results indicate that the correlated NLO predictions for tt¯ and Z/γ∗→ττ significantly underestimate the data, while those at NNLO generally describe the data well. The full cross-sections are measured to be σ(tt¯)=181.2±2.8+9.7−9.5±3.3±3.3 pb, σ(W+W−)=53.3±2.7+7.3−8.0±1.0±0.5 pb, and σ(Z/γ∗→ττ)=1174±24+72−87±21±9 pb, where the cited uncertainties are due to statistics, systematic effects, luminosity and the LHC beam energy measurement, respectively.