790 resultados para Transmembrane Helix Prediction
Resumo:
Seasonal forecast skill of the basinwide and regional tropical cyclone (TC) activity in an experimental coupled prediction system based on the ECMWF System 4 is assessed. As part of a collaboration between the Center for Ocean–Land–Atmosphere Studies (COLA) and the ECMWF called Project Minerva, the system is integrated at the atmospheric horizontal spectral resolutions of T319, T639, and T1279. Seven-month hindcasts starting from 1 May for the years 1980–2011 are produced at all three resolutions with at least 15 ensemble members. The Minerva system demonstrates statistically significant skill for retrospective forecasts of TC frequency and accumulated cyclone energy (ACE) in the North Atlantic (NA), eastern North Pacific (EP), and western North Pacific. While the highest scores overall are achieved in the North Pacific, the skill in the NA appears to be limited by an overly strong influence of the tropical Pacific variability. Higher model resolution improves skill scores for the ACE and, to a lesser extent, the TC frequency, even though the influence of large-scale climate variations on these TC activity measures is largely independent of resolution changes. The biggest gain occurs in transition from T319 to T639. Significant skill in regional TC forecasts is achieved over broad areas of the Northern Hemisphere. The highest-resolution hindcasts exhibit additional locations with skill in the NA and EP, including land-adjacent areas. The feasibility of regional intensity forecasts is assessed. In the presence of the coupled model biases, the benefits of high resolution for seasonal TC forecasting may be underestimated.
Resumo:
The Madden-Julian Oscillation (MJO) is the dominant mode of intraseasonal variability in the Trop- ics. It can be characterised as a planetary-scale coupling between the atmospheric circulation and organised deep convection that propagates east through the equatorial Indo-Pacific region. The MJO interacts with weather and climate systems on a near-global scale and is a crucial source of predictability for weather forecasts on medium to seasonal timescales. Despite its global signifi- cance, accurately representing the MJO in numerical weather prediction (NWP) and climate models remains a challenge. This thesis focuses on the representation of the MJO in the Integrated Forecasting System (IFS) at the European Centre for Medium-Range Weather Forecasting (ECMWF), a state-of-the-art NWP model. Recent modifications to the model physics in Cycle 32r3 (Cy32r3) of the IFS led to ad- vances in the simulation of the MJO; for the first time the observed amplitude of the MJO was maintained throughout the integration period. A set of hindcast experiments, which differ only in their formulation of convection, have been performed between May 2008 and April 2009 to asses the sensitivity of MJO simulation in the IFS to the Cy32r3 convective parameterization. Unique to this thesis is the attribution of the advances in MJO simulation in Cy32r3 to the mod- ified convective parameterization, specifically, the relative-humidity-dependent formulation for or- ganised deep entrainment. Increasing the sensitivity of the deep convection scheme to environmen- tal moisture is shown to modify the relationship between precipitation and moisture in the model. Through dry-air entrainment, convective plumes ascending in low-humidity environments terminate lower in the atmosphere. As a result, there is an increase in the occurrence of cumulus congestus, which acts to moisten the mid-troposphere. Due to the modified precipitation-moisture relationship more moisture is able to build up which effectively preconditions the tropical atmosphere for the transition to deep convection. Results from this thesis suggest that a tropospheric moisture control on convection is key to simulating the interaction between the physics and large-scale circulation associated with the MJO.
Resumo:
This paper describes the methodology of providing multiprobability predictions for proteomic mass spectrometry data. The methodology is based on a newly developed machine learning framework called Venn machines. Is allows to output a valid probability interval. The methodology is designed for mass spectrometry data. For demonstrative purposes, we applied this methodology to MALDI-TOF data sets in order to predict the diagnosis of heart disease and early diagnoses of ovarian cancer and breast cancer. The experiments showed that probability intervals are narrow, that is, the output of the multiprobability predictor is similar to a single probability distribution. In addition, probability intervals produced for heart disease and ovarian cancer data were more accurate than the output of corresponding probability predictor. When Venn machines were forced to make point predictions, the accuracy of such predictions is for the most data better than the accuracy of the underlying algorithm that outputs single probability distribution of a label. Application of this methodology to MALDI-TOF data sets empirically demonstrates the validity. The accuracy of the proposed method on ovarian cancer data rises from 66.7 % 11 months in advance of the moment of diagnosis to up to 90.2 % at the moment of diagnosis. The same approach has been applied to heart disease data without time dependency, although the achieved accuracy was not as high (up to 69.9 %). The methodology allowed us to confirm mass spectrometry peaks previously identified as carrying statistically significant information for discrimination between controls and cases.
Resumo:
It is argued that existing polar prediction systems do not yet meet users’ needs; and possible ways forward in advancing prediction capacity in polar regions and beyond are outlined. The polar regions have been attracting more and more attention in recent years, fuelled by the perceptible impacts of anthropogenic climate change. Polar climate change provides new opportunities, such as shorter shipping routes between Europe and East Asia, but also new risks such as the potential for industrial accidents or emergencies in ice-covered seas. Here, it is argued that environmental prediction systems for the polar regions are less developed than elsewhere. There are many reasons for this situation, including the polar regions being (historically) lower priority, with less in situ observations, and with numerous local physical processes that are less well-represented by models. By contrasting the relative importance of different physical processes in polar and lower latitudes, the need for a dedicated polar prediction effort is illustrated. Research priorities are identified that will help to advance environmental polar prediction capabilities. Examples include an improvement of the polar observing system; the use of coupled atmosphere-sea ice-ocean models, even for short-term prediction; and insight into polar-lower latitude linkages and their role for forecasting. Given the enormity of some of the challenges ahead, in a harsh and remote environment such as the polar regions, it is argued that rapid progress will only be possible with a coordinated international effort. More specifically, it is proposed to hold a Year of Polar Prediction (YOPP) from mid-2017 to mid-2019 in which the international research and operational forecasting community will work together with stakeholders in a period of intensive observing, modelling, prediction, verification, user-engagement and educational activities.
Resumo:
In the present work, a group contribution method is proposed for the estimation of viscosity of fatty compounds and biodiesel esters as a function of the temperature. The databank used for regression of the group contribution parameters (1070 values for 65 types of substances) included fatty compounds, such as fatty acids, methyl and ethyl esters and alcohols, tri- and diacylglycerols, and glycerol. The inclusion of new experimental data for fatty esters, a partial acylglycerol, and glycerol allowed for a further refinement in the performance of this methodology in comparison to a prior group contribution equation (Ceriani, R.; Goncalves, C. B.; Rabelo, J.; Caruso, M.; Cunha, A. C. C.; Cavaleri, F. W.; Batista, E. A. C.; Meirelles, A. J. A. Group contribution model for predicting viscosity of fatty compounds. J. Chem. Eng. Data 2007, 52, 965-972) for all classes of fatty compounds. Besides, the influence of small concentrations of partial acylglycerols, intermediate compounds in the transesterification reaction, in the viscosity of biodiesels was also investigated.
Resumo:
Molecular hydrogen emission is commonly observed in planetary nebulae. Images taken in infrared H(2) emission lines show that at least part of the molecular emission is produced inside the ionized region. In the best studied case, the Helix nebula, the H(2) emission is produced inside cometary knots (CKs), comet-shaped structures believed to be clumps of dense neutral gas embedded within the ionized gas. Most of the H(2) emission of the CKs seems to be produced in a thin layer between the ionized diffuse gas and the neutral material of the knot, in a mini-photodissociation region (mini-PDR). However, PDR models published so far cannot fully explain all the characteristics of the H(2) emission of the CKs. In this work, we use the photoionization code AANGABA to study the H(2) emission of the CKs, particularly that produced in the interface H(+)/H(0) of the knot, where a significant fraction of the H(2) 1-0 S(1) emission seems to be produced. Our results show that the production of molecular hydrogen in such a region may explain several characteristics of the observed emission, particularly the high excitation temperature of the H(2) infrared lines. We find that the temperature derived from H(2) observations, even of a single knot, will depend very strongly on the observed transitions, with much higher temperatures derived from excited levels. We also proposed that the separation between the H alpha and [N II] peak emission observed in the images of CKs may be an effect of the distance of the knot from the star, since for knots farther from the central star the [N II] line is produced closer to the border of the CK than H alpha.
Resumo:
The evolution of commodity computing lead to the possibility of efficient usage of interconnected machines to solve computationally-intensive tasks, which were previously solvable only by using expensive supercomputers. This, however, required new methods for process scheduling and distribution, considering the network latency, communication cost, heterogeneous environments and distributed computing constraints. An efficient distribution of processes over such environments requires an adequate scheduling strategy, as the cost of inefficient process allocation is unacceptably high. Therefore, a knowledge and prediction of application behavior is essential to perform effective scheduling. In this paper, we overview the evolution of scheduling approaches, focusing on distributed environments. We also evaluate the current approaches for process behavior extraction and prediction, aiming at selecting an adequate technique for online prediction of application execution. Based on this evaluation, we propose a novel model for application behavior prediction, considering chaotic properties of such behavior and the automatic detection of critical execution points. The proposed model is applied and evaluated for process scheduling in cluster and grid computing environments. The obtained results demonstrate that prediction of the process behavior is essential for efficient scheduling in large-scale and heterogeneous distributed environments, outperforming conventional scheduling policies by a factor of 10, and even more in some cases. Furthermore, the proposed approach proves to be efficient for online predictions due to its low computational cost and good precision. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Process scheduling techniques consider the current load situation to allocate computing resources. Those techniques make approximations such as the average of communication, processing, and memory access to improve the process scheduling, although processes may present different behaviors during their whole execution. They may start with high communication requirements and later just processing. By discovering how processes behave over time, we believe it is possible to improve the resource allocation. This has motivated this paper which adopts chaos theory concepts and nonlinear prediction techniques in order to model and predict process behavior. Results confirm the radial basis function technique which presents good predictions and also low processing demands show what is essential in a real distributed environment.
Resumo:
This study investigates the numerical simulation of three-dimensional time-dependent viscoelastic free surface flows using the Upper-Convected Maxwell (UCM) constitutive equation and an algebraic explicit model. This investigation was carried out to develop a simplified approach that can be applied to the extrudate swell problem. The relevant physics of this flow phenomenon is discussed in the paper and an algebraic model to predict the extrudate swell problem is presented. It is based on an explicit algebraic representation of the non-Newtonian extra-stress through a kinematic tensor formed with the scaled dyadic product of the velocity field. The elasticity of the fluid is governed by a single transport equation for a scalar quantity which has dimension of strain rate. Mass and momentum conservations, and the constitutive equation (UCM and algebraic model) were solved by a three-dimensional time-dependent finite difference method. The free surface of the fluid was modeled using a marker-and-cell approach. The algebraic model was validated by comparing the numerical predictions with analytic solutions for pipe flow. In comparison with the classical UCM model, one advantage of this approach is that computational workload is substantially reduced: the UCM model employs six differential equations while the algebraic model uses only one. The results showed stable flows with very large extrudate growths beyond those usually obtained with standard differential viscoelastic models. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Managing software maintenance is rarely a precise task due to uncertainties concerned with resources and services descriptions. Even when a well-established maintenance process is followed, the risk of delaying tasks remains if the new services are not precisely described or when resources change during process execution. Also, the delay of a task at an early process stage may represent a different delay at the end of the process, depending on complexity or services reliability requirements. This paper presents a knowledge-based representation (Bayesian Networks) for maintenance project delays based on specialists experience and a corresponding tool to help in managing software maintenance projects. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
The purpose of this article is to present a new method to predict the response variable of an observation in a new cluster for a multilevel logistic regression. The central idea is based on the empirical best estimator for the random effect. Two estimation methods for multilevel model are compared: penalized quasi-likelihood and Gauss-Hermite quadrature. The performance measures for the prediction of the probability for a new cluster observation of the multilevel logistic model in comparison with the usual logistic model are examined through simulations and an application.
Resumo:
The objective of this article is to find out the influence of the parameters of the ARIMA-GARCH models in the prediction of artificial neural networks (ANN) of the feed forward type, trained with the Levenberg-Marquardt algorithm, through Monte Carlo simulations. The paper presents a study of the relationship between ANN performance and ARIMA-GARCH model parameters, i.e. the fact that depending on the stationarity and other parameters of the time series, the ANN structure should be selected differently. Neural networks have been widely used to predict time series and their capacity for dealing with non-linearities is a normally outstanding advantage. However, the values of the parameters of the models of generalized autoregressive conditional heteroscedasticity have an influence on ANN prediction performance. The combination of the values of the GARCH parameters with the ARIMA autoregressive terms also implies in ANN performance variation. Combining the parameters of the ARIMA-GARCH models and changing the ANN`s topologies, we used the Theil inequality coefficient to measure the prediction of the feed forward ANN.
Resumo:
The purpose of this work is to verify the stability of the relationship between real activity and interest rate spread. The test is based on Chen (1988) and Osorio and Galea (2006). The analysis is applied to Chile and the United States, from 1980 to 1999. In general, in both cases the relationship was statistically significant in early 80s, but a break point is found in both countries during that decades, suggesting that the relationship depends on the monetary rule follow by the Central Bank.
Resumo:
In the present work, a new approach for the determination of the partition coefficient in different interfaces based on the density function theory is proposed. Our results for log P(ow) considering a n-octanol/water interface for a large super cell for acetone -0.30 (-0.24) and methane 0.95 (0.78) are comparable with the experimental data given in parenthesis. We believe that these differences are mainly related to the absence of van der Walls interactions and the limited number of molecules considered in the super cell. The numerical deviations are smaller than that observed for interpolation based tools. As the proposed model is parameter free, it is not limited to the n-octanol/water interface.
Resumo:
XACb0070 is an uncharacterized protein coded by the two large plasmids isolated from Xanthomonas axonopodis pv. cirri, the agent of citrus canker and responsible for important economical losses in citrus world production. XACb0070 presents sequence homology only with other hypothetical proteins belonging to plant pathogens, none of which have their structure determined. The NMR-derived solution structure reveals this protein is a homodimer in which each monomer presents two domains with different structural and dynamic properties: a folded N-terminal domain with beta alpha alpha topology which mediates dimerization and a long disordered C-terminal tail. The folded domain shows high structural similarity to the ribbon-helix-helix transcriptional repressors, a family of DNA-binding proteins of conserved 3D fold but low sequence homology: indeed XACb0070 binds DNA. Primary sequence and fold comparison of XACb0070 with other proteins of the ribbon-helix-helix family together with examination of the genes in the vicinity of xacb0070 suggest the protein might be the component of a toxin-antitoxin system. (C) 2010 Elsevier Inc. All rights reserved.