991 resultados para Prediction theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In protein databases there is a substantial number of proteins structurally determined but without function annotation. Understanding the relationship between function and structure can be useful to predict function on a large scale. We have analyzed the similarities in global physicochemical parameters for a set of enzymes which were classified according to the four Enzyme Commission (EC) hierarchical levels. Using relevance theory we introduced a distance between proteins in the space of physicochemical characteristics. This was done by minimizing a cost function of the metric tensor built to reflect the EC classification system. Using an unsupervised clustering method on a set of 1025 enzymes, we obtained no relevant clustering formation compatible with EC classification. The distance distributions between enzymes from the same EC group and from different EC groups were compared by histograms. Such analysis was also performed using sequence alignment similarity as a distance. Our results suggest that global structure parameters are not sufficient to segregate enzymes according to EC hierarchy. This indicates that features essential for function are rather local than global. Consequently, methods for predicting function based on global attributes should not obtain high accuracy in main EC classes prediction without relying on similarities between enzymes from training and validation datasets. Furthermore, these results are consistent with a substantial number of studies suggesting that function evolves fundamentally by recruitment, i.e., a same protein motif or fold can be used to perform different enzymatic functions and a few specific amino acids (AAs) are actually responsible for enzyme activity. These essential amino acids should belong to active sites and an effective method for predicting function should be able to recognize them. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current scientific applications have been producing large amounts of data. The processing, handling and analysis of such data require large-scale computing infrastructures such as clusters and grids. In this area, studies aim at improving the performance of data-intensive applications by optimizing data accesses. In order to achieve this goal, distributed storage systems have been considering techniques of data replication, migration, distribution, and access parallelism. However, the main drawback of those studies is that they do not take into account application behavior to perform data access optimization. This limitation motivated this paper which applies strategies to support the online prediction of application behavior in order to optimize data access operations on distributed systems, without requiring any information on past executions. In order to accomplish such a goal, this approach organizes application behaviors as time series and, then, analyzes and classifies those series according to their properties. By knowing properties, the approach selects modeling techniques to represent series and perform predictions, which are, later on, used to optimize data access operations. This new approach was implemented and evaluated using the OptorSim simulator, sponsored by the LHC-CERN project and widely employed by the scientific community. Experiments confirm this new approach reduces application execution time in about 50 percent, specially when handling large amounts of data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantum chemical calculations at the B3LYP/6-31G* level of theory were employed for the structure-activity relationship and prediction of the antioxidant activity of edaravone and structurally related derivatives using energy (E), ionization potential (IP), bond dissociation energy (BDE), and stabilization energies(Delta E-iso). Spin density calculations were also performed for the proposed antioxidant activity mechanism. The electron abstraction is related to electron-donating groups (EDG) at position 3, decreasing the IP when compared to substitution at position 4. The hydrogen abstraction is related to electron-withdrawing groups (EDG) at position 4, decreasing the BDECH when compared to other substitutions, resulting in a better antioxidant activity. The unpaired electron formed by the hydrogen abstraction from the C-H group of the pyrazole ring is localized at 2, 4, and 6 positions. The highest scavenging activity prediction is related to the lowest contribution at the carbon atom. The likely mechanism is related to hydrogen transfer. It was found that antioxidant activity depends on the presence of EDG at the C-2 and C-4 positions and there is a correlation between IP and BDE. Our results identified three different classes of new derivatives more potent than edaravone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Airbus GmbH (Hamburg) has been developed a new design of Rear Pressure Bulkhead (RPB) for the A320-family. The new model has been formed with vacuum forming technology. During this process the wrinkling phenomenon occurs. In this thesis is described an analytical model for prediction of wrinkling based on the energetic method of Timoshenko. Large deflection theory has been used for analyze two cases of study: a simply supported circular thin plate stamped by a spherical punch and a simply supported circular thin plate formed with vacuum forming technique. If the edges are free to displace radially, thin plates will develop radial wrinkles near the edge at a central deflection approximately equal to four plate thicknesses w0/ℎ≈4 if they’re stamped by a spherical punch and w0/ℎ≈3 if they’re formed with vacuum forming technique. Initially, there are four symmetrical wrinkles, but the number increases if the central deflection is increased. By using experimental results, the “Snaptrhough” phenomenon is described.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Suppose that we are interested in establishing simple, but reliable rules for predicting future t-year survivors via censored regression models. In this article, we present inference procedures for evaluating such binary classification rules based on various prediction precision measures quantified by the overall misclassification rate, sensitivity and specificity, and positive and negative predictive values. Specifically, under various working models we derive consistent estimators for the above measures via substitution and cross validation estimation procedures. Furthermore, we provide large sample approximations to the distributions of these nonsmooth estimators without assuming that the working model is correctly specified. Confidence intervals, for example, for the difference of the precision measures between two competing rules can then be constructed. All the proposals are illustrated with two real examples and their finite sample properties are evaluated via a simulation study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High density oligonucleotide expression arrays are a widely used tool for the measurement of gene expression on a large scale. Affymetrix GeneChip arrays appear to dominate this market. These arrays use short oligonucleotides to probe for genes in an RNA sample. Due to optical noise, non-specific hybridization, probe-specific effects, and measurement error, ad-hoc measures of expression, that summarize probe intensities, can lead to imprecise and inaccurate results. Various researchers have demonstrated that expression measures based on simple statistical models can provide great improvements over the ad-hoc procedure offered by Affymetrix. Recently, physical models based on molecular hybridization theory, have been proposed as useful tools for prediction of, for example, non-specific hybridization. These physical models show great potential in terms of improving existing expression measures. In this paper we demonstrate that the system producing the measured intensities is too complex to be fully described with these relatively simple physical models and we propose empirically motivated stochastic models that compliment the above mentioned molecular hybridization theory to provide a comprehensive description of the data. We discuss how the proposed model can be used to obtain improved measures of expression useful for the data analysts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many complex and dynamic domains, the ability to generate and then select the appropriate course of action is based on the decision maker's "reading" of the situation--in other words, their ability to assess the situation and predict how it will evolve over the next few seconds. Current theories regarding option generation during the situation assessment and response phases of decision making offer contrasting views on the cognitive mechanisms that support superior performance. The Recognition-Primed Decision-making model (RPD; Klein, 1989) and Take-The-First heuristic (TTF; Johnson & Raab, 2003) suggest that superior decisions are made by generating few options, and then selecting the first option as the final one. Long-Term Working Memory theory (LTWM; Ericsson & Kintsch, 1995), on the other hand, posits that skilled decision makers construct rich, detailed situation models, and that as a result, skilled performers should have the ability to generate more of the available task-relevant options. The main goal of this dissertation was to use these theories about option generation as a way to further the understanding of how police officers anticipate a perpetrator's actions, and make decisions about how to respond, during dynamic law enforcement situations. An additional goal was to gather information that can be used, in the future, to design training based on the anticipation skills, decision strategies, and processes of experienced officers. Two studies were conducted to achieve these goals. Study 1 identified video-based law enforcement scenarios that could be used to discriminate between experienced and less-experienced police officers, in terms of their ability to anticipate the outcome. The discriminating scenarios were used as the stimuli in Study 2; 23 experienced and 26 less-experienced police officers observed temporally-occluded versions of the scenarios, and then completed assessment and response option-generation tasks. The results provided mixed support for the nature of option generation in these situations. Consistent with RPD and TTF, participants typically selected the first-generated option as their final one, and did so during both the assessment and response phases of decision making. Consistent with LTWM theory, participants--regardless of experience level--generated more task-relevant assessment options than task-irrelevant options. However, an expected interaction between experience level and option-relevance was not observed. Collectively, the two studies provide a deeper understanding of how police officers make decisions in dynamic situations. The methods developed and employed in the studies can be used to investigate anticipation and decision making in other critical domains (e.g., nursing, military). The results are discussed in relation to how they can inform future studies of option-generation performance, and how they could be applied to develop training for law enforcement officers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Anomie theorists have been reporting the suppression of shared welfare orientations by the overwhelming dominance of economic values within capitalist societies since before the outset of neoliberalism debate. Obligations concerning common welfare are more and more often subordinated to the overarching aim of realizing economic success goals. This should be especially valid with for social life in contemporary market societies. This empirical investigation examines the extent to which market imperatives and values of the societal community are anchored within the normative orientations of market actors. Special attention is paid to whether the shape of these normative orientations varies with respect to the degree of market inclusion. Empirical analyses, based on the data of a standardized written survey within the German working population carried out in 2002, show that different types of normative orientation can be distinguished among market actors. These types are quite similar to the well-known types of anomic adaptation developed by Robert K. Merton in “Social Structure and Anomie” and are externally valid with respect to the prediction of different forms of economic crime. Further analyses show that the type of normative orientation actors adopt within everyday life depends on the degree of market inclusion. Confirming anomie theory, it is shown that the individual willingness to subordinate matters of common welfare to the aim of economic success—radical market activism—gets stronger the more actors are included in the market sphere. Finally, the relevance of reported findings for the explanation of violent behavior, especially with view to varieties of corporate violence, is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces an extended hierarchical task analysis (HTA) methodology devised to evaluate and compare user interfaces on volumetric infusion pumps. The pumps were studied along the dimensions of overall usability and propensity for generating human error. With HTA as our framework, we analyzed six pumps on a variety of common tasks using Norman’s Action theory. The introduced method of evaluation divides the problem space between the external world of the device interface and the user’s internal cognitive world, allowing for predictions of potential user errors at the human-device level. In this paper, one detailed analysis is provided as an example, comparing two different pumps on two separate tasks. The results demonstrate the inherent variation, often the cause of usage errors, found with infusion pumps being used in hospitals today. The reported methodology is a useful tool for evaluating human performance and predicting potential user errors with infusion pumps and other simple medical devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Theory: Interpersonal factors play a major role in causing and maintaining depression. It is unclear, however, to what degree significant others of the patient need to be involved for characterizing the patient's interpersonal style. Therefore, our study sought to investigate how impact messages as perceived by the patients' significant others add to the prediction of psychotherapy process and outcome above and beyond routine assessments, and therapist factors. Method: 143 outpatients with major depressive disorder were treated by 24 therapists with CBT or Exposure-Based Cognitive Therapy. Interpersonal style was measured pre and post therapy with the informant‐based Impact Message Inventory (IMI), in addition to the self‐report Inventory of Interpersonal Problems (IIP‐32). Indicators for the patients' dominance and affiliation as well as interpersonal distress were calculated from these measures. Depressive and general symptomatology was assessed at pre, post, and at three months follow‐up, and by process measures after every session. Results: Whereas significant other's reports did not add significantly to the prediction of the early therapeutic alliance, central mechanisms of change, or post‐therapy outcome including therapist factors, the best predictor of outcome 3 months post therapy was an increase in dominance as perceived by significant others. Conclusions: The patients' significant others seem to provide important additional information about the patients' interpersonal style and therefore should be included in the diagnostic process. Moreover, practitioners should specifically target interpersonal change as a potential mechanism of change in psychotherapy for depression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical calculations describing weathering of the Poços de Caldas alkaline complex (Minas Gerais, Brazil) by infiltrating groundwater are carried out for time spans up to two million years in the absence of pyrite, and up to 500,000 years with pyrite present. Deposition of uranium resulting from infiltration of oxygenated, uranium bearing groundwater through the hydrothermally altered phonolitic host rock at the Osamu Utsumi uranium mine is also included in the latter calculation. The calculations are based on the quasi-stationary state approximation to mass conservation equations for pure advective transport. This approximation enables the prediction of solute concentrations, mineral abundances and porosity as functions of time and distance over geologic time spans. Mineral reactions are described by kinetic rate laws for both precipitation and dissolution. Homogeneous equilibrium is assumed to be maintained within the aqueous phase. No other constraints are imposed on the calculations other than the initial composition of the unaltered host rock and the composition of the inlet fluid, taken as rainwater modified by percolation through a soil zone. The results are in qualitative agreement with field observations at the Osamu Utsumi uranium mine. They predict a lateritic cover followed by a highly porous saprolitic zone, a zone of oxidized rock with pyrite replaced by iron-hydroxide, a sharp redox front at which uranium is deposited, and the reduced unweathered host rock. Uranium is deposited in a narrow zone located on the reduced side of the redox front in association with pyrite, in agreement with field observations. The calculations predict the formation of a broad dissolution front of primary kaolinite that penetrates deep into the host rock accompanied by the precipitation of secondary illite. Secondary kaolinite occurs in a saprolitic zone near the surface and in the vicinity of the redox front. Gibbsite forms a bi-modal distribution consisting of a maximum near the surface followed by a thin tongue extending downward into the weathered profile in agreement with field observations. The results are found to be insensitive to the kinetic rate constants used to describe mineral reactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

If payment of goods is easily default, economic transaction may deeply suffer from the risk. This risky environment formed a mechanism that governs how economic transaction is realized, subsequently how trade credit is given. This paper distinguished ex ante bargaining and ex post enforcement, then modeled that bargaining power reduces trade credit ex ante, and ex post enforcement power and cash in hand of buyer can enhances both trade amount and trade credit in a presence of default risk. We modeled this relationship in order to organize findings from previous literature and from our original micro data on detailed transaction in China to consistently understand the mechanism governing trade credit. Then empirically tested a structure from the theoretical prediction with data. Results show that ex post enforcement power of seller mainly determines size of trade credit and trade amount, cash in hand of buyer can substitute with enforcement power; Bargaining power of seller is exercised to reduces trade credit and trade amount for avoiding default risk, but it simultaneously improves enforcement power as well. We found that ex post enforcement power consists of (ex ante) bargaining power on between two parties and intervention from the third party. However, its magnitude is far smaller than the direct impact to reduce trade credit and trade amount.