11 resultados para Unstructured Grids

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a new algebraic-graph method for identification of islanding in power system grids is proposed. The proposed method identifies all the possible cases of islanding, due to the loss of a equipment, by means of a factorization of the bus-branch incidence matrix. The main features of this new method include: (i) simple implementation, (ii) high speed, (iii) real-time adaptability, (iv) identification of all islanding cases and (v) identification of the buses that compose each island in case of island formation. The method was successfully tested on large-scale systems such as the reduced south Brazilian system (45 buses/72 branches) and the south-southeast Brazilian system (810 buses/1340 branches). (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Major depressive disorder (MDD) trials - investigating either non-pharmacological or pharmacological interventions - have shown mixed results. Many reasons explain this heterogeneity, but one that stands out is the trial design due to specific challenges in the field. We aimed therefore to review the methodology of non-invasive brain stimulation (NIBS) trials and provide a framework to improve clinical trial design. We performed a systematic review for randomized, controlled MDD trials whose intervention was transcranial magnetic stimulation (rTMS) or transcranial direct current stimulation (tDCS) in MEDLINE and other databases from April 2002 to April 2008. We created an unstructured checklist based on CONSORT guidelines to extract items such as power analysis, sham method, blinding assessment, allocation concealment, operational criteria used for MDD, definition of refractory depression and primary study hypotheses. Thirty-one studies were included. We found that the main methodological issues can be divided in to three groups: (1) issues related to phase II/small trials, (2) issues related to MDD trials and, (3) specific issues of NIBS studies. Taken together, they can threaten study validity and lead to inconclusive results. Feasible solutions include: estimating the sample size a priori; measuring the degree of refractoriness of the subjects; specifying the primary hypothesis and statistical tests; controlling predictor variables through stratification randomization methods or using strict eligibility criteria; adjusting the study design to the target population; using adaptive designs and exploring NIBS efficacy employing biological markers. In conclusion, our study summarizes the main methodological issues of NIBS trials and proposes a number of alternatives to manage them. Copyright (C) 2011 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes three new hybrid mechanisms for the scheduling of grid tasks, which integrate reactive and proactive approaches. They differ by the scheduler used to define the initial schedule of an application and by the scheduler used to reschedule the application. The mechanisms are compared to reactive and proactive mechanisms. Results show that hybrid approach produces performance close to that of the reactive mechanisms, but demanding less migrations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The proteasome is the primary contributor in intracellular proteolysis. Oxidized or unstructured proteins can be degraded via a ubiquitin-and ATP-independent process by the free 20S proteasome (20SPT). The mechanism by which these proteins enter the catalytic chamber is not understood thus far, although the 20SPT gating conformation is considered to be an important barrier to allowing proteins free entrance. We have previously shown that S-glutathiolation of the 20SPT is a post-translational modification affecting the proteasomal activities. Aims: The goal of this work was to investigate the mechanism that regulates 20SPT activity, which includes the identification of the Cys residues prone to S-glutathiolation. Results: Modulation of 20SPT activity by proteasome gating is at least partially due to the S-glutathiolation of specific Cys residues. The gate was open when the 20SPT was S-glutathiolated, whereas following treatment with high concentrations of dithiothreitol, the gate was closed. S-glutathiolated 20SPT was more effective at degrading both oxidized and partially unfolded proteins than its reduced form. Only 2 out of 28 Cys were observed to be S-glutathiolated in the proteasomal alpha 5 subunit of yeast cells grown to the stationary phase in glucose-containing medium. Innovation: We demonstrate a redox post-translational regulatory mechanism controlling 20SPT activity. Conclusion: S-glutathiolation is a post-translational modification that triggers gate opening and thereby activates the proteolytic activities of free 20SPT. This process appears to be an important regulatory mechanism to intensify the removal of oxidized or unstructured proteins in stressful situations by a process independent of ubiquitination and ATP consumption. Antioxid. Redox Signal. 16, 1183-1194.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4767672]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Yield mapping represents the spatial variability concerning the features of a productive area and allows intervening on the next year production, for example, on a site-specific input application. The trial aimed at verifying the influence of a sampling density and the type of interpolator on yield mapping precision to be produced by a manual sampling of grains. This solution is usually adopted when a combine with yield monitor can not be used. An yield map was developed using data obtained from a combine equipped with yield monitor during corn harvesting. From this map, 84 sample grids were established and through three interpolators: inverse of square distance, inverse of distance and ordinary kriging, 252 yield maps were created. Then they were compared with the original one using the coefficient of relative deviation (CRD) and the kappa index. The loss regarding yield mapping information increased as the sampling density decreased. Besides, it was also dependent on the interpolation method used. A multiple regression model was adjusted to the variable CRD, according to the following variables: spatial variability index and sampling density. This model aimed at aiding the farmer to define the sampling density, thus, allowing to obtain the manual yield mapping, during eventual problems in the yield monitor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The flow around circular smooth fixed cylinder in a large range of Reynolds numbers is considered in this paper. In order to investigate this canonical case, we perform CFD calculations and apply verification & validation (V&V) procedures to draw conclusions regarding numerical error and, afterwards, assess the modeling errors and capabilities of this (U)RANS method to solve the problem. Eight Reynolds numbers between Re = 10 and Re 5 x 10(5) will be presented with, at least, four geometrically similar grids and five discretization in time for each case (when unsteady), together with strict control of iterative and round-off errors, allowing a consistent verification analysis with uncertainty estimation. Two-dimensional RANS, steady or unsteady, laminar or turbulent calculations are performed. The original 1994 k - omega SST turbulence model by Menter is used to model turbulence. The validation procedure is performed by comparing the numerical results with an extensive set of experimental results compiled from the literature. [DOI: 10.1115/1.4007571]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study compared the dimensions of gingival papillae in anterosuperior areas presenting at natural teeth (teeth sites) or single-tooth implants adjacent to natural teeth (implant-tooth sites) by analyzing determined distances. A total of 45 teeth and 46 implant-tooth sites were carefully selected. Clinical evaluation consisted of visual and quantitative analyses with millimeter grids on radiographs. Implant-tooth sites showed a smaller gingival papilla dimension than tooth sites (P < .01). Both evaluated distances (contact point to bone crest and between the roots of adjacent teeth or implant platform to root of adjacent tooth) in all groups significantly influenced the presence/absence of gingival papillae (P < .01). (Int J Periodontics Restorative Dent 2012;32:93-100.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Failure detection is at the core of most fault tolerance strategies, but it often depends on reliable communication. We present new algorithms for failure detectors which are appropriate as components of a fault tolerance system that can be deployed in situations of adverse network conditions (such as loosely connected and administered computing grids). It packs redundancy into heartbeat messages, thereby improving on the robustness of the traditional protocols. Results from experimental tests conducted in a simulated environment with adverse network conditions show significant improvement over existing solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current scientific applications have been producing large amounts of data. The processing, handling and analysis of such data require large-scale computing infrastructures such as clusters and grids. In this area, studies aim at improving the performance of data-intensive applications by optimizing data accesses. In order to achieve this goal, distributed storage systems have been considering techniques of data replication, migration, distribution, and access parallelism. However, the main drawback of those studies is that they do not take into account application behavior to perform data access optimization. This limitation motivated this paper which applies strategies to support the online prediction of application behavior in order to optimize data access operations on distributed systems, without requiring any information on past executions. In order to accomplish such a goal, this approach organizes application behaviors as time series and, then, analyzes and classifies those series according to their properties. By knowing properties, the approach selects modeling techniques to represent series and perform predictions, which are, later on, used to optimize data access operations. This new approach was implemented and evaluated using the OptorSim simulator, sponsored by the LHC-CERN project and widely employed by the scientific community. Experiments confirm this new approach reduces application execution time in about 50 percent, specially when handling large amounts of data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the increasing production of information from e-government initiatives, there is also the need to transform a large volume of unstructured data into useful information for society. All this information should be easily accessible and made available in a meaningful and effective way in order to achieve semantic interoperability in electronic government services, which is a challenge to be pursued by governments round the world. Our aim is to discuss the context of e-Government Big Data and to present a framework to promote semantic interoperability through automatic generation of ontologies from unstructured information found in the Internet. We propose the use of fuzzy mechanisms to deal with natural language terms and present some related works found in this area. The results achieved in this study are based on the architectural definition and major components and requirements in order to compose the proposed framework. With this, it is possible to take advantage of the large volume of information generated from e-Government initiatives and use it to benefit society.