832 resultados para Issued-based approach


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work proposes a model based approach for pointcut management in the presence of evolution in aspect oriented systems. The proposed approach, called conceptual visions based pointcuts, is motivated by the observation of the shortcomings in traditional approaches pointcuts definition, which generally refer directly to software structure and/or behavior, thereby creating a strong coupling between pointcut definition and the base code. This coupling causes the problem known as pointcut fragility problem and hinders the evolution of aspect-oriented systems. This problem occurs when all the pointcuts of each aspect should be reviewed due to any software changes/evolution, to ensure that they remain valid even after the changes made in the software. Our approach is focused on the pointcuts definition based on a conceptual model, which has definitions of the system's structure in a more abstract level. The conceptual model consists of classifications (called conceptual views) on entities of the business model elements based on common characteristics, and relationships between these views. Thus the pointcuts definitions are created based on the conceptual model rather than directly referencing the base model. Moreover, the conceptual model contains a set of relationships that allows it to be automatically verified if the classifications in the conceptual model remain valid even after a software change. To this end, all the development using the conceptual views based pointcuts approach is supported by a conceptual framework called CrossMDA2 and a development process based on MDA, both also proposed in this work. As proof of concept, we present two versions of a case study, setting up a scenario of evolution that shows how the use of conceptual visions based pointcuts helps detecting and minimizing the pointcuts fragility. For the proposal evaluation the Goal/Question/Metric (GQM) technique is used together with metrics for efficiency analysis in the pointcuts definition

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Multiobjective Spanning Tree Problem is NP-hard and models applications in several areas. This research presents an experimental analysis of different strategies used in the literature to develop exact algorithms to solve the problem. Initially, the algorithms are classified according to the approaches used to solve the problem. Features of two or more approaches can be found in some of those algorithms. The approaches investigated here are: the two-stage method, branch-and-bound, k-best and the preference-based approach. The main contribution of this research lies in the fact that no research was presented to date reporting a systematic experimental analysis of exact algorithms for the Multiobjective Spanning Tree Problem. Therefore, this work can be a basis for other research that deal with the same problem. The computational experiments compare the performance of algorithms regarding processing time, efficiency based on the number of objectives and number of solutions found in a controlled time interval. The analysis of the algorithms was performed for known instances of the problem, as well as instances obtained from a generator commonly used in the literature

Relevância:

80.00% 80.00%

Publicador:

Resumo:

When crosscutting concerns identification is performed from the beginning of development, on the activities involved in requirements engineering, there are many gains in terms of quality, cost and efficiency throughout the lifecycle of software development. This early identification supports the evolution of requirements, detects possible flaws in the requirements specification, improves traceability among requirements, provides better software modularity and prevents possible rework. However, despite these several advantages, the crosscutting concerns identification over requirements engineering faces several difficulties such as the lack of systematization and tools that support it. Furthermore, it is difficult to justify why some concerns are identified as crosscutting or not, since this identification is, most often, made without any methodology that systematizes and bases it. In this context, this paper proposes an approach based on Grounded Theory, called GT4CCI, for systematizing and basing the process of identifying crosscutting concerns in the initial stages of the software development process in the requirements document. Grounded Theory is a renowned methodology for qualitative analysis of data. Through the use of GT4CCI it is possible to better understand, track and document concerns, adding gains in terms of quality, reliability and modularity of the entire lifecycle of software

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this review, we provide a brief retrospective history of the science of animal welfare and recognize the sentience of non-human animals; however, we emphasize that crucial problems remain regarding how to define and measure animal welfare. In general, the use of physiological measures to assess welfare is discouraged. Furthermore, there is a theoretical background for measures of stress, but not for welfare states because life may not be at risk. Instead, a preference or choice-based approach, which is based on the animal decision, is recommended. To this end, welfare is discussed and then contrasted with disease, health, stress and distress. In addition, the importance of prospective capacities for the welfare of human and non-human animals is discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Hepatitis C virus (HCV) currently infects approximately three percent of the world population. In view of the lack of vaccines against HCV, there is an urgent need for an efficient treatment of the disease by an effective antiviral drug. Rational drug design has not been the primary way for discovering major therapeutics. Nevertheless, there are reports of success in the development of inhibitor using a structure-based approach. One of the possible targets for drug development against HCV is the NS3 protease variants. Based on the three-dimensional structure of these variants we expect to identify new NS3 protease inhibitors. In order to speed up the modeling process all NS3 protease variant models were generated in a Beowulf cluster. The potential of the structural bioinformatics for development of new antiviral drugs is discussed.Results: the atomic coordinates of crystallographic structure 1CU1 and 1DY9 were used as starting model for modeling of the NS3 protease variant structures. The NS3 protease variant structures are composed of six subdomains, which occur in sequence along the polypeptide chain. The protease domain exhibits the dual beta-barrel fold that is common among members of the chymotrypsin serine protease family. The helicase domain contains two structurally related beta-alpha-beta subdomains and a third subdomain of seven helices and three short beta strands. The latter domain is usually referred to as the helicase alpha-helical subdomain. The rmsd value of bond lengths and bond angles, the average G-factor and Verify 3D values are presented for NS3 protease variant structures.Conclusions: This project increases the certainty that homology modeling is an useful tool in structural biology and that it can be very valuable in annotating genome sequence information and contributing to structural and functional genomics from virus. The structural models will be used to guide future efforts in the structure-based drug design of a new generation of NS3 protease variants inhibitors. All models in the database are publicly accessible via our interactive website, providing us with large amount of structural models for use in protein-ligand docking analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We use the QCD pomeron model proposed by Landshoff and Nachtmann to compute the differential and the total cross-sections for pp scattering in order to discuss a QCD-based approach to the proton form factor. This model is quite dependent on the experimental electromagnetic form factor, and it is not totally clear why this form factor gives good results even at moderate transferred momentum. We exchange the electromagnetic form factor by the asymptotic QCD proton form factor determined by Brodsky and Lepage (BL) plus a prescription for its low energy behavior dictated by the existence of a dynamically generated gluon mass. We fit the data with this QCD inspired form factor and a value for the dynamical gluon mass consistent with the ones determined in the literature. Our results also provide a determination of the proton wave function at the origin, which appears in the BL form factor.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Superstitions are found everywhere in our lives, and medicine, a profession that is prides itself on an evidence-based approach to treatment, is not exempt. A superstition that pervades the labor and delivery floor is that it is busier during certain phases of the lunar cycle, specifically the full moon. Although some studies have demonstrated an increase in deliveries that are related to the lunar cycle, there has been disagreement about when, in the lunar cycle, the peak volume occurs. Front to the divergence of the existent results in the literature to relate the events of the lunar cycle with deliveries, the aim of this review was to accomplish the literature in the attempt of explaining this popular culture with base in the results presented by different researchers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ecologists usually estimate means, but devote much less attention to variation. The study of variation is a key aspect to understand natural systems and to make predictions regarding them. In community ecology, most studies focus on local species diversity (alpha diversity), but only in recent decades have ecologists devoted proper attention to variation in community composition among sites (beta diversity). This is in spite of the fact that the first attempts to estimate beta diversity date back to the pioneering work by Koch and Whittaker in the 1950s. Progress in the last decade has been made in the development both of methods and of hypotheses about the origin and maintenance of variation in community composition. For instance, methods are available to partition total diversity in a region (gamma diversity), in a local component (alpha), and several beta diversities, each corresponding to one scale in a hierarchy. The popularization of the so-called raw-data approach (based on partial constrained ordination techniques) and the distance-based approach (based on correlation of dissimilarity/distance matrices) have allowed many ecologists to address current hypotheses about beta diversity patterns. Overall, these hypotheses are based on niche and neutral theory, accounting for the relative roles of environmental and spatial processes (or a combination of them) in shaping metacommunities. Recent studies have addressed these issues on a variety of spatial and temporal scales, habitats and taxonomic groups. Moreover, life history and functional traits of species such as dispersal abilities and rarity have begun to be considered in studies of beta diversity. In this article we briefly review some of these new tools and approaches developed in recent years, and illustrate them by using case studies in aquatic ecosystems.