105 resultados para Sistemas de Créditos e Avaliação de Pesquisadores


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to comparatively evaluate the mechanical strength of squared and rectangular 2.0 mm system miniplates comparing them to the standard configuration with 2 straight miniplates in stabilizing fractures in the anterior mandible. Ninety synthetic polyurethane mandible replicas were used in mechanical test. The samples were divided into six groups of three different methods for fixation. Groups 1, 2 and 3 showed complete fractures in symphysis, characterized by a linear separation between the medial incisor, and groups 4, 5 and 6 showed complete fractures in parasymphysis with oblique design. Groups 1 and 4 were represented by the standard technique with two straight miniplates parallel to each other. Groups 2 and 5 were stabilized by squared miniplates and groups 3 and 6 were fixed by rectangular design. Each group was subjected to a mechanical test at a displacement speed of 10 mm/min on a universal testing machine, receiving linear vertical load on the region of the left first molar. The values of the maximum load and when displacements reached 5 mm were obtained and statistically analyzed by calculating the confidence interval of 95%. Fixation systems using squared (G2) and rectangular (G3) miniplates obtained similar results. No statistically significant differences with respect to the maximum load and the load at 5 mm displacement were found when compared to standard method in symphyseal fractures (G1). In parasymphysis the fixation method using squared miniplates (G5) obtained results without significant differences regarding the maximum load and the load at 5 mm when compared to the standard configuration (G4). The fixation method using rectangular miniplates (G6) showed inferior results which were statistically significant when compared to the standard configuration (G4) for parasymphysis fractures. The mechanical behavior of the fixation methods was similar, except when rectangular miniplates were used. The fixation methods showed better results with statistical significance in symphyseal fractures

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It bet on the next generation of computers as architecture with multiple processors and/or multicore processors. In this sense there are challenges related to features interconnection, operating frequency, the area on chip, power dissipation, performance and programmability. The mechanism of interconnection and communication it was considered ideal for this type of architecture are the networks-on-chip, due its scalability, reusability and intrinsic parallelism. The networks-on-chip communication is accomplished by transmitting packets that carry data and instructions that represent requests and responses between the processing elements interconnected by the network. The transmission of packets is accomplished as in a pipeline between the routers in the network, from source to destination of the communication, even allowing simultaneous communications between pairs of different sources and destinations. From this fact, it is proposed to transform the entire infrastructure communication of network-on-chip, using the routing mechanisms, arbitration and storage, in a parallel processing system for high performance. In this proposal, the packages are formed by instructions and data that represent the applications, which are executed on routers as well as they are transmitted, using the pipeline and parallel communication transmissions. In contrast, traditional processors are not used, but only single cores that control the access to memory. An implementation of this idea is called IPNoSys (Integrated Processing NoC System), which has an own programming model and a routing algorithm that guarantees the execution of all instructions in the packets, preventing situations of deadlock, livelock and starvation. This architecture provides mechanisms for input and output, interruption and operating system support. As proof of concept was developed a programming environment and a simulator for this architecture in SystemC, which allows configuration of various parameters and to obtain several results to evaluate it

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of clustering methods for the discovery of cancer subtypes has drawn a great deal of attention in the scientific community. While bioinformaticians have proposed new clustering methods that take advantage of characteristics of the gene expression data, the medical community has a preference for using classic clustering methods. There have been no studies thus far performing a large-scale evaluation of different clustering methods in this context. This work presents the first large-scale analysis of seven different clustering methods and four proximity measures for the analysis of 35 cancer gene expression data sets. Results reveal that the finite mixture of Gaussians, followed closely by k-means, exhibited the best performance in terms of recovering the true structure of the data sets. These methods also exhibited, on average, the smallest difference between the actual number of classes in the data sets and the best number of clusters as indicated by our validation criteria. Furthermore, hierarchical methods, which have been widely used by the medical community, exhibited a poorer recovery performance than that of the other methods evaluated. Moreover, as a stable basis for the assessment and comparison of different clustering methods for cancer gene expression data, this study provides a common group of data sets (benchmark data sets) to be shared among researchers and used for comparisons with new methods

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes a model based approach for pointcut management in the presence of evolution in aspect oriented systems. The proposed approach, called conceptual visions based pointcuts, is motivated by the observation of the shortcomings in traditional approaches pointcuts definition, which generally refer directly to software structure and/or behavior, thereby creating a strong coupling between pointcut definition and the base code. This coupling causes the problem known as pointcut fragility problem and hinders the evolution of aspect-oriented systems. This problem occurs when all the pointcuts of each aspect should be reviewed due to any software changes/evolution, to ensure that they remain valid even after the changes made in the software. Our approach is focused on the pointcuts definition based on a conceptual model, which has definitions of the system's structure in a more abstract level. The conceptual model consists of classifications (called conceptual views) on entities of the business model elements based on common characteristics, and relationships between these views. Thus the pointcuts definitions are created based on the conceptual model rather than directly referencing the base model. Moreover, the conceptual model contains a set of relationships that allows it to be automatically verified if the classifications in the conceptual model remain valid even after a software change. To this end, all the development using the conceptual views based pointcuts approach is supported by a conceptual framework called CrossMDA2 and a development process based on MDA, both also proposed in this work. As proof of concept, we present two versions of a case study, setting up a scenario of evolution that shows how the use of conceptual visions based pointcuts helps detecting and minimizing the pointcuts fragility. For the proposal evaluation the Goal/Question/Metric (GQM) technique is used together with metrics for efficiency analysis in the pointcuts definition

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brazil is going through the process from analogical transmission to digital transmission. This new technology, in addition to providing a high quality audio and video, also allows applications to execute on television. Equipment called Set-Top Box are needed to allow the reception of this new signal and create the appropriate environment necessary to execute applications. At first, the only way to interact with these applications is given by remote control. However, the remote control has serious usability problems when used to interact with some types of applications. This research suggests a software resources implementation capable to create a environment that allows a smartphone to interact with applications. Besides this implementation, is performed a comparative study between use remote controle and smartphones to interact with applications of digital television, taking into account parameters related to usability. After analysis of data collected by the comparative study is possible to identify which device provides an interactive experience more interesting for users

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Product derivation tools are responsible for automating the development process of software product lines. The configuration knowledge, which is responsible for mapping the problem space to the solution space, plays a fundamental role on product derivation approaches. Each product derivation approach adopts different strategies and techniques to manage the existing variabilities in code assets. There is a lack of empirical studies to analyze these different approaches. This dissertation has the aim of comparing systematically automatic product derivation approaches through of the development of two different empirical studies. The studies are analyzed under two perspectives: (i) qualitative that analyzes the characteristics of approaches using specific criteria; and (ii) quantitative that quantifies specific properties of product derivation artifacts produced for the different approaches. A set of criteria and metrics are also being proposed with the aim of providing support to the qualitative and quantitative analysis. Two software product lines from the web and mobile application domains are targets of our study

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The academic community and software industry have shown, in recent years, substantial interest in approaches and technologies related to the area of model-driven development (MDD). At the same time, continues the relentless pursuit of industry for technologies to raise productivity and quality in the development of software products. This work aims to explore those two statements, through an experiment carried by using MDD technology and evaluation of its use on solving an actual problem under the security context of enterprise systems. By building and using a tool, a visual DSL denominated CALV3, inspired by the software factory approach: a synergy between software product line, domainspecific languages and MDD, we evaluate the gains in abstraction and productivity through a systematic case study conducted in a development team. The results and lessons learned from the evaluation of this tool within industry are the main contributions of this work

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing complexity of integrated circuits has boosted the development of communications architectures like Networks-on-Chip (NoCs), as an architecture; alternative for interconnection of Systems-on-Chip (SoC). Networks-on-Chip complain for component reuse, parallelism and scalability, enhancing reusability in projects of dedicated applications. In the literature, lots of proposals have been made, suggesting different configurations for networks-on-chip architectures. Among all networks-on-chip considered, the architecture of IPNoSys is a non conventional one, since it allows the execution of operations, while the communication process is performed. This study aims to evaluate the execution of data-flow based applications on IPNoSys, focusing on their adaptation against the design constraints. Data-flow based applications are characterized by the flowing of continuous stream of data, on which operations are executed. We expect that these type of applications can be improved when running on IPNoSys, because they have a programming model similar to the execution model of this network. By observing the behavior of these applications when running on IPNoSys, were performed changes in the execution model of the network IPNoSys, allowing the implementation of an instruction level parallelism. For these purposes, analysis of the implementations of dataflow applications were performed and compared

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software Repository Mining (MSR) is a research area that analyses software repositories in order to derive relevant information for the research and practice of software engineering. The main goal of repository mining is to extract static information from repositories (e.g. code repository or change requisition system) into valuable information providing a way to support the decision making of software projects. On the other hand, another research area called Process Mining (PM) aims to find the characteristics of the underlying process of business organizations, supporting the process improvement and documentation. Recent works have been doing several analyses through MSR and PM techniques: (i) to investigate the evolution of software projects; (ii) to understand the real underlying process of a project; and (iii) create defect prediction models. However, few research works have been focusing on analyzing the contributions of software developers by means of MSR and PM techniques. In this context, this dissertation proposes the development of two empirical studies of assessment of the contribution of software developers to an open-source and a commercial project using those techniques. The contributions of developers are assessed through three different perspectives: (i) buggy commits; (ii) the size of commits; and (iii) the most important bugs. For the opensource project 12.827 commits and 8.410 bugs have been analyzed while 4.663 commits and 1.898 bugs have been analyzed for the commercial project. Our results indicate that, for the open source project, the developers classified as core developers have contributed with more buggy commits (although they have contributed with the majority of commits), more code to the project (commit size) and more important bugs solved while the results could not indicate differences with statistical significance between developer groups for the commercial project

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One way to deal with the high complexity of current software systems is through selfadaptive systems. Self-adaptive system must be able to monitor themselves and their environment, analyzing the monitored data to determine the need for adaptation, decide how the adaptation will be performed, and finally, make the necessary adjustments. One way to perform the adaptation of a system is generating, at runtime, the process that will perform the adaptation. One advantage of this approach is the possibility to take into account features that can only be evaluated at runtime, such as the emergence of new components that allow new architectural arrangements which were not foreseen at design time. In this work we have as main objective the use of a framework for dynamic generation of processes to generate architectural adaptation plans on OSGi environment. Our main interest is evaluate how this framework for dynamic generation of processes behave in new environments

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Being available as a tourist destination is a necessary condition but not enough for the expansion and success of tourism activity. To be successful, tourism requires investment, inputs, appropriate planning and management, like any other economic activity. A fundamental goal of the destination management is to understand how the competitiveness of a tourist destination can be improved and sustained. Competitive position of tourism can be measured and assessed by various models. Evaluating the indicators of competitiveness of a tourist destination involves a multivariate analysis, ranging from issues directly related to tourism activity itself to the indirect factors. These are elements that are interrelated and that together will point out the competitive condition of this destination. From the definition and characterization of competitiveness, sustainability and management in the context of tourist destinations, understood as the main concepts of this study, we present the main theoretical and methodological models of assessment of competitiveness of tourist destinations in the literature and represent the state of the issue in the scientific treatment of the subject. These models, designed by researchers from several countries and applied in different tourist destinations, are confronted about their structure, indicators considered and localities in which they were applied. The aim of this study was to know and evaluate the condition of tourist competitiveness of the destination Pólo Costa das Dunas, from the constraints attributes of superior performance of the evaluation model of tourist competitiveness of destinations Competenible, suggested by Mazaro, and that suit the requirements of international market aware of the strength and importance of sustainability. The condition of competitiveness of tourist destination in Rio Grande do Norte Pólo Costa das Dunas was moderate. The competitive strengths and weaknesses of the destination Pólo Costa das Dunas revealed through the dozens of sustainable attributes of the model Competenible showed guidelines and initiatives that can be taken to guide strategic decisions related to their planning and management. Thus, this study should serve as support for strategic planning and long-term management of the sector and as a crucial tool for making decisions related to public policies, sectoral investments, monitor processes, strategic planning, direction and control of the local and regional tourism development of destinations

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For many decades the problematic raised about the indiscriminate use of pesticides in modern agriculture has incited the interest of many researchers to discover the effects caused by such products in the environment and consequently in the life of the ones which use that products (farmers) and those people who live in places next to the agricultural areas. Facing these facts, this research had the intention of comprehend the environmental perception of habitants of the Distrito Irrigado do Baixo-Açu (DIBA), located in the semiarid of Rio Grande do Norte, by using agro toxics and its possible environmental effects, as well as evaluate the levels of toxicity of waters from agricultural runoff in this region by using eco-toxicological exams with Ceriodaphnia silvestii. Were done 86 interviews with dwellers and farmers from DIBA. With the results reached in the evaluation of the interviews it was possible to identify that one of major problems is the inappropriate discard of empty packs of the pesticides used at that place. The samples collected for eco-toxicological evaluation showed a variation in its toxicity, once that the point of collect which receives waters from different cultures presented in four out of five samples toxicity for the tested species. Therefore, it concludes that the indiscriminate use of pesticides agricultural practice presented a potential to pollute to the irrigation waters, and the absence of elucidation by the farmers about the manipulation of these products contribute to the risk of environmental contamination and the possible decrease of the quality of life of the dwellers of the region

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A avaliação do ciclo de vida (ACV) é uma metodologia de avaliação de impacto ambiental de produtos e sistemas de produção considerando todo o ciclo de vida, desde a aquisição de matérias-primas até a disposição final. Este trabalho consistiu na investigação do progresso dos estudos sobre ACV no Brasil, por meio de uma pesquisa bibliográfica em eventos e periódicos oficiais ou reconhecidos pela Associação Brasileira de Engenharia de Produção e na base de dados SciELO Brasil. Foram identificados 80 artigos, a maioria de instituições das regiões Sul e Sudeste. A Universidade de São Paulo (USP) e Universidade Federal de Santa Catarina (UFSC) apresentaram o maior número de publicações dentre as 50 instituições identificadas. Verificou-se que 17 artigos aplicaram efetivamente a metodologia ACV em um estudo de caso, sendo que 11 utilizaram a metodologia para avaliar processo produtivo e 6 para comparar materiais ou processos

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, there are many aspect-oriented middleware implementations that take advantage of the modularity provided by the aspect oriented paradigm. Although the works always present an assessment of the middleware according to some quality attribute, there is not a specific set of metrics to assess them in a comprehensive way, following various quality attributes. This work aims to propose a suite of metrics for the assessment of aspect-oriented middleware systems at different development stages: design, refactoring, implementation and runtime. The work presents the metrics and how they are applied at each development stage. The suite is composed of metrics associated to static properties (modularity, maintainability, reusability, exibility, complexity, stability, and size) and dynamic properties (performance and memory consumption). Such metrics are based on existing assessment approaches of object-oriented and aspect-oriented systems. The proposed metrics are used in the context of OiL (Orb in Lua), a middleware based on CORBA and implemented in Lua, and AO-OiL, the refactoring of OIL that follows a reference architecture for aspect-oriented middleware systems. The case study performed in OiL and AO-OiL is a system for monitoring of oil wells. This work also presents the CoMeTA-Lua tool to automate the collection of coupling and size metrics in Lua source code