946 resultados para software failure prediction


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los resultados financieros de las organizaciones son objeto de estudio y análisis permanente, predecir sus comportamientos es una tarea permanente de empresarios, inversionistas, analistas y académicos. En el presente trabajo se explora el impacto del tamaño de los activos (valor total de los activos) en la cuenta de resultados operativos y netos, analizando inicialmente la relación entre dichas variables con indicadores tradicionales del análisis financiero como es el caso de la rentabilidad operativa y neta y con elementos de estadística descriptiva que permiten calificar los datos utilizados como lineales o no lineales. Descubriendo posteriormente que los resultados financieros de las empresas vigiladas por la Superintendencia de Sociedades para el año 2012, tienen un comportamiento no lineal, de esta manera se procede a analizar la relación de los activos y los resultados con la utilización de espacios de fase y análisis de recurrencia, herramientas útiles para sistemas caóticos y complejos. Para el desarrollo de la investigación y la revisión de la relación entre las variables de activos y resultados financieros se tomó como fuente de información los reportes financieros del cierre del año 2012 de la Superintendencia de Sociedades (Superintendencia de Sociedades, 2012).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accelerated failure time models with a shared random component are described, and are used to evaluate the effect of explanatory factors and different transplant centres on survival times following kidney transplantation. Different combinations of the distribution of the random effects and baseline hazard function are considered and the fit of such models to the transplant data is critically assessed. A mixture model that combines short- and long-term components of a hazard function is then developed, which provides a more flexible model for the hazard function. The model can incorporate different explanatory variables and random effects in each component. The model is straightforward to fit using standard statistical software, and is shown to be a good fit to the transplant data. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The availability of a network strongly depends on the frequency of service outages and the recovery time for each outage. The loss of network resources includes complete or partial failure of hardware and software components, power outages, scheduled maintenance such as software and hardware, operational errors such as configuration errors and acts of nature such as floods, tornadoes and earthquakes. This paper proposes a practical approach to the enhancement of QoS routing by means of providing alternative or repair paths in the event of a breakage of a working path. The proposed scheme guarantees that every Protected Node (PN) is connected to a multi-repair path such that no further failure or breakage of single or double repair paths can cause any simultaneous loss of connectivity between an ingress node and an egress node. Links to be protected in an MPLS network are predefined and an LSP request involves the establishment of a working path. The use of multi-protection paths permits the formation of numerous protection paths allowing greater flexibility. Our analysis will examine several methods including single, double and multi-repair routes and the prioritization of signals along the protected paths to improve the Quality of Service (QoS), throughput, reduce the cost of the protection path placement, delay, congestion and collision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The FunFOLD2 server is a new independent server that integrates our novel protein–ligand binding site and quality assessment protocols for the prediction of protein function (FN) from sequence via structure. Our guiding principles were, first, to provide a simple unified resource to make our function prediction software easily accessible to all via a simple web interface and, second, to produce integrated output for predictions that can be easily interpreted. The server provides a clean web interface so that results can be viewed on a single page and interpreted by non-experts at a glance. The output for the prediction is an image of the top predicted tertiary structure annotated to indicate putative ligand-binding site residues. The results page also includes a list of the most likely binding site residues and the types of predicted ligands and their frequencies in similar structures. The protein–ligand interactions can also be interactively visualized in 3D using the Jmol plug-in. The raw machine readable data are provided for developers, which comply with the Critical Assessment of Techniques for Protein Structure Prediction data standards for FN predictions. The FunFOLD2 webserver is freely available to all at the following web site: http://www.reading.ac.uk/bioinf/FunFOLD/FunFOLD_form_2_0.html.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the prediction of the demise of cities with the advance of new information and communication technologies in the New Economy, the software industry has emerged from cities in the USA, Europe and Asia in the past two decades. This article explores the reasons why cities are centers of software clusters, with reference to Boston, London and Dublin. It is suggested that cities' roles as centres of knowledge flows and creativity are the key determinants of their competitiveness in the knowledge-intensive software industry.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

o trabalho analisa a indústria nacional de software, em especial, o software para exportação e as fábricas de software, focalizando as estratégias e os desafios do empresariado nacional da Tecnologia da Informação e Comunicação que atua neste segmento. Tendo por base considerações sobre o taylorismo, o fordismo, a flexibilidade do trabalho, a legislação trabalhista vigente, a competitividade do mercado de software, a maturidade nos processos de gestão e a responsabilidade social das empresas, colocou-se em perspectiva os principais fatores que podem influenciar o sucesso ou o fracasso das empresas nacionais de Tecnologia da Informação e Comunicação que investem no segmento de fábrica de software;

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software Repository Mining (MSR) is a research area that analyses software repositories in order to derive relevant information for the research and practice of software engineering. The main goal of repository mining is to extract static information from repositories (e.g. code repository or change requisition system) into valuable information providing a way to support the decision making of software projects. On the other hand, another research area called Process Mining (PM) aims to find the characteristics of the underlying process of business organizations, supporting the process improvement and documentation. Recent works have been doing several analyses through MSR and PM techniques: (i) to investigate the evolution of software projects; (ii) to understand the real underlying process of a project; and (iii) create defect prediction models. However, few research works have been focusing on analyzing the contributions of software developers by means of MSR and PM techniques. In this context, this dissertation proposes the development of two empirical studies of assessment of the contribution of software developers to an open-source and a commercial project using those techniques. The contributions of developers are assessed through three different perspectives: (i) buggy commits; (ii) the size of commits; and (iii) the most important bugs. For the opensource project 12.827 commits and 8.410 bugs have been analyzed while 4.663 commits and 1.898 bugs have been analyzed for the commercial project. Our results indicate that, for the open source project, the developers classified as core developers have contributed with more buggy commits (although they have contributed with the majority of commits), more code to the project (commit size) and more important bugs solved while the results could not indicate differences with statistical significance between developer groups for the commercial project

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Computer software can be used to predict orthognathic surgery outcomes. The aim of this study was to subjectively compare the soft-tissue surgical simulations of 2 software programs. Methods: Standard profile pictures were taken of 10 patients with a Class III malocclusion and a concave facial profile who were scheduled for double-jaw orthognathic surgery. The patients had horizontal maxillary deficiency or horizontal mandibular excess. Two software programs (Dentofacial Planner Plus [Dentofacial Software, Toronto, Ontario, Canada] and Dolphin Imaging [version 9.0, Dolphin Imaging Software, Canoga Park, Calif]) were used to predict the postsurgical profiles. The predictive images were compared with the actual final photographs. One hundred one orthodontists, oral-maxillofacial surgeons, and general dentists evaluated the images and were asked whether they would use either software program to plan treatment for, or to educate, their patients. Results: Statistical analyses showed differences between the groups when each point was judged. Dolphin Imaging software had better prediction of nasal tip, chin, and submandibular area. Dentofacial Planner Plus software was better in predicting nasolabial angle, and upper and lower lips. The total profile comparison showed no statistical difference between the softwares. Conclusions: The 2 types of software are similar for obtaining 2-dimensional predictive profile images of patients with Class III malocclusion treated with orthognathic surgery. (Am J Orthod Dentofacial Orthop 2010; 137: 452.e1-452.e5)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In businesses such as the software industry, which uses knowledge as a resource, activities are knowledge intensive, requiring constant adoption of new technologies and practices. Another feature of this environment is that the industry is particularly susceptible to failure; with this in mind, the objective of this research is to analyze the integration of Knowledge Management techniques into the activity of risk management as it applies to software development projects of micro and small Brazilian incubated technology-based firms. Research methods chosen were the Multiple Case Study. The main risk factor for managers and developers is that scope or goals are often unclear or misinterpreted. For risk management, firms have found that Knowledge Management techniques of conversion combination would be the most applicable for use; however, those most commonly used refer to the conversion mode as internalization.. © 2013 Elsevier Ltd. APM and IPMA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When a bolted joint is loaded in tension with dynamically, part of this load is absorbed by the bolt and rest is absorbed by the joint material. What determines the portion that is to absorbed by the bolt is the joint stiffness factor. This factor influences the tension which corresponds to pre-load and the safety factor for fatigue failure, thus being an important factor in the design of bolted joints. In this work, three methods of calculating the stiffness factor are compared through a spreadsheet in Excel software. The ratio of initial pre-load and the safety factor for fatigue failure depending on the stiffness factor graph is generated. The calculations for each method show results with a small difference. It is therefore recommended that each project case is analyzed, and depending on its conditions and the range of stiffness values, the more or less rigid method about the safety factor for fatigue failure is chosen. In general, the approximation method provides consistent results and can be easily calculated

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to develop a model that allows testing in the wind tunnel at high angles of attack and validates its most critical components by analyzing the results of simulations in finite element software. During the project this structure suffered major loads identified during the flight conditions and, from these, we calculated the stresses in critical regions defined as the parts of the model that have higher failure probabilities. All aspects associated with Load methods, mesh refining and stress analysis were taken into account in this approach. The selection of the analysis software was based on project needs, seeking greater ease of modeling and simulation. We opted for the software ANSYS® since the entire project is being developed in CAD platforms enabling a friendly integration between software's modeling and analysis