946 resultados para software failure prediction
Resumo:
Numerical models, used for atmospheric research, weather prediction and climate simulation, describe the state of the atmosphere over the heterogeneous surface of the Earth. Several fundamental properties of atmospheric models depend on orography, i.e. on the average elevation of land over a model area. The higher is the models' resolution, the more the details of orography directly influence the simulated atmospheric processes. This sets new requirements for the accuracy of the model formulations with respect to the spatially varying orography. Orography is always averaged, representing the surface elevation within the horizontal resolution of the model. In order to remove the smallest scales and steepest slopes, the continuous spectrum of orography is normally filtered (truncated) even more, typically beyond a few gridlengths of the model. This means, that in the numerical weather prediction (NWP) models, there will always be subgridscale orography effects, which cannot be explicitly resolved by numerical integration of the basic equations, but require parametrization. In the subgrid-scale, different physical processes contribute in different scales. The parametrized processes interact with the resolved-scale processes and with each other. This study contributes to building of a consistent, scale-dependent system of orography-related parametrizations for the High Resolution Limited Area Model (HIRLAM). The system comprises schemes for handling the effects of mesoscale (MSO) and small-scale (SSO) orographic effects on the simulated flow and a scheme of orographic effects on the surface-level radiation fluxes. Representation of orography, scale-dependencies of the simulated processes and interactions between the parametrized and resolved processes are discussed. From the high-resolution digital elevation data, orographic parameters are derived for both momentum and radiation flux parametrizations. Tools for diagnostics and validation are developed and presented. The parametrization schemes applied, developed and validated in this study, are currently being implemented into the reference version of HIRLAM.
Resumo:
The StreamIt programming model has been proposed to exploit parallelism in streaming applications on general purpose multi-core architectures. This model allows programmers to specify the structure of a program as a set of filters that act upon data, and a set of communication channels between them. The StreamIt graphs describe task, data and pipeline parallelism which can be exploited on modern Graphics Processing Units (GPUs), as they support abundant parallelism in hardware. In this paper, we describe the challenges in mapping StreamIt to GPUs and propose an efficient technique to software pipeline the execution of stream programs on GPUs. We formulate this problem - both scheduling and assignment of filters to processors - as an efficient Integer Linear Program (ILP), which is then solved using ILP solvers. We also describe a novel buffer layout technique for GPUs which facilitates exploiting the high memory bandwidth available in GPUs. The proposed scheduling utilizes both the scalar units in GPU, to exploit data parallelism, and multiprocessors, to exploit task and pipelin parallelism. Further it takes into consideration the synchronization and bandwidth limitations of GPUs, and yields speedups between 1.87X and 36.83X over a single threaded CPU.
Resumo:
The significance of treating rainfall as a chaotic system instead of a stochastic system for a better understanding of the underlying dynamics has been taken up by various studies recently. However, an important limitation of all these approaches is the dependence on a single method for identifying the chaotic nature and the parameters involved. Many of these approaches aim at only analyzing the chaotic nature and not its prediction. In the present study, an attempt is made to identify chaos using various techniques and prediction is also done by generating ensembles in order to quantify the uncertainty involved. Daily rainfall data of three regions with contrasting characteristics (mainly in the spatial area covered), Malaprabha, Mahanadi and All-India for the period 1955-2000 are used for the study. Auto-correlation and mutual information methods are used to determine the delay time for the phase space reconstruction. Optimum embedding dimension is determined using correlation dimension, false nearest neighbour algorithm and also nonlinear prediction methods. The low embedding dimensions obtained from these methods indicate the existence of low dimensional chaos in the three rainfall series. Correlation dimension method is done on th phase randomized and first derivative of the data series to check whether the saturation of the dimension is due to the inherent linear correlation structure or due to low dimensional dynamics. Positive Lyapunov exponents obtained prove the exponential divergence of the trajectories and hence the unpredictability. Surrogate data test is also done to further confirm the nonlinear structure of the rainfall series. A range of plausible parameters is used for generating an ensemble of predictions of rainfall for each year separately for the period 1996-2000 using the data till the preceding year. For analyzing the sensitiveness to initial conditions, predictions are done from two different months in a year viz., from the beginning of January and June. The reasonably good predictions obtained indicate the efficiency of the nonlinear prediction method for predicting the rainfall series. Also, the rank probability skill score and the rank histograms show that the ensembles generated are reliable with a good spread and skill. A comparison of results of the three regions indicates that although they are chaotic in nature, the spatial averaging over a large area can increase the dimension and improve the predictability, thus destroying the chaotic nature. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
The economic, political and social face of Europe has been changing rapidly in the past decades. These changes are unique in the history of Europe, but not without challenges for the nation states. The support for the European integration varies among the countries. In order to understand why certain developments or changes are perceived as threatening or as desired by different member countries, we must consider the social representations of the European integration on the national level: how the EU is represented to its citizens in media and in educational systems, particularly in the curricula and textbooks. The current study is concerned with the social representations of the European integration in the curricula and school textbooks in five European countries: France, Britain, Germany, Finland and Sweden. Besides that, the first volume of the common Franco-German history textbook was analyzed, since it has been seen as a model for a common European history textbook. As the collective representations, values and identities are dominantly mediated and imposed through media and educational systems, the national curricula and textbooks make an interesting starting point for the study of the European integration and of national and European identities. The social representations theory provides a comprehensive framework for the study of the European integration. By analyzing the curricula and history and civics textbooks of major educational publishers, the study aimed to demonstrate what is written on the European integration and how it is portrayed how the European integration is understood, made familiar and concretized in the educational context in the five European countries. To grasp the phenomenon of the European integration in the textbooks in its entirety, it was investigated from various perspectives. The two analysis methods of content analysis, the automatic analysis with ALCESTE and a more qualitative theory-driven content analysis, were carried out to give a more vivid and multifaceted picture of the object of the research. The analysis of the text was complemented with the analysis of visual material. Drawing on quantitative and qualitative methods, the contents, processes, visual images, transformations and structures of the social representations of European integration, as well as the communicative styles of the textbooks were examined. This study showed the divergent social representations of the European integration, anchored in the nation states, in the five member countries of the European Union. The social representations were constructed around different central core elements: French Europe in the French textbooks, Ambivalent Europe in the British textbooks, Influential and Unifying EU in the German textbooks, Enabling and Threatening EU in the Finnish textbooks, Sceptical EU in the Swedish textbooks and EU as a World Model in the Franco-German textbook. Some elements of the representations were shared by all countries such as peace and economic aspects of the European cooperation, whereas other elements of representations were found more frequently in some countries than in others, such as ideological, threatening or social components of the phenomenon European integration. The study also demonstrated the linkage between social representations of the EU and national and European identities. The findings of this study are applicable to the study of the European integration, to the study of education, as well as to the social representation theory.
Resumo:
Automatic identification of software faults has enormous practical significance. This requires characterizing program execution behavior and the use of appropriate data mining techniques on the chosen representation. In this paper, we use the sequence of system calls to characterize program execution. The data mining tasks addressed are learning to map system call streams to fault labels and automatic identification of fault causes. Spectrum kernels and SVM are used for the former while latent semantic analysis is used for the latter The techniques are demonstrated for the intrusion dataset containing system call traces. The results show that kernel techniques are as accurate as the best available results but are faster by orders of magnitude. We also show that latent semantic indexing is capable of revealing fault-specific features.
Resumo:
Asian elephants (Dephas maximus), prominent ``flagship species'', arelisted under the category of endangered species (EN - A2c, ver. 3.1, IUCN Red List 2009) and there is a need for their conservation This requires understanding demographic and reproductive dynamics of the species. Monitoring reproductive status of any species is traditionally being carried out through invasive blood sampling and this is restrictive for large animals such as wild or semi-captive elephants due to legal. ethical, and practical reasons Hence. there is a need for a non-invasive technique to assess reproductive cyclicity profiles of elephants. which will help in the species' conservation strategies In this study. we developed an indirect competitive enzyme linked immuno-sorbent assay (ELISA) to estimate the concentration of one of the progesterone-metabolites i.e, allopregnanolone (5 alpha-P-3OH) in fecal samples of As elephants We validated the assay which had a sensitivity of 0.25 mu M at 90% binding with an EC50 value of 1 37 mu M Using female elephants. kept under semi-captive conditions in the forest camps of Mudumalar Wildlife Sanctuary, Tamil Nadu and Bandipur National Park, Karnataka, India. we measured fecal progesterone-metabolite (5 alpha-P-3OH) concentrations in six an and showed their clear correlation with those of scrum progesterone measured by a standard radio-immuno assay. Statistical analyses using a Linear Mixed Effect model showed a positive correlation (P < 0 1) between the profiles of fecal 5 alpha-P-3OH (range 0 5-10 mu g/g) and serum progesterone (range: 0 1-1 8 ng/mL) Therefore, our studies show, for the first time, that the fecal progesterone-metabolite assay could be exploited to predict estrus cyclicity and to potentially assess the reproductive status of captive and free-ranging female Asian elephants, thereby helping to plan their breeding strategy (C) 2010 Elsevier Inc.All rights reserved.
Resumo:
Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Argues that focus on the immigrant status of the Espoo shooter deflects attention from the failure of the relevant authorities to effectively respond to a foreseeable threat to public safety. Response to public discussion of New Year's Day shootings in Espoo shopping mall.
Resumo:
Argues that focus on the immigrant status of the Espoo shooter deflects attention from the failure of the relevant authorities to effectively respond to a foreseeable threat to public safety. Response to public discussion of New Year's Day shootings in Espoo shopping mall.
Resumo:
Acute heart failure (AHF) is a complex syndrome associated with exceptionally high mortality. Still, characteristics and prognostic factors of contemporary AHF patients have been inadequately studied. Kidney function has emerged as a very powerful prognostic risk factor in cardiovascular disease. This is believed to be the consequence of an interaction between the heart and kidneys, also termed the cardiorenal syndrome, the mechanisms of which are not fully understood. Renal insufficiency is common in heart failure and of particular interest for predicting outcome in AHF. Cystatin C (CysC) is a marker of glomerular filtration rate with properties making it a prospective alternative to the currently used measure creatinine for assessment of renal function. The aim of this thesis is to characterize a representative cohort of patients hospitalized for AHF and to identify risk factors for poor outcome in AHF. In particular, the role of CysC as a marker of renal function is evaluated, including examination of the value of CysC as a predictor of mortality in AHF. The FINN-AKVA (Finnish Acute Heart Failure) study is a national prospective multicenter study conducted to investigate the clinical presentation, aetiology and treatment of, as well as concomitant diseases and outcome in, AHF. Patients hospitalized for AHF were enrolled in the FINN-AKVA study, and mortality was followed for 12 months. The mean age of patients with AHF is 75 years and they frequently have both cardiovascular and non-cardiovascular co-morbidities. The mortality after hospitalization for AHF is high, rising to 27% by 12 months. The present study shows that renal dysfunction is very common in AHF. CysC detects impaired renal function in forty percent of patients. Renal function, measured by CysC, is one of the strongest predictors of mortality independently of other prognostic risk markers, such as age, gender, co-morbidities and systolic blood pressure on admission. Moreover, in patients with normal creatinine values, elevated CysC is associated with a marked increase in mortality. Acute kidney injury, defined as an increase in CysC within 48 hours of hospital admission, occurs in a significant proportion of patients and is associated with increased short- and mid-term mortality. The results suggest that CysC can be used for risk stratification in AHF. Markers of inflammation are elevated both in heart failure and in chronic kidney disease, and inflammation is one of the mechanisms thought to mediate heart-kidney interactions in the cardiorenal syndrome. Inflammatory cytokines such as interleukin-6 (IL-6) and tumor necrosis factor-alpha (TNF-α) correlate very differently to markers of cardiac stress and renal function. In particular, TNF-α showed a robust correlation to CysC, but was not associated with levels of NT-proBNP, a marker of hemodynamic cardiac stress. Compared to CysC, the inflammatory markers were not strongly related to mortality in AHF. In conclusion, patients with AHF are elderly with multiple co-morbidities, and renal dysfunction is very common. CysC demonstrates good diagnostic properties both in identifying impaired renal function and acute kidney injury in patients with AHF. CysC, as a measure of renal function, is also a powerful prognostic marker in AHF. CysC shows promise as a marker for assessment of kidney function and risk stratification in patients hospitalized for AHF.
Resumo:
We present a new computationally efficient method for large-scale polypeptide folding using coarse-grained elastic networks and gradient-based continuous optimization techniques. The folding is governed by minimization of energy based on Miyazawa–Jernigan contact potentials. Using this method we are able to substantially reduce the computation time on ordinary desktop computers for simulation of polypeptide folding starting from a fully unfolded state. We compare our results with available native state structures from Protein Data Bank (PDB) for a few de-novo proteins and two natural proteins, Ubiquitin and Lysozyme. Based on our simulations we are able to draw the energy landscape for a small de-novo protein, Chignolin. We also use two well known protein structure prediction software, MODELLER and GROMACS to compare our results. In the end, we show how a modification of normal elastic network model can lead to higher accuracy and lower time required for simulation.
Resumo:
This article concentrates on the discursive constmction of success and failure in narratives of post-merger integration. Drawing on extensive interview material from eight Finnish-Swedish mergers and acquisitions, the empirical analysis leads to distinguishing four types of discourse — 'rationalistic', 'cultural', 'role-bound' and 'individualistic' — that narrators employ in recounting their experiences. In particular, the empirical material illustrates how the discursive frameworks enable specific (di.scursive) strategies and moves for (re)framing the success/failure, justification/legitimization of one's own actions, and (re)constniction of responsibility when dealing with socio-psychological pressures associated with success/failtire. The analysis also suggests that, as a result of making use of these discursive strategies and moves, success stories are likely to lead to overly optimistic or, in the case of failure stories, overly pessimistic views on the management's ability to control these change processes. Tliese findings imply that we should take the discursive elements that both constrain our descriptions and explanations seriously, and provide opportunities for more or less intentional (re)interpretations of postmerger integration or other organizational change processes.
Resumo:
In Somalia the central government collapsed in 1991 and since then state failure became a widespread phenomenon and one of the greatest political and humanitarian problems facing the world in this century. Thus, the main objective of this research is to answer the following question: What went wrong? Most of the existing literature on the political economy of conflict starts from the assumption that state in Africa is predatory by nature. Unlike these studies, the present research, although it uses predation theory, starts from the social contract approach of state definition. Therefore, rather than contemplating actions and policies of the rulers alone, this approach allows us to deliberately bring the role of the society – as citizens – and other players into the analyses. In Chapter 1, after introducing the study, a simple principal-agent model will be developed to check the logical consistence of the argument and to make the identification of causal mechanism easier. I also identify three main actors in the process of state failure in Somalia: the Somali state, Somali society and the superpowers. In Chapter 2, so as to understand the incentives, preferences and constraints of each player in the state failure game, I in some depth analyse the evolution and structure of three central informal institutions: identity based patronage system of leadership, political tribalism, and the Cold War. These three institutions are considered as the rules of the game in the Somali state failure. Chapter 3 summarises the successive civilian governments’ achievements and failures (1960-69) concerning the main national goals, national unification and socio-economic development. Chapter 4 shows that the military regime, although it assumed power through extralegal means, served to some extent the developmental interest of the citizens in the first five years of its rule. Chapter 5 shows the process, and the factors involved, of the military regime’s self-transformation from being an agent for the developmental interests of the society to a predatory state that not only undermines the interests of the society but that also destroys the state itself. Chapter 6 addresses the process of disintegration of the post-colonial state of Somalia. The chapter shows how the regime’s merciless reactions to political ventures by power-seeking opposition leaders shattered the entire country and wrecked the state institutions. Chapter 7 concludes the study by summarising the main findings: due to the incentive structures generated by the informal institutions, the formal state institutions fell apart.
Resumo:
One of the most fundamental and widely accepted ideas in finance is that investors are compensated through higher returns for taking on non-diversifiable risk. Hence the quantification, modeling and prediction of risk have been, and still are one of the most prolific research areas in financial economics. It was recognized early on that there are predictable patterns in the variance of speculative prices. Later research has shown that there may also be systematic variation in the skewness and kurtosis of financial returns. Lacking in the literature so far, is an out-of-sample forecast evaluation of the potential benefits of these new more complicated models with time-varying higher moments. Such an evaluation is the topic of this dissertation. Essay 1 investigates the forecast performance of the GARCH (1,1) model when estimated with 9 different error distributions on Standard and Poor’s 500 Index Future returns. By utilizing the theory of realized variance to construct an appropriate ex post measure of variance from intra-day data it is shown that allowing for a leptokurtic error distribution leads to significant improvements in variance forecasts compared to using the normal distribution. This result holds for daily, weekly as well as monthly forecast horizons. It is also found that allowing for skewness and time variation in the higher moments of the distribution does not further improve forecasts. In Essay 2, by using 20 years of daily Standard and Poor 500 index returns, it is found that density forecasts are much improved by allowing for constant excess kurtosis but not improved by allowing for skewness. By allowing the kurtosis and skewness to be time varying the density forecasts are not further improved but on the contrary made slightly worse. In Essay 3 a new model incorporating conditional variance, skewness and kurtosis based on the Normal Inverse Gaussian (NIG) distribution is proposed. The new model and two previously used NIG models are evaluated by their Value at Risk (VaR) forecasts on a long series of daily Standard and Poor’s 500 returns. The results show that only the new model produces satisfactory VaR forecasts for both 1% and 5% VaR Taken together the results of the thesis show that kurtosis appears not to exhibit predictable time variation, whereas there is found some predictability in the skewness. However, the dynamic properties of the skewness are not completely captured by any of the models.