934 resultados para Research management.
The boundedness of penalty parameters in an augmented Lagrangian method with constrained subproblems
Resumo:
Augmented Lagrangian methods are effective tools for solving large-scale nonlinear programming problems. At each outer iteration, a minimization subproblem with simple constraints, whose objective function depends on updated Lagrange multipliers and penalty parameters, is approximately solved. When the penalty parameter becomes very large, solving the subproblem becomes difficult; therefore, the effectiveness of this approach is associated with the boundedness of the penalty parameters. In this paper, it is proved that under more natural assumptions than the ones employed until now, penalty parameters are bounded. For proving the new boundedness result, the original algorithm has been slightly modified. Numerical consequences of the modifications are discussed and computational experiments are presented.
Resumo:
A chaotic encryption algorithm is proposed based on the "Life-like" cellular automata (CA), which acts as a pseudo-random generator (PRNG). The paper main focus is to use chaos theory to cryptography. Thus, CA was explored to look for this "chaos" property. This way, the manuscript is more concerning on tests like: Lyapunov exponent, Entropy and Hamming distance to measure the chaos in CA, as well as statistic analysis like DIEHARD and ENT suites. Our results achieved higher randomness quality than others ciphers in literature. These results reinforce the supposition of a strong relationship between chaos and the randomness quality. Thus, the "chaos" property of CA is a good reason to be employed in cryptography, furthermore, for its simplicity, low cost of implementation and respectable encryption power. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Texture image analysis is an important field of investigation that has attracted the attention from computer vision community in the last decades. In this paper, a novel approach for texture image analysis is proposed by using a combination of graph theory and partially self-avoiding deterministic walks. From the image, we build a regular graph where each vertex represents a pixel and it is connected to neighboring pixels (pixels whose spatial distance is less than a given radius). Transformations on the regular graph are applied to emphasize different image features. To characterize the transformed graphs, partially self-avoiding deterministic walks are performed to compose the feature vector. Experimental results on three databases indicate that the proposed method significantly improves correct classification rate compared to the state-of-the-art, e.g. from 89.37% (original tourist walk) to 94.32% on the Brodatz database, from 84.86% (Gabor filter) to 85.07% on the Vistex database and from 92.60% (original tourist walk) to 98.00% on the plant leaves database. In view of these results, it is expected that this method could provide good results in other applications such as texture synthesis and texture segmentation. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Fraud is a global problem that has required more attention due to an accentuated expansion of modern technology and communication. When statistical techniques are used to detect fraud, whether a fraud detection model is accurate enough in order to provide correct classification of the case as a fraudulent or legitimate is a critical factor. In this context, the concept of bootstrap aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the adjusted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper, for the first time, we aim to present a pioneer study of the performance of the discrete and continuous k-dependence probabilistic networks within the context of bagging predictors classification. Via a large simulation study and various real datasets, we discovered that the probabilistic networks are a strong modeling option with high predictive capacity and with a high increment using the bagging procedure when compared to traditional techniques. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
In multi-label classification, examples can be associated with multiple labels simultaneously. The task of learning from multi-label data can be addressed by methods that transform the multi-label classification problem into several single-label classification problems. The binary relevance approach is one of these methods, where the multi-label learning task is decomposed into several independent binary classification problems, one for each label in the set of labels, and the final labels for each example are determined by aggregating the predictions from all binary classifiers. However, this approach fails to consider any dependency among the labels. Aiming to accurately predict label combinations, in this paper we propose a simple approach that enables the binary classifiers to discover existing label dependency by themselves. An experimental study using decision trees, a kernel method as well as Naive Bayes as base-learning techniques shows the potential of the proposed approach to improve the multi-label classification performance.
Resumo:
Statistical methods have been widely employed to assess the capabilities of credit scoring classification models in order to reduce the risk of wrong decisions when granting credit facilities to clients. The predictive quality of a classification model can be evaluated based on measures such as sensitivity, specificity, predictive values, accuracy, correlation coefficients and information theoretical measures, such as relative entropy and mutual information. In this paper we analyze the performance of a naive logistic regression model (Hosmer & Lemeshow, 1989) and a logistic regression with state-dependent sample selection model (Cramer, 2004) applied to simulated data. Also, as a case study, the methodology is illustrated on a data set extracted from a Brazilian bank portfolio. Our simulation results so far revealed that there is no statistically significant difference in terms of predictive capacity between the naive logistic regression models and the logistic regression with state-dependent sample selection models. However, there is strong difference between the distributions of the estimated default probabilities from these two statistical modeling techniques, with the naive logistic regression models always underestimating such probabilities, particularly in the presence of balanced samples. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The complexity of power systems has increased in recent years due to the operation of existing transmission lines closer to their limits, using flexible AC transmission system (FACTS) devices, and also due to the increased penetration of new types of generators that have more intermittent characteristics and lower inertial response, such as wind generators. This changing nature of a power system has considerable effect on its dynamic behaviors resulting in power swings, dynamic interactions between different power system devices, and less synchronized coupling. This paper presents some analyses of this changing nature of power systems and their dynamic behaviors to identify critical issues that limit the large-scale integration of wind generators and FACTS devices. In addition, this paper addresses some general concerns toward high compensations in different grid topologies. The studies in this paper are conducted on the New England and New York power system model under both small and large disturbances. From the analyses, it can be concluded that high compensation can reduce the security limits under certain operating conditions, and the modes related to operating slip and shaft stiffness are critical as they may limit the large-scale integration of wind generation.
Resumo:
In the current climate of escalating health care costs, defining value and accurately measuring it are two critical issues affecting not only the future of cancer care in particular but also the future of health care in general. Specifically, measuring and improving value in cancer-related health care are critical for continued advancements in research, management, and overall delivery of care. However, in oncology, most of this research has focused on value as it relates to insurance industry and payment reform, with little attention paid to value as the output of clinical interventions that encompass integrated clinical teams focusing on the entire cycle of care and measuring objective outcomes that are most relevant to patients. ^ In this study, patient-centered value was defined as health outcomes achieved per dollar spent, and calculated using objective functional outcomes and total care costs. The analytic sample comprised patients diagnosed with three common head and neck cancers—cancer of the larynx, oral cavity, and oropharynx—who were treated in an integrated tertiary care center over an approximately 10-year period. The results of this study provide initial empirical data that can be used to assess and ultimately to help improve the quality and value of head and neck cancer care, and more importantly they can be used by patients and clinicians to make better-informed decisions about care, particularly what therapeutic services and outcomes matter the most to patients.^
Resumo:
BACKGROUND: Managing fibromyalgia is a challenge for both health care systems and the professionals caring for these patients, due, in part, to the fact that the etiology of this disease is unknown, its symptoms are not specific and there is no standardized treatment. OBJECTIVE: The present study examines three aspects of fibromyalgia management, namely diagnostic approach, therapeutic management and the health professional-patient relationship, to explore specific areas of the health care process that professionals and patients may consider unsatisfactory. METHODS: A qualitative study involving semistructured interviews with 12 fibromyalgia patients and nine health professionals was performed. RESULTS: The most commonly recurring theme was the dissatisfaction of both patients and professionals with the management process as a whole. Both groups expressed dissatisfaction with the delay in reaching a diagnosis and obtaining effective treatment. Patients reported the need for greater moral support from professionals, whereas the latter often felt frustrated and of little help to patients. Patients and professionals agreed on one point: the uncertainty surrounding the management of fibromyalgia and, especially, its etiology. CONCLUSION: The present study contributes to a better understanding regarding why current management of fibromyalgia is neither effective nor satisfactory. It also provides insight into how health professionals can support fibromyalgia patients to achieve beneficial results. Health care services should offer greater support for these patients in the form of specific resources such as fibromyalgia clinics and health professionals with increased awareness of the disease.
Resumo:
This annotated bibliography discusses 60 key publications dealing with wave-current interaction. Each entry includes a bibliographic identification, keywords, a discussion of contents, and a statement of coastal engineering significance. An index of the entries by keywords is provided in an appendix. The recent growth of the wave-current interaction field is indicated by the fact that more than 30 percent of the selected publications were published in 1978 and 1979. (Author).
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
In this paper, we discuss two-dimensional failure modeling for a system where degradation is due to age and usage. We extend the concept of minimal repair for the one-dimensional case to the two-dimensional case and characterize the failures over a two-dimensional region under minimal repair. An application of this important result to a rnanufacturer's servicing costs for a two-dimensional warranty policy is given and we compare the minimal repair strategy with the strategy of replacement of failure. (C) 2003 Wiley Periodicals, Inc.
Resumo:
Warranty is an important element of marketing new products. The servicing of warranty results in additional costs to the manufacturer. Warranty logistics deals with various issues relating to the servicing of warranty. Proper management of warranty logistics is needed not only to reduce the warranty servicing cost but also to ensure customer satisfaction as customer dissatisfaction has a negative impact on sales and revenue. Unfortunately, warranty logistics has received very little attention. The paper links the literature on warranty and on logistics and then discusses the different issues in warranty logistics. It highlights the challenges and identifies some research topics of potential interest to operational researchers. (C) 2003 Elsevier B.V. All rights reserved.