886 resultados para Lot sizing and scheduling problems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives This paper reports on a longitudinal qualitative study exploring concerns of 60 patients before and after transplantation. Methods Semi-structured interviews were conducted without time constraints in a protected space out of the hospital. Qualitative analysis was performed. Results Prior to transplantation, all patients talked freely about negative feelings, stigmatisation, being misunderstood by others, loneliness and culpability caused by increasing physical dependency or abandoned roles. They mentioned alternative ways to cope (magic, spirituality), and even expressed their right to let go. In a subset of 13 patients, significant ones allowed themselves in the interview, or were integrated on the request of the patients. In this modified setting, two illness-worlds were confronted. If common themes were mentioned (e.g., modified life plans, restricted space, physical and psychological barriers), they were experienced differently. Fear of transplantation or guilt towards the donors was overtly expressed, often for the first time. Mutual hiding of anxiety in order to protect loved ones or to prevent loss of control was disclosed. The significant ones talked about accumulated stress and exhaustion related to the physical degradation of the patient, fear of the unpredictable evolution of illness and financial problems, and stressed their difficulty to adapt adequately to the fluctuating state of the patient. After transplantation, other themes emerged, where difficulty in disclosure was observed: intensive care and near death experiences, being a transplanted person, debt to the donor and his/her family, fear of rejection. Conclusions With the self-imposed strategy of hiding concerns to protect one another, a discrepancy between two illness-worlds was created. When concerns were confronted during the interviews, a new mutual understanding emerged. Patients and their families stated the need for sharing concerns in the course of illness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Soil surveys are the main source of spatial information on soils and have a range of different applications, mainly in agriculture. The continuity of this activity has however been severely compromised, mainly due to a lack of governmental funding. The purpose of this study was to evaluate the feasibility of two different classifiers (artificial neural networks and a maximum likelihood algorithm) in the prediction of soil classes in the northwest of the state of Rio de Janeiro. Terrain attributes such as elevation, slope, aspect, plan curvature and compound topographic index (CTI) and indices of clay minerals, iron oxide and Normalized Difference Vegetation Index (NDVI), derived from Landsat 7 ETM+ sensor imagery, were used as discriminating variables. The two classifiers were trained and validated for each soil class using 300 and 150 samples respectively, representing the characteristics of these classes in terms of the discriminating variables. According to the statistical tests, the accuracy of the classifier based on artificial neural networks (ANNs) was greater than of the classic Maximum Likelihood Classifier (MLC). Comparing the results with 126 points of reference showed that the resulting ANN map (73.81 %) was superior to the MLC map (57.94 %). The main errors when using the two classifiers were caused by: a) the geological heterogeneity of the area coupled with problems related to the geological map; b) the depth of lithic contact and/or rock exposure, and c) problems with the environmental correlation model used due to the polygenetic nature of the soils. This study confirms that the use of terrain attributes together with remote sensing data by an ANN approach can be a tool to facilitate soil mapping in Brazil, primarily due to the availability of low-cost remote sensing data and the ease by which terrain attributes can be obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The multiscale finite-volume (MSFV) method is designed to reduce the computational cost of elliptic and parabolic problems with highly heterogeneous anisotropic coefficients. The reduction is achieved by splitting the original global problem into a set of local problems (with approximate local boundary conditions) coupled by a coarse global problem. It has been shown recently that the numerical errors in MSFV results can be reduced systematically with an iterative procedure that provides a conservative velocity field after any iteration step. The iterative MSFV (i-MSFV) method can be obtained with an improved (smoothed) multiscale solution to enhance the localization conditions, with a Krylov subspace method [e.g., the generalized-minimal-residual (GMRES) algorithm] preconditioned by the MSFV system, or with a combination of both. In a multiphase-flow system, a balance between accuracy and computational efficiency should be achieved by finding a minimum number of i-MSFV iterations (on pressure), which is necessary to achieve the desired accuracy in the saturation solution. In this work, we extend the i-MSFV method to sequential implicit simulation of time-dependent problems. To control the error of the coupled saturation/pressure system, we analyze the transport error caused by an approximate velocity field. We then propose an error-control strategy on the basis of the residual of the pressure equation. At the beginning of simulation, the pressure solution is iterated until a specified accuracy is achieved. To minimize the number of iterations in a multiphase-flow problem, the solution at the previous timestep is used to improve the localization assumption at the current timestep. Additional iterations are used only when the residual becomes larger than a specified threshold value. Numerical results show that only a few iterations on average are necessary to improve the MSFV results significantly, even for very challenging problems. Therefore, the proposed adaptive strategy yields efficient and accurate simulation of multiphase flow in heterogeneous porous media.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This naturalistic cross-sectional study explores how and to what extent cannabis dependence was associated with intrapersonal aspects (anxiety, coping styles) and interpersonal aspects of adolescent functioning (school status, family relationships, peer relationships, social life). A convenience sample of 110 adolescents (aged 12 to 19) was recruited and subdivided into two groups (38 with a cannabis dependence and 72 nondependent) according to DSM-IV-TR criteria for cannabis dependence. Participants completed the State-Trait Anxiety Inventory (STAI-Y), the Coping Across Situations Questionnaire (CASQ), and the Adolescent Drug Abuse Diagnosis (ADAD) interview investigating psychosocial and interpersonal problems in an adolescent's life. Factors associated with cannabis dependence were explored with logistic regression analyses. The results indicated that severity of problems in social life and peer relationships (OR = 1.68, 95% CI = 1.21 - 2.33) and avoidant coping (OR = 4.22, 95% CI = 1.01 - 17.73) were the only discriminatory factors for cannabis dependence. This model correctly classified 84.5% of the adolescents. These findings are partially consistent with the "self-medication hypothesis" and underlined the importance of peer relationships and dysfunctional coping strategies in cannabis dependence in adolescence. Limitations of the study and implications for clinical work with adolescents are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe one of the research lines of the Grup de Teoria de Funcions de la UAB UB, which deals with sampling and interpolation problems in signal analysis and their connections with complex function theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Performance-related pay within public organizations is continuing to spread. Although it can help to strengthen an entrepreneurial spirit in civil servants, its implementation is marred by technical, financial, managerial and cultural problems. This article identifies an added problem, namely the contradiction that exists between a managerial discourse that emphasizes the team and collective performance, on the one hand, and the use of appraisal and reward tools that are above all individual, on the other. Based on an empirical survey carried out within Swiss public organizations, the analysis shows that the team is currently rarely taken into account and singles out the principal routes towards an integrated system for the management and rewarding of civil servants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To examine whether drawing is useful in the detection of problems of psychosocial adaptation in children and adolescents with type 1 diabetes (T1D) and in improving communication with health professionals. Methods: We performed an exploratory descriptive study in 199 children and adolescents with T1D aged 413 years. The participants were asked to render a drawing on a suggested topic. The variables analyzed were related to the drawing and to clinical and sociodemographic data. Results: Most participants showed evidence of having a well-balanced personality, but there were also signs of affective or psychosocial difficulties. Conclusion: Drawing is a useful technique by which to identify children"s and adolescents" feelings and possible problems in adapting to T1D, as well as to gain information directly from the children themselves. Future studies should delimit the possibilities of this technique in clinical practice in greater detail.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The aim of this research was to characterize the experience of living with diabetes mellitus (DM) and identify patients" opinions of the quality of care received and the results of interventions. Methods: A descriptive, exploratory evaluation study using qualitative methodology was performed. Participants consisted of 40 adult patients diagnosed with DM and followed up in a public hospital in Barcelona, Spain. A semistructured interview and a focus group were used and a thematic content analysis was performed. Results: Patients described DM as a disease that is difficult to control and that provokes lifestyle changes requiring effort and sacrifice. Insulin treatment increased the perception of disease severity. The most frequent and dreaded complication was hypoglycemia. The main problems perceived by patients affecting the quality of care were related to a disease-centered medical approach, lack of information, limited participation in decision-making, and the administrative and bureaucratic problems of the health care system. Conclusion: The bureaucratic circuits of the health care system impair patients" quality of life and perceived quality of care. Health professionals should foster patient participation in decision-making. However, this requires not only training and appropriate attitudes, but also adequate staffing and materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The aim of this research was to characterize the experience of living with diabetes mellitus (DM) and identify patients" opinions of the quality of care received and the results of interventions. Methods: A descriptive, exploratory evaluation study using qualitative methodology was performed. Participants consisted of 40 adult patients diagnosed with DM and followed up in a public hospital in Barcelona, Spain. A semistructured interview and a focus group were used and a thematic content analysis was performed. Results: Patients described DM as a disease that is difficult to control and that provokes lifestyle changes requiring effort and sacrifice. Insulin treatment increased the perception of disease severity. The most frequent and dreaded complication was hypoglycemia. The main problems perceived by patients affecting the quality of care were related to a disease-centered medical approach, lack of information, limited participation in decision-making, and the administrative and bureaucratic problems of the health care system. Conclusion: The bureaucratic circuits of the health care system impair patients" quality of life and perceived quality of care. Health professionals should foster patient participation in decision-making. However, this requires not only training and appropriate attitudes, but also adequate staffing and materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To examine whether drawing is useful in the detection of problems of psychosocial adaptation in children and adolescents with type 1 diabetes (T1D) and in improving communication with health professionals. Methods: We performed an exploratory descriptive study in 199 children and adolescents with T1D aged 413 years. The participants were asked to render a drawing on a suggested topic. The variables analyzed were related to the drawing and to clinical and sociodemographic data. Results: Most participants showed evidence of having a well-balanced personality, but there were also signs of affective or psychosocial difficulties. Conclusion: Drawing is a useful technique by which to identify children"s and adolescents" feelings and possible problems in adapting to T1D, as well as to gain information directly from the children themselves. Future studies should delimit the possibilities of this technique in clinical practice in greater detail.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

“The liquidity crisis of the Spanish banks is largely due to the lack of confidence of foreign investors and, therefore, the changes that occur in the legislation should not affect the credibility, stability, legal certainty, predictability that markets expect”.Sergio Nasarre (2011)In the current situation of economic crisis, many people have found they can no longer pay back the mortgage loans that were granted to them in order to purchase a dwelling. It is for this reason that, in light of the economic, political and social problems this poses, our paper studies the state of the Spanish real-estate system and of foreclosure, paying special attention to the solution that has been proposed recently as the best option for debtors that cannot make their mort-gage payments: non-recourse mortgaging. We analyze this proposal from legal and economic perspectives in order to fully understand the effects that this change could imply. At the same time, this paper will also examine several alternatives we believe would ameliorate the situation of mortgage-holders, among them legal reforms, mortgage insurance, and non-recourse mortgaging itself.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Captopril, an orally active angiotensin-converting enzyme inhibitor, has been administered to 81 patients with different types of clinical hypertension. Most of the patients had previously uncontrollable high blood pressure. In order to achieve a satisfactory blood pressure control during long-term captopril therapy, a concomitant decrease in total body sodium was required in more than half of the patients. During our first two years of clinical experience with this new antihypertensive agent, side effects developed in 46.9 per cent of the patients and necessitated the withdrawal of the drug in 23.4 per cent of all patients. Only a few side effects such as hypotensive or syncopal episodes and cold extremities appeared to be due to the chronic blockade of the renin-angiotensin system. The most frequent and the most serious adverse reactions such as skin rash, altered taste, pancytopenia, and pemphigus foliaceus seemed to be specifically drug related. The incidence of cutaneous and taste problems was markedly higher in patients with impaired renal function in whom retention of captopril has been previously demonstrated. This suggests that the occurrence of adverse reactions to captopril could be lowered in the future by using smaller daily doses and by titrating them according to the renal function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La principal fita dels ciclistes ha estat sempre l’entrenament per millorar les seves condicions i prestacions fisiològiques. Al llarg dels anys, el ciclisme com pràcticament tot l’esport s’ha anat modernitzant, no només tecnològicament. Això ha provocat l’aparició d’especialistes, corredors destinats a destacar, només en unes determinades condicions, per sobre els demés. Una d’aquestes condicions més restringides son les arribades massives, terreny dels anomenats esprinters, els quals brillen per sobre els demés degut a la seva potència, velocitat punta i arrancada. L’entrenament d’aquesta tipologia d’especialitat ha deixat entreveure varies ambigüitats i algunes problemàtiques de fonament teòric. L’esprint en el ciclisme es dona després d’un gran desgast de les reserves energètiques i de fatiga muscular. Per tant, entrenar-lo amb blocs de velocitat no té lògica. Tampoc es una opció viable el recurs que molts equips utilitzen: agafar corredors joves de la pista, i que la seva genètica (fibres ràpides) i les seves característiques de pistard1 facin la resta, perquè al pas dels anys perden aquesta exclusivitat. Aquest estudi es proposa buscar una manera de treballar i potenciar l’esprint del ciclista a partir de la força explosiva, garantint preservar les condicions aeròbiques per tal de que no perjudiqui per altra banda la seva resistència. Per tal d’aconseguir-ho, s’efectuaran uns tests. Un focalitzat en mesurar les prestacions en un esprint dels subjectes. L’altra es basarà en avaluar la força explosiva d’aquets, a partir de salts verticals. Un cop obtinguts els resultats de la primera tanda, els subjectes seran sotmesos a un entrenament combinat de sobrecàrrega, per tal de observar, en la segona tanda, si els resultats son significatius. Com a conclusió, es podria destacar el fet de millora en la majoria d’aspectes en tots els tests per part de tots els subjectes, i que probablement, hi ha una correlació significativa entre la força explosiva i la capacitat per esprintar, tot i que s’haurien de corroborar els resultats amb una mostra més gran.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.