907 resultados para General Linear Methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Early repolarization, which is characterized by an elevation of the J-point on 12-lead electrocardiography, is a common finding that has been considered as benign for decades. However, in the last years, it has been related with vulnerability to idiopathic ventricular fibrillation and with cardiac mortality in the general population. Recently, 4 potential ECG predictors that could differentiate the benign from the malignant form of early repolarization have been suggested. Any previous study about early repolarization has been done in Spain. Aim. To ascertain whether the presence of early repolarization pattern in a resting electrocardiogram is associated with a major risk of cardiac death in a Spanish general population and to determine whether the presence of potential predictors of malignancy in a resting electrocardiogram increases the risk of cardiac mortality in patients with early repolarization pattern. Methods. We will analyse the presence of early repolarization and the occurrence of cardiac mortality in a retrospective cohort study of 4,279 participants aged 25 to 74 years in the province of Girona. This cohort has been followed during a mean of 9.8 years. Early repolarization will be stratified according to the degree of J-point elevation (≥0.1 mV or ≥0.2 mV), the morphology of the J-wave (slurring, notching or any of these two), the ST-segment pattern (ascending or descending) and the localization (inferior leads, lateral leads, or both). Association of early repolarization with cardiac death will be assessed by adjusted Cox-proportional hazards models

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Recent research based on comparisons between bilinguals and monolinguals postulates that bilingualism enhances cognitive control functions, because the parallel activation of languages necessitates control of interference. In a novel approach we investigated two groups of bilinguals, distinguished by their susceptibility to cross-language interference, asking whether bilinguals with strong language control abilities ('non-switchers") have an advantage in executive functions (inhibition of irrelevant information, problem solving, planning efficiency, generative fluency and self-monitoring) compared to those bilinguals showing weaker language control abilities ('switchers"). Methods: 29 late bilinguals (21 women) were evaluated using various cognitive control neuropsychological tests [e.g., Tower of Hanoi, Ruff Figural Fluency Task, Divided Attention, Go/noGo] tapping executive functions as well as four subtests of the Wechsler Adult Intelligence Scale. The analysis involved t-tests (two independent samples). Non-switchers (n = 16) were distinguished from switchers (n = 13) by their performance observed in a bilingual picture-naming task. Results: The non-switcher group demonstrated a better performance on the Tower of Hanoi and Ruff Figural Fluency task, faster reaction time in a Go/noGo and Divided Attention task, and produced significantly fewer errors in the Tower of Hanoi, Go/noGo, and Divided Attention tasks when compared to the switchers. Non-switchers performed significantly better on two verbal subtests of the Wechsler Adult Intelligence Scale (Information and Similarity), but not on the Performance subtests (Picture Completion, Block Design). Conclusions: The present results suggest that bilinguals with stronger language control have indeed a cognitive advantage in the administered tests involving executive functions, in particular inhibition, self-monitoring, problem solving, and generative fluency, and in two of the intelligence tests. What remains unclear is the direction of the relationship between executive functions and language control abilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Learning of preference relations has recently received significant attention in machine learning community. It is closely related to the classification and regression analysis and can be reduced to these tasks. However, preference learning involves prediction of ordering of the data points rather than prediction of a single numerical value as in case of regression or a class label as in case of classification. Therefore, studying preference relations within a separate framework facilitates not only better theoretical understanding of the problem, but also motivates development of the efficient algorithms for the task. Preference learning has many applications in domains such as information retrieval, bioinformatics, natural language processing, etc. For example, algorithms that learn to rank are frequently used in search engines for ordering documents retrieved by the query. Preference learning methods have been also applied to collaborative filtering problems for predicting individual customer choices from the vast amount of user generated feedback. In this thesis we propose several algorithms for learning preference relations. These algorithms stem from well founded and robust class of regularized least-squares methods and have many attractive computational properties. In order to improve the performance of our methods, we introduce several non-linear kernel functions. Thus, contribution of this thesis is twofold: kernel functions for structured data that are used to take advantage of various non-vectorial data representations and the preference learning algorithms that are suitable for different tasks, namely efficient learning of preference relations, learning with large amount of training data, and semi-supervised preference learning. Proposed kernel-based algorithms and kernels are applied to the parse ranking task in natural language processing, document ranking in information retrieval, and remote homology detection in bioinformatics domain. Training of kernel-based ranking algorithms can be infeasible when the size of the training set is large. This problem is addressed by proposing a preference learning algorithm whose computation complexity scales linearly with the number of training data points. We also introduce sparse approximation of the algorithm that can be efficiently trained with large amount of data. For situations when small amount of labeled data but a large amount of unlabeled data is available, we propose a co-regularized preference learning algorithm. To conclude, the methods presented in this thesis address not only the problem of the efficient training of the algorithms but also fast regularization parameter selection, multiple output prediction, and cross-validation. Furthermore, proposed algorithms lead to notably better performance in many preference learning tasks considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The general objective of the international MEDiterranean EXperiment (MEDEX) was the better understanding and forecasting of cyclones that produce high impact weather in the Mediterranean. This paper reviews the motivation and foundation of MEDEX, the gestation, history and organisation of the project, as well as the main products and scientific achievements obtained from it. MEDEX obtained the approval of World Meteorological Organisation (WMO) and can be considered as framed within other WMO actions, such as the ALPine EXperiment (ALPEX), the Mediterranean Cyclones Study Project (MCP) and, to a certain extent, THe Observing System Research and Predictability EXperiment (THORPEX) and the HYdrological cycle in Mediterranean EXperiment (HyMeX). Through two phases (2000 2005 and 2006 2010), MEDEX has produced a specific database, with information about cyclones and severe or high impact weather events, several main reports and a specific data targeting system field campaign (DTS-MEDEX-2009). The scientific achievements are significant in fields like climatology, dynamical understanding of the physical processes and social impact of cyclones, as well as in aspects related to the location of sensitive zones for individual cases, the climatology of sensitivity zones and the improvement of the forecasts through innovative methods like mesoscale ensemble prediction systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Documenting the distribution of radiotherapy departments and the availability of radiotherapy equipment in the European countries is an important part of HERO the ESTRO Health Economics in Radiation Oncology project. HERO has the overall aim to develop a knowledge base of the provision of radiotherapy in Europe and build a model for health economic evaluation of radiation treatments at the European level. The aim of the current report is to describe the distribution of radiotherapy equipment in European countries. Methods: An 84-item questionnaire was sent out to European countries, principally through their national societies. The current report includes a detailed analysis of radiotherapy departments and equipment (questionnaire items 2629), analyzed in relation to the annual number of treatment courses and the socio-economic status of the countries. The analysis is based on validated responses from 28 of the 40 European countries defined by the European Cancer Observatory (ECO). Results: A large variation between countries was found for most parameters studied. There were 2192 linear accelerators, 96 dedicated stereotactic machines, and 77 cobalt machines reported in the 27 countries where this information was available. A total of 12 countries had at least one cobalt machine in use. There was a median of 0.5 simulator per MV unit (range 0.31.5) and 1.4 (range 0.44.4) simulators per department. Of the 874 simulators, a total of 654 (75%) were capable of 3D imaging (CT-scanner or CBCToption). The number of MV machines (cobalt, linear accelerators, and dedicated stereotactic machines) per million inhabitants ranged from 1.4 to 9.5 (median 5.3) and the average number of MV machines per department from 0.9 to 8.2 (median 2.6). The average number of treatment courses per year per MV machine varied from 262 to 1061 (median 419). While 69% of MV units were capable of IMRT only 49% were equipped for image guidance (IGRT). There was a clear relation between socio-economic status, as measured by GNI per capita, and availability of radiotherapy equipment in the countries. In many low income countries in Southern and Central-Eastern Europe there was very limited access to radiotherapy and especially to equipment for IMRT or IGRT. Conclusions: The European average number of MV machines per million inhabitants and per department is now better in line with QUARTS recommendations from 2005, but the survey also showed a significant heterogeneity in the access to modern radiotherapy equipment in Europe. High income countries especially in Northern-Western Europe are well-served with radiotherapy resources, other countries are facing important shortages of both equipment in general and especially machines capable of delivering high precision conformal treatments (IMRT, IGRT)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rosin is a natural product from pine forests and it is used as a raw material in resinate syntheses. Resinates are polyvalent metal salts of rosin acids and especially Ca- and Ca/Mg- resinates find wide application in the printing ink industry. In this thesis, analytical methods were applied to increase general knowledge of resinate chemistry and the reaction kinetics was studied in order to model the non linear solution viscosity increase during resinate syntheses by the fusion method. Solution viscosity in toluene is an important quality factor for resinates to be used in printing inks. The concept of critical resinate concentration, c crit, was introduced to define an abrupt change in viscosity dependence on resinate concentration in the solution. The concept was then used to explain the non-inear solution viscosity increase during resinate syntheses. A semi empirical model with two estimated parameters was derived for the viscosity increase on the basis of apparent reaction kinetics. The model was used to control the viscosity and to predict the total reaction time of the resinate process. The kinetic data from the complex reaction media was obtained by acid value titration and by FTIR spectroscopic analyses using a conventional calibration method to measure the resinate concentration and the concentration of free rosin acids. A multivariate calibration method was successfully applied to make partial least square (PLS) models for monitoring acid value and solution viscosity in both mid-infrared (MIR) and near infrared (NIR) regions during the syntheses. The calibration models can be used for on line resinate process monitoring. In kinetic studies, two main reaction steps were observed during the syntheses. First a fast irreversible resination reaction occurs at 235 °C and then a slow thermal decarboxylation of rosin acids starts to take place at 265 °C. Rosin oil is formed during the decarboxylation reaction step causing significant mass loss as the rosin oil evaporates from the system while the viscosity increases to the target level. The mass balance of the syntheses was determined based on the resinate concentration increase during the decarboxylation reaction step. A mechanistic study of the decarboxylation reaction was based on the observation that resinate molecules are partly solvated by rosin acids during the syntheses. Different decarboxylation mechanisms were proposed for the free and solvating rosin acids. The deduced kinetic model supported the analytical data of the syntheses in a wide resinate concentration region, over a wide range of viscosity values and at different reaction temperatures. In addition, the application of the kinetic model to the modified resinate syntheses gave a good fit. A novel synthesis method with the addition of decarboxylated rosin (i.e. rosin oil) to the reaction mixture was introduced. The conversion of rosin acid to resinate was increased to the level necessary to obtain the target viscosity for the product at 235 °C. Due to a lower reaction temperature than in traditional fusion synthesis at 265 °C, thermal decarboxylation is avoided. As a consequence, the mass yield of the resinate syntheses can be increased from ca. 70% to almost 100% by recycling the added rosin oil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The general striving to bring down the number of municipal landfills and to increase the reuse and recycling of waste-derived materials across the EU supports the debates concerning the feasibility and rationality of waste management systems. Substantial decrease in the volume and mass of landfill-disposed waste flows can be achieved by directing suitable waste fractions to energy recovery. Global fossil energy supplies are becoming more and more valuable and expensive energy sources for the mankind, and efforts to save fossil fuels have been made. Waste-derived fuels offer one potential partial solution to two different problems. First, waste that cannot be feasibly re-used or recycled is utilized in the energy conversion process according to EU’s Waste Hierarchy. Second, fossil fuels can be saved for other purposes than energy, mainly as transport fuels. This thesis presents the principles of assessing the most sustainable system solution for an integrated municipal waste management and energy system. The assessment process includes: · formation of a SISMan (Simple Integrated System Management) model of an integrated system including mass, energy and financial flows, and · formation of a MEFLO (Mass, Energy, Financial, Legislational, Other decisionsupport data) decision matrix according to the selected decision criteria, including essential and optional decision criteria. The methods are described and theoretical examples of the utilization of the methods are presented in the thesis. The assessment process involves the selection of different system alternatives (process alternatives for treatment of different waste fractions) and comparison between the alternatives. The first of the two novelty values of the utilization of the presented methods is the perspective selected for the formation of the SISMan model. Normally waste management and energy systems are operated separately according to the targets and principles set for each system. In the thesis the waste management and energy supply systems are considered as one larger integrated system with one primary target of serving the customers, i.e. citizens, as efficiently as possible in the spirit of sustainable development, including the following requirements: · reasonable overall costs, including waste management costs and energy costs; · minimum environmental burdens caused by the integrated waste management and energy system, taking into account the requirement above; and · social acceptance of the selected waste treatment and energy production methods. The integrated waste management and energy system is described by forming a SISMan model including three different flows of the system: energy, mass and financial flows. By defining the three types of flows for an integrated system, the selected factor results needed in the decision-making process of the selection of waste management treatment processes for different waste fractions can be calculated. The model and its results form a transparent description of the integrated system under discussion. The MEFLO decision matrix has been formed from the results of the SISMan model, combined with additional data, including e.g. environmental restrictions and regional aspects. System alternatives which do not meet the requirements set by legislation can be deleted from the comparisons before any closer numerical considerations. The second novelty value of this thesis is the three-level ranking method for combining the factor results of the MEFLO decision matrix. As a result of the MEFLO decision matrix, a transparent ranking of different system alternatives, including selection of treatment processes for different waste fractions, is achieved. SISMan and MEFLO are methods meant to be utilized in municipal decision-making processes concerning waste management and energy supply as simple, transparent and easyto- understand tools. The methods can be utilized in the assessment of existing systems, and particularly in the planning processes of future regional integrated systems. The principles of SISMan and MEFLO can be utilized also in other environments, where synergies of integrating two (or more) systems can be obtained. The SISMan flow model and the MEFLO decision matrix can be formed with or without any applicable commercial or free-of-charge tool/software. SISMan and MEFLO are not bound to any libraries or data-bases including process information, such as different emission data libraries utilized in life cycle assessments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The environmental impact of detergents and other consumer products is behind the continued interest in the chemistry of the surfactants used. Of these, linear alkylbenzene sulfonates (LASs) are most widely employed in detergent formulations. The precursors to LASs are linear alkylbenzenes (LABs). There is also interest in the chemistry of these hydrocarbons, because they are usually present in commercial LASs (due to incomplete sulfonation), or form as one of their degradation products. Additionally, they may be employed as molecular tracers of domestic waste in the aquatic environment. The following aspects are covered in the present review: The chemistry of surfactants, in particular LAS; environmental impact of the production of LAS; environmental and toxicological effects of LAS; mechanisms of removal of LAS in the environment, and methods for monitoring LAS and LAB, the latter in domestic wastes. Classical and novel analytical methods employed for the determination of LAS and LAB are discussed in detail, and a brief comment on detergents in Brazil is given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Population aging is closely related to high prevalence of chronic conditions in developed countries. In this context, health care policies aim to increase life span cost-effectively while maintaining quality of life and functional ability. There is still, however, a need for further understanding of how chronic conditions affect these health aspects. The aim of this paper is to assess the individual and combined impact of chronic physical and mental conditions on quality of life and disability in Spain, and secondly to show gender trends. METHODS: Cross-sectional data were collected from the COURAGE study. A total of 3,625 participants over 50 years old from Spain were included. Crude and adjusted multiple linear regressions were conducted to detect associations between individual chronic conditions and disability, and between chronic conditions and quality of life. Separate models were used to assess the influence of the number of diseases on the same variables. Additional analogous regressions were performed for males and females. RESULTS: All chronic conditions except hypertension were statistically associated with poor results in quality of life and disability. Depression, anxiety and stroke were found to have the greatest impact on outcomes. The number of chronic conditions was associated with substantially lower quality of life [β for 4+ diseases: -18.10 (-20.95,-15.25)] and greater disability [β for 4+ diseases: 27.64 (24.99,30.29]. In general, women suffered from higher rates of multimorbidity and poorer results in quality of life and disability. CONCLUSIONS: Chronic conditions impact greatly on quality of life and disability in the older Spanish population, especially when co-occurring diseases are added. Multimorbidity considerations should be a priority in the development of future health policies focused on quality of life and disability. Further studies would benefit from an expanded selection of diseases. Policies should also deal with gender idiosyncrasy in certain cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Com a resultat de les politiques i estratègies de col·laboració entre la universitat de Vic i de l’hospital i de la voluntat de realitzar activitats formatives conjuntes , s’estableix un línia de treball orientada a l’estudi i anàlisi de la situació logística interna actual del laboratori d’anàlisis clíniques de l’Hospital General de Vic. El treball es centra en el procés intern del laboratori i l’abast de l’estudi es troba limitat a les àrees especifiques d’hematologia i coagulació i bioquí… [+]mica. D’aquestes dues àrees el treball realitza un estudi exhaustiu del seu procés intern, identifica les seves activitats i la seva metodologia de treball amb l’objectiu d’elaborar el Value Stream Map de cadascuna de les àrees. Les àrees de Microbiologia, Banc de Sang i Urgències resten fora d’aquest estudi exhaustiu tot i que són presents en el treball per la inevitable interacció que tenen en la globalitat del procés. El treball es centra bàsicament en els processos automatitzats tot i que els processos que es duen a terme en el laboratori són tant automatitzats com manuals. També es limita al sistema productiu intern del laboratori tot i la interacció que té aquest sistema intern amb altres centres productius del sistema com ara són els centres d’atenció primària, els diversos hospitals i centres d’atenció sociosanitària. El laboratori es troba immers en el moment de l’elaboració d’aquest treball en un situació de canvi i millora del seus processos interns que consisteixen principalment en la substitució de part la maquinària actual que obliguen a la definició d’un nou layout i d’una nova distribució de la producció a cada màquina. A nivell extern també s’estan produint millores en el sistema informàtic de gestió que afecten a part del seu procés. L’objectiu del treball és donar visibilitat total al procés de logística interna actual del laboratori, identificant clarament com són i quina seqüència tenen els processos logístics interns i els mètodes de treball actuals, tant de recursos màquina com recursos persona, per poder identificar sota una perspectiva de generació de valor, aquells punts concrets de la logística interna que poden ser millorats en quant a eficiència i productivitat amb l’objectiu que un cop identificats es puguin emprendre accions i/o projectes de millora. El treball finalitza amb un anàlisis final del procés logística interna des d’una òptica Lean. Per fer-ho, identifica aquelles activitats que no aporten valor al procés o MUDA i les classifica en set categories i es realitzen diverses propostes de millora com són la implantació d’un flux continu , anivellat i basat en un concepte pull , identifica activitats que poden ser estandarditzades i/o simplificades i proposa modificacions en les infraestructures físiques per donar major visibilitat al procés. L’aspecte humà del procés es planteja des d’un punt de vist de metodologia, formació, comunicació i aplicació de les 5S.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent years have produced great advances in the instrumentation technology. The amount of available data has been increasing due to the simplicity, speed and accuracy of current spectroscopic instruments. Most of these data are, however, meaningless without a proper analysis. This has been one of the reasons for the overgrowing success of multivariate handling of such data. Industrial data is commonly not designed data; in other words, there is no exact experimental design, but rather the data have been collected as a routine procedure during an industrial process. This makes certain demands on the multivariate modeling, as the selection of samples and variables can have an enormous effect. Common approaches in the modeling of industrial data are PCA (principal component analysis) and PLS (projection to latent structures or partial least squares) but there are also other methods that should be considered. The more advanced methods include multi block modeling and nonlinear modeling. In this thesis it is shown that the results of data analysis vary according to the modeling approach used, thus making the selection of the modeling approach dependent on the purpose of the model. If the model is intended to provide accurate predictions, the approach should be different than in the case where the purpose of modeling is mostly to obtain information about the variables and the process. For industrial applicability it is essential that the methods are robust and sufficiently simple to apply. In this way the methods and the results can be compared and an approach selected that is suitable for the intended purpose. Differences in data analysis methods are compared with data from different fields of industry in this thesis. In the first two papers, the multi block method is considered for data originating from the oil and fertilizer industries. The results are compared to those from PLS and priority PLS. The third paper considers applicability of multivariate models to process control for a reactive crystallization process. In the fourth paper, nonlinear modeling is examined with a data set from the oil industry. The response has a nonlinear relation to the descriptor matrix, and the results are compared between linear modeling, polynomial PLS and nonlinear modeling using nonlinear score vectors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Metaheuristic methods have become increasingly popular approaches in solving global optimization problems. From a practical viewpoint, it is often desirable to perform multimodal optimization which, enables the search of more than one optimal solution to the task at hand. Population-based metaheuristic methods offer a natural basis for multimodal optimization. The topic has received increasing interest especially in the evolutionary computation community. Several niching approaches have been suggested to allow multimodal optimization using evolutionary algorithms. Most global optimization approaches, including metaheuristics, contain global and local search phases. The requirement to locate several optima sets additional requirements for the design of algorithms to be effective in both respects in the context of multimodal optimization. In this thesis, several different multimodal optimization algorithms are studied in regard to how their implementation in the global and local search phases affect their performance in different problems. The study concentrates especially on variations of the Differential Evolution algorithm and their capabilities in multimodal optimization. To separate the global and local search search phases, three multimodal optimization algorithms are proposed, two of which hybridize the Differential Evolution with a local search method. As the theoretical background behind the operation of metaheuristics is not generally thoroughly understood, the research relies heavily on experimental studies in finding out the properties of different approaches. To achieve reliable experimental information, the experimental environment must be carefully chosen to contain appropriate and adequately varying problems. The available selection of multimodal test problems is, however, rather limited, and no general framework exists. As a part of this thesis, such a framework for generating tunable test functions for evaluating different methods of multimodal optimization experimentally is provided and used for testing the algorithms. The results demonstrate that an efficient local phase is essential for creating efficient multimodal optimization algorithms. Adding a suitable global phase has the potential to boost the performance significantly, but the weak local phase may invalidate the advantages gained from the global phase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The Cancer Fast-track Programme's aim was to reduce the time that elapsed between well-founded suspicion of breast, colorectal and lung cancer and the start of initial treatment in Catalonia (Spain). We sought to analyse its implementation and overall effectiveness. METHODS: A quantitative analysis of the programme was performed using data generated by the hospitals on the basis of seven fast-track monitoring indicators for the period 2006-2009. In addition, we conducted a qualitative study, based on 83 semistructured interviews with primary and specialised health professionals and health administrators, to obtain their perception of the programme's implementation. RESULTS: About half of all new patients with breast, lung or colorectal cancer were diagnosed via the fast track, though the cancer detection rate declined across the period. Mean time from detection of suspected cancer in primary care to start of initial treatment was 32 days for breast, 30 for colorectal and 37 for lung cancer (2009). Professionals associated with the implementation of the programme showed that general practitioners faced with suspicion of cancer had changed their conduct with the aim of preventing lags. Furthermore, hospitals were found to have pursued three specific implementation strategies (top-down, consensus-based and participatory), which made for the cohesion and sustainability of the circuits. CONCLUSION: The programme has contributed to speeding up diagnostic assessment and treatment of patients with suspicion of cancer, and to clarifying the patient pathway between primary and specialised care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A reversed-phase liquid chromatographic (LC) and ultraviolet (UV) spectrophotometric methods were developed and validated for the assay of bromopride in oral and injectable solutions. The methods were validated according to ICH guideline. Both methods were linear in the range between 5-25 μg mL-1 (y = 41837x - 5103.4, r = 0.9996 and y = 0.0284x - 0.0351, r = 1, respectively). The statistical analysis showed no significant difference between the results obtained by the two methods. The proposed methods were found to be simple, rapid, precise, accurate, and sensitive. The LC and UV methods can be used in the routine quantitative analysis of bromopride in oral and injectable solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nonlinear analysis of a general mixed second order reaction was performed, aiming to explore some basic tools concerning the mathematics of nonlinear differential equations. Concepts of stability around fixed points based on linear stability analysis are introduced, together with phase plane and integral curves. The main focus is the chemical relationship between changes of limiting reagent and transcritical bifurcation, and the investigation underlying the conclusion.