118 resultados para Machine Tools


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: In haemodynamically stable patients with acute symptomatic pulmonary embolism (PE), studies have not evaluated the usefulness of combining the measurement of cardiac troponin, transthoracic echocardiogram (TTE), and lower extremity complete compression ultrasound (CCUS) testing for predicting the risk of PE-related death. Methods: The study assessed the ability of three diagnostic tests (cardiac troponin I (cTnI), echocardiogram, and CCUS) to prognosticate the primary outcome of PE-related mortality during 30 days of follow-up after a diagnosis of PE by objective testing. Results: Of 591 normotensive patients diagnosed with PE, the primary outcome occurred in 37 patients (6.3%; 95% CI 4.3% to 8.2%). Patients with right ventricular dysfunction (RVD) by TTE and concomitant deep vein thrombosis (DVT) by CCUS had a PE-related mortality of 19.6%, compared with 17.1% of patients with elevated cTnI and concomitant DVT and 15.2% of patients with elevated cTnI and RVD. The use of any two-test strategy had a higher specificity and positive predictive value compared with the use of any test by itself. A combined three-test strategy did not further improve prognostication. For a subgroup analysis of high-risk patients, according to the pulmonary embolism severity index (classes IV and V), positive predictive values of the two-test strategies for PE-related mortality were 25.0%, 24.4% and 20.7%, respectively. Conclusions: In haemodynamically stable patients with acute symptomatic PE, a combination of echocardiography (or troponin testing) and CCUS improved prognostication compared with the use of any test by itself for the identification of those at high risk of PE-related death.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present research deals with an application of artificial neural networks for multitask learning from spatial environmental data. The real case study (sediments contamination of Geneva Lake) consists of 8 pollutants. There are different relationships between these variables, from linear correlations to strong nonlinear dependencies. The main idea is to construct a subsets of pollutants which can be efficiently modeled together within the multitask framework. The proposed two-step approach is based on: 1) the criterion of nonlinear predictability of each variable ?k? by analyzing all possible models composed from the rest of the variables by using a General Regression Neural Network (GRNN) as a model; 2) a multitask learning of the best model using multilayer perceptron and spatial predictions. The results of the study are analyzed using both machine learning and geostatistical tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Forensic imaging and especially forensic radiology is a new trend in forensic medicine. More and more forensic institutes set up their own CT-scanner in order to perform postmortem cross-sectional imaging. Due to this trend, a new subspecialty was born: the forensic radiology. To image the vascular system after death, a postmortem CT- angiography can be performed. Methods and materials: In the Institute of Forensic Medicine in Lausanne, a science group has been created with specialists of different medical fields that has set up a new technique of forensic CT-angiography. The method consists in the creation of a postmortem circulation by the use of a modified heart lung machine. As circulating liquid Angiofil, an oily contrast agent, is injected. Results: With the aid of this technique, the whole vascular system of a deceased person can be imaged in detail without autopsy. The circulating contrast allows demonstrating the vascular system when it is under pressure, similarly to living patients. First experiences showed, that vascular pathologies such as cardiac tamponade and aortic dissection can be well demonstrated. Since the oily Angiofil strictly remains in the intravascular space, no artifacts had been observed during the CT-examination and the later performed autopsy. Conclusion: Post-mortem dynamic CT angiography is of great advantage in forensic pathology, because the detailed mapping of the entire vascular system is almost impossible with conventional autopsy tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Political participation is often very low in Switzerland especially among students and young citizens. In the run-up to the Swiss parliamentary election in October 2007 several online tools and campaigns were developed with the aim to increase not only the level of information about the political programs of parties and candidates, but also the electoral participation of younger citizens. From a practical point of view this paper will describe the development, marketing efforts and the distribution as well as the use of two of these tools : the so-called "Parteienkompass" (party compass) and the "myVote"-tool - an online voting assistance tool based on an issue-matching system comparing policy preferences between voters and candidates on an individual level. We also havea look at similar tools stemming from Voting Advice Applications (VAA) in other countries in Western Europe. The paper closes with the results of an evaluation and an outlook to further developments and on-going projects in the near future in Switzerland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present research deals with the review of the analysis and modeling of Swiss franc interest rate curves (IRC) by using unsupervised (SOM, Gaussian Mixtures) and supervised machine (MLP) learning algorithms. IRC are considered as objects embedded into different feature spaces: maturities; maturity-date, parameters of Nelson-Siegel model (NSM). Analysis of NSM parameters and their temporal and clustering structures helps to understand the relevance of model and its potential use for the forecasting. Mapping of IRC in a maturity-date feature space is presented and analyzed for the visualization and forecasting purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Avalanche forecasting is a complex process involving the assimilation of multiple data sources to make predictions over varying spatial and temporal resolutions. Numerically assisted forecasting often uses nearest neighbour methods (NN), which are known to have limitations when dealing with high dimensional data. We apply Support Vector Machines to a dataset from Lochaber, Scotland to assess their applicability in avalanche forecasting. Support Vector Machines (SVMs) belong to a family of theoretically based techniques from machine learning and are designed to deal with high dimensional data. Initial experiments showed that SVMs gave results which were comparable with NN for categorical and probabilistic forecasts. Experiments utilising the ability of SVMs to deal with high dimensionality in producing a spatial forecast show promise, but require further work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although cross-sectional diffusion tensor imaging (DTI) studies revealed significant white matter changes in mild cognitive impairment (MCI), the utility of this technique in predicting further cognitive decline is debated. Thirty-five healthy controls (HC) and 67 MCI subjects with DTI baseline data were neuropsychologically assessed at one year. Among them, there were 40 stable (sMCI; 9 single domain amnestic, 7 single domain frontal, 24 multiple domain) and 27 were progressive (pMCI; 7 single domain amnestic, 4 single domain frontal, 16 multiple domain). Fractional anisotropy (FA) and longitudinal, radial, and mean diffusivity were measured using Tract-Based Spatial Statistics. Statistics included group comparisons and individual classification of MCI cases using support vector machines (SVM). FA was significantly higher in HC compared to MCI in a distributed network including the ventral part of the corpus callosum, right temporal and frontal pathways. There were no significant group-level differences between sMCI versus pMCI or between MCI subtypes after correction for multiple comparisons. However, SVM analysis allowed for an individual classification with accuracies up to 91.4% (HC versus MCI) and 98.4% (sMCI versus pMCI). When considering the MCI subgroups separately, the minimum SVM classification accuracy for stable versus progressive cognitive decline was 97.5% in the multiple domain MCI group. SVM analysis of DTI data provided highly accurate individual classification of stable versus progressive MCI regardless of MCI subtype, indicating that this method may become an easily applicable tool for early individual detection of MCI subjects evolving to dementia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proponents of microalgae biofuel technologies often claim that the world demand of liquid fuels, about 5 trillion liters per year, could be supplied by microalgae cultivated on only a few tens of millions of hectares. This perspective reviews this subject and points out that such projections are greatly exaggerated, because (1) the pro- ductivities achieved in large-scale commercial microalgae production systems, operated year-round, do not surpass those of irrigated tropical crops; (2) cultivating, harvesting and processing microalgae solely for the production of biofuels is simply too expensive using current or prospective technology; and (3) currently available (limited) data suggest that the energy balance of algal biofuels is very poor. Thus, microalgal biofuels are no panacea for depleting oil or global warming, and are unlikely to save the internal combustion machine.