962 resultados para Desiring Machines


Relevância:

10.00% 10.00%

Publicador:

Resumo:

MOTIVATION: Analysis of millions of pyro-sequences is currently playing a crucial role in the advance of environmental microbiology. Taxonomy-independent, i.e. unsupervised, clustering of these sequences is essential for the definition of Operational Taxonomic Units. For this application, reproducibility and robustness should be the most sought after qualities, but have thus far largely been overlooked. RESULTS: More than 1 million hyper-variable internal transcribed spacer 1 (ITS1) sequences of fungal origin have been analyzed. The ITS1 sequences were first properly extracted from 454 reads using generalized profiles. Then, otupipe, cd-hit-454, ESPRIT-Tree and DBC454, a new algorithm presented here, were used to analyze the sequences. A numerical assay was developed to measure the reproducibility and robustness of these algorithms. DBC454 was the most robust, closely followed by ESPRIT-Tree. DBC454 features density-based hierarchical clustering, which complements the other methods by providing insights into the structure of the data. AVAILABILITY: An executable is freely available for non-commercial users at ftp://ftp.vital-it.ch/tools/dbc454. It is designed to run under MPI on a cluster of 64-bit Linux machines running Red Hat 4.x, or on a multi-core OSX system. CONTACT: dbc454@vital-it.ch or nicolas.guex@isb-sib.ch.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Because of the development of modern transportation facilities, an ever rising number of individuals including many patients with preexisting diseases visit high-altitude locations (>2500 m). High-altitude exposure triggers a series of physiologic responses intended to maintain an adequate tissue oxygenation. Even in normal subjects, there is enormous interindividual variability in these responses that may be further amplified by environmental factors such as cold temperature, low humidity, exercise, and stress. These adaptive mechanisms, although generally tolerated by most healthy subjects, may induce major problems in patients with preexisting cardiovascular diseases in which the functional reserves are already limited. Preexposure assessment of patients helps to minimize risk and detect contraindications to high-altitude exposure. Moreover, the great variability and nonpredictability of the adaptive response should encourage physicians counseling such patients to adapt a cautionary approach. Here, we will briefly review how high-altitude adjustments may interfere with and aggravate/decompensate preexisting cardiovascular diseases. Moreover, we will provide practical recommendations on how to investigate and counsel patients with cardiovascular disease desiring to travel to high-altitude locations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In severe forms of Diamond-Blackfan anemia, preimplantation genetic diagnosis (PGD) of histocompatibility leukocyte antigen-compatible embryos for enabling the next sibling in the family to be a stem-cell transplantation donor constitutes the sole lasting cure capable of terminating the enduring need for iterative transfusions. We report here an open collaboration between two renowned institutions to provide a family desiring this treatment even though they resided where the preimplantation genetic diagnosis procedure is banned.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Avalanche forecasting is a complex process involving the assimilation of multiple data sources to make predictions over varying spatial and temporal resolutions. Numerically assisted forecasting often uses nearest neighbour methods (NN), which are known to have limitations when dealing with high dimensional data. We apply Support Vector Machines to a dataset from Lochaber, Scotland to assess their applicability in avalanche forecasting. Support Vector Machines (SVMs) belong to a family of theoretically based techniques from machine learning and are designed to deal with high dimensional data. Initial experiments showed that SVMs gave results which were comparable with NN for categorical and probabilistic forecasts. Experiments utilising the ability of SVMs to deal with high dimensionality in producing a spatial forecast show promise, but require further work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although cross-sectional diffusion tensor imaging (DTI) studies revealed significant white matter changes in mild cognitive impairment (MCI), the utility of this technique in predicting further cognitive decline is debated. Thirty-five healthy controls (HC) and 67 MCI subjects with DTI baseline data were neuropsychologically assessed at one year. Among them, there were 40 stable (sMCI; 9 single domain amnestic, 7 single domain frontal, 24 multiple domain) and 27 were progressive (pMCI; 7 single domain amnestic, 4 single domain frontal, 16 multiple domain). Fractional anisotropy (FA) and longitudinal, radial, and mean diffusivity were measured using Tract-Based Spatial Statistics. Statistics included group comparisons and individual classification of MCI cases using support vector machines (SVM). FA was significantly higher in HC compared to MCI in a distributed network including the ventral part of the corpus callosum, right temporal and frontal pathways. There were no significant group-level differences between sMCI versus pMCI or between MCI subtypes after correction for multiple comparisons. However, SVM analysis allowed for an individual classification with accuracies up to 91.4% (HC versus MCI) and 98.4% (sMCI versus pMCI). When considering the MCI subgroups separately, the minimum SVM classification accuracy for stable versus progressive cognitive decline was 97.5% in the multiple domain MCI group. SVM analysis of DTI data provided highly accurate individual classification of stable versus progressive MCI regardless of MCI subtype, indicating that this method may become an easily applicable tool for early individual detection of MCI subjects evolving to dementia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Experience has shown that milling machines with carbide tipped teeth have the capability of profiling most asphalt concrete (ac) and portland cement concrete (pcc) pavements. Most standard milling operations today leave a very coarse, generally objectionable surface texture. This research utilized a Cedarapids Wirtgen 1900C mill modified by adding additional teeth. There were 411 teeth at a 5 millimeter transverse spacing (standard spacing is 15 mm) on a 6 ft. 4 in. long drum. The mill was used to profile and texture the surface of one ac and two pcc pavements. One year after the milling operation there is still some noticeable change in tire noise but the general appearance is good. The milling operation with the additional teeth provides an acceptable surface texture with improved Friction Numbers when compared to a nonmilled surface.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To explore the user-friendliness and ergonomics of seven new generation intensive care ventilators. DESIGN: Prospective task-performing study. SETTING: Intensive care research laboratory, university hospital. METHODS: Ten physicians experienced in mechanical ventilation, but without prior knowledge of the ventilators, were asked to perform eight specific tasks [turning the ventilator on; recognizing mode and parameters; recognizing and setting alarms; mode change; finding and activating the pre-oxygenation function; pressure support setting; stand-by; finding and activating non-invasive ventilation (NIV) mode]. The time needed for each task was compared to a reference time (by trained physiotherapist familiar with the devices). A time >180 s was considered a task failure. RESULTS: For each of the tests on the ventilators, all physicians' times were significantly higher than the reference time (P < 0.001). A mean of 13 +/- 8 task failures (16%) was observed by the ventilator. The most frequently failed tasks were mode and parameter recognition, starting pressure support and finding the NIV mode. Least often failed tasks were turning on the pre-oxygenation function and alarm recognition and management. Overall, there was substantial heterogeneity between machines, some exhibiting better user-friendliness than others for certain tasks, but no ventilator was clearly better that the others on all points tested. CONCLUSIONS: The present study adds to the available literature outlining the ergonomic shortcomings of mechanical ventilators. These results suggest that closer ties between end-users and manufacturers should be promoted, at an early development phase of these machines, based on the scientific evaluation of the cognitive processes involved by users in the clinical setting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: Late toxicities such as second cancer induction become more important as treatment outcome improves. Often the dose distribution calculated with a commercial treatment planning system (TPS) is used to estimate radiation carcinogenesis for the radiotherapy patient. However, for locations beyond the treatment field borders, the accuracy is not well known. The aim of this study was to perform detailed out-of-field-measurements for a typical radiotherapy treatment plan administered with a Cyberknife and a Tomotherapy machine and to compare the measurements to the predictions of the TPS. MATERIALS AND METHODS: Individually calibrated thermoluminescent dosimeters were used to measure absorbed dose in an anthropomorphic phantom at 184 locations. The measured dose distributions from 6 MV intensity-modulated treatment beams for CyberKnife and TomoTherapy machines were compared to the dose calculations from the TPS. RESULTS: The TPS are underestimating the dose far away from the target volume. Quantitatively the Cyberknife underestimates the dose at 40cm from the PTV border by a factor of 60, the Tomotherapy TPS by a factor of two. If a 50% dose uncertainty is accepted, the Cyberknife TPS can predict doses down to approximately 10 mGy/treatment Gy, the Tomotherapy-TPS down to 0.75 mGy/treatment Gy. The Cyberknife TPS can then be used up to 10cm from the PTV border the Tomotherapy up to 35cm. CONCLUSIONS: We determined that the Cyberknife and Tomotherapy TPS underestimate substantially the doses far away from the treated volume. It is recommended not to use out-of-field doses from the Cyberknife TPS for applications like modeling of second cancer induction. The Tomotherapy TPS can be used up to 35cm from the PTV border (for a 390 cm(3) large PTV).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although the radiation doses involved in basic research radiology are relatively small, the increasing number of radiological procedures makes risks becoming increasingly high. Quality control techniques in radiological practice have to ensure an adequate system of protection for people exposed to radiation. These techniques belong to a quality assurance program for X-ray machines and are designed to correct problems related to equipment and radiological practices, to obtain radiological images of high quality and to reduce the unnecessary exposures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Bone health is a concern when treating early stage breast cancer patients with adjuvant aromatase inhibitors. Early detection of patients (pts) at risk of osteoporosis and fractures may be helpful for starting preventive therapies and selecting the most appropriate endocrine therapy schedule. We present statistical models describing the evolution of lumbar and hip bone mineral density (BMD) in pts treated with tamoxifen (T), letrozole (L) and sequences of T and L. Methods: Available dual-energy x-ray absorptiometry exams (DXA) of pts treated in trial BIG 1-98 were retrospectively collected from Swiss centers. Treatment arms: A) T for 5 years, B) L for 5 years, C) 2 years of T followed by 3 years of L and, D) 2 years of L followed by 3 years of T. Pts without DXA were used as a control for detecting selection biases. Patients randomized to arm A were subsequently allowed an unplanned switch from T to L. Allowing for variations between DXA machines and centres, two repeated measures models, using a covariance structure that allow for different times between DXA, were used to estimate changes in hip and lumbar BMD (g/cm2) from trial randomization. Prospectively defined covariates, considered as fixed effects in the multivariable models in an intention to treat analysis, at the time of trial randomization were: age, height, weight, hysterectomy, race, known osteoporosis, tobacco use, prior bone fracture, prior hormone replacement therapy (HRT), bisphosphonate use and previous neo-/adjuvant chemotherapy (ChT). Similarly, the T-scores for lumbar and hip BMD measurements were modeled using a per-protocol approach (allowing for treatment switch in arm A), specifically studying the effect of each therapy upon T-score percentage. Results: A total of 247 out of 546 pts had between 1 and 5 DXA; a total of 576 DXA were collected. Number of DXA measurements per arm were; arm A 133, B 137, C 141 and D 135. The median follow-up time was 5.8 years. Significant factors positively correlated with lumbar and hip BMD in the multivariate analysis were weight, previous HRT use, neo-/adjuvant ChT, hysterectomy and height. Significant negatively correlated factors in the models were osteoporosis, treatment arm (B/C/D vs. A), time since endocrine therapy start, age and smoking (current vs. never).Modeling the T-score percentage, differences from T to L were -4.199% (p = 0.036) and -4.907% (p = 0.025) for the hip and lumbar measurements respectively, before any treatment switch occurred. Conclusions: Our statistical models describe the lumbar and hip BMD evolution for pts treated with L and/or T. The results of both localisations confirm that, contrary to expectation, the sequential schedules do not seem less detrimental for the BMD than L monotherapy. The estimated difference in BMD T-score percent is at least 4% from T to L.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’objectiu d’aquest estudi és comparar els efectes de dos mitjans de treball sobre la força explosiva en 9 atletes de nivell autonòmic i nacional (19,6 ± 2,6 anys, 1,76 ± 7 m i 68,9 ± 3,5 kg) que entrenaren durant el període competitiu, 6 setmanes amb una freqüència de 2 cops per setmana, seguint una periodització creixent. Els subjectes varen ser dividits en 3 grups de 3 atletes cada un, però utilitzant diferents mitjans d’entrenament: el grup experimental combinant la plataforma vibratòria i les màquines inercials (1), amb un temps d’exposició a la vibració de 30’’, a una intensitat de 45 Hz i una amplitud de 5 mm amb una pausa d’1’. I de 3 sèries de 8 repeticions executades a màxima velocitat en la fase concèntrica i controlant aquesta en la fase excèntrica, amb una pausa de 3’ en les màquines io-io. El grup experimental de pesos lliures (2), va realitzar 4 exercicis: ½ squat, pliometria (CEE), multisalts horitzontals i acceleracions, seguint les pautes d’un treball de força explosiva proposat per Badillo i Gorostiaga (1995). I un grup control (3), que no realitzà cap entrenament de força. Abans i després del període d’intervenció és realitzaren els següents tests: salt sense contramoviment (SJ) i salt amb contramoviment (CMJ). Els resultats indicaren que el grup (1), (2) i (3) disminuïren significativament el SJ i el CMJ. Es conclou que l’entrenament, tant en el de combinació d’estímuls vibratoris amb màquines inercials com el de pesos lliures pareix ser un mitja que s’ha de controlar i perioditzar molt bé ja que en el període competitiu l’atleta acumula uns nivells de fatiga tant muscular com fisiològics molt superiors que en altres períodes de la temporada.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Actualment hi ha una gran varietat de marques i models d'estabilitzadors de tensió, tot i que tots estan dissenyats i construïts amb el mateix propòsit, entregar una tensió estable a la sortida del dispositiu. La raó per la que es fabriquen estabilitzadors de tensió, es basa en el fet que, tot i amb els avenços tècnics i millores dels serveis de l'àrea energètica, no s'han pogut eliminar les freqüents caigudes o pujades de tensió en les xarxes d'alimentació d'energia elèctrica, fet que pot ocasionar errors en el funcionament dels equips electrònics. És per això que la majoria d'usuaris d'equips electrònics interposen un estabilitzador de tensió entre la línea d'alimentació dels seus aparells. L'objectiu del treball és realitzar la construcció d'un dispositiu de tres bobines amb un nucli ferromagnètic per a dur a terme la funció d'estabilitzador de tensió alterna. Mitjançant la tècnica de reluctància de nucli saturable i aplicant-la en aquest cas pràctic, es vol aconseguir desenvolupar un estabilitzador de tensió fiable i econòmic. Aquest treball ha d'aportar una solució en entorns de treball, entre altres, on s'utilitzen màquines d'alimentació alternes que siguin molt sensibles a les variacions de tensió de la xarxa i que, a més a més, no suposi una despesa molt elevada.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The book presents the state of the art in machine learning algorithms (artificial neural networks of different architectures, support vector machines, etc.) as applied to the classification and mapping of spatially distributed environmental data. Basic geostatistical algorithms are presented as well. New trends in machine learning and their application to spatial data are given, and real case studies based on environmental and pollution data are carried out. The book provides a CD-ROM with the Machine Learning Office software, including sample sets of data, that will allow both students and researchers to put the concepts rapidly to practice.