82 resultados para Bookkeeping machines.
Resumo:
Objective: To assess reproducibility and feasibility of amusculoskeletal ultrasound (US) score for rheumatoid arthritis amongrheumatologist with diverse expertise in US, working in private orhospital practice.Methods: The Swiss Sonography in Arthritis and Rheumatism(SONAR) group has developed a semi-quantitative score for RA usingOMERACT criteria for synovitis and erosion. The score was taught torheumatologists trained in US through two workshops. Subsequently,they were encouraged to practice in their office. For the study, we used6 US machines of different quality, each with a different patient.19 readers randomly selected among rheumatologists who haveattended both workshops, were asked to score anonymously at leastone patient. To assess whether some factors influence the score, weasked each reader to answer questionnaire describing his experiencewith US.Results: 19 rheumatologists have performed 29 scans, each patienthaving been evaluated by 4 to 6 readers. Median time for examcompletion was 20 minutes (range 15 to 60 mn). 53% ofrheumatologists work in private practice. Graph 1 show the global greyscale score for each patient. Weighted kappa was calculated for eachpair of reader using stata11. Almost all kappa of poor agreement wereobtained with a low quality device or by an assessor who havepreviously performed less than 5 scores himself.Conclusions: This is the first study to show an US score for RAfeasible by rheumatologists with diverse expertise in US both in privateand hospital practice. Reproducibility seemed to be influenced by thequality of device and previous experience with the score.
Resumo:
AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.
Resumo:
In the recent years, kernel methods have revealed very powerful tools in many application domains in general and in remote sensing image classification in particular. The special characteristics of remote sensing images (high dimension, few labeled samples and different noise sources) are efficiently dealt with kernel machines. In this paper, we propose the use of structured output learning to improve remote sensing image classification based on kernels. Structured output learning is concerned with the design of machine learning algorithms that not only implement input-output mapping, but also take into account the relations between output labels, thus generalizing unstructured kernel methods. We analyze the framework and introduce it to the remote sensing community. Output similarity is here encoded into SVM classifiers by modifying the model loss function and the kernel function either independently or jointly. Experiments on a very high resolution (VHR) image classification problem shows promising results and opens a wide field of research with structured output kernel methods.
Resumo:
Dans cet ouvrage, l'auteur propose une conceptualisation théorique de la coprésence en un même film de mondes multiples en abordant différents paramètres (hétérogénéité de la facture de l'image, pratiques du montage alterné, typologie des enchâssements, expansion sérielle, etc.) sur la base d'un corpus de films de fiction récents qui appartiennent pour la plupart au genre de la science-fiction (Matrix, Dark City, Avalon, Resident Evil, Avatar,...). Issue de la filmologie, la notion de « diégèse » y est développée à la fois dans le potentiel d'autonomisation dont témoigne la conception mondaine qui semble dominer aujourd'hui à l'ère des jeux vidéo, dans ses liens avec le récit et dans une perspective intermédiale. Les films discutés ont la particularité de mettre en scène des machines permettant aux personnages de passer d'un monde à l'autre : les modes de figuration de ces technologies sont investigués en lien avec les imaginaires du dispositif cinématographique et les potentialité du montage. La comparaison entre les films (Tron et son récent sequel, Totall Recall et son remake) et entre des oeuvres filmiques et littéraires (en particulier les nouvelles de Philip K. Dick et Simlacron 3 de Galouye) constitue un outil d'analyse permettant de saisir la contemporanéité de cette problématique, envisagée sur le plan esthétique dans le contexte de l'imagerie numérique.
Resumo:
An active learning method is proposed for the semi-automatic selection of training sets in remote sensing image classification. The method adds iteratively to the current training set the unlabeled pixels for which the prediction of an ensemble of classifiers based on bagged training sets show maximum entropy. This way, the algorithm selects the pixels that are the most uncertain and that will improve the model if added in the training set. The user is asked to label such pixels at each iteration. Experiments using support vector machines (SVM) on an 8 classes QuickBird image show the excellent performances of the methods, that equals accuracies of both a model trained with ten times more pixels and a model whose training set has been built using a state-of-the-art SVM specific active learning method
Resumo:
This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.
Resumo:
Technology (i.e. tools, methods of cultivation and domestication, systems of construction and appropriation, machines) has increased the vital rates of humans, and is one of the defining features of the transition from Malthusian ecological stagnation to a potentially perpetual rising population growth. Maladaptations, on the other hand, encompass behaviours, customs and practices that decrease the vital rates of individuals. Technology and maladaptations are part of the total stock of culture carried by the individuals in a population. Here, we develop a quantitative model for the coevolution of cumulative adaptive technology and maladaptive culture in a 'producer-scrounger' game, which can also usefully be interpreted as an 'individual-social' learner interaction. Producers (individual learners) are assumed to invent new adaptations and maladaptations by trial-and-error learning, insight or deduction, and they pay the cost of innovation. Scroungers (social learners) are assumed to copy or imitate (cultural transmission) both the adaptations and maladaptations generated by producers. We show that the coevolutionary dynamics of producers and scroungers in the presence of cultural transmission can have a variety of effects on population carrying capacity. From stable polymorphism, where scroungers bring an advantage to the population (increase in carrying capacity), to periodic cycling, where scroungers decrease carrying capacity, we find that selection-driven cultural innovation and transmission may send a population on the path of indefinite growth or to extinction.
Resumo:
Raman spectroscopy has become an attractive tool for the analysis of pharmaceutical solid dosage forms. In the present study it is used to ensure the identity of tablets. The two main applications of this method are release of final products in quality control and detection of counterfeits. Twenty-five product families of tablets have been included in the spectral library and a non-linear classification method, the Support Vector Machines (SVMs), has been employed. Two calibrations have been developed in cascade: the first one identifies the product family while the second one specifies the formulation. A product family comprises different formulations that have the same active pharmaceutical ingredient (API) but in a different amount. Once the tablets have been classified by the SVM model, API peaks detection and correlation are applied in order to have a specific method for the identification and allow in the future to discriminate counterfeits from genuine products. This calibration strategy enables the identification of 25 product families without error and in the absence of prior information about the sample. Raman spectroscopy coupled with chemometrics is therefore a fast and accurate tool for the identification of pharmaceutical tablets.
Resumo:
The 2008 Data Fusion Contest organized by the IEEE Geoscience and Remote Sensing Data Fusion Technical Committee deals with the classification of high-resolution hyperspectral data from an urban area. Unlike in the previous issues of the contest, the goal was not only to identify the best algorithm but also to provide a collaborative effort: The decision fusion of the best individual algorithms was aiming at further improving the classification performances, and the best algorithms were ranked according to their relative contribution to the decision fusion. This paper presents the five awarded algorithms and the conclusions of the contest, stressing the importance of decision fusion, dimension reduction, and supervised classification methods, such as neural networks and support vector machines.
Resumo:
Exposure to fine particles and noise has been linked to cardiovascular diseases and elevated cardiovascular mortality affecting the worldwide population. Residence and/or work in proximity to emission sources as for example road traffic leads to an elevated exposure and a higher risk for adverse health effects. Highway maintenance workers spend most of their work time in traffic and are exposed regularly to particles and noise. The aims of this thesis were to provide a better understanding of the workers' mixed exposure to particles and noise and to assess cardiopulmonary short term health effects in relation to this exposure. Exposure and health data were collected in collaboration with 8 maintenance centers of the Swiss Road Maintenance Services located in the cantons Bern, Fribourg and Vaud in western Switzerland. Repeated measurements with 18 subjects were conducted during 50 non-consecutive work shifts between Mai 2010 and February 2012, equally distributed over all seasons. In the first part of this thesis we tested and validated measurements of ultrafine particles with a miniature diffusion size classifier (miniDiSC) - a novel particle counting device that was used for the exposure assessment during highway maintenance work. We found that particle numbers and average particle size measured by the miniDiSC were highly correlated with data from the P-TRAK, a condensation particle counter (CPC), as well as from a scanning mobility particle sizer (SMPS). However, the miniDiSC measured significantly more particles than the P-TRAK and significantly less than the SMPS in its full size range. Our data suggests that the instrument specific cutoffs were the main reason for the different particle counts. The first main objective of this thesis was to investigate the exposure of highway maintenance workers to air pollutants and noise, in relation to the different maintenance activities. We have seen that the workers are regularly exposed to high particle and noise levels. This was a consequence of close proximity to highway traffic and the use of motorized working equipment such as brush cutters, chain saws, generators and pneumatic hammers during which the highest exposure levels occurred. Although exposure to air pollutants were not critical if compared to occupational exposure limits, the elevated exposure to particles and noise may lead to a higher risk for cardiovascular diseases in this worker population. The second main objective was to investigate cardiopulmonary short-term health effects in relation to the particle and noise exposure during highway maintenance work. We observed a PM2.5 related increase of the acute-phase inflammation markers C-reactive protein and serum amyloid A and a decrease of TNFa. Heart rate variability increased as a consequence of particle as well as noise exposure. Increased high frequency power indicated a stronger parasympathetic influence on the heart. Elevated noise levels during recreational time, after work, were related to increased blood pressure. Our data confirmed that highway maintenance workers are exposed to elevated levels of particles and noise as compared to the average population. This exposure poses a cardiovascular health risk and it is therefore important to make efforts to better protect the workers health. The use of cleaner machines during maintenance work would be a major step to improve the workers' situation. Furthermore, regulatory policies with the aim of reducing combustion and non-combustion emissions from road traffic are important for the protection of workers in traffic environments and the entire population.
Resumo:
METHODS: 20 inactive (10 male, 10 female) underwent a single typical WBV session, with a total of 27 minutes of exercise on an oscillating platform at 26 Hz, involving upper and lower body muscles. Each exercise lasted 90 seconds, with 40 seconds pauses inbetween. Muscle enzymes (CK, transaminase, LDH, troponin I) were measured before, at 24, 48 and 96 hours post exercise. Lactate was measured immediately after the session. Muscle aches were assessed during 4 days post-exercise.RESULTS: Subjects' mean age was 23.0 ± 3.5 (male), 22.4 ± 1.4 (female), BMI 22.8 ± 2.3 and 22.1 ± 1.9, and all had been inactive for at least 12 months. Post exercise lactatemia was 10.0 ± 2.4 and 6.9 ± 2.4. CK elevation was significant (at least doubling of baseline values) in 1 male and 4 female subjects, while they remained at baseline values for the remaining 15 subjects. One female subject peaked at 3520 U/l at 96 hours post exercise, and all but one peaked at the same late time. Troponin and CK-MB never increased. No correlation was found between muscle soreness and CK levels.CONCLUSIONS: WBV can elicit important anaerobic processes reflected by the high lactacidemia, and CK elevation was significant in 25 % of subjects, peaking at the fourth day after exercise for 80 % of those. Such exercises should not be regarded as trivial and "easy" as they are advertised, since they can provoke important anaerobia and CK elevation. Many fragile patients or patients treated for cardiovascular disease could benefit from WBV but it is important to recognise these potential effects, especially in those treated with statins, known to cause a myopathy and CK elevation. Before considering a side effect of an important therapeutic agent, doctors should be aware of the possible interaction with not-so-harmless exercising machines.
Resumo:
Cannabis cultivation in order to produce drugs is forbidden in Switzerland. Thus, law enforcement authorities regularly ask forensic laboratories to determinate cannabis plant's chemotype from seized material in order to ascertain that the plantation is legal or not. As required by the EU official analysis protocol the THC rate of cannabis is measured from the flowers at maturity. When laboratories are confronted to seedlings, they have to lead the plant to maturity, meaning a time consuming and costly procedure. This study investigated the discrimination of fibre type from drug type Cannabis seedlings by analysing the compounds found in their leaves and using chemometrics tools. 11 legal varieties allowed by the Swiss Federal Office for Agriculture and 13 illegal ones were greenhouse grown and analysed using a gas chromatograph interfaced with a mass spectrometer. Compounds that show high discrimination capabilities in the seedlings have been identified and a support vector machines (SVMs) analysis was used to classify the cannabis samples. The overall set of samples shows a classification rate above 99% with false positive rates less than 2%. This model allows then discrimination between fibre and drug type Cannabis at an early stage of growth. Therefore it is not necessary to wait plants' maturity to quantify their amount of THC in order to determine their chemotype. This procedure could be used for the control of legal (fibre type) and illegal (drug type) Cannabis production.
Resumo:
Monitoring of posture allocations and activities enables accurate estimation of energy expenditure and may aid in obesity prevention and treatment. At present, accurate devices rely on multiple sensors distributed on the body and thus may be too obtrusive for everyday use. This paper presents a novel wearable sensor, which is capable of very accurate recognition of common postures and activities. The patterns of heel acceleration and plantar pressure uniquely characterize postures and typical activities while requiring minimal preprocessing and no feature extraction. The shoe sensor was tested in nine adults performing sitting and standing postures and while walking, running, stair ascent/descent and cycling. Support vector machines (SVMs) were used for classification. A fourfold validation of a six-class subject-independent group model showed 95.2% average accuracy of posture/activity classification on full sensor set and over 98% on optimized sensor set. Using a combination of acceleration/pressure also enabled a pronounced reduction of the sampling frequency (25 to 1 Hz) without significant loss of accuracy (98% versus 93%). Subjects had shoe sizes (US) M9.5-11 and W7-9 and body mass index from 18.1 to 39.4 kg/m2 and thus suggesting that the device can be used by individuals with varying anthropometric characteristics.
Resumo:
The present study aims to analyze attitudes and beliefs of the French-speaking general Swiss population (n = 2500; female n = 1280; mean age = 43 years) as regards gambling, which are to date almost exclusively studied in the North American and Australian contexts. Beliefs related to gambling include the perception of the effectiveness of preventive measures toward gambling, the comparative risk assessment of different addictive behaviors, the perceived risks of different types of gambling and attitudes are related to the gambler's personality. The general population perceived gambling rather negatively and was conscious of the potential risks of gambling; indeed, 59.0% of the sample identified gambling as an addictive practice. Slot machines were estimated to bear the highest risk. Compared with women and older people, men and young people indicated more positive beliefs about gambling; they perceived gambling as less addictive, supported structural preventive measures less often, and perceived gambling as a less serious problem for society. Gamblers were more likely to put their practices into perspective, perceiving gambling more positively than non-gamblers. General population surveys on such beliefs can deliver insights into preventive actions that should be targeted to young men who showed more favorable views of gambling, which have been shown to be associated with increased risk for problematic gambling.