941 resultados para Work methods
Resumo:
A prior long-term and complex evaluation of the already available data on the geophysical prospecting during the first season work carried out at 2006, at the archaeological site of Tchinguiz Tepe of Termez, took place to decide the strategy to follow during the campaign of 2007. This previous evaluation of the information, on one hand, leaded to the decision to increase the geophysical prospecting at Tchinguiz Tepe, on the other hand, to decide the exact location of areas where the archaeological interventions.would carry out. The main objective at the beginning of this new season was to crosscheck the reliabilityof the measurements and, at the same time, to establish the unknown up to the present archaeologicaland chronological sequence of Tchinguiz Tepe. Meanwhile, the geophysical prospecting also wasextended to the outskirts of the city were the localisation of an unknown up to now Buddhist Monasterywas possible.
Resumo:
The International Pluridisciplinary Archaeological Expedition in Bactria (IPAEB) was created in 2006.The name underlines the international character of the team (which includes Uzbeks, Spanish, French, British and Greek members), the presence of specialists from various fields apart from archaeology and the fame of Bactria1.
Resumo:
This year the development of our project can be divided into two main clearly different parts, on one hand the laboratory work, where the sampled ceramic individuals has been prepared and analyzed and the elaboration of the data obtained during the excavation of 2007 has been finished and, on the other hand, the field work developed at the archaeological site during this specific year (2008).In the mark of the analytical work a significant number of ceramic individuals (144) from the different stratigraphical units from various areas of the excavation of Termez and Tchinguiz Tepe sampled duringthe new and previous field works has been archaeometrically characterized. This specific material includedindividuals dated into the Hellenistic and Sassanian period, which has been confirmed by C14dating upon organic samples.At the same time, in the mark of the field work of 2008 the archaeological record, already started tobe under study during the excavation of 2007, has been completed and two new archaeological recordshave been registered on of which is located in the area of Tchinguiz Tepe. For the archeological studythe information of the previous geophysical prospecting has been indisputably taken into considerationand the same methodology has been applied to crosscheck the latter archaeological results.
Resumo:
Institute of Archaeology & Institute of Fine Arts. Academy of Science of the Republic of Uzbekistan- Universitat de Barcelona- Ministerio de Cultura (Gobierno de España)- Ministerio de Ciencia e Innovación (Gobierno de España)
Resumo:
Background: Complex wounds pose a major challenge in reconstructive and trauma surgery. Several approaches to increase the healing process have been proposed in the last decades. In this study we study the mechanism of action of the Vacuum Assisted Closure device in diabetic wounds. Methods: Full-thickness wounds were excised in diabetic mice and treated with the VAC device or its isolated components: an occlusive dressing (OD) alone, subathmospheric pressure at 125 mm Hg (Suction), and a polyurethane foam without (Foam) and with (Foamc) downward compression of approximately 125 mm Hg. The last goups were treated with either the complete VAC device (VAC) or with a silicne interface that alows fluid removel (Mepithel-VAC). The effects of the treatment modes on the wound surface were quantified by a two-dimensional immunohistochemical staging system based on vasculature, as defined by blood vessel density (CD31) and cell proliferation (defined by ki67 positivity), 7 days post wounding. Finite element modelling was used to predict wound surface deformation under dressing modes and cross sections of in situ fixed tissues were used to measure actual microstrain. Results: The foam-wound interface of the Vacuum Assisted Closure device causes significant wound stains (60%) causing a deformation of the single cell level leading to a profound upregulation of cell proliferation (4-fold) and angiogenisis (2.2-fold) compared to OD treated wounds. Polyurethane foam exposure itself causes a frather unspecific angiogenic response (Foamc, 2 - fold, Foam, 2.2 - fold) without changes of the cell proliferation rate of the wound bed. Suction alone without a specific interface does not have an effect on meassured parameters, showing similar results to untreated wounds. A perforated silicone interface caused a significant lower microdeforamtion of the wound bed correlating to changes of the wound tissues. Conclusion: The Vacuum Assisted Closure device induce significanttissue growth in diabetic wounds. The wound foam interface under suction causes profound macrodeformation that stimulates tissue growth by angiogenesis and cell proliferation. It needs to be taken in consideration that in the clinical setting different wound types may profit from different elements of this suction device.
Resumo:
Purpose This study aimed to identify self-perception variables which may predict return to work (RTW) in orthopedic trauma patients 2 years after rehabilitation. Methods A prospective cohort investigated 1,207 orthopedic trauma inpatients, hospitalised in rehabilitation, clinics at admission, discharge, and 2 years after discharge. Information on potential predictors was obtained from self administered questionnaires. Multiple logistic regression models were applied. Results In the final model, a higher likelihood of RTW was predicted by: better general health and lower pain at admission; health and pain improvements during hospitalisation; lower impact of event (IES-R) avoidance behaviour score; higher IES-R hyperarousal score, higher SF-36 mental score and low perceived severity of the injury. Conclusion RTW is not only predicted by perceived health, pain and severity of the accident at the beginning of a rehabilitation program, but also by the changes in pain and health perceptions observed during hospitalisation.
Resumo:
BACKGROUND: In May 2010, Switzerland introduced a heterogeneous smoking ban in the hospitality sector. While the law leaves room for exceptions in some cantons, it is comprehensive in others. This longitudinal study uses different measurement methods to examine airborne nicotine levels in hospitality venues and the level of personal exposure of non-smoking hospitality workers before and after implementation of the law. METHODS: Personal exposure to second hand smoke (SHS) was measured by three different methods. We compared a passive sampler called MoNIC (Monitor of NICotine) badge, to salivary cotinine and nicotine concentration as well as questionnaire data. Badges allowed the number of passively smoked cigarettes to be estimated. They were placed at the venues as well as distributed to the participants for personal measurements. To assess personal exposure at work, a time-weighted average of the workplace badge measurements was calculated. RESULTS: Prior to the ban, smoke-exposed hospitality venues yielded a mean badge value of 4.48 (95%-CI: 3.7 to 5.25; n = 214) cigarette equivalents/day. At follow-up, measurements in venues that had implemented a smoking ban significantly declined to an average of 0.31 (0.17 to 0.45; n = 37) (p = 0.001). Personal badge measurements also significantly decreased from an average of 2.18 (1.31-3.05 n = 53) to 0.25 (0.13-0.36; n = 41) (p = 0.001). Spearman rank correlations between badge exposure measures and salivary measures were small to moderate (0.3 at maximum). CONCLUSIONS: Nicotine levels significantly decreased in all types of hospitality venues after implementation of the smoking ban. In-depth analyses demonstrated that a time-weighted average of the workplace badge measurements represented typical personal SHS exposure at work more reliably than personal exposure measures such as salivary cotinine and nicotine.
Resumo:
Résumé Suite aux recentes avancées technologiques, les archives d'images digitales ont connu une croissance qualitative et quantitative sans précédent. Malgré les énormes possibilités qu'elles offrent, ces avancées posent de nouvelles questions quant au traitement des masses de données saisies. Cette question est à la base de cette Thèse: les problèmes de traitement d'information digitale à très haute résolution spatiale et/ou spectrale y sont considérés en recourant à des approches d'apprentissage statistique, les méthodes à noyau. Cette Thèse étudie des problèmes de classification d'images, c'est à dire de catégorisation de pixels en un nombre réduit de classes refletant les propriétés spectrales et contextuelles des objets qu'elles représentent. L'accent est mis sur l'efficience des algorithmes, ainsi que sur leur simplicité, de manière à augmenter leur potentiel d'implementation pour les utilisateurs. De plus, le défi de cette Thèse est de rester proche des problèmes concrets des utilisateurs d'images satellite sans pour autant perdre de vue l'intéret des méthodes proposées pour le milieu du machine learning dont elles sont issues. En ce sens, ce travail joue la carte de la transdisciplinarité en maintenant un lien fort entre les deux sciences dans tous les développements proposés. Quatre modèles sont proposés: le premier répond au problème de la haute dimensionalité et de la redondance des données par un modèle optimisant les performances en classification en s'adaptant aux particularités de l'image. Ceci est rendu possible par un système de ranking des variables (les bandes) qui est optimisé en même temps que le modèle de base: ce faisant, seules les variables importantes pour résoudre le problème sont utilisées par le classifieur. Le manque d'information étiquétée et l'incertitude quant à sa pertinence pour le problème sont à la source des deux modèles suivants, basés respectivement sur l'apprentissage actif et les méthodes semi-supervisées: le premier permet d'améliorer la qualité d'un ensemble d'entraînement par interaction directe entre l'utilisateur et la machine, alors que le deuxième utilise les pixels non étiquetés pour améliorer la description des données disponibles et la robustesse du modèle. Enfin, le dernier modèle proposé considère la question plus théorique de la structure entre les outputs: l'intègration de cette source d'information, jusqu'à présent jamais considérée en télédétection, ouvre des nouveaux défis de recherche. Advanced kernel methods for remote sensing image classification Devis Tuia Institut de Géomatique et d'Analyse du Risque September 2009 Abstract The technical developments in recent years have brought the quantity and quality of digital information to an unprecedented level, as enormous archives of satellite images are available to the users. However, even if these advances open more and more possibilities in the use of digital imagery, they also rise several problems of storage and treatment. The latter is considered in this Thesis: the processing of very high spatial and spectral resolution images is treated with approaches based on data-driven algorithms relying on kernel methods. In particular, the problem of image classification, i.e. the categorization of the image's pixels into a reduced number of classes reflecting spectral and contextual properties, is studied through the different models presented. The accent is put on algorithmic efficiency and the simplicity of the approaches proposed, to avoid too complex models that would not be used by users. The major challenge of the Thesis is to remain close to concrete remote sensing problems, without losing the methodological interest from the machine learning viewpoint: in this sense, this work aims at building a bridge between the machine learning and remote sensing communities and all the models proposed have been developed keeping in mind the need for such a synergy. Four models are proposed: first, an adaptive model learning the relevant image features has been proposed to solve the problem of high dimensionality and collinearity of the image features. This model provides automatically an accurate classifier and a ranking of the relevance of the single features. The scarcity and unreliability of labeled. information were the common root of the second and third models proposed: when confronted to such problems, the user can either construct the labeled set iteratively by direct interaction with the machine or use the unlabeled data to increase robustness and quality of the description of data. Both solutions have been explored resulting into two methodological contributions, based respectively on active learning and semisupervised learning. Finally, the more theoretical issue of structured outputs has been considered in the last model, which, by integrating outputs similarity into a model, opens new challenges and opportunities for remote sensing image processing.
Resumo:
Abstract : This work is concerned with the development and application of novel unsupervised learning methods, having in mind two target applications: the analysis of forensic case data and the classification of remote sensing images. First, a method based on a symbolic optimization of the inter-sample distance measure is proposed to improve the flexibility of spectral clustering algorithms, and applied to the problem of forensic case data. This distance is optimized using a loss function related to the preservation of neighborhood structure between the input space and the space of principal components, and solutions are found using genetic programming. Results are compared to a variety of state-of--the-art clustering algorithms. Subsequently, a new large-scale clustering method based on a joint optimization of feature extraction and classification is proposed and applied to various databases, including two hyperspectral remote sensing images. The algorithm makes uses of a functional model (e.g., a neural network) for clustering which is trained by stochastic gradient descent. Results indicate that such a technique can easily scale to huge databases, can avoid the so-called out-of-sample problem, and can compete with or even outperform existing clustering algorithms on both artificial data and real remote sensing images. This is verified on small databases as well as very large problems. Résumé : Ce travail de recherche porte sur le développement et l'application de méthodes d'apprentissage dites non supervisées. Les applications visées par ces méthodes sont l'analyse de données forensiques et la classification d'images hyperspectrales en télédétection. Dans un premier temps, une méthodologie de classification non supervisée fondée sur l'optimisation symbolique d'une mesure de distance inter-échantillons est proposée. Cette mesure est obtenue en optimisant une fonction de coût reliée à la préservation de la structure de voisinage d'un point entre l'espace des variables initiales et l'espace des composantes principales. Cette méthode est appliquée à l'analyse de données forensiques et comparée à un éventail de méthodes déjà existantes. En second lieu, une méthode fondée sur une optimisation conjointe des tâches de sélection de variables et de classification est implémentée dans un réseau de neurones et appliquée à diverses bases de données, dont deux images hyperspectrales. Le réseau de neurones est entraîné à l'aide d'un algorithme de gradient stochastique, ce qui rend cette technique applicable à des images de très haute résolution. Les résultats de l'application de cette dernière montrent que l'utilisation d'une telle technique permet de classifier de très grandes bases de données sans difficulté et donne des résultats avantageusement comparables aux méthodes existantes.
Resumo:
"Most quantitative empirical analyses are motivated by the desire to estimate the causal effect of an independent variable on a dependent variable. Although the randomized experiment is the most powerful design for this task, in most social science research done outside of psychology, experimental designs are infeasible. (Winship & Morgan, 1999, p. 659)." This quote from earlier work by Winship and Morgan, which was instrumental in setting the groundwork for their book, captures the essence of our review of Morgan and Winship's book: It is about causality in nonexperimental settings.
Resumo:
INTRODUCTION: Radiosurgery (RS) is gaining increasing acceptance in the upfront management of brain metastases (BM). It was initially used in so-called radioresistant metastases (melanoma, renal cell, sarcoma) because it allowed delivering higher dose to the tumor. Now, RS is also used for BM of other cancers. The risk of high incidence of new BM questions the need for associated whole-brain radiotherapy (WBRT). Recent evidence suggests that RS alone allows avoiding cognitive impairment related to WBRT, and the latter should be upheld for salvage therapy. Thus the increase use of RS for single and multiple BM raises new technical challenges for treatment delivery and dosimetry. We present our single institution experience focusing on the criteria that led to patients' selection for RS treatment with Gamma Knife (GK) in lieu of Linac. METHODS: Leksell Gamma Knife Perfexion (Elekta, Sweden) was installed in July 2010. Currently, the Swiss federal health care supports the costs of RS for BM with Linac but not with GK. Therefore, in our center, we always consider first the possibility to use Linac for this indication, and only select patients for GK in specific situations. All cases of BM treated with GK were retrospectively reviewed for criteria yielding to GK indication, clinical information, and treatment data. Further work in progress includes a posteriori dosimetry comparison with our Linac planning system (Brainscan V.5.3, Brainlab, Germany). RESULTS: From July 2010 to March 2012, 20 patients had RS for BM with GK (7 patients with single BM, and 13 with multiple BM). During the same period, 31 had Linac-based RS. Primary tumor was melanoma in 9, lung in 7, renal in 2, and gastrointestinal tract in 2 patients. In single BM, the reason for choosing of GK was the anatomical location close to, or in highly functional areas (1 motor cortex, 1 thalamic, 1 ventricular, 1 mesio-temporal, 3 deep cerebellar close to the brainstem), especially since most of these tumors were intended to be treated with high-dose RS (24 Gy at margin) because of their histology (3 melanomas, 1 renal cell). In multiple BM, the reason for choosing GK in relation with the anatomical location of the lesions was either technical (limitations of Linac movements, especially in lower posterior fossa locations) or closeness of multiple lesions to highly functional areas (typically, multiple posterior fossa BM close to the brainstem), precluding optimal dosimetry with Linac. Again, this was made more critical for multiple BM needing high-dose RS (6 melanoma, 2 hypernephroma). CONCLUSION: Radiosurgery for BM may represent some technical challenge in relation with the anatomical location and multiplicity of the lesions. These considerations may be accentuated for so-called radioresistant BM, when higher dose RS in needed. In our experience, Leksell Gamma Knife Perfexion proves to be useful in addressing these challenges for the treatment of BM.
Resumo:
Avalanche forecasting is a complex process involving the assimilation of multiple data sources to make predictions over varying spatial and temporal resolutions. Numerically assisted forecasting often uses nearest neighbour methods (NN), which are known to have limitations when dealing with high dimensional data. We apply Support Vector Machines to a dataset from Lochaber, Scotland to assess their applicability in avalanche forecasting. Support Vector Machines (SVMs) belong to a family of theoretically based techniques from machine learning and are designed to deal with high dimensional data. Initial experiments showed that SVMs gave results which were comparable with NN for categorical and probabilistic forecasts. Experiments utilising the ability of SVMs to deal with high dimensionality in producing a spatial forecast show promise, but require further work.
Resumo:
A prior long-term and complex evaluation of the already available data on the geophysical prospecting during the first season work carried out at 2006, at the archaeological site of Tchinguiz Tepe of Termez, took place to decide the strategy to follow during the campaign of 2007. This previous evaluation of the information, on one hand, leaded to the decision to increase the geophysical prospecting at Tchinguiz Tepe, on the other hand, to decide the exact location of areas where the archaeological interventions.would carry out. The main objective at the beginning of this new season was to crosscheck the reliabilityof the measurements and, at the same time, to establish the unknown up to the present archaeologicaland chronological sequence of Tchinguiz Tepe. Meanwhile, the geophysical prospecting also wasextended to the outskirts of the city were the localisation of an unknown up to now Buddhist Monasterywas possible.
Resumo:
The International Pluridisciplinary Archaeological Expedition in Bactria (IPAEB) was created in 2006.The name underlines the international character of the team (which includes Uzbeks, Spanish, French, British and Greek members), the presence of specialists from various fields apart from archaeology and the fame of Bactria1.