22 resultados para Australian Mining
em Université de Lausanne, Switzerland
Resumo:
PURPOSE: In Burkina Faso, gold ore is one of the main sources of income for an important part of the active population. Artisan gold miners use mercury in the extraction, a toxic metal whose human health risks are well known. The aim of the present study was to assess mercury exposure as well as to understand the exposure determinants of gold miners in Burkinabe small-scale mines.METHODS: The examined gold miners' population on the different selected gold mining sites was composed by persons who were directly and indirectly related to gold mining activities. But measurement of urinary mercury was performed on workers most susceptible to be exposed to mercury. Thus, occupational exposure to mercury was evaluated among ninety-three workers belonging to eight different gold mining sites spread in six regions of Burkina Faso. Among others, work-related exposure determinants were taken into account for each person during urine sampling as for example amalgamating or heating mercury. All participants were medically examined by a local medical team in order to identify possible symptoms related to the toxic effect of mercury.RESULTS: Mercury levels were high, showing that 69% of the measurements exceeded the ACGIH (American Conference of Industrial Hygienists) biological exposure indice (BEI) of 35 µg per g of creatinine (µg/g-Cr) (prior to shift) while 16% even exceeded 350 µg/g-Cr. Basically, unspecific but also specific symptoms related to mercury toxicity could be underlined among the persons who were directly related to gold mining activities. Only one-third among the studied subpopulation reported about less than three symptoms possibly associated to mercury exposure and nearly half of them suffered from at least five of these symptoms. Ore washers were more involved in the direct handling of mercury while gold dealers in the final gold recovery activities. These differences may explain the overexposure observed in gold dealers and indicate that the refining process is the major source of exposure.CONCLUSIONS: This study attests that mercury exposure still is an issue of concern. North-South collaborations should encourage knowledge exchange between developing and developed countries, for a cleaner artisanal gold mining process and thus for reducing human health and environmental hazards due to mercury use.
Resumo:
Le "data mining", ou "fouille de données", est un ensemble de méthodes et de techniques attractif qui a connu une popularité fulgurante ces dernières années, spécialement dans le domaine du marketing. Le développement récent de l'analyse ou du renseignement criminel soulève des problèmatiques auxqwuelles il est tentant de d'appliquer ces méthodes et techniques. Le potentiel et la place du data mining dans le contexte de l'analyse criminelle doivent être mieux définis afin de piloter son application. Cette réflexion est menée dans le cadre du renseignement produit par des systèmes de détection et de suivi systématique de la criminalité répétitive, appelés processus de veille opérationnelle. Leur fonctionnement nécessite l'existence de patterns inscrits dans les données, et justifiés par les approches situationnelles en criminologie. Muni de ce bagage théorique, l'enjeu principal revient à explorer les possibilités de détecter ces patterns au travers des méthodes et techniques de data mining. Afin de répondre à cet objectif, une recherche est actuellement menée au Suisse à travers une approche interdisciplinaire combinant des connaissances forensiques, criminologiques et computationnelles.
Resumo:
The DNA microarray technology has arguably caught the attention of the worldwide life science community and is now systematically supporting major discoveries in many fields of study. The majority of the initial technical challenges of conducting experiments are being resolved, only to be replaced with new informatics hurdles, including statistical analysis, data visualization, interpretation, and storage. Two systems of databases, one containing expression data and one containing annotation data are quickly becoming essential knowledge repositories of the research community. This present paper surveys several databases, which are considered "pillars" of research and important nodes in the network. This paper focuses on a generalized workflow scheme typical for microarray experiments using two examples related to cancer research. The workflow is used to reference appropriate databases and tools for each step in the process of array experimentation. Additionally, benefits and drawbacks of current array databases are addressed, and suggestions are made for their improvement.
Resumo:
Imaging mass spectrometry (IMS) represents an innovative tool in the cancer research pipeline, which is increasingly being used in clinical and pharmaceutical applications. The unique properties of the technique, especially the amount of data generated, make the handling of data from multiple IMS acquisitions challenging. This work presents a histology-driven IMS approach aiming to identify discriminant lipid signatures from the simultaneous mining of IMS data sets from multiple samples. The feasibility of the developed workflow is evaluated on a set of three human colorectal cancer liver metastasis (CRCLM) tissue sections. Lipid IMS on tissue sections was performed using MALDI-TOF/TOF MS in both negative and positive ionization modes after 1,5-diaminonaphthalene matrix deposition by sublimation. The combination of both positive and negative acquisition results was performed during data mining to simplify the process and interrogate a larger lipidome into a single analysis. To reduce the complexity of the IMS data sets, a sub data set was generated by randomly selecting a fixed number of spectra from a histologically defined region of interest, resulting in a 10-fold data reduction. Principal component analysis confirmed that the molecular selectivity of the regions of interest is maintained after data reduction. Partial least-squares and heat map analyses demonstrated a selective signature of the CRCLM, revealing lipids that are significantly up- and down-regulated in the tumor region. This comprehensive approach is thus of interest for defining disease signatures directly from IMS data sets by the use of combinatory data mining, opening novel routes of investigation for addressing the demands of the clinical setting.
Resumo:
Invasive species may carry with them parasites from their native range, differing from parasite taxa found in the invaded range. Host switching by parasites (either from the invader to native fauna or from native fauna to the invader) may have important consequences for the viability of either type of host (e.g., their survivorship, fecundity, dispersal ability, or geographic distribution). Rhabdias pseudosphaerocephala (Nematoda) is a common parasite of cane toads (Rhinella marina) in the toad's native range (South and Central America) and also in its introduced Australian range. This lungworm can depress host viability and is capable of infecting Australian frogs in laboratory trials. Despite syntopy between toads and frogs for up to 75 yr, our analyses, based on DNA sequence data of lungworms from 80 frogs and 56 toads, collected from 2008 to 2011, did not reveal any cases of host switching in nature: toads and native frogs retain entirely different lungworm faunas. All lungworms in cane toads were the South and Central American species Rhabdias pseudosphaerocephala, whereas Australian frogs contained at least four taxa (mostly undescribed and currently lumped under the name Rhabdias cf. hylae). General patterns of prevalence and intensity, based on the dissection of 1,315 frogs collected between 1989 and 2011 across the toads' Australian range, show that these Australian endemic Rhabdias spp. are widely distributed geographically and across host taxa but are more common in some frog species (especially, large-bodied species) than they are in others.
Resumo:
This paper describes a study that aimed to identify research priorities for the care of infants, children and adolescents at the sole tertiary referral hospital for children in Western Australia. The secondary aim was to stimulate nurses to explore clinical problems that would require further inquiry. Background. Planning for research is an essential stage of research development; involving clinicians in this exercise is likely to foster research partnerships that are pertinent to clinical practice. Nursing research priorities for the paediatric population have not previously been reported in Australia. Design. Delphi study. Method. Over 12 months in 2005-2006, a three-round questionnaire, using the Delphi technique, was sent to a randomly selected sample of registered nurses. This method was used to identify and prioritise nursing research topics relevant to the patient and the family. Content analysis was used to analyse Round I data and descriptive statistics for Round II and III data. Results. In Round I, 280 statements were identified and reduced to 37 research priorities. Analysis of data in subsequent rounds identified the top two priority research areas as (1) identification of strategies to reduce medication incidents (Mean = 6 center dot 47; SD 0 center dot 88) and (2) improvement in pain assessment and management (Mean = 6; SD 1 center dot 38). Additional comments indicated few nurses access the scientific literature or use research findings because of a lack of time or electronic access. Conclusions. Thirty-seven research priorities were identified. The identification of research priorities by nurses provided research direction for the health service and potentially other similar health institutions for children and adolescents in Australia and internationally. Relevance to clinical practice. The nurse participants showed concern about the safety of care and the well-being of children and their families. This study also enabled the identification of potential collaborative research and development of pain management improvement initiatives.
Resumo:
Data mining can be defined as the extraction of previously unknown and potentially useful information from large datasets. The main principle is to devise computer programs that run through databases and automatically seek deterministic patterns. It is applied in different fields of application, e.g., remote sensing, biometry, speech recognition, but has seldom been applied to forensic case data. The intrinsic difficulty related to the use of such data lies in its heterogeneity, which comes from the many different sources of information. The aim of this study is to highlight potential uses of pattern recognition that would provide relevant results from a criminal intelligence point of view. The role of data mining within a global crime analysis methodology is to detect all types of structures in a dataset. Once filtered and interpreted, those structures can point to previously unseen criminal activities. The interpretation of patterns for intelligence purposes is the final stage of the process. It allows the researcher to validate the whole methodology and to refine each step if necessary. An application to cutting agents found in illicit drug seizures was performed. A combinatorial approach was done, using the presence and the absence of products. Methods coming from the graph theory field were used to extract patterns in data constituted by links between products and place and date of seizure. A data mining process completed using graphing techniques is called ``graph mining''. Patterns were detected that had to be interpreted and compared with preliminary knowledge to establish their relevancy. The illicit drug profiling process is actually an intelligence process that uses preliminary illicit drug classes to classify new samples. Methods proposed in this study could be used \textit{a priori} to compare structures from preliminary and post-detection patterns. This new knowledge of a repeated structure may provide valuable complementary information to profiling and become a source of intelligence.
3D seismic facies characterization and geological patterns recognition (Australian North West Shelf)
Resumo:
EXECUTIVE SUMMARY This PhD research, funded by the Swiss Sciences Foundation, is principally devoted to enhance the recognition, the visualisation and the characterization of geobodies through innovative 3D seismic approaches. A series of case studies from the Australian North West Shelf ensures the development of reproducible integrated 3D workflows and gives new insight into local and regional stratigraphic as well as structural issues. This project was initiated in year 2000 at the Geology and Palaeontology Institute of the University of Lausanne (Switzerland). Several collaborations ensured the improvement of technical approaches as well as the assessment of geological models. - Investigations into the Timor Sea structural style were carried out at the Tectonics Special Research Centre of the University of Western Australia and in collaboration with Woodside Energy in Perth. - Seismic analysis and attributes classification approach were initiated with Schlumberger Oilfield Australia in Perth; assessments and enhancements of the integrated seismic approaches benefited from collaborations with scientists from Schlumberger Stavanger Research (Norway). Adapting and refining from "linear" exploration techniques, a conceptual "helical" 3D seismic approach has been developed. In order to investigate specific geological issues this approach, integrating seismic attributes and visualisation tools, has been refined and adjusted leading to the development of two specific workflows: - A stratigraphic workflow focused on the recognition of geobodies and the characterization of depositional systems. Additionally, it can support the modelling of the subsidence and incidentally the constraint of the hydrocarbon maturity of a given area. - A structural workflow used to quickly and accurately define major and secondary fault systems. The integration of the 3D structural interpretation results ensures the analysis of the fault networks kinematics which can affect hydrocarbon trapping mechanisms. The application of these integrated workflows brings new insight into two complex settings on the Australian North West Shelf and ensures the definition of astonishing stratigraphic and structural outcomes. The stratigraphic workflow ensures the 3D characterization of the Late Palaeozoic glacial depositional system on the Mermaid Nose (Dampier Subbasin, Northern Carnarvon Basin) that presents similarities with the glacial facies along the Neotethys margin up to Oman (chapter 3.1). A subsidence model reveals the Phanerozoic geodynamic evolution of this area (chapter 3.2) and emphasizes two distinct mode of regional extension for the Palaeozoic (Neotethys opening) and Mesozoic (abyssal plains opening). The structural workflow is used for the definition of the structural evolution of the Laminaria High area (Bonaparte Basin). Following a regional structural characterization of the Timor Sea (chapter 4.1), a thorough analysis of the Mesozoic fault architecture reveals a local rotation of the stress field and the development of reverse structures (flower structures) in extensional setting, that form potential hydrocarbon traps (chapter 4.2). The definition of the complex Neogene structural architecture associated with the fault kinematic analysis and a plate flexure model (chapter 4.3) suggest that the Miocene to Pleistocene reactivation phases recorded at the Laminaria High most probably result from the oblique normal reactivation of the underlying Mesozoic fault planes. This episode is associated with the deformation of the subducting Australian plate. Based on these results three papers were published in international journals and two additional publications will be submitted. Additionally this research led to several communications in international conferences. Although the different workflows presented in this research have been primarily developed and used for the analysis of specific stratigraphic and structural geobodies on the Australian North West Shelf, similar integrated 3D seismic approaches will have applications to hydrocarbon exploration and production phases; for instance increasing the recognition of potential source rocks, secondary migration pathways, additional traps or reservoir breaching mechanisms. The new elements brought by this research further highlight that 3D seismic data contains a tremendous amount of hidden geological information waiting to be revealed and that will undoubtedly bring new insight into depositional systems, structural evolution and geohistory of the areas reputed being explored and constrained and other yet to be constrained. The further development of 3D texture attributes highlighting specific features of the seismic signal, the integration of quantitative analysis for stratigraphic and structural processes, the automation of the interpretation workflow as well as the formal definition of "seismo-morphologic" characteristics of a wide range of geobodies from various environments would represent challenging examples of continuation of this present research. The 21st century will most probably represent a transition period between fossil and other alternative energies. The next generation of seismic interpreters prospecting for hydrocarbon will undoubtedly face new challenges mostly due to the shortage of obvious and easy targets. They will probably have to keep on integrating techniques and geological processes in order to further capitalise the seismic data for new potentials definition. Imagination and creativity will most certainly be among the most important quality required from such geoscientists.
Resumo:
Ultra-high-throughput sequencing (UHTS) techniques are evolving rapidly and may soon become an affordable and routine tool for sequencing plant DNA, even in smaller plant biology labs. Here we review recent insights into intraspecific genome variation gained from UHTS, which offers a glimpse of the rather unexpected levels of structural variability among Arabidopsis thaliana accessions. The challenges that will need to be addressed to efficiently assemble and exploit this information are also discussed.
Resumo:
La présente étude est à la fois une évaluation du processus de la mise en oeuvre et des impacts de la police de proximité dans les cinq plus grandes zones urbaines de Suisse - Bâle, Berne, Genève, Lausanne et Zurich. La police de proximité (community policing) est à la fois une philosophie et une stratégie organisationnelle qui favorise un partenariat renouvelé entre la police et les communautés locales dans le but de résoudre les problèmes relatifs à la sécurité et à l'ordre public. L'évaluation de processus a analysé des données relatives aux réformes internes de la police qui ont été obtenues par l'intermédiaire d'entretiens semi-structurés avec des administrateurs clés des cinq départements de police, ainsi que dans des documents écrits de la police et d'autres sources publiques. L'évaluation des impacts, quant à elle, s'est basée sur des variables contextuelles telles que des statistiques policières et des données de recensement, ainsi que sur des indicateurs d'impacts construit à partir des données du Swiss Crime Survey (SCS) relatives au sentiment d'insécurité, à la perception du désordre public et à la satisfaction de la population à l'égard de la police. Le SCS est un sondage régulier qui a permis d'interroger des habitants des cinq grandes zones urbaines à plusieurs reprises depuis le milieu des années 1980. L'évaluation de processus a abouti à un « Calendrier des activités » visant à créer des données de panel permettant de mesurer les progrès réalisés dans la mise en oeuvre de la police de proximité à l'aide d'une grille d'évaluation à six dimensions à des intervalles de cinq ans entre 1990 et 2010. L'évaluation des impacts, effectuée ex post facto, a utilisé un concept de recherche non-expérimental (observational design) dans le but d'analyser les impacts de différents modèles de police de proximité dans des zones comparables à travers les cinq villes étudiées. Les quartiers urbains, délimités par zone de code postal, ont ainsi été regroupés par l'intermédiaire d'une typologie réalisée à l'aide d'algorithmes d'apprentissage automatique (machine learning). Des algorithmes supervisés et non supervisés ont été utilisés sur les données à haute dimensionnalité relatives à la criminalité, à la structure socio-économique et démographique et au cadre bâti dans le but de regrouper les quartiers urbains les plus similaires dans des clusters. D'abord, les cartes auto-organisatrices (self-organizing maps) ont été utilisées dans le but de réduire la variance intra-cluster des variables contextuelles et de maximiser simultanément la variance inter-cluster des réponses au sondage. Ensuite, l'algorithme des forêts d'arbres décisionnels (random forests) a permis à la fois d'évaluer la pertinence de la typologie de quartier élaborée et de sélectionner les variables contextuelles clés afin de construire un modèle parcimonieux faisant un minimum d'erreurs de classification. Enfin, pour l'analyse des impacts, la méthode des appariements des coefficients de propension (propensity score matching) a été utilisée pour équilibrer les échantillons prétest-posttest en termes d'âge, de sexe et de niveau d'éducation des répondants au sein de chaque type de quartier ainsi identifié dans chacune des villes, avant d'effectuer un test statistique de la différence observée dans les indicateurs d'impacts. De plus, tous les résultats statistiquement significatifs ont été soumis à une analyse de sensibilité (sensitivity analysis) afin d'évaluer leur robustesse face à un biais potentiel dû à des covariables non observées. L'étude relève qu'au cours des quinze dernières années, les cinq services de police ont entamé des réformes majeures de leur organisation ainsi que de leurs stratégies opérationnelles et qu'ils ont noué des partenariats stratégiques afin de mettre en oeuvre la police de proximité. La typologie de quartier développée a abouti à une réduction de la variance intra-cluster des variables contextuelles et permet d'expliquer une partie significative de la variance inter-cluster des indicateurs d'impacts avant la mise en oeuvre du traitement. Ceci semble suggérer que les méthodes de géocomputation aident à équilibrer les covariables observées et donc à réduire les menaces relatives à la validité interne d'un concept de recherche non-expérimental. Enfin, l'analyse des impacts a révélé que le sentiment d'insécurité a diminué de manière significative pendant la période 2000-2005 dans les quartiers se trouvant à l'intérieur et autour des centres-villes de Berne et de Zurich. Ces améliorations sont assez robustes face à des biais dus à des covariables inobservées et covarient dans le temps et l'espace avec la mise en oeuvre de la police de proximité. L'hypothèse alternative envisageant que les diminutions observées dans le sentiment d'insécurité soient, partiellement, un résultat des interventions policières de proximité semble donc être aussi plausible que l'hypothèse nulle considérant l'absence absolue d'effet. Ceci, même si le concept de recherche non-expérimental mis en oeuvre ne peut pas complètement exclure la sélection et la régression à la moyenne comme explications alternatives. The current research project is both a process and impact evaluation of community policing in Switzerland's five major urban areas - Basel, Bern, Geneva, Lausanne, and Zurich. Community policing is both a philosophy and an organizational strategy that promotes a renewed partnership between the police and the community to solve problems of crime and disorder. The process evaluation data on police internal reforms were obtained through semi-structured interviews with key administrators from the five police departments as well as from police internal documents and additional public sources. The impact evaluation uses official crime records and census statistics as contextual variables as well as Swiss Crime Survey (SCS) data on fear of crime, perceptions of disorder, and public attitudes towards the police as outcome measures. The SCS is a standing survey instrument that has polled residents of the five urban areas repeatedly since the mid-1980s. The process evaluation produced a "Calendar of Action" to create panel data to measure community policing implementation progress over six evaluative dimensions in intervals of five years between 1990 and 2010. The impact evaluation, carried out ex post facto, uses an observational design that analyzes the impact of the different community policing models between matched comparison areas across the five cities. Using ZIP code districts as proxies for urban neighborhoods, geospatial data mining algorithms serve to develop a neighborhood typology in order to match the comparison areas. To this end, both unsupervised and supervised algorithms are used to analyze high-dimensional data on crime, the socio-economic and demographic structure, and the built environment in order to classify urban neighborhoods into clusters of similar type. In a first step, self-organizing maps serve as tools to develop a clustering algorithm that reduces the within-cluster variance in the contextual variables and simultaneously maximizes the between-cluster variance in survey responses. The random forests algorithm then serves to assess the appropriateness of the resulting neighborhood typology and to select the key contextual variables in order to build a parsimonious model that makes a minimum of classification errors. Finally, for the impact analysis, propensity score matching methods are used to match the survey respondents of the pretest and posttest samples on age, gender, and their level of education for each neighborhood type identified within each city, before conducting a statistical test of the observed difference in the outcome measures. Moreover, all significant results were subjected to a sensitivity analysis to assess the robustness of these findings in the face of potential bias due to some unobserved covariates. The study finds that over the last fifteen years, all five police departments have undertaken major reforms of their internal organization and operating strategies and forged strategic partnerships in order to implement community policing. The resulting neighborhood typology reduced the within-cluster variance of the contextual variables and accounted for a significant share of the between-cluster variance in the outcome measures prior to treatment, suggesting that geocomputational methods help to balance the observed covariates and hence to reduce threats to the internal validity of an observational design. Finally, the impact analysis revealed that fear of crime dropped significantly over the 2000-2005 period in the neighborhoods in and around the urban centers of Bern and Zurich. These improvements are fairly robust in the face of bias due to some unobserved covariate and covary temporally and spatially with the implementation of community policing. The alternative hypothesis that the observed reductions in fear of crime were at least in part a result of community policing interventions thus appears at least as plausible as the null hypothesis of absolutely no effect, even if the observational design cannot completely rule out selection and regression to the mean as alternative explanations.