881 resultados para mining algorithm
Resumo:
A family of nonempty closed convex sets is built by using the data of the Generalized Nash equilibrium problem (GNEP). The sets are selected iteratively such that the intersection of the selected sets contains solutions of the GNEP. The algorithm introduced by Iusem-Sosa (2003) is adapted to obtain solutions of the GNEP. Finally some numerical experiments are given to illustrate the numerical behavior of the algorithm.
Resumo:
PURPOSE: In Burkina Faso, gold ore is one of the main sources of income for an important part of the active population. Artisan gold miners use mercury in the extraction, a toxic metal whose human health risks are well known. The aim of the present study was to assess mercury exposure as well as to understand the exposure determinants of gold miners in Burkinabe small-scale mines.METHODS: The examined gold miners' population on the different selected gold mining sites was composed by persons who were directly and indirectly related to gold mining activities. But measurement of urinary mercury was performed on workers most susceptible to be exposed to mercury. Thus, occupational exposure to mercury was evaluated among ninety-three workers belonging to eight different gold mining sites spread in six regions of Burkina Faso. Among others, work-related exposure determinants were taken into account for each person during urine sampling as for example amalgamating or heating mercury. All participants were medically examined by a local medical team in order to identify possible symptoms related to the toxic effect of mercury.RESULTS: Mercury levels were high, showing that 69% of the measurements exceeded the ACGIH (American Conference of Industrial Hygienists) biological exposure indice (BEI) of 35 µg per g of creatinine (µg/g-Cr) (prior to shift) while 16% even exceeded 350 µg/g-Cr. Basically, unspecific but also specific symptoms related to mercury toxicity could be underlined among the persons who were directly related to gold mining activities. Only one-third among the studied subpopulation reported about less than three symptoms possibly associated to mercury exposure and nearly half of them suffered from at least five of these symptoms. Ore washers were more involved in the direct handling of mercury while gold dealers in the final gold recovery activities. These differences may explain the overexposure observed in gold dealers and indicate that the refining process is the major source of exposure.CONCLUSIONS: This study attests that mercury exposure still is an issue of concern. North-South collaborations should encourage knowledge exchange between developing and developed countries, for a cleaner artisanal gold mining process and thus for reducing human health and environmental hazards due to mercury use.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt"
Resumo:
Le "data mining", ou "fouille de données", est un ensemble de méthodes et de techniques attractif qui a connu une popularité fulgurante ces dernières années, spécialement dans le domaine du marketing. Le développement récent de l'analyse ou du renseignement criminel soulève des problèmatiques auxqwuelles il est tentant de d'appliquer ces méthodes et techniques. Le potentiel et la place du data mining dans le contexte de l'analyse criminelle doivent être mieux définis afin de piloter son application. Cette réflexion est menée dans le cadre du renseignement produit par des systèmes de détection et de suivi systématique de la criminalité répétitive, appelés processus de veille opérationnelle. Leur fonctionnement nécessite l'existence de patterns inscrits dans les données, et justifiés par les approches situationnelles en criminologie. Muni de ce bagage théorique, l'enjeu principal revient à explorer les possibilités de détecter ces patterns au travers des méthodes et techniques de data mining. Afin de répondre à cet objectif, une recherche est actuellement menée au Suisse à travers une approche interdisciplinaire combinant des connaissances forensiques, criminologiques et computationnelles.
Resumo:
The DNA microarray technology has arguably caught the attention of the worldwide life science community and is now systematically supporting major discoveries in many fields of study. The majority of the initial technical challenges of conducting experiments are being resolved, only to be replaced with new informatics hurdles, including statistical analysis, data visualization, interpretation, and storage. Two systems of databases, one containing expression data and one containing annotation data are quickly becoming essential knowledge repositories of the research community. This present paper surveys several databases, which are considered "pillars" of research and important nodes in the network. This paper focuses on a generalized workflow scheme typical for microarray experiments using two examples related to cancer research. The workflow is used to reference appropriate databases and tools for each step in the process of array experimentation. Additionally, benefits and drawbacks of current array databases are addressed, and suggestions are made for their improvement.
Resumo:
A longitudinal study of malaria vectors aiming to describe the intensity of transmission was carried out in five villages of Southern Venezuela between January 1999-April 2000. The man-biting, sporozoite and entomological inoculation rates (EIR) were calculated based on 121 all-night collections of anophelines landing on humans, CDC light traps and ultra violet up-draft traps. A total of 6,027 female mosquitoes representing seven species were collected. The most abundant species were Anopheles marajoara Galvão & Damasceno (56.7%) and Anopheles darlingi Root (33%), which together accounted for 89.7% of the total anophelines collected. The mean biting rate for An. marajoara was 1.27 (SD + 0.81); it was 0.74 (SD + 0.91) for An. darlingand 0.11 (SD + 0.10) for Anopheles neomaculipalpus Curry and the overall biting rate was 2.29 (SD + 1.06). A total of 5,886 mosquitoes collected by all three methods were assayed by ELISA and 28 pools, equivalent to 28 mosquitoes, yielded positive results for Plasmodium spp. CS protein. An. neomaculipalpus had the highest sporozoite rate 0.84% (3/356), followed by An. darlingi 0.82% (16/1,948) and An. marajoara 0.27% (9/3,332). The overall sporozoite rate was 0.48% (28/5,886). The rates of infection by Plasmodium species in mosquitoes were 0.37% (22/5,886) for Plasmodium vivax(Grassi & Feletti) and 0.10% (6/5,886) for Plasmodium falciparum (Welch). The estimated overall EIR for An. darling was 2.21 infective bites/person/year, 1.25 for An. marajoara and 0.34 for An. neomaculipalpus. The overall EIR was four infective bites/person/year. The biting rate, the sporozoite rate and the EIR are too low to be indicators of the efficacy of control campaigns in this area.
Resumo:
The multiscale finite volume (MsFV) method has been developed to efficiently solve large heterogeneous problems (elliptic or parabolic); it is usually employed for pressure equations and delivers conservative flux fields to be used in transport problems. The method essentially relies on the hypothesis that the (fine-scale) problem can be reasonably described by a set of local solutions coupled by a conservative global (coarse-scale) problem. In most cases, the boundary conditions assigned for the local problems are satisfactory and the approximate conservative fluxes provided by the method are accurate. In numerically challenging cases, however, a more accurate localization is required to obtain a good approximation of the fine-scale solution. In this paper we develop a procedure to iteratively improve the boundary conditions of the local problems. The algorithm relies on the data structure of the MsFV method and employs a Krylov-subspace projection method to obtain an unconditionally stable scheme and accelerate convergence. Two variants are considered: in the first, only the MsFV operator is used; in the second, the MsFV operator is combined in a two-step method with an operator derived from the problem solved to construct the conservative flux field. The resulting iterative MsFV algorithms allow arbitrary reduction of the solution error without compromising the construction of a conservative flux field, which is guaranteed at any iteration. Since it converges to the exact solution, the method can be regarded as a linear solver. In this context, the schemes proposed here can be viewed as preconditioned versions of the Generalized Minimal Residual method (GMRES), with a very peculiar characteristic that the residual on the coarse grid is zero at any iteration (thus conservative fluxes can be obtained).
Resumo:
This paper proposes a parallel architecture for estimation of the motion of an underwater robot. It is well known that image processing requires a huge amount of computation, mainly at low-level processing where the algorithms are dealing with a great number of data. In a motion estimation algorithm, correspondences between two images have to be solved at the low level. In the underwater imaging, normalised correlation can be a solution in the presence of non-uniform illumination. Due to its regular processing scheme, parallel implementation of the correspondence problem can be an adequate approach to reduce the computation time. Taking into consideration the complexity of the normalised correlation criteria, a new approach using parallel organisation of every processor from the architecture is proposed
Resumo:
In computer graphics, global illumination algorithms take into account not only the light that comes directly from the sources, but also the light interreflections. This kind of algorithms produce very realistic images, but at a high computational cost, especially when dealing with complex environments. Parallel computation has been successfully applied to such algorithms in order to make it possible to compute highly-realistic images in a reasonable time. We introduce here a speculation-based parallel solution for a global illumination algorithm in the context of radiosity, in which we have taken advantage of the hierarchical nature of such an algorithm
Resumo:
BACKGROUND Spain shows the highest bladder cancer incidence rates in men among European countries. The most important risk factors are tobacco smoking and occupational exposure to a range of different chemical substances, such as aromatic amines. METHODS This paper describes the municipal distribution of bladder cancer mortality and attempts to "adjust" this spatial pattern for the prevalence of smokers, using the autoregressive spatial model proposed by Besag, York and Molliè, with relative risk of lung cancer mortality as a surrogate. RESULTS It has been possible to compile and ascertain the posterior distribution of relative risk for bladder cancer adjusted for lung cancer mortality, on the basis of a single Bayesian spatial model covering all of Spain's 8077 towns. Maps were plotted depicting smoothed relative risk (RR) estimates, and the distribution of the posterior probability of RR>1 by sex. Towns that registered the highest relative risks for both sexes were mostly located in the Provinces of Cadiz, Seville, Huelva, Barcelona and Almería. The highest-risk area in Barcelona Province corresponded to very specific municipal areas in the Bages district, e.g., Suría, Sallent, Balsareny, Manresa and Cardona. CONCLUSION Mining/industrial pollution and the risk entailed in certain occupational exposures could in part be dictating the pattern of municipal bladder cancer mortality in Spain. Population exposure to arsenic is a matter that calls for attention. It would be of great interest if the relationship between the chemical quality of drinking water and the frequency of bladder cancer could be studied.
Resumo:
Diffusion tensor magnetic resonance imaging, which measures directional information of water diffusion in the brain, has emerged as a powerful tool for human brain studies. In this paper, we introduce a new Monte Carlo-based fiber tracking approach to estimate brain connectivity. One of the main characteristics of this approach is that all parameters of the algorithm are automatically determined at each point using the entropy of the eigenvalues of the diffusion tensor. Experimental results show the good performance of the proposed approach