512 resultados para refining
Resumo:
The origin of species diversity has challenged biologists for over two centuries. Allopatric speciation, the divergence of species resulting from geographical isolation, is well documented. However, sympatric speciation, divergence without geographical isolation, is highly controversial. Claims of sympatric speciation must demonstrate species sympatry, sister relationships, reproductive isolation, and that an earlier allopatric phase is highly unlikely. Here we provide clear support for sympatric speciation in a case study of two species of palm (Arecaceae) on an oceanic island. A large dated phylogenetic tree shows that the two species of Howea, endemic to the remote Lord Howe Island, are sister taxa and diverged from each other well after the island was formed 6.9 million years ago. During fieldwork, we found a substantial disjunction in flowering time that is correlated with soil preference. In addition, a genome scan indicates that few genetic loci are more divergent between the two species than expected under neutrality, a finding consistent with models of sympatric speciation involving disruptive/divergent selection. This case study of sympatric speciation in plants provides an opportunity for refining theoretical models on the origin of species, and new impetus for exploring putative plant and animal examples on oceanic islands.
Resumo:
PURPOSE OF REVIEW: Amplification and overexpression of the epidermal growth factor receptor (EGFR) gene are a hallmark of primary glioblastoma (45%), making it a prime target for therapy. In addition, these amplifications are frequently associated with oncogenic mutations in the extracellular domain. However, efforts at targeting the EGFR tyrosine kinase using small molecule inhibitors or antibodies have shown disappointing efficacy in clinical trials for newly diagnosed or recurrent glioblastoma. Here, we review recent insights into molecular mechanisms relevant for effective targeting of the EGFR pathway. RECENT FINDINGS: Molecular workup of glioblastoma tissue of patients under treatment with small molecule inhibitors has established drug concentrations in the tumor tissue, and has shed light on the effectiveness of target inhibition and respective effects on pathway signaling. Further, functional analyses of interaction of small molecule inhibitors with distinct properties to bind to the active or inactive form of EGFR have provided new insights that will impact the choice of drugs. Finally, vaccination approaches targeting the EGFRvIII mutant featuring a tumor-specific antigen have shown promising results that warrant larger controlled clinical trials. SUMMARY: A combination of preclinical and clinical studies at the molecular level has provided new insights that will allow refining strategies for targeting the EGFR pathway in glioblastoma.
Resumo:
(Résumé de l'ouvrage) Seventeen innovative studies are collected in this volume which has been produced under the aegis of the Centre for Biblical Studies, University of Manchester, and L'Institut des sciences bibliques, Université de Lausanne. The majority of the studies engage with narrative through providing insightful working examples. Building on the many contributions of recent narratological research, for the most part the studies in this collection avoid the technical language of narratology as they present fresh insights at many levels. Some essays focus more on the implied author, some on the implied reader or hearer, and some on the way particular messages are constructed; some of the studies consider how author, message and reader are all interconnected. There are several creative proposals for refining genre definition, from law and wisdom to gospel and apocryphal writings. Some studies highlight the way in which narratives can contain ethical, religious, and cultural messages. Sensitivity to narrative is also shown by some contributors to expose in intruing ways the redactional processes behind the final form of texts. Students of narrative in the ancient world will find much to consider in this book, and others engaged with literary studies more generally will discover that scholars of the worlds of the Bible and Late Antiquity have much to offer them.
Resumo:
Seventeen innovative studies are collected in this volume which has been produced under the aegis of the Centre for Biblical Studies, University of Manchester, and L'Institut des sciences bibliques, Université de Lausanne. The majority of the studies engage with narrative through providing insightful working examples. Building on the many contributions of recent narratological research, for the most part the studies in this collection avoid the technical language of narratology as they present fresh insights at many levels. Some essays focus more on the implied author, some on the implied reader or hearer, and some on the way particular messages are constructed; some of the studies consider how author, message and reader are all interconnected. There are several creative proposals for refining genre definition, from law and wisdom to gospel and apocryphal writings. Some studies highlight the way in which narratives can contain ethical, religious, and cultural messages. Sensitivity to narrative is also shown by some contributors to expose in intruing ways the redactional processes behind the final form of texts. Students of narrative in the ancient world will find much to consider in this book, and others engaged with literary studies more generally will discover that scholars of the worlds of the Bible and Late Antiquity have much to offer them.
Resumo:
(Résumé de l'ouvrage) Seventeen innovative studies are collected in this volume which has been produced under the aegis of the Centre for Biblical Studies, University of Manchester, and L'Institut des sciences bibliques, Université de Lausanne. The majority of the studies engage with narrative through providing insightful working examples. Building on the many contributions of recent narratological research, for the most part the studies in this collection avoid the technical language of narratology as they present fresh insights at many levels. Some essays focus more on the implied author, some on the implied reader or hearer, and some on the way particular messages are constructed; some of the studies consider how author, message and reader are all interconnected. There are several creative proposals for refining genre definition, from law and wisdom to gospel and apocryphal writings. Some studies highlight the way in which narratives can contain ethical, religious, and cultural messages. Sensitivity to narrative is also shown by some contributors to expose in intruing ways the redactional processes behind the final form of texts. Students of narrative in the ancient world will find much to consider in this book, and others engaged with literary studies more generally will discover that scholars of the worlds of the Bible and Late Antiquity have much to offer them.
Resumo:
PURPOSE: In Burkina Faso, gold ore is one of the main sources of income for an important part of the active population. Artisan gold miners use mercury in the extraction, a toxic metal whose human health risks are well known. The aim of the present study was to assess mercury exposure as well as to understand the exposure determinants of gold miners in Burkinabe small-scale mines.METHODS: The examined gold miners' population on the different selected gold mining sites was composed by persons who were directly and indirectly related to gold mining activities. But measurement of urinary mercury was performed on workers most susceptible to be exposed to mercury. Thus, occupational exposure to mercury was evaluated among ninety-three workers belonging to eight different gold mining sites spread in six regions of Burkina Faso. Among others, work-related exposure determinants were taken into account for each person during urine sampling as for example amalgamating or heating mercury. All participants were medically examined by a local medical team in order to identify possible symptoms related to the toxic effect of mercury.RESULTS: Mercury levels were high, showing that 69% of the measurements exceeded the ACGIH (American Conference of Industrial Hygienists) biological exposure indice (BEI) of 35 µg per g of creatinine (µg/g-Cr) (prior to shift) while 16% even exceeded 350 µg/g-Cr. Basically, unspecific but also specific symptoms related to mercury toxicity could be underlined among the persons who were directly related to gold mining activities. Only one-third among the studied subpopulation reported about less than three symptoms possibly associated to mercury exposure and nearly half of them suffered from at least five of these symptoms. Ore washers were more involved in the direct handling of mercury while gold dealers in the final gold recovery activities. These differences may explain the overexposure observed in gold dealers and indicate that the refining process is the major source of exposure.CONCLUSIONS: This study attests that mercury exposure still is an issue of concern. North-South collaborations should encourage knowledge exchange between developing and developed countries, for a cleaner artisanal gold mining process and thus for reducing human health and environmental hazards due to mercury use.
Resumo:
In this study I try to explain the systemic problem of the low economic competitiveness of nuclear energy for the production of electricity by carrying out a biophysical analysis of its production process. Given the fact that neither econometric approaches nor onedimensional methods of energy analyses are effective, I introduce the concept of biophysical explanation as a quantitative analysis capable of handling the inherent ambiguity associated with the concept of energy. In particular, the quantities of energy, considered as relevant for the assessment, can only be measured and aggregated after having agreed on a pre-analytical definition of a grammar characterizing a given set of finite transformations. Using this grammar it becomes possible to provide a biophysical explanation for the low economic competitiveness of nuclear energy in the production of electricity. When comparing the various unit operations of the process of production of electricity with nuclear energy to the analogous unit operations of the process of production of fossil energy, we see that the various phases of the process are the same. The only difference is related to characteristics of the process associated with the generation of heat which are completely different in the two systems. Since the cost of production of fossil energy provides the base line of economic competitiveness of electricity, the (lack of) economic competitiveness of the production of electricity from nuclear energy can be studied, by comparing the biophysical costs associated with the different unit operations taking place in nuclear and fossil power plants when generating process heat or net electricity. In particular, the analysis focuses on fossil-fuel requirements and labor requirements for those phases that both nuclear plants and fossil energy plants have in common: (i) mining; (ii) refining/enriching; (iii) generating heat/electricity; (iv) handling the pollution/radioactive wastes. By adopting this approach, it becomes possible to explain the systemic low economic competitiveness of nuclear energy in the production of electricity, because of: (i) its dependence on oil, limiting its possible role as a carbon-free alternative; (ii) the choices made in relation to its fuel cycle, especially whether it includes reprocessing operations or not; (iii) the unavoidable uncertainty in the definition of the characteristics of its process; (iv) its large inertia (lack of flexibility) due to issues of time scale; and (v) its low power level.
Resumo:
The report presents a grammar capable of analyzing the process of production of electricity in modular elements for different power-supply systems, defined using semantic and formal categories. In this way it becomes possible to individuate similarities and differences in the process of production of electricity, and then measure and compare “apples” with “apples” and “oranges” with “oranges”. For instance, when comparing the various unit operations of the process of production of electricity with nuclear energy to the analogous unit operations of the process of production of fossil energy, we see that the various phases of the process are the same. The only difference is related to characteristics of the process associated with the generation of heat which are completely different in the two systems. As a matter of facts, the performance of the production of electricity from nuclear energy can be studied, by comparing the biophysical costs associated with the different unit operations taking place in nuclear and fossil power plants when generating process heat or net electricity. By adopting this approach, it becomes possible to compare the performance of the two power-supply systems by comparing their relative biophysical requirements for the phases that both nuclear energy power plants and fossil energy power plants have in common: (i) mining; (ii) refining/enriching; (iii) generating heat/electricity; (iv) handling the pollution/radioactive wastes. This report presents the evaluation of the biophysical requirements for the two powersupply systems: nuclear energy and fossil energy. In particular, the report focuses on the following requirements: (i) electricity; (ii) fossil-fuels, (iii) labor; and (iv) materials.
Resumo:
The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes.
Resumo:
INTRODUCTION: Gamma Knife surgery (GKS) is a non-invasive neurosurgical stereotactic procedure, increasingly used as an alternative to open functional procedures. This includes targeting of the ventro-intermediate nucleus of the thalamus (e.g. Vim) for tremor. We currently perform an indirect targeting, as the Vim is not visible on current 3Tesla MRI acquisitions. Our objective was to enhance anatomic imaging (aiming at refining the precision of anatomic target selection by direct visualisation) in patients treated for tremor with Vim GKS, by using high field 7T MRI. MATERIALS AND METHODSH: Five young healthy subjects were scanned on 3 (T1-w and diffusion tensor imaging) and 7T (high-resolution susceptibility weighted images (SWI)) MRI in Lausanne. All images were further integrated for the first time into the Gamma Plan Software(®) (Elekta Instruments, AB, Sweden) and co-registered (with T1 was a reference). A simulation of targeting of the Vim was done using various methods on the 3T images. Furthermore, a correlation with the position of the found target with the 7T SWI was performed. The atlas of Morel et al. (Zurich, CH) was used to confirm the findings on a detailed analysis inside/outside the Gamma Plan. RESULTS: The use of SWI provided us with a superior resolution and an improved image contrast within the basal ganglia. This allowed visualization and direct delineation of some subgroups of thalamic nuclei in vivo, including the Vim. The position of the target, as assessed on 3T, perfectly matched with the supposed one of the Vim on the SWI. Furthermore, a 3-dimensional model of the Vim-target area was created on the basis of the obtained images. CONCLUSION: This is the first report of the integration of SWI high field MRI into the LGP, aiming at the improvement of targeting validation of the Vim in tremor. The anatomical correlation between the direct visualization on 7T and the current targeting methods on 3T (e.g. quadrilatere of Guyot, histological atlases) seems to show a very good anatomical matching. Further studies are needed to validate this technique, both by improving the accuracy of the targeting of the Vim (potentially also other thalamic nuclei) and to perform clinical assessment.
Resumo:
Objective: Status epilepticus (SE) prognosis, is mostly related to non-modifiable factors (especially age, etiology), but the specific role of treatment appropriateness (TA) has not been investigated. Methods: In a prospective cohort with incident SE (excluding postanoxic), TA was defined, after recent European recommendations, in terms of drug dosage (630% deviation) and sequence. Outcome at hospital discharge was categorized into mortality, new handicap, or return to baseline. Results: Among 225 adults, treatment was inappropriate in 37%. In univariate analyses, age, etiology, SE severity and comorbidity, but not TA, were significantly related to outcome. Etiology (95% CI 4.3-82.8) and SE severity (95% CI 1.2-2.4) were independent predictors of mortality, and of lack of return to baseline conditions (etiology: 95% CI 3.9-14.0; SE severity: 95% CI 1.4-2.2). Moreover, TA did not improve outcome prediction in the corresponding ROC curves. Conclusions: This large analysis suggests that TA plays a negligible prognostic role in SE, probably reflecting the outstanding importance of the biological background. Awaiting treatment trials in SE, it appears questionable to apply further resources in refining treatment protocols involving existing compounds; rather, new therapeutic approaches should be identified and tested.
Resumo:
Theories on social capital and on social entrepreneurship have mainly highlighted the attitude of social capital to generate enterprises and to foster good relations between third sector organizations and the public sector. This paper considers the social capital in a specific third sector enterprise; here, multi-stakeholder social cooperatives are seen, at the same time, as social capital results, creators and incubators. In the particular enterprises that identify themselves as community social enterprises, social capital, both as organizational and relational capital, is fundamental: SCEs arise from but also produce and disseminate social capital. This paper aims to improve the building of relational social capital and the refining of helpful relations drawn from other arenas, where they were created and from where they are sometimes transferred to other realities, where their role is carried on further (often working in non-profit, horizontally and vertically arranged groups, where they share resources and relations). To represent this perspective, we use a qualitative system dynamic approach in which social capital is measured using proxies. Cooperation of volunteers, customers, community leaders and third sector local organizations is fundamental to establish trust relations between public local authorities and cooperatives. These relations help the latter to maintain long-term contracts with local authorities as providers of social services and enable them to add innovation to their services, by developing experiences and management models and maintaining an interchange with civil servants regarding these matters. The long-term relations and the organizational relations linking SCEs and public organizations help to create and to renovate social capital. Thus, multi-stakeholder cooperatives originated via social capital developed in third sector organizations produce new social capital within the cooperatives themselves and between different cooperatives (entrepreneurial components of the third sector) and the public sector. In their entrepreneurial life, cooperatives have to contrast the "working drift," as a result of which only workers remain as members of the cooperative, while other stakeholders leave the organization. Those who are not workers in the cooperative are (stake)holders with "weak ties," who are nevertheless fundamental in making a worker's cooperative an authentic social multi-stakeholders cooperative. To maintain multi-stakeholder governance and the relations with third sector and civil society, social cooperatives have to reinforce participation and dialogue with civil society through ongoing efforts to include people that provide social proposals. We try to represent these processes in a system dynamic model applied to local cooperatives, measuring the social capital created by the social cooperative through proxies, such as number of volunteers and strong cooperation with public institutions. Using a reverse-engineering approach, we can individuate the determinants of the creation of social capital and thereby give support to governance that creates social capital.
Resumo:
El projecte s’emmarca dins de l’àmbit de la indústria paperera, i dins el procés defabricació del paper es centra concretament amb l’etapa de la depuració ciclònica, una etapa en la qual es pretenen eliminar aquelles partícules sòlides que acompanyen la pasta de paper i que no són desitjables ja que afecten les característiques del paper i el funcionament de la resta d’equips del procés de fabricació. Es presenta una demanda de projecte per part del grup LEPAMAP (Laboratori d’Enginyeria Paperera i Materials Polímers) sobre el disseny d’una instal•lació de depuració ciclònica, mitjançant un hidrocicló, per realitzar activitats de recerca. Un hidrocicló és un equip que separa gràcies a la força centrífuga del flux d’entrada i pel seu disseny amb forma de con invertit. Així doncs, els objectius bàsics d’aquest projecte són dissenyar una instal•lació de depuració adequada per depurar una pasta de paper de possibles contaminants sòlids, la qual cosa implica el disseny d’una bomba centrífuga per impulsar el flux cap a l’interior de l’hidrocicló, dipòsits per emmagatzemar i preparar la suspensió, un agitador per mantenir en tot moment la homogeneïtat de la suspensió i uns dispositius de mesura i control per mesurar i controlar variables com el cabal, la pressió o el volum.El segon objectiu tracta de posar en marxa la instal•lació i comprovar si aquest equip,l’hidrocicló, separa correctament les partícules sòlides del flux de pasta.Paral•lelament s’ha estudiat quins són els equips de depuració més importants a laindústria paperera així com les etapes més importants que formen part del procés de fabricació del paper
Resumo:
Most available studies on lead smelter emissions deal with the environmental impact of outdoor particles, but only a few focus on air quality at workplaces. The objective of this study is to physically and chemically characterize the Pb-rich particles emitted at different workplaces in a lead recycling plant. A multi-scale characterization was conducted from bulk analysis to the level of individual particles, to assess the particles properties in relation with Pb speciation and availability. Process PM from various origins were sampled and then compared; namely Furnace and Refining PM respectively present in the smelter and at refinery workplaces, Emissions PM present in channeled emissions.These particles first differed by their morphology and size distribution, with finer particles found in emissions. Differences observed in chemical composition could be explained by the industrial processes. All PM contained the same major phases (Pb, PbS, PbO, PbSO4 and PbO·PbSO4) but differed on the nature and amount of minor phases. Due to high content in PM, Pb concentrations in the CaCl2 extractant reached relatively high values (40 mg.L-1). However, the ratios (soluble/total) of CaCl2 exchangeable Pb were relatively low (< 0.02%) in comparison with Cd (up to 18%). These results highlight the interest to assess the soluble fractions of all metals (minor and major) and discuss both total metal concentrations and ratios for risk evaluations. In most cases metal extractability increased with decreasing size of particles, in particular, lead exchangeability was highest for channeled emissions. Such type of study could help in the choice of targeted sanitary protection procedures and for further toxicological investigations. In the present context, particular attention is given to Emissions and Furnace PM. Moreover, exposure to other metals than Pb should be considered. [Authors]
Resumo:
Aquest estudi s’emmarca en un projecte titulat “Noves possibilitats en el blanqueig en lespastes destintades: Alternatives al blanqueig amb sosa càustica i peròxid d’hidrogen” i estàsubvencionat per diverses empreses europees del sector. La seva realització s’ha dut a termeen el Centre Technique du Papier, a Grenoble, França. L’estudi contempla la realitzaciód’una part del projecte, i té per objecte analitzar els Na2SiO3 modificats, el hidròxid demagnesi (Mg(OH)2) i el hidròxid de calci (Ca(OH)2) per veure si poden actuar com asubstituts del NaOH. Cal intentar aconseguir valors de blancor similars als assolits amb elblanqueig estàndard i millorar substancialment la qualitat de les aigües. Segons el centre derecerca es pot considerar una millora projectable a nivell industrial una reducció de la DQO ide la demanda catiònica a partir d’un 10% a nivell experimental.La pasta que es blanqueja és TMP verge. Els resultats obtinguts a nivell de blancor tambépoden ser considerats aplicables en paper recuperat, prèviament destintat, provenintd’aquest tipus de pasta