987 resultados para deep level centres


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Major coastal storms, associated with strong winds, high waves and intensified currents, and occasionally with heavy rains and flash floods, are mostly known because of the serious damage they can cause along the shoreline and the threats they pose to navigation. However, there is a profound lack of knowledge on the deep-sea impacts of severe coastal storms. Concurrent measurements of key parameters along the coast and in the deep-sea are extremely rare. Here we present a unique data set showing how one of the most extreme coastal storms of the last decades lashing the Western Mediterranean Sea rapidly impacted the deep-sea ecosystem. The storm peaked the 26th of December 2008 leading to the remobilization of a shallow-water reservoir of marine organic carbon associated with fine particles and resulting in its redistribution across the deep basin. The storm also initiated the movement of large amounts of coarse shelf sediment, which abraded and buried benthic communities. Our findings demonstrate, first, that severe coastal storms are highly efficient in transporting organic carbon from shallow water to deep water, thus contributing to its sequestration and, second, that natural, intermittent atmospheric drivers sensitive to global climate change have the potential to tremendously impact the largest and least known ecosystem on Earth, the deep-sea ecosystem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter presents the state of the art concerning the deep-sea Mediterranean environment: geology, hydrology, biology and fisheries. These are the fields of study dealt with in the scientific papers of this volume. The authors are specialists who have addressed their research to the Mediterranean deep-sea environment during the last years. This introduction is an overview but not an exhaustive review.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The goal of this paper is to investigate the respective influence of work characteristics, the effort-reward ratio, and overcommitment on the poor mental health of out-of-hospital care providers. METHODS: 333 out-of-hospital care providers answered a questionnaire that included queries on mental health (GHQ-12), demographics, health-related information and work characteristics, questions from the Effort-Reward Imbalance Questionnaire, and items about overcommitment. A two-level multiple regression was performed between mental health (the dependent variable) and the effort-reward ratio, the overcommitment score, weekly number of interventions, percentage of non-prehospital transport of patients out of total missions, gender, and age. Participants were first-level units, and ambulance services were second-level units. We also shadowed ambulance personnel for a total of 416 hr. RESULTS: With cutoff points of 2/3 and 3/4 positive answers on the GHQ-12, the percentages of potential cases with poor mental health were 20% and 15%, respectively. The effort-reward ratio was associated with poor mental health (P < 0.001), irrespective of age or gender. Overcommitment was associated with poor mental health; this association was stronger in women (β = 0.054) than in men (β = 0.020). The percentage of prehospital missions out of total missions was only associated with poor mental health at the individual level. CONCLUSIONS: Emergency medical services should pay attention to the way employees perceive their efforts and the rewarding aspects of their work: an imbalance of those aspects is associated with poor mental health. Low perceived esteem appeared particularly associated with poor mental health. This suggests that supervisors of emergency medical services should enhance the value of their employees' work. Employees with overcommitment should also receive appropriate consideration. Preventive measures should target individual perceptions of effort and reward in order to improve mental health in prehospital care providers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last five years, Deep Brain Stimulation (DBS) has become the most popular and effective surgical technique for the treatent of Parkinson's disease (PD). The Subthalamic Nucleus (STN) is the usual target involved when applying DBS. Unfortunately, the STN is in general not visible in common medical imaging modalities. Therefore, atlas-based segmentation is commonly considered to locate it in the images. In this paper, we propose a scheme that allows both, to perform a comparison between different registration algorithms and to evaluate their ability to locate the STN automatically. Using this scheme we can evaluate the expert variability against the error of the algorithms and we demonstrate that automatic STN location is possible and as accurate as the methods currently used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An efficient method is developed for an iterative solution of the Poisson and Schro¿dinger equations, which allows systematic studies of the properties of the electron gas in linear deep-etched quantum wires. A much simpler two-dimensional (2D) approximation is developed that accurately reproduces the results of the 3D calculations. A 2D Thomas-Fermi approximation is then derived, and shown to give a good account of average properties. Further, we prove that an analytic form due to Shikin et al. is a good approximation to the electron density given by the self-consistent methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the spectrum and magnetic properties of double quantum dots in the lowest Landau level for different values of the hopping and Zeeman parameters by means of exact diagonalization techniques in systems of N=6 and 7 electrons and a filling factor close to 2. We compare our results with those obtained in double quantum layers and single quantum dots. The Kohn theorem is also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AbstractIn addition to genetic changes affecting the function of gene products, changes in gene expression have been suggested to underlie many or even most of the phenotypic differences among mammals. However, detailed gene expression comparisons were, until recently, restricted to closely related species, owing to technological limitations. Thus, we took advantage of the latest technologies (RNA-Seq) to generate extensive qualitative and quantitative transcriptome data for a unique collection of somatic and germline tissues from representatives of all major mammalian lineages (placental mammals, marsupials and monotremes) and birds, the evolutionary outgroup.In the first major project of my thesis, we performed global comparative analyses of gene expression levels based on these data. Our analyses provided fundamental insights into the dynamics of transcriptome change during mammalian evolution (e.g., the rate of expression change across species, tissues and chromosomes) and allowed the exploration of the functional relevance and phenotypic implications of transcription changes at a genome-wide scale (e.g., we identified numerous potentially selectively driven expression switches).In a second project of my thesis, which was also based on the unique transcriptome data generated in the context of the first project we focused on the evolution of alternative splicing in mammals. Alternative splicing contributes to transcriptome complexity by generating several transcript isoforms from a single gene, which can, thus, perform various functions. To complete the global comparative analysis of gene expression changes, we explored patterns of alternative splicing evolution. This work uncovered several general and unexpected patterns of alternative splicing evolution (e.g., we found that alternative splicing evolves extremely rapidly) as well as a large number of conserved alternative isoforms that may be crucial for the functioning of mammalian organs.Finally, the third and final project of my PhD consisted in analyzing in detail the unique functional and evolutionary properties of the testis by exploring the extent of its transcriptome complexity. This organ was previously shown to evolve rapidly both at the phenotypic and molecular level, apparently because of the specific pressures that act on this organ and are associated with its reproductive function. Moreover, my analyses of the amniote tissue transcriptome data described above, revealed strikingly widespread transcriptional activity of both functional and nonfunctional genomic elements in the testis compared to the other organs. To elucidate the cellular source and mechanisms underlying this promiscuous transcription in the testis, we generated deep coverage RNA-Seq data for all major testis cell types as well as epigenetic data (DNA and histone methylation) using the mouse as model system. The integration of these complete dataset revealed that meiotic and especially post-meiotic germ cells are the major contributors to the widespread functional and nonfunctional transcriptome complexity of the testis, and that this "promiscuous" spermatogenic transcription is resulting, at least partially, from an overall transcriptionally permissive chromatin state. We hypothesize that this particular open state of the chromatin results from the extensive chromatin remodeling that occurs during spermatogenesis which ultimately leads to the replacement of histones by protamines in the mature spermatozoa. Our results have important functional and evolutionary implications (e.g., regarding new gene birth and testicular gene expression evolution).Generally, these three large-scale projects of my thesis provide complete and massive datasets that constitute valuables resources for further functional and evolutionary analyses of mammalian genomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A mobile ad hoc network (MANET) is a decentralized and infrastructure-less network. This thesis aims to provide support at the system-level for developers of applications or protocols in such networks. To do this, we propose contributions in both the algorithmic realm and in the practical realm. In the algorithmic realm, we contribute to the field by proposing different context-aware broadcast and multicast algorithms in MANETs, namely six-shot broadcast, six-shot multicast, PLAN-B and ageneric algorithmic approach to optimize the power consumption of existing algorithms. For each algorithm we propose, we compare it to existing algorithms that are either probabilistic or context-aware, and then we evaluate their performance based on simulations. We demonstrate that in some cases, context-aware information, such as location or signal-strength, can improve the effciency. In the practical realm, we propose a testbed framework, namely ManetLab, to implement and to deploy MANET-specific protocols, and to evaluate their performance. This testbed framework aims to increase the accuracy of performance evaluation compared to simulations, while keeping the ease of use offered by the simulators to reproduce a performance evaluation. By evaluating the performance of different probabilistic algorithms with ManetLab, we observe that both simulations and testbeds should be used in a complementary way. In addition to the above original contributions, we also provide two surveys about system-level support for ad hoc communications in order to establish a state of the art. The first is about existing broadcast algorithms and the second is about existing middleware solutions and the way they deal with privacy and especially with location privacy. - Un réseau mobile ad hoc (MANET) est un réseau avec une architecture décentralisée et sans infrastructure. Cette thèse vise à fournir un support adéquat, au niveau système, aux développeurs d'applications ou de protocoles dans de tels réseaux. Dans ce but, nous proposons des contributions à la fois dans le domaine de l'algorithmique et dans celui de la pratique. Nous contribuons au domaine algorithmique en proposant différents algorithmes de diffusion dans les MANETs, algorithmes qui sont sensibles au contexte, à savoir six-shot broadcast,six-shot multicast, PLAN-B ainsi qu'une approche générique permettant d'optimiser la consommation d'énergie de ces algorithmes. Pour chaque algorithme que nous proposons, nous le comparons à des algorithmes existants qui sont soit probabilistes, soit sensibles au contexte, puis nous évaluons leurs performances sur la base de simulations. Nous montrons que, dans certains cas, des informations liées au contexte, telles que la localisation ou l'intensité du signal, peuvent améliorer l'efficience de ces algorithmes. Sur le plan pratique, nous proposons une plateforme logicielle pour la création de bancs d'essai, intitulé ManetLab, permettant d'implémenter, et de déployer des protocoles spécifiques aux MANETs, de sorte à évaluer leur performance. Cet outil logiciel vise à accroître la précision desévaluations de performance comparativement à celles fournies par des simulations, tout en conservant la facilité d'utilisation offerte par les simulateurs pour reproduire uneévaluation de performance. En évaluant les performances de différents algorithmes probabilistes avec ManetLab, nous observons que simulateurs et bancs d'essai doivent être utilisés de manière complémentaire. En plus de ces contributions principales, nous fournissons également deux états de l'art au sujet du support nécessaire pour les communications ad hoc. Le premier porte sur les algorithmes de diffusion existants et le second sur les solutions de type middleware existantes et la façon dont elles traitent de la confidentialité, en particulier celle de la localisation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La présente étude est à la fois une évaluation du processus de la mise en oeuvre et des impacts de la police de proximité dans les cinq plus grandes zones urbaines de Suisse - Bâle, Berne, Genève, Lausanne et Zurich. La police de proximité (community policing) est à la fois une philosophie et une stratégie organisationnelle qui favorise un partenariat renouvelé entre la police et les communautés locales dans le but de résoudre les problèmes relatifs à la sécurité et à l'ordre public. L'évaluation de processus a analysé des données relatives aux réformes internes de la police qui ont été obtenues par l'intermédiaire d'entretiens semi-structurés avec des administrateurs clés des cinq départements de police, ainsi que dans des documents écrits de la police et d'autres sources publiques. L'évaluation des impacts, quant à elle, s'est basée sur des variables contextuelles telles que des statistiques policières et des données de recensement, ainsi que sur des indicateurs d'impacts construit à partir des données du Swiss Crime Survey (SCS) relatives au sentiment d'insécurité, à la perception du désordre public et à la satisfaction de la population à l'égard de la police. Le SCS est un sondage régulier qui a permis d'interroger des habitants des cinq grandes zones urbaines à plusieurs reprises depuis le milieu des années 1980. L'évaluation de processus a abouti à un « Calendrier des activités » visant à créer des données de panel permettant de mesurer les progrès réalisés dans la mise en oeuvre de la police de proximité à l'aide d'une grille d'évaluation à six dimensions à des intervalles de cinq ans entre 1990 et 2010. L'évaluation des impacts, effectuée ex post facto, a utilisé un concept de recherche non-expérimental (observational design) dans le but d'analyser les impacts de différents modèles de police de proximité dans des zones comparables à travers les cinq villes étudiées. Les quartiers urbains, délimités par zone de code postal, ont ainsi été regroupés par l'intermédiaire d'une typologie réalisée à l'aide d'algorithmes d'apprentissage automatique (machine learning). Des algorithmes supervisés et non supervisés ont été utilisés sur les données à haute dimensionnalité relatives à la criminalité, à la structure socio-économique et démographique et au cadre bâti dans le but de regrouper les quartiers urbains les plus similaires dans des clusters. D'abord, les cartes auto-organisatrices (self-organizing maps) ont été utilisées dans le but de réduire la variance intra-cluster des variables contextuelles et de maximiser simultanément la variance inter-cluster des réponses au sondage. Ensuite, l'algorithme des forêts d'arbres décisionnels (random forests) a permis à la fois d'évaluer la pertinence de la typologie de quartier élaborée et de sélectionner les variables contextuelles clés afin de construire un modèle parcimonieux faisant un minimum d'erreurs de classification. Enfin, pour l'analyse des impacts, la méthode des appariements des coefficients de propension (propensity score matching) a été utilisée pour équilibrer les échantillons prétest-posttest en termes d'âge, de sexe et de niveau d'éducation des répondants au sein de chaque type de quartier ainsi identifié dans chacune des villes, avant d'effectuer un test statistique de la différence observée dans les indicateurs d'impacts. De plus, tous les résultats statistiquement significatifs ont été soumis à une analyse de sensibilité (sensitivity analysis) afin d'évaluer leur robustesse face à un biais potentiel dû à des covariables non observées. L'étude relève qu'au cours des quinze dernières années, les cinq services de police ont entamé des réformes majeures de leur organisation ainsi que de leurs stratégies opérationnelles et qu'ils ont noué des partenariats stratégiques afin de mettre en oeuvre la police de proximité. La typologie de quartier développée a abouti à une réduction de la variance intra-cluster des variables contextuelles et permet d'expliquer une partie significative de la variance inter-cluster des indicateurs d'impacts avant la mise en oeuvre du traitement. Ceci semble suggérer que les méthodes de géocomputation aident à équilibrer les covariables observées et donc à réduire les menaces relatives à la validité interne d'un concept de recherche non-expérimental. Enfin, l'analyse des impacts a révélé que le sentiment d'insécurité a diminué de manière significative pendant la période 2000-2005 dans les quartiers se trouvant à l'intérieur et autour des centres-villes de Berne et de Zurich. Ces améliorations sont assez robustes face à des biais dus à des covariables inobservées et covarient dans le temps et l'espace avec la mise en oeuvre de la police de proximité. L'hypothèse alternative envisageant que les diminutions observées dans le sentiment d'insécurité soient, partiellement, un résultat des interventions policières de proximité semble donc être aussi plausible que l'hypothèse nulle considérant l'absence absolue d'effet. Ceci, même si le concept de recherche non-expérimental mis en oeuvre ne peut pas complètement exclure la sélection et la régression à la moyenne comme explications alternatives. The current research project is both a process and impact evaluation of community policing in Switzerland's five major urban areas - Basel, Bern, Geneva, Lausanne, and Zurich. Community policing is both a philosophy and an organizational strategy that promotes a renewed partnership between the police and the community to solve problems of crime and disorder. The process evaluation data on police internal reforms were obtained through semi-structured interviews with key administrators from the five police departments as well as from police internal documents and additional public sources. The impact evaluation uses official crime records and census statistics as contextual variables as well as Swiss Crime Survey (SCS) data on fear of crime, perceptions of disorder, and public attitudes towards the police as outcome measures. The SCS is a standing survey instrument that has polled residents of the five urban areas repeatedly since the mid-1980s. The process evaluation produced a "Calendar of Action" to create panel data to measure community policing implementation progress over six evaluative dimensions in intervals of five years between 1990 and 2010. The impact evaluation, carried out ex post facto, uses an observational design that analyzes the impact of the different community policing models between matched comparison areas across the five cities. Using ZIP code districts as proxies for urban neighborhoods, geospatial data mining algorithms serve to develop a neighborhood typology in order to match the comparison areas. To this end, both unsupervised and supervised algorithms are used to analyze high-dimensional data on crime, the socio-economic and demographic structure, and the built environment in order to classify urban neighborhoods into clusters of similar type. In a first step, self-organizing maps serve as tools to develop a clustering algorithm that reduces the within-cluster variance in the contextual variables and simultaneously maximizes the between-cluster variance in survey responses. The random forests algorithm then serves to assess the appropriateness of the resulting neighborhood typology and to select the key contextual variables in order to build a parsimonious model that makes a minimum of classification errors. Finally, for the impact analysis, propensity score matching methods are used to match the survey respondents of the pretest and posttest samples on age, gender, and their level of education for each neighborhood type identified within each city, before conducting a statistical test of the observed difference in the outcome measures. Moreover, all significant results were subjected to a sensitivity analysis to assess the robustness of these findings in the face of potential bias due to some unobserved covariates. The study finds that over the last fifteen years, all five police departments have undertaken major reforms of their internal organization and operating strategies and forged strategic partnerships in order to implement community policing. The resulting neighborhood typology reduced the within-cluster variance of the contextual variables and accounted for a significant share of the between-cluster variance in the outcome measures prior to treatment, suggesting that geocomputational methods help to balance the observed covariates and hence to reduce threats to the internal validity of an observational design. Finally, the impact analysis revealed that fear of crime dropped significantly over the 2000-2005 period in the neighborhoods in and around the urban centers of Bern and Zurich. These improvements are fairly robust in the face of bias due to some unobserved covariate and covary temporally and spatially with the implementation of community policing. The alternative hypothesis that the observed reductions in fear of crime were at least in part a result of community policing interventions thus appears at least as plausible as the null hypothesis of absolutely no effect, even if the observational design cannot completely rule out selection and regression to the mean as alternative explanations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To evaluate the long-term success rate and complications of nonpenetrating deep sclerectomy with collagen implant in open-angle glaucoma. PATIENTS AND METHODS: Clinical, prospective, monocentric, nonrandomized, unmasked study on 105 patients with medically uncontrolled glaucoma. A standard procedure deep sclerectomy with collagen implant was performed. Complete examinations were performed before surgery and postoperatively at 1 and 7 days; 1, 2, 3, 6, 9, and 12 months and then every 6 months during the 10 following years. RESULTS: The mean follow-up was 101.5+/-43.1 (3 to 144) months [mean+/-SD, (range)]. The preoperative intraocular pressure (IOP) was 26.8+/-7.7 (14 to 52) mm Hg and the best-corrected visual acuity 0.71+/-0.33 (0.02 to 1.5). Ten years after surgery IOP was 12.2+/-4.7 (6 to 20) mm Hg and best-corrected visual acuity 0.63+/-0.34 (0.01 to 1.2) (number of remaining patients=52). The mean number of medications per patient went from 2.3+/-0.7 (1 to 4) down to 1.3+/-1.1 (0 to 3). An IOP <or=21 mm Hg without medication was achieved in 47.7% patients and in 89% with or without treatment. One major complication was reported. Goniopuncture was performed in 61 eyes (59.8%), 5-fluorouracil treatment given to 25 patients postoperatively and included needling (n=5). CONCLUSIONS: On the basis of a 10-year follow-up deep sclerectomy with collagen implant demonstrated its efficacy in controlling IOP with few postoperative complications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently a new Bell inequality has been introduced by Collins et al. [Phys. Rev. Lett. 88, 040404 (2002)], which is strongly resistant to noise for maximally entangled states of two d-dimensional quantum systems. We prove that a larger violation, or equivalently a stronger resistance to noise, is found for a nonmaximally entangled state. It is shown that the resistance to noise is not a good measure of nonlocality and we introduce some other possible measures. The nonmaximally entangled state turns out to be more robust also for these alternative measures. From these results it follows that two von Neumann measurements per party may be not optimal for detecting nonlocality. For d=3,4, we point out some connections between this inequality and distillability. Indeed, we demonstrate that any state violating it, with the optimal von Neumann settings, is distillable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT : Background : The aim of this study was to evaluate the midterm biocompatibility of a new x-shaped implant made of zirconium in an animal model of glaucoma surgery. Methods : Preoperatively, ultrasound biomicroscopy (UBM), intraocular pressure (IOP) and outflow facility (OF) data were acquired. Upon surgery, one eye was chosen randomly to receive an implant, while the other received none. Ten rabbits went through a 1-, 2-, 3-, 4- and 6-month follow-up. [OP was measured regularly, UBM performed at 1, 3 and 6 months after surgery. At the end of the follow-up, OF was again measured. Histology sections were analyzed. Results : For both groups IOP control was satisfactory, while OF initially increased at month 1 to resume preoperative values thereafter. Eyes with implants had larger filtration blebs which decreased faster than in eyes without the implant. Drainage vessel density, inflammatory cell number and fibrosis were higher in tissues near the implant. Conclusions : The zirconium implant initially promoted the positive effects of the surgery (IOP control, OF increase). Nevertheless, after several months, foreign body reactions and fibrosis had occurred on some implants that restrained the early benefit of such a procedure. Modifications of the zirconium implant geometry could enhance the overall success rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Precise diagnosis of DVT of the legs is a challenging problem, not only in front of suspicion of PE, but also in all status of leg pain, warmth and swelling. Clinical diagnosis has a low accuracy and further investigations are mandatory in order to diagnose DVT. Amongst the possible investigations, US has a high specificity and a good NPV. However, many pathologies unrelated to the veins may mimic the signs and symptoms of DVT and have to be recognized in order to make the correct diagnosis. The purpose of this paper is to review the results of the US investigations of the legs performed in our Department during the last three years for a suspicion of DVT and describe alternative diagnoses mimicking DVT. Methods and materials: Through a RIS-based search, we retrospectively reviewed all the cases of US of the legs performed in our Department between January 2006 and December 2008 for a suspicion of DVT. We selected the cases of positive findings unrelated to the veins and illustrated these findings with characteristic images. Results: 419 US of the legs were performed between December 2006 and December 2008 for a suspicion of DVT. Among these, 75 were positive for DVT, and 79 for alternative diagnosis. The most common alternative diagnosis was edema of the legs (31%), followed by hematoma (23%). Other findings were Baker cysts (13%), cellulitis (10%) and lymphoceles (5%). Rare diagnoses were arterio-venous malformations, pseudoaneurysms, pelvic masses, necrosing fasciitis, intramuscular abscesses, subcutaneous seromas, sarcoma and ganglion cysts. Conclusion: A greater knowledge of the US appearance of the pathologies mimicking DVT may help to make the correct diagnosis, avoiding further expensive investigations or inappropriate anticoagulant therapy.