29 resultados para merged beams
Resumo:
BACKGROUND AND PURPOSE: The DRAGON score predicts functional outcome in the hyperacute phase of intravenous thrombolysis treatment of ischemic stroke patients. We aimed to validate the score in a large multicenter cohort in anterior and posterior circulation. METHODS: Prospectively collected data of consecutive ischemic stroke patients who received intravenous thrombolysis in 12 stroke centers were merged (n=5471). We excluded patients lacking data necessary to calculate the score and patients with missing 3-month modified Rankin scale scores. The final cohort comprised 4519 eligible patients. We assessed the performance of the DRAGON score with area under the receiver operating characteristic curve in the whole cohort for both good (modified Rankin scale score, 0-2) and miserable (modified Rankin scale score, 5-6) outcomes. RESULTS: Area under the receiver operating characteristic curve was 0.84 (0.82-0.85) for miserable outcome and 0.82 (0.80-0.83) for good outcome. Proportions of patients with good outcome were 96%, 93%, 78%, and 0% for 0 to 1, 2, 3, and 8 to 10 score points, respectively. Proportions of patients with miserable outcome were 0%, 2%, 4%, 89%, and 97% for 0 to 1, 2, 3, 8, and 9 to 10 points, respectively. When tested separately for anterior and posterior circulation, there was no difference in performance (P=0.55); areas under the receiver operating characteristic curve were 0.84 (0.83-0.86) and 0.82 (0.78-0.87), respectively. No sex-related difference in performance was observed (P=0.25). CONCLUSIONS: The DRAGON score showed very good performance in the large merged cohort in both anterior and posterior circulation strokes. The DRAGON score provides rapid estimation of patient prognosis and supports clinical decision-making in the hyperacute phase of stroke care (eg, when invasive add-on strategies are considered).
Resumo:
PURPOSE: We investigated the influence of beam modulation on treatment planning by comparing four available stereotactic radiosurgery (SRS) modalities: Gamma-Knife-Perfexion, Novalis-Tx Dynamic-Conformal-Arc (DCA) and Dynamic-Multileaf-Collimation-Intensity-Modulated-radiotherapy (DMLC-IMRT), and Cyberknife. MATERIAL AND METHODS: Patients with arteriovenous malformation (n = 10) or acoustic neuromas (n = 5) were planned with different treatment modalities. Paddick conformity index (CI), dose heterogeneity (DH), gradient index (GI) and beam-on time were used as dosimetric indices. RESULTS: Gamma-Knife-Perfexion can achieve high degree of conformity (CI = 0.77 ± 0.04) with limited low-doses (GI = 2.59 ± 0.10) surrounding the inhomogeneous dose distribution (D(H) = 0.84 ± 0.05) at the cost of treatment time (68.1 min ± 27.5). Novalis-Tx-DCA improved this inhomogeneity (D(H) = 0.30 ± 0.03) and treatment time (16.8 min ± 2.2) at the cost of conformity (CI = 0.66 ± 0.04) and Novalis-TX-DMLC-IMRT improved the DCA CI (CI = 0.68 ± 0.04) and inhomogeneity (D(H) = 0.18 ± 0.05) at the cost of low-doses (GI = 3.94 ± 0.92) and treatment time (21.7 min ± 3.4) (p<0.01). Cyberknife achieved comparable conformity (CI = 0.77 ± 0.06) at the cost of low-doses (GI = 3.48 ± 0.47) surrounding the homogeneous (D(H) = 0.22 ± 0.02) dose distribution and treatment time (28.4min±8.1) (p<0.01). CONCLUSIONS: Gamma-Knife-Perfexion will comply with all SRS constraints (high conformity while minimizing low-dose spread). Multiple focal entries (Gamma-Knife-Perfexion and Cyberknife) will achieve better conformity than High-Definition-MLC of Novalis-Tx at the cost of treatment time. Non-isocentric beams (Cyberknife) or IMRT-beams (Novalis-Tx-DMLC-IMRT) will spread more low-dose than multiple isocenters (Gamma-Knife-Perfexion) or dynamic arcs (Novalis-Tx-DCA). Inverse planning and modulated fluences (Novalis-Tx-DMLC-IMRT and CyberKnife) will deliver the most homogeneous treatment. Furthermore, Linac-based systems (Novalis and Cyberknife) can perform image verification at the time of treatment delivery.
Resumo:
BACKGROUND AND AIMS: The structured IBD Ahead 'Optimised Monitoring' programme was designed to obtain the opinion, insight and advice of gastroenterologists on optimising the monitoring of Crohn's disease activity in four settings: (1) assessment at diagnosis, (2) monitoring in symptomatic patients, (3) monitoring in asymptomatic patients, and (4) the postoperative follow-up. For each of these settings, four monitoring methods were discussed: (a) symptom assessment, (b) endoscopy, (c) laboratory markers, and (d) imaging. Based on literature search and expert opinion compiled during an international consensus meeting, recommendations were given to answer the question 'which diagnostic method, when, and how often'. The International IBD Ahead Expert Panel advised to tailor this guidance to the healthcare system and the special prerequisites of each country. The IBD Ahead Swiss National Steering Committee proposes best-practice recommendations adapted for Switzerland. METHODS: The IBD Ahead Steering Committee identified key questions and provided the Swiss Expert Panel with a structured literature research. The expert panel agreed on a set of statements. During an international expert meeting the consolidated outcome of the national meetings was merged into final statements agreed by the participating International and National Steering Committee members - the IBD Ahead 'Optimized Monitoring' Consensus. RESULTS: A systematic assessment of symptoms, endoscopy findings, and laboratory markers with special emphasis on faecal calprotectin is deemed necessary even in symptom-free patients. The choice of recommended imaging methods is adapted to the specific situation in Switzerland and highlights the importance of ultrasonography and magnetic resonance imaging besides endoscopy. CONCLUSION: The recommendations stress the importance of monitoring disease activity on a regular basis and by objective parameters, such as faecal calprotectin and endoscopy with detailed documentation of findings. Physicians should not rely on symptoms only and adapt the monitoring schedule and choice of options to individual situations. © 2014 S. Karger AG, Basel.
Resumo:
In this paper, aggregate migration patterns during fluid concrete castings are studied through experiments, dimensionless approach and numerical modeling. The experimental results obtained on two beams show that gravity induced migration is primarily affecting the coarsest aggregates resulting in a decrease of coarse aggregates volume fraction with the horizontal distance from the pouring point and in a puzzling vertical multi-layer structure. The origin of this multi layer structure is discussed and analyzed with the help of numerical simulations of free surface flow. Our results suggest that it finds its origin in the non Newtonian nature of fresh concrete and that increasing casting rate shall decrease the magnitude of gravity induced particle migration. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Objective: To demonstrate our institutional experience in the treatment ofdiffuse intrinsic pontine glioma (DIPG) with an hypofractionated external beam radiotherapy schedule.Materials and Methods: Between April 1996 and January 2004, 22 patients, ages 2.9-12.5 years, with newly diagnosed DIPG were treated by hypofractionated radiation therapy delivering a total dose of 45 Gy in daily fraction of 3 Gy, given over 3 weeks. No other treatment was applied concomittently.Results: Fourteen of the 22 patients received the prescribed dose of 45 Gy in 15 fractions of 3 Gy, two patients received a total dose of 60 and 45 Gy with a combination of two different beams (photons and neutrons), in 5 cases the daily fraction was modified to 2 Gy because of bad tolerance and one patient died due to serious intracranial hypertension after 2 fractions of 3 Gy and one of 2 Gy. Fourteen patients of 22 patients/of the total showed a clinical improvement, usually starting in the second week of treatment. No grade 3 or 4 acute toxicity from radiotherapy was observed. No treatment interruption was needed. In six patients, steroids could be discontinued within one month after the end of radiotherapy. The median time to progression and the median overall survival were 5.7 months and 7.6 months, respectively.Conclusion: External radiotherapy with a radical hypofractionated regimen is feasible and well tolerated in children with newly diagnosed DIPG. This regimen does not seem however to change the overall survival in this setting. It could represent an alternative option of short duration to more protracted regimens.
Resumo:
PURPOSE: Late toxicities such as second cancer induction become more important as treatment outcome improves. Often the dose distribution calculated with a commercial treatment planning system (TPS) is used to estimate radiation carcinogenesis for the radiotherapy patient. However, for locations beyond the treatment field borders, the accuracy is not well known. The aim of this study was to perform detailed out-of-field-measurements for a typical radiotherapy treatment plan administered with a Cyberknife and a Tomotherapy machine and to compare the measurements to the predictions of the TPS. MATERIALS AND METHODS: Individually calibrated thermoluminescent dosimeters were used to measure absorbed dose in an anthropomorphic phantom at 184 locations. The measured dose distributions from 6 MV intensity-modulated treatment beams for CyberKnife and TomoTherapy machines were compared to the dose calculations from the TPS. RESULTS: The TPS are underestimating the dose far away from the target volume. Quantitatively the Cyberknife underestimates the dose at 40cm from the PTV border by a factor of 60, the Tomotherapy TPS by a factor of two. If a 50% dose uncertainty is accepted, the Cyberknife TPS can predict doses down to approximately 10 mGy/treatment Gy, the Tomotherapy-TPS down to 0.75 mGy/treatment Gy. The Cyberknife TPS can then be used up to 10cm from the PTV border the Tomotherapy up to 35cm. CONCLUSIONS: We determined that the Cyberknife and Tomotherapy TPS underestimate substantially the doses far away from the treated volume. It is recommended not to use out-of-field doses from the Cyberknife TPS for applications like modeling of second cancer induction. The Tomotherapy TPS can be used up to 35cm from the PTV border (for a 390 cm(3) large PTV).
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
BACKGROUND AND PURPOSE: Several prognostic scores have been developed to predict the risk of symptomatic intracranial hemorrhage (sICH) after ischemic stroke thrombolysis. We compared the performance of these scores in a multicenter cohort. METHODS: We merged prospectively collected data of patients with consecutive ischemic stroke who received intravenous thrombolysis in 7 stroke centers. We identified and evaluated 6 scores that can provide an estimate of the risk of sICH in hyperacute settings: MSS (Multicenter Stroke Survey); HAT (Hemorrhage After Thrombolysis); SEDAN (blood sugar, early infarct signs, [hyper]dense cerebral artery sign, age, NIH Stroke Scale); GRASPS (glucose at presentation, race [Asian], age, sex [male], systolic blood pressure at presentation, and severity of stroke at presentation [NIH Stroke Scale]); SITS (Safe Implementation of Thrombolysis in Stroke); and SPAN (stroke prognostication using age and NIH Stroke Scale)-100 positive index. We included only patients with available variables for all scores. We calculated the area under the receiver operating characteristic curve (AUC-ROC) and also performed logistic regression and the Hosmer-Lemeshow test. RESULTS: The final cohort comprised 3012 eligible patients, of whom 221 (7.3%) had sICH per National Institute of Neurological Disorders and Stroke, 141 (4.7%) per European Cooperative Acute Stroke Study II, and 86 (2.9%) per Safe Implementation of Thrombolysis in Stroke criteria. The performance of the scores assessed with AUC-ROC for predicting European Cooperative Acute Stroke Study II sICH was: MSS, 0.63 (95% confidence interval, 0.58-0.68); HAT, 0.65 (0.60-0.70); SEDAN, 0.70 (0.66-0.73); GRASPS, 0.67 (0.62-0.72); SITS, 0.64 (0.59-0.69); and SPAN-100 positive index, 0.56 (0.50-0.61). SEDAN had significantly higher AUC-ROC values compared with all other scores, except for GRASPS where the difference was nonsignificant. SPAN-100 performed significantly worse compared with other scores. The discriminative ranking of the scores was the same for the National Institute of Neurological Disorders and Stroke, and Safe Implementation of Thrombolysis in Stroke definitions, with SEDAN performing best, GRASPS second, and SPAN-100 worst. CONCLUSIONS: SPAN-100 had the worst predictive power, and SEDAN constantly the highest predictive power. However, none of the scores had better than moderate performance.
Resumo:
A sense of calling in career is supposed to have positive implications for individuals and organizations but current theoretical development is plagued with incongruent conceptualizations of what does or does not constitute a calling. The present study used cluster analysis to identify essential and optional components of a presence of calling among 407 German undergraduate students from different majors. Three types of calling merged: "negative career self-centered", "pro-social religious", and "positive varied work orientation". All types could be described as vocational identity achieved (high commitment/high self-exploration), high in career confidence and career engagement. Not defining characteristics were centrality of work or religion, endorsement of specific work values, or positivity of core self-evaluations. The results suggest that callings entail intense self-exploration and might be beneficial because they correspond with identity achievement and promote career confidence and engagement while not necessarily having pro-social orientations. Suggestions for future research, theory and practice are suggested.
Resumo:
Résumé : La radiothérapie par modulation d'intensité (IMRT) est une technique de traitement qui utilise des faisceaux dont la fluence de rayonnement est modulée. L'IMRT, largement utilisée dans les pays industrialisés, permet d'atteindre une meilleure homogénéité de la dose à l'intérieur du volume cible et de réduire la dose aux organes à risque. Une méthode usuelle pour réaliser pratiquement la modulation des faisceaux est de sommer de petits faisceaux (segments) qui ont la même incidence. Cette technique est appelée IMRT step-and-shoot. Dans le contexte clinique, il est nécessaire de vérifier les plans de traitement des patients avant la première irradiation. Cette question n'est toujours pas résolue de manière satisfaisante. En effet, un calcul indépendant des unités moniteur (représentatif de la pondération des chaque segment) ne peut pas être réalisé pour les traitements IMRT step-and-shoot, car les poids des segments ne sont pas connus à priori, mais calculés au moment de la planification inverse. Par ailleurs, la vérification des plans de traitement par comparaison avec des mesures prend du temps et ne restitue pas la géométrie exacte du traitement. Dans ce travail, une méthode indépendante de calcul des plans de traitement IMRT step-and-shoot est décrite. Cette méthode est basée sur le code Monte Carlo EGSnrc/BEAMnrc, dont la modélisation de la tête de l'accélérateur linéaire a été validée dans une large gamme de situations. Les segments d'un plan de traitement IMRT sont simulés individuellement dans la géométrie exacte du traitement. Ensuite, les distributions de dose sont converties en dose absorbée dans l'eau par unité moniteur. La dose totale du traitement dans chaque élément de volume du patient (voxel) peut être exprimée comme une équation matricielle linéaire des unités moniteur et de la dose par unité moniteur de chacun des faisceaux. La résolution de cette équation est effectuée par l'inversion d'une matrice à l'aide de l'algorithme dit Non-Negative Least Square fit (NNLS). L'ensemble des voxels contenus dans le volume patient ne pouvant être utilisés dans le calcul pour des raisons de limitations informatiques, plusieurs possibilités de sélection ont été testées. Le meilleur choix consiste à utiliser les voxels contenus dans le Volume Cible de Planification (PTV). La méthode proposée dans ce travail a été testée avec huit cas cliniques représentatifs des traitements habituels de radiothérapie. Les unités moniteur obtenues conduisent à des distributions de dose globale cliniquement équivalentes à celles issues du logiciel de planification des traitements. Ainsi, cette méthode indépendante de calcul des unités moniteur pour l'IMRT step-andshootest validée pour une utilisation clinique. Par analogie, il serait possible d'envisager d'appliquer une méthode similaire pour d'autres modalités de traitement comme par exemple la tomothérapie. Abstract : Intensity Modulated RadioTherapy (IMRT) is a treatment technique that uses modulated beam fluence. IMRT is now widespread in more advanced countries, due to its improvement of dose conformation around target volume, and its ability to lower doses to organs at risk in complex clinical cases. One way to carry out beam modulation is to sum smaller beams (beamlets) with the same incidence. This technique is called step-and-shoot IMRT. In a clinical context, it is necessary to verify treatment plans before the first irradiation. IMRT Plan verification is still an issue for this technique. Independent monitor unit calculation (representative of the weight of each beamlet) can indeed not be performed for IMRT step-and-shoot, because beamlet weights are not known a priori, but calculated by inverse planning. Besides, treatment plan verification by comparison with measured data is time consuming and performed in a simple geometry, usually in a cubic water phantom with all machine angles set to zero. In this work, an independent method for monitor unit calculation for step-and-shoot IMRT is described. This method is based on the Monte Carlo code EGSnrc/BEAMnrc. The Monte Carlo model of the head of the linear accelerator is validated by comparison of simulated and measured dose distributions in a large range of situations. The beamlets of an IMRT treatment plan are calculated individually by Monte Carlo, in the exact geometry of the treatment. Then, the dose distributions of the beamlets are converted in absorbed dose to water per monitor unit. The dose of the whole treatment in each volume element (voxel) can be expressed through a linear matrix equation of the monitor units and dose per monitor unit of every beamlets. This equation is solved by a Non-Negative Least Sqvare fif algorithm (NNLS). However, not every voxels inside the patient volume can be used in order to solve this equation, because of computer limitations. Several ways of voxel selection have been tested and the best choice consists in using voxels inside the Planning Target Volume (PTV). The method presented in this work was tested with eight clinical cases, which were representative of usual radiotherapy treatments. The monitor units obtained lead to clinically equivalent global dose distributions. Thus, this independent monitor unit calculation method for step-and-shoot IMRT is validated and can therefore be used in a clinical routine. It would be possible to consider applying a similar method for other treatment modalities, such as for instance tomotherapy or volumetric modulated arc therapy.
Resumo:
La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.
Resumo:
BACKGROUND: Artemether-lumefantrine is the most widely used artemisinin-based combination therapy for malaria, although treatment failures occur in some regions. We investigated the effect of dosing strategy on efficacy in a pooled analysis from trials done in a wide range of malaria-endemic settings. METHODS: We searched PubMed for clinical trials that enrolled and treated patients with artemether-lumefantrine and were published from 1960 to December, 2012. We merged individual patient data from these trials by use of standardised methods. The primary endpoint was the PCR-adjusted risk of Plasmodium falciparum recrudescence by day 28. Secondary endpoints consisted of the PCR-adjusted risk of P falciparum recurrence by day 42, PCR-unadjusted risk of P falciparum recurrence by day 42, early parasite clearance, and gametocyte carriage. Risk factors for PCR-adjusted recrudescence were identified using Cox's regression model with frailty shared across the study sites. FINDINGS: We included 61 studies done between January, 1998, and December, 2012, and included 14 327 patients in our analyses. The PCR-adjusted therapeutic efficacy was 97·6% (95% CI 97·4-97·9) at day 28 and 96·0% (95·6-96·5) at day 42. After controlling for age and parasitaemia, patients prescribed a higher dose of artemether had a lower risk of having parasitaemia on day 1 (adjusted odds ratio [OR] 0·92, 95% CI 0·86-0·99 for every 1 mg/kg increase in daily artemether dose; p=0·024), but not on day 2 (p=0·69) or day 3 (0·087). In Asia, children weighing 10-15 kg who received a total lumefantrine dose less than 60 mg/kg had the lowest PCR-adjusted efficacy (91·7%, 95% CI 86·5-96·9). In Africa, the risk of treatment failure was greatest in malnourished children aged 1-3 years (PCR-adjusted efficacy 94·3%, 95% CI 92·3-96·3). A higher artemether dose was associated with a lower gametocyte presence within 14 days of treatment (adjusted OR 0·92, 95% CI 0·85-0·99; p=0·037 for every 1 mg/kg increase in total artemether dose). INTERPRETATION: The recommended dose of artemether-lumefantrine provides reliable efficacy in most patients with uncomplicated malaria. However, therapeutic efficacy was lowest in young children from Asia and young underweight children from Africa; a higher dose regimen should be assessed in these groups. FUNDING: Bill & Melinda Gates Foundation.
Resumo:
BACKGROUND: Deep brain stimulation (DBS) is recognized as an effective treatment for movement disorders. We recently changed our technique, limiting the number of brain penetrations to three per side. OBJECTIVES: The first aim was to evaluate the electrode precision on both sides of surgery since we implemented this surgical technique. The second aim was to analyse whether or not the electrode placement was improved with microrecording and macrostimulation. METHODS: We retrospectively reviewed operation protocols and MRIs of 30 patients who underwent bilateral DBS. For microrecording and macrostimulation, we used three parallel channels of the 'Ben Gun' centred on the MRI-planned target. Pre- and post-operative MRIs were merged. The distance between the planned target and the centre of the implanted electrode artefact was measured. RESULTS: There was no significant difference in targeting precision on both sides of surgery. There was more intra-operative adjustment of the second electrode positioning based on microrecording and macrostimulation, which allowed to significantly approach the MRI-planned target on the medial-lateral axis. CONCLUSION: There was more electrode adjustment needed on the second side, possibly in relation with brain shift. We thus suggest performing a single central track with electrophysiological and clinical assessment, with multidirectional exploration on demand for suboptimal clinical responses.
Resumo:
The first molecular phylogenies of the flowering plant family Ranunculaceae were published more than twenty years ago, and have led to major changes in the infrafamilial classification. However, the current phylogeny is not yet well supported, and relationships among subfamilies and tribes of Ranunculaceae remain an open question. Eight molecular markers from the three genomes (nuclear, chloroplast and mitochondrial) were selected to investigate these relationships, including new markers for the family (two homologs of the nuclear CYCLOIDEA gene, the chloroplast gene ndhF, and the mitochondrial intron nad4-I1). The combination of multiple markers led to better resolution and higher support of phylogenetic relationships among subfamilies of Ranunculaceae, and among tribes within subfamily Ranunculoideae. Our results challenge the monophyly of Ranunculoideae as currently circumscribed due to the position of tribe Adonideae (Ranunculoideae), sister to Thalictroideae. We suggest that Thalictroideae could be merged with Ranunculoideae in an enlarged single subfamily.