167 resultados para galaxies: cluster: general
Resumo:
A semisupervised support vector machine is presented for the classification of remote sensing images. The method exploits the wealth of unlabeled samples for regularizing the training kernel representation locally by means of cluster kernels. The method learns a suitable kernel directly from the image and thus avoids assuming a priori signal relations by using a predefined kernel structure. Good results are obtained in image classification examples when few labeled samples are available. The method scales almost linearly with the number of unlabeled samples and provides out-of-sample predictions.
Resumo:
Despite two international studies, there is still no consensus concerning prostate cancer screening. The results of a meta-analysis are making us question our convictions concerning pneumococcal vaccination. The preoperative work-up of cataract surgery can be simplified. When describing the efficacy of a treatment to a patient, relative risks are better understood than absolute risks. For rotator cuff syndrome, intramuscular corticosteroid injections are as efficient as intra-articular injections. In patients prescribed clopidogrel, a proton pump inhibitor is not absolutely necessary. The arrival of a anticoagulant that does not need blood monitoring is an interesting option in atrial fibrillation.
Resumo:
Background: The type of anesthesia to be used for total hip arthroplasty (THA) is still a matter of debate. We compared the occurrence of per- and post-anesthesia incidents in patients receiving either general (GA) or regional anesthesia (RA). Methods: We used data from 29 hospitals, routinely collected in the Anaesthesia Databank Switzerland register between January 2001 and December 2003. We used multi-level logistic regression models. Results: There were more per- and post-anesthesia incidents under GA compared to RA (35.1% vs 32.7 %, n = 3191, and 23.1% vs 19.4%, n = 3258, respectively). In multi-level logistic regression analysis, RA was significantly associated with a lower incidence of per-anesthetic problems, especially hypertension, compared with GA. During the post-anesthetic period, RA was also less associated with pain. Conversely, RA was more associated with post-anesthetic hypotension, especially for epidural technique. In addition, age and ASA were more associated with incidents under GA compared to RA. Men were more associated with per-anesthetic problems under RA compared to GA. Whereas increased age (>67), gender (male), and ASA were linked with the choice of RA, we noticed that this choice depended also on hospital practices after we adjusted for the other variables. Conclusions: Compared to RA, GA was associated with an increased proportion of per- and post-anesthesia incidents. Although this study is only observational, it is rooted in daily practice. Whereas RA might be routinely proposed, GA might be indicated because of contraindications to RA, patients' preferences or other surgical or anaesthesiology related reasons. Finally, the choice of a type of anesthesia seems to depend on local practices that may differ between hospitals.
Resumo:
From our reading over the current year 2010 we have singled out 8 items which seem to us significant for the practice of medicine. Small doses of colchicine are useful in the treatment of gout. No efficacious treatment for muscular cramps can be recommended. A cervical collar can be usefully prescribed for the treatment of cervical radiculopathy. A single dose of azithromycin can be envisaged as a third line treatment of syphilis. High doses of vitamin D should not be prescribed for the prevention of fractures in elderly women because of the risks of falling. The wearing of bifocals can be associated with these risks. A clinical score is available to help with the diagnosis of thoracic pain. The NT-pro BNP is of limited use for the follow-up of patients suffering from heart failure.
Resumo:
The project "Quantification and qualification of ambulatory health care", financed by the Swiss National Science Foundation and covering the Cantons of Vaud and Fribourg, has two main goals: --a structural study of the elements of the ambulatory care sector. This is done through inventories of the professions concerned (physicians, public health nurses, physiotherapists, pharmacists, medical laboratories), allowing to better characterize the "offer". This inventory work includes the collect and analysis of existing statistical data as well as surveys, by questionnaires sent (from September 1980) to the different professions and by interviews. --a functional study, inspired from the US National Ambulatory Medical Care Survey and from similar studies elsewhere, in order to investigate the modes of practice of various providers, with particular regard to interprofessional collaboration (through studying referrals from the ones to the others). The first months of the project have been used for a methodological research in this regard, centered on the use of systems analysis, and for the elaboration of adequate instruments.
Resumo:
With the advancement of high-throughput sequencing and dramatic increase of available genetic data, statistical modeling has become an essential part in the field of molecular evolution. Statistical modeling results in many interesting discoveries in the field, from detection of highly conserved or diverse regions in a genome to phylogenetic inference of species evolutionary history Among different types of genome sequences, protein coding regions are particularly interesting due to their impact on proteins. The building blocks of proteins, i.e. amino acids, are coded by triples of nucleotides, known as codons. Accordingly, studying the evolution of codons leads to fundamental understanding of how proteins function and evolve. The current codon models can be classified into three principal groups: mechanistic codon models, empirical codon models and hybrid ones. The mechanistic models grasp particular attention due to clarity of their underlying biological assumptions and parameters. However, they suffer from simplified assumptions that are required to overcome the burden of computational complexity. The main assumptions applied to the current mechanistic codon models are (a) double and triple substitutions of nucleotides within codons are negligible, (b) there is no mutation variation among nucleotides of a single codon and (c) assuming HKY nucleotide model is sufficient to capture essence of transition- transversion rates at nucleotide level. In this thesis, I develop a framework of mechanistic codon models, named KCM-based model family framework, based on holding or relaxing the mentioned assumptions. Accordingly, eight different models are proposed from eight combinations of holding or relaxing the assumptions from the simplest one that holds all the assumptions to the most general one that relaxes all of them. The models derived from the proposed framework allow me to investigate the biological plausibility of the three simplified assumptions on real data sets as well as finding the best model that is aligned with the underlying characteristics of the data sets. -- Avec l'avancement de séquençage à haut débit et l'augmentation dramatique des données géné¬tiques disponibles, la modélisation statistique est devenue un élément essentiel dans le domaine dé l'évolution moléculaire. Les résultats de la modélisation statistique dans de nombreuses découvertes intéressantes dans le domaine de la détection, de régions hautement conservées ou diverses dans un génome de l'inférence phylogénétique des espèces histoire évolutive. Parmi les différents types de séquences du génome, les régions codantes de protéines sont particulièrement intéressants en raison de leur impact sur les protéines. Les blocs de construction des protéines, à savoir les acides aminés, sont codés par des triplets de nucléotides, appelés codons. Par conséquent, l'étude de l'évolution des codons mène à la compréhension fondamentale de la façon dont les protéines fonctionnent et évoluent. Les modèles de codons actuels peuvent être classés en trois groupes principaux : les modèles de codons mécanistes, les modèles de codons empiriques et les hybrides. Les modèles mécanistes saisir une attention particulière en raison de la clarté de leurs hypothèses et les paramètres biologiques sous-jacents. Cependant, ils souffrent d'hypothèses simplificatrices qui permettent de surmonter le fardeau de la complexité des calculs. Les principales hypothèses retenues pour les modèles actuels de codons mécanistes sont : a) substitutions doubles et triples de nucleotides dans les codons sont négligeables, b) il n'y a pas de variation de la mutation chez les nucléotides d'un codon unique, et c) en supposant modèle nucléotidique HKY est suffisant pour capturer l'essence de taux de transition transversion au niveau nucléotidique. Dans cette thèse, je poursuis deux objectifs principaux. Le premier objectif est de développer un cadre de modèles de codons mécanistes, nommé cadre KCM-based model family, sur la base de la détention ou de l'assouplissement des hypothèses mentionnées. En conséquence, huit modèles différents sont proposés à partir de huit combinaisons de la détention ou l'assouplissement des hypothèses de la plus simple qui détient toutes les hypothèses à la plus générale qui détend tous. Les modèles dérivés du cadre proposé nous permettent d'enquêter sur la plausibilité biologique des trois hypothèses simplificatrices sur des données réelles ainsi que de trouver le meilleur modèle qui est aligné avec les caractéristiques sous-jacentes des jeux de données. Nos expériences montrent que, dans aucun des jeux de données réelles, tenant les trois hypothèses mentionnées est réaliste. Cela signifie en utilisant des modèles simples qui détiennent ces hypothèses peuvent être trompeuses et les résultats de l'estimation inexacte des paramètres. Le deuxième objectif est de développer un modèle mécaniste de codon généralisée qui détend les trois hypothèses simplificatrices, tandis que d'informatique efficace, en utilisant une opération de matrice appelée produit de Kronecker. Nos expériences montrent que sur un jeux de données choisis au hasard, le modèle proposé de codon mécaniste généralisée surpasse autre modèle de codon par rapport à AICc métrique dans environ la moitié des ensembles de données. En outre, je montre à travers plusieurs expériences que le modèle général proposé est biologiquement plausible.
Resumo:
Although the sport of triathlon provides an opportunity to research the effect of multi-disciplinary exercise on health across the lifespan, much remains to be done. The literature has failed to consistently or adequately report subject age group, sex, ability level, and/or event-distance specialization. The demands of training and racing are relatively unquantified. Multiple definitions and reporting methods for injury and illness have been implemented. In general, risk factors for maladaptation have not been well-described. The data thus far collected indicate that the sport of triathlon is relatively safe for the well-prepared, well-supplied athlete. Most injuries 'causing cessation or reduction of training or seeking of medical aid' are not serious. However, as the extent to which they recur may be high and is undocumented, injury outcome is unclear. The sudden death rate for competition is 1.5 (0.9-2.5) [mostly swim-related] occurrences for every 100,000 participations. The sudden death rate is unknown for training, although stroke risk may be increased, in the long-term, in genetically susceptible athletes. During heavy training and up to 5 days post-competition, host protection against pathogens may also be compromised. The incidence of illness seems low, but its outcome is unclear. More prospective investigation of the immunological, oxidative stress-related and cardiovascular effects of triathlon training and competition is warranted. Training diaries may prove to be a promising method of monitoring negative adaptation and its potential risk factors. More longitudinal, medical-tent-based studies of the aetiology and treatment demands of race-related injury and illness are needed.
Resumo:
General Summary Although the chapters of this thesis address a variety of issues, the principal aim is common: test economic ideas in an international economic context. The intention has been to supply empirical findings using the largest suitable data sets and making use of the most appropriate empirical techniques. This thesis can roughly be divided into two parts: the first one, corresponding to the first two chapters, investigates the link between trade and the environment, the second one, the last three chapters, is related to economic geography issues. Environmental problems are omnipresent in the daily press nowadays and one of the arguments put forward is that globalisation causes severe environmental problems through the reallocation of investments and production to countries with less stringent environmental regulations. A measure of the amplitude of this undesirable effect is provided in the first part. The third and the fourth chapters explore the productivity effects of agglomeration. The computed spillover effects between different sectors indicate how cluster-formation might be productivity enhancing. The last chapter is not about how to better understand the world but how to measure it and it was just a great pleasure to work on it. "The Economist" writes every week about the impressive population and economic growth observed in China and India, and everybody agrees that the world's center of gravity has shifted. But by how much and how fast did it shift? An answer is given in the last part, which proposes a global measure for the location of world production and allows to visualize our results in Google Earth. A short summary of each of the five chapters is provided below. The first chapter, entitled "Unraveling the World-Wide Pollution-Haven Effect" investigates the relative strength of the pollution haven effect (PH, comparative advantage in dirty products due to differences in environmental regulation) and the factor endowment effect (FE, comparative advantage in dirty, capital intensive products due to differences in endowments). We compute the pollution content of imports using the IPPS coefficients (for three pollutants, namely biological oxygen demand, sulphur dioxide and toxic pollution intensity for all manufacturing sectors) provided by the World Bank and use a gravity-type framework to isolate the two above mentioned effects. Our study covers 48 countries that can be classified into 29 Southern and 19 Northern countries and uses the lead content of gasoline as proxy for environmental stringency. For North-South trade we find significant PH and FE effects going in the expected, opposite directions and being of similar magnitude. However, when looking at world trade, the effects become very small because of the high North-North trade share, where we have no a priori expectations about the signs of these effects. Therefore popular fears about the trade effects of differences in environmental regulations might by exaggerated. The second chapter is entitled "Is trade bad for the Environment? Decomposing worldwide SO2 emissions, 1990-2000". First we construct a novel and large database containing reasonable estimates of SO2 emission intensities per unit labor that vary across countries, periods and manufacturing sectors. Then we use these original data (covering 31 developed and 31 developing countries) to decompose the worldwide SO2 emissions into the three well known dynamic effects (scale, technique and composition effect). We find that the positive scale (+9,5%) and the negative technique (-12.5%) effect are the main driving forces of emission changes. Composition effects between countries and sectors are smaller, both negative and of similar magnitude (-3.5% each). Given that trade matters via the composition effects this means that trade reduces total emissions. We next construct, in a first experiment, a hypothetical world where no trade happens, i.e. each country produces its imports at home and does no longer produce its exports. The difference between the actual and this no-trade world allows us (under the omission of price effects) to compute a static first-order trade effect. The latter now increases total world emissions because it allows, on average, dirty countries to specialize in dirty products. However, this effect is smaller (3.5%) in 2000 than in 1990 (10%), in line with the negative dynamic composition effect identified in the previous exercise. We then propose a second experiment, comparing effective emissions with the maximum or minimum possible level of SO2 emissions. These hypothetical levels of emissions are obtained by reallocating labour accordingly across sectors within each country (under the country-employment and the world industry-production constraints). Using linear programming techniques, we show that emissions are reduced by 90% with respect to the worst case, but that they could still be reduced further by another 80% if emissions were to be minimized. The findings from this chapter go together with those from chapter one in the sense that trade-induced composition effect do not seem to be the main source of pollution, at least in the recent past. Going now to the economic geography part of this thesis, the third chapter, entitled "A Dynamic Model with Sectoral Agglomeration Effects" consists of a short note that derives the theoretical model estimated in the fourth chapter. The derivation is directly based on the multi-regional framework by Ciccone (2002) but extends it in order to include sectoral disaggregation and a temporal dimension. This allows us formally to write present productivity as a function of past productivity and other contemporaneous and past control variables. The fourth chapter entitled "Sectoral Agglomeration Effects in a Panel of European Regions" takes the final equation derived in chapter three to the data. We investigate the empirical link between density and labour productivity based on regional data (245 NUTS-2 regions over the period 1980-2003). Using dynamic panel techniques allows us to control for the possible endogeneity of density and for region specific effects. We find a positive long run elasticity of density with respect to labour productivity of about 13%. When using data at the sectoral level it seems that positive cross-sector and negative own-sector externalities are present in manufacturing while financial services display strong positive own-sector effects. The fifth and last chapter entitled "Is the World's Economic Center of Gravity Already in Asia?" computes the world economic, demographic and geographic center of gravity for 1975-2004 and compares them. Based on data for the largest cities in the world and using the physical concept of center of mass, we find that the world's economic center of gravity is still located in Europe, even though there is a clear shift towards Asia. To sum up, this thesis makes three main contributions. First, it provides new estimates of orders of magnitudes for the role of trade in the globalisation and environment debate. Second, it computes reliable and disaggregated elasticities for the effect of density on labour productivity in European regions. Third, it allows us, in a geometrically rigorous way, to track the path of the world's economic center of gravity.
Resumo:
BACKGROUND: Fourmidable is an infrastructure to curate and share the emerging genetic, molecular, and functional genomic data and protocols for ants. DESCRIPTION: The Fourmidable assembly pipeline groups nucleotide sequences into clusters before independently assembling each cluster. Subsequently, assembled sequences are annotated via Interproscan and BLAST against general and insect-specific databases. Gene-specific information can be retrieved using gene identifiers, searching for similar sequences or browsing through inferred Gene Ontology annotations. The database will readily scale as ultra-high throughput sequence data and sequences from additional species become available. CONCLUSION: Fourmidable currently houses EST data from two ant species and microarray gene expression data for one of these. Fourmidable is publicly available at http://fourmidable.unil.ch.
Resumo:
Convictions statistics were the first criminal statistics available in Europe during the nineteenth century. Their main weaknesses as crime measures and for comparative purposes were identified by Alphonse de Candolle in the 1830s. Currently, they are seldom used by comparative criminologists, although they provide a less valid but more reliable measure of crime and formal social control than police statistics. This article uses conviction statistics, compiled from the four editions of the European Sourcebook of Crime and Criminal Justice Statistics, to study the evolution of persons convicted in European countries from 1990 to 2006. Trends in persons convicted for six offences -intentional homicide, assault, rape, robbery, theft, and drug offences- and up to 26 European countries are analysed. These trends are established for the whole of Europe as well as for a cluster of Western European countries and a cluster of Central and Eastern European countries. The analyses show similarities between both regions of Europe at the beginning and at the end of the period under study. After a general increase of the rate of persons convicted in the early 1990s in the whole of Europe, trends followed different directions in Western and in Central and Eastern Europe. However, during the 2000s, it can be observed, throughout Europe, a certain stability of the rates of persons convicted for intentional homicides, accompanied by a general decrease of the rate of persons convicted for property offences, and an increase of the rate of those convicted for drug offences. The latter goes together with an increase of the rate of persons convicted for non lethal violent offences, which only reached some stability at the end of the time series. These trends show that there is no general crime drop in Europe. After a discussion of possible theoretical explanations, a multifactor model, inspired by opportunity-based theories, is proposed to explain the trends observed.
Resumo:
Objective: There is little evidence regarding the benefit of stress ulcer prophylaxis (SUP) outside critical care setting. Over-prescription of SUP is not devoid of risks. This prospective study aimed to evaluate the use of proton pump inhibitors (PPIs) for SUP in a general surgery department.Methods: Data collection was performed prospectively during an 8-week period on patients hospitalized in a general surgery department (58 beds) by pharmacists. Patients with a PPI prescription for the treatment of ulcers, gastro-oesophageal reflux disease, oesophagitis or epigastric pain were excluded. Patients admitted twice during the study period were not re-included. The American Society of Health-System Pharmacists guidelines on SUP were used to assess the appropriateness of de novo PPI prescriptions.Results: Among 255 consecutive patients in the study, 138 (54%) received a prophylaxis with PPI, of which 86 (62%) were de novo PPI prescriptions. One-hundred twenty-nine patients (94%) received esomeprazole (according to the hospital drug policy). The most frequent dosage was 40 mg/day. Use of PPI for SUP was evaluated in 67 patients. Fifty-three patients (79%) had no risk factors for SUP. Twelve and 2 patients had one or two risk factors, respectively. At discharge, PPI prophylaxis was continued in 34% of patients with a de novo PPI prescription.Conclusion: This study highlights the overuse of PPIs in non-ICU patients and the inappropriate continuation of PPI prescriptions at discharge.Treatment
Resumo:
Rapid antagonist induction under anesthesia is a method that has been increasingly used to detoxify opiate addicts. These procedures are useful to reduce the duration and the discomfort of withdrawal. However, the high risk and the cost of these methods require randomized clinical trial to evaluate safety and clinical effectiveness. The University Substance Abuse Division of Lausanne and the Intensive Care Unit of the St-Loup Hospital work on a randomized clinical trial comparing anesthesia-assisted versus traditional clonidine detoxification combined with an additional psychosocial week. This paper describes the technique of anesthesia used in our study. Our clinical experience suggests that, integrating this technique in a multidisciplinary network, with a strong emphasis on post-anesthetic follow-up, is a viable and safe option in the treatment of opiate dependence.
Resumo:
OBJECTIVE: To assess the effect of a governmentally-led center based child care physical activity program (Youp'la Bouge) on child motor skills.Patients and methods: We conducted a single blinded cluster randomized controlled trial in 58 Swiss child care centers. Centers were randomly selected and 1:1 assigned to a control or intervention group. The intervention lasted from September 2009 to June 2010 and included training of the educators, adaptation of the child care built environment, parental involvement and daily physical activity. Motor skill was the primary outcome and body mass index (BMI), physical activity and quality of life secondary outcomes. The intervention implementation was also assessed. RESULTS: At baseline, 648 children present on the motor test day were included (age 3.3 +/- 0.6, BMI 16.3 +/- 1.3 kg/m2, 13.2% overweight, 49% girls) and 313 received the intervention. Relative to children in the control group (n = 201), children in the intervention group (n = 187) showed no significant increase in motor skills (delta of mean change (95% confidence interval: -0.2 (-0.8 to 0.3), p = 0.43) or in any of the secondary outcomes. Not all child care centers implemented all the intervention components. Within the intervention group, several predictors were positively associated with trial outcomes: 1) free-access to a movement space and parental information session for motor skills 2) highly motivated and trained educators for BMI 3) free-access to a movement space and purchase of mobile equipment for physical activity (all p < 0.05). CONCLUSION: This "real-life" physical activity program in child care centers confirms the complexity of implementing an intervention outside a study setting and identified potentially relevant predictors that could improve future programs.Trial registration: Trial registration number: clinical trials.gov NCT00967460 http://clinicaltrials.gov/ct2/show/NCT00967460.