55 resultados para new trends
em Université de Lausanne, Switzerland
Resumo:
New products available for food creations include a wide variety of "supposed" food grade aerosol sprays. However, the gas propellants used cannot be considered as safe. The different legislations available did not rule any maximum residue limits, even though these compounds have some limits when used for other food purposes. This study shows a preliminary monitoring of propane, butane and dimethyl ether residues, in cakes and chocolate after spraying, when these gases are used as propellants in food aerosol sprays. Release kinetics of propane, butane and dimethyl ether were measured over one day with sprayed food, left at room temperature or in the fridge after spraying. The alkanes and dimethyl ether analyses were performed by headspace-gas chromatography-mass spectrometry/thermal conductivity detection, using monodeuterated propane and butane generated in situ as internal standards. According to the obtained results and regardingthe extrapolations of the maximum residue limits existing for these substances, different delays should be respected according to the storage conditions and the gas propellant to consume safely the sprayed food.
Resumo:
Cataract surgery is the most frequent surgery performed in the world. Modernization of cataract surgery is a continuous process and recent technological progress have enlarged the spectrum of treatable refractive errors, improved safety of surgery, speed of visual recovery and reduction of complications rate. Thus, during the last years, refractive intraocular lenses such as toric and multifocal IOLS have been introduced in practice, as well as torsional phacoemulsification and corneal microincision. For endophthalmitis prophylaxis, modern management includes intracameral injection of antibiotics. The future of cataract surgery is probably to replace phacoemulsification surgery by laser surgery, which is safer and more reproducible.
Resumo:
Standard care for newly diagnosed glioblastoma multiforme (GBM) previously consisted of resection to the greatest extent feasible, followed by radiotherapy. The role of chemotherapy was controversial and its efficacy was marginal at best. Five years ago temozolomide (TMZ) was approved specifically for the treatment of recurrent malignant glioma. The role of TMZ chemotherapy administered alone or as an adjuvant therapy for newly diagnosed GBM has been evaluated in a large randomized trial whose results suggested a significant prolongation of survival following treatment. Findings of correlative molecular studies have indicated that methylguanine methyltransferase promoter methylation may be used as a predictive factor in selecting patients most likely to benefit from such treatment. In this short review the authors summarize the current role of TMZ chemotherapy in the management of GBM, with an emphasis on approved indications and practical aspects.
Resumo:
New products available for food creations include a wide variety of "supposed" food grade aerosol sprays. However, the gas propellants used cannot be considered as safe. The different legislations available did not rule any maximum residue limits, even though these compounds have some limits when used for other food purposes. This study shows a preliminary monitoring of propane, butane and dimethyl ether residues, in cakes and chocolate after spraying, when these gases are used as propellants in food aerosol sprays. Release kinetics of propane, butane and dimethyl ether were measured over one day with sprayed food, left at room temperature or in the fridge after spraying. The alkanes and dimethyl ether analyses were performed by headspace-gas chromatography-mass spectrometry/thermal conductivity detection, using monodeuterated propane and butane generated in situ as internal standards. According to the obtained results and regardingthe extrapolations of the maximum residue limits existing for these substances, different delays should be respected according to the storage conditions and the gas propellant to consume safely the sprayed food.
Resumo:
The emergence of electronic cigarettes (e-cigs) has given cannabis smokers a new method of inhaling cannabinoids. E-cigs differ from traditional marijuana cigarettes in several respects. First, it is assumed that vaporizing cannabinoids at lower temperatures is safer because it produces smaller amounts of toxic substances than the hot combustion of a marijuana cigarette. Recreational cannabis users can discretely "vape" deodorized cannabis extracts with minimal annoyance to the people around them and less chance of detection. There are nevertheless several drawbacks worth mentioning: although manufacturing commercial (or homemade) cannabinoid-enriched electronic liquids (e-liquids) requires lengthy, complex processing, some are readily on the Internet despite their lack of quality control, expiry date, and conditions of preservation and, above all, any toxicological and clinical assessment. Besides these safety problems, the regulatory situation surrounding e-liquids is often unclear. More simply ground cannabis flowering heads or concentrated, oily THC extracts (such as butane honey oil or BHO) can be vaped in specially designed, pen-sized marijuana vaporizers. Analysis of a commercial e-liquid rich in cannabidiol showed that it contained a smaller dose of active ingredient than advertised; testing our laboratory-made, purified BHO, however, confirmed that it could be vaped in an e-cig to deliver a psychoactive dose of THC. The health consequences specific to vaping these cannabis preparations remain largely unknown and speculative due to the absence of comprehensive, robust scientific studies. The most significant health concerns involve the vaping of cannabinoids by children and teenagers. E-cigs could provide an alternative gateway to cannabis use for young people. Furthermore, vaping cannabinoids could lead to environmental and passive contamination.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
Bacteria have long been the targets for genetic manipulation, but more recently they have been synthetically designed to carry out specific tasks. Among the simplest of these tasks is chemical compound and toxicity detection coupled to the production of a quantifiable reporter signal. In this Review, we describe the current design of bacterial bioreporters and their use in a range of assays to measure the presence of harmful chemicals in water, air, soil, food or biological specimens. New trends for integrating synthetic biology and microengineering into the design of bacterial bioreporter platforms are also highlighted.
Resumo:
In Switzerland, the annual cost of damage by natural elements has been increasing for several years despite the introduction of protective measures. Mainly induced by material destruction building insurance companies have to pay the majority of this cost. In many European countries, governments and insurance companies consider prevention strategies to reduce vulnerability. In Switzerland, since 2004, the cost of damage due to natural hazards has surpassed the cost of damage due to fire; a traditional activity of the Cantonal Insurance company (EGA). Therefore, the strategy for efficient fire prevention incorporates a reduction of the vulnerability of buildings. The thesis seeks to illustrate the relevance of such an approach when applied to the damage caused by natural hazards. It examines the role of insurance place and its involvement in targeted prevention of natural disasters. Integrated risk management involves a faultless comprehension of all risk parameters The first part of the thesis is devoted to the theoretical development of the key concepts that influence risk management, such as: hazard, vulnerability, exposure or damage. The literature on this subject, very prolific in recent years, was taken into account and put in perspective in the context of this study. Among the risk parameters, it is shown in the thesis that vulnerability is a factor that we can influence efficiently in order to limit the cost of damage to buildings. This is confirmed through the development of an analysis method. This method has led to the development of a tool to assess damage to buildings by flooding. The tool, designed for the property insurer or owner, proposes several steps, namely: - Vulnerability and damage potential assessment; - Proposals for remedial measures and risk reduction from an analysis of the costs of a potential flood; - Adaptation of a global strategy in high-risk areas based on the elements at risk. The final part of the thesis is devoted to the study of a hail event in order to provide a better understanding of damage to buildings. For this, two samples from the available claims data were selected and analysed in the study. The results allow the identification of new trends A second objective of the study was to develop a hail model based on the available data The model simulates a random distribution of intensities and coupled with a risk model, proposes a simulation of damage costs for the determined study area. Le coût annuel des dommages provoqués par les éléments naturels en Suisse est conséquent et sa tendance est en augmentation depuis plusieurs années, malgré la mise en place d'ouvrages de protection et la mise en oeuvre de moyens importants. Majoritairement induit par des dégâts matériels, le coût est supporté en partie par les assurances immobilières en ce qui concerne les dommages aux bâtiments. Dans de nombreux pays européens, les gouvernements et les compagnies d'assurance se sont mis à concevoir leur stratégie de prévention en termes de réduction de la vulnérabilité. Depuis 2004, en Suisse, ce coût a dépassé celui des dommages dus à l'incendie, activité traditionnelle des établissements cantonaux d'assurance (ECA). Ce fait, aux implications stratégiques nombreuses dans le domaine public de la gestion des risques, résulte en particulier d'une politique de prévention des incendies menée efficacement depuis plusieurs années, notamment par le biais de la diminution de la vulnérabilité des bâtiments. La thèse, par la mise en valeur de données actuarielles ainsi que par le développement d'outils d'analyse, cherche à illustrer la pertinence d'une telle approche appliquée aux dommages induits par les phénomènes naturels. Elle s'interroge sur la place de l'assurance et son implication dans une prévention ciblée des catastrophes naturelles. La gestion intégrale des risques passe par une juste maîtrise de ses paramètres et de leur compréhension. La première partie de la thèse est ainsi consacrée au développement théorique des concepts clés ayant une influence sur la gestion des risques, comme l'aléa, la vulnérabilité, l'exposition ou le dommage. La littérature à ce sujet, très prolifique ces dernières années, a été repnse et mise en perspective dans le contexte de l'étude, à savoir l'assurance immobilière. Parmi les paramètres du risque, il est démontré dans la thèse que la vulnérabilité est un facteur sur lequel il est possible d'influer de manière efficace dans le but de limiter les coûts des dommages aux bâtiments. Ce raisonnement est confirmé dans un premier temps dans le cadre de l'élaboration d'une méthode d'analyse ayant débouché sur le développement d'un outil d'estimation des dommages aux bâtiments dus aux inondations. L'outil, destiné aux assurances immobilières, et le cas échéant aux propriétaires, offre plusieurs étapes, à savoir : - l'analyse de la vulnérabilité et le potentiel de dommages ; - des propositions de mesures de remédiation et de réduction du risque issues d'une analyse des coûts engendrés par une inondation potentielle; - l'adaptation d'une stratégie globale dans les zones à risque en fonction des éléments à risque. La dernière partie de la thèse est consacrée à l'étude d'un événement de grêle dans le but de fournir une meilleure compréhension des dommages aux bâtiments et de leur structure. Pour cela, deux échantillons ont été sélectionnés et analysés parmi les données de sinistres à disposition de l'étude. Les résultats obtenus, tant au niveau du portefeuille assuré que de l'analyse individuelle, permettent de dégager des tendances nouvelles. Un deuxième objectif de l'étude a consisté à élaborer une modélisation d'événements de grêle basée sur les données à disposition. Le modèle permet de simuler une distribution aléatoire des intensités et, couplé à un modèle d'estimation des risques, offre une simulation des coûts de dommages envisagés pour une zone d'étude déterminée. Les perspectives de ce travail permettent une meilleure focalisation du rôle de l'assurance et de ses besoins en matière de prévention.
Resumo:
No study to date has focused specifically on the reasons for and against disclosure of HIV-positive status among sub-Saharan migrant women. Thirty HIV-positive women from 11 sub-Saharan countries living in French-speaking Switzerland participated in semi-structured individual interviews. The reasons women reported for disclosure or nondisclosure of their HIV serostatus were classified into three categories: social, medical, and ethical. The women identified the stigma associated with HIV as a major social reason for nondisclosure. However, this study identifies new trends related to disclosure for medical and ethical reasons. Being undetectable played an important role in the life of sub-Saharan migrant women, and analysis revealed their medical reasons for both disclosure and nondisclosure. Disclosure to new sexual partners occurred when women had a more positive perception about HIV and when they believed themselves to be in a long-term relationship. Women reported nondisclosure to family members when they did not need help outside the support provided by the medical and social fields. The results on ethical reasons suggested that challenging stigma was a reason for disclosure. Since the women' perceptions on HIV changed when they came to see it as a chronic disease, disclosure occurred in an attempt to normalize life with HIV in their communities in migration and to challenge racism and discrimination. Our findings can help health providers better understand the communication needs of sub-Saharan migrant women with respect to HIV/AIDS and sexuality and offer them adequate disclosure advice that takes into account migration and gender issues.
Resumo:
Infectious diseases (ID) are a major cause of morbidity and mortality after SOT. Since May 2008, the STCS has registered 95% of all SOT recipients in Switzerland. The extensive data set includes pre- and post-transplant variables that are prospectively collected at transplantation, 6 months post-transplant, and yearly thereafter. All ID events are recorded using internationally validated defi nitions. We obtained data from 1101 patients (79 heart, 685 kidney, 29 kidney-pancreas, 212 liver, and 96 lung transplants). So far the median observation times were 0.8 (IQR 0.3-1.4; heart); 1.1 (0.6-1.8, kidney); 1.1 (0.6-1.9, kidney-pancreas); 1.0 (0.5-1.7, liver); and 0.9 years (0.5-1.5, lung). The highest rates of proven or probable ID events were seen in lung (76%), followed by liver (64%), heart (62%), kidney-pancreas (62%), kidney (58%). During the observation period, ID was the cause of death in 19 patients (1.7%). Rates of infections per person-years according to pathogen and type of transplantation are shown in Figure 1. The data indicate that virus infections are only second after bacteria whereas fungi occur at relatively low rates. This prospective and standardized long-term collection of all ID events will allow a comprehensive assessment of the burden of ID across all SOT types in Switzerland. Regular analysis will identify new trends, serve as a quality control and help design anti-infectious interventions aiming at increasing safety and improving overall transplantation outcome.
Resumo:
Gauguin's first attempts at still-life painting, around 1875, followed the Dutch tradition, influenced mainly by Manet's palette. But he did take occasional liberties in depicting flowers with more fluid colour and dynamic backgrounds. From 1879 his style shows the influence of the Impressionists: Pissarro in the landscapes and Degas in the composition of his still-lifes. He was also open to the new trends which were developing among artists in Paris and applied them in his paintings, using still-lifes as his main means for testing them. He did not escape the contemporary fascination with Japonism, and even experimented briefly with Pointillism in Still Life with Horse's Head. His stays in Britain between 1886 and 1890 correspond to an extremely rich and innovative period for him, in which still-lifes served for increasing experimentation. "Fête Gloanec" and Three Puppies reflect his preoccupations: rejection of perspective, use of areas of flat colour, and mixed styles. These pictures amount to an aesthetic manifesto; many of them are also imbued with strong symbolism, as in the Portrait of Meyer de Haan, which is a melancholic reflection on the fall of man. In Still-Life with Japanese Print, frail blue flowers seem to come out of the head of the artist-martyr, a pure product of the painter's "restless imagination". Thus Gauguin showed that art is an "abstraction" through a genre which was reputed to lend itself with difficulty to anything other than mimesis. Although he moved away from still-life after 1890, Gauguin is one of the first artists to radically renew its role and the status of still-life at the end of the 19th century, well before the Fauvists and Cubists.
Resumo:
The book presents the state of the art in machine learning algorithms (artificial neural networks of different architectures, support vector machines, etc.) as applied to the classification and mapping of spatially distributed environmental data. Basic geostatistical algorithms are presented as well. New trends in machine learning and their application to spatial data are given, and real case studies based on environmental and pollution data are carried out. The book provides a CD-ROM with the Machine Learning Office software, including sample sets of data, that will allow both students and researchers to put the concepts rapidly to practice.