1000 resultados para Research labs
Resumo:
Mode of access: Internet.
Resumo:
Despite numerous discussions, workshops, reviews and reports about responsible development of nanotechnology, information describing health and environmental risk of engineered nanoparticles or nanomaterials is severely lacking and thus insufficient for completing rigorous risk assessment on their use. However, since preliminary scientific evaluations indicate that there are reasonable suspicions that activities involving nanomaterials might have damaging effects on human health; the precautionary principle must be applied. Public and private institutions as well as industries have the duty to adopt preventive and protective measures proportionate to the risk intensity and the desired level of protection. In this work, we present a practical, 'user-friendly' procedure for a university-wide safety and health management of nanomaterials, developed as a multi-stakeholder effort (government, accident insurance, researchers and experts for occupational safety and health). The process starts using a schematic decision tree that allows classifying the nano laboratory into three hazard classes similar to a control banding approach (from Nano 3 - highest hazard to Nano1 - lowest hazard). Classifying laboratories into risk classes would require considering actual or potential exposure to the nanomaterial as well as statistical data on health effects of exposure. Due to the fact that these data (as well as exposure limits for each individual material) are not available, risk classes could not be determined. For each hazard level we then provide a list of required risk mitigation measures (technical, organizational and personal). The target 'users' of this safety and health methodology are researchers and safety officers. They can rapidly access the precautionary hazard class of their activities and the corresponding adequate safety and health measures. We succeed in convincing scientist dealing with nano-activities that adequate safety measures and management are promoting innovation and discoveries by ensuring them a safe environment even in the case of very novel products. The proposed measures are not considered as constraints but as a support to their research. This methodology is being implemented at the Ecole Polytechnique de Lausanne in over 100 research labs dealing with nanomaterials. It is our opinion that it would be useful to other research and academia institutions as well. [Authors]
Resumo:
ABSTRACT This dissertation focuses on new technology commercialization, innovation and new business development. Industry-based novel technology may achieve commercialization through its transfer to a large research laboratory acting as a lead user and technical partner, and providing the new technology with complementary assets and meaningful initial use in social practice. The research lab benefits from the new technology and innovation through major performance improvements and cost savings. Such mutually beneficial collaboration between the lab and the firm does not require any additional administrative efforts or funds from the lab, yet requires openness to technologies and partner companies that may not be previously known to the lab- Labs achieve the benefits by applying a proactive procurement model that promotes active pre-tender search of new technologies and pre-tender testing and piloting of these technological options. The collaboration works best when based on the development needs of both parties. This means that first of all the lab has significant engineering activity with well-defined technological needs and second, that the firm has advanced prototype technology yet needs further testing, piloting and the initial market and references to achieve the market breakthrough. The empirical evidence of the dissertation is based on a longitudinal multiple-case study with the European Laboratory for Particle Physics. The key theoretical contribution of this study is that large research labs, including basic research, play an important role in product and business development toward the end, rather than front-end, of the innovation process. This also implies that product-orientation and business-orientation can contribute to basic re-search. The study provides practical managerial and policy guidelines on how to initiate and manage mutually beneficial lab-industry collaboration and proactive procurement.
Resumo:
Nanomaterials have properties that are often very different from normal materials made of the same substance, which can be used to create novel products with exciting properties. However, the health and environmental impact of these nanomaterials is also changed and their potential risk needs to be studied. There is evidence that some nanomaterials can pass through tissue barriers (including the blood-brain barrier) and cell membranes. This is interesting for medical applications, but it raises concerns about the impact of non-medical nanomaterials. Current research aims at better coordinating research efforts and at better communication between researchers and involved stakeholders. Many research labs and production sites currently follow strategies that were established for dealing with very toxic chemicals and powders, until future research in this field helps identify the appropriate level of protection. All these efforts will ultimately ensure a safe, healthy and environmental friendly production, use and disposal of nanomaterials.
Resumo:
In recent years correlative microscopy, combining the power and advantages of different imaging system, e.g., light, electrons, X-ray, NMR, etc., has become an important tool for biomedical research. Among all the possible combinations of techniques, light and electron microscopy, have made an especially big step forward and are being implemented in more and more research labs. Electron microscopy profits from the high spatial resolution, the direct recognition of the cellular ultrastructure and identification of the organelles. It, however, has two severe limitations: the restricted field of view and the fact that no live imaging can be done. On the other hand light microscopy has the advantage of live imaging, following a fluorescently tagged molecule in real time and at lower magnifications the large field of view facilitates the identification and location of sparse individual cells in a large context, e.g., tissue. The combination of these two imaging techniques appears to be a valuable approach to dissect biological events at a submicrometer level. Light microscopy can be used to follow a labelled protein of interest, or a visible organelle such as mitochondria, in time, then the sample is fixed and the exactly same region is investigated by electron microscopy. The time resolution is dependent on the speed of penetration and fixation when chemical fixatives are used and on the reaction time of the operator for cryo-fixation. Light microscopy can also be used to identify cells of interest, e.g., a special cell type in tissue or cells that have been modified by either transfections or RNAi, in a large population of non-modified cells. A further application is to find fluorescence labels in cells on a large section to reduce searching time in the electron microscope. Multiple fluorescence labelling of a series of sections can be correlated with the ultrastructure of the individual sections to get 3D information of the distribution of the marked proteins: array tomography. More and more efforts are put in either converting a fluorescence label into an electron dense product or preserving the fluorescence throughout preparation for the electron microscopy. Here, we will review successful protocols and where possible try to extract common features to better understand the importance of the individual steps in the preparation. Further the new instruments and software, intended to ease correlative light and electron microscopy, are discussed. Last but not least we will detail the approach we have chosen for correlative microscopy.
Resumo:
Un nombre significatif d’enfants autistes ont une macrocéphalie. Malgré plusieurs études du périmètre crânien en autisme, peu d’études ont été faites sur des adultes. Aussi, les références actuelles en périmètre crânien (PC) adulte datent d’environ 20 ans. Les objectifs de cette étude étaient de construire une échelle de référence du PC adulte, et de comparer les taux de macrocéphalie entre un groupe d’adultes autistes et un groupe d’adultes neurotypiques. Dans cette étude, 221 sujets masculins adultes étaient recrutés de différents milieux afin de déterminer le meilleur modèle prédictif du PC et de construire l’échelle de référence. La hauteur et le poids étaient mesurés pour chaque participant afin de déterminer leur influence sur les dimensions crâniennes. Pour la partie comparative, 30 autistes et 36 sujets neurotypiques, tous adultes, étaient recrutés à partir de la base de données du laboratoire de recherche. Pour l’échelle de référence, les résultats démontraient des corrélations positives entre le PC avec la hauteur et le poids. Après analyse, la corrélation conjointe de la hauteur et du poids sur le PC a été déterminée comme étant le modèle qui offre les résultats les plus significatifs dans la prédiction du PC. Pour la partie comparative, les taux de macrocéphalie atteignaient 10,00% chez les autistes contre 2,56% chez les neurotypiques selon la formule de régression linéaire obtenue du modèle. Cependant le test d’exactitude de Fisher n’a révélé aucune différence significative entre les 2 groupes. Mes résultats suggèrent qu’il est nécessaire de considérer la hauteur et le poids en construisant une référence du PC et que, même en utilisant la nouvelle référence, les taux de macrocéphalie demeurent plus élevés chez les autistes adultes que chez les adultes neurotypiques en dépit de l’absence de différences significatives.
Resumo:
An introductory lecture on Web Science, taking a kind of devils advocate position by suggesting that the Web is a piece of runaway technology that escaped from research labs prematurely.
Resumo:
With the new discoveries of oil and gas, the exploration of fields in various geological basins, imports of other oils and the development of alternative fuels, more and more research labs have evaluated and characterized new types of petroleum and derivatives. Therefore the investment in new techniques and equipment in the samples analysis to determine their physical and chemical properties, their composition, possible contaminants, especification of products, among others, have multiplied in last years, so development of techniques for rapid and efficient characterization is extremely important for a better economic recovery of oil. Based on this context, this work has two main objectives. The first one is to characterize the oil by thermogravimetry coupled with mass spectrometry (TG-MS), and correlate these results with from other types of characterizations data previously informed. The second is to use the technique to develop a methodology to obtain the curve of evaluation of hydrogen sulfide gas in oil. Thus, four samples were analyzed by TG-MS, and X-ray fluorescence spectrometry (XRF). TG results can be used to indicate the nature of oil, its tendency in coke formation, temperatures of distillation and cracking, and other features. It was observed in MS evaluations the behavior of oil main compounds with temperature, the points where the volatilized certain fractions and the evaluation gas analysis of sulfide hydrogen that is compared with the evaluation curve obtained by Petrobras with another methodology
Resumo:
The European Union set the ambitious target of reducing energy consumption by 20% within 2020. This goal demands a tremendous change in how we generate and consume energy and urgently calls for an aggressive policy on energy efficiency. Since 19% of the European electrical energy is used for lighting, considerable savings can be achieved with the development of novel and more efficient lighting systems. In this thesis, accomplished in the frame of the EU project CELLO, I report some selected goals we achieved attempting to develop highly efficient, flat, low cost and flexible light sources using Light-Emitting Electrochemical Cells (LECs), based on ionic cyclometalated iridium(III) complexes. After an extensive introduction about LECs and solid-state lighting in general, I focus on the research we carried out on cyclometalated iridium(III) complexes displaying deep-blue emission, which has turned out to be a rather challenging task. In order to demonstrate the wide versatility of this class of compounds, I also report a case in which some tailored iridium(III) complexes act as near-infrared (NIR) sources. In fact, standard NIR emitting devices are typically expensive and, also in this case, LECs could serve as low-cost alternatives in fields were NIR luminescence is crucial, such as telecommunications and bioimaging. Since LECs are based on only one active material, in the last chapter I stress the importance of an integrated approach toward the right selection of suitable emitters not only from the photophysical, but also from the point of view of material science. An iridium(III) complex, once in the device, is interacting with ionic liquids, metal cathodes, electric fields, etc. All these interactions should be taken in to account if Europe really wants to implement more efficient lighting paradigms, generating light beyond research labs.
Resumo:
Synthetic oligonucleotides and peptides have found wide applications in industry and academic research labs. There are ~60 peptide drugs on the market and over 500 under development. The global annual sale of peptide drugs in 2010 was estimated to be $13 billion. There are three oligonucleotide-based drugs on market; among them, the FDA newly approved Kynamro was predicted to have a $100 million annual sale. The annual sale of oligonucleotides to academic labs was estimated to be $700 million. Both bio-oligomers are mostly synthesized on automated synthesizers using solid phase synthesis technology, in which nucleoside or amino acid monomers are added sequentially until the desired full-length sequence is reached. The additions cannot be complete, which generates truncated undesired failure sequences. For almost all applications, these impurities must be removed. The most widely used method is HPLC. However, the method is slow, expensive, labor-intensive, not amendable for automation, difficult to scale up, and unsuitable for high throughput purification. It needs large capital investment, and consumes large volumes of harmful solvents. The purification costs are estimated to be more than 50% of total production costs. Other methods for bio-oligomer purification also have drawbacks, and are less favored than HPLC for most applications. To overcome the problems of known biopolymer purification technologies, we have developed two non-chromatographic purification methods. They are (1) catching failure sequences by polymerization, and (2) catching full-length sequences by polymerization. In the first method, a polymerizable group is attached to the failure sequences of the bio-oligomers during automated synthesis; purification is achieved by simply polymerizing the failure sequences into an insoluble gel and extracting full-length sequences. In the second method, a polymerizable group is attached to the full-length sequences, which are then incorporated into a polymer; impurities are removed by washing, and pure product is cleaved from polymer. These methods do not need chromatography, and all drawbacks of HPLC no longer exist. Using them, purification is achieved by simple manipulations such as shaking and extraction. Therefore, they are suitable for large scale purification of oligonucleotide and peptide drugs, and also ideal for high throughput purification, which currently has a high demand for research projects involving total gene synthesis. The dissertation will present the details about the development of the techniques. Chapter 1 will make an introduction to oligodeoxynucleotides (ODNs), their synthesis and purification. Chapter 2 will describe the detailed studies of using the catching failure sequences by polymerization method to purify ODNs. Chapter 3 will describe the further optimization of the catching failure sequences by polymerization ODN purification technology to the level of practical use. Chapter 4 will present using the catching full-length sequence by polymerization method for ODN purification using acid-cleavable linker. Chapter 5 will make an introduction to peptides, their synthesis and purification. Chapter 6 will describe the studies using the catching full-length sequence by polymerization method for peptide purification.
Resumo:
The microarray technology provides a high-throughput technique to study gene expression. Microarrays can help us diagnose different types of cancers, understand biological processes, assess host responses to drugs and pathogens, find markers for specific diseases, and much more. Microarray experiments generate large amounts of data. Thus, effective data processing and analysis are critical for making reliable inferences from the data. ^ The first part of dissertation addresses the problem of finding an optimal set of genes (biomarkers) to classify a set of samples as diseased or normal. Three statistical gene selection methods (GS, GS-NR, and GS-PCA) were developed to identify a set of genes that best differentiate between samples. A comparative study on different classification tools was performed and the best combinations of gene selection and classifiers for multi-class cancer classification were identified. For most of the benchmarking cancer data sets, the gene selection method proposed in this dissertation, GS, outperformed other gene selection methods. The classifiers based on Random Forests, neural network ensembles, and K-nearest neighbor (KNN) showed consistently god performance. A striking commonality among these classifiers is that they all use a committee-based approach, suggesting that ensemble classification methods are superior. ^ The same biological problem may be studied at different research labs and/or performed using different lab protocols or samples. In such situations, it is important to combine results from these efforts. The second part of the dissertation addresses the problem of pooling the results from different independent experiments to obtain improved results. Four statistical pooling techniques (Fisher inverse chi-square method, Logit method. Stouffer's Z transform method, and Liptak-Stouffer weighted Z-method) were investigated in this dissertation. These pooling techniques were applied to the problem of identifying cell cycle-regulated genes in two different yeast species. As a result, improved sets of cell cycle-regulated genes were identified. The last part of dissertation explores the effectiveness of wavelet data transforms for the task of clustering. Discrete wavelet transforms, with an appropriate choice of wavelet bases, were shown to be effective in producing clusters that were biologically more meaningful. ^
Resumo:
Documents pertaining ot the establishment of teaching facilities, research labs, and a medical library for the College of Medicine.
Resumo:
G-Protein Coupled Receptors (GPCRs) are not only the largest protein family in the human genome but are also the single biggest target for therapeutic agents. Research into GPCRs is therefore growing at a fast pace and the range of techniques that can be applied to GPCRs is vast and continues to grow. This book provides an invaluable bench-side guide into the best and most up-to-date techniques for current and future research on GPCRs. With contributions from leading international authorities, this book equips readers with clear and detailed protocols for both well-known and up-and-coming techniques along with hints and tips for success. All the methods have been tried and tested by leading international research labs and are presented in easy-to-follow stages along with a useful overview of each technique. This book is an essential resource for all researchers in molecular biology, biochemistry, pharmacology and for graduate students.© 2010 John Wiley & Sons, Ltd.
Resumo:
The semiconductor industry's urge towards faster, smaller and cheaper integrated circuits has lead the industry to smaller node devices. The integrated circuits that are now under volume production belong to 22 nm and 14 nm technology nodes. In 2007 the 45 nm technology came with the revolutionary high- /metal gate structure. 22 nm technology utilizes fully depleted tri-gate transistor structure. The 14 nm technology is a continuation of the 22 nm technology. Intel is using second generation tri-gate technology in 14 nm devices. After 14 nm, the semiconductor industry is expected to continue the scaling with 10 nm devices followed by 7 nm. Recently, IBM has announced successful production of 7 nm node test chips. This is the fashion how nanoelectronics industry is proceeding with its scaling trend. For the present node of technologies selective deposition and selective removal of the materials are required. Atomic layer deposition and the atomic layer etching are the respective techniques used for selective deposition and selective removal. Atomic layer deposition still remains as a futuristic manufacturing approach that deposits materials and lms in exact places. In addition to the nano/microelectronics industry, ALD is also widening its application areas and acceptance. The usage of ALD equipments in industry exhibits a diversi cation trend. With this trend, large area, batch processing, particle ALD and plasma enhanced like ALD equipments are becoming prominent in industrial applications. In this work, the development of an atomic layer deposition tool with microwave plasma capability is described, which is a ordable even for lightly funded research labs.
Resumo:
The evolution and maturation of Cloud Computing created an opportunity for the emergence of new Cloud applications. High-performance Computing, a complex problem solving class, arises as a new business consumer by taking advantage of the Cloud premises and leaving the expensive datacenter management and difficult grid development. Standing on an advanced maturing phase, today’s Cloud discarded many of its drawbacks, becoming more and more efficient and widespread. Performance enhancements, prices drops due to massification and customizable services on demand triggered an emphasized attention from other markets. HPC, regardless of being a very well established field, traditionally has a narrow frontier concerning its deployment and runs on dedicated datacenters or large grid computing. The problem with common placement is mainly the initial cost and the inability to fully use resources which not all research labs can afford. The main objective of this work was to investigate new technical solutions to allow the deployment of HPC applications on the Cloud, with particular emphasis on the private on-premise resources – the lower end of the chain which reduces costs. The work includes many experiments and analysis to identify obstacles and technology limitations. The feasibility of the objective was tested with new modeling, architecture and several applications migration. The final application integrates a simplified incorporation of both public and private Cloud resources, as well as HPC applications scheduling, deployment and management. It uses a well-defined user role strategy, based on federated authentication and a seamless procedure to daily usage with balanced low cost and performance.