45 resultados para new technology
em Université de Lausanne, Switzerland
Resumo:
Bio-nano interactions can be defined as the study of interactions between nanoscale entities and biological systems such as, but not limited to, peptides, proteins, lipids, DNA and other biomolecules, cells and cellular receptors and organisms including humans. Studying bio-nano interactions is particularly useful for understanding engineered materials that have at least one dimension in the nanoscale. Such materials may consist of discrete particles or nanostructured surfaces. Much of biology functions at the nanoscale; therefore, our ability to manipulate materials such that they are taken up at the nanoscale, and engage biological machinery in a designed and purposeful manner, opens new vistas for more efficient diagnostics, therapeutics (treatments) and tissue regeneration, so-called nanomedicine. Additionally, this ability of nanomaterials to interact with and be taken up by cells allows nanomaterials to be used as probes and tools to advance our understanding of cellular functioning. Yet, as a new technology, assessment of the safety of nanomaterials, and the applicability of existing regulatory frameworks for nanomaterials must be investigated in parallel with development of novel applications. The Royal Society meeting 'Bio-nano interactions: new tools, insights and impacts' provided an important platform for open dialogue on the current state of knowledge on these issues, bringing together scientists, industry, regulatory and legal experts to concretize existing discourse in science law and policy. This paper summarizes these discussions and the insights that emerged.
Resumo:
ABSTRACT This dissertation focuses on new technology commercialization, innovation and new business development. Industry-based novel technology may achieve commercialization through its transfer to a large research laboratory acting as a lead user and technical partner, and providing the new technology with complementary assets and meaningful initial use in social practice. The research lab benefits from the new technology and innovation through major performance improvements and cost savings. Such mutually beneficial collaboration between the lab and the firm does not require any additional administrative efforts or funds from the lab, yet requires openness to technologies and partner companies that may not be previously known to the lab- Labs achieve the benefits by applying a proactive procurement model that promotes active pre-tender search of new technologies and pre-tender testing and piloting of these technological options. The collaboration works best when based on the development needs of both parties. This means that first of all the lab has significant engineering activity with well-defined technological needs and second, that the firm has advanced prototype technology yet needs further testing, piloting and the initial market and references to achieve the market breakthrough. The empirical evidence of the dissertation is based on a longitudinal multiple-case study with the European Laboratory for Particle Physics. The key theoretical contribution of this study is that large research labs, including basic research, play an important role in product and business development toward the end, rather than front-end, of the innovation process. This also implies that product-orientation and business-orientation can contribute to basic re-search. The study provides practical managerial and policy guidelines on how to initiate and manage mutually beneficial lab-industry collaboration and proactive procurement.
Resumo:
Mobile technologies have brought about major changes in police equipment and police work. If a utopian narrative remains strongly linked to the adoption of new technologies, often formulated as 'magic bullets' to real occupational problems, there are important tensions between their 'imagined' outcomes and the (unexpected) effects that accompany their daily 'practical' use by police officers. This article offers an analysis of police officers' perceptions and interactions with security devices. In so doing, it develops a conceptual typology of strategies for coping with new technology inspired by Le Bourhis and Lascoumes: challenging, neutralizing and diverting. To that purpose, we adopt an ethnographic approach that focuses on the discourses, practices and actions of police officers in relation to three security devices: the mobile digital terminal, the mobile phone and the body camera. Based on a case study of a North American municipal police department, the article addresses how these technological devices are perceived and experienced by police officers on the beat.
Resumo:
Introduced in 2008, the femtosecond laser is a promising new technological advance which plays an ever increasing role in cataract surgery where it automates the three main surgical steps: corneal incision, capsulotomy and lens fragmentation. The proven advantages over manual surgery are: a better quality of incision with reduced induced astigmatism; increased reliability and reproducibility of the capsulotomy with increased stability of the implanted lens; a reduction in the use of ultrasound. Regarding refractive results or safety, however, no prospective randomized study to date has shown significant superiority compared with standard manual technique. The significant extra cost generated by this laser, undertaken by the patient, is a limiting factor for both its use and study. This review outlines the potential benefits of femtosecond-laser-assisted cataract surgery due to the automation of key steps and the safety of this new technology.
Resumo:
AIM: To assesse the rate of bile duct injuries (BDI) and overall biliary complications during single-port laparoscopic cholecystectomy (SPLC) compared to conventional laparoscopic cholecystectomy (CLC). METHODS: SPLC has recently been proposed as an innovative surgical approach for gallbladder surgery. So far, its safety with respect to bile duct injuries has not been specifically evaluated. A systematic review of the literature published between January 1990 and November 2012 was performed. Randomized controlled trials (RCT) comparing SPLC versus CLC reporting BDI rate and overall biliary complications were included. The quality of RCT was assessed using the Jadad score. Analysis was made by performing a meta-analysis, using Review Manager 5.2. This study was based on the Preferred Reporting Items for Systematic reviews and Meta-Analyses guidelines. A retrospective study including all retrospective reports on SPLC was also performed alongside. RESULTS: From 496 publications, 11 RCT including 898 patients were selected for meta-analysis. No studies were rated as high quality (Jadad score ≥ 4). Operative indications included benign gallbladder disease operated in an elective setting in all studies, excluding all emergency cases and acute cholecystitis. The median follow-up was 1 mo (range 0.03-18 mo). The incidence of BDI was 0.4% for SPLC and 0% for CLC; the difference was not statistically different (P = 0.36). The incidence of overall biliary complication was 1.6% for SPLC and 0.5% for CLC, the difference did not reached statistically significance (P = 0.21, 95%CI: 0.66-15). Sixty non-randomized trials including 3599 patients were also analysed. The incidence of BDI reported then was 0.7%. CONCLUSION: The safety of SPLC cannot be assumed, based on the current evidence. Hence, this new technology cannot be recommended as standard technique for laparoscopic cholecystectomy.
Resumo:
SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.
Resumo:
This dissertation focuses on the practice of regulatory governance, throughout the study of the functioning of formally independent regulatory agencies (IRAs), with special attention to their de facto independence. The research goals are grounded on a "neo-positivist" (or "reconstructed positivist") position (Hawkesworth 1992; Radaelli 2000b; Sabatier 2000). This perspective starts from the ontological assumption that even if subjective perceptions are constitutive elements of political phenomena, a real world exists beyond any social construction and can, however imperfectly, become the object of scientific inquiry. Epistemologically, it follows that hypothetical-deductive theories with explanatory aims can be tested by employing a proper methodology and set of analytical techniques. It is thus possible to make scientific inferences and general conclusions to a certain extent, according to a Bayesian conception of knowledge, in order to update the prior scientific beliefs in the truth of the related hypotheses (Howson 1998), while acknowledging the fact that the conditions of truth are at least partially subjective and historically determined (Foucault 1988; Kuhn 1970). At the same time, a sceptical position is adopted towards the supposed disjunction between facts and values and the possibility of discovering abstract universal laws in social science. It has been observed that the current version of capitalism corresponds to the golden age of regulation, and that since the 1980s no government activity in OECD countries has grown faster than regulatory functions (Jacobs 1999). Following an apparent paradox, the ongoing dynamics of liberalisation, privatisation, decartelisation, internationalisation, and regional integration hardly led to the crumbling of the state, but instead promoted a wave of regulatory growth in the face of new risks and new opportunities (Vogel 1996). Accordingly, a new order of regulatory capitalism is rising, implying a new division of labour between state and society and entailing the expansion and intensification of regulation (Levi-Faur 2005). The previous order, relying on public ownership and public intervention and/or on sectoral self-regulation by private actors, is being replaced by a more formalised, expert-based, open, and independently regulated model of governance. Independent regulation agencies (IRAs), that is, formally independent administrative agencies with regulatory powers that benefit from public authority delegated from political decision makers, represent the main institutional feature of regulatory governance (Gilardi 2008). IRAs constitute a relatively new technology of regulation in western Europe, at least for certain domains, but they are increasingly widespread across countries and sectors. For instance, independent regulators have been set up for regulating very diverse issues, such as general competition, banking and finance, telecommunications, civil aviation, railway services, food safety, the pharmaceutical industry, electricity, environmental protection, and personal data privacy. Two attributes of IRAs deserve a special mention. On the one hand, they are formally separated from democratic institutions and elected politicians, thus raising normative and empirical concerns about their accountability and legitimacy. On the other hand, some hard questions about their role as political actors are still unaddressed, though, together with regulatory competencies, IRAs often accumulate executive, (quasi-)legislative, and adjudicatory functions, as well as about their performance.
Resumo:
The increasing number of bomb attacks involving improvised explosive devices, as well as the nature of the explosives, give rise to concern among safety and law enforcement agencies. The substances used in explosive charges are often everyday products diverted from their primary licit applications. Thus, reducing or limiting their accessibility for prevention purposes is difficult. Ammonium nitrate, employed in agriculture as a fertiliser, is used worldwide in small and large homemade bombs. Black powder, dedicated to hunting and shooting sports, is used illegally as a filling in pipe bombs causing extensive damage. If the main developments of instrumental techniques in explosive analysis have been constantly pushing the limits of detection, their actual contribution to the investigation of explosives in terms of source discrimination is limited. Forensic science has seen the emergence of a new technology, isotope ratio mass spectrometry (IRMS), that shows promising results. Its very first application in forensic science dates back to 1979. Liu et al. analysed cannabis plants coming from different countries [Liu et al. 1979]. This preliminary study highlighted its potential to discriminate specimens coming from different sources. Thirty years later, the keen interest in this new technology has given rise to a flourishing number of publications in forensic science. The countless applications of IRMS to a wide range of materials and substances attest to its success and suggest that the technique is ready to be used in forensic science. However, many studies are characterised by a lack of methodology and fundamental data. They have been undertaken in a top-down approach, applying this technique in an exploratory manner on a restricted sampling. This manner of procedure often does not allow the researcher to answer a number of questions, such as: do the specimens come from the same source, what do we mean by source or what is the inherent variability of a substance? The production of positive results has prevailed at the expense of forensic fundamentals. This research focused on the evaluation of the contribution of the information provided by isotopic analysis to the investigation of explosives. More specifically, this evaluation was based on a sampling of black powders and ammonium nitrate fertilisers coming from known sources. Not only has the methodology developed in this work enabled us to highlight crucial elements inherent to the methods themselves, but also to evaluate both the longitudinal and transversal variabilities of the information. First, the study of the variability of the profile over time was undertaken. Secondly, the variability of black powders and ammonium nitrate fertilisers within the same source and between different sources was evaluated. The contribution of this information to the investigation of explosives was then evaluated and discussed. --------------------------------------------------------------------------------------------------- Le nombre croissant d'attentats à la bombe impliquant des engins explosifs artisanaux, ainsi que la nature des charges explosives, constituent une préoccupation majeure pour les autorités d'application de la loi et les organismes de sécurité. Les substances utilisées dans les charges explosives sont souvent des produits du quotidien, détournés de leurs applications licites. Par conséquent, réduire ou limiter l'accessibilité de ces produits dans un but de prévention est difficile. Le nitrate d'ammonium, employé dans l'agriculture comme engrais, est utilisé dans des petits et grands engins explosifs artisanaux. La poudre noire, initialement dédiée à la chasse et au tir sportif, est fréquemment utilisée comme charge explosive dans les pipe bombs, qui causent des dommages importants. Si les développements des techniques d'analyse des explosifs n'ont cessé de repousser les limites de détection, leur contribution réelle à l'investigation des explosifs est limitée en termes de discrimination de sources. Une nouvelle technologie qui donne des résultats prometteurs a fait son apparition en science forensique: la spectrométrie de masse à rapport isotopique (IRMS). Sa première application en science forensique remonte à 1979. Liu et al. ont analysé des plants de cannabis provenant de différents pays [Liu et al. 1979]. Cette étude préliminaire, basée sur quelques analyses, a mis en évidence le potentiel de l'IRMS à discriminer des spécimens provenant de sources différentes. Trente ans plus tard, l'intérêt marqué pour cette nouvelle technologie en science forensique se traduit par un nombre florissant de publications. Les innombrables applications de l'IRMS à une large gamme de matériaux et de substances attestent de son succès et suggèrent que la technique est prête à être utilisée en science forensique. Cependant, de nombreuses études sont caractérisées par un manque de méthodologie et de données fondamentales. Elles ont été menées sans définir les hypothèses de recherche et en appliquant cette technique de façon exploratoire sur un échantillonnage restreint. Cette manière de procéder ne permet souvent pas au chercheur de répondre à un certain nombre de questions, tels que: est-ce que deux spécimens proviennent de la même source, qu'entend-on par source ou encore quelle est l'intravariabilité d'une substance? La production de résultats positifs a prévalu au détriment des fondamentaux de science forensique. Cette recherche s'est attachée à évaluer la contribution réelle de l'information isotopique dans les investigations en matière d'explosifs. Plus particulièrement, cette évaluation s'est basée sur un échantillonnage constitué de poudres noires et d'engrais à base de nitrate d'ammonium provenant de sources connues. La méthodologie développée dans ce travail a permis non seulement de mettre en évidence des éléments cruciaux relatifs à la méthode d'analyse elle-même, mais également d'évaluer la variabilité de l'information isotopique d'un point de vue longitudinal et transversal. Dans un premier temps, l'évolution du profil en fonction du temps a été étudiée. Dans un second temps, la variabilité du profil des poudres noires et des engrais à base de nitrate d'ammonium au sein d'une même source et entre différentes sources a été évaluée. La contribution de cette information dans le cadre des investigations d'explosifs a ensuite été discutée et évaluée.
Resumo:
An objective analysis of image quality parameters was performed for a computed radiography (CR) system using both standard single-side and prototype dual-side read plates. The pre-sampled modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) for the systems were determined at three different beam qualities representative of pediatric chest radiography, at an entrance detector air kerma of 5 microGy. The NPS and DQE measurements were realized under clinically relevant x-ray spectra for pediatric radiology, including x-ray scatter radiations. Compared to the standard single-side read system, the MTF for the dual-side read system is reduced, but this is offset by a significant decrease in image noise, resulting in a marked increase in DQE (+40%) in the low spatial frequency range. Thus, for the same image quality, the new technology permits the CR system to be used at a reduced dose level.
Resumo:
Background. Early identification of pathogens from blood cultures using matrix-assisted laser desorption ionization time-of-flight (MALDI-TOF) mass spectrometry may optimize the choice of empirical antibiotic therapy in the setting of bloodstream infections. We aimed to assess the impact of this new technology on the use of antibiotic treatment in patients with gram-negative bacteremia. Methods. We conducted a prospective observational study from January to December 2010 to evaluate the sequential and separate impacts of Gram stain reporting and MALDI-TOF bacterial identification performed on blood culture pellets in patients with gram-negative bacteremia. The primary outcome was the impact of MALDI-TOF on empirical antibiotic choice. Results. Among 202 episodes of gram-negative bacteremia, Gram stain reporting had an impact in 42 cases (20.8%). MALDI-TOF identification led to a modification of empirical therapy in 71 of all 202 cases (35.1%), and in 16 of 27 cases (59.3%) of monomicrobial bacteremia caused by AmpC-producing Enterobacteriaceae. The most frequently observed impact was an early appropriate broadening of the antibiotic spectrum in 31 of 71 cases (43.7%). In total, 143 of 165 episodes (86.7%) of monomicrobial bacteremia were correctly identified at genus level by MALDI-TOF. Conclusions. In a low prevalence area for extended spectrum betalactamases (ESBL) and multiresistant gram-negative bacteria, MALDI-TOF performed on blood culture pellets had an impact on the clinical management of 35.1% of all gram-negative bacteremia cases, demonstrating a greater impact than Gram stain reporting. Thus, MALDI-TOF could become a vital second step beside Gram stain in guiding the empirical treatment of patients with bloodstream infection.
Resumo:
This is the second edition of the compendium. Since the first edition a number of important initiatives have been launched in the shape of large projects targeting integration of research infrastructure and new technology for toxicity studies and exposure monitoring.The demand for research in the area of human health and environmental safety management of nanotechnologies is present since a decade and identified by several landmark reports and studies. Several guidance documents have been published. It is not the intention of this compendium to report on these as they are widely available.It is also not the intention to publish scientific papers and research results as this task is covered by scientific conferences and the peer reviewed press.The intention of the compendium is to bring together researchers, create synergy in their work, and establish links and communication between them mainly during the actual research phase before publication of results. Towards this purpose we find useful to give emphasis to communication of projects strategic aims, extensive coverage of specific work objectives and of methods used in research, strengthening human capacities and laboratories infrastructure, supporting collaboration for common goals and joint elaboration of future plans, without compromising scientific publication potential or IP Rights.These targets are far from being achieved with the publication in its present shape. We shall continue working, though, and hope with the assistance of the research community to make significant progress. The publication will take the shape of a dynamic, frequently updated, web-based document available free of charge to all interested parties. Researchers in this domain are invited to join the effort, communicating the work being done. [Auteurs]
Resumo:
This article reviews: 1) some of the results of drug eluting stents (SYNTAX and FAME); 2) the questionnable benefit of physical training in heart failure patients (HF-ACTION); 3) the benefit on cardiac remodelling of cardiac resynchronisation in heart failure patients (REVERSE study) and 4) the role of rate over rhythm control in patients with atrial fibrillation and heart failure (AF-CHF study); this article also reports the encouraging evolution of new technology allowing percutaneous implantation of stents-valves. Finally, this article address the screening of athletes for cardiac diseases.
Resumo:
Since the development of the first whole-cell living biosensor or bioreporter about 15 years ago, construction and testing of new genetically modified microorganisms for environmental sensing and reporting has proceeded at an ever increasing rate. One and a half decades appear as a reasonable time span for a new technology to reach the maturity needed for application and commercial success. It seems, however, that the research into cellular biosensors is still mostly in a proof-of-principle or demonstration phase and not close to extensive or commercial use outside of academia. In this review, we consider the motivations for bioreporter developments and discuss the suitability of extant bioreporters for the proposed applications to stimulate complementary research and to help researchers to develop realistic objectives. This includes the identification of some popular misconceptions about the qualities and shortcomings of bioreporters.
Resumo:
BACKGROUND/AIMS: The present report examines a new pig model for progressive induction of high-grade stenosis, for the study of chronic myocardial ischemia and the dynamics of collateral vessel growth. METHODS: Thirty-nine Landrace pigs were instrumented with a novel experimental stent (GVD stent) in the left anterior descending coronary artery. Eight animals underwent transthoracic echocardiography at rest and under low-dose dobutamine. Seven animals were examined by nuclear PET and SPECT analysis. Epi-, mid- and endocardial fibrosis and the numbers of arterial vessels were examined by histology. RESULTS: Functional analysis showed a significant decrease in global left ventricular ejection fraction (24.5 +/- 1.6%) 3 weeks after implantation. There was a trend to increased left ventricular ejection fraction after low-dose dobutamine stress (36.0 +/- 6.6%) and a significant improvement of the impaired regional anterior wall motion. PET and SPECT imaging documented chronic hibernation. Myocardial fibrosis increased significantly in the ischemic area with a gradient from epi- to endocardial. The number of arterial vessels in the ischemic area increased and coronary angiography showed abundant collateral vessels of Rentrop class 1. CONCLUSION: The presented experimental model mimics the clinical situation of chronic myocardial ischemia secondary to 1-vessel coronary disease.