936 resultados para Best available techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain perfusion can be assessed by CT and MR. For CT, two major techniquesare used. First, Xenon CT is an equilibrium technique based on a freely diffusibletracer. First pass of iodinated contrast injected intravenously is a second method,more widely available. Both methods are proven to be robust and quantitative,thanks to the linear relationship between contrast concentration and x-ray attenuation.For the CT methods, concern regarding x-ray doses delivered to the patientsneed to be addressed. MR is also able to assess brain perfusion using the firstpass of gadolinium based contrast agent injected intravenously. This method hasto be considered as a semi-quantitative because of the non linear relationshipbetween contrast concentration and MR signal changes. Arterial spin labelingis another MR method assessing brain perfusion without injection of contrast. Insuch case, the blood flow in the carotids is magnetically labelled by an externalradiofrequency pulse and observed during its first pass through the brain. Eachof this various CT and MR techniques have advantages and limits that will be illustratedand summarised.Learning Objectives:1. To understand and compare the different techniques for brain perfusionimaging.2. To learn about the methods of acquisition and post-processing of brainperfusion by first pass of contrast agent for CT and MR.3. To learn about non contrast MR methods (arterial spin labelling).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This special issue aims to cover some problems related to non-linear and nonconventional speech processing. The origin of this volume is in the ISCA Tutorial and Research Workshop on Non-Linear Speech Processing, NOLISP’09, held at the Universitat de Vic (Catalonia, Spain) on June 25–27, 2009. The series of NOLISP workshops started in 2003 has become a biannual event whose aim is to discuss alternative techniques for speech processing that, in a sense, do not fit into mainstream approaches. A selected choice of papers based on the presentations delivered at NOLISP’09 has given rise to this issue of Cognitive Computation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current means and methods of verifying that high-strength bolts have been properly tightened are very laborious and time consuming. In some cases, the techniques require special equipment and, in other cases, the verification itself may be somewhat subjective. While some commercially available verification techniques do exist, these options still have some limitations and might be considered costly options. The main objectives of this project were to explore high-strength bolt-tightening and verification techniques and to investigate the feasibility of developing and implementing new alternatives. A literature search and a survey of state departments of transportation (DOTs) were conducted to collect information on various bolt-tightening techniques such that an understanding of available and under-development techniques could be obtained. During the literature review, the requirements for materials, inspection, and installation methods outlined in the Research Council on Structural Connections specification were also reviewed and summarized. To guide the search for finding new alternatives and technology development, a working group meeting was held at the Iowa State University Institute for Transportation October 12, 2015. During the meeting, topics central to the research were discussed with Iowa DOT engineers and other professionals who have relevant experiences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research project was initiated in 1988 to study the effectiveness of four different construction techniques for establishing a stable base on a granular surfaced roadway. After base stabilization, the roadway was then seal coated, eliminating dust problems associated with granular surfaced roads. When monies become available, the roadway can be surfaced with a more permanent structure. A 2.8 mile section of the Horseshoe Road in Dubuque County was divided into four divisions for the study. This report discusses the procedures used during construction of these different divisions. Problems and possible solutions have been analyzed to better understand the capabilities of the materials and construction techniques used on the project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetics is the study of heredity, which means the study of genes and factors related to all aspects of genes. The scientific history of genetics began with the works of Gregor Mendel in the mid-19th century. Prior to Mendel, genetics was primarily theoretical whilst, after Mendel, the science of genetics was broadened to include experimental genetics. Developments in all fields of genetics and genetic technology in the first half of the 20th century provided a basis for the later developments. In the second half of the 20th century, the molecular background of genetics has become more understandable. Rapid technological advancements, followed by the completion of Human Genome Project, have contributed a great deal to the knowledge of genetic factors and their impact on human life and diseases. Currently, more than 1800 disease genes have been identified, more than 2000 genetic tests have become available, and in conjunction with this at least 350 biotechnology-based products have been released onto the market. Novel technologies, particularly next generation sequencing, have dramatically accelerated the pace of biological research, while at the same time increasing expectations. In this paper, a brief summary of genetic history with short explanations of most popular genetic techniques is given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide robust and compelling evidence of the marked impact of tertiary education on the economic growth of less developed countries and of its the relatively smaller impact on the growth of developed ones. Our results argue in favor of the accumulation of high skill levels especially in technologically under-developed countries and, contrary to common wisdom, independently of the fact that these economies might initially produce lower-technology goods or perform technology imitation. Our results are robust to the different measures used in proxying human capital and to the adjustments made for cross-country differences in the quality of education. Country-specific institutional quality, as well as other indicators including legal origin, religious fractionalization and openness to trade have been used to control for the robustness of the results. These factors are also shown to speed up technology convergence thereby confirming previous empirical studies. Our estimates tackle problems of endogeneity by adopting a variety of techniques, including instrumental variables -for both panel and cross-section analyses- and the two-step efficient dynamics system GMM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computed tomography (CT) is a modality of choice for the study of the musculoskeletal system for various indications including the study of bone, calcifications, internal derangements of joints (with CT arthrography), as well as periprosthetic complications. However, CT remains intrinsically limited by the fact that it exposes patients to ionizing radiation. Scanning protocols need to be optimized to achieve diagnostic image quality at the lowest radiation dose possible. In this optimization process, the radiologist needs to be familiar with the parameters used to quantify radiation dose and image quality. CT imaging of the musculoskeletal system has certain specificities including the focus on high-contrast objects (i.e., in CT of bone or CT arthrography). These characteristics need to be taken into account when defining a strategy to optimize dose and when choosing the best combination of scanning parameters. In the first part of this review, we present the parameters used for the evaluation and quantification of radiation dose and image quality. In the second part, we discuss different strategies to optimize radiation dose and image quality at CT, with a focus on the musculoskeletal system and the use of novel iterative reconstruction techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computed tomography (CT) is a modality of choice for the study of the musculoskeletal system for various indications including the study of bone, calcifications, internal derangements of joints (with CT arthrography), as well as periprosthetic complications. However, CT remains intrinsically limited by the fact that it exposes patients to ionizing radiation. Scanning protocols need to be optimized to achieve diagnostic image quality at the lowest radiation dose possible. In this optimization process, the radiologist needs to be familiar with the parameters used to quantify radiation dose and image quality. CT imaging of the musculoskeletal system has certain specificities including the focus on high-contrast objects (i.e., in CT of bone or CT arthrography). These characteristics need to be taken into account when defining a strategy to optimize dose and when choosing the best combination of scanning parameters. In the first part of this review, we present the parameters used for the evaluation and quantification of radiation dose and image quality. In the second part, we discuss different strategies to optimize radiation dose and image quality of CT, with a focus on the musculoskeletal system and the use of novel iterative reconstruction techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neural signal processing is a discipline within neuroengineering. This interdisciplinary approach combines principles from machine learning, signal processing theory, and computational neuroscience applied to problems in basic and clinical neuroscience. The ultimate goal of neuroengineering is a technological revolution, where machines would interact in real time with the brain. Machines and brains could interface, enabling normal function in cases of injury or disease, brain monitoring, and/or medical rehabilitation of brain disorders. Much current research in neuroengineering is focused on understanding the coding and processing of information in the sensory and motor systems, quantifying how this processing is altered in the pathological state, and how it can be manipulated through interactions with artificial devices including brain–computer interfaces and neuroprosthetics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A study of the partial USEPA 3050B and total ISO 14869-1:2001 digestion methods of sediments was performed. USEPA 3050B was recommended as the simpler method with less operational risk. However, the extraction ability of the method should be taken in account for the best environmental interpretation of the results. FAAS was used to quantify metal concentrations in sediment solutions. The alternative use of ICP-OES quantification should be conditioned by a previous detailed investigation and eventual correction of the matrix effect. For the first time, the EID method was employed for the detection and correction of the matrix effect in sediment ICP-OES analysis. Finally, some considerations were made about the level of metal contamination in the area under study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current study aims to verify the best method for a rapid and efficient extraction of flavonoids from Alpinia zerumbet. Dried leaves were extracted using distillated water and ethanol 70% by extraction methods of shaking maceration, ultrasonic, microwave and stirring. By the application of TLC and reversed-phase HPLC techniques the rutin and kaempferol-3-O-glucuronide were detected. Ethanol 70% was more efficient for flavonoids extraction than water. No significant yielding variation was verified for ultrasonic, microwave and stirring methods using ethanol 70% (11 to 14%). The relative concentration of rutin and kaempferol-3-O-glucuronide, respectively, was higher by ultrasonic (1.5 and 5.62 mg g-1 dried leaves, respectively) and by microwave (1.0 and 6.64 mg g-1 dried leaves) methods using ethanol. Rapid and simplified extraction proceeding optimize phytochemical work and acquisition of secondary metabolites.