822 resultados para current medical knowledge
Resumo:
The rapid emergence of infectious diseases calls for immediate attention to determine practical solutions for intervention strategies. To this end, it becomes necessary to obtain a holistic view of the complex hostpathogen interactome. Advances in omics and related technology have resulted in massive generation of data for the interacting systems at unprecedented levels of detail. Systems-level studies with the aid of mathematical tools contribute to a deeper understanding of biological systems, where intuitive reasoning alone does not suffice. In this review, we discuss different aspects of hostpathogen interactions (HPIs) and the available data resources and tools used to study them. We discuss in detail models of HPIs at various levels of abstraction, along with their applications and limitations. We also enlist a few case studies, which incorporate different modeling approaches, providing significant insights into disease. (c) 2013 Wiley Periodicals, Inc.
Resumo:
The telecommunication, broadcasting and other instrumented towers carry power and/or signal cables from their ground end to their upper regions. During a direct hit to the tower, significant induction can occur to these mounted cables. In order to provide adequate protection to the equipments connected to them, protection schemes have been evolved in the literature. Development of more effective protection schemes requires a quantitative knowledge on various parameters. However, such quantitative knowledge is difficult to find at present. Amongst several of these aspects, the present work aims to investigate on the two important aspects: (i) what would be the nature of the induced currents and (ii) what will be the current sharing if as per the practice, the sheath of the cable is connected to the down conductor/tower. These aspects will be useful in design of protection schemes and also in analyzing the field structure around instrumented towers.
Resumo:
Electrical Impedance Tomography (EIT) is a computerized medical imaging technique which reconstructs the electrical impedance images of a domain under test from the boundary voltage-current data measured by an EIT electronic instrumentation using an image reconstruction algorithm. Being a computed tomography technique, EIT injects a constant current to the patient's body through the surface electrodes surrounding the domain to be imaged (Omega) and tries to calculate the spatial distribution of electrical conductivity or resistivity of the closed conducting domain using the potentials developed at the domain boundary (partial derivative Omega). Practical phantoms are essentially required to study, test and calibrate a medical EIT system for certifying the system before applying it on patients for diagnostic imaging. Therefore, the EIT phantoms are essentially required to generate boundary data for studying and assessing the instrumentation and inverse solvers a in EIT. For proper assessment of an inverse solver of a 2D EIT system, a perfect 2D practical phantom is required. As the practical phantoms are the assemblies of the objects with 3D geometries, the developing of a practical 2D-phantom is a great challenge and therefore, the boundary data generated from the practical phantoms with 3D geometry are found inappropriate for assessing a 2D inverse solver. Furthermore, the boundary data errors contributed by the instrumentation are also difficult to separate from the errors developed by the 3D phantoms. Hence, the errorless boundary data are found essential to assess the inverse solver in 2D EIT. In this direction, a MatLAB-based Virtual Phantom for 2D EIT (MatVP2DEIT) is developed to generate accurate boundary data for assessing the 2D-EIT inverse solvers and the image reconstruction accuracy. MatVP2DEIT is a MatLAB-based computer program which simulates a phantom in computer and generates the boundary potential data as the outputs by using the combinations of different phantom parameters as the inputs to the program. Phantom diameter, inhomogeneity geometry (shape, size and position), number of inhomogeneities, applied current magnitude, background resistivity, inhomogeneity resistivity all are set as the phantom variables which are provided as the input parameters to the MatVP2DEIT for simulating different phantom configurations. A constant current injection is simulated at the phantom boundary with different current injection protocols and boundary potential data are calculated. Boundary data sets are generated with different phantom configurations obtained with the different combinations of the phantom variables and the resistivity images are reconstructed using EIDORS. Boundary data of the virtual phantoms, containing inhomogeneities with complex geometries, are also generated for different current injection patterns using MatVP2DEIT and the resistivity imaging is studied. The effect of regularization method on the image reconstruction is also studied with the data generated by MatVP2DEIT. Resistivity images are evaluated by studying the resistivity parameters and contrast parameters estimated from the elemental resistivity profiles of the reconstructed phantom domain. Results show that the MatVP2DEIT generates accurate boundary data for different types of single or multiple objects which are efficient and accurate enough to reconstruct the resistivity images in EIDORS. The spatial resolution studies show that, the resistivity imaging conducted with the boundary data generated by MatVP2DEIT with 2048 elements, can reconstruct two circular inhomogeneities placed with a minimum distance (boundary to boundary) of 2 mm. It is also observed that, in MatVP2DEIT with 2048 elements, the boundary data generated for a phantom with a circular inhomogeneity of a diameter less than 7% of that of the phantom domain can produce resistivity images in EIDORS with a 1968 element mesh. Results also show that the MatVP2DEIT accurately generates the boundary data for neighbouring, opposite reference and trigonometric current patterns which are very suitable for resistivity reconstruction studies. MatVP2DEIT generated data are also found suitable for studying the effect of the different regularization methods on reconstruction process. Comparing the reconstructed image with an original geometry made in MatVP2DEIT, it would be easier to study the resistivity imaging procedures as well as the inverse solver performance. Using the proposed MatVP2DEIT software with modified domains, the cross sectional anatomy of a number of body parts can be simulated in PC and the impedance image reconstruction of human anatomy can be studied.
Resumo:
Insulated gate bipolar transistors (IGBTs) are used in high-power voltage-source converters rated up to hundreds of kilowatts or even a few megawatts. Knowledge of device switching characteristics is required for reliable design and operation of the converters. Switching characteristics are studied widely at high current levels, and corresponding data are available in datasheets. But the devices in a converter also switch low currents close to the zero crossings of the line currents. Further, the switching behaviour under these conditions could significantly influence the output waveform quality including zero crossover distortion. Hence, the switching characteristics of high-current IGBTs (300-A and 75-A IGBT modules) at low load current magnitudes are investigated experimentally in this paper. The collector current, gate-emitter voltage and collector-emitter voltage are measured at various low values of current (less than 10% of the device rated current). A specially designed in-house constructed coaxial current transformer (CCT) is used for device current measurement without increasing the loop inductance in the power circuit. Experimental results show that the device voltage rise time increases significantly during turn-off transitions at low currents.
Resumo:
Insulated gate bipolar transistors (IGBTs) are used in high-power voltage-source converters rated up to hundreds of kilowatts or even a few megawatts. Knowledge of device switching characteristics is required for reliable design and operation of the converters. Switching characteristics are studied widely at high current levels, and corresponding data are available in datasheets. But the devices in a converter also switch low currents close to the zero crossings of the line currents. Further, the switching behaviour under these conditions could significantly influence the output waveform quality including zero crossover distortion. Hence, the switching characteristics of high-current IGBTs (300-A and 75-A IGBT modules) at low load current magnitudes are investigated experimentally in this paper. The collector current, gate-emitter voltage and collector-emitter voltage are measured at various low values of current (less than 10% of the device rated current). A specially designed in-house constructed coaxial current transformer (CCT) is used for device current measurement without increasing the loop inductance in the power circuit. Experimental results show that the device voltage rise time increases significantly during turn-off transitions at low currents.
Resumo:
The rapid emergence of infectious diseases calls for immediate attention to determine practical solutions for intervention strategies. To this end, it becomes necessary to obtain a holistic view of the complex hostpathogen interactome. Advances in omics and related technology have resulted in massive generation of data for the interacting systems at unprecedented levels of detail. Systems-level studies with the aid of mathematical tools contribute to a deeper understanding of biological systems, where intuitive reasoning alone does not suffice. In this review, we discuss different aspects of hostpathogen interactions (HPIs) and the available data resources and tools used to study them. We discuss in detail models of HPIs at various levels of abstraction, along with their applications and limitations. We also enlist a few case studies, which incorporate different modeling approaches, providing significant insights into disease. (c) 2013 Wiley Periodicals, Inc.
Resumo:
The broader goal of the research being described here is to automatically acquire diagnostic knowledge from documents in the domain of manual and mechanical assembly of aircraft structures. These documents are treated as a discourse used by experts to communicate with others. It therefore becomes possible to use discourse analysis to enable machine understanding of the text. The research challenge addressed in the paper is to identify documents or sections of documents that are potential sources of knowledge. In a subsequent step, domain knowledge will be extracted from these segments. The segmentation task requires partitioning the document into relevant segments and understanding the context of each segment. In discourse analysis, the division of a discourse into various segments is achieved through certain indicative clauses called cue phrases that indicate changes in the discourse context. However, in formal documents such language may not be used. Hence the use of a domain specific ontology and an assembly process model is proposed to segregate chunks of the text based on a local context. Elements of the ontology/model, and their related terms serve as indicators of current context for a segment and changes in context between segments. Local contexts are aggregated for increasingly larger segments to identify if the document (or portions of it) pertains to the topic of interest, namely, assembly. Knowledge acquired through such processes enables acquisition and reuse of knowledge during any part of the lifecycle of a product.
Resumo:
This work has been presented in: V Conference AERNA, Faro (Portugal), 30 May 2012-1 June 2012 and IV Workshop on Valuation Methods in Agro-food and Environmental Economics, Castelldefels (Barcelona, Spain), 12 July 2012-13 July 2012.
Resumo:
ENGLISH: The tendency of the tunas, especially the yellowfin (Neothunnus macropterus) to be more abundant in the near vicinity of islands and seamounts, or "banks", than in the surrounding oceanic areas, is well known to commercial fishermen. This has been confirmed by statistical analysis of fishing vessel logbook records, which demonstrates that the catch-per-day's-fishing is, indeed, higher in the near vicinity of these features. It is hypothesized that islands and seamounts cause changes in the physical circulation or the biochemical cycle resulting in greater supplies of food for tunas in their immediate environs. In order to examine this hypothesis, and in order to study possible mechanisms involved, the "Island Current Survey" was undertaken from 8 May to 12 June, 1957, under the joint auspices of the Inter-American Tropical Tuna Commission and the Scripps Institution of Oceanography. Surveys of varying nature and extent were made from M/V Spencer F. Baird near Alijos Rocks, Clarion Island, Shimada Bank and Socorro Island (Figure 1). These studies sought to provide knowledge of the action of islands and seamounts in arresting, stalling or deflecting the mean current past them, in establishing convergence and divergence in the surface flow, in producing vertical motion (mixing and upwelling), and in influencing the primary production and the standing crops of phytoplankton and zooplankton. Each survey is discussed below in detail. Observations made at a front on 10 June will be discussed in another paper. SPANISH: Los pescadores que realizan la pesca comercial conocen muy bien la tendencia de los atunes, en particular del atún aleta amarilla (Neothunnus macropterus), de presentarse en mayor abundancia en las cercanías inmediatas a las islas y cimas submarinas, o "bancos", que en las áreas oceánicas circundantes. Este hecho ha sido confirmado par el análisis estadístico de los registros de los cuadernos de bitácora de las embarcaciones pesqueras, demostrándose que la captura par dias de pesca es, en efecto, más abundante en la inmediata proximidad de tales formaciones. Hipotéticamente se admite que las islas y las cimas submarinas provocan cambios en la circulación física o en el ciclo bioquímico, lo cual se pone de manifiesto a través de un mejor abastecimiento de alimento para los atunes en sus cercanías inmediatas. Con la finalidad de verificar esta hipótesis y de estudiar los mecanismos que ella involucra, se realizó la “Island Current Survey” del 8 de mayo al 12 de junio de 1957, bajo los auspicios de la Comisión Interamericana del Atún Tropical y de la Institución Scripps de Oceanografia. Con el barco Spencer F. Baird se hicieron observaciones de distintas clases y alcances cerca de las Rocas Alijos, la Isla Clarion, el Banco Shimada y la Isla Socorro (Figura 1). Estos estudios tuvieron por objeto adquirir conocimientos sobre la acción que ejercen las islas y cimas submarinas sobre la corriente promedio, ya sea deteniéndola, reduciendo su velocidad o desviando su curso, así como estableciendo convergencia o divergencia en su flujo de superficie, o provocando un movimiento vertical (mezcla y afloramiento) e influyendo en la producción primaria y en las existencias de fitoplancton y zooplancton. Cada operación será tratada a continuación por separado. Las observaciones hechas el dia 10 de junio sobre un frente serán objeto de otra publicación.
Resumo:
This Technical memorandum fulfills Task 2 for Agreement 03-495 between El Dorado County and the Office of Water Programs at California State University Sacramento and their co-authors, Bachand & Associates and the University of California Tahoe Research Group: 1) a review of current stormwater treatment Best Management Practices (BMP) in the Tahoe Basin and their potential effectiveness in removing fine particles and reducing nutrient concentrations; 2) an assessment of the potential for improving the performance of different types of existing BMPs through retrofitting or better maintenance practices; 3) a review of additional promising treatment technologies not currently in use in the Tahoe Basin; and 4) a list of recommendations to help address the knowledge gaps in BMP design and performance. ... (PDF contains 67 pages)
Resumo:
Executive Summary: Observations show that warming of the climate is unequivocal. The global warming observed over the past 50 years is due primarily to human-induced emissions of heat-trapping gases. These emissions come mainly from the burning of fossil fuels (coal, oil, and gas), with important contributions from the clearing of forests, agricultural practices, and other activities. Warming over this century is projected to be considerably greater than over the last century. The global average temperature since 1900 has risen by about 1.5ºF. By 2100, it is projected to rise another 2 to 11.5ºF. The U.S. average temperature has risen by a comparable amount and is very likely to rise more than the global average over this century, with some variation from place to place. Several factors will determine future temperature increases. Increases at the lower end of this range are more likely if global heat-trapping gas emissions are cut substantially. If emissions continue to rise at or near current rates, temperature increases are more likely to be near the upper end of the range. Volcanic eruptions or other natural variations could temporarily counteract some of the human-induced warming, slowing the rise in global temperature, but these effects would only last a few years. Reducing emissions of carbon dioxide would lessen warming over this century and beyond. Sizable early cuts in emissions would significantly reduce the pace and the overall amount of climate change. Earlier cuts in emissions would have a greater effect in reducing climate change than comparable reductions made later. In addition, reducing emissions of some shorter-lived heat-trapping gases, such as methane, and some types of particles, such as soot, would begin to reduce warming within weeks to decades. Climate-related changes have already been observed globally and in the United States. These include increases in air and water temperatures, reduced frost days, increased frequency and intensity of heavy downpours, a rise in sea level, and reduced snow cover, glaciers, permafrost, and sea ice. A longer ice-free period on lakes and rivers, lengthening of the growing season, and increased water vapor in the atmosphere have also been observed. Over the past 30 years, temperatures have risen faster in winter than in any other season, with average winter temperatures in the Midwest and northern Great Plains increasing more than 7ºF. Some of the changes have been faster than previous assessments had suggested. These climate-related changes are expected to continue while new ones develop. Likely future changes for the United States and surrounding coastal waters include more intense hurricanes with related increases in wind, rain, and storm surges (but not necessarily an increase in the number of these storms that make landfall), as well as drier conditions in the Southwest and Caribbean. These changes will affect human health, water supply, agriculture, coastal areas, and many other aspects of society and the natural environment. This report synthesizes information from a wide variety of scientific assessments (see page 7) and recently published research to summarize what is known about the observed and projected consequences of climate change on the United States. It combines analysis of impacts on various sectors such as energy, water, and transportation at the national level with an assessment of key impacts on specific regions of the United States. For example, sea-level rise will increase risks of erosion, storm surge damage, and flooding for coastal communities, especially in the Southeast and parts of Alaska. Reduced snowpack and earlier snow melt will alter the timing and amount of water supplies, posing significant challenges for water resource management in the West. (PDF contains 196 pages)
Resumo:
With elevating interest to establish conservation efforts for groundfish stocks and continued scrutiny over the value of marine protected areas along the west coast, the importance of enhancing our knowledge of seabed characteristics through mapping activities is becoming increasingly more important, especially in a timely manner. Shortly after the inception of the Seabed Mapping Initiative instituted with the US Geological Survey (USGS), the National Marine Sanctuary Program (NMSP) assembled a panel of habitat mapping experts. They determined that the status of existing data sets and future data acquisition needs varied widely among the individual sanctuaries and that more detailed site assessments were needed to better prioritize mapping efforts and outline an overall joint strategy. To assist with that specific effort and provide pertinent information for the Olympic Coast National Marine Sanctuary’s (OCNMS) Management Plan Review, this report summarizes the mapping efforts that have taken place at the site to date; calculates a timeframe for completion of baseline mapping efforts when operating under current data acquisition limitations; describes an optimized survey strategy to dramatically reduce the required time to complete baseline surveying; and provides estimates for the needed vessel sea-days (DAS) to accomplish baseline survey completion within a 2, 5 and 10 year timeframe. (PDF contains 38 pages.)
Resumo:
Organised by Knowledge Exchange & the Nordbib programme 11 June 2012, 8:30-12:30, Copenhagen Adjacent to the Nordbib conference 'Structural frameworks for open, digital research' Participants in break out discussion during the workshop on cost modelsThe Knowledge Exchange and the Nordbib programme organised a workshop on cost models for the preservation and management of digital collections. The rapid growth of the digital information which a wide range of institutions must preserve emphasizes the need for robust cost modelling. Such models should enable these institutions to assess both what resources are needed to sustain their digital preservation activities and allow comparisons of different preservation solutions in order to select the most cost-efficient alternative. In order to justify the costs institutions also need to describe the expected benefits of preserving digital information. This workshop provided an overview of existing models and demonstrated the functionality of some of the current cost tools. It considered the specific economic challenges with regard to the preservation of research data and addressed the benefits of investing in the preservation of digital information. Finally, the workshop discussed international collaboration on cost models. The aim of the workshop was to facilitate understanding of the economies of data preservation and to discuss the value of developing an international benchmarking model for the costs and benefits of digital preservation. The workshop took place in the Danish Agency for Culture and was planned directly prior to the Nordbib conference 'Structural frameworks for open, digital research'