990 resultados para Domain elimination method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mismatching of alveolar ventilation and perfusion (VA/Q) is the major determinant of impaired gas exchange. The gold standard for measuring VA/Q distributions is based on measurements of the elimination and retention of infused inert gases. Conventional multiple inert gas elimination technique (MIGET) uses gas chromatography (GC) to measure the inert gas partial pressures, which requires tonometry of blood samples with a gas that can then be injected into the chromatograph. The method is laborious and requires meticulous care. A new technique based on micropore membrane inlet mass spectrometry (MMIMS) facilitates the handling of blood and gas samples and provides nearly real-time analysis. In this study we compared MIGET by GC and MMIMS in 10 piglets: 1) 3 with healthy lungs; 2) 4 with oleic acid injury; and 3) 3 with isolated left lower lobe ventilation. The different protocols ensured a large range of normal and abnormal VA/Q distributions. Eight inert gases (SF6, krypton, ethane, cyclopropane, desflurane, enflurane, diethyl ether, and acetone) were infused; six of these gases were measured with MMIMS, and six were measured with GC. We found close agreement of retention and excretion of the gases and the constructed VA/Q distributions between GC and MMIMS, and predicted PaO2 from both methods compared well with measured PaO2. VA/Q by GC produced more widely dispersed modes than MMIMS, explained in part by differences in the algorithms used to calculate VA/Q distributions. In conclusion, MMIMS enables faster measurement of VA/Q, is less demanding than GC, and produces comparable results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stable oxygen isotope composition of atmospheric precipitation (δ18Op) was scrutinized from 39 stations distributed over Switzerland and its border zone. Monthly amount-weighted δ18Op values averaged over the 1995–2000 period showed the expected strong linear altitude dependence (−0.15 to −0.22‰ per 100 m) only during the summer season (May–September). Steeper gradients (~ −0.56 to −0.60‰ per 100 m) were observed for winter months over a low elevation belt, while hardly any altitudinal difference was seen for high elevation stations. This dichotomous pattern could be explained by the characteristically shallower vertical atmospheric mixing height during winter season and provides empirical evidence for recently simulated effects of stratified atmospheric flow on orographic precipitation isotopic ratios. This helps explain "anomalous" deflected altitudinal water isotope profiles reported from many other high relief regions. Grids and isotope distribution maps of the monthly δ18Op have been calculated over the study region for 1995–1996. The adopted interpolation method took into account both the variable mixing heights and the seasonal difference in the isotopic lapse rate and combined them with residual kriging. The presented data set allows a point estimation of δ18Op with monthly resolution. According to the test calculations executed on subsets, this biannual data set can be extended back to 1992 with maintained fidelity and, with a reduced station subset, even back to 1983 at the expense of faded reliability of the derived δ18Op estimates, mainly in the eastern part of Switzerland. Before 1983, reliable results can only be expected for the Swiss Plateau since important stations representing eastern and south-western Switzerland were not yet in operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Quantitative light intensity analysis of the strut core by optical coherence tomography (OCT) may enable assessment of changes in the light reflectivity of the bioresorbable polymeric scaffold from polymer to provisional matrix and connective tissues, with full disappearance and integration of the scaffold into the vessel wall. The aim of this report was to describe the methodology and to apply it to serial human OCT images post procedure and at 6, 12, 24 and 36 months in the ABSORB cohort B trial. METHODS AND RESULTS In serial frequency-domain OCT pullbacks, corresponding struts at different time points were identified by 3-dimensional foldout view. The peak and median values of light intensity were measured in the strut core by dedicated software. A total of 303 corresponding struts were serially analyzed at 3 time points. In the sequential analysis, peak light intensity increased gradually in the first 24 months after implantation and reached a plateau (relative difference with respect to baseline [%Dif]: 61.4% at 12 months, 115.0% at 24 months, 110.7% at 36 months), while the median intensity kept increasing at 36 months (%Dif: 14.3% at 12 months, 75.0% at 24 months, 93.1% at 36 months). CONCLUSIONS Quantitative light intensity analysis by OCT was capable of detecting subtle changes in the bioresorbable strut appearance over time, and could be used to monitor the bioresorption and integration process of polylactide struts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Porcine reproductive and respiratory syndrome virus (PRRSV) is wide-spread in pig populations globally. In many regions of Europe with intensive pig production and high herd densities, the virus is endemic and can cause disease and production losses. This fuels discussion about the feasibility and sustainability of virus elimination from larger geographic regions. The implementation of a program aiming at virus elimination for areas with high pig density is unprecedented and its potential success is unknown. The objective of this work was to approach pig population data with a simple method that could support assessing the feasibility of a sustainable regional PRRSV elimination. Based on known risk factors such as pig herd structure and neighborhood conditions, an index characterizing individual herds' potential for endemic virus circulation and reinfection was designed. This index was subsequently used to compare data of all pig herds in two regions with different pig- and herd-densities in Lower Saxony (North-West Germany) where PRRSV is endemic. Distribution of the indexed herds was displayed using GIS. Clusters of high herd index densities forming potential risk hot spots were identified which could represent key target areas for surveillance and biosecurity measures under a control program aimed at virus elimination. In an additional step, for the study region with the higher pig density (2463 pigs/km(2) farmland), the potential distribution of PRRSV-free and non-free herds during the implementation of a national control program aiming at national virus elimination was modeled. Complex herd and trade network structures suggest that PRRSV elimination in regions with intensive pig farming like that of middle Europe would have to involve legal regulation and be accompanied by important trade and animal movement restrictions. The proposed methodology of risk index mapping could be adapted to areas varying in size, herd structure and density. Interpreted in the regional context, this could help to classify the density of risk and to accordingly target resources and measures for elimination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an application and sample independent method for the automatic discrimination of noise and signal in optical coherence tomography Bscans. The proposed algorithm models the observed noise probabilistically and allows for a dynamic determination of image noise parameters and the choice of appropriate image rendering parameters. This overcomes the observer variability and the need for a priori information about the content of sample images, both of which are challenging to estimate systematically with current systems. As such, our approach has the advantage of automatically determining crucial parameters for evaluating rendered image quality in a systematic and task independent way. We tested our algorithm on data from four different biological and nonbiological samples (index finger, lemon slices, sticky tape, and detector cards) acquired with three different experimental spectral domain optical coherence tomography (OCT) measurement systems including a swept source OCT. The results are compared to parameters determined manually by four experienced OCT users. Overall, our algorithm works reliably regardless of which system and sample are used and estimates noise parameters in all cases within the confidence interval of those found by observers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The implementation of new surgical techniques offers chances but carries risks. Usually, several years pass before a critical appraisal and a balanced opinion of a new treatment method are available and rely on the evidence from the literature and expert's opinion. The frozen elephant trunk (FET) technique has been increasingly used to treat complex pathologies of the aortic arch and the descending aorta, but there still is an ongoing discussion within the surgical community about the optimal indications. This paper represents a common effort of the Vascular Domain of EACTS together with several surgeons with particular expertise in aortic surgery, and summarizes the current knowledge and the state of the art about the FET technique. The majority of the information about the FET technique has been extracted from 97 focused publications already available in the PubMed database (cohort studies, case reports, reviews, small series, meta-analyses and best evidence topics) published in English.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite having been identified over thirty years ago and definitively established as having a critical role in driving tumor growth and predicting for resistance to therapy, the KRAS oncogene remains a target in cancer for which there is no effective treatment. KRas is activated b y mutations at a few sites, primarily amino acid substitutions at codon 12 which promote a constitutively active state. I have found that different amino acid substitutions at codon 12 can activate different KRas downstream signaling pathways, determine clonogenic growth potential and determine patient response to molecularly targeted therapies. Computer modeling of the KRas structure shows that different amino acids substituted at the codon 12 position influences how KRas interacts with its effecters. In the absence of a direct inhibitor of mutant KRas several agents have recently entered clinical trials alone and in combination directly targeting two of the common downstream effecter pathways of KRas, namely the Mapk pathway and the Akt pathway. These inhibitors were evaluated for efficacy against different KRAS activating mutations. An isogenic panel of colorectal cells with wild type KRas replaced with KRas G12C, G12D, or G12V at the endogenous loci differed in sensitivity to Mek and Akt inhibition. In contrast, screening was performed in a broad panel of lung cell lines alone and no correlation was seen between types of activating KRAS mutation due to concurrent oncogenic lesions. To find a new method to inhibit KRAS driven tumors, siRNA screens were performed in isogenic lines with and without active KRas. The knockdown of CNKSR1 (CNK1) showed selective growth inhibition in cells with an oncogenic KRAS. The deletion of CNK1 reduces expression of mitotic cell cycle proteins and arrests cells with active KRas in the G1 phase of the cell cycle similar to the deletion of an activated KRas regardless of activating substitution. CNK1 has a PH domain responsible for localizing it to membrane lipids making KRas potentially amenable to inhibition with small molecules. The work has identified a series of small molecules capable of binding to this PH domain and inhibiting CNK1 facilitated KRas signaling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The failure rate of health information systems is high, partially due to fragmented, incomplete, or incorrect identification and description of specific and critical domain requirements. In order to systematically transform the requirements of work into real information system, an explicit conceptual framework is essential to summarize the work requirements and guide system design. Recently, Butler, Zhang, and colleagues proposed a conceptual framework called Work Domain Ontology (WDO) to formally represent users’ work. This WDO approach has been successfully demonstrated in a real world design project on aircraft scheduling. However, as a top level conceptual framework, this WDO has not defined an explicit and well specified schema (WDOS) , and it does not have a generalizable and operationalized procedure that can be easily applied to develop WDO. Moreover, WDO has not been developed for any concrete healthcare domain. These limitations hinder the utility of WDO in real world information system in general and in health information system in particular. Objective: The objective of this research is to formalize the WDOS, operationalize a procedure to develop WDO, and evaluate WDO approach using Self-Nutrition Management (SNM) work domain. Method: Concept analysis was implemented to formalize WDOS. Focus group interview was conducted to capture concepts in SNM work domain. Ontology engineering methods were adopted to model SNM WDO. Part of the concepts under the primary goal “staying healthy” for SNM were selected and transformed into a semi-structured survey to evaluate the acceptance, explicitness, completeness, consistency, experience dependency of SNM WDO. Result: Four concepts, “goal, operation, object and constraint”, were identified and formally modeled in WDOS with definitions and attributes. 72 SNM WDO concepts under primary goal were selected and transformed into semi-structured survey questions. The evaluation indicated that the major concepts of SNM WDO were accepted by 41 overweight subjects. SNM WDO is generally independent of user domain experience but partially dependent on SNM application experience. 23 of 41 paired concepts had significant correlations. Two concepts were identified as ambiguous concepts. 8 extra concepts were recommended towards the completeness of SNM WDO. Conclusion: The preliminary WDOS is ready with an operationalized procedure. SNM WDO has been developed to guide future SNM application design. This research is an essential step towards Work-Centered Design (WCD).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Zircons from the oldest magmatic and metasedimentary rocks in the Podolia domain of the Ukrainian shield were studied and dated by the U-Pb method on a NORDSIM secondary-ion mass spectrometer. Age of zircon cores in enderbite gneisses sampled in the Kazachii Yar and Odessa quarries on the opposite banks of the Yuzhnyi Bug River reaches 3790 Ma. Cores of terrigenous zircons in quartzites from the Odessa quarry as well as in garnet gneisses from the Zaval'e graphite quarry have age within 3650-3750 Ma. Zircon rims record two metamorphic events around 2750-2850 Ma and 1900-2000 Ma. Extremely low U content in zircons of the second age group indicates conditions of the granulite facies metamorphism in Paleoproterozoic within the Podolia domain. Measured data on orthorocks (enderbite-gneiss) and metasedimentary rocks unambiguously suggest existence of the ancient Paleoarchean crust in the Podolia (Dniester-Bug) domain of the Ukrainian shield. They contribute in our knowledge of scales of formation and geochemical features of the primordial crust.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: By means of a GTAP based-CGE model, we investigate the impact of the elimination of import tariffs and non-tariff policy barriers (NTPBs) on agricultural trade towards East Asian FTAs. To do that, we first measure the NTPBs by employing a widely-used method derived from the literature on border effects. Next, by adding into the GTAP database our estimates on the NTPBs, which the original GTAP database by its nature does not succeed in incorporating, we compute the impact of the entire elimination of policy barriers (the complete reduction of import tariffs and of NTPBs) on GDP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The access to medical literature collections such as PubMed, MedScape or Cochrane has been increased notably in the last years by the web-based tools that provide instant access to the information. However, more sophisticated methodologies are needed to exploit efficiently all that information. The lack of advanced search methods in clinical domain produce that even using well-defined questions for a particular disease, clinicians receive too many results. Since no information analysis is applied afterwards, some relevant results which are not presented in the top of the resultant collection could be ignored by the expert causing an important loose of information. In this work we present a new method to improve scientific article search using patient information for query generation. Using federated search strategy, it is able to simultaneously search in different resources and present a unique relevant literature collection. And applying NLP techniques it presents semantically similar publications together, facilitating the identification of relevant information to clinicians. This method aims to be the foundation of a collaborative environment for sharing clinical knowledge related to patients and scientific publications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Web 2.0 applications enabled users to classify information resources using their own vocabularies. The bottom-up nature of these user-generated classification systems have turned them into interesting knowledge sources, since they provide a rich terminology generated by potentially large user communities. Previous research has shown that it is possible to elicit some emergent semantics from the aggregation of individual classifications in these systems. However the generation of ontologies from them is still an open research problem. In this thesis we address the problem of how to tap into user-generated classification systems for building domain ontologies. Our objective is to design a method to develop domain ontologies from user-generated classifications systems. To do so, we rely on ontologies in the Web of Data to formalize the semantics of the knowledge collected from the classification system. Current ontology development methodologies have recognized the importance of reusing knowledge from existing resources. Thus, our work is framed within the NeOn methodology scenario for building ontologies by reusing and reengineering non-ontological resources. The main contributions of this work are: An integrated method to develop ontologies from user-generated classification systems. With this method we extract a domain terminology from the classification system and then we formalize the semantics of this terminology by reusing ontologies in the Web of Data. Identification and adaptation of existing techniques for implementing the activities in the method so that they can fulfill the requirements of each activity. A novel study about emerging semantics in user-generated lists. Resumen La web 2.0 permitió a los usuarios clasificar recursos de información usando su propio vocabulario. Estos sistemas de clasificación generados por usuarios son recursos interesantes para la extracción de conocimiento debido principalmente a que proveen una extensa terminología generada por grandes comunidades de usuarios. Se ha demostrado en investigaciones previas que es posible obtener una semántica emergente de estos sistemas. Sin embargo la generación de ontologías a partir de ellos es todavía un problema de investigación abierto. Esta tesis trata el problema de cómo aprovechar los sistemas de clasificación generados por usuarios en la construcción de ontologías de dominio. Así el objetivo de la tesis es diseñar un método para desarrollar ontologías de dominio a partir de sistemas de clasificación generados por usuarios. El método propuesto reutiliza conceptualizaciones existentes en ontologías publicadas en la Web de Datos para formalizar la semántica del conocimiento que se extrae del sistema de clasificación. Por tanto, este trabajo está enmarcado dentro del escenario para desarrollar ontologías mediante la reutilización y reingeniería de recursos no ontológicos que se ha definido en la Metodología NeOn. Las principales contribuciones de este trabajo son: Un método integrado para desarrollar una ontología de dominio a partir de sistemas de clasificación generados por usuarios. En este método se extrae una terminología de dominio del sistema de clasificación y posteriormente se formaliza su semántica reutilizando ontologías en la Web de Datos. La identificación y adaptación de un conjunto de técnicas para implementar las actividades propuestas en el método de tal manera que puedan cumplir automáticamente los requerimientos de cada actividad. Un novedoso estudio acerca de la semántica emergente en las listas generadas por usuarios en la Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the recent decades, meshless methods (MMs), like the element-free Galerkin method (EFGM), have been widely studied and interesting results have been reached when solving partial differential equations. However, such solutions show a problem around boundary conditions, where the accuracy is not adequately achieved. This is caused by the use of moving least squares or residual kernel particle method methods to obtain the shape functions needed in MM, since such methods are good enough in the inner of the integration domains, but not so accurate in boundaries. This way, Bernstein curves, which are a partition of unity themselves,can solve this problem with the same accuracy in the inner area of the domain and at their boundaries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The boundary element method (BEM) has been applied successfully to many engineering problems during the last decades. Compared with domain type methods like the finite element method (FEM) or the finite difference method (FDM) the BEM can handle problems where the medium extends to infinity much easier than domain type methods as there is no need to develop special boundary conditions (quiet or absorbing boundaries) or infinite elements at the boundaries introduced to limit the domain studied. The determination of the dynamic stiffness of arbitrarily shaped footings is just one of these fields where the BEM has been the method of choice, especially in the 1980s. With the continuous development of computer technology and the available hardware equipment the size of the problems under study grew and, as the flop count for solving the resulting linear system of equations grows with the third power of the number of equations, there was a need for the development of iterative methods with better performance. In [1] the GMRES algorithm was presented which is now widely used for implementations of the collocation BEM. While the FEM results in sparsely populated coefficient matrices, the BEM leads, in general, to fully or densely populated ones, depending on the number of subregions, posing a serious memory problem even for todays computers. If the geometry of the problem permits the surface of the domain to be meshed with equally shaped elements a lot of the resulting coefficients will be calculated and stored repeatedly. The present paper shows how these unnecessary operations can be avoided reducing the calculation time as well as the storage requirement. To this end a similar coefficient identification algorithm (SCIA), has been developed and implemented in a program written in Fortran 90. The vertical dynamic stiffness of a single pile in layered soil has been chosen to test the performance of the implementation. The results obtained with the 3-d model may be compared with those obtained with an axisymmetric formulation which are considered to be the reference values as the mesh quality is much better. The entire 3D model comprises more than 35000 dofs being a soil region with 21168 dofs the biggest single region. Note that the memory necessary to store all coefficients of this single region is about 6.8 GB, an amount which is usually not available with personal computers. In the problem under study the interface zone between the two adjacent soil regions as well as the surface of the top layer may be meshed with equally sized elements. In this case the application of the SCIA leads to an important reduction in memory requirements. The maximum memory used during the calculation has been reduced to 1.2 GB. The application of the SCIA thus permits problems to be solved on personal computers which otherwise would require much more powerful hardware.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a methodology for developing a speech into sign language translation system considering a user-centered strategy. This method-ology consists of four main steps: analysis of technical and user requirements, data collection, technology adaptation to the new domain, and finally, evalua-tion of the system. The two most demanding tasks are the sign generation and the translation rules generation. Many other aspects can be updated automatical-ly from a parallel corpus that includes sentences (in Spanish and LSE: Lengua de Signos Española) related to the application domain. In this paper, we explain how to apply this methodology in order to develop two translation systems in two specific domains: bus transport information and hotel reception.