27 resultados para Quantitative information

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

70.00% 70.00%

Publicador:

Resumo:

A confocal imaging and image processing scheme is introduced to visualize and evaluate the spatial distribution of spectral information in tissue. The image data are recorded using a confocal laser-scanning microscope equipped with a detection unit that provides high spectral resolution. The processing scheme is based on spectral data, is less error-prone than intensity-based visualization and evaluation methods, and provides quantitative information on the composition of the sample. The method is tested and validated in the context of the development of dermal drug delivery systems, introducing a quantitative uptake indicator to compare the performances of different delivery systems is introduced. A drug penetration study was performed in vitro. The results show that the method is able to detect, visualize and measure spectral information in tissue. In the penetration study, uptake efficiencies of different experiment setups could be discriminated and quantitatively described. The developed uptake indicator is a step towards a quantitative assessment and, in a more general view apart from pharmaceutical research, provides valuable information on tissue composition. It can potentially be used for clinical in vitro and in vivo applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Gene therapy of the heart has been attempted in a number of clinical trials with the injection of naked DNA, although quantitative information on myocellular transfection rates is not available. The present study aimed to quantify the efficacy of electropulsing protocols that differ in pulse duration and number to stimulate transfection of cardiomyocytes and to determine the impact on myocardial integrity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE: There is a need for valid and reliable short scales that can be used to assess social networks and social supports and to screen for social isolation in older persons. DESIGN AND METHODS: The present study is a cross-national and cross-cultural evaluation of the performance of an abbreviated version of the Lubben Social Network Scale (LSNS-6), which was used to screen for social isolation among community-dwelling older adult populations in three European countries. Based on the concept of lack of redundancy of social ties we defined clinical cut-points of the LSNS-6 for identifying persons deemed at risk for social isolation. RESULTS: Among all three samples, the LSNS-6 and two subscales (Family and Friends) demonstrated high levels of internal consistency, stable factor structures, and high correlations with criterion variables. The proposed clinical cut-points showed good convergent validity, and classified 20% of the respondents in Hamburg, 11% of those in Solothurn (Switzerland), and 15% of those in London as at risk for social isolation. IMPLICATIONS: We conclude that abbreviated scales such as the LSNS-6 should be considered for inclusion in practice protocols of gerontological practitioners. Screening older persons based on the LSNS-6 provides quantitative information on their family and friendship ties, and identifies persons at increased risk for social isolation who might benefit from in-depth assessment and targeted interventions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ligament balancing in total knee arthroplasty may have an important influence on joint stability and prosthesis lifetime. In order to provide quantitative information and assistance during ligament balancing, a device that intraoperatively measures knee joint forces and moments was developed. Its performance and surgical advantages were evaluated on six cadaver specimens mounted on a knee joint loading apparatus allowing unconstrained knee motion as well as compression and varus-valgus loading. Four different experiments were performed on each specimen. (1) Knee joints were axially loaded. Comparison between applied and measured compressive forces demonstrated the accuracy and reliability of in situ measurements (1.8N). (2) Assessment of knee stability based on condyle contact forces or varus-valgus moments were compared to the current surgical method (difference of varus-valgus loads causing condyle lift-off). The force-based approach was equivalent to the surgical method while the moment-based, which is considered optimal, showed a tendency of lateral imbalance. (3) To estimate the importance of keeping the patella in its anatomical position during imbalance assessment, the effect of patellar eversion on the mediolateral distribution of tibiofemoral contact forces was measured. One fourth of the contact force induced by the patellar load was shifted to the lateral compartment. (4) The effect of minor and major medial collateral ligament releases was biomechanically quantified. On average, the medial contact force was reduced by 20% and 46%, respectively. Large variation among specimens reflected the difficulty of ligament release and the need for intraoperative force monitoring. This series of experiments thus demonstrated the device's potential to improve ligament balancing and survivorship of total knee arthroplasty.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In spite of the facts that magnetic resonance spectroscopy (MRS) is applied as clinical tool in non-specialized institutions and that semi-automatic acquisition and processing tools can be used to produce quantitative information from MRS exams without expert information, issues of spectral quality and quality assessment are neglected in the literature of MR spectroscopy. Even worse, there is no consensus among experts on concepts or detailed criteria of quality assessment for MR spectra. Furthermore, artifacts are not at all conspicuous in MRS and can easily be taken for true, interpretable features. This article aims to increase interest in issues of spectral quality and quality assessment, to start a larger debate on generally accepted criteria that spectra must fulfil to be clinically and scientifically acceptable, and to provide a sample gallery of artifacts, which can be used to raise awareness for potential pitfalls in MRS.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A regional hydrogeochemical model was developed to evaluate the geochemical evolution of different groundwaters in an alluvial aquifer system in the Interior of Oman. In combination with environmental isotopes the model is able to extract qualitative and quantitative information about recharge, groundwater flow paths and hydraulic connections between different aquifers. The main source of water to the alluvial aquifer along the flow paths ofWadi Abyadh andWadi M’uaydin in the piedmont is groundwater from the high-altitude areas of the Jabal Akhdar and local infiltration along the wadi channels. In contrast, the piedmont alluvial aquifer alongWadi Halfayn is primarily replenished by lateral recharge from the ophiolite foothills to the east besides smaller contributions from the Jabal Akhdar and local infiltration. Further down gradient in the Southern Alluvial Plain aquifer a significant source of recharge is direct infiltration of rain and surface runoff, originating from a moisture source that approaches Oman from the south. The model shows that the main geochemical evolution of the alluvial groundwaters occurs along the flow path from the piedmont to the Southern Alluvial Plain, where dedolomitization is responsible for the observed changes in the chemical and carbon isotope composition in these waters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The isotope composition of selenium (Se) can provide important constraints on biological, geochemical, and cosmochemical processes taking place in different reservoirs on Earth and during planet formation. To provide precise qualitative and quantitative information on these processes, accurate and highly precise isotope data need to be obtained. The currently applied ICP-MS methods for Se isotope measurements are compromised by the necessity to perform a large number of interference corrections. Differences in these correction methods can lead to discrepancies in published Se isotope values of rock standards which are significantly higher than the acclaimed precision. An independent analytical approach applying a double spike (DS) and state-of-the-art TIMS may yield better precision due to its smaller number of interferences and could test the accuracy of data obtained by ICP-MS approaches. This study shows that the precision of Se isotope measurements performed with two different Thermo Scientific™ Triton™ Plus TIMS is distinctly deteriorated by about ±1‰ (2 s.d.) due to δ80/78Se by a memory Se signal of up to several millivolts and additional minor residual mass bias which could not be corrected for with the common isotope fractionation laws. This memory Se has a variable isotope composition with a DS fraction of up to 20% and accumulates with increasing number of measurements. Thus it represents an accumulation of Se from previous Se measurements with a potential addition from a sample or machine blank. Several cleaning techniques of the MS parts were tried to decrease the memory signal, but were not sufficient to perform precise Se isotope analysis. If these serious memory problems can be overcome in the future, the precision and accuracy of Se isotope analysis with TIMS should be significantly better than those of the current ICP-MS approaches.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present the results of an investigation into the nature of information needs of software developers who work in projects that are part of larger ecosystems. This work is based on a quantitative survey of 75 professional software developers. We corroborate the results identified in the sur- vey with needs and motivations proposed in a previous sur- vey and discover that tool support for developers working in an ecosystem context is even more meager than we thought: mailing lists and internet search are the most popular tools developers use to satisfy their ecosystem-related information needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deep tissue imaging has become state of the art in biology, but now the problem is to quantify spatial information in a global, organ-wide context. Although access to the raw data is no longer a limitation, the computational tools to extract biologically useful information out of these large data sets is still catching up. In many cases, to understand the mechanism behind a biological process, where molecules or cells interact with each other, it is mandatory to know their mutual positions. We illustrate this principle here with the immune system. Although the general functions of lymph nodes as immune sentinels are well described, many cellular and molecular details governing the interactions of lymphocytes and dendritic cells remain unclear to date and prevent an in-depth mechanistic understanding of the immune system. We imaged ex vivo lymph nodes isolated from both wild-type and transgenic mice lacking key factors for dendritic cell positioning and used software written in MATLAB to determine the spatial distances between the dendritic cells and the internal high endothelial vascular network. This allowed us to quantify the spatial localization of the dendritic cells in the lymph node, which is a critical parameter determining the effectiveness of an adaptive immune response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the identification of quantitative trait loci (QTL) affecting carcass composition, carcass length, fat deposition and lean meat content using a genome scan across 462 animals from a combined intercross and backcross between Hampshire and Landrace pigs. Data were analysed using multiple linear regression fitting additive and dominance effects. This model was compared with a model including a parent-of-origin effect to spot evidence of imprinting. Several precisely defined muscle phenotypes were measured in order to dissect body composition in more detail. Three significant QTL were detected in the study at the 1% genome-wide level, and twelve significant QTL were detected at the 5% genome-wide level. These QTL comprise loci affecting fat deposition and lean meat content on SSC1, 4, 9, 10, 13 and 16, a locus on SSC2 affecting the ratio between weight of meat and bone in back and weight of meat and bone in ham and two loci affecting carcass length on SSC12 and 17. The well-defined phenotypes in this study enabled us to detect QTL for sizes of individual muscles and to obtain information of relevance for the description of the complexity underlying other carcass traits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to identify quantitative trait loci (QTL) for osteochondrosis (OC) and palmar/plantar osseous fragments (POF) in fetlock joints in a whole-genome scan of 219 South German Coldblood horses. Symptoms of OC and POF were checked by radiography in 117 South German Coldblood horses at a mean age of 17 months. The radiographic examination comprised the fetlock and hock joints of all limbs. The genome scan included 157 polymorphic microsatellite markers. All microsatellite markers were equally spaced over the 31 autosomes and the X chromosome, with an average distance of 17.7 cM and a mean polymorphism information content (PIC) of 63%. Sixteen chromosomes harbouring putative QTL regions were further investigated by genotyping the animals with 93 additional markers. QTL that had chromosome-wide significance by non-parametric Z-means and LOD scores were found on 10 chromosomes. This included seven QTL for fetlock OC and one QTL on ECA18 associated with hock OC and fetlock OC. Significant QTL for POF in fetlock joints were located on equine chromosomes 1, 4, 8, 12 and 18. This genome scan is an important step towards the identification of genes responsible for OC in horses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The prognostic relevance of the collateral circulation is still controversial. The goal of this study was to assess the impact on survival of quantitatively obtained, recruitable coronary collateral flow in patients with stable coronary artery disease during 10 years of follow-up. METHODS AND RESULTS: Eight-hundred forty-five individuals (age, 62+/-11 years), 106 patients without coronary artery disease and 739 patients with chronic stable coronary artery disease, underwent a total of 1053 quantitative, coronary pressure-derived collateral measurements between March 1996 and April 2006. All patients were prospectively included in a collateral flow index (CFI) database containing information on recruitable collateral flow parameters obtained during a 1-minute coronary balloon occlusion. CFI was calculated as follows: CFI = (P(occl) - CVP)/(P(ao) - CVP) where P(occl) is mean coronary occlusive pressure, P(ao) is mean aortic pressure, and CVP is central venous pressure. Patients were divided into groups with poorly developed (CFI < 0.25) or well-grown collateral vessels (CFI > or = 0.25). Follow-up information on the occurrence of all-cause mortality and major adverse cardiac events after study inclusion was collected. Cumulative 10-year survival rates in relation to all-cause deaths and cardiac deaths were 71% and 88%, respectively, in patients with low CFI and 89% and 97% in the group with high CFI (P=0.0395, P=0.0109). Through the use of Cox proportional hazards analysis, the following variables independently predicted elevated cardiac mortality: age, low CFI (as a continuous variable), and current smoking. CONCLUSIONS: A well-functioning coronary collateral circulation saves lives in patients with chronic stable coronary artery disease. Depending on the exact amount of collateral flow recruitable during a brief coronary occlusion, long-term cardiac mortality is reduced to one fourth compared with the situation without collateral supply.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The penetration, translocation, and distribution of ultrafine and nanoparticles in tissues and cells are challenging issues in aerosol research. This article describes a set of novel quantitative microscopic methods for evaluating particle distributions within sectional images of tissues and cells by addressing the following questions: (1) is the observed distribution of particles between spatial compartments random? (2) Which compartments are preferentially targeted by particles? and (3) Does the observed particle distribution shift between different experimental groups? Each of these questions can be addressed by testing an appropriate null hypothesis. The methods all require observed particle distributions to be estimated by counting the number of particles associated with each defined compartment. For studying preferential labeling of compartments, the size of each of the compartments must also be estimated by counting the number of points of a randomly superimposed test grid that hit the different compartments. The latter provides information about the particle distribution that would be expected if the particles were randomly distributed, that is, the expected number of particles. From these data, we can calculate a relative deposition index (RDI) by dividing the observed number of particles by the expected number of particles. The RDI indicates whether the observed number of particles corresponds to that predicted solely by compartment size (for which RDI = 1). Within one group, the observed and expected particle distributions are compared by chi-squared analysis. The total chi-squared value indicates whether an observed distribution is random. If not, the partial chi-squared values help to identify those compartments that are preferential targets of the particles (RDI > 1). Particle distributions between different groups can be compared in a similar way by contingency table analysis. We first describe the preconditions and the way to implement these methods, then provide three worked examples, and finally discuss the advantages, pitfalls, and limitations of this method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: Nanotechnology in its widest sense seeks to exploit the special biophysical and chemical properties of materials at the nanoscale. While the potential technological, diagnostic or therapeutic applications are promising there is a growing body of evidence that the special technological features of nanoparticulate material are associated with biological effects formerly not attributed to the same materials at a larger particle scale. Therefore, studies that address the potential hazards of nanoparticles on biological systems including human health are required. Due to its large surface area the lung is one of the major sites of interaction with inhaled nanoparticles. One of the great challenges of studying particle-lung interactions is the microscopic visualization of nanoparticles within tissues or single cells both in vivo and in vitro. Once a certain type of nanoparticle can be identified unambiguously using microscopic methods it is desirable to quantify the particle distribution within a cell, an organ or the whole organism. Transmission electron microscopy provides an ideal tool to perform qualitative and quantitative analyses of particle-related structural changes of the respiratory tract, to reveal the localization of nanoparticles within tissues and cells and to investigate the 3D nature of nanoparticle-lung interactions.This article provides information on the applicability, advantages and disadvantages of electron microscopic preparation techniques and several advanced transmission electron microscopic methods including conventional, immuno and energy-filtered electron microscopy as well as electron tomography for the visualization of both model nanoparticles (e.g. polystyrene) and technologically relevant nanoparticles (e.g. titanium dioxide). Furthermore, we highlight possibilities to combine light and electron microscopic techniques in a correlative approach. Finally, we demonstrate a formal quantitative, i.e. stereological approach to analyze the distributions of nanoparticles in tissues and cells.This comprehensive article aims to provide a basis for scientists in nanoparticle research to integrate electron microscopic analyses into their study design and to select the appropriate microscopic strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil degradation is a major problem in the agriculturally dominated country of Tajikistan, which makes it necessary to determine and monitor the state of soils. For this purpose a soil spectral library was established as it enables the determination of soil properties with relatively low costs and effort. A total of 1465 soil samples were collected from three 10x10 km test sites in western Tajikistan. The diffuse reflectance of the samples was measured with a FieldSpec PRO FR from ASD in the spectral range from 380 to 2500 nm in laboratory. 166 samples were finally selected based on their spectral information and analysed on total C and N, organic C, pH, CaCO₃, extractable P, exchangeable Ca, Mg and K, and the fractions clay, silt and sand. Multiple linear regression was used to set up the models. Two third of the chemically analysed samples were used to calibrate the models, one third was used for hold-out validation. Very good prediction accuracy was obtained for total C (R² = 0.76, RMSEP = 4.36 g kg⁻¹), total N (R² = 0.83, RMSEP = 0.30 g kg⁻¹) and organic C (R² = 0.81, RMSEP = 3.30 g kg⁻¹), good accuracy for pH (R² = 0.61, RMSEP = 0.157) and CaCO3(R² = 0.72, RMSEP = 4.63 %). No models could be developed for extractable P, exchangeable Ca, Mg and K, and the fractions clay, silt and sand. It can be concluded that the spectral library approach has a high potential to substitute standard laboratory methods where rapid and inexpensive analysis is required.