237 resultados para Quantitative micrographic parameters


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tumor antigen-specific cytotoxic T cells (CTLs) play a major role in the adaptive immune response to cancers. This CTL response is often insufficient because of functional impairment, tumor escape mechanisms, or inhibitory tumor microenvironment. However, little is known about the fate of given tumor-specific CTL clones in cancer patients. Studies in patients with favorable outcomes may be very informative. In this longitudinal study, we tracked, quantified, and characterized functionally defined antigen-specific T-cell clones ex vivo, in peripheral blood and at tumor sites, in two long-term melanoma survivors. MAGE-A10-specific CD8+ T-cell clones with high avidity to antigenic peptide and tumor lytic capabilities persisted in peripheral blood over more than 10 years, with quantitative variations correlating with the clinical course. These clones were also found in emerging metastases, and, in one patient, circulating clonal T cells displayed a fully differentiated effector phenotype at the time of relapse. Longevity, tumor homing, differentiation phenotype, and quantitative adaptation to the disease phases suggest the contribution of the tracked tumor-reactive clones in the tumor control of these long-term metastatic survivor patients. Focusing research on patients with favorable outcomes may help to identify parameters that are crucial for an efficient antitumor response and to optimize cancer immunotherapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To investigate the effect of a change in second-hand smoke (SHS) exposure on heart rate variability (HRV) and pulse wave velocity (PWV), this study utilized a quasi-experimental setting when a smoking ban was introduced. METHODS: HRV, a quantitative marker of autonomic activity of the nervous system, and PWV, a marker of arterial stiffness, were measured in 55 non-smoking hospitality workers before and 3-12 months after a smoking ban and compared to a control group that did not experience an exposure change. SHS exposure was determined with a nicotine-specific badge and expressed as inhaled cigarette equivalents per day (CE/d). RESULTS: PWV and HRV parameters significantly changed in a dose-dependent manner in the intervention group as compared to the control group. A one CE/d decrease was associated with a 2.3 % (95 % CI 0.2-4.4; p = 0.031) higher root mean square of successive differences (RMSSD), a 5.7 % (95 % CI 0.9-10.2; p = 0.02) higher high-frequency component and a 0.72 % (95 % CI 0.40-1.05; p < 0.001) lower PWV. CONCLUSIONS: PWV and HRV significantly improved after introducing smoke-free workplaces indicating a decreased cardiovascular risk.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND AIMS: Although it is well known that fire acts as a selective pressure shaping plant phenotypes, there are no quantitative estimates of the heritability of any trait related to plant persistence under recurrent fires, such as serotiny. In this study, the heritability of serotiny in Pinus halepensis is calculated, and an evaluation is made as to whether fire has left a selection signature on the level of serotiny among populations by comparing the genetic divergence of serotiny with the expected divergence of neutral molecular markers (QST-FST comparison). METHODS: A common garden of P. halepensis was used, located in inland Spain and composed of 145 open-pollinated families from 29 provenances covering the entire natural range of P. halepensis in the Iberian Peninsula and Balearic Islands. Narrow-sense heritability (h(2)) and quantitative genetic differentiation among populations for serotiny (QST) were estimated by means of an 'animal model' fitted by Bayesian inference. In order to determine whether genetic differentiation for serotiny is the result of differential natural selection, QST estimates for serotiny were compared with FST estimates obtained from allozyme data. Finally, a test was made of whether levels of serotiny in the different provenances were related to different fire regimes, using summer rainfall as a proxy for fire regime in each provenance. KEY RESULTS: Serotiny showed a significant narrow-sense heritability (h(2)) of 0·20 (credible interval 0·09-0·40). Quantitative genetic differentiation among provenances for serotiny (QST = 0·44) was significantly higher than expected under a neutral process (FST = 0·12), suggesting adaptive differentiation. A significant negative relationship was found between the serotiny level of trees in the common garden and summer rainfall of their provenance sites. CONCLUSIONS: Serotiny is a heritable trait in P. halepensis, and selection acts on it, giving rise to contrasting serotiny levels among populations depending on the fire regime, and supporting the role of fire in generating genetic divergence for adaptive traits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines a dataset that derives from an observational tracking, in order to analyze where and how middle-class working families spend time at home. We use an ethnographic approach to study the everyday lives of Italian dual-income middle-class families, with the aim to analyze quantitatively the use of home spaces and the types of activities of family members on weekday afternoons and evenings. The different analyses (multiple correspondence analysis, agglomerative hierarchical cluster, discriminant analysis) show how particular spaces and activities in these spaces are dominated by certain family members. We suggest a combination of qualitative and quantitative methodologies as useful tools to explore in detail the everyday lives of families, and to understand how family members use the domestic spaces. In particular, we consider relevant the use of quantitative analyses to examine ethnographic data, especially in connection with the methodological reflexivity among researchers

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: An important goal of neurocritical care is the management of secondary brain injury (SBI), that is pathological events occurring after primary insult that add further burden to outcome. Brain oedema, cerebral ischemia, energy dysfunction, seizures and systemic insults are the main components of SBI. We here review recent data showing the clinical utility of brain multimodality monitoring (BMM) for the management of SBI. RECENT FINDINGS: Despite being recommended by international guidelines, standard intracranial pressure (ICP) monitoring may be insufficient to detect all episodes of SBI. ICP monitoring, combined with brain oxygen (PbtO(2)), cerebral microdialysis and regional cerebral blood flow, might help to target therapy (e.g. management of cerebral perfusion pressure, blood transfusion, glucose control) to patient-specific pathophysiology. Physiological parameters derived from BMM, including PbtO(2) and microdialysis lactate/pyruvate ratio, correlate with outcome and have recently been incorporated into neurocritical care guidelines. Advanced intracranial devices can be complemented by quantitative electroencephalography to monitor changes of brain function and nonconvulsive seizures. SUMMARY: BMM offers an on-line comprehensive scrutiny of the injured brain and is increasingly used for the management of SBI. Integration of monitored data using new informatics tools may help optimize therapy of brain-injured patients and quality of care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous studies have demonstrated that poultry-house workers are exposed to very high levels of organic dust and consequently have an increased prevalence of adverse respiratory symptoms. However, the influence of the age of broilers, on bioaerosol concentrations has not been investigated. To evaluate the evolution of bioaerosol concentration during the fattening period, bioaerosol parameters (inhalable dust, endotoxin and bacteria) were measured in 12 poultry confinement buildings in Switzerland, at 3 different stages of the birds' growth; Samples of air taken from within the breathing zones of individual poultry-house employees as they caught the chickens ready to be transported for slaughter, were also analysed. Quantitative PCR (Q-PCR) was used to assess the quantity of total airborne bacteria and total airborne Staphylococcus species. Bioaerosol levels increased significantly during the fattening period of the chickens. During the task of catching mature birds, the mean inhalable dust concentration for a worker was 31 ± 4.7 mg/m3, and endotoxin concentration was 11'080 ± 3436 UE/m3 air, more than ten-fold higher than the Swiss occupational recommended value (1000 UE/m3). The mean exposure level of bird catchers to total bacteria and Staphylococcus species measured by Q-PCR is also very high, respectively reaching values of 72 (± 11) x107 cells/m3 air and 70 (± 16) x106/m3 air. It was concluded that in the absence of wearing protective breathing apparatus, chicken catchers in Switzerland risk exposure beyond recommended limits for all measured bioaerosol parameters. Moreover, the use of Q-PCR to estimate total and specific numbers of airborne bacteria is a promising tool for evaluating any modifications intended to improve the safety of current working practices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this study was to investigate the impact of computing parameters and the location of volumes of interest (VOI) on the calculation of 3D noise power spectrum (NPS) in order to determine an optimal set of computing parameters and propose a robust method for evaluating the noise properties of imaging systems. Noise stationarity in noise volumes acquired with a water phantom on a 128-MDCT and a 320-MDCT scanner were analyzed in the spatial domain in order to define locally stationary VOIs. The influence of the computing parameters in the 3D NPS measurement: the sampling distances bx,y,z and the VOI lengths Lx,y,z, the number of VOIs NVOI and the structured noise were investigated to minimize measurement errors. The effect of the VOI locations on the NPS was also investigated. Results showed that the noise (standard deviation) varies more in the r-direction (phantom radius) than z-direction plane. A 25 × 25 × 40 mm(3) VOI associated with DFOV = 200 mm (Lx,y,z = 64, bx,y = 0.391 mm with 512 × 512 matrix) and a first-order detrending method to reduce structured noise led to an accurate NPS estimation. NPS estimated from off centered small VOIs had a directional dependency contrary to NPS obtained from large VOIs located in the center of the volume or from small VOIs located on a concentric circle. This showed that the VOI size and location play a major role in the determination of NPS when images are not stationary. This study emphasizes the need for consistent measurement methods to assess and compare image quality in CT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agricultural practices, such as spreading liquid manure or the utilisation of land as animal pastures, can result in faecal contamination of water resources. Rhodococcus coprophilus is used in microbial source tracking to indicate animal faecal contamination in water. Methods previously described for detecting of R. coprophilus in water were neither sensitive nor specific. Therefore, the aim of this study was to design and validate a new quantitative polymerase chain reaction (qPCR) to improve the detection of R. coprophilus in water. The new PCR assay was based on the R. coprophilus 16S rRNA gene. The validation showed that the new approach was specific and sensitive for deoxyribunucleic acid from target host species. Compared with other PCR assays tested in this study, the detection limit of the new qPCR was between 1 and 3 log lower. The method, including a filtration step, was further validated and successfully used in a field investigation in Switzerland. Our work demonstrated that the new detection method is sensitive and robust to detect R. coprophilus in surface and spring water. Compared with PCR assays that are available in the literature or to the culture-dependent method, the new molecular approach improves the detection of R. coprophilus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PPARs are members of the nuclear hormone receptor superfamily and are primarily involved in lipid metabolism. The expression patterns of all 3 PPAR isotypes in 22 adult rat organs were analyzed by a quantitative ribonuclease protection assay. The data obtained allowed comparison of the expression of each isotype to the others and provided new insight into the less studied PPAR beta (NR1C2) expression and function. This isotype shows a ubiquitous expression pattern and is the most abundant of the three PPARs in all analyzed tissues except adipose tissue. Its expression is especially high in the digestive tract, in addition to kidney, heart, diaphragm, and esophagus. After an overnight fast, PPAR beta mRNA levels are dramatically down-regulated in liver and kidney by up to 80% and are rapidly restored to control levels upon refeeding. This tight nutritional regulation is independent of the circulating glucocorticoid levels and the presence of PPAR alpha, whose activity is markedly up-regulated in the liver and small intestine during fasting. Finally, PPAR gamma 2 mRNA levels are decreased by 50% during fasting in both white and brown adipose tissue. In conclusion, fasting can strongly influence PPAR expression, but in only a few selected tissues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative approaches in ceramology are gaining ground in excavation reports, archaeological publications and thematic studies. Hence, a wide variety of methods are being used depending on the researchers' theoretical premise, the type of material which is examined, the context of discovery and the questions that are addressed. The round table that took place in Athens on November 2008 was intended to offer the participants the opportunity to present a selection of case studies on the basis of which methodological approaches were discussed. The aim was to define a set of guidelines for quantification that would prove to be of use to all researchers. Contents: 1) Introduction (Samuel Verdan); 2) Isthmia and beyond. How can quantification help the analysis of EIA sanctuary deposits? (Catherine Morgan); 3) Approaching aspects of cult practice and ethnicity in Early Iron Age Ephesos using quantitative analysis of a Protogeometric deposit from the Artemision (Michael Kerschner); 4) Development of a ceramic cultic assemblage: Analyzing pottery from Late Helladic IIIC through Late Geometric Kalapodi (Ivonne Kaiser, Laura-Concetta Rizzotto, Sara Strack); 5) 'Erfahrungsbericht' of application of different quantitative methods at Kalapodi (Sara Strack); 6) The Early Iron Age sanctuary at Olympia: counting sherds from the Pelopion excavations (1987-1996) (Birgitta Eder); 7) L'aire du pilier des Rhodiens à Delphes: Essai de quantification du mobilier (Jean-Marc Luce); 8) A new approach in ceramic statistical analyses: Pit 13 on Xeropolis at Lefkandi (David A. Mitchell, Irene S. Lemos); 9) Households and workshops at Early Iron Age Oropos: A quantitative approach of the fine, wheel-made pottery (Vicky Vlachou); 10) Counting sherds at Sindos: Pottery consumption and construction of identities in the Iron Age (Stefanos Gimatzidis); 11) Analyse quantitative du mobilier céramique des fouilles de Xombourgo à Ténos et le cas des supports de caisson (Jean-Sébastien Gros); 12) Defining a typology of pottery from Gortyn: The material from a pottery workshop pit, (Emanuela Santaniello); 13) Quantification of ceramics from Early Iron Age tombs (Antonis Kotsonas); 14) Quantitative analysis of the pottery from the Early Iron Age necropolis of Tsikalario on Naxos (Xenia Charalambidou); 15) Finding the Early Iron Age in field survey: Two case studies from Boeotia and Magnesia (Vladimir Stissi); 16) Pottery quantification: Some guidelines (Samuel Verdan).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As modern molecular biology moves towards the analysis of biological systems as opposed to their individual components, the need for appropriate mathematical and computational techniques for understanding the dynamics and structure of such systems is becoming more pressing. For example, the modeling of biochemical systems using ordinary differential equations (ODEs) based on high-throughput, time-dense profiles is becoming more common-place, which is necessitating the development of improved techniques to estimate model parameters from such data. Due to the high dimensionality of this estimation problem, straight-forward optimization strategies rarely produce correct parameter values, and hence current methods tend to utilize genetic/evolutionary algorithms to perform non-linear parameter fitting. Here, we describe a completely deterministic approach, which is based on interval analysis. This allows us to examine entire sets of parameters, and thus to exhaust the global search within a finite number of steps. In particular, we show how our method may be applied to a generic class of ODEs used for modeling biochemical systems called Generalized Mass Action Models (GMAs). In addition, we show that for GMAs our method is amenable to the technique in interval arithmetic called constraint propagation, which allows great improvement of its efficiency. To illustrate the applicability of our method we apply it to some networks of biochemical reactions appearing in the literature, showing in particular that, in addition to estimating system parameters in the absence of noise, our method may also be used to recover the topology of these networks.