225 resultados para quantification géométrique
Resumo:
Aim The aim of this paper was to explore the concept of expertise in nursing from the perspective of how it relates to current driving forces in health care in which it discusses the potential barriers to acceptance of nursing expertise in a climate in which quantification of value and cost containment run high on agendas. Background Expert nursing practice can be argued to be central to high quality, holistic, individualized patient care. However, changes in government policy which have led to the inception of comprehensive guidelines or protocols of care are in danger of relegating the ‘expert nurse’ to being an icon of the past. Indeed, it could be argued that expert nurses are an expensive commodity within the nursing workforce. Consequently, with this change to the use of clinical guidelines, it calls into question how expert nursing practice will develop within this framework of care. Method The article critically reviews the evidence related to the role of the Expert Nurse in an attempt to identify the key concepts and ideas, and how the inception of care protocols has implications for their role. Conclusion Nursing expertise which focuses on the provision of individualized, holistic care and is based largely on intuitive decision making cannot, should not be reduced to being articulated in positivist terms. However, the dominant power and decision-making focus in health care means that nurses must be confident in articulating the value of a concept which may be outside the scope of knowledge of those with whom they are debating. Relevance to clinical practice The principles of abduction or fuzzy logic may be useful in assisting nurses to explain in terms which others can comprehend, the value of nursing expertise.
Resumo:
This paper explores the concept of expertise in intensive care nursing practice from the perspective of its relationship to the current driving forces in healthcare. It discusses the potential barriers to acceptance of nursing expertise in a climate in which quantification of value and cost containment run high on agendas. It argues that nursing expertise which focuses on the provision of individualised, holistic care and which is based largely on intuitive decision-making cannot and should not be reduced to being articulated in positivist terms. The principles of abduction or fuzzy logic, derived from computer science, may be useful in assisting nurses to explain in terms, which others can comprehend, the value of nursing expertise.
Resumo:
Rapidly developing proteomic tools are improving detection of deregulated kallikrein-related peptidase (KLK) expression, at the protein level, in prostate and ovarian cancer, as well as facilitating the determination of functional consequences downstream. Mass spectrometry (MS)-driven proteomics uniquely allows for the detection, identification and quantification of thousands of proteins in a complex protein pool, and this has served to identify certain KLKs as biomarkers for these diseases. In this review we describe applications of this technology in KLK biomarker discovery, and elucidate MS-based techniques which have been used for unbiased, global screening of KLK substrates within complex protein pools. Although MS-based KLK degradomic studies are limited to date, they helped to discover an array of novel KLK substrates. Substrates identified by MS-based degradomics are reported with improved confidence over those determined by incubating a purified or recombinant substrate and protease of interest, in vitro. We propose that these novel proteomic approaches represent the way forward for KLK research, in order to correlate proteolysis of biological substrates with tissue-related consequences, toward clinical targeting of KLK expression and function for cancer diagnosis, prognosis and therapies.
Resumo:
Purpose. To quantify the molecular lipid composition of patient-matched tear and meibum samples and compare tear and meibum lipid molecular profiles. Methods. Lipids were extracted from tears and meibum by bi-phasic methods using 10:3 tertbutyl methyl ether:methanol, washed with aqueous ammonium acetate, and analyzed by chipbased nanoelectrospray ionization tandem mass spectrometry. Targeted precursor ion and neutral loss scans identified individual molecular lipids and quantification was obtained by comparison to internal standards in each lipid class. Results. Two hundred and thirty-six lipid species were identified and quantified from nine lipid classes comprised of cholesterol esters, wax esters, (O-acyl)-x-hydroxy fatty acids, triacylglycerols, phosphatidylcholine, lysophosphatidylcholine, phosphatidylethanolamine, sphingomyelin, and phosphatidylserine. With the exception of phospholipids, lipid molecular profiles were strikingly similar between tears and meibum. Conclusions. Comparisons between tears and meibum indicate that meibum is likely to supply the majority of lipids in the tear film lipid layer. However, the observed higher mole ratio of phospholipid in tears shows that analysis of meibum alone does not provide a complete understanding of the tear film lipid composition.
Resumo:
Highway construction works have significant bearings on all aspects of sustainability. With the increasing level of public awareness and government regulatory measures, the construction industry is experiencing a cultural shift to recognise, embrace and pursue sustainability. Stakeholders are now keen to identify sustainable alternatives and the financial implications of including them on a lifecycle basis. They need tools that can aid the evaluation of investment options. To date, however, there have not been many financial assessments on the sustainability aspects of highway projects. This is because the existing life-cycle costing analysis (LCCA) models tend to focus on economic issues alone and are not able to deal with sustainability factors. This paper provides insights into the current practice of life-cycle cost analysis, and the identification and quantification of sustainability-related cost components in highway projects through literature review, questionnaire surveys and semi-structured interviews. The results can serve as a platform for highway project stakeholders to develop practical tools to evaluate highway investment decisions and reach an optimum balance between financial viability and sustainability deliverables.
Resumo:
Global climate change is one of the most significant environmental issues that can harm human development. One central issue for the building and construction industry to address global climate change is the development of a credible and meaningful way to measure greenhouse gas (GHG) emissions. While Publicly Available Specification (PAS) 2050, the first international GHG standard, has been proven to be successful in standardizing the quantification process, its contribution to the management of carbon labels for construction materials is limited. With the recent publication of ISO 14067: Greenhouse gases – carbon footprint of products – requirements and guidelines for quantification and communication in May 2013, it is necessary for the building and construction industry to understand the past, present and future of the carbon labelling practices for construction materials. A systematic review shows that international GHG standards have been evolving in terms of providing additional guidance on communication and comparison, as well as less flexibility on the use of carbon labels. At the same time, carbon labelling schemes have been evolving on standardization and benchmarking. In addition, future actions are needed in the aspect of raising consumer awareness, providing benchmarking, ensuring standardization and developing simulation technologies in order for carbon labelling schemes for construction materials to provide credible, accurate and transparent information on GHG emissions.
Resumo:
Process compliance measurement is getting increasing attention in companies due to stricter legal requirements and market pressure for operational excellence. On the other hand, the metrics to quantify process compliance have only been defined recently. A major criticism points to the fact that existing measures appear to be unintuitive. In this paper, we trace back this problem to a more foundational question: which notion of behavioural equivalence is appropriate for discussing compliance? We present a quantification approach based on behavioural profiles, which is a process abstraction mechanism. Behavioural profiles can be regarded as weaker than existing equivalence notions like trace equivalence, and they can be calculated efficiently. As a validation, we present a respective implementation that measures compliance of logs against a normative process model. This implementation is being evaluated in a case study with an international service provider.
Resumo:
Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.
Resumo:
This contribution outlines Synchrotron-based X-ray micro-tomography and its potential use in structural geology and rock mechanics. The paper complements several recent reviews of X-ray microtomography. We summarize the general approach to data acquisition, post-processing as well as analysis and thereby aim to provide an entry point for the interested reader. The paper includes tables listing relevant beamlines, a list of all available imaging techniques, and available free and commercial software packages for data visualization and quantification. We highlight potential applications in a review of relevant literature including time-resolved experiments and digital rock physics. The paper concludes with a report on ongoing developments and upgrades at synchrotron facilities to frame the future possibilities for imaging sub-second processes in centimetre-sized samples.
Resumo:
Road infrastructure has been considered as one of the most expensive and extensive infrastructure assets of the built environment globally. This asset also impacts the natural environment significantly during different phases of life e.g. construction, use, maintenance and end-of-life. The growing emphasis for sustainable development to meet the needs of future generations requires mitigation of the environmental impacts of road infrastructure during all phases of life e.g. construction, operation and end-of-life disposal (as required). Life-cycle analysis (LCA), a method of quantification of all stages of life, has recently been studied to explore all the environmental components of road projects due to limitations of generic environmental assessments. The LCA ensures collection and assessment of the inputs and outputs relating to any potential environmental factor of any system throughout its life. However, absence of a defined system boundary covering all potential environmental components restricts the findings of the current LCA studies. A review of the relevant published LCA studies has identified that environmental components such as rolling resistance of pavement, effect of solar radiation on pavement(albedo), traffic congestion during construction, and roadway lighting & signals are not considered by most of the studies. These components have potentially higher weightings for environment damage than several commonly considered components such as materials, transportation and equipment. This paper presents the findings of literature review, and suggests a system boundary model for LCA study of road infrastructure projects covering potential environmental components.
Resumo:
Objective There are no objective ambulatory studies on the temporal relationship between reflux and cough in children. Commercial pHmetry loggers have slow capture rates (0.25 Hz) that limit objective quantification of reflux and cough. The authors aimed to evaluate if there is a temporal association between cough and acid pH in ambulatory children with chronic cough. setting and patients The authors studied children (aged <14 years) with chronic cough, suspected of acid reflux and considered for pHmetry using a specifically built ambulatory pHmetry–cough logger that enabled the simultaneous ambulatory recording of cough and pH with a fast (10 Hz) capture rate. Main outcome measures Coughs within (before and after) 10, 30, 60 and 120 s of a reflux episode (pH<4 for >0.5 s). Results Analysis of 5628 coughs in 20 children. Most coughs (83.9%) were independent of a reflux event. Cough–reflux (median 19, IQR 3–45) and reflux–cough (24.5, 13–51) sequences were equally likely to occur within 120 s. Within the 10 and 30 s time frame, reflux–cough (10 s=median 2.5, IQR 0–7.25; 30 s=6.5, 1.25–22.25) sequences were significantly less frequent than reflux–no cough (10 s=27, IQR 15–65; 30 s=24.5, 14.5–55.5) sequences, (p=0.0001 and p=0.001, respectively). No differences were found for 60 and 120 s time frame. Cough–reflux sequence (median 1.0, IQR 0–8) within 10 s was significantly less (p=0.0001) than no cough–reflux sequences (median 29.5, 15–67), within 30 s (p=0.006) and 60 s (p=0.048) but not within 120 s (p=0.47). Conclusions In children with chronic cough and suspected of having gastro-oesophageal reflux disease, the temporal relationship between acid reflux and cough is unlikely causal.
Resumo:
Background The genetic regulation of flower color has been widely studied, notably as a character used by Mendel and his predecessors in the study of inheritance in pea. Methodology/Principal Findings We used the genome sequence of model legumes, together with their known synteny to the pea genome to identify candidate genes for the A and A2 loci in pea. We then used a combination of genetic mapping, fast neutron mutant analysis, allelic diversity, transcript quantification and transient expression complementation studies to confirm the identity of the candidates. Conclusions/Significance We have identified the pea genes A and A2. A is the factor determining anthocyanin pigmentation in pea that was used by Gregor Mendel 150 years ago in his study of inheritance. The A gene encodes a bHLH transcription factor. The white flowered mutant allele most likely used by Mendel is a simple G to A transition in a splice donor site that leads to a mis-spliced mRNA with a premature stop codon, and we have identified a second rare mutant allele. The A2 gene encodes a WD40 protein that is part of an evolutionarily conserved regulatory complex.
Resumo:
Plant microRNAs (miRNAs) are a class of endogenous small RNAs that are essential for plant development and survival. They arise from larger precursor RNAs with a characteristic hairpin structure and regulate gene activity by targeting mRNA transcripts for cleavage or translational repression. Efficient and reliable detection and quantification of miRNA expression has become an essential step in understanding their specific roles. The expression levels of miRNAs can vary dramatically between samples and they often escape detection by conventional technologies such as cloning, northern hybridization and microarray analysis. The stem-loop RT-PCR method described here is designed to detect and quantify mature miRNAs in a fast, specific, accurate and reliable manner. First, a miRNA-specific stem-loop RT primer is hybridized to the miRNA and then reverse transcribed. Next, the RT product is amplified and monitored in real time using a miRNA-specific forward primer and the universal reverse primer. This method enables miRNA expression profiling from as little as 10 pg of total RNA and is suitable for high-throughput miRNA expression analysis.
Resumo:
Plant small RNAs are a class of 19- to 25-nucleotide (nt) RNA molecules that are essential for genome stability, development and differentiation, disease, cellular communication, signaling, and adaptive responses to biotic and abiotic stress. Small RNAs comprise two major RNA classes, short interfering RNAs (siRNAs) and microRNAs (miRNAs). Efficient and reliable detection and quantification of small RNA expression has become an essential step in understanding their roles in specific cells and tissues. Here we provide protocols for the detection of miRNAs by stem-loop RT-PCR. This method enables fast and reliable miRNA expression profiling from as little as 20 pg of total RNA extracted from plant tissue and is suitable for high-throughput miRNA expression analysis. In addition, this method can be used to detect other classes of small RNAs, provided the sequence is known and their GC contents are similar to those specific for miRNAs.
Resumo:
Recent developments in analytical technologies have driven significant advances in lipid science. The sensitivity and selectivity of modern mass spectrometers can now provide for the detection and even quantification of many hundreds of lipids in a single analysis. In parallel, increasing evidence from structural biology suggests that a detailed knowledge of lipid molecular structure including carbon-carbon double bond position, stereochemistry and acyl chain regiochemistry is required to fully appreciate the biochemical role(s) of individual lipids. Here we review the capabilities and limitations of tandem mass spectrometry to provide this level of structural specificity in the analysis of lipids present in complex biological extracts. In particular, we focus on the capabilities of a novel technology termed ozone-induced dissociation to identify the position (s) of double bonds in unsaturated lipids and discuss its possible role in efforts to develop workflows that provide for complete structure elucidation of lipids by mass spectrometry alone: so-called top-down lipidomics. This article is part of a Special Issue entitled: Lipodomics and Imaging Mass Spectrom. (C) 2011 Elsevier B.V. All rights reserved.