918 resultados para quantitative phase analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rapid, quantitative SERS analysis of nicotine at ppm/ppb levels has been carried out using stable and inexpensive polymer-encapsulated Ag nanoparticles (gel-colls). The strongest nicotine band (1030 cm(-1)) was measured against d(5)-pyridine internal standard (974 cm(-1)) which was introduced during preparation of the stock gel-colls. Calibration plots of I-nic/I-pyr against the concentration of nicotine were non-linear but plotting I-nic/I-pyr against [nicotine](x) (x = 0.6-0.75, depending on the exact experimental conditions) gave linear calibrations over the range (0.1-10 ppm) with R-2 typically ca. 0.998. The RMS prediction error was found to be 0.10 ppm when the gel-colls were used for quantitative determination of unknown nicotine samples in 1-5 ppm level. The main advantages of the method are that the gel-colls constitute a highly stable and reproducible SERS medium that allows high throughput (50 sample h(-1)) measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

P>Neuropeptides are produced from larger precursors by limited proteolysis, first by endopeptidases and then by carboxypeptidases. Major endopeptidases required for these cleavages include prohormone convertase (PC) 1/3 and PC2. In this study, quantitative peptidomics analysis was used to characterize the specific role PC1/3 plays in this process. Peptides isolated from hypothalamus, amygdala, and striatum of PC1/3 null mice were compared with those from heterozygous and wild-type mice. Extracts were labeled with stable isotopic tags and fractionated by HPLC, after which relative peptide levels were determined using tandem mass spectrometry. In total, 92 peptides were found, of which 35 were known neuropeptides or related peptides derived from 15 distinct secretory pathway proteins: 7B2, chromogranin A and B, cocaine- and amphetamine-regulated transcript, procholecystokinin, proenkephalin, promelanin concentrating hormone, proneurotensin, propituitary adenylate cyclase-activating peptide, proSAAS, prosomatosatin, provasoactive intestinal peptide, provasopressin, secretogranin III, and VGF. Among the peptides derived from these proteins, similar to 1/3 were decreased in the PC1/3 null mice relative to wild-type mice, similar to 1/3 showed no change, and similar to 1/3 increased in PC1/3 null. Cleavage sites were analyzed in peptides that showed no change or that decreased in PC1/3 mice, and these results were compared with peptides that showed no change or decreased in previous peptidomic studies with PC2 null mice. Analysis of these sites showed that while PC1/3 and PC2 have overlapping substrate preferences, there are particular cleavage site residues that distinguish peptides preferred by each PC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Risk analysis is one of the critical functions of the risk management process. It relies on a detailed understanding of risks and their possible implications. Construction projects, because of their large and complex nature, are plagued by a variety of risks which must be considered and responded to in order to ensure project success. This study conducts an extensive comparative analysis of major quantitative risk analysis techniques in the construction industry. The techniques discussed and comparatively analyzed in this report include: Programme Evaluation and Review Technique (PERT), Judgmental Risk Analysis Process (JRAP), Estimating Using Risk Analysis (ERA), Monte Carlo Simulation technique, Computer Aided Simulation for Project Appraisal and Review (CASPAR), Failure Modes and Effects Analysis technique (FMEA) and Advanced Programmatic Risk Analysis and Management model (APRAM). The findings highlight the fact that each risk analysis technique addresses risks in any or all of the following areas – schedule risks, budget risks or technical risks. Through comparative analysis, it has been revealed that a majority of risk analysis techniques focus on schedule or budget risks. Very little has been documented in terms of technical risk analysis techniques. In an era where clients are demanding and expecting higher quality projects and finishes, project managers must endeavor to invest time and resources to ensure that the few existing technical risk analysis techniques are developed and further refined, and that new technical risk analysis techniques are developed to suit the current construction industries requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A set of NIH Image macro programs was developed to make qualitative and quantitative analyses from digital stereo pictures produced by scanning electron microscopes. These tools were designed for image alignment, anaglyph representation, animation, reconstruction of true elevation surfaces, reconstruction of elevation profiles, true-scale elevation mapping and, for the quantitative approach, surface area and roughness calculations. Limitations on time processing, scanning techniques and programming concepts are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method for quantitative mineralogical analysis by ATR-FTIR has been developed. The method relies on the use of the main band of calcite as a reference for the normalization of the IR spectrum of a mineral sample. In this way, the molar absorptivity coefficient in the Lambert–Beer law and the components of a mixture in mole percentage can be calculated. The GAMS equation modeling environment and the NLP solver CONOPT (©ARKI Consulting and Development) were used to correlate the experimental data in the samples considered. Mixtures of different minerals and gypsum were used in order to measure the minimum band intensity that must be considered for calculations and the detection limit. Accordingly, bands of intensity lower than 0.01 were discarded. The detection limit for gypsum was about 7% (mol/total mole). Good agreement was obtained when this FTIR method was applied to ceramic tiles previously analyzed by X-ray diffraction (XRD) or mineral mixtures prepared in the lab.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have employed an inverse engineering strategy based on quantitative proteome analysis to identify changes in intracellular protein abundance that correlate with increased specific recombinant monoclonal antibody production (qMab) by engineered murine myeloma (NSO) cells. Four homogeneous NSO cell lines differing in qMab were isolated from a pool of primary transfectants. The proteome of each stably transfected cell line was analyzed at mid-exponential growth phase by two-dimensional gel electrophoresis (2D-PAGE) and individual protein spot volume data derived from digitized gel images were compared statistically. To identify changes in protein abundance associated with qMab clatasets were screened for proteins that exhibited either a linear correlation with cell line qMab or a conserved change in abundance specific only to the cell line with highest qMab. Several proteins with altered abundance were identified by mass spectrometry. Proteins exhibiting a significant increase in abundance with increasing qMab included molecular chaperones known to interact directly with nascent immunoglobulins during their folding and assembly (e.g., BiP, endoplasmin, protein disulfide isomerase). 2D-PAGE analysis showed that in all cell lines Mab light chain was more abundant than heavy chain, indicating that this is a likely prerequisite for efficient Mab production. In summary, these data reveal both the adaptive responses and molecular mechanisms enabling mammalian cells in culture to achieve high-level recombinant monoclonal antibody production. (C) 2004 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 17: Risk Analysis

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The inquiry documented in this thesis is located at the nexus of technological innovation and traditional schooling. As we enter the second decade of a new century, few would argue against the increasingly urgent need to integrate digital literacies with traditional academic knowledge. Yet, despite substantial investments from governments and businesses, the adoption and diffusion of contemporary digital tools in formal schooling remain sluggish. To date, research on technology adoption in schools tends to take a deficit perspective of schools and teachers, with the lack of resources and teacher ‘technophobia’ most commonly cited as barriers to digital uptake. Corresponding interventions that focus on increasing funding and upskilling teachers, however, have made little difference to adoption trends in the last decade. Empirical evidence that explicates the cultural and pedagogical complexities of innovation diffusion within long-established conventions of mainstream schooling, particularly from the standpoint of students, is wanting. To address this knowledge gap, this thesis inquires into how students evaluate and account for the constraints and affordances of contemporary digital tools when they engage with them as part of their conventional schooling. It documents the attempted integration of a student-led Web 2.0 learning initiative, known as the Student Media Centre (SMC), into the schooling practices of a long-established, high-performing independent senior boys’ school in urban Australia. The study employed an ‘explanatory’ two-phase research design (Creswell, 2003) that combined complementary quantitative and qualitative methods to achieve both breadth of measurement and richness of characterisation. In the initial quantitative phase, a self-reported questionnaire was administered to the senior school student population to determine adoption trends and predictors of SMC usage (N=481). Measurement constructs included individual learning dispositions (learning and performance goals, cognitive playfulness and personal innovativeness), as well as social and technological variables (peer support, perceived usefulness and ease of use). Incremental predictive models of SMC usage were conducted using Classification and Regression Tree (CART) modelling: (i) individual-level predictors, (ii) individual and social predictors, and (iii) individual, social and technological predictors. Peer support emerged as the best predictor of SMC usage. Other salient predictors include perceived ease of use and usefulness, cognitive playfulness and learning goals. On the whole, an overwhelming proportion of students reported low usage levels, low perceived usefulness and a lack of peer support for engaging with the digital learning initiative. The small minority of frequent users reported having high levels of peer support and robust learning goal orientations, rather than being predominantly driven by performance goals. These findings indicate that tensions around social validation, digital learning and academic performance pressures influence students’ engagement with the Web 2.0 learning initiative. The qualitative phase that followed provided insights into these tensions by shifting the analytics from individual attitudes and behaviours to shared social and cultural reasoning practices that explain students’ engagement with the innovation. Six indepth focus groups, comprising 60 students with different levels of SMC usage, were conducted, audio-recorded and transcribed. Textual data were analysed using Membership Categorisation Analysis. Students’ accounts converged around a key proposition. The Web 2.0 learning initiative was useful-in-principle but useless-in-practice. While students endorsed the usefulness of the SMC for enhancing multimodal engagement, extending peer-topeer networks and acquiring real-world skills, they also called attention to a number of constraints that obfuscated the realisation of these design affordances in practice. These constraints were cast in terms of three binary formulations of social and cultural imperatives at play within the school: (i) ‘cool/uncool’, (ii) ‘dominant staff/compliant student’, and (iii) ‘digital learning/academic performance’. The first formulation foregrounds the social stigma of the SMC among peers and its resultant lack of positive network benefits. The second relates to students’ perception of the school culture as authoritarian and punitive with adverse effects on the very student agency required to drive the innovation. The third points to academic performance pressures in a crowded curriculum with tight timelines. Taken together, findings from both phases of the study provide the following key insights. First, students endorsed the learning affordances of contemporary digital tools such as the SMC for enhancing their current schooling practices. For the majority of students, however, these learning affordances were overshadowed by the performative demands of schooling, both social and academic. The student participants saw engagement with the SMC in-school as distinct from, even oppositional to, the conventional social and academic performance indicators of schooling, namely (i) being ‘cool’ (or at least ‘not uncool’), (ii) sufficiently ‘compliant’, and (iii) achieving good academic grades. Their reasoned response therefore, was simply to resist engagement with the digital learning innovation. Second, a small minority of students seemed dispositionally inclined to negotiate the learning affordances and performance constraints of digital learning and traditional schooling more effectively than others. These students were able to engage more frequently and meaningfully with the SMC in school. Their ability to adapt and traverse seemingly incommensurate social and institutional identities and norms is theorised as cultural agility – a dispositional construct that comprises personal innovativeness, cognitive playfulness and learning goals orientation. The logic then is ‘both and’ rather than ‘either or’ for these individuals with a capacity to accommodate both learning and performance in school, whether in terms of digital engagement and academic excellence, or successful brokerage across multiple social identities and institutional affiliations within the school. In sum, this study takes us beyond the familiar terrain of deficit discourses that tend to blame institutional conservatism, lack of resourcing and teacher resistance for low uptake of digital technologies in schools. It does so by providing an empirical base for the development of a ‘third way’ of theorising technological and pedagogical innovation in schools, one which is more informed by students as critical stakeholders and thus more relevant to the lived culture within the school, and its complex relationship to students’ lives outside of school. It is in this relationship that we find an explanation for how these individuals can, at the one time, be digital kids and analogue students.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: In response to the need for more comprehensive quality assessment within Australian residential aged care facilities, the Clinical Care Indicator (CCI) Tool was developed to collect outcome data as a means of making inferences about quality. A national trial of its effectiveness and a Brisbane-based trial of its use within the quality improvement context determined the CCI Tool represented a potentially valuable addition to the Australian aged care system. This document describes the next phase in the CCI Tool.s development; the aims of which were to establish validity and reliability of the CCI Tool, and to develop quality indicator thresholds (benchmarks) for use in Australia. The CCI Tool is now known as the ResCareQA (Residential Care Quality Assessment). Methods: The study aims were achieved through a combination of quantitative data analysis, and expert panel consultations using modified Delphi process. The expert panel consisted of experienced aged care clinicians, managers, and academics; they were initially consulted to determine face and content validity of the ResCareQA, and later to develop thresholds of quality. To analyse its psychometric properties, ResCareQA forms were completed for all residents (N=498) of nine aged care facilities throughout Queensland. Kappa statistics were used to assess inter-rater and test-retest reliability, and Cronbach.s alpha coefficient calculated to determine internal consistency. For concurrent validity, equivalent items on the ResCareQA and the Resident Classification Scales (RCS) were compared using Spearman.s rank order correlations, while discriminative validity was assessed using known-groups technique, comparing ResCareQA results between groups with differing care needs, as well as between male and female residents. Rank-ordered facility results for each clinical care indicator (CCI) were circulated to the panel; upper and lower thresholds for each CCI were nominated by panel members and refined through a Delphi process. These thresholds indicate excellent care at one extreme and questionable care at the other. Results: Minor modifications were made to the assessment, and it was renamed the ResCareQA. Agreement on its content was reached after two Delphi rounds; the final version contains 24 questions across four domains, enabling generation of 36 CCIs. Both test-retest and inter-rater reliability were sound with median kappa values of 0.74 (test-retest) and 0.91 (inter-rater); internal consistency was not as strong, with a Chronbach.s alpha of 0.46. Because the ResCareQA does not provide a single combined score, comparisons for concurrent validity were made with the RCS on an item by item basis, with most resultant correlations being quite low. Discriminative validity analyses, however, revealed highly significant differences in total number of CCIs between high care and low care groups (t199=10.77, p=0.000), while the differences between male and female residents were not significant (t414=0.56, p=0.58). Clinical outcomes varied both within and between facilities; agreed upper and lower thresholds were finalised after three Delphi rounds. Conclusions: The ResCareQA provides a comprehensive, easily administered means of monitoring quality in residential aged care facilities that can be reliably used on multiple occasions. The relatively modest internal consistency score was likely due to the multi-factorial nature of quality, and the absence of an aggregate result for the assessment. Measurement of concurrent validity proved difficult in the absence of a gold standard, but the sound discriminative validity results suggest that the ResCareQA has acceptable validity and could be confidently used as an indication of care quality within Australian residential aged care facilities. The thresholds, while preliminary due to small sample size, enable users to make judgements about quality within and between facilities. Thus it is recommended the ResCareQA be adopted for wider use.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Virtual methods to assess the fitting of a fracture fixation plate were proposed recently, however with limitations such as simplified fit criteria or manual data processing. This study aims to automate a fit analysis procedure using clinical-based criteria, and then to analyse the results further for borderline fit cases. Three dimensional (3D) models of 45 bones and of a precontoured distal tibial plate were utilized to assess the fitting of the plate automatically. A Matlab program was developed to automatically measure the shortest distance between the bone and the plate at three regions of interest and a plate-bone angle. The measured values including the fit assessment results were recorded in a spreadsheet as part of the batch-process routine. An automated fit analysis procedure will enable the processing of larger bone datasets in a significantly shorter time, which will provide more representative data of the target population for plate shape design and validation. As a result, better fitting plates can be manufactured and made available to surgeons, thereby reducing the risk and cost associated with complications or corrective procedures. This in turn, is expected to translate into improving patients' quality of life.