955 resultados para analytical methods
Resumo:
Advances in safety research—trying to improve the collective understanding of motor vehicle crash causes and contributing factors—rest upon the pursuit of numerous lines of research inquiry. The research community has focused considerable attention on analytical methods development (negative binomial models, simultaneous equations, etc.), on better experimental designs (before-after studies, comparison sites, etc.), on improving exposure measures, and on model specification improvements (additive terms, non-linear relations, etc.). One might logically seek to know which lines of inquiry might provide the most significant improvements in understanding crash causation and/or prediction. It is the contention of this paper that the exclusion of important variables (causal or surrogate measures of causal variables) cause omitted variable bias in model estimation and is an important and neglected line of inquiry in safety research. In particular, spatially related variables are often difficult to collect and omitted from crash models—but offer significant opportunities to better understand contributing factors and/or causes of crashes. This study examines the role of important variables (other than Average Annual Daily Traffic (AADT)) that are generally omitted from intersection crash prediction models. In addition to the geometric and traffic regulatory information of intersection, the proposed model includes many spatial factors such as local influences of weather, sun glare, proximity to drinking establishments, and proximity to schools—representing a mix of potential environmental and human factors that are theoretically important, but rarely used. Results suggest that these variables in addition to AADT have significant explanatory power, and their exclusion leads to omitted variable bias. Provided is evidence that variable exclusion overstates the effect of minor road AADT by as much as 40% and major road AADT by 14%.
Resumo:
A multiple reaction monitoring mass spectrometric assay for the quantification of PYY in human plasma has been developed. A two stage sample preparation protocol was employed in which plasma containing the full length neuropeptide was first digested using trypsin, followed by solid-phase extraction to extract the digested peptide from the complex plasma matrix. The peptide extracts were analysed by LC-MS using multiple reaction monitoring to detect and quantify PYY. The method has been validated for plasma samples, yielding linear responses over the range 5–1,000 ng mL−1. The method is rapid, robust and specific for plasma PYY detection.
Resumo:
Despite their ecological significance as decomposers and their evolutionary significance as the most speciose eusocial insect group outside the Hymenoptera, termite (Blattodea: Termitoidae or Isoptera) evolutionary relationships have yet to be well resolved. Previous morphological and molecular analyses strongly conflict at the family level and are marked by poor support for backbone nodes. A mitochondrial (mt) genome phylogeny of termites was produced to test relationships between the recognised termite families, improve nodal support and test the phylogenetic utility of rare genomic changes found in the termite mt genome. Complete mt genomes were sequenced for 7 of the 9 extant termite families with additional representatives of each of the two most speciose families Rhinotermitidae (3 of 7 subfamilies) and Termitidae (3 of 8 subfamilies). The mt genome of the well supported sister group of termites, the subsocial cockroach Cryptocercus, was also sequenced. A highly supported tree of termite relationships was produced by all analytical methods and data treatment approaches, however the relationship of the termites + Cryptocercus clade to other cockroach lineages was highly affected by the strong nucleotide compositional bias found in termites relative to other dictyopterans. The phylogeny supports previously proposed suprafamilial termite lineages, the Euisoptera and Neoisoptera, a later derived Kalotermitidae as sister group of the Neoisoptera and a monophyletic clade of dampwood (Stolotermitidae, Archotermopsidae) and harvester termites (Hodotermitidae). In contrast to previous termite phylogenetic studies, nodal supports were very high for family-level relationships within termites. Two rare genomic changes in the mt genome control region were found to be molecular synapomorphies for major clades. An elongated stem-loop structure defined the clade Polyphagidae + (Cryptocercus + termites), and a further series of compensatory base changes in this stem loop is synapomorphic for the Neoisoptera. The complicated repeat structures first identified in Reticulitermes, composed of short (A-type) and long (B-type repeats) defines the clade Heterotermitinae + Termitidae, while the secondary loss of A-type repeats is synapomorphic for the non-macrotermitine Termitidae.
Resumo:
Ratchetting failure of railhead material adjacent to endpost which is placed in the air gap between the two rail ends at insulated rail joints causes significant economic problems to the railway operators who rely on the proper functioning of these joints for train control using the signalling track circuitry. The ratchetting failure is a localised problem and is very difficult to predict even when complex analytical methods are employed. This paper presents a novel experimental technique that enables measurement of the progressive ratchetting. A special purpose test rig was developed for this purpose and commissioned by the Centre for Railway Engineering at Central Queensland University. The rig also provides the capability of testing of the wheel/rail rolling contract conditions. The results provide confidence that accurate measurement of the localised failure of railhead material can be achieved using the test rig.
Resumo:
Proteoglycans (PGs) are crucial extracellular matrix (ECM) components that are present in all tissues and organs. Pathological remodeling of these macromolecules can lead to severe diseases such as osteoarthritis or rheumatoid arthritis. To date, PG-associated ECM alterations are routinely diagnosed by invasive analytical methods. Here, we employed Raman microspectroscopy, a laser-based, marker-free and non-destructive technique that allows the generation of spectra with peaks originating from molecular vibrations within a sample, to identify specific Raman bands that can be assigned to PGs within human and porcine cartilage samples and chondrocytes. Based on the non-invasively acquired Raman spectra, we further revealed that a prolonged in vitro culture leads to phenotypic alterations of chondrocytes, resulting in a decreased PG synthesis rate and loss of lipid contents. Our results are the first to demonstrate the applicability of Raman microspectroscopy as an analytical and potential diagnostic tool for non-invasive cell and tissue state monitoring of cartilage in biomedical research. ((c) 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim).
Resumo:
Using Monte Carlo simulation for radiotherapy dose calculation can provide more accurate results when compared to the analytical methods usually found in modern treatment planning systems, especially in regions with a high degree of inhomogeneity. These more accurate results acquired using Monte Carlo simulation however, often require orders of magnitude more calculation time so as to attain high precision, thereby reducing its utility within the clinical environment. This work aims to improve the utility of Monte Carlo simulation within the clinical environment by developing techniques which enable faster Monte Carlo simulation of radiotherapy geometries. This is achieved principally through the use new high performance computing environments and simpler alternative, yet equivalent representations of complex geometries. Firstly the use of cloud computing technology and it application to radiotherapy dose calculation is demonstrated. As with other super-computer like environments, the time to complete a simulation decreases as 1=n with increasing n cloud based computers performing the calculation in parallel. Unlike traditional super computer infrastructure however, there is no initial outlay of cost, only modest ongoing usage fees; the simulations described in the following are performed using this cloud computing technology. The definition of geometry within the chosen Monte Carlo simulation environment - Geometry & Tracking 4 (GEANT4) in this case - is also addressed in this work. At the simulation implementation level, a new computer aided design interface is presented for use with GEANT4 enabling direct coupling between manufactured parts and their equivalent in the simulation environment, which is of particular importance when defining linear accelerator treatment head geometry. Further, a new technique for navigating tessellated or meshed geometries is described, allowing for up to 3 orders of magnitude performance improvement with the use of tetrahedral meshes in place of complex triangular surface meshes. The technique has application in the definition of both mechanical parts in a geometry as well as patient geometry. Static patient CT datasets like those found in typical radiotherapy treatment plans are often very large and present a significant performance penalty on a Monte Carlo simulation. By extracting the regions of interest in a radiotherapy treatment plan, and representing them in a mesh based form similar to those used in computer aided design, the above mentioned optimisation techniques can be used so as to reduce the time required to navigation the patient geometry in the simulation environment. Results presented in this work show that these equivalent yet much simplified patient geometry representations enable significant performance improvements over simulations that consider raw CT datasets alone. Furthermore, this mesh based representation allows for direct manipulation of the geometry enabling motion augmentation for time dependant dose calculation for example. Finally, an experimental dosimetry technique is described which allows the validation of time dependant Monte Carlo simulation, like the ones made possible by the afore mentioned patient geometry definition. A bespoke organic plastic scintillator dose rate meter is embedded in a gel dosimeter thereby enabling simultaneous 3D dose distribution and dose rate measurement. This work demonstrates the effectiveness of applying alternative and equivalent geometry definitions to complex geometries for the purposes of Monte Carlo simulation performance improvement. Additionally, these alternative geometry definitions allow for manipulations to be performed on otherwise static and rigid geometry.
Resumo:
In our laboratory we have developed a quantitative-polymerase chain reaction (Q-PCR) strategy to examine the differential expression of adenosine receptor (ADOR), A(1), A(2A), A(2B) and A(3), and estrogen receptors (ER) alpha and beta. Brain and uterine mRNA were first used to optimise specific amplification conditions prior to SYBR Green I real time analysis of receptor subtype expression. SYBR Green I provided a convenient and sensitive means of examining specific PCR amplification product in real time, and allowed the generation of standard curves from which relative receptor abundance could be determined. Real time Q-PCR analysis was then performed, to examine changes in receptor expression levels in brains of adult female Wistar rats 3-month post ovariectomy. Comparison with sham-operated age-matched control rats demonstrated both comparative and absolute-copy number changes in receptor levels. Evaluation of both analytical methods investigated 18S rRNA as an internal reference for comparative gene expression analysis in the brain. The results of this study revealed preferential repression of ADORA(2A) (>4-fold down) and consistent (>2-fold) down-regulation of ADORA(1), ADORA(3), and ER-beta, following ovariectomy. No change was found in ADORA(2B) or ER-alpha. Analysis of absolute copy number in this study revealed a correlation between receptor expression in response to ovariectomy, and relative receptor subtype abundance in the brain.
Resumo:
Conditions of bridges deteriorate with age, due to different critical factors including, changes in loading, fatigue, environmental effects and natural events. In order to rate a network of bridges, based on their structural condition, the condition of the components of a bridge and their effects on behaviour of the bridge should be reliably estimated. In this paper, a new method for quantifying the criticality and vulnerability of the components of the railway bridges in a network will be introduced. The type of structural analyses for identifying the criticality of the components for carrying train loads will be determined. In addition to that, the analytical methods for identifying the vulnerability of the components to natural events whose probability of occurrence is important, such as, flood, wind, earthquake and collision will be determined. In order to maintain the practicality of this method to be applied to a network of thousands of railway bridges, the simplicity of structural analysis has been taken into account. Demand by capacity ratios of the components at both safety and serviceability condition states as well as weighting factors used in current bridge management systems (BMS) are taken into consideration. It will be explained what types of information related to the structural condition of a bridge is required to be obtained, recorded and analysed. The authors of this paper will use this method in a new rating system introduced previously. Enhancing accuracy and reliability of evaluating and predicting the vulnerability of railway bridges to environmental effects and natural events will be the significant achievement of this research.
Resumo:
Fossils and sediments preserved in caves are an excellent source of information for investigating impacts of past environmental changes on biodiversity. Until recently studies have relied on morphology-based palaeontological approaches, but recent advances in molecular analytical methods offer excellent potential for extracting a greater array of biological information from these sites. This study presents a thorough assessment of DNA preservation from late Pleistocene–Holocene vertebrate fossils and sediments from Kelly Hill Cave Kangaroo Island, South Australia. Using a combination of extraction techniques and sequencing technologies, ancient DNA was characterised from over 70 bones and 20 sediment samples from 15 stratigraphic layers ranging in age from >20 ka to ∼6.8 ka. A combination of primers targeting marsupial and placental mammals, reptiles and two universal plant primers were used to reveal genetic biodiversity for comparison with the mainland and with the morphological fossil record for Kelly Hill Cave. We demonstrate that Kelly Hill Cave has excellent long-term DNA preservation, back to at least 20 ka. This contrasts with the majority of Australian cave sites thus far explored for ancient DNA preservation, and highlights the great promise Kangaroo Island caves hold for yielding the hitherto-elusive DNA of extinct Australian Pleistocene species.
Resumo:
The 2008 NASA Astrobiology Roadmap provides one way of theorising this developing field, a way which has become the normative model for the discipline: science-and scholarship-driven funding for space. By contrast, a novel re-evaluation of funding policies is undertaken in this article to reframe astrobiology, terraforming and associated space travel and research. Textual visualisation, discourse and numeric analytical methods, and value theory are applied to historical data and contemporary sources to re-investigate significant drivers and constraints on the mechanisms of enabling space exploration. Two data sets are identified and compared: the business objectives and outcomes of major 15th-17th century European joint-stock exploration and trading companies and a case study of a current space industry entrepreneur company. Comparison of these analyses suggests that viable funding policy drivers can exist outside the normative science and scholarship-driven roadmap. The two drivers identified in this study are (1) the intrinsic value of space as a territory to be experienced and enjoyed, not just studied, and (2) the instrumental, commercial value of exploiting these experiences by developing infrastructure and retail revenues. Filtering of these results also offers an investment rationale for companies operating in, or about to enter, the space business marketplace.
Resumo:
Several analytical methods for Dynamic System Optimum (DSO) assignment have been proposed but they are basically classified into two kinds. This chapter attempts to establish DSO by equilbrating the path dynamic marginal time (DMT). The authors analyze the path DMT for a single path with tandem bottlenecks and showed that the path DMT is not the simple summation of DMT associated with each bottleneck along the path. Next, the authors examined the DMT of several paths passing through a common bottleneck. It is shown that the externality at the bottleneck is shared by the paths in proportion to their demand from the current time until the queue vanishes. This share of the externality is caused by the departure rate shift under first in first out (FIFO) and the externality propagates to the downstream bottlenecks. However, the externalities propagates to the downstream are calculated out if downstream bottlenecks exist. Therefore, the authors concluded that the path DMT can be evaluated without considering the propagation of the externalities, but just as in the evaluation of the path DMT for a single path passing through a series of bottlenecks between the origin and destination. Based on the DMT analysis, the authors finally proposed a heuristic solution algorithm and verified it by comparing the numerical solution with the analytical one.
Resumo:
As a sequel to a paper that dealt with the analysis of two-way quantitative data in large germplasm collections, this paper presents analytical methods appropriate for two-way data matrices consisting of mixed data types, namely, ordered multicategory and quantitative data types. While various pattern analysis techniques have been identified as suitable for analysis of the mixed data types which occur in germplasm collections, the clustering and ordination methods used often can not deal explicitly with the computational consequences of large data sets (i.e. greater than 5000 accessions) with incomplete information. However, it is shown that the ordination technique of principal component analysis and the mixture maximum likelihood method of clustering can be employed to achieve such analyses. Germplasm evaluation data for 11436 accessions of groundnut (Arachis hypogaea L.) from the International Research Institute of the Semi-Arid Tropics, Andhra Pradesh, India were examined. Data for nine quantitative descriptors measured in the post-rainy season and five ordered multicategory descriptors were used. Pattern analysis results generally indicated that the accessions could be distinguished into four regions along the continuum of growth habit (or plant erectness). Interpretation of accession membership in these regions was found to be consistent with taxonomic information, such as subspecies. Each growth habit region contained accessions from three of the most common groundnut botanical varieties. This implies that within each of the habit types there is the full range of expression for the other descriptors used in the analysis. Using these types of insights, the patterns of variability in germplasm collections can provide scientists with valuable information for their plant improvement programs.