855 resultados para quantifying heteroskedasticity
Resumo:
Quantifying the impact of biochemical compounds on collective cell spreading is an essential element of drug design, with various applications including developing treatments for chronic wounds and cancer. Scratch assays are a technically simple and inexpensive method used to study collective cell spreading; however, most previous interpretations of scratch assays are qualitative and do not provide estimates of the cell diffusivity, D, or the cell proliferation rate,l. Estimating D and l is important for investigating the efficacy of a potential treatment and provides insight into the mechanism through which the potential treatment acts. While a few methods for estimating D and l have been proposed, these previous methods lead to point estimates of D and l, and provide no insight into the uncertainty in these estimates. Here, we compare various types of information that can be extracted from images of a scratch assay, and quantify D and l using discrete computational simulations and approximate Bayesian computation. We show that it is possible to robustly recover estimates of D and l from synthetic data, as well as a new set of experimental data. For the first time, our approach also provides a method to estimate the uncertainty in our estimates of D and l. We anticipate that our approach can be generalized to deal with more realistic experimental scenarios in which we are interested in estimating D and l, as well as additional relevant parameters such as the strength of cell-to-cell adhesion or the strength of cell-to-substrate adhesion.
Resumo:
Contralateral bones are often used in many medical applications but it is assumed that their bilateral differences are insignificant. Previous studies used a limited number of distance measurements in quantifying the corresponding differences; therefore, little is known about their bilateral 3D surface asymmetries. The aim of the study is to develop a comprehensive method to quantify geometrical asymmetries between the left and right tibia in order to provide first results on whether the contralateral tibia can be used as an equivalent reference. In this study, 3D bone models were reconstructed from CT scans of seven tibiae pairs, and 34 variables consisting of 2D and 3D measurements were measured from various anatomical regions. All 2D measurements, and lateral plateau and distal subchondral bone surface measurements showed insignificant differences (p > 0.05), but the rest of the surfaces showed significant differences (p < 0.05). Our results suggest that the contralateral tibia can be used as a reference especially in surgical applications such as articular reconstructions since the bilateral differences in the subchondral bone surfaces were less than 0.3 mm. The method can also be potentially transferable to other relevant studies that require the accurate quantification of bone bilateral asymmetries.
Resumo:
Lead compounds are known genotoxicants, principally affecting the integrity of chromosomes. Lead chloride and lead acetate induced concentration-dependent increases in micronucleus frequency in V79 cells, starting at 1.1 μM lead chloride and 0.05 μM lead acetate. The difference between the lead salts, which was expected based on their relative abilities to form complex acetato-cations, was confirmed in an independent experiment. CREST analyses of the micronuclei verified that lead chloride and acetate were predominantly aneugenic (CREST-positive response), which was consistent with the morphology of the micronuclei (larger micronuclei, compared with micronuclei induced by a clastogenic mechanism). The effects of high concentrations of lead salts on the microtubule network of V79 cells were also examined using immunofluorescence staining. The dose effects of these responses were consistent with the cytotoxicity of lead(II), as visualized in the neutral-red uptake assay. In a cell-free system, 20-60 μM lead salts inhibited tubulin assembly dose-dependently. The no-observed-effect concentration of lead(II) in this assay was 10 μM. This inhibitory effect was interpreted as a shift of the assembly/disassembly steady-state toward disassembly, e.g., by reducing the concentration of assembly-competent tubulin dimers. The effects of lead salts on microtubule-associated motor-protein functions were studied using a kinesin-gliding assay that mimics intracellular transport processes in vitro by quantifying the movement of paclitaxel-stabilized microtubules across a kinesin-coated glass surface. There was a dose-dependent effect of lead nitrate on microtubule motility. Lead nitrate affected the gliding velocities of microtubules starting at concentrations above 10 μM and reached half-maximal inhibition of motility at about 50 μM. The processes reported here point to relevant interactions of lead with tubulin and kinesin at low dose levels.
Resumo:
Cane fibre content has increased over the past ten years. Some of that increase can be attributed to new varieties selected for release. This paper reviews the existing methods for quantifying the fibre characteristics of a variety, including fibre content and fibre quality measurements – shear strength, impact resistance and short fibre content. The variety selection process is presented and it is reported that fibre content has zero weighting in the current selection index. An updated variety selection approach is proposed, potentially replacing the existing selection process relating to fibre. This alternative approach involves the use of a more complex mill area level model that accounts for harvesting, transport and processing equipment, taking into account capacity, efficiency and operational impacts, along with the end use for the bagasse. The approach will ultimately determine a net economic value for the variety. The methodology lends itself to a determination of the fibre properties that have a significant impact on the economic value so that variety tests can better target the critical properties. A low-pressure compression test is proposed as a good test to provide an assessment of the impact of a variety on milling capacity. NIR methodology is proposed as a technology to lead to a more rapid assessment of fibre properties, and hence the opportunity to more comprehensively test for fibre impacts at an earlier stage of variety development.
Resumo:
Background: Human saliva mirrors the body's health and can be collected non-invasively, does not require specialized skills and is suitable for large population based screening programs. The aims were twofold: to evaluate the suitability of commercially available saliva collection devices for quantifying proteins present in saliva and to provide levels for C-reactive protein (CRP), myoglobin, and immunoglobin E (IgE) in saliva of healthy individuals as a baseline for future studies. Methods: Saliva was collected from healthy volunteers (n = 17, ages 18-33 years). The following collection methods were evaluated: drool; Salimetrics (R) Oral Swab (SOS); Salivette (R) Cotton and Synthetic (Sarstedt) and Greiner Bio-One Saliva Collection System (GBO SCS (R)). We used AlphaLISA (R) assays to measure CRP, IgE and myoglobin levels in human saliva. Results: Significant (p<0.05) differences in the salivary flow rates were observed based on the method of collection, Le. salivary flow rates were significantly lower (p<0.05) in unstimulated saliva (Le. drool and SOS), when compared with mechanically stimulated methods (p<0.05) (Salivette (R) Cotton and Synthetic) and acid stimulated method (p<0.05) (SCS (R)). Saliva collected using SOS yielded significantly (p<0.05) lower concentrations of myoglobin and CRP, whilst, saliva collected using the Salivette (R) Cotton and Synthetic swab yielded significantly (p<0.05) lower myoglobin and IgE concentrations respectively. Conclusions: The results demonstrated significantly relevant differences in analyte levels based on the collection method. Significant differences in the salivary flow rates were also observed depending on the saliva collection method. The data provide preliminary baseline values for salivary CRP, myoglobin, and IgE levels in healthy participants and based on the collection method. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
In current practice, urban-rural development has been regarded as one of the key pillars in driving regenerative development that includes economic, social, and environmental balance. In association with rapid urbanization, an important contemporary issue in China is that its rural areas are increasingly lagging behind urban areas in their development and a coordinated provision of public facilities in rural areas is necessary to achieve a better balance. A model is therefore introduced for quantifying the effect of individual infrastructure projects on urban-rural balance (e-UR) by focusing on two attributes, namely, efficiency and equity. The model is demonstrated through a multi-criteria model, developed with data collected from infrastructure projects in Chongqing, with the criteria values for each project being scored by comparing data collected from the project involved with e-UR neutral “benchmark” values derived from a survey of experts in the field. The model helps evaluate the contribution of the projects to improving rural-urban balance and hence enable government decision-makers for the first time to prioritize future projects rigorously in terms of their likely contribution too.
Resumo:
This thesis investigated in detail the physics of small X-ray fields used in radiotherapy treatments. Because of this work, the ability to accurately measure dose from these very small X-ray fields has been improved in several ways. These include scientifically quantifying when highly accurate measurements are required by introducing the concept of a very small field, and by the invention of a new detector that responds the same in very small fields as in normal fields.
Resumo:
Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making
Resumo:
Many areas of biochemistry and molecular biology, both fundamental and applications-orientated, require an accurate construction, representation and understanding of the protein molecular surface and its interaction with other, usually small, molecules. There are however many situations when the protein molecular surface gets in physical contact with larger objects, either biological, such as membranes, or artificial, such as nanoparticles. The contribution presents a methodology for describing and quantifying the molecular properties of proteins, by geometrical and physico-chemical mapping of the molecular surfaces, with several analytical relationships being proposed for molecular surface properties. The relevance of the molecular surface-derived properties has been demonstrated through the calculation of the statistical strength of the prediction of protein adsorption. It is expected that the extension of this methodology to other phenomena involving proteins near solid surfaces, in particular the protein interaction with nanoparticles, will result in important benefits in the understanding and design of protein-specific solid surfaces. © 2013 Nicolau et al.
Resumo:
The characterisation of facial expression through landmark-based analysis methods such as FACEM (Pilowsky & Katsikitis, 1994) has a variety of uses in psychiatric and psychological research. In these systems, important structural relationships are extracted from images of facial expressions by the analysis of a pre-defined set of feature points. These relationship measures may then be used, for instance, to assess the degree of variability and similarity between different facial expressions of emotion. FaceXpress is a multimedia software suite that provides a generalised workbench for landmark-based facial emotion analysis and stimulus manipulation. It is a flexible tool that is designed to be specialised at runtime by the user. While FaceXpress has been used to implement the FACEM process, it can also be configured to support any other similar, arbitrary system for quantifying human facial emotion. FaceXpress also implements an integrated set of image processing tools and specialised tools for facial expression stimulus production including facial morphing routines and the generation of expression-representative line drawings from photographs.
Resumo:
This paper presents a new metric, which we call the lighting variance ratio, for quantifying descriptors in terms of their variance to illumination changes. In many applications it is desirable to have descriptors that are robust to changes in illumination, especially in outdoor environments. The lighting variance ratio is useful for comparing descriptors and determining if a descriptor is lighting invariant enough for a given environment. The metric is analysed across a number of datasets, cameras and descriptors. The results show that the upright SIFT descriptor is typically the most lighting invariant descriptor.
Resumo:
The quality of environmental decisions should be gauged according to managers' objectives. Management objectives generally seek to maximize quantifiable measures of system benefit, for instance population growth rate. Reaching these goals often requires a certain degree of learning about the system. Learning can occur by using management action in combination with a monitoring system. Furthermore, actions can be chosen strategically to obtain specific kinds of information. Formal decision making tools can choose actions to favor such learning in two ways: implicitly via the optimization algorithm that is used when there is a management objective (for instance, when using adaptive management), or explicitly by quantifying knowledge and using it as the fundamental project objective, an approach new to conservation.This paper outlines three conservation project objectives - a pure management objective, a pure learning objective, and an objective that is a weighted mixture of these two. We use eight optimization algorithms to choose actions that meet project objectives and illustrate them in a simulated conservation project. The algorithms provide a taxonomy of decision making tools in conservation management when there is uncertainty surrounding competing models of system function. The algorithms build upon each other such that their differences are highlighted and practitioners may see where their decision making tools can be improved. © 2010 Elsevier Ltd.
Resumo:
Quantifying the competing rates of intake and elimination of persistent organic pollutants (POPs) in the human body is necessary to understand the levels and trends of POPs at a population level. In this paper we reconstruct the historical intake and elimination of ten polychlorinated biphenyls (PCBs) and five organochlorine pesticides (OCPs) from Australian biomonitoring data by fitting a population-level pharmacokinetic (PK) model. Our analysis exploits two sets of cross-sectional biomonitoring data for PCBs and OCPs in pooled blood serum samples from the Australian population that were collected in 2003 and 2009. The modeled adult reference intakes in 1975 for PCB congeners ranged from 0.89 to 24.5 ng/kg bw/day, lower than the daily intakes of OCPs ranging from 73 to 970 ng/kg bw/day. Modeled intake rates are declining with half-times from 1.1 to 1.3 years for PCB congeners and 0.83 to 0.97 years for OCPs. The shortest modeled intrinsic human elimination half-life among the compounds studied here is 6.4 years for hexachlorobenzene, and the longest is 30 years for PCB-74. Our results indicate that it is feasible to reconstruct intakes and to estimate intrinsic human elimination half-lives using the population-level PK model and biomonitoring data only. Our modeled intrinsic human elimination half-lives are in good agreement with values from a similar study carried out for the population of the United Kingdom, and are generally longer than reported values from other industrialized countries in the Northern Hemisphere.
Resumo:
Water to air methane emissions from freshwater reservoirs can be dominated by sediment bubbling (ebullitive) events. Previous work to quantify methane bubbling from a number of Australian sub-tropical reservoirs has shown that this can contribute as much as 95% of total emissions. These bubbling events are controlled by a variety of different factors including water depth, surface and internal waves, wind seiching, atmospheric pressure changes and water levels changes. Key to quantifying the magnitude of this emission pathway is estimating both the bubbling rate as well as the areal extent of bubbling. Both bubbling rate and areal extent are seldom constant and require persistent monitoring over extended time periods before true estimates can be generated. In this paper we present a novel system for persistent monitoring of both bubbling rate and areal extent using multiple robotic surface chambers and adaptive sampling (grazing) algorithms to automate the quantification process. Individual chambers are self-propelled and guided and communicate between each other without the need for supervised control. They can maintain station at a sampling site for a desired incubation period and continuously monitor, record and report fluxes during the incubation. To exploit the methane sensor detection capabilities, the chamber can be automatically lowered to decrease the head-space and increase concentration. The grazing algorithms assign a hierarchical order to chambers within a preselected zone. Chambers then converge on the individual recording the highest 15 minute bubbling rate. Individuals maintain a specified distance apart from each other during each sampling period before all individuals are then required to move to different locations based on a sampling algorithm (systematic or adaptive) exploiting prior measurements. This system has been field tested on a large-scale subtropical reservoir, Little Nerang Dam, and over monthly timescales. Using this technique, localised bubbling zones on the water storage were found to produce over 50,000 mg m-2 d-1 and the areal extent ranged from 1.8 to 7% of the total reservoir area. The drivers behind these changes as well as lessons learnt from the system implementation are presented. This system exploits relatively cheap materials, sensing and computing and can be applied to a wide variety of aquatic and terrestrial systems.
Resumo:
Due to the increasing recognition of global climate change, the building and construction industry is under pressure to reduce carbon emissions. A central issue in striving towards reduced carbon emissions is the need for a practicable and meaningful yardstick for assessing and communicating greenhouse gas (GHG) results. ISO 14067 was published by the International Organization for Standardization in May 2013. By providing specific requirements in the life cycle assessment (LCA) approach, the standard clarifies the GHG assessment in the aspects of choosing system boundaries and simulating use and end-of-life phases when quantifying carbon footprint of products (CFPs). More importantly, the standard, for the first time, provides step-to-step guidance and standardized template for communicating CFPs in the form of CFP external communication report, CFP performance tracking report, CFP declaration and CFP label. ISO 14067 therefore makes a valuable contribution to GHG quantification and transparent communication and comparison of CFPs. In addition, as cradle-to-grave should be used as the system boundary if use and end-of-life phases can be simulated, ISO 14067 will hopefully promote the development and implementation of simulation technologies, with Building Information Modelling (BIM) in particular, in the building and construction industry.