890 resultados para Percolation threshold


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantitative imaging methods to analyze cell migration assays are not standardized. Here we present a suite of two–dimensional barrier assays describing the collective spreading of an initially–confined population of 3T3 fibroblast cells. To quantify the motility rate we apply two different automatic image detection methods to locate the position of the leading edge of the spreading population after 24, 48 and 72 hours. These results are compared with a manual edge detection method where we systematically vary the detection threshold. Our results indicate that the observed spreading rates are very sensitive to the choice of image analysis tools and we show that a standard measure of cell migration can vary by as much as 25% for the same experimental images depending on the details of the image analysis tools. Our results imply that it is very difficult, if not impossible, to meaningfully compare previously published measures of cell migration since previous results have been obtained using different image analysis techniques and the details of these techniques are not always reported. Using a mathematical model, we provide a physical interpretation of our edge detection results. The physical interpretation is important since edge detection algorithms alone do not specify any physical measure, or physical definition, of the leading edge of the spreading population. Our modeling indicates that variations in the image threshold parameter correspond to a consistent variation in the local cell density. This means that varying the threshold parameter is equivalent to varying the location of the leading edge in the range of approximately 1–5% of the maximum cell density.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction. Calculating segmental (vertebral level-by-level) torso masses in Adolescent Idiopathic Scoliosis (AIS) patients allows the gravitational loading on the scoliotic spine during relaxed standing to be determined. This study used CT scans of AIS patients to measure segmental torso masses and explores how joint moments in the coronal plane are affected by changes in the position of the intervertebral joint’s axis of rotation; particularly at the apex of a scoliotic major curve. Methods. Existing low dose CT data from the Paediatric Spine Research Group was used to calculate vertebral level-by-level torso masses and joint torques occurring in the spine for a group of 20 female AIS patients (mean age 15.0 ± 2.7 years, mean Cobb angle 53 ± 7.1°). Image processing software, ImageJ (v1.45 NIH USA) was used to threshold the T1 to L5 CT images and calculate the segmental torso volume and mass corresponding to each vertebral level. Body segment masses for the head, neck and arms were taken from published anthropometric data. Intervertebral (IV) joint torques at each vertebral level were found using principles of static equilibrium together with the segmental body mass data. Summing the torque contributions for each level above the required joint, allowed the cumulative joint torque at a particular level to be found. Since there is some uncertainty in the position of the coronal plane Instantaneous Axis of Rotation (IAR) for scoliosis patients, it was assumed the IAR was located in the centre of the IV disc. A sensitivity analysis was performed to see what effect the IAR had on the joint torques by moving it laterally 10mm in both directions. Results. The magnitude of the torso masses from T1-L5 increased inferiorly, with a 150% increase in mean segmental torso mass from 0.6kg at T1 to 1.5kg at L5. The magnitudes of the calculated coronal plane joint torques during relaxed standing were typically 5-7 Nm at the apex of the curve, with the highest apex joint torque of 7Nm being found in patient 13. Shifting the assumed IAR by 10mm towards the convexity of the spine, increased the joint torque at that level by a mean 9.0%, showing that calculated joint torques were moderately sensitive to the assumed IAR location. When the IAR midline position was moved 10mm away from the convexity of the spine, the joint torque reduced by a mean 8.9%. Conclusion. Coronal plane joint torques as high as 7Nm can occur during relaxed standing in scoliosis patients, which may help to explain the mechanics of AIS progression. This study provides new anthropometric reference data on vertebral level-by-level torso mass in AIS patients which will be useful for biomechanical models of scoliosis progression and treatment. However, the CT scans were performed in supine (no gravitational load on spine) and curve magnitudes are known to be smaller than those measured in standing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Experimentally, hydrogen-free diamond-like carbon (DLC) films were assembled by means of pulsed laser deposition (PLD), where energetic small-carbon-clusters were deposited on the substrate. In this paper, the chemisorption of energetic C2 and C10 clusters on diamond (001)-( 2×1) surface was investigated by molecular dynamics simulation. The influence of cluster size and the impact energy on the structure character of the deposited clusters is mainly addressed. The impact energy was varied from a few tens eV to 100 eV. The chemisorption of C10 was found to occur only when its incident energy is above a threshold value ( E th). While, the C2 cluster was easily to adsorb on the surface even at much lower incident energy. With increasing the impact energy, the structures of the deposited C2 and C10 are different from the free clusters. Finally, the growth of films synthesized by energetic C2 and C10 clusters were simulated. The statistics indicate the C2 cluster has high probability of adsorption and films assembled of C2 present slightly higher SP3 fraction than that of C10-films, especially at higher impact energy and lower substrate temperature. Our result supports the experimental findings. Moreover, the simulation underlines the deposition mechanism at atomic scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impact induced chemisorption of hydrocarbon molecules (CH3 and CH2) on H-terminated diamond (001)-(2x1) surface was investigated by molecular dynamics simulation using the many-body Brenner potential. The deposition dynamics of the CH3 radical at impact energies of 0.1-50 eV per molecule was studied and the energy threshold for chemisorption was calculated. The impact-induced decomposition of hydrogen atoms and the dimer opening mechanism on the surface was investigated. Furthermore, the probability for dimer opening event induced by chemisorption of CH, was simulated by randomly varying the impact position as well as the orientation of the molecule relative to the surface. Finally, the energetic hydrocarbons were modeled, slowing down one after the other to simulate the initial fabrication of diamond-like carbon (DLC) films. The structure characteristic in synthesized films with different hydrogen flux was studied. Our results indicate that CH3, CH2 and H are highly reactive and important species in diamond growth. Especially, the fraction of C-atoms in the film having sp(3) hybridization will be enhanced in the presence of H atoms, which is in good agreement with experimental observations. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The adsorption of low-energy C20 isomers on diamond (0 0 1)–(2×1) surface was investigated by molecular dynamics simulation using the Brenner potential. The energy dependence of chemisorption characteristic was studied. We found that there existed an energy threshold for chemisorption of C20 to occur. Between 10 and 20 eV, the C20 fullerene has high probability of chemisorption and the adsorbed cage retains its original structure, which supports the experimental observations of memory effects. However, the structures of the adsorbed bowl and ring C20 were different from their original ones. In this case, the local order in cluster-assembled films would be different from the free clusters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The deposition of hyperthermal CH3 on diamond (001)-(2×1) surface at room temperature has been studied by means of molecular dynamics simulation using the many-body hydrocarbon potential. The energy threshold effect has been observed. That is, with fixed collision geometry, chemisorption can occur only when the incident energy of CH3 is above a critical value (Eth). Increasing the incident energy, dissociation of hydrogen atoms from the incident molecule was observed. The chemisorption probability of CH3 as a function of its incident energy was calculated and compared with that of C2H2. We found that below 10 eV, the chemisorption probability of C2H2 is much lower than that of CH3 on the same surface. The interesting thing is that it is even lower than that of CH3 on a hydrogen covered surface at the same impact energy. It indicates that the reactive CH3 molecule is the more important species than C2H2 in diamond synthesis at low energy, which is in good agreement with the experimental observation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the collision of a C36, with D6h symmetry, on diamond (001)-(/2×1) surface was investigated using molecular dynamics (MD) simulation based on the semi-empirical Brenner potential. The incident kinetic energy of the C36 ranges from 20 to 150 eV per cluster. The collision dynamics was investigated as a function of impact energy Ein. The C36 cluster was first impacted towards the center of two dimers with a fixed orientation. It was found that when Ein was lower than 30 eV, C36 bounces off the surface without breaking up. Increasing Ein to 30-45 eV, bonds were formed between C36 and surface dimer atoms, and the adsorbed C36 retained its original free-cluster structure. Around 50-60 eV, the C36 rebounded from the surface with cage defects. Above 70 eV, fragmentation both in the cluster and on the surface was observed. Our simulation supported the experimental findings that during low-energy cluster beam deposition small fullerenes could keep their original structure after adsorption (i.e. the memory effect), if Ein is within a certain range. Furthermore, we found that the energy threshold for chemisorption is sensitive to the orientation of the incident C36 and its impact position on the asymmetric surface.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Knowledge of current trends in nurse-administered procedural sedation and analgesia (PSA) in the cardiac catheterisation laboratory (CCL) may provide important insights into how to improve safety and effectiveness of this practice. Objective To characterise current practice as well as education and competency standards regarding nurse-administered PSA in Australian and New Zealand CCLs. Design A quantitative, cross-sectional, descriptive survey design was used. Methods Data were collected using a web-based questionnaire on practice, educational standards and protocols related to nurse-administered PSA. Descriptive statistics were used to analyse data. Results A sample of 62 nurses, each from a different CCL, completed a questionnaire that focused on PSA practice. Over half of the estimated total number of CCLs in Australia and New Zealand was represented. Nurse-administered PSA was used in 94% (n = 58) of respondents CCLs. All respondents indicated that benzodiazepines, opioids or a combination of both is used for PSA (n = 58). One respondent indicated that propofol was also used. 20% (n = 12) indicated that deep sedation is purposefully induced for defibrillation threshold testing and cardioversion without a second medical practitioner present. Sedation monitoring practices vary considerably between institutions. 31% (n = 18) indicated that comprehensive education about PSA is provided. 45% (n = 26) indicated that nurses who administer PSA should undergo competency assessment. Conclusion By characterising nurse-administered PSA in Australian and New Zealand CCLs, a baseline for future studies has been established. Areas of particular importance to improve include protocols for patient monitoring and comprehensive PSA education for CCL nurses in Australia and New Zealand.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Making institutional expectations explicit using clear and common language engages commencing students and promotes help-seeking behaviour. When first year students enter university they cross the threshold into an unfamiliar environment (Devlin, Kift, Nelson, Smith & McKay, 2012). Universities endeavour to provide appropriate learning support services and resources; however research suggests that there is limited up take of these services, particularly in high risk students (Nelson-Field & Goodman, 2005). The Successful Student Skills Checklist is a tool which will be trialled during the 2013 Orientation period at the QUT Caboolture campus. The new tool is a response to the university’s commitment to provide “an environment where [students] are supported to take responsibility for their own learning, and to embrace an active role in succeeding to their full potential” (QUT, 2012, 6.2.1). This paper will outline the design of the support tool implemented during Orientation, as well as discuss the anticipated outcomes of the trial.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the last several decades, the quality of natural resources and their services have been exposed to significant degradation from increased urban populations combined with the sprawl of settlements, development of transportation networks and industrial activities (Dorsey, 2003; Pauleit et al., 2005). As a result of this environmental degradation, a sustainable framework for urban development is required to provide the resilience of natural resources and ecosystems. Sustainable urban development refers to the management of cities with adequate infrastructure to support the needs of its population for the present and future generations as well as maintain the sustainability of its ecosystems (UNEP/IETC, 2002; Yigitcanlar, 2010). One of the important strategic approaches for planning sustainable cities is „ecological planning‟. Ecological planning is a multi-dimensional concept that aims to preserve biodiversity richness and ecosystem productivity through the sustainable management of natural resources (Barnes et al., 2005). As stated by Baldwin (1985, p.4), ecological planning is the initiation and operation of activities to direct and control the acquisition, transformation, disruption and disposal of resources in a manner capable of sustaining human activities with a minimum disruption of ecosystem processes. Therefore, ecological planning is a powerful method for creating sustainable urban ecosystems. In order to explore the city as an ecosystem and investigate the interaction between the urban ecosystem and human activities, a holistic urban ecosystem sustainability assessment approach is required. Urban ecosystem sustainability assessment serves as a tool that helps policy and decision-makers in improving their actions towards sustainable urban development. There are several methods used in urban ecosystem sustainability assessment among which sustainability indicators and composite indices are the most commonly used tools for assessing the progress towards sustainable land use and urban management. Currently, a variety of composite indices are available to measure the sustainability at the local, national and international levels. However, the main conclusion drawn from the literature review is that they are too broad to be applied to assess local and micro level sustainability and no benchmark value for most of the indicators exists due to limited data availability and non-comparable data across countries. Mayer (2008, p. 280) advocates that by stating "as different as the indices may seem, many of them incorporate the same underlying data because of the small number of available sustainability datasets". Mori and Christodoulou (2011) also argue that this relative evaluation and comparison brings along biased assessments, as data only exists for some entities, which also means excluding many nations from evaluation and comparison. Thus, there is a need for developing an accurate and comprehensive micro-level urban ecosystem sustainability assessment method. In order to develop such a model, it is practical to adopt an approach that uses a method to utilise indicators for collecting data, designate certain threshold values or ranges, perform a comparative sustainability assessment via indices at the micro-level, and aggregate these assessment findings to the local level. Hereby, through this approach and model, it is possible to produce sufficient and reliable data to enable comparison at the local level, and provide useful results to inform the local planning, conservation and development decision-making process to secure sustainable ecosystems and urban futures. To advance research in this area, this study investigated the environmental impacts of an existing urban context by using a composite index with an aim to identify the interaction between urban ecosystems and human activities in the context of environmental sustainability. In this respect, this study developed a new comprehensive urban ecosystem sustainability assessment tool entitled the „Micro-level Urban-ecosystem Sustainability IndeX‟ (MUSIX). The MUSIX model is an indicator-based indexing model that investigates the factors affecting urban sustainability in a local context. The model outputs provide local and micro-level sustainability reporting guidance to help policy-making concerning environmental issues. A multi-method research approach, which is based on both quantitative analysis and qualitative analysis, was employed in the construction of the MUSIX model. First, a qualitative research was conducted through an interpretive and critical literature review in developing a theoretical framework and indicator selection. Afterwards, a quantitative research was conducted through statistical and spatial analyses in data collection, processing and model application. The MUSIX model was tested in four pilot study sites selected from the Gold Coast City, Queensland, Australia. The model results detected the sustainability performance of current urban settings referring to six main issues of urban development: (1) hydrology, (2) ecology, (3) pollution, (4) location, (5) design, and; (6) efficiency. For each category, a set of core indicators was assigned which are intended to: (1) benchmark the current situation, strengths and weaknesses, (2) evaluate the efficiency of implemented plans, and; (3) measure the progress towards sustainable development. While the indicator set of the model provided specific information about the environmental impacts in the area at the parcel scale, the composite index score provided general information about the sustainability of the area at the neighbourhood scale. Finally, in light of the model findings, integrated ecological planning strategies were developed to guide the preparation and assessment of development and local area plans in conjunction with the Gold Coast Planning Scheme, which establishes regulatory provisions to achieve ecological sustainability through the formulation of place codes, development codes, constraint codes and other assessment criteria that provide guidance for best practice development solutions. These relevant strategies can be summarised as follows: • Establishing hydrological conservation through sustainable stormwater management in order to preserve the Earth’s water cycle and aquatic ecosystems; • Providing ecological conservation through sustainable ecosystem management in order to protect biological diversity and maintain the integrity of natural ecosystems; • Improving environmental quality through developing pollution prevention regulations and policies in order to promote high quality water resources, clean air and enhanced ecosystem health; • Creating sustainable mobility and accessibility through designing better local services and walkable neighbourhoods in order to promote safe environments and healthy communities; • Sustainable design of urban environment through climate responsive design in order to increase the efficient use of solar energy to provide thermal comfort, and; • Use of renewable resources through creating efficient communities in order to provide long-term management of natural resources for the sustainability of future generations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a classification problem typically we face two challenging issues, the diverse characteristic of negative documents and sometimes a lot of negative documents that are closed to positive documents. Therefore, it is hard for a single classifier to clearly classify incoming documents into classes. This paper proposes a novel gradual problem solving to create a two-stage classifier. The first stage identifies reliable negatives (negative documents with weak positive characteristics). It concentrates on minimizing the number of false negative documents (recall-oriented). We use Rocchio, an existing recall based classifier, for this stage. The second stage is a precision-oriented “fine tuning”, concentrates on minimizing the number of false positive documents by applying pattern (a statistical phrase) mining techniques. In this stage a pattern-based scoring is followed by threshold setting (thresholding). Experiment shows that our statistical phrase based two-stage classifier is promising.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Migraine is a common neurological disease with a genetic basis affecting approximately 12% of the population. Pain during a migraine attack is associated with activation of the trigeminal nerve system, which carries pain signals from the meninges and the blood vessels infusing the meninges to the trigeminal nucleus in the brain stem. The release of inflammatory mediators following cortical spreading depression (CSD) may further promote and sustain the activation and sensitization of meningeal nociceptors, inducing the persistent throbbing headache characterised in migraine. Lymphotoxin α (LTA) is a cytokine secreted by lymphocytes and is a member of the tumour necrosis factor (TNF) family. Genetic variation with the TNF and LTA genes may contribute to threshold brain excitability, propagation of neuronal hyperexcitability and thus initiation and maintenance of a migraine attack. Three LTA variants rs2009658, rs2844482 and rs2229094 were identified in a recent pGWAS study conducted in the Norfolk Island population as being potentially implicated in migraine with nominally significant p values of p = 0.0093, p = 0.0088 and p = 0.033 respectively. To determine whether these SNPs played a role in migraine in a general outbred population these SNPs were gentoyped in a large case control Australian Caucasian population and tested for association with migraine. All three SNPs showed no association in our cohort (p > 0.05). Validation of GWAS data in independent case-controls cohorts is essential to establish risk validity within specific population groups. The importance of cytokines in modulating neural inflammation and pain threshold in addition to other studies showing associations between TNF-α and SNPs in the LTA gene with migraine, suggests that LTA could be an important factor contributing to migraine. Although the present study did not support a role for the tested LTA variants in migraine, investigation of other variants within the LTA gene is still warranted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Migraine is a painful and debilitating, neurovascular disease. Current migraine head pain treatments work with differing efficacies in migraineurs. The opioid system plays an important role in diverse biological functions including analgesia, drug response and pain reduction. The A118G single nucleotide polymorphism (SNP) in exon 1 of the μ-opioid receptor gene (OPRM1) has been associated with elevated pain responses and decreased pain threshold in a variety of populations. The aim of the current preliminary study was to test whether genotypes of the OPRM1 A118G SNP are associated with head pain severity in a clinical cohort of female migraineurs. This was a preliminary study to determine whether genotypes of the OPRM1 A118G SNP are associated with head pain severity in a clinical cohort of female migraineurs. A total of 153 chronic migraine with aura sufferers were assessed for migraine head pain using the Migraine Disability Assessment Score instrument and classified into high and low pain severity groups. DNA was extracted and genotypes obtained for the A118G SNP. Logistic regression analysis adjusting for age effects showed the A118G SNP of the OPRM1 gene to be significantly associated with migraine pain severity in the test population (P = 0.0037). In particular, G118 allele carriers were more likely to be high pain sufferers compared to homozygous carriers of the A118 allele (OR = 3.125, 95 % CI = 1.41, 6.93, P = 0.0037). These findings suggest that A118G genotypes of the OPRM1 gene may influence migraine-associated head pain in females. Further investigations are required to fully understand the effect of this gene variant on migraine head pain including studies in males and in different migraine subtypes, as well as in response to head pain medication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How do agents with limited cognitive capacities flourish in informationally impoverished or unexpected circumstances? Aristotle argued that human flourishing emerged from knowing about the world and our place within it. If he is right, then the virtuous processes that produce knowledge, best explain flourishing. Influenced by Aristotle, virtue epistemology defends an analysis of knowledge where beliefs are evaluated for their truth and the intellectual virtue or competences relied on in their creation. However, human flourishing may emerge from how degrees of ignorance are managed in an uncertain world. Perhaps decision-making in the shadow of knowledge best explains human wellbeing—a Bayesian approach? In this dissertation I argue that a hybrid of virtue and Bayesian epistemologies explains human flourishing—what I term homeostatic epistemology. Homeostatic epistemology supposes that an agent has a rational credence p when p is the product of reliable processes aligned with the norms of probability theory; whereas an agent knows that p when a rational credence p is the product of reliable processes such that: 1) p meets some relevant threshold for belief (such that the agent acts as though p were true and indeed p is true), 2) p coheres with a satisficing set of relevant beliefs and, 3) the relevant set of beliefs is coordinated appropriately to meet the integrated aims of the agent. Homeostatic epistemology recognizes that justificatory relationships between beliefs are constantly changing to combat uncertainties and to take advantage of predictable circumstances. Contrary to holism, justification is built up and broken down across limited sets like the anabolic and catabolic processes that maintain homeostasis in the cells, organs and systems of the body. It is the coordination of choristic sets of reliably produced beliefs that create the greatest flourishing given the limitations inherent in the situated agent.