391 resultados para Coastal sensitivity mapping
Resumo:
Quantitative imaging methods to analyze cell migration assays are not standardized. Here we present a suite of two–dimensional barrier assays describing the collective spreading of an initially–confined population of 3T3 fibroblast cells. To quantify the motility rate we apply two different automatic image detection methods to locate the position of the leading edge of the spreading population after 24, 48 and 72 hours. These results are compared with a manual edge detection method where we systematically vary the detection threshold. Our results indicate that the observed spreading rates are very sensitive to the choice of image analysis tools and we show that a standard measure of cell migration can vary by as much as 25% for the same experimental images depending on the details of the image analysis tools. Our results imply that it is very difficult, if not impossible, to meaningfully compare previously published measures of cell migration since previous results have been obtained using different image analysis techniques and the details of these techniques are not always reported. Using a mathematical model, we provide a physical interpretation of our edge detection results. The physical interpretation is important since edge detection algorithms alone do not specify any physical measure, or physical definition, of the leading edge of the spreading population. Our modeling indicates that variations in the image threshold parameter correspond to a consistent variation in the local cell density. This means that varying the threshold parameter is equivalent to varying the location of the leading edge in the range of approximately 1–5% of the maximum cell density.
Resumo:
This chapter contains sections titled: Introduction ICZM and sustainable development of coastal zone International legal framework for ICZM Implementation of international legal obligations in domestic arena Concluding remarks References
Resumo:
Mapping Multiple Literacies brings together the latest theory and research in the fields of literacy study and European philosophy, Multiple Literacies Theory (MLT) and the philosophical work of Gilles Deleuze. It frames the process of becoming literate as a fluid process involving multiple modes of presentation, and explains these processes in terms of making maps of our social lives and ways of doing things together. For Deleuze, language acquisition is a social activity of which we are a part, but only one part amongst many others. Masny and Cole draw on Deleuze's thinking to expand the repertoires of literacy research and understanding. They outline how we can understand literacy as a social activity and map the ways in which becoming literate may take hold and transform communities. The chapters in this book weave together theory, data and practice to open up a creative new area of literacy studies and to provoke vigorous debate about the sociology of literacy.
Resumo:
The objective of this PhD research program is to investigate numerical methods for simulating variably-saturated flow and sea water intrusion in coastal aquifers in a high-performance computing environment. The work is divided into three overlapping tasks: to develop an accurate and stable finite volume discretisation and numerical solution strategy for the variably-saturated flow and salt transport equations; to implement the chosen approach in a high performance computing environment that may have multiple GPUs or CPU cores; and to verify and test the implementation. The geological description of aquifers is often complex, with porous materials possessing highly variable properties, that are best described using unstructured meshes. The finite volume method is a popular method for the solution of the conservation laws that describe sea water intrusion, and is well-suited to unstructured meshes. In this work we apply a control volume-finite element (CV-FE) method to an extension of a recently proposed formulation (Kees and Miller, 2002) for variably saturated groundwater flow. The CV-FE method evaluates fluxes at points where material properties and gradients in pressure and concentration are consistently defined, making it both suitable for heterogeneous media and mass conservative. Using the method of lines, the CV-FE discretisation gives a set of differential algebraic equations (DAEs) amenable to solution using higher-order implicit solvers. Heterogeneous computer systems that use a combination of computational hardware such as CPUs and GPUs, are attractive for scientific computing due to the potential advantages offered by GPUs for accelerating data-parallel operations. We present a C++ library that implements data-parallel methods on both CPU and GPUs. The finite volume discretisation is expressed in terms of these data-parallel operations, which gives an efficient implementation of the nonlinear residual function. This makes the implicit solution of the DAE system possible on the GPU, because the inexact Newton-Krylov method used by the implicit time stepping scheme can approximate the action of a matrix on a vector using residual evaluations. We also propose preconditioning strategies that are amenable to GPU implementation, so that all computationally-intensive aspects of the implicit time stepping scheme are implemented on the GPU. Results are presented that demonstrate the efficiency and accuracy of the proposed numeric methods and formulation. The formulation offers excellent conservation of mass, and higher-order temporal integration increases both numeric efficiency and accuracy of the solutions. Flux limiting produces accurate, oscillation-free solutions on coarse meshes, where much finer meshes are required to obtain solutions with equivalent accuracy using upstream weighting. The computational efficiency of the software is investigated using CPUs and GPUs on a high-performance workstation. The GPU version offers considerable speedup over the CPU version, with one GPU giving speedup factor of 3 over the eight-core CPU implementation.
Resumo:
The application of different EMS current thresholds on muscle activates not only the muscle but also peripheral sensory axons that send proprioceptive and pain signals to the cerebral cortex. A 32-channel time-domain fNIRS instrument was employed to map regional cortical activities under varied EMS current intensities applied on the right wrist extensor muscle. Eight healthy volunteers underwent four EMS at different current thresholds based on their individual maximal tolerated intensity (MTI), i.e., 10 % < 50 % < 100 % < over 100 % MTI. Time courses of the absolute oxygenated and deoxygenated hemoglobin concentrations primarily over the bilateral sensorimotor cortical (SMC) regions were extrapolated, and cortical activation maps were determined by general linear model using the NIRS-SPM software. The stimulation-induced wrist extension paradigm significantly increased activation of the contralateral SMC region according to the EMS intensities, while the ipsilateral SMC region showed no significant changes. This could be due in part to a nociceptive response to the higher EMS current intensities and result also from increased sensorimotor integration in these cortical regions.
Resumo:
Nha Trang Bay (NTB) is located on the Central Vietnam coast, western South China Sea. Recent coastal development of Nha Trang City has raised public concern over an increasing level of pollution within the bay and degradation of nearby coral reefs. In this study, multiple proxies (e.g., trace metals, rare earth elements (REEs), and Y/Ho) recorded in a massive Porites lutea coral colony were used to reconstruct changes in seawater conditions in the NTB from 1995 to 2009. A 14-year record of REEs and other trace metals revealed that the concentrations of terrestrial trace metals have increased dramatically in response to an increase in coastal development projects such as road, port, and resort constructions, port and river dredging, and dumping activities since 2000. The effects of such developmental processes are also evident in changes in REE patterns and Y/Ho ratios through time, suggesting that both parameters are critical proxies for marine pollution.
Resumo:
Text categorisation is challenging, due to the complex structure with heterogeneous, changing topics in documents. The performance of text categorisation relies on the quality of samples, effectiveness of document features, and the topic coverage of categories, depending on the employing strategies; supervised or unsupervised; single labelled or multi-labelled. Attempting to deal with these reliability issues in text categorisation, we propose an unsupervised multi-labelled text categorisation approach that maps the local knowledge in documents to global knowledge in a world ontology to optimise categorisation result. The conceptual framework of the approach consists of three modules; pattern mining for feature extraction; feature-subject mapping for categorisation; concept generalisation for optimised categorisation. The approach has been promisingly evaluated by compared with typical text categorisation methods, based on the ground truth encoded by human experts.
Resumo:
The first fiber Bragg grating (FBG) accelerometer using direct transverse forces is demonstrated by fixing the FBG by its two ends and placing a transversely moving inertial object at its middle. It is very sensitive because a lightly stretched FBG is more sensitive to transverse forces than axial forces. Its resonant frequency and static sensitivity are analyzed by the classic spring-mass theory, assuming the axial force changes little. The experiments show that the theory can be modified for cases where the assumption does not hold. The resonant frequency can be modified by a linear relationship experimentally achieved, and the static sensitivity by an alternative method proposed. The principles of the over-range protection and low cross axial sensitivity are achieved by limiting the movement of the FBG and were validated experimentally. The sensitivities 1.333 and 0.634 nm/g were experimentally achieved by 5.29 and 2.83 gram inertial objects at 10 Hz from 0.1 to 0.4 g (g = 9.8 m/s 2), respectively, and their resonant frequencies were around 25 Hz. Their theoretical static sensitivities and resonant frequencies found by the modifications are 1.188 nm/g and 26.81 Hz for the 5.29 gram one and 0.784 nm/g and 29.04 Hz for the 2.83 gram one, respectively.
Resumo:
Instances of parallel ecotypic divergence where adaptation to similar conditions repeatedly cause similar phenotypic changes in closely related organisms are useful for studying the role of ecological selection in speciation. Here we used a combination of traditional and next generation genotyping techniques to test for the parallel divergence of plants from the Senecio lautus complex, a phenotypically variable groundsel that has adapted to disparate environments in the South Pacific. Phylogenetic analysis of a broad selection of Senecio species showed that members of the S. lautus complex form a distinct lineage that has diversified recently in Australasia. An inspection of thousands of polymorphisms in the genome of 27 natural populations from the S. lautus complex in Australia revealed a signal of strong genetic structure independent of habitat and phenotype. Additionally, genetic differentiation between populations was correlated with the geographical distance separating them, and the genetic diversity of populations strongly depended on geographical location. Importantly, coastal forms appeared in several independent phylogenetic clades, a pattern that is consistent with the parallel evolution of these forms. Analyses of the patterns of genomic differentiation between populations further revealed that adjacent populations displayed greater genomic heterogeneity than allopatric populations and are differentiated according to variation in soil composition. These results are consistent with a process of parallel ecotypic divergence in face of gene flow.
Resumo:
Objective: To calculate pooled risk estimates of the association between pigmentary characteristics and basal cell carcinoma (BCC) of the skin. Methods: We searched three electronic databases and reviewed the reference lists of the retrieved articles until July 2012 to identify eligible epidemiologic studies. Eligible studies were those published in between 1965 and July 2012 that permitted quantitative assessment of the association between histologically-confirmed BCC and any of the following characteristics: hair colour, eye colour, skin colour, skin phototype, tanning and burning ability, and presence of freckling or melanocytic nevi. We included 29 studies from 2236 initially identified. We calculated summary odds ratios (ORs) using weighted averages of the log OR, using random effects models. Results: We found strongest associations with red hair (OR 2.02; 95% CI: 1.68, 2.44), fair skin colour (OR 2.11; 95% CI: 1.56, 2.86), and having skin that burns and never tans (OR 2.03; 95% CI: 1.73, 2.38). All other factors had weaker but positive associations with BCC, with the exception of freckling of the face in adulthood which showed no association. Conclusions: Although most studies report risk estimates that are in the same direction, there is significant heterogeneity in the size of the estimates. The associations were quite modest and remarkably similar, with ORs between about 1.5 and 2.5 for the highest risk level for each factor. Given the public health impact of BCC, this meta-analysis will make a valuable contribution to our understanding of BCC.
Resumo:
Background Loss of heterozygosity (LOH) is an important marker for one of the 'two-hits' required for tumor suppressor gene inactivation. Traditional methods for mapping LOH regions require the comparison of both tumor and patient-matched normal DNA samples. However, for many archival samples, patient-matched normal DNA is not available leading to the under-utilization of this important resource in LOH studies. Here we describe a new method for LOH analysis that relies on the genome-wide comparison of heterozygosity of single nucleotide polymorphisms (SNPs) between cohorts of cases and un-matched healthy control samples. Regions of LOH are defined by consistent decreases in heterozygosity across a genetic region in the case cohort compared to the control cohort. Methods DNA was collected from 20 Follicular Lymphoma (FL) tumor samples, 20 Diffuse Large B-cell Lymphoma (DLBCL) tumor samples, neoplastic B-cells of 10 B-cell Chronic Lymphocytic Leukemia (B-CLL) patients and Buccal cell samples matched to 4 of these B-CLL patients. The cohort heterozygosity comparison method was developed and validated using LOH derived in a small cohort of B-CLL by traditional comparisons of tumor and normal DNA samples, and compared to the only alternative method for LOH analysis without patient matched controls. LOH candidate regions were then generated for enlarged cohorts of B-CLL, FL and DLBCL samples using our cohort heterozygosity comparison method in order to evaluate potential LOH candidate regions in these non-Hodgkin's lymphoma tumor subtypes. Results Using a small cohort of B-CLL samples with patient-matched normal DNA we have validated the utility of this method and shown that it displays more accuracy and sensitivity in detecting LOH candidate regions compared to the only alternative method, the Hidden Markov Model (HMM) method. Subsequently, using B-CLL, FL and DLBCL tumor samples we have utilised cohort heterozygosity comparisons to localise LOH candidate regions in these subtypes of non-Hodgkin's lymphoma. Detected LOH regions included both previously described regions of LOH as well as novel genomic candidate regions. Conclusions We have proven the efficacy of the use of cohort heterozygosity comparisons for genome-wide mapping of LOH and shown it to be in many ways superior to the HMM method. Additionally, the use of this method to analyse SNP microarray data from 3 common forms of non-Hodgkin's lymphoma yielded interesting tumor suppressor gene candidates, including the ETV3 gene that was highlighted in both B-CLL and FL.
Resumo:
To understand the underlying genetic architecture of cardiovascular disease (CVD) risk traits, we undertook a genome-wide linkage scan to identify CVD quantitative trait loci (QTLs) in 377 individuals from the Norfolk Island population. The central aim of this research focused on the utilization of a genetically and geographically isolated population of individuals from Norfolk Island for the purposes of variance component linkage analysis to identify QTLs involved in CVD risk traits. Substantial evidence supports the involvement of traits such as systolic and diastolic blood pressures, high-density lipoprotein-cholesterol, low-density lipoprotein-cholesterol, body mass index and triglycerides as important risk factors for CVD pathogenesis. In addition to the environmental inXuences of poor diet, reduced physical activity, increasing age, cigarette smoking and alcohol consumption, many studies have illustrated a strong involvement of genetic components in the CVD phenotype through family and twin studies. We undertook a genome scan using 400 markers spaced approximately 10 cM in 600 individuals from Norfolk Island. Genotype data was analyzed using the variance components methods of SOLAR. Our results gave a peak LOD score of 2.01 localizing to chromosome 1p36 for systolic blood pressure and replicated previously implicated loci for other CVD relevant QTLs.
Resumo:
The Chemistry Discipline Network has recently completed two distinct mapping exercises. The first is a snapshot of chemistry taught at 12 institutions around Australia in 2011. There were many similarities but also important differences in the content taught and assessed at different institutions. There were also significant differences in delivery, particularly laboratory contact hours, as well as forms and weightings of assessment. The second mapping exercise mapped the chemistry degrees at three institutions to the Threshold Learning Outcomes for chemistry. Importantly, some of the TLOs were addressed by multiple units at all institutions, while others were not met, or were met at an introductory level only. The exercise also exposed some challenges in using the TLOs as currently written.
Resumo:
Democratic governments raise taxes and charges and spend revenue on delivering peace, order and good government. The delivery process begins with a legislature as that can provide a framework of legally enforceable rules enacted according to the government’s constitution. These rules confer rights and obligations that allow particular people to carry on particular functions at particular places and times. Metadata standards as applied to public records contain information about the functioning of government as distinct from the non-government sector of society. Metadata standards apply to database construction. Data entry, storage, maintenance, interrogation and retrieval depend on a controlled vocabulary needed to enable accurate retrieval of suitably catalogued records in a global information environment. Queensland’s socioeconomic progress now depends in part on technical efficiency in database construction to address queries about who does what, where and when; under what legally enforceable authority; and how the evidence of those facts is recorded. The Survey and Mapping Infrastructure Act 2003 (Qld) addresses technical aspects of where questions – typically the officially recognised name of a place and a description of its boundaries. The current 10-year review of the Survey and Mapping Regulation 2004 provides a valuable opportunity to consider whether the Regulation makes sense in the context of a number of later laws concerned with management of Public Sector Information (PSI) as well as policies for ICT hardware and software procurement. Removing ambiguities about how official place names are to be regarded on a whole-of-government basis can achieve some short term goals. Longer-term goals depend on a more holistic approach to information management – and current aspirations for more open government and community engagement are unlikely to occur without such a longer-term vision.