968 resultados para Dispute resolution advocacy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A model is developed for predicting the resolution of interested component pair and calculating the optimum temperature programming condition in the comprehensive two-dimensional gas chromatography (GC x GC). Based on at least three isothermal runs, retention times and the peak widths at half-height on both dimensions are predicted for any kind of linear temperature-programmed run on the first dimension and isothermal runs on the second dimension. The calculation of the optimum temperature programming condition is based on the prediction of the resolution of "difficult-to-separate components" in a given mixture. The resolution of all the neighboring peaks on the first dimension is obtained by the predicted retention time and peak width on the first dimension, the resolution on the second dimension is calculated only for the adjacent components with un-enough resolution on the first dimension and eluted within a same modulation period on the second dimension. The optimum temperature programming condition is acquired when the resolutions of all components of interest by GC x GC separation meet the analytical requirement and the analysis time is the shortest. The validity of the model has been proven by using it to predict and optimize GC x GC temperature programming condition of an alkylpyridine mixture. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Submitted to Appl Magn Reson Sponsorship: EPSRC / EU

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Africa faces problems of ecological devastation caused by economic exploitation, rapid population growth, and poverty. Capitalism, residual colonialism, and corruption undermine Africa's efforts to forge a better future. The dissertation describes how in Africa the mounting ecological crisis has religious, political, and economic roots that enable and promote social and environmental harm. It presents the thesis that religious traditions, including their ethical expressions, can effectively address the crisis, ameliorate its impacts, and advocate for social and environmental betterment, now and in the future. First, it examines African traditional religion and Christian teaching, which together provide the foundation for African Christianity. Critical examination of both religious worldviews uncovers their complementary emphases on human responsibility toward planet Earth and future generations. Second, an analysis of the Gwembe Tonga of Chief Simamba explores the interconnectedness of all elements of the universe in African cosmologies. In Africa, an interdependent, participatory relationship exists between the world of animals, the world of humans, and the Creator. In discussing the annual lwiindi (rain calling) ceremony of Simamba, the study explores ecological overtones of African religions. Such rituals illustrate the involvement of ancestors and high gods in maintaining ecological integrity. Third, the foundation of the African morality of abundant life is explored. Across Sub-Saharan Africa, ancestors' teachings are the foundation of morality; ancestors are guardians of the land. A complementary teaching that Christ is the ecological ancestor of all life can direct ethical responses to the ecological crisis. Fourth, the eco-social implications of ubuntu (what it means to be fully human) are examined. Some aspects of ubuntu are criticized in light of economic inequalities and corruption in Africa. However, ubuntu can be transformed to advocate for eco-social liberation. Fifth, the study recognizes that in some cases conflicts exist between ecological values and religious teachings. This conflict is examined in terms of the contrast between awareness of socioeconomic problems caused by population growth, on the one hand, and advocacy of a traditional African morality of abundant children, on the other hand. A change in the latter religious view is needed since overpopulation threatens sustainable living and the future of Earth. The dissertation concludes that the identification of Jesus with African ancestors and theological recognition of Jesus as the ecological ancestor, woven together with ubuntu, an ethic of interconnectedness, should characterize African consciousness and promote resolution of the socio-ecological crisis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Do humans and animals learn exemplars or prototypes when they categorize objects and events in the world? How are different degrees of abstraction realized through learning by neurons in inferotemporal and prefrontal cortex? How do top-down expectations influence the course of learning? Thirty related human cognitive experiments (the 5-4 category structure) have been used to test competing views in the prototype-exemplar debate. In these experiments, during the test phase, subjects unlearn in a characteristic way items that they had learned to categorize perfectly in the training phase. Many cognitive models do not describe how an individual learns or forgets such categories through time. Adaptive Resonance Theory (ART) neural models provide such a description, and also clarify both psychological and neurobiological data. Matching of bottom-up signals with learned top-down expectations plays a key role in ART model learning. Here, an ART model is used to learn incrementally in response to 5-4 category structure stimuli. Simulation results agree with experimental data, achieving perfect categorization in training and a good match to the pattern of errors exhibited by human subjects in the testing phase. These results show how the model learns both prototypes and certain exemplars in the training phase. ART prototypes are, however, unlike the ones posited in the traditional prototype-exemplar debate. Rather, they are critical patterns of features to which a subject learns to pay attention based on past predictive success and the order in which exemplars are experienced. Perturbations of old memories by newly arriving test items generate a performance curve that closely matches the performance pattern of human subjects. The model also clarifies exemplar-based accounts of data concerning amnesia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerous laboratory experiments have been performed in an attempt to mimic atmospheric secondary organic aerosol (SOA) formation. However, it is still unclear how close the aerosol particles generated in laboratory experiments resemble atmospheric SOA with respect to their detailed chemical composition. In this study, we generated SOA in a simulation chamber from the ozonolysis of α-pinene and a biogenic volatile organic compound (BVOC) mixture containing α- and β-pinene, Δ3-carene, and isoprene. The detailed molecular composition of laboratory-generated SOA was compared with that of background ambient aerosol collected at a boreal forest site (Hyytiälä, Finland) and an urban location (Cork, Ireland) using direct infusion nanoelectrospray ultrahigh resolution mass spectrometry. Kendrick Mass Defect and Van Krevelen approaches were used to identify and compare compound classes and distributions of the detected species. The laboratory-generated SOA contained a distinguishable group of dimers that was not observed in the ambient samples. The presence of dimers was found to be less pronounced in the SOA from the VOC mixtures when compared to the one component precursor system. The elemental composition of the compounds identified in the monomeric region from the ozonolysis of both α-pinene and VOC mixtures represented the ambient organic composition of particles collected at the boreal forest site reasonably well, with about 70% of common molecular formulae. In contrast, large differences were found between the laboratory-generated BVOC samples and the ambient urban sample. To our knowledge this is the first direct comparison of molecular composition of laboratory-generated SOA from BVOC mixtures and ambient samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Very Long Baseline Interferometry (VLBI) polarisation observations of the relativistic jets from Active Galactic Nuclei (AGN) allow the magnetic field environment around the jet to be probed. In particular, multi-wavelength observations of AGN jets allow the creation of Faraday rotation measure maps which can be used to gain an insight into the magnetic field component of the jet along the line of sight. Recent polarisation and Faraday rotation measure maps of many AGN show possible evidence for the presence of helical magnetic fields. The detection of such evidence is highly dependent both on the resolution of the images and the quality of the error analysis and statistics used in the detection. This thesis focuses on the development of new methods for high resolution radio astronomy imaging in both of these areas. An implementation of the Maximum Entropy Method (MEM) suitable for multi-wavelength VLBI polarisation observations is presented and the advantage in resolution it possesses over the CLEAN algorithm is discussed and demonstrated using Monte Carlo simulations. This new polarisation MEM code has been applied to multi-wavelength imaging of the Active Galactic Nuclei 0716+714, Mrk 501 and 1633+382, in each case providing improved polarisation imaging compared to the case of deconvolution using the standard CLEAN algorithm. The first MEM-based fractional polarisation and Faraday-rotation VLBI images are presented, using these sources as examples. Recent detections of gradients in Faraday rotation measure are presented, including an observation of a reversal in the direction of a gradient further along a jet. Simulated observations confirming the observability of such a phenomenon are conducted, and possible explanations for a reversal in the direction of the Faraday rotation measure gradient are discussed. These results were originally published in Mahmud et al. (2013). Finally, a new error model for the CLEAN algorithm is developed which takes into account correlation between neighbouring pixels. Comparison of error maps calculated using this new model and Monte Carlo maps show striking similarities when the sources considered are well resolved, indicating that the method is correctly reproducing at least some component of the overall uncertainty in the images. The calculation of many useful quantities using this model is demonstrated and the advantages it poses over traditional single pixel calculations is illustrated. The limitations of the model as revealed by Monte Carlo simulations are also discussed; unfortunately, the error model does not work well when applied to compact regions of emission.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

info:eu-repo/semantics/nonPublished

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent emergence of human connectome imaging has led to a high demand on angular and spatial resolutions for diffusion magnetic resonance imaging (MRI). While there have been significant growths in high angular resolution diffusion imaging, the improvement in spatial resolution is still limited due to a number of technical challenges, such as the low signal-to-noise ratio and high motion artifacts. As a result, the benefit of a high spatial resolution in the whole-brain connectome imaging has not been fully evaluated in vivo. In this brief report, the impact of spatial resolution was assessed in a newly acquired whole-brain three-dimensional diffusion tensor imaging data set with an isotropic spatial resolution of 0.85 mm. It was found that the delineation of short cortical association fibers is drastically improved as well as the definition of fiber pathway endings into the gray/white matter boundary-both of which will help construct a more accurate structural map of the human brain connectome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DNaseI footprinting is an established assay for identifying transcription factor (TF)-DNA interactions with single base pair resolution. High-throughput DNase-seq assays have recently been used to detect in vivo DNase footprints across the genome. Multiple computational approaches have been developed to identify DNase-seq footprints as predictors of TF binding. However, recent studies have pointed to a substantial cleavage bias of DNase and its negative impact on predictive performance of footprinting. To assess the potential for using DNase-seq to identify individual binding sites, we performed DNase-seq on deproteinized genomic DNA and determined sequence cleavage bias. This allowed us to build bias corrected and TF-specific footprint models. The predictive performance of these models demonstrated that predicted footprints corresponded to high-confidence TF-DNA interactions. DNase-seq footprints were absent under a fraction of ChIP-seq peaks, which we show to be indicative of weaker binding, indirect TF-DNA interactions or possible ChIP artifacts. The modeling approach was also able to detect variation in the consensus motifs that TFs bind to. Finally, cell type specific footprints were detected within DNase hypersensitive sites that are present in multiple cell types, further supporting that footprints can identify changes in TF binding that are not detectable using other strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Preclinical imaging has a critical role in phenotyping, in drug discovery, and in providing a basic understanding of mechanisms of disease. Translating imaging methods from humans to small animals is not an easy task. The purpose of this work is to review high-resolution computed tomography (CT) also known as micro-CT for small-animal imaging. We present the principles, the technologies, the image quality parameters, and the types of applications. We show that micro-CT can be used to provide not only morphological but also functional information such as cardiac function or vascular permeability. Another way in which micro-CT can be used in the study of both function and anatomy is by combining it with other imaging modalities, such as positron emission tomography or single-photon emission tomography. Compared to other modalities, micro-CT imaging is usually regarded as being able to provide higher throughput at lower cost and higher resolution. The limitations are usually associated with the relatively poor contrast mechanisms and the radiation damage, although the use of novel nanoparticle-based contrast agents and careful design of studies can address these limitations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© 2015 IOP Publishing Ltd and Deutsche Physikalische Gesellschaft.A key component in calculations of exchange and correlation energies is the Coulomb operator, which requires the evaluation of two-electron integrals. For localized basis sets, these four-center integrals are most efficiently evaluated with the resolution of identity (RI) technique, which expands basis-function products in an auxiliary basis. In this work we show the practical applicability of a localized RI-variant ('RI-LVL'), which expands products of basis functions only in the subset of those auxiliary basis functions which are located at the same atoms as the basis functions. We demonstrate the accuracy of RI-LVL for Hartree-Fock calculations, for the PBE0 hybrid density functional, as well as for RPA and MP2 perturbation theory. Molecular test sets used include the S22 set of weakly interacting molecules, the G3 test set, as well as the G2-1 and BH76 test sets, and heavy elements including titanium dioxide, copper and gold clusters. Our RI-LVL implementation paves the way for linear-scaling RI-based hybrid functional calculations for large systems and for all-electron many-body perturbation theory with significantly reduced computational and memory cost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

info:eu-repo/semantics/published

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It was shown in previous papers that the resolution of a confocal scanning microscope can be significantly improved by measuring, for each scanning position, the full diffraction image and by inverting these data to recover the value of the object at the confocal point. In the present work, the authors generalize the data inversion procedure by allowing, for reconstructing the object at a given point, to make use of the data samples recorded at other scanning positions. This leads them to a family of generalized inversion formulae, either exact or approximate. Some previously known formulae are re-derived here as special cases in a particularly simple way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

info:eu-repo/semantics/published