18 resultados para Tool Complex
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
Carbon and nitrogen stable isotope analysis (SIA) has identified the terrestrial subsidy of freshwater food-webs but relies on different 13C fractionation in aquatic and terrestrial primary producers. However dissolved inorganic carbon (DIC) is partly comprised of 13C depleted respiration of terrestrial C and ‘old’ C derived from weathering of catchment geology. SIA thus fails to differentiate between the contribution of old and recently fixed terrestrial C. DIC in alkaline lakes is partially derived from weathering of 14C-free carbonaceous bedrock This
yields an artificial age offset leading samples to appear significantly older than their actual age. As such, 14C can be used as a biomarker to identify the proportion of autochthonous C in the food-web. With terrestrial C inputs likely to increase, the origin and utilisation of ‘old’ or ‘recent’ allochthonous C in the food-web can also be determined. Stable isotopes and 14C were measured for biota, particulate organic matter (POM), DIC and dissolved organic carbon (DOC) from Lough Erne, Northern Ireland, a humic but alkaline lake. High winter δ15N values in calanoid zooplankton (δ15N =24‰) relative to phytoplankton and POM (δ15N =6‰ and 12‰ respectively) may reflect several microbial trophic levels between terrestrial C and calanoids. Furthermore winter calanoid 14C ages are consistent with DOC from inflowing rivers (87 and 75 years BP respectively) but not phytoplankton (355 years BP). Summer calanoid δ13N, δ15N and 14C (312 years BP) indicate greater reliance on phytoplankton. There is also temporal and spatial variation in DIC, DOC and POM C isotopes.
Resumo:
Globally lakes bury and remineralise significant quantities of terrestrial C, and the associated flux of terrestrial C strongly influences their functioning. Changing deposition chemistry, land use and climate induced impacts on hydrology will affect soil biogeochemistry and terrestrial C export1 and hence lake ecology with potential feedbacks for regional and global C cycling. C and nitrogen stable isotope analysis (SIA) has identified the terrestrial subsidy of freshwater food webs. The approach relies on different 13C fractionation in aquatic and terrestrial primary producers, but also that inorganic C demands of aquatic primary producers are partly met by 13C depleted C from respiration of terrestrial C, and ‘old’ C derived from weathering of catchment geology. SIA thus fails to differentiate between the contributions of old and recently fixed terrestrial C. Natural abundance 14C can be used as an additional biomarker to untangle riverine food webs2 where aquatic and terrestrial δ 13C overlap, but may also be valuable for examining the age and origin of C in the lake. Primary production in lakes is based on dissolved inorganic C (DIC). DIC in alkaline lakes is partially derived from weathering of carbonaceous bedrock, a proportion of which is14C-free. The low 14C activity yields an artificial age offset leading samples to appear hundreds to thousands of years older than their actual age. As such, 14C can be used to identify the proportion of autochthonous C in the food-web. With terrestrial C inputs likely to increase, the origin and utilisation of ‘fossil’ or ‘recent’ allochthonous C in the food-web can also be determined. Stable isotopes and 14C were measured for biota, particulate organic matter (POM), DIC and dissolved organic carbon (DOC) from Lough Erne, Northern Ireland, a humic alkaline lake. Temporal and spatial variation was evident in DIC, DOC and POM C isotopes with implications for the fluctuation in terrestrial export processes. Ramped pyrolysis of lake surface sediment indicates the burial of two C components. 14C activity (507 ± 30 BP) of sediment combusted at 400˚C was consistent with algal values and younger than bulk sediment values (1097 ± 30 BP). The sample was subsequently combusted at 850˚C, yielding 14C values (1471 ± 30 BP) older than the bulk sediment age, suggesting that fossil terrestrial carbon is also buried in the sediment. Stable isotopes in the food web indicate that terrestrial organic C is also utilised by lake organisms. High winter δ 15N values in calanoid zooplankton (δ 15N = 24%¸) relative to phytoplankton and POM (δ 15N = 6h and 12h respectively) may reflect several microbial trophic levels between terrestrial C and calanoids. Furthermore winter calanoid 14C ages are consistent with DOC from an inflowing river (75 ± 24 BP), not phytoplankton (367 ± 70 BP). Summer calanoid δ 13C, δ 15N and 14C (345 ± 80 BP) indicate greater reliance on phytoplankton.
1 Monteith, D.T et al., (2007) Dissolved organic carbon trends resulting from changes in atmospheric deposition chemistry. Nature, 450:537-535
2 Caraco, N., et al.,(2010) Millennial-aged organic carbon subsidies to a modern river food web. Ecology,91: 2385-2393.
Resumo:
A new approach to evaluating all multiple complex roots of analytical function f(z) confined to the specified rectangular domain of complex plane has been developed and implemented in Fortran code. Generally f (z), despite being holomorphic function, does not have a closed analytical form thereby inhibiting explicit evaluation of its derivatives. The latter constraint poses a major challenge to implementation of the robust numerical algorithm. This work is at the instrumental level and provides an enabling tool for solving a broad class of eigenvalue problems and polynomial approximations.
Resumo:
Ligand prediction has been driven by a fundamental desire to understand more about how biomolecules recognize their ligands and by the commercial imperative to develop new drugs. Most of the current available software systems are very complex and time-consuming to use. Therefore, developing simple and efficient tools to perform initial screening of interesting compounds is an appealing idea. In this paper, we introduce our tool for very rapid screening for likely ligands (either substrates or inhibitors) based on reasoning with imprecise probabilistic knowledge elicited from past experiments. Probabilistic knowledge is input to the system via a user-friendly interface showing a base compound structure. A prediction of whether a particular compound is a substrate is queried against the acquired probabilistic knowledge base and a probability is returned as an indication of the prediction. This tool will be particularly useful in situations where a number of similar compounds have been screened experimentally, but information is not available for all possible members of that group of compounds. We use two case studies to demonstrate how to use the tool.
Resumo:
This paper addresses the problems often faced by social workers and their supervisors in decision making where human rights considerations and child protection concerns collide. High profile court cases in the United Kingdom and Europe have consistently called for social workers to convey more clarity when justifying their reasons for interfering with human rights in child protection cases. The themes emerging from these case law decisions imply that social workers need to be better at giving reasons and evidence in more explicit ways to support any actions they propose which cause interference with Convention Rights. Toulmin (1958, 1985) offers a structured approach to argumentation which may have relevance to the supervision of child protection cases when social workers and managers are required to balance these human rights considerations. One of the key challenges in this balancing act is the need for decision makers to feel confident that any interventions resulting in the interference of human rights are both justified and proportionate. Toulmin’s work has already been shown to have relevance for assisting social workers navigate pathways through cases involving competing ethical and moral demands (Osmo and Landau, 2001) and more recently to human rights and decision making in child protection (Duffy et al, 2006). Toulmin’s model takes the practitioner through a series of stages where any argument or proposed recommendation (claim) is subjected to intense critical analysis involving exposition of its strengths and weaknesses. The author therefore proposes that explicit argumentation (Osmo and Landau, 2001) may help supervisors and practitioners towards safer and more confident decision making in child protection cases involving the interference of the human rights of children and parents. In addition to highlighting the broader context of human rights currently permeating child protection decision making, the paper will include case material to practically demonstrate the application of Toulmin’s model of argumentation to the supervision context. In this way the paper adopts a strong practice approach in helping to assist practitioners with the problems and dilemmas they may come up against in decision making in complex cases.
Resumo:
Traditional static analysis fails to auto-parallelize programs with a complex control and data flow. Furthermore, thread-level parallelism in such programs is often restricted to pipeline parallelism, which can be hard to discover by a programmer. In this paper we propose a tool that, based on profiling information, helps the programmer to discover parallelism. The programmer hand-picks the code transformations from among the proposed candidates which are then applied by automatic code transformation techniques.
This paper contributes to the literature by presenting a profiling tool for discovering thread-level parallelism. We track dependencies at the whole-data structure level rather than at the element level or byte level in order to limit the profiling overhead. We perform a thorough analysis of the needs and costs of this technique. Furthermore, we present and validate the belief that programs with complex control and data flow contain significant amounts of exploitable coarse-grain pipeline parallelism in the program’s outer loops. This observation validates our approach to whole-data structure dependencies. As state-of-the-art compilers focus on loops iterating over data structure members, this observation also explains why our approach finds coarse-grain pipeline parallelism in cases that have remained out of reach for state-of-the-art compilers. In cases where traditional compilation techniques do find parallelism, our approach allows to discover higher degrees of parallelism, allowing a 40% speedup over traditional compilation techniques. Moreover, we demonstrate real speedups on multiple hardware platforms.
Resumo:
In this paper, we assess realistic evaluation’s articulation with evidence-based practice (EBP) from the perspective of critical realism. We argue that the adoption by realistic evaluation of a realist causal ontology means that it is better placed to explain complex healthcare interventions than the traditional method used by EBP, the randomized controlled trial (RCT). However, we do not conclude from this that the use of RCTs is without merit, arguing that it is possible to use both methods in combination under the rubric of realist theory. More negatively, we contend that the rejection of critical theory and utopianism by realistic evaluation in favour of the pragmatism of piecemeal social engineering means that it is vulnerable to accusations that it promotes technocratic interpretations of human problems. We conclude that, insofar as realistic evaluation adheres to the ontology of critical realism, it provides a sound contribution to EBP, but insofar as it rejects the critical turn of Bhaskar’s realism, it replicates the technocratic tendencies inherent in EBP.
Resumo:
A numerical method is developed to simulate complex two-dimensional crack propagation in quasi-brittle materials considering random heterogeneous fracture properties. Potential cracks are represented by pre-inserted cohesive elements with tension and shear softening constitutive laws modelled by spatially varying Weibull random fields. Monte Carlo simulations of a concrete specimen under uni-axial tension were carried out with extensive investigation of the effects of important numerical algorithms and material properties on numerical efficiency and stability, crack propagation processes and load-carrying capacities. It was found that the homogeneous model led to incorrect crack patterns and load–displacement curves with strong mesh-dependence, whereas the heterogeneous model predicted realistic, complicated fracture processes and load-carrying capacity of little mesh-dependence. Increasing the variance of the tensile strength random fields with increased heterogeneity led to reduction in the mean peak load and increase in the standard deviation. The developed method provides a simple but effective tool for assessment of structural reliability and calculation of characteristic material strength for structural design.
Resumo:
Animal communities are sensitive to environmental disturbance, and several multivariate methods have recently been developed to detect changes in community structure. The complex taxonomy of soil invertebrates constrains the use of the community level in monitoring environmental changes, since species identification requires expertise and time. However, recent literature data on marine communities indicate that little multivariate information is lost in the taxonomic aggregation of species data to high rank taxa. In the present paper, this hypothesis was tested on two oribatid mite (oribatida, Acari) assemblages under two different kinds of disturbance: metal pollution and fires. Results indicate that data sets built at the genus and family systematic rank can detect the effects of disturbance with little loss of information. This is an encouraging result in view of the use of the community level as a preliminary tool for describing patterns of human-disturbed soil ecosystems. (c) 2006 Elsevier SAS. All rights reserved.
Resumo:
Here we describe the development of the MALTS software which is a generalized tool that simulates Lorentz Transmission Electron Microscopy (LTEM) contrast of magnetic nanostructures. Complex magnetic nanostructures typically have multiple stable domain structures. MALTS works in conjunction with the open access micromagnetic software Object Oriented Micromagnetic Framework or MuMax. Magnetically stable trial magnetization states of the object of interest are input into MALTS and simulated LTEM images are output. MALTS computes the magnetic and electric phases accrued by the transmitted electrons via the Aharonov-Bohm expressions. Transfer and envelope functions are used to simulate the progression of the electron wave through the microscope lenses. The final contrast image due to these effects is determined by Fourier Optics. Similar approaches have been used previously for simulations of specific cases of LTEM contrast. The novelty here is the integration with micromagnetic codes via a simple user interface enabling the computation of the contrast from any structure. The output from MALTS is in good agreement with both experimental data and published LTEM simulations. A widely-available generalized code for the analysis of Lorentz contrast is a much needed step towards the use of LTEM as a standardized laboratory technique.
Resumo:
Despite considerable advances in reducing the production of dioxin-like toxicants in recent years, contamination of the food chain still occasionally occurs resulting in huge losses to the agri-food sector and risk to human health through exposure. Dioxin-like toxicity is exhibited by a range of stable and bioaccumulative compounds including polychlorinated dibenzo-p-dioxins (PCDDs) and dibenzofurans (PCDFs), produced by certain types of combustion, and man-made coplanar polychlorinated biphenyls (PCBs), as found in electrical transformer oils. While dioxinergic compounds act by a common mode of action making exposure detection biomarker based techniques a potentially useful tool, the influence of co-contaminating toxicants on such approaches needs to be considered. To assess the impact of possible interactions, the biological responses of H4IIE cells to challenge by 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) in combination with PCB-52 and benzo-a-pyrene (BaP) were evaluated by a number of methods in this study. Ethoxyresorufin-O-deethylase (EROD) induction in TCDD exposed cells was suppressed by increasing concentrations of PCB-52, PCB-153, or BaP up to 10 mu M. BaP levels below 1 mu M suppressed TCDD stimulated EROD induction, but at higher concentrations, EROD induction was greater than the maximum observed when cells were treated with TCDD alone. A similar biphasic interaction of BaP with TCDD co-exposure was noted in the AlamarBlue assay and to a lesser extent with PCB-52. Surface enhanced laser desorption/ionization-time of flight mass spectrometry (SELDI-TOF) profiling of peptidomic responses of cells exposed to compound combinations was compared. Cells co-exposed to TCDD in the presence of BaP or PCB-52 produced the most differentiated spectra with a substantial number of non-additive interactions observed. These findings suggest that interactions between dioxin and other toxicants create novel, additive, and non-additive effects, which may be more indicative of the types of responses seen in exposed animals than those of single exposures to the individual compounds.
Resumo:
Objectives: Interference between a target and simultaneous maskers occurs both at the cochlear level through energetic masking and more centrally through informational masking (IM). Hence, quantifying the amount of IM requires a strict control of the energetic component. Presenting target and maskers on different sides (i.e., dichotically) reduces energetic masking but provides listeners with important lateralization cues that also drastically reduce IM. The main purpose of this study (Experiment 1) was to evaluate a "switch" manipulation aiming at restoring most of the IM despite dichotic listening. Experiment 2 was designed to investigate the source of the difficulty induced by this switching dichotic condition.
Design: In Experiment 1, the authors presented 60 normal-hearing young adults with a detection task in which a regularly repeating target was embedded in a randomly varying background masker. The authors evaluated spatial masking release induced by three different dichotic listening conditions in comparison with a diotic baseline. Dichotic stimuli were presented in either a nonswitching or a switching condition. In the latter case, the presentation sides of dichotic target and maskers alternated several times throughout 10 sec sequences. The impact of the number of switches on IM was investigated parametrically, with both pure and complex tone sequences. In Experiment 2, the authors compared performance of 13 young, normal-hearing listeners in a monotic and dichotic version of the rapidly switching condition, using pure-tone sequences.
Results: When target and maskers switched rapidly within sequences, IM was significantly stronger than in nonswitching dichotic sequences and was comparable with the masking effect induced by diotic sequences. Furthermore, Experiment 2 suggests that rapidly switching target and maskers prevent listeners from relying on lateralization cues inherent to the dichotic condition, hence preserving important amounts of IM.
Conclusions: This paradigm thus provides an original tool to isolate IM in signal and maskers having overlapping spectra.
Resumo:
Aflatoxin B1 (AFB1), ochratoxin A (OTA) and fumonisin B1 (FB1) are important mycotoxins in terms of
human exposure via food, their toxicity and regulatory limits that exist worldwide. Mixtures of toxins can frequently be present in foods, however due to the complications of determining their combined toxicity,
legal limits of exposure are determined for single compounds, based on long standing toxicological
techniques. High content analysis (HCA) may be a useful tool to determine total toxicity of complex
mixtures of mycotoxins. Endpoints including cell number (CN), nuclear intensity (NI), nuclear area (NA),
plasma membrane permeability (PMP), mitochondrial membrane potential (MMP) and mitochondrial
mass (MM) were compared to the conventional 3-(4,5-dimethylthiazol-2-yl)-2,5 diphenyltetrazolium
bromide (MTT) and neutral red (NR) endpoints in MDBK cells. Individual concentrations of each
mycotoxin (OTA 3mg/ml, FB1 8mg/ml and AFB11.28mg/ml) revealed no cytotoxicity with MTTor NR but
HCA showed significant cytotoxic effects up to 41.6% (p0.001) and 10.1% (p0.05) for OTA and AFB1,
respectively. The tertiary mixture (OTA 3mg/ml, FB1 8mg/ml and AFB1 1.28mg/ml) detected up to 37.3%
and 49.8% more cytotoxicity using HCA over MTT and NR, respectively. Whilst binary combinations of
OTA (3mg/ml) and FB1 (8mg/ml) revealed synergistic interactions using HCA (MMP, MM, NI endpoints)
not detected using MTT or NR. HCA is a highly novel and sensitive tool that could substantially help
determine future regulatory limits, for single and combined toxins present in food, ensuring legislation is based on true risks to human health exposure.
Resumo:
Ready-to-eat (RTE) foods can be readily consumed with minimum or without any further preparation; their processing is complex—involving thorough decontamination processes— due to their composition of mixed ingredients. Compared with conventional preservation technologies, novel processing technologies can enhance the safety and quality of these complex products by reducing the risk of pathogens and/ or by preserving related health-promoting compounds. These novel technologies can be divided into two categories: thermal and non-thermal. As a non-thermal treatment, High Pressure Processing is a very promising novel methodology that can be used even in the already packaged RTE foods. A new “volumetric” microwave heating technology is an interesting cooking and decontamination method directly applied to foods. Cold Plasma technology is a potential substitute of chlorine washing in fresh vegetable decontamination. Ohmic heating is a heating method applicable to viscous products but also to meat products. Producers of RTE foods have to deal with challenging decisions starting from the ingredients suppliers to the distribution chain. They have to take into account not only the cost factor but also the benefits and food products’ safety and quality. Novel processing technologies can be a valuable yet large investment for several SME food manufacturers, but they need support data to be able to make adequate decisions. Within the FP7 Cooperation funded by the European Commission, the STARTEC project aims to develop an IT decision supporting tool to help food business operators in their risk assessment and future decision making when producing RTE foods with or without novel preservation technologies.