935 resultados para pragmatic problem of induction


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study we address the problem of the response of a (electro)chemical oscillator towards chemical perturbations of different magnitudes. The chemical perturbation was achieved by addition of distinct amounts of trifluoromethanesulfonate (TFMSA), a rather stable and non-specifically adsorbing anion, and the system under investigation was the methanol electro-oxidation reaction under both stationary and oscillatory regimes. Increasing the anion concentration resulted in a decrease in the reaction rates of methanol oxidation and a general decrease in the parameter window where oscillations occurred. Furthermore, the addition of TFMSA was found to decrease the induction period and the total duration of oscillations. The mechanism underlying these observations was derived mathematically and revealed that inhibition in the methanol oxidation through blockage of active sites was found to further accelerate the intrinsic non-stationarity of the unperturbed system. Altogether, the presented results are among the few concerning the experimental assessment of the sensitiveness of an oscillator towards chemical perturbations. The universal nature of the complex chemical oscillator investigated here might be used for reference when studying the dynamics of other less accessible perturbed networks of (bio)chemical reactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction 1.1 Occurrence of polycyclic aromatic hydrocarbons (PAH) in the environment Worldwide industrial and agricultural developments have released a large number of natural and synthetic hazardous compounds into the environment due to careless waste disposal, illegal waste dumping and accidental spills. As a result, there are numerous sites in the world that require cleanup of soils and groundwater. Polycyclic aromatic hydrocarbons (PAHs) are one of the major groups of these contaminants (Da Silva et al., 2003). PAHs constitute a diverse class of organic compounds consisting of two or more aromatic rings with various structural configurations (Prabhu and Phale, 2003). Being a derivative of benzene, PAHs are thermodynamically stable. In addition, these chemicals tend to adhere to particle surfaces, such as soils, because of their low water solubility and strong hydrophobicity, and this results in greater persistence under natural conditions. This persistence coupled with their potential carcinogenicity makes PAHs problematic environmental contaminants (Cerniglia, 1992; Sutherland, 1992). PAHs are widely found in high concentrations at many industrial sites, particularly those associated with petroleum, gas production and wood preserving industries (Wilson and Jones, 1993). 1.2 Remediation technologies Conventional techniques used for the remediation of soil polluted with organic contaminants include excavation of the contaminated soil and disposal to a landfill or capping - containment - of the contaminated areas of a site. These methods have some drawbacks. The first method simply moves the contamination elsewhere and may create significant risks in the excavation, handling and transport of hazardous material. Additionally, it is very difficult and increasingly expensive to find new landfill sites for the final disposal of the material. The cap and containment method is only an interim solution since the contamination remains on site, requiring monitoring and maintenance of the isolation barriers long into the future, with all the associated costs and potential liability. A better approach than these traditional methods is to completely destroy the pollutants, if possible, or transform them into harmless substances. Some technologies that have been used are high-temperature incineration and various types of chemical decomposition (for example, base-catalyzed dechlorination, UV oxidation). However, these methods have significant disadvantages, principally their technological complexity, high cost , and the lack of public acceptance. Bioremediation, on the contrast, is a promising option for the complete removal and destruction of contaminants. 1.3 Bioremediation of PAH contaminated soil & groundwater Bioremediation is the use of living organisms, primarily microorganisms, to degrade or detoxify hazardous wastes into harmless substances such as carbon dioxide, water and cell biomass Most PAHs are biodegradable unter natural conditions (Da Silva et al., 2003; Meysami and Baheri, 2003) and bioremediation for cleanup of PAH wastes has been extensively studied at both laboratory and commercial levels- It has been implemented at a number of contaminated sites, including the cleanup of the Exxon Valdez oil spill in Prince William Sound, Alaska in 1989, the Mega Borg spill off the Texas coast in 1990 and the Burgan Oil Field, Kuwait in 1994 (Purwaningsih, 2002). Different strategies for PAH bioremediation, such as in situ , ex situ or on site bioremediation were developed in recent years. In situ bioremediation is a technique that is applied to soil and groundwater at the site without removing the contaminated soil or groundwater, based on the provision of optimum conditions for microbiological contaminant breakdown.. Ex situ bioremediation of PAHs, on the other hand, is a technique applied to soil and groundwater which has been removed from the site via excavation (soil) or pumping (water). Hazardous contaminants are converted in controlled bioreactors into harmless compounds in an efficient manner. 1.4 Bioavailability of PAH in the subsurface Frequently, PAH contamination in the environment is occurs as contaminants that are sorbed onto soilparticles rather than in phase (NAPL, non aqueous phase liquids). It is known that the biodegradation rate of most PAHs sorbed onto soil is far lower than rates measured in solution cultures of microorganisms with pure solid pollutants (Alexander and Scow, 1989; Hamaker, 1972). It is generally believed that only that fraction of PAHs dissolved in the solution can be metabolized by microorganisms in soil. The amount of contaminant that can be readily taken up and degraded by microorganisms is defined as bioavailability (Bosma et al., 1997; Maier, 2000). Two phenomena have been suggested to cause the low bioavailability of PAHs in soil (Danielsson, 2000). The first one is strong adsorption of the contaminants to the soil constituents which then leads to very slow release rates of contaminants to the aqueous phase. Sorption is often well correlated with soil organic matter content (Means, 1980) and significantly reduces biodegradation (Manilal and Alexander, 1991). The second phenomenon is slow mass transfer of pollutants, such as pore diffusion in the soil aggregates or diffusion in the organic matter in the soil. The complex set of these physical, chemical and biological processes is schematically illustrated in Figure 1. As shown in Figure 1, biodegradation processes are taking place in the soil solution while diffusion processes occur in the narrow pores in and between soil aggregates (Danielsson, 2000). Seemingly contradictory studies can be found in the literature that indicate the rate and final extent of metabolism may be either lower or higher for sorbed PAHs by soil than those for pure PAHs (Van Loosdrecht et al., 1990). These contrasting results demonstrate that the bioavailability of organic contaminants sorbed onto soil is far from being well understood. Besides bioavailability, there are several other factors influencing the rate and extent of biodegradation of PAHs in soil including microbial population characteristics, physical and chemical properties of PAHs and environmental factors (temperature, moisture, pH, degree of contamination). Figure 1: Schematic diagram showing possible rate-limiting processes during bioremediation of hydrophobic organic contaminants in a contaminated soil-water system (not to scale) (Danielsson, 2000). 1.5 Increasing the bioavailability of PAH in soil Attempts to improve the biodegradation of PAHs in soil by increasing their bioavailability include the use of surfactants , solvents or solubility enhancers.. However, introduction of synthetic surfactant may result in the addition of one more pollutant. (Wang and Brusseau, 1993).A study conducted by Mulder et al. showed that the introduction of hydropropyl-ß-cyclodextrin (HPCD), a well-known PAH solubility enhancer, significantly increased the solubilization of PAHs although it did not improve the biodegradation rate of PAHs (Mulder et al., 1998), indicating that further research is required in order to develop a feasible and efficient remediation method. Enhancing the extent of PAHs mass transfer from the soil phase to the liquid might prove an efficient and environmentally low-risk alternative way of addressing the problem of slow PAH biodegradation in soil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BTES (borehole thermal energy storage)systems exchange thermal energy by conduction with the surrounding ground through borehole materials. The spatial variability of the geological properties and the space-time variability of hydrogeological conditions affect the real power rate of heat exchangers and, consequently, the amount of energy extracted from / injected into the ground. For this reason, it is not an easy task to identify the underground thermal properties to use when designing. At the current state of technology, Thermal Response Test (TRT) is the in situ test for the characterization of ground thermal properties with the higher degree of accuracy, but it doesn’t fully solve the problem of characterizing the thermal properties of a shallow geothermal reservoir, simply because it characterizes only the neighborhood of the heat exchanger at hand and only for the test duration. Different analytical and numerical models exist for the characterization of shallow geothermal reservoir, but they are still inadequate and not exhaustive: more sophisticated models must be taken into account and a geostatistical approach is needed to tackle natural variability and estimates uncertainty. The approach adopted for reservoir characterization is the “inverse problem”, typical of oil&gas field analysis. Similarly, we create different realizations of thermal properties by direct sequential simulation and we find the best one fitting real production data (fluid temperature along time). The software used to develop heat production simulation is FEFLOW 5.4 (Finite Element subsurface FLOW system). A geostatistical reservoir model has been set up based on literature thermal properties data and spatial variability hypotheses, and a real TRT has been tested. Then we analyzed and used as well two other codes (SA-Geotherm and FV-Geotherm) which are two implementation of the same numerical model of FEFLOW (Al-Khoury model).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Patients with moderate to severe psoriasis are undertreated. To solve this persistent problem, the consensus programme was performed to define goals for treatment of plaque psoriasis with systemic therapy and to improve patient care. An expert consensus meeting and a collaborative Delphi procedure were carried out. Nineteen dermatologists from different European countries met for a face-to-face discussion and defined items through a four-round Delphi process. Severity of plaque psoriasis was graded into mild and moderate to severe disease. Mild disease was defined as body surface area (BSA) ≤10 and psoriasis area and severity index (PASI) ≤10 and dermatology life quality index (DLQI) ≤10 and moderate to severe psoriasis as (BSA > 10 or PASI > 10) and DLQI > 10. Special clinical situations may change mild psoriasis to moderate to severe including involvement of visible areas or severe nail involvement. For systemic therapy of plaque psoriasis two treatment phases were defined: (1) induction phase as the treatment period until week 16; however, depending on the type of drug and dose regimen used, this phase may be extended until week 24 and (2) maintenance phase for all drugs was defined as the treatment period after the induction phase. For the definition of treatment goals in plaque psoriasis, the change of PASI from baseline until the time of evaluation (ΔPASI) and the absolute DLQI were used. After induction and during maintenance therapy, treatment can be continued if reduction in PASI is ≥75%. The treatment regimen should be modified if improvement of PASI is <50%. In a situation where the therapeutic response improved ≥50% but <75%, as assessed by PASI, therapy should be modified if the DLQI is >5 but can be continued if the DLQI is ≤5. This programme defines the severity of plaque psoriasis for the first time using a formal consensus of 19 European experts. In addition, treatment goals for moderate to severe disease were established. Implementation of treatment goals in the daily management of psoriasis will improve patient care and mitigate the problem of undertreatment. It is planned to evaluate the implementation of these treatment goals in a subsequent programme involving patients and physicians.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The production of immunoglobulin A (IgA) in mammals exceeds all other isotypes, and it is mostly exported across mucous membranes. The discovery of IgA and the realization that it dominates humoral mucosal immunity, in contrast to the IgG dominance of the systemic immune system, was early evidence for the distinct nature of mucosal immunology. It is now clear that IgA can function in high-affinity modes for neutralization of toxins and pathogenic microbes, and as a low-affinity system to contain the dense commensal microbiota within the intestinal lumen. The basic map of induction of IgA B cells in the Peyer's patches, which then circulate through the lymph and bloodstream to seed the mucosa with precursors of plasma cells that produce dimeric IgA for export through the intestinal epithelium, has been known for more than 30 years. In this review, we discuss the mechanisms underlying selective IgA induction of mucosal B cells for IgA production and the immune geography of their homing characteristics. We also review the functionality of secretory IgA directed against both commensal organisms and pathogens.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

GOALS OF WORK: In patients with locally advanced esophageal cancer, only those responding to the treatment ultimately benefit from preoperative chemoradiation. We investigated whether changes in subjective dysphagia or eating restrictions after two cycles of induction chemotherapy can predict histopathological tumor response observed after chemoradiation. In addition, we examined general long-term quality of life (QoL) and, in particular, eating restrictions after esophagectomy. MATERIALS AND METHODS: Patients with resectable, locally advanced squamous cell- or adenocarcinoma of the esophagus were treated with two cycles of chemotherapy followed by chemoradiation and surgery. They were asked to complete the EORTC oesophageal-specific QoL module (EORTC QLQ-OES24), and linear analogue self-assessment QoL indicators, before and during neoadjuvant therapy and quarterly until 1 year postoperatively. A median change of at least eight points was considered as clinically meaningful. MAIN RESULTS: Clinically meaningful improvements in the median scores for dysphagia and eating restrictions were found during induction chemotherapy. These improvements were not associated with a histopathological response observed after chemoradiation, but enhanced treatment compliance. Postoperatively, dysphagia scores remained low at 1 year, while eating restrictions persisted more frequently in patients with extended transthoracic resection compared to those with limited transhiatal resection. CONCLUSIONS: The improvement of dysphagia and eating restrictions after induction chemotherapy did not predict tumor response observed after chemoradiation. One year after esophagectomy, dysphagia was a minor problem, and global QoL was rather good. Eating restrictions persisted depending on the surgical technique used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In lucid dreams the dreamer is aware of dreaming and often able to influence the ongoing dream content. Lucid dreaming is a learnable skill and a variety of techniques is suggested for lucid dreaming induction. This systematic review evaluated the evidence for the effectiveness of induction techniques. A comprehensive literature search was carried out in biomedical databases and specific resources. Thirty-five studies were included in the analysis (11 sleep laboratory and 24 field studies), of which 26 employed cognitive techniques, 11 external stimulation and one drug application. The methodological quality of the included studies was relatively low. None of the induction techniques were verified to induce lucid dreams reliably and consistently, although some of them look promising. On the basis of the reviewed studies, a taxonomy of lucid dream induction methods is presented. Several methodological issues are discussed and further directions for future studies are proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE A case is presented and a systematic review of the literature is provided to update our current knowledge of induction of fear by cortical stimulation. METHODS We present a case of refractory epilepsy associated with a lesion where fear could be induced by intraoperative electrical stimulation of the posterior inner part of the superior temporal gyrus. We performed a systematic review of the literature using PubMed with the key words "epilepsy AND emotion", "cortical stimulation AND emotion," and "human brain stimulation AND behavior". RESULTS Intraoperative cortical stimulation of the inner part of the posterior superior temporal gyrus reliably induced fear and progressive screaming behavior. Stimulation through subdural grid electrodes did not induce this phenomenon. A systematic review of the literature identified fear induction by stimulation of different widespread cortical areas including the temporal pole, the insula, and the anterior cingulate cortex. The posterior part of the superior temporal gyrus has so far not been associated with fear induction after electrical stimulation. CONCLUSION Although our observation suggests that this area of the brain could be part of a network involved in the elicitation of fear, dysfunction of this network induced by epilepsy could also explain the observed phenomenon. Electrophysiologic and imaging studies must be conducted to improve our understanding of the cortical networks forming the neuroanatomical substrate of higher brain functions and experiences such as fear.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper defines the 3D reconstruction problem as the process of reconstructing a 3D scene from numerous 2D visual images of that scene. It is well known that this problem is ill-posed, and numerous constraints and assumptions are used in 3D reconstruction algorithms in order to reduce the solution space. Unfortunately, most constraints only work in a certain range of situations and often constraints are built into the most fundamental methods (e.g. Area Based Matching assumes that all the pixels in the window belong to the same object). This paper presents a novel formulation of the 3D reconstruction problem, using a voxel framework and first order logic equations, which does not contain any additional constraints or assumptions. Solving this formulation for a set of input images gives all the possible solutions for that set, rather than picking a solution that is deemed most likely. Using this formulation, this paper studies the problem of uniqueness in 3D reconstruction and how the solution space changes for different configurations of input images. It is found that it is not possible to guarantee a unique solution, no matter how many images are taken of the scene, their orientation or even how much color variation is in the scene itself. Results of using the formulation to reconstruct a few small voxel spaces are also presented. They show that the number of solutions is extremely large for even very small voxel spaces (5 x 5 voxel space gives 10 to 10(7) solutions). This shows the need for constraints to reduce the solution space to a reasonable size. Finally, it is noted that because of the discrete nature of the formulation, the solution space size can be easily calculated, making the formulation a useful tool to numerically evaluate the usefulness of any constraints that are added.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A formalism for describing the dynamics of Genetic Algorithms (GAs) using method s from statistical mechanics is applied to the problem of generalization in a perceptron with binary weights. The dynamics are solved for the case where a new batch of training patterns is presented to each population member each generation, which considerably simplifies the calculation. The theory is shown to agree closely to simulations of a real GA averaged over many runs, accurately predicting the mean best solution found. For weak selection and large problem size the difference equations describing the dynamics can be expressed analytically and we find that the effects of noise due to the finite size of each training batch can be removed by increasing the population size appropriately. If this population resizing is used, one can deduce the most computationally efficient size of training batch each generation. For independent patterns this choice also gives the minimum total number of training patterns used. Although using independent patterns is a very inefficient use of training patterns in general, this work may also prove useful for determining the optimum batch size in the case where patterns are recycled.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a series of studies, I investigated the developmental changes in children’s inductive reasoning strategy, methodological manipulations affecting the trajectory, and driving mechanisms behind the development of category induction. I systematically controlled the nature of the stimuli used, and employed a triad paradigm in which perceptual cues were directly pitted against category membership, to explore under which circumstances children used perceptual or category induction. My induction tasks were designed for children aged 3-9 years old using biologically plausible novel items. In Study 1, I tested 264 children. Using a wide age range allowed me to systematically investigate the developmental trajectory of induction. I also created two degrees of perceptual distractor – high and low – and explored whether the degree of perceptual similarity between target and test items altered children’s strategy preference. A further 52 children were tested in Study 2, to examine whether children showing a perceptual-bias were in fact basing their choice on maturation categories. A gradual transition was observed from perceptual to category induction. However, this transition could not be due to the inability to inhibit high perceptual distractors as children of all ages were equally distracted. Children were also not basing their strategy choices on maturation categories. In Study 3, I investigated category structure (featural vs. relational category rules) and domain (natural vs. artefact) on inductive preference. I tested 403 children. Each child was assigned to either the featural or relational condition, and completed both a natural kind and an artefact task. A further 98 children were tested in Study 4, on the effect of using stimuli labels during the tasks. I observed the same gradual transition from perceptual to category induction preference in Studies 3 and 4. This pattern was stable across domains, but children developed a category-bias one year later for relational categories, arguably due to the greater demands on executive function (EF) posed by these stimuli. Children who received labels during the task made significantly more category choices than those who did not receive labels, possibly due to priming effects. Having investigated influences affecting the developmental trajectory, I continued by exploring the driving mechanism behind the development of category induction. In Study 5, I tested 60 children on a battery of EF tasks as well as my induction task. None of the EF tasks were able to predict inductive variance, therefore EF development is unlikely to be the driving factor behind the transition. Finally in Study 6, I divided 252 children into either a comparison group or an intervention group. The intervention group took part in an interactive educational session at Twycross Zoo about animal adaptations. Both groups took part in four induction tasks, two before and two a week after the zoo visits. There was a significant increase in the number of category choices made in the intervention condition after the zoo visit, a result not observed in the comparison condition. This highlights the role of knowledge in supporting the transition from perceptual to category induction. I suggest that EF development may support induction development, but the driving mechanism behind the transition is an accumulation of knowledge, and an appreciation for the importance of category membership.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physical distribution plays an imporant role in contemporary logistics management. Both satisfaction level of of customer and competitiveness of company can be enhanced if the distribution problem is solved optimally. The multi-depot vehicle routing problem (MDVRP) belongs to a practical logistics distribution problem, which consists of three critical issues: customer assignment, customer routing, and vehicle sequencing. According to the literatures, the solution approaches for the MDVRP are not satisfactory because some unrealistic assumptions were made on the first sub-problem of the MDVRP, ot the customer assignment problem. To refine the approaches, the focus of this paper is confined to this problem only. This paper formulates the customer assignment problem as a minimax-type integer linear programming model with the objective of minimizing the cycle time of the depots where setup times are explicitly considered. Since the model is proven to be MP-complete, a genetic algorithm is developed for solving the problem. The efficiency and effectiveness of the genetic algorithm are illustrated by a numerical example.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the problem of determining the stationary temperature field on an inclusion from given Cauchy data on an accessible exterior boundary. On this accessible part the temperature (or the heat flux) is known, and, additionally, on a portion of this exterior boundary the heat flux (or temperature) is also given. We propose a direct boundary integral approach in combination with Tikhonov regularization for the stable determination of the temperature and flux on the inclusion. To determine these quantities on the inclusion, boundary integral equations are derived using Green’s functions, and properties of these equations are shown in an L2-setting. An effective way of discretizing these boundary integral equations based on the Nystr¨om method and trigonometric approximations, is outlined. Numerical examples are included, both with exact and noisy data, showing that accurate approximations can be obtained with small computational effort, and the accuracy is increasing with the length of the portion of the boundary where the additionally data is given.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose an iterative procedure for the inverse problem of determining the displacement vector on the boundary of a bounded planar inclusion given the displacement and stress fields on an infinite (planar) line-segment. At each iteration step mixed boundary value problems in an elastostatic half-plane containing the bounded inclusion are solved. For efficient numerical implementation of the procedure these mixed problems are reduced to integral equations over the bounded inclusion. Well-posedness and numerical solution of these boundary integral equations are presented, and a proof of convergence of the procedure for the inverse problem to the original solution is given. Numerical investigations are presented both for the direct and inverse problems, and these results show in particular that the displacement vector on the boundary of the inclusion can be found in an accurate and stable way with small computational cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previously, we have shown that a maternal low protein diet, fed exclusively during the preimplantation period of mouse development (Emb-LPD), is sufficient to induce by the blastocyst stage a compensatory growth phenotype in late gestation and postnatally, correlating with increased risk of adult onset cardiovascular disease and behavioural dysfunction. Here, we examine mechanisms of induction of maternal Emb-LPD programming and early compensatory responses by the embryo. Emb-LPD induced changes in maternal serum metabolites at the time of blastocyst formation (E3.5), notably reduced insulin and increased glucose, together with reduced levels of free amino acids (AAs) including branched chain AAs leucine, isoleucine and valine. Emb-LPD also caused reduction in the branched chain AAs within uterine fluid at the blastocyst stage. These maternal changes coincided with an altered content of blastocyst AAs and reduced mTORC1 signalling within blastocysts evident in reduced phosphorylation of effector S6 ribosomal protein and its ratio to total S6 protein but no change in effector 4E-BP1 phosphorylated and total pools. These changes were accompanied by increased proliferation of blastocyst trophectoderm and total cells and subsequent increased spreading of trophoblast cells in blastocyst outgrowths. We propose that induction of metabolic programming following Emb-LPD is achieved through mTORC1signalling which acts as a sensor for preimplantation embryos to detect maternal nutrient levels via branched chain AAs and/or insulin availability. Moreover, this induction step associates with changes in extra-embryonic trophectoderm behaviour occurring as early compensatory responses leading to later nutrient recovery. © 2012 Fleming et al.