72 resultados para two-temperature model


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim/Background TRALI is hypothesised to develop via a two-event mechanism involving both the patieint's underlying morbidity and blood product factors. The storage of cellular products has been implicated in cases of non-antibody mediated TRALI, however the pathophysiological mechanisms are undefined. We investigated blood product storage-related modulation of inflmmatory cells and medicators involved in TRALI. Methods In an in vitro mode, fresh human whole blood was mixed with culture media (control) or LPS as a 1st event and "transfused" with 10% (v/v) pooled supernatant (SN) from Day 1 (d1, n=75) or Day 42 (D42, n=113) packed red blood cells (PRBCs) as a 2nd event. Following 6hrs, culture SN was used to assess the overall inflammatory response (cytometric bead array) and a duplicate assay containing protein transport inhibitor was used to assess neutrophil- and monocyte-specific inflmamatory responses using multi-colour flow cytometry. Panels: IL-6, IL-8, IL-10, IL-12, IL-1, TNF, MCP-1, IP-10, MIP-1. One-way ANOVA 95% CI. Results In the absence of LPS, exposure to D1 or D42 PRBC-SN reduced monocyte expression of IL-6, IL-8 and Il-10. D42 PRBC-SN also reduced monocyte IP-10, and the overall IL-8 production was increased. In the presence of LPS, D1-PRBC SN only modified overall IP-10 levels which were reduced. However, cf LPS alone, the combination of LPS and D42 PRBC-SN resulted in increased neutrophil and monocyte productionof IL-1 and IL-8 as well as reduced monocyte TNF production. Additionally, LPS and D42 PRBC-SN resulted in overall inflmmatory changes: elevated IL-8,

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The uncertainty associated with how projected climate change will affect global C cycling could have a large impact on predictions of soil C stocks. The purpose of our study was to determine how various soil decomposition and chemistry characteristics relate to soil organic matter (SOM) temperature sensitivity. We accomplished this objective using long-term soil incubations at three temperatures (15, 25, and 35°C) and pyrolysis molecular beam mass spectrometry (py-MBMS) on 12 soils from 6 sites along a mean annual temperature (MAT) gradient (2–25.6°C). The Q10 values calculated from the CO2 respired during a long-term incubation using the Q10-q method showed decomposition of the more resistant fraction to be more temperature sensitive with a Q10-q of 1.95 ± 0.08 for the labile fraction and a Q10-q of 3.33 ± 0.04 for the more resistant fraction. We compared the fit of soil respiration data using a two-pool model (active and slow) with first-order kinetics with a three-pool model and found that the two and three-pool models statistically fit the data equally well. The three-pool model changed the size and rate constant for the more resistant pool. The size of the active pool in these soils, calculated using the two-pool model, increased with incubation temperature and ranged from 0.1 to 14.0% of initial soil organic C. Sites with an intermediate MAT and lowest C/N ratio had the largest active pool. Pyrolysis molecular beam mass spectrometry showed declines in carbohydrates with conversion from grassland to wheat cultivation and a greater amount of protected carbohydrates in allophanic soils which may have lead to differences found between the total amount of CO2 respired, the size of the active pool, and the Q10-q values of the soils.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The study focuses on an alluvial plain situated within a large meander of the Logan River at Josephville near Beaudesert which supports a factory that processes gelatine. The plant draws water from on site bores, as well as the Logan River, for its production processes and produces approximately 1.5 ML per day (Douglas Partners, 2004) of waste water containing high levels of dissolved ions. At present a series of treatment ponds are used to aerate the waste water reducing the level of organic matter; the water is then used to irrigate grazing land around the site. Within the study the hydrogeology is investigated, a conceptual groundwater model is produced and a numerical groundwater flow model is developed from this. On the site are several bores that access groundwater, plus a network of monitoring bores. Assessment of drilling logs shows the area is formed from a mixture of poorly sorted Quaternary alluvial sediments with a laterally continuous aquifer comprised of coarse sands and fine gravels that is in contact with the river. This aquifer occurs at a depth of between 11 and 15 metres and is overlain by a heterogeneous mixture of silts, sands and clays. The study investigates the degree of interaction between the river and the groundwater within the fluvially derived sediments for reasons of both environmental monitoring and sustainability of the potential local groundwater resource. A conceptual hydrogeological model of the site proposes two hydrostratigraphic units, a basal aquifer of coarse-grained materials overlain by a thick semi-confining unit of finer materials. From this, a two-layer groundwater flow model and hydraulic conductivity distribution was developed based on bore monitoring and rainfall data using MODFLOW (McDonald and Harbaugh, 1988) and PEST (Doherty, 2004) based on GMS 6.5 software (EMSI, 2008). A second model was also considered with the alluvium represented as a single hydrogeological unit. Both models were calibrated to steady state conditions and sensitivity analyses of the parameters has demonstrated that both models are very stable for changes in the range of ± 10% for all parameters and still reasonably stable for changes up to ± 20% with RMS errors in the model always less that 10%. The preferred two-layer model was found to give the more realistic representation of the site, where water level variations and the numerical modeling showed that the basal layer of coarse sands and fine gravels is hydraulically connected to the river and the upper layer comprising a poorly sorted mixture of silt-rich clays and sands of very low permeability limits infiltration from the surface to the lower layer. The paucity of historical data has limited the numerical modelling to a steady state one based on groundwater levels during a drought period and forecasts for varying hydrological conditions (e.g. short term as well as prolonged dry and wet conditions) cannot reasonably be made from such a model. If future modelling is to be undertaken it is necessary to establish a regular program of groundwater monitoring and maintain a long term database of water levels to enable a transient model to be developed at a later stage. This will require a valid monitoring network to be designed with additional bores required for adequate coverage of the hydrogeological conditions at the Josephville site. Further investigations would also be enhanced by undertaking pump testing to investigate hydrogeological properties in the aquifer.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The melting of spherical nanoparticles is considered from the perspective of heat flow in a pure material and as a moving boundary (Stefan) problem. The dependence of the melting temperature on both the size of the particle and the interfacial tension is described by the Gibbs-Thomson effect, and the resulting two-phase model is solved numerically using a front-fixing method. Results show that interfacial tension increases the speed of the melting process, and furthermore, the temperature distribution within the solid core of the particle exhibits behaviour that is qualitatively different to that predicted by the classical models without interfacial tension.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Extreme cold and heat waves, characterised by a number of cold or hot days in succession, place a strain on people’s cardiovascular and respiratory systems. The increase in deaths due to these waves may be greater than that predicted by extreme temperatures alone. We examined cold and heat waves in 99 US cities for 14 years (1987–2000) and investigated how the risk of death depended on the temperature threshold used to define a wave, and a wave’s timing, duration and intensity. We defined cold and heat waves using temperatures above and below cold and heat thresholds for two or more days. We tried five cold thresholds using the first to fifth percentiles of temperature, and five heat thresholds using the ninety-fifth to ninety-ninth percentiles. The extra wave effects were estimated using a two-stage model to ensure that their effects were estimated after removing the general effects of temperature. The increases in deaths associated with cold waves were generally small and not statistically significant, and there was even evidence of a decreased risk during the coldest waves. Heat waves generally increased the risk of death, particularly for the hottest heat threshold. Cold waves of a colder intensity or longer duration were not more dangerous. Cold waves earlier in the cool season were more dangerous, as were heat waves earlier in the warm season. In general there was no increased risk of death during cold waves above the known increased risk associated with cold temperatures. Cold or heat waves earlier in the cool or warm season may be more dangerous because of a build up in the susceptible pool or a lack of preparedness for cold or hot temperatures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Topic modeling has been widely utilized in the fields of information retrieval, text mining, text classification etc. Most existing statistical topic modeling methods such as LDA and pLSA generate a term based representation to represent a topic by selecting single words from multinomial word distribution over this topic. There are two main shortcomings: firstly, popular or common words occur very often across different topics that bring ambiguity to understand topics; secondly, single words lack coherent semantic meaning to accurately represent topics. In order to overcome these problems, in this paper, we propose a two-stage model that combines text mining and pattern mining with statistical modeling to generate more discriminative and semantic rich topic representations. Experiments show that the optimized topic representations generated by the proposed methods outperform the typical statistical topic modeling method LDA in terms of accuracy and certainty.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of mobile phones while driving is more prevalent among young drivers—a less experienced cohort with elevated crash risk. The objective of this study was to examine and better understand the reaction times of young drivers to a traffic event originating in their peripheral vision whilst engaged in a mobile phone conversation. The CARRS-Q Advanced Driving Simulator was used to test a sample of young drivers on various simulated driving tasks, including an event that originated within the driver’s peripheral vision, whereby a pedestrian enters a zebra crossing from a sidewalk. Thirty-two licensed drivers drove the simulator in three phone conditions: baseline (no phone conversation), hands-free and handheld. In addition to driving the simulator each participant completed questionnaires related to driver demographics, driving history, usage of mobile phones while driving, and general mobile phone usage history. The participants were 21 to 26 years old and split evenly by gender. Drivers’ reaction times to a pedestrian in the zebra crossing were modelled using a parametric accelerated failure time (AFT) duration model with a Weibull distribution. Also tested where two different model specifications to account for the structured heterogeneity arising from the repeated measures experimental design. The Weibull AFT model with gamma heterogeneity was found to be the best fitting model and identified four significant variables influencing the reaction times, including phone condition, driver’s age, license type (Provisional license holder or not), and self-reported frequency of usage of handheld phones while driving. The reaction times of drivers were more than 40% longer in the distracted condition compared to baseline (not distracted). Moreover, the impairment of reaction times due to mobile phone conversations was almost double for provisional compared to open license holders. A reduction in the ability to detect traffic events in the periphery whilst distracted presents a significant and measurable safety concern that will undoubtedly persist unless mitigated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Unsaturated water flow in soil is commonly modelled using Richards’ equation, which requires the hydraulic properties of the soil (e.g., porosity, hydraulic conductivity, etc.) to be characterised. Naturally occurring soils, however, are heterogeneous in nature, that is, they are composed of a number of interwoven homogeneous soils each with their own set of hydraulic properties. When the length scale of these soil heterogeneities is small, numerical solution of Richards’ equation is computationally impractical due to the immense effort and refinement required to mesh the actual heterogeneous geometry. A classic way forward is to use a macroscopic model, where the heterogeneous medium is replaced with a fictitious homogeneous medium, which attempts to give the average flow behaviour at the macroscopic scale (i.e., at a scale much larger than the scale of the heterogeneities). Using the homogenisation theory, a macroscopic equation can be derived that takes the form of Richards’ equation with effective parameters. A disadvantage of the macroscopic approach, however, is that it fails in cases when the assumption of local equilibrium does not hold. This limitation has seen the introduction of two-scale models that include at each point in the macroscopic domain an additional flow equation at the scale of the heterogeneities (microscopic scale). This report outlines a well-known two-scale model and contributes to the literature a number of important advances in its numerical implementation. These include the use of an unstructured control volume finite element method and image-based meshing techniques, that allow for irregular micro-scale geometries to be treated, and the use of an exponential time integration scheme that permits both scales to be resolved simultaneously in a completely coupled manner. Numerical comparisons against a classical macroscopic model confirm that only the two-scale model correctly captures the important features of the flow for a range of parameter values.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND CONTEXT: The Neck Disability Index frequently is used to measure outcomes of the neck. The statistical rigor of the Neck Disability Index has been assessed with conflicting outcomes. To date, Confirmatory Factor Analysis of the Neck Disability Index has not been reported for a suitably large population study. Because the Neck Disability Index is not a condition-specific measure of neck function, initial Confirmatory Factor Analysis should consider problematic neck patients as a homogenous group. PURPOSE: We sought to analyze the factor structure of the Neck Disability Index through Confirmatory Factor Analysis in a symptomatic, homogeneous, neck population, with respect to pooled populations and gender subgroups. STUDY DESIGN: This was a secondary analysis of pooled data. PATIENT SAMPLE: A total of 1,278 symptomatic neck patients (67.5% female, median age 41 years), 803 nonspecific and 475 with whiplash-associated disorder. OUTCOME MEASURES: The Neck Disability Index was used to measure outcomes. METHODS: We analyzed pooled baseline data from six independent studies of patients with neck problems who completed Neck Disability Index questionnaires at baseline. The Confirmatory Factor Analysis was considered in three scenarios: the full sample and separate sexes. Models were compared empirically for best fit. RESULTS: Two-factor models have good psychometric properties across both the pooled and sex subgroups. However, according to these analyses, the one-factor solution is preferable from both a statistical perspective and parsimony. The two-factor model was close to significant for the male subgroup (p<.07) where questions separated into constructs of mental function (pain, reading headaches and concentration) and physical function (personal care, lifting, work, driving, sleep, and recreation). CONCLUSIONS: The Neck Disability Index demonstrated a one-factor structure when analyzed by Confirmatory Factor Analysis in a pooled, homogenous sample of neck problem patients. However, a two-factor model did approach significance for male subjects where questions separated into constructs of mental and physical function. Further investigations in different conditions, subgroup and sex-specific populations are warranted.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The focus of this paper is two-dimensional computational modelling of water flow in unsaturated soils consisting of weakly conductive disconnected inclusions embedded in a highly conductive connected matrix. When the inclusions are small, a two-scale Richards’ equation-based model has been proposed in the literature taking the form of an equation with effective parameters governing the macroscopic flow coupled with a microscopic equation, defined at each point in the macroscopic domain, governing the flow in the inclusions. This paper is devoted to a number of advances in the numerical implementation of this model. Namely, by treating the micro-scale as a two-dimensional problem, our solution approach based on a control volume finite element method can be applied to irregular inclusion geometries, and, if necessary, modified to account for additional phenomena (e.g. imposing the macroscopic gradient on the micro-scale via a linear approximation of the macroscopic variable along the microscopic boundary). This is achieved with the help of an exponential integrator for advancing the solution in time. This time integration method completely avoids generation of the Jacobian matrix of the system and hence eases the computation when solving the two-scale model in a completely coupled manner. Numerical simulations are presented for a two-dimensional infiltration problem.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this study, we investigate the qualitative and quantitative effects of an R&D subsidy for a clean technology and a Pigouvian tax on a dirty technology on environmental R&D when it is uncertain how long the research takes to complete. The model is formulated as an optimal stopping problem, in which the number of successes required to complete the R&D project is finite and learning about the probability of success is incorporated. We show that the optimal R&D subsidy with the consideration of learning is higher than that without it. We also find that an R&D subsidy performs better than a Pigouvian tax unless suppliers have sufficient incentives to continue cost-reduction efforts after the new technology success-fully replaces the old one. Moreover, by using a two-project model, we show that a uniform subsidy is better than a selective subsidy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).