925 resultados para Models and Methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Platinum agents can cause the formation of DNA adducts and induce apoptosis to eliminate tumor cells. The aim of the present study was to investigate the influence of genetic variants of MDM2 on chemotherapy-related toxicities and clinical outcomes in patients with advanced non-small-cell lung cancer (NSCLC). MATERIALS AND METHODS: We recruited 663 patients with advanced NSCLC who had been treated with first-line platinum-based chemotherapy. Five tagging single nucleotide polymorphisms (SNPs) in MDM2 were genotyped in these patients. The associations of these SNPs with clinical toxicities and outcomes were evaluated using logistic regression and Cox regression analyses. RESULTS: Two SNPs (rs1470383 and rs1690924) showed significant associations with chemotherapy-related toxicities (ie, overall, hematologic, and gastrointestinal toxicity). Compared with the wild genotype AA carriers, patients with the GG genotype of rs1470383 had an increased risk of overall toxicity (odds ratio [OR], 3.28; 95% confidence interval [CI], 1.34-8.02; P = .009) and hematologic toxicity (OR, 4.10; 95% CI, 1.73-9.71; P = .001). Likewise, patients with the AG genotype of rs1690924 showed more sensitivity to gastrointestinal toxicity than did those with the wild-type homozygote GG (OR, 2.32; 95% CI, 1.30-4.14; P = .004). Stratified survival analysis revealed significant associations between rs1470383 genotypes and overall survival in patients without overall or hematologic toxicity (P = .007 and P = .0009, respectively). CONCLUSION: The results of our study suggest that SNPs in MDM2 might be used to predict the toxicities of platinum-based chemotherapy and overall survival in patients with advanced NSCLC. Additional validations of the association are warranted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: QRS prolongation is associated with adverse outcomes in mostly white populations, but its clinical significance is not well established for other groups. We investigated the association between QRS duration and mortality in African Americans. METHODS AND RESULTS: We analyzed data from 5146 African Americans in the Jackson Heart Study stratified by QRS duration on baseline 12-lead ECG. We defined QRS prolongation as QRS≥100 ms. We assessed the association between QRS duration and all-cause mortality using Cox proportional hazards models and reported the cumulative incidence of heart failure hospitalization. We identified factors associated with the development of QRS prolongation in patients with normal baseline QRS. At baseline, 30% (n=1528) of participants had QRS prolongation. The cumulative incidences of mortality and heart failure hospitalization were greater with versus without baseline QRS prolongation: 12.6% (95% confidence interval [CI], 11.0-14.4) versus 7.1% (95% CI, 6.3-8.0) and 8.2% (95% CI, 6.9-9.7) versus 4.4% (95% CI, 3.7-5.1), respectively. After risk adjustment, QRS prolongation was associated with increased mortality (hazard ratio, 1.27; 95% CI, 1.03-1.56; P=0.02). There was a linear relationship between QRS duration and mortality (hazard ratio per 10 ms increase, 1.06; 95% CI, 1.01-1.12). Older age, male sex, prior myocardial infarction, lower ejection fraction, left ventricular hypertrophy, and left ventricular dilatation were associated with the development of QRS prolongation. CONCLUSIONS: QRS prolongation in African Americans was associated with increased mortality and heart failure hospitalization. Factors associated with developing QRS prolongation included age, male sex, prior myocardial infarction, and left ventricular structural abnormalities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CT and digital subtraction angiography (DSA) are ubiquitous in the clinic. Their preclinical equivalents are valuable imaging methods for studying disease models and treatment. We have developed a dual source/detector X-ray imaging system that we have used for both micro-CT and DSA studies in rodents. The control of such a complex imaging system requires substantial software development for which we use the graphical language LabVIEW (National Instruments, Austin, TX, USA). This paper focuses on a LabVIEW platform that we have developed to enable anatomical and functional imaging with micro-CT and DSA. Our LabVIEW applications integrate and control all the elements of our system including a dual source/detector X-ray system, a mechanical ventilator, a physiological monitor, and a power microinjector for the vascular delivery of X-ray contrast agents. Various applications allow cardiac- and respiratory-gated acquisitions for both DSA and micro-CT studies. Our results illustrate the application of DSA for cardiopulmonary studies and vascular imaging of the liver and coronary arteries. We also show how DSA can be used for functional imaging of the kidney. Finally, the power of 4D micro-CT imaging using both prospective and retrospective gating is shown for cardiac imaging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trend analysis is widely used for detecting changes in hydrological data. Parametric methods for this employ pre-specified models and associated tests to assess significance, whereas non-parametric methods generally apply rank tests to the data. Neither approach is suitable for exploratory analysis, because parametric models impose a particular, perhaps unsuitable, form of trend, while testing may confirm that trend is present but does not describe its form. This paper describes semi-parametric approaches to trend analysis using local likelihood fitting of annual maximum and partial duration series and illustrates their application to the exploratory analysis of changes in extremes in sea level and river flow data. Bootstrap methods are used to quantify the variability of estimates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electromagnetic Levitation (EML) is a valuable method for measuring the thermo-physical properties of metals - surface tensions, viscosity, thermal/electrical conductivity, specific heat, hemispherical emissivity, etc. – beyond their melting temperature. In EML, a small amount of the test specimen is melted by Joule heating in a suspended AC coil. Once in liquid state, a small perturbation causes the liquid envelope to oscillate and the frequency of oscillation is then used to compute its surface tension by the well know Rayleigh formula. Similarly, the rate at which the oscillation is dampened relates to the viscosity. To measure thermal conductivity, a sinusoidally varying laser source may be used to heat the polar axis of the droplet and the temperature response measured at the polar opposite – the resulting phase shift yields thermal conductivity. All these theoretical methods assume that convective effects due to flow within the droplet are negligible compared to conduction, and similarly that the flow conditions are laminar; a situation that can only be realised under microgravity conditions. Hence the EML experiment is the method favoured for Spacelab experiments (viz. TEMPUS). Under terrestrial conditions, the full gravity force has to be countered by a much larger induced magnetic field. The magnetic field generates strong flow within the droplet, which for droplets of practical size becomes irrotational and turbulent. At the same time the droplet oscillation envelope is no longer ellipsoidal. Both these conditions invalidate simple theoretical models and prevent widespread EML use in terrestrial laboratories. The authors have shown in earlier publications that it is possible to suppress most of the turbulent convection generated in the droplet skin layer, through use of a static magnetic field. Using a pseudo-spectral discretisation method it is possible compute very accurately the dynamic variation in the suspended fluid envelope and simultaneously compute the time-varying electromagnetic, flow and thermal fields. The use of a DC field as a dampening agent was also demonstrated in cold crucible melting, where suppression of turbulence was achieved in a much larger liquid metal volume and led to increased superheat in the melt and reduction of heat losses to the water-cooled walls. In this paper, the authors describe the pseudo-spectral technique as applied to EML to compute the combined effects of AC and DC fields, accounting for all the flow-induced forces acting on the liquid volume (Lorentz, Maragoni, surface tension, gravity) and show example simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: A number of factors are known to influence food preferences and acceptability of new products. These include their sensory characteristics and strong, innate neural influences. In designing foods for any target group, it is important to consider intrinsic and extrinsic characteristics which may contribute to palatability, and acceptability of foods. Objective: To assess age and gender influences on sensory perceptions of novel low cost nutrient-rich food products developed using traditional Ghanaian food ingredients. Materials and Methods: In this study, a range of food products were developed from Ghanaian traditional food sources using the Food Multimix (FMM) concept. These products were subjected to sensory evaluation to assess the role of sensory perception on their acceptability among different target age groups across the life cycle (aged 11-68 years olds) and to ascertain any possible influences of gender on preference and choice. Variables including taste, odour, texture, flavour and appearance were tested and the results captured on a Likert scale and scores of likeness and acceptability analysed. Multivariate analyses were used to develop prediction models for targeted recipe development for different target groups. Multiple factor analysis of variance (ANOVA) and logistic linear regression were employed to test the strength of acceptability and to ascertain age and gender influences on product preference. Results: The results showed a positive trend in acceptability (r = 0.602) which tended towards statistical significance (p = 0.065) with very high product favourability rating (91% acceptability; P=0.005). However, age [odds ratios=1.44 (11-15 years old) odds ratios=2.01 (18-68 years old) and gender (P=0.000)] were major influences on product preference with children and females (irrespective of age) showing clear preferences or dislike of products containing certain particular ingredients. Conclusion: These findings are potentially useful in planning recipes for feeding interventions involving different vulnerable and target groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many different models have been postulated over the years for sizing of feeder drives; these models have different bases, some rationally based and others more rule-of-thumb. Experience of Jenike & Johanson and likewise of The Wolfson Centre in trouble-shooting feeder drives has shown that drive powers are often poorly matched, so there is clearly still some way to go towards establishing a universally-used reliable approach. This paper presents an on-going programme of work designed to measure feeder forces experimentally on a purpose designed testing rig, and to compare these against some of the best known available models, and also against a full size installation. One aspect which is novel is the monitoring of the transition between the “filling stress field” load on the feeder and the “flowing stress field” load.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: Ecological niche modelling can provide valuable insight into species' environmental preferences and aid the identification of key habitats for populations of conservation concern. Here, we integrate biologging, satellite remote-sensing and ensemble ecological niche models (EENMs) to identify predictable foraging habitats for a globally important population of the grey-headed albatross (GHA) Thalassarche chrysostoma. Location: Bird Island, South Georgia; Southern Atlantic Ocean. Methods: GPS and geolocation-immersion loggers were used to track at-sea movements and activity patterns of GHA over two breeding seasons (n = 55; brood-guard). Immersion frequency (landings per 10-min interval) was used to define foraging events. EENM combining Generalized Additive Models (GAM), MaxEnt, Random Forest (RF) and Boosted Regression Trees (BRT) identified the biophysical conditions characterizing the locations of foraging events, using time-matched oceanographic predictors (Sea Surface Temperature, SST; chlorophyll a, chl-a; thermal front frequency, TFreq; depth). Model performance was assessed through iterative cross-validation and extrapolative performance through cross-validation among years. Results: Predictable foraging habitats identified by EENM spanned neritic (<500 m), shelf break and oceanic waters, coinciding with a set of persistent biophysical conditions characterized by particular thermal ranges (3–8 °C, 12–13 °C), elevated primary productivity (chl-a > 0.5 mg m−3) and frequent manifestation of mesoscale thermal fronts. Our results confirm previous indications that GHA exploit enhanced foraging opportunities associated with frontal systems and objectively identify the APFZ as a region of high foraging habitat suitability. Moreover, at the spatial and temporal scales investigated here, the performance of multi-model ensembles was superior to that of single-algorithm models, and cross-validation among years indicated reasonable extrapolative performance. Main conclusions: EENM techniques are useful for integrating the predictions of several single-algorithm models, reducing potential bias and increasing confidence in predictions. Our analysis highlights the value of EENM for use with movement data in identifying at-sea habitats of wide-ranging marine predators, with clear implications for conservation and management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: Ecological niche modelling can provide valuable insight into species' environmental preferences and aid the identification of key habitats for populations of conservation concern. Here, we integrate biologging, satellite remote-sensing and ensemble ecological niche models (EENMs) to identify predictable foraging habitats for a globally important population of the grey-headed albatross (GHA) Thalassarche chrysostoma. Location: Bird Island, South Georgia; Southern Atlantic Ocean. Methods: GPS and geolocation-immersion loggers were used to track at-sea movements and activity patterns of GHA over two breeding seasons (n = 55; brood-guard). Immersion frequency (landings per 10-min interval) was used to define foraging events. EENM combining Generalized Additive Models (GAM), MaxEnt, Random Forest (RF) and Boosted Regression Trees (BRT) identified the biophysical conditions characterizing the locations of foraging events, using time-matched oceanographic predictors (Sea Surface Temperature, SST; chlorophyll a, chl-a; thermal front frequency, TFreq; depth). Model performance was assessed through iterative cross-validation and extrapolative performance through cross-validation among years. Results: Predictable foraging habitats identified by EENM spanned neritic (<500 m), shelf break and oceanic waters, coinciding with a set of persistent biophysical conditions characterized by particular thermal ranges (3–8 °C, 12–13 °C), elevated primary productivity (chl-a > 0.5 mg m−3) and frequent manifestation of mesoscale thermal fronts. Our results confirm previous indications that GHA exploit enhanced foraging opportunities associated with frontal systems and objectively identify the APFZ as a region of high foraging habitat suitability. Moreover, at the spatial and temporal scales investigated here, the performance of multi-model ensembles was superior to that of single-algorithm models, and cross-validation among years indicated reasonable extrapolative performance. Main conclusions: EENM techniques are useful for integrating the predictions of several single-algorithm models, reducing potential bias and increasing confidence in predictions. Our analysis highlights the value of EENM for use with movement data in identifying at-sea habitats of wide-ranging marine predators, with clear implications for conservation and management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Marine legislation is becoming more complex and marine ecosystem-based management is specified in national and regional legislative frameworks. Shelf-seas community and ecosystem models (hereafter termed ecosystem models) are central to the delivery of ecosystem-based management, but there is limited uptake and use of model products by decision makers in Europe and the UK in comparison with other countries. In this study, the challenges to the uptake and use of ecosystem models in support of marine environmental management are assessed using the UK capability as an example. The UK has a broad capability in marine ecosystem modelling, with at least 14 different models that support management, but few examples exist of ecosystem modelling that underpin policy or management decisions. To improve understanding of policy and management issues that can be addressed using ecosystem models, a workshop was convened that brought together advisors, assessors, biologists, social scientists, economists, modellers, statisticians, policy makers, and funders. Some policy requirements were identified that can be addressed without further model development including: attribution of environmental change to underlying drivers, integration of models and observations to develop more efficient monitoring programmes, assessment of indicator performance for different management goals, and the costs and benefit of legislation. Multi-model ensembles are being developed in cases where many models exist, but model structures are very diverse making a standardised approach of combining outputs a significant challenge, and there is a need for new methodologies for describing, analysing, and visualising uncertainties. A stronger link to social and economic systems is needed to increase the range of policy-related questions that can be addressed. It is also important to improve communication between policy and modelling communities so that there is a shared understanding of the strengths and limitations of ecosystem models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of train planning or scheduling for large, busy, complex train stations, which are common in Europe and elsewhere, though not in North America. We develop the constraints and objectives for this problem, but these are too computationally complex to solve by standard combinatorial search or integer programming methods. Also, the problem is somewhat political in nature, that is, it does not have a clear objective function because it involves multiple train operators with conflicting interests. We therefore develop scheduling heuristics analogous to those successfully adopted by train planners using ''manual'' methods. We tested the model and algorithms by applying to a typical large station that exhibits most of the complexities found in practice. The results compare well with those found by traditional methods, and take account of cost and preference trade-offs not handled by those methods. With successive refinements, the algorithm eventually took only a few seconds to run, the time depending on the version of the algorithm and the scheduling problem. The scheduling models and algorithms developed and tested here can be used on their own, or as key components for a more general system for train scheduling for a rail line or network.Train scheduling for a busy station includes ensuring that there are no conflicts between several hundred trains per day going in and out of the station on intersecting paths from multiple in-lines and out-lines to multiple platforms, while ensuring that each train is allowed at least its minimum required headways, dwell time, turnaround time and trip time. This has to be done while minimizing (costs of) deviations from desired times, platforms or lines, allowing for conflicts due to through-platforms, dead-end platforms, multiple sub-platforms, and possible constraints due to infrastructure, safety or business policy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims.We use observations and models of molecular D/H ratios to probe the physical conditions and chemical history of the gas and to differentiate between gas-phase and grain-surface chemical processing in star forming regions. Methods: As a follow up to previous observations of HDCO/H2CO and DCN/HCN ratios in a selection of low-mass protostellar cores, we have measured D2CO/H2CO and N2D^+/N2H+ ratios in these same sources. For comparison, we have also measured N2D^+/N2H+ ratios towards several starless cores and have searched for N2D+ and deuterated formaldehyde towards hot molecular cores (HMCs) associated with high mass star formation. We compare our results with predictions from detailed chemical models, and to other observations made in these sources. Results: Towards the starless cores and low-mass protostellar sources we have found very high N2D+ fractionation, which suggests that the bulk of the gas in these regions is cold and heavily depleted. The non-detections of N2D+ in the HMCs indicate higher temperatures. We did detect HDCO towards two of the HMCs, with abundances 1-3% of H2CO. These are the first detections of deuterated formaldehyde in high mass sources since Turner (1990) measured HDCO/H2CO and D2CO/H2CO towards the Orion Compact Ridge. Figures 1-5 are only available in electronic form at http://www.aanda.org

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exam timetabling is one of the most important administrative activities that takes place in academic institutions. In this paper we present a critical discussion of the research on exam timetabling in the last decade or so. This last ten years has seen an increased level of attention on this important topic. There has been a range of significant contributions to the scientific literature both in terms of theoretical andpractical aspects. The main aim of this survey is to highlight the new trends and key research achievements that have been carried out in the last decade.We also aim to outline a range of relevant important research issues and challenges that have been generated by this body of work.

We first define the problem and review previous survey papers. Algorithmic approaches are then classified and discussed. These include early techniques (e.g. graph heuristics) and state-of-the-art approaches including meta-heuristics, constraint based methods, multi-criteria techniques, hybridisations, and recent new trends concerning neighbourhood structures, which are motivated by raising the generality of the approaches. Summarising tables are presented to provide an overall view of these techniques. We discuss some issues on decomposition techniques, system tools and languages, models and complexity. We also present and discuss some important issues which have come to light concerning the public benchmark exam timetabling data. Different versions of problem datasetswith the same name have been circulating in the scientific community in the last ten years which has generated a significant amount of confusion. We clarify the situation and present a re-naming of the widely studied datasets to avoid future confusion. We also highlight which research papershave dealt with which dataset. Finally, we draw upon our discussion of the literature to present a (non-exhaustive) range of potential future research directions and open issues in exam timetabling research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: The aim was to investigate the association between periodontal health and the serum levels of various antioxidants including carotenoids, retinol and vitamin E in a homogenous group of Western European men.
Materials and Methods: A representative sample of 1258 men aged 60-70 years, drawn from the population of Northern Ireland, was examined between 2001 and 2003. Each participant had six or more teeth, completed a questionnaire and underwent a clinical periodontal examination. Serum lipid-soluble antioxidant levels were measured by high-performance liquid chromatography with diode array detection. Multivariable analysis was carried out using logistic regression with adjustment for possible confounders. Models were constructed using two measures of periodontal status (low- and high-threshold periodontitis) as dependent variables and the fifths of each antioxidant as a predictor variable.
Results: The levels of a- and ß-carotene, ß-cryptoxanthin and zeaxanthin were highly significantly lower in the men with low-threshold periodontitis (p<0.001). These carotenoids were also significantly lower in high-threshold periodontitis. There were no significant differences in the levels of lutein, lycopene, a- and ?-tocopherol or retinol in relation to periodontitis. In fully adjusted models, there was an inverse relationship between a number of carotenoids (a- and ß-carotene and ß-cryptoxanthin) and low-threshold periodontitis. ß-Carotene and ß-cryptoxanthin were the only antioxidants that were associated with an increased risk of high-threshold severe periodontitis. The adjusted odds ratio for high-threshold periodontitis in the lowest fifth relative to the highest fifth of ß-cryptoxanthin was 4.02 (p=0.003).
Conclusion: It is concluded that low serum levels of a number of carotenoids, in particular beta-cryptoxanthin and beta-carotene, were associated with an increased prevalence of periodontitis in this homogenous group of 60-70-year-old Western European men.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this study is to identify cues for the cognitive process of attention in ancient Greek art, aiming to find confirmation of its possible use by ancient Greek audiences and artists. Evidence of cues that trigger attention’s psychological dispositions was searched through content analysis of image reproductions of ancient Greek sculpture and fine vase painting from the archaic to the Hellenistic period - ca. 7th -1st cent. BC. Through this analysis, it was possible to observe the presence of cues that trigger orientation to the work of art (i.e. amplification, contrast, emotional salience, simplification, symmetry), of a cue that triggers a disseminate attention to the parts of the work (i.e. distribution of elements) and of cues that activate selective attention to specific elements in the work of art (i.e. contrast of elements, salient color, central positioning of elements, composition regarding the flow of elements and significant objects). Results support the universality of those dispositions, probably connected with basic competencies that are hard-wired in the nervous system and in the cognitive processes.