887 resultados para OR-join, Synchronizing Merge, Cancelation, YAWL, Workflow Patterns, Reset Nets


Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: Energy metabolism is increasingly recognized as a key factor in the pathogenesis of acute brain injury (ABI). We review the role of cerebral lactate metabolism and summarize evidence showing that lactate may act as supplemental fuel after ABI. RECENT FINDINGS: The role of cerebral lactate has shifted from a waste product to a potentially preferential fuel and signaling molecule. According to the astrocyte-neuron lactate shuttle model, glycolytic lactate might act as glucose-sparing substrate. Lactate also is emerging as a key signal to regulate cerebral blood flow (CBF) and a neuroprotective agent after experimental ABI. Clinical investigation using cerebral microdialysis shows the existence of two main lactate patterns, ischemic - from anaerobic metabolism - and nonischemic, from activated glycolysis, whereby lactate can be used as supplemental energy fuel. Preliminary clinical data suggests hypertonic lactate solutions improve cerebral energy metabolism and are an effective treatment for elevated intracranial pressure (ICP) after ABI. SUMMARY: Lactate can be a supplemental fuel for the injured brain and is important to regulate glucose metabolism and CBF. Exogenous lactate supplementation may be neuroprotective after experimental ABI. Recent clinical data from ABI patients suggest hypertonic lactate solutions may be a valid therapeutic option for secondary energy dysfunction and elevated ICP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dans cette thèse, nous étudions les aspects comportementaux d'agents qui interagissent dans des systèmes de files d'attente à l'aide de modèles de simulation et de méthodologies expérimentales. Chaque période les clients doivent choisir un prestataire de servivce. L'objectif est d'analyser l'impact des décisions des clients et des prestataires sur la formation des files d'attente. Dans un premier cas nous considérons des clients ayant un certain degré d'aversion au risque. Sur la base de leur perception de l'attente moyenne et de la variabilité de cette attente, ils forment une estimation de la limite supérieure de l'attente chez chacun des prestataires. Chaque période, ils choisissent le prestataire pour lequel cette estimation est la plus basse. Nos résultats indiquent qu'il n'y a pas de relation monotone entre le degré d'aversion au risque et la performance globale. En effet, une population de clients ayant un degré d'aversion au risque intermédiaire encoure généralement une attente moyenne plus élevée qu'une population d'agents indifférents au risque ou très averses au risque. Ensuite, nous incorporons les décisions des prestataires en leur permettant d'ajuster leur capacité de service sur la base de leur perception de la fréquence moyenne d'arrivées. Les résultats montrent que le comportement des clients et les décisions des prestataires présentent une forte "dépendance au sentier". En outre, nous montrons que les décisions des prestataires font converger l'attente moyenne pondérée vers l'attente de référence du marché. Finalement, une expérience de laboratoire dans laquelle des sujets jouent le rôle de prestataire de service nous a permis de conclure que les délais d'installation et de démantèlement de capacité affectent de manière significative la performance et les décisions des sujets. En particulier, les décisions du prestataire, sont influencées par ses commandes en carnet, sa capacité de service actuellement disponible et les décisions d'ajustement de capacité qu'il a prises, mais pas encore implémentées. - Queuing is a fact of life that we witness daily. We all have had the experience of waiting in line for some reason and we also know that it is an annoying situation. As the adage says "time is money"; this is perhaps the best way of stating what queuing problems mean for customers. Human beings are not very tolerant, but they are even less so when having to wait in line for service. Banks, roads, post offices and restaurants are just some examples where people must wait for service. Studies of queuing phenomena have typically addressed the optimisation of performance measures (e.g. average waiting time, queue length and server utilisation rates) and the analysis of equilibrium solutions. The individual behaviour of the agents involved in queueing systems and their decision making process have received little attention. Although this work has been useful to improve the efficiency of many queueing systems, or to design new processes in social and physical systems, it has only provided us with a limited ability to explain the behaviour observed in many real queues. In this dissertation we differ from this traditional research by analysing how the agents involved in the system make decisions instead of focusing on optimising performance measures or analysing an equilibrium solution. This dissertation builds on and extends the framework proposed by van Ackere and Larsen (2004) and van Ackere et al. (2010). We focus on studying behavioural aspects in queueing systems and incorporate this still underdeveloped framework into the operations management field. In the first chapter of this thesis we provide a general introduction to the area, as well as an overview of the results. In Chapters 2 and 3, we use Cellular Automata (CA) to model service systems where captive interacting customers must decide each period which facility to join for service. They base this decision on their expectations of sojourn times. Each period, customers use new information (their most recent experience and that of their best performing neighbour) to form expectations of sojourn time at the different facilities. Customers update their expectations using an adaptive expectations process to combine their memory and their new information. We label "conservative" those customers who give more weight to their memory than to the xiv Summary new information. In contrast, when they give more weight to new information, we call them "reactive". In Chapter 2, we consider customers with different degree of risk-aversion who take into account uncertainty. They choose which facility to join based on an estimated upper-bound of the sojourn time which they compute using their perceptions of the average sojourn time and the level of uncertainty. We assume the same exogenous service capacity for all facilities, which remains constant throughout. We first analyse the collective behaviour generated by the customers' decisions. We show that the system achieves low weighted average sojourn times when the collective behaviour results in neighbourhoods of customers loyal to a facility and the customers are approximately equally split among all facilities. The lowest weighted average sojourn time is achieved when exactly the same number of customers patronises each facility, implying that they do not wish to switch facility. In this case, the system has achieved the Nash equilibrium. We show that there is a non-monotonic relationship between the degree of risk-aversion and system performance. Customers with an intermediate degree of riskaversion typically achieve higher sojourn times; in particular they rarely achieve the Nash equilibrium. Risk-neutral customers have the highest probability of achieving the Nash Equilibrium. Chapter 3 considers a service system similar to the previous one but with risk-neutral customers, and relaxes the assumption of exogenous service rates. In this sense, we model a queueing system with endogenous service rates by enabling managers to adjust the service capacity of the facilities. We assume that managers do so based on their perceptions of the arrival rates and use the same principle of adaptive expectations to model these perceptions. We consider service systems in which the managers' decisions take time to be implemented. Managers are characterised by a profile which is determined by the speed at which they update their perceptions, the speed at which they take decisions, and how coherent they are when accounting for their previous decisions still to be implemented when taking their next decision. We find that the managers' decisions exhibit a strong path-dependence: owing to the initial conditions of the model, the facilities of managers with identical profiles can evolve completely differently. In some cases the system becomes "locked-in" into a monopoly or duopoly situation. The competition between managers causes the weighted average sojourn time of the system to converge to the exogenous benchmark value which they use to estimate their desired capacity. Concerning the managers' profile, we found that the more conservative Summary xv a manager is regarding new information, the larger the market share his facility achieves. Additionally, the faster he takes decisions, the higher the probability that he achieves a monopoly position. In Chapter 4 we consider a one-server queueing system with non-captive customers. We carry out an experiment aimed at analysing the way human subjects, taking on the role of the manager, take decisions in a laboratory regarding the capacity of a service facility. We adapt the model proposed by van Ackere et al (2010). This model relaxes the assumption of a captive market and allows current customers to decide whether or not to use the facility. Additionally the facility also has potential customers who currently do not patronise it, but might consider doing so in the future. We identify three groups of subjects whose decisions cause similar behavioural patterns. These groups are labelled: gradual investors, lumpy investors, and random investor. Using an autocorrelation analysis of the subjects' decisions, we illustrate that these decisions are positively correlated to the decisions taken one period early. Subsequently we formulate a heuristic to model the decision rule considered by subjects in the laboratory. We found that this decision rule fits very well for those subjects who gradually adjust capacity, but it does not capture the behaviour of the subjects of the other two groups. In Chapter 5 we summarise the results and provide suggestions for further work. Our main contribution is the use of simulation and experimental methodologies to explain the collective behaviour generated by customers' and managers' decisions in queueing systems as well as the analysis of the individual behaviour of these agents. In this way, we differ from the typical literature related to queueing systems which focuses on optimising performance measures and the analysis of equilibrium solutions. Our work can be seen as a first step towards understanding the interaction between customer behaviour and the capacity adjustment process in queueing systems. This framework is still in its early stages and accordingly there is a large potential for further work that spans several research topics. Interesting extensions to this work include incorporating other characteristics of queueing systems which affect the customers' experience (e.g. balking, reneging and jockeying); providing customers and managers with additional information to take their decisions (e.g. service price, quality, customers' profile); analysing different decision rules and studying other characteristics which determine the profile of customers and managers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New plate-tectonic reconstructions of the Gondwana margin suggest that the location of Gondwana-derived terranes should not only be guided by the models, but should also consider the possible detrital input from some Asian blocks (Hunia), supposed to have been located along the Cambrian Gondwana margin, and accreted in the Silurian to the North-Chinese block. Consequently, the Gondwana margin has to be subdivided into a more western domain, where the future Avalonian blocks will be separated from Gondwana by the opening Rheic Ocean, whereas in its eastern continuation, hosting the future basement areas of Central Europe, different periods of crustal extension should be distinguished. Instead of applying a rather cylindrical model, it is supposed that crustal extension follows a much more complex pattern, where local back-arcs or intra-continental rifts are involved. Guided by the age data of magmatic rocks and the pattern of subsidence curves, the following extensional events can be distinguished: During the early to middle Cambrian, a back-arc setting guided the evolution at the Gondwana margin. Contemporaneous intra-continental rift basins developed at other places related to a general post-PanAfrican extensional phase affecting Africa Upper Cambrian formation of oceanic crust is manifested in the Chamrousse area, and may have lateral cryptic relics preserved in other places. This is regarded as the oceanisation of some marginal basins in a context of back-arc rifting. These basins were closed in a mid-Ordovician tectonic phase, related to the subduction of buoyant material (mid-ocean ridge?) Since the Early Ordovician, a new phase of extension is observed, accompanied by a large-scale volcanic activity, erosion of the rift shoulders generated detritus (Armorican Quartzite) and the rift basins collected detrital zircons from a wide hinterland. This phase heralded the opening of Palaeotethys, but it failed due to the Silurian collision (Eo-Variscan phase) of an intra-oceanic arc with the Gondwana margin. During this time period, at the eastern wing of the Gondwana margin begins the drift of the future Hunia microcontinents, through the opening of an eastern prolongation of the already existing Rheic Ocean. The passive margin of the remaining Gondwana was composed of the Galatian superterranes, constituents of the future Variscan basement areas. Remaining under the influence of crustal extension, they will start their drift to Laurussia since the earliest Devonian during the opening of the Palaeotethys Ocean. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to evaluate the performance of stacked species distribution models in predicting the alpha and gamma species diversity patterns of two important plant clades along elevation in the Andes. We modelled the distribution of the species in the Anthurium genus (53 species) and the Bromeliaceae family (89 species) using six modelling techniques. We combined all of the predictions for the same species in ensemble models based on two different criteria: the average of the rescaled predictions by all techniques and the average of the best techniques. The rescaled predictions were then reclassified into binary predictions (presence/absence). By stacking either the original predictions or binary predictions for both ensemble procedures, we obtained four different species richness models per taxa. The gamma and alpha diversity per elevation band (500 m) was also computed. To evaluate the prediction abilities for the four predictions of species richness and gamma diversity, the models were compared with the real data along an elevation gradient that was independently compiled by specialists. Finally, we also tested whether our richness models performed better than a null model of altitudinal changes of diversity based on the literature. Stacking of the ensemble prediction of the individual species models generated richness models that proved to be well correlated with the observed alpha diversity richness patterns along elevation and with the gamma diversity derived from the literature. Overall, these models tend to overpredict species richness. The use of the ensemble predictions from the species models built with different techniques seems very promising for modelling of species assemblages. Stacking of the binary models reduced the over-prediction, although more research is needed. The randomisation test proved to be a promising method for testing the performance of the stacked models, but other implementations may still be developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human low-grade astrocytomas frequently recur and progress to states of higher malignancy. During tumor progression TP53 alterations are among the first genetic changes, while derangement of the p16/p14ARF/RB-1 system occurs later. To probe the pathogenetic significance of TP53 and RB-1 alterations, we introduced a v-src transgene driven by glial fibrillary acidic protein (GFAP) regulatory elements (which causes preneoplastic astrocytic lesions and stochastically astrocytomas of varying degrees of malignancy) into TP53+/- or RB-1+/- mice. Hemizygosity for TP53 or RB-1 did not increase the incidence or shorten the latency of astrocytic tumors in GFAP-v-src mice over a period of up to 76 weeks. Single strand conformation analysis of exons 5 to 8 of non-ablated TP53 alleles revealed altered migration patterns in only 3/16 tumors analyzed. Wild-type RB-1 alleles were retained in all RB-1+/-GFAP-v-src mice-derived astrocytic tumors analyzed, and pRb immunostaining revealed protein expression in all tumors. Conversely, the GFAP-v-src transgene did not influence the development of extraneural tumors related to TP53 or RB-1 hemizygosity. Therefore, the present study indicates that neither loss of RB-1 nor of TP53 confer a growth advantage in vivo to preneoplastic astrocytes expressing v-src, and suggests that RB-1 and TP53 belong to one single complementation group along with v-src in this transgenic model of astrocytoma development. The stochastic development of astrocytic tumors in GFAP-v-src, TP53+/- GFAP-v-src, and RB-1+/- GFAP-v-src transgenic mice indicates that additional hitherto unknown genetic lesions of astrocytes contribute to tumorigenesis, whose elucidation may prove important for our understanding of astrocytoma initiation and progression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The Advisa MRI system is designed to safely undergo magnetic resonance imaging (MRI). Its influence on image quality is not well known. OBJECTIVE: To evaluate cardiac magnetic resonance (CMR) image quality and to characterize myocardial contraction patterns by using the Advisa MRI system. METHODS: In this international trial with 35 participating centers, an Advisa MRI system was implanted in 263 patients. Of those, 177 were randomized to the MRI group and 150 underwent MRI scans at the 9-12-week visit. Left ventricular (LV) and right ventricular (RV) cine long-axis steady-state free precession MR images were graded for quality. Signal loss along the implantable pulse generator and leads was measured. The tagging CMR data quality was assessed as the percentage of trackable tagging points on complementary spatial modulation of magnetization acquisitions (n=16) and segmental circumferential fiber shortening was quantified. RESULTS: Of all cine long-axis steady-state free precession acquisitions, 95% of LV and 98% of RV acquisitions were of diagnostic quality, with 84% and 93%, respectively, being of good or excellent quality. Tagging points were trackable from systole into early diastole (360-648 ms after the R-wave) in all segments. During RV pacing, tagging demonstrated a dyssynchronous contraction pattern, which was not observed in nonpaced (n = 4) and right atrial-paced (n = 8) patients. CONCLUSIONS: In the Advisa MRI study, high-quality CMR images for the assessment of cardiac anatomy and function were obtained in most patients with an implantable pacing system. In addition, this study demonstrated the feasibility of acquiring tagging data to study the LV function during pacing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kinetic parameters of T cell receptor (TCR) interactions with its ligand have been proposed to control T cell activation. Analysis of kinetic data obtained has so far produced conflicting insights; here, we offer a consideration of this problem. As a model system, association and dissociation of a soluble TCR (sT1) and its specific ligand, an azidobenzoic acid derivative of the peptide SYIPSAEK-(ABA)I (residues 252-260 from Plasmodium berghei circumsporozoite protein), bound to class I MHC H-2K(d)-encoded molecule (MHCp) were studied by surface plasmon resonance. The association time courses exhibited biphasic patterns. The fast and dominant phase was assigned to ligand association with the major fraction of TCR molecules, whereas the slow component was attributed to the presence of traces of TCR dimers. The association rate constant derived for the fast phase, assuming a reversible, single-step reaction mechanism, was relatively slow and markedly temperature-dependent, decreasing from 7.0 x 10(3) at 25 degrees C to 1.8 x 10(2) M(-1).s(-1) at 4 degrees C. Hence, it is suggested that these observed slow rate constants are the result of unresolved elementary steps of the process. Indeed, our analysis of the kinetic data shows that the time courses of TCR-MHCp interaction fit well to two different, yet closely related mechanisms, where an induced fit or a preequilibrium of two unbound TCR conformers are operational. These mechanisms may provide a rationale for the reported conformational flexibility of the TCR and its unusual ligand recognition properties, which combine high specificity with considerable crossreactivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Amnestic mild cognitive impairment (aMCI) is characterized by memory deficits alone (single-domain, sd-aMCI) or associated with other cognitive disabilities (multi-domain, md-aMCI). The present study assessed the patterns of electroencephalographic (EEG) activity during the encoding and retrieval phases of short-term memory in these two aMCI subtypes, to identify potential functional differences according to the neuropsychological profile. Continuous EEG was recorded in 43 aMCI patients, whose 16 sd-aMCI and 27 md-aMCI, and 36 age-matched controls (EC) during delayed match-to-sample tasks for face and letter stimuli. At encoding, attended stimuli elicited parietal alpha (8-12 Hz) power decrease (desynchronization), whereas distracting stimuli were associated with alpha power increase (synchronization) over right central sites. No difference was observed in parietal alpha desynchronization among the three groups. For attended faces, the alpha synchronization underlying suppression of distracting letters was reduced in both aMCI subgroups, but more severely in md-aMCI cases that differed significantly from EC. At retrieval, the early N250r recognition effect was significantly reduced for faces in md-aMCI as compared to both sd-aMCI and EC. The results suggest a differential alteration of working memory cerebral processes for faces in the two aMCI subtypes, face covert recognition processes being specifically altered in md-aMCI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Until the mid-1990s, gastric cancer has been the first cause of cancer death worldwide, although rates had been declining for several decades and gastric cancer has become a relatively rare cancer in North America and in most Northern and Western Europe, but not in Eastern Europe, Russia and selected areas of Central and South America or East Asia. We analyzed gastric cancer mortality in Europe and other areas of the world from 1980 to 2005 using joinpoint regression analysis, and provided updated site-specific incidence rates from 51 selected registries. Over the last decade, the annual percent change (APC) in mortality rate was around -3, -4% for the major European countries. The APC were similar for the Republic of Korea (APC = -4.3%), Australia (-3.7%), the USA (-3.6%), Japan (-3.5%), Ukraine (-3%) and the Russian Federation (-2.8%). In Latin America, the decline was less marked, but constant with APC around -1.6% in Chile and Brazil, -2.3% in Argentina and Mexico and -2.6% in Colombia. Cancers in the fundus and pylorus are more common in high incidence and mortality areas and have been declining more than cardia gastric cancer. Steady downward trends persist in gastric cancer mortality worldwide even in middle aged population, and hence further appreciable declines are likely in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: There may be a considerable gap between LDL cholesterol (LDL-C) and blood pressure (BP) goal values recommended by the guidelines and results achieved in daily practice. Design Prospective cross-sectional survey of cardiovascular disease risk profiles and management with focus on lipid lowering and BP lowering in clinical practice. Methods: In phase 1, the cardiovascular risk of patients with known lipid profile visiting their general practitioner was anonymously assessed in accordance to the PROCAM-score. In phase 2, high-risk patients who did not achieve LDL-C goal less than 2.6 mmol/l in phase 1 could be further documented. Results: Six hundred thirty-five general practitioners collected the data of 23 892 patients with known lipid profile. Forty percent were high-risk patients (diabetes mellitus or coronary heart disease or PROCAM-score >20%), compared with 27% estimated by the physicians. Goal attainment rate was almost double for BP than for LDL-C in high-risk patients (62 vs. 37%). Both goals were attained by 25%. LDL-C values in phase 1 and 2 were available for 3097 high-risk patients not at LDL-C goal in phase 1; 32% of patients achieved LDL-C goal of less than 2.6 mmol/l after a mean of 17 weeks. The most successful strategies for LDL-C reduction were implemented in only 22% of the high-risk patients. Conclusion: Although patients at high cardiovascular risk were treated more intensively than low or medium risk patients, the majority remained insufficiently controlled, which is an incentive for intensified medical education. Adequate implementation of Swiss and International guidelines would expectedly contribute to improved achievement of LDL-C and BP goal values in daily practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The association between dietary patterns and head and neck cancer has rarely been addressed. Patients and methods We used individual-level pooled data from five case-control studies (2452 cases and 5013 controls) participating in the International Head and Neck Cancer Epidemiology consortium. A posteriori dietary patterns were identified through a principal component factor analysis carried out on 24 nutrients derived from study-specific food-frequency questionnaires. Odds ratios (ORs) and corresponding 95% confidence intervals (CIs) were estimated using unconditional logistic regression models on quintiles of factor scores. Results We identified three major dietary patterns named 'animal products and cereals', 'antioxidant vitamins and fiber', and 'fats'. The 'antioxidant vitamins and fiber' pattern was inversely related to oral and pharyngeal cancer (OR = 0.57, 95% CI 0.43-0.76 for the highest versus the lowest score quintile). The 'animal products and cereals' pattern was positively associated with laryngeal cancer (OR = 1.54, 95% CI 1.12-2.11), whereas the 'fats' pattern was inversely associated with oral and pharyngeal cancer (OR = 0.78, 95% CI 0.63-0.97) and positively associated with laryngeal cancer (OR = 1.69, 95% CI 1.22-2.34). Conclusions These findings suggest that diets rich in animal products, cereals, and fats are positively related to laryngeal cancer, and those rich in fruit and vegetables inversely related to oral and pharyngeal cancer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While there is evidence that the two ubiquitously expressed thyroid hormone (T3) receptors, TRalpha1 and TRbeta1, have distinct functional specificities, the mechanism by which they discriminate potential target genes remains largely unexplained. In this study, we demonstrate that the thyroid hormone response elements (TRE) from the malic enzyme and myelin basic protein genes (METRE and MBPTRE) respectively, are not functionally equivalent. The METRE, which is a direct repeat motif with a 4-base pair gap between the two half-site hexamers binds thyroid hormone receptor as a heterodimer with 9-cis-retinoic acid receptor (RXR) and mediates a high T3-dependent activation in response to TRalpha1 or TRbeta1 in NIH3T3 cells. In contrast, the MBPTRE, which consists of an inverted palindrome formed by two hexamers spaced by 6 base pairs, confers an efficient transactivation by TRbeta1 but a poor transactivation by TRalpha1. While both receptors form heterodimers with RXR on MBPTRE, the poor transactivation by TRalpha1 correlates also with its ability to bind efficiently as a monomer. This monomer, which is only observed with TRalpha1 bound to MBPTRE, interacts neither with N-CoR nor with SRC-1, explaining its functional inefficacy. However, in Xenopus oocytes, in which RXR proteins are not detectable, the transactivation mediated by TRalpha1 and TRbeta1 is equivalent and independent of a RXR supply, raising the question of the identity of the thyroid hormone receptor partner in these cells. Thus, in mammalian cells, the binding characteristics of TRalpha1 to MBPTRE (i.e. high monomer binding efficiency and low transactivation activity) might explain the particular pattern of T3 responsiveness of MBP gene expression during central nervous system development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Mortality among HIV-infected persons is decreasing, and causes of death are changing. Classification of deaths is hampered because of low autopsy rates, frequent deaths outside of hospitals, and shortcomings of International Statistical Classification of Diseases and Related Health Problems (ICD-10) coding. METHODS: We studied mortality among Swiss HIV Cohort Study (SHCS) participants (1988-2010) and causes of death using the Coding Causes of Death in HIV (CoDe) protocol (2005-2009). Furthermore, we linked the SHCS data to the Swiss National Cohort (SNC) cause of death registry. RESULTS: AIDS-related mortality peaked in 1992 [11.0/100 person-years (PY)] and decreased to 0.144/100 PY (2006); non-AIDS-related mortality ranged between 1.74 (1993) and 0.776/100 PY (2006); mortality of unknown cause ranged between 2.33 and 0.206/100 PY. From 2005 to 2009, 459 of 9053 participants (5.1%) died. Underlying causes of deaths were: non-AIDS malignancies [total, 85 (19%) of 446 deceased persons with known hepatitis C virus (HCV) status; HCV-negative persons, 59 (24%); HCV-coinfected persons, 26 (13%)]; AIDS [73 (16%); 50 (21%); 23 (11%)]; liver failure [67 (15%); 12 (5%); 55 (27%)]; non-AIDS infections [42 (9%); 13 (5%); 29 (14%)]; substance use [31 (7%); 9 (4%); 22 (11%)]; suicide [28 (6%); 17 (7%), 11 (6%)]; myocardial infarction [28 (6%); 24 (10%), 4 (2%)]. Characteristics of deceased persons differed in 2005 vs. 2009: median age (45 vs. 49 years, respectively); median CD4 count (257 vs. 321 cells/μL, respectively); the percentage of individuals who were antiretroviral therapy-naïve (13 vs. 5%, respectively); the percentage of deaths that were AIDS-related (23 vs. 9%, respectively); and the percentage of deaths from non-AIDS-related malignancies (13 vs. 24%, respectively). Concordance in the classification of deaths was 72% between CoDe and ICD-10 coding in the SHCS; and 60% between the SHCS and the SNC registry. CONCLUSIONS: Mortality in HIV-positive persons decreased to 1.33/100 PY in 2010. Hepatitis B or C virus coinfections increased the risk of death. Between 2005 and 2009, 84% of deaths were non-AIDS-related. Causes of deaths varied according to data source and coding system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exposure to solar ultraviolet (UV) radiation is the main causative factor for skin cancer. UV exposure depends on environmental and individual factors, but individual exposure data remain scarce. UV irradiance is monitored via different techniques including ground measurements and satellite observations. However it is difficult to translate such observations into human UV exposure or dose because of confounding factors (shape of the exposed surface, shading, behavior, etc.) A collaboration between public health institutions, a meteorological office and an institute specialized in computing techniques developed a model predicting the dose and distribution of UV exposure on the basis of ground irradiation and morphological data. Standard 3D computer graphics techniques were adapted to develop this tool, which estimates solar exposure of a virtual manikin depicted as a triangle mesh surface. The amount of solar energy received by various body locations is computed for direct, diffuse and reflected radiation separately. The radiation components are deduced from corresponding measurements of UV irradiance, and the related UV dose received by each triangle of the virtual manikin is computed accounting for shading by other body parts and eventual protection measures. The model was verified with dosimetric measurements (n=54) in field conditions using a foam manikin as surrogate for an exposed individual. Dosimetric results were compared to the model predictions. The model predicted exposure to solar UV adequately. The symmetric mean absolute percentage error was 13%. Half of the predictions were within 17% range of the measurements. This model allows assessing outdoor occupational and recreational UV exposures, without necessitating time-consuming individual dosimetry, with numerous potential uses in skin cancer prevention and research. Using this tool, we investigated solar UV exposure patterns with respect to the relative contribution of the direct, diffuse and reflected radiation. We assessed exposure doses for various body parts and exposure scenarios of a standing individual (static and dynamic postures). As input, the model used erythemally-weighted ground irradiance data measured in 2009 at Payerne, Switzerland. A year-round daily exposure (8 am to 5 pm) without protection was assumed. For most anatomical sites, mean daily doses were high (typically 6.2-14.6 SED) and exceeded recommended exposure values. Direct exposure was important during specific periods (e.g. midday during summer), but contributed moderately to the annual dose, ranging from 15 to 24% for vertical and horizontal body parts, respectively. Diffuse irradiation explained about 80% of the cumulative annual exposure dose. Acute diffuse exposures were also obtained for cloudy summer days. The importance of diffuse UV radiation should not be underestimated when advocating preventive measures. Messages focused on avoiding acute direct exposures may be of limited efficiency to prevent skin cancers associated with chronic exposure (e.g., squamous cell carcinomas).