977 resultados para high-definition television (HDTV)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The broad aim of biomedical science in the postgenomic era is to link genomic and phenotype information to allow deeper understanding of the processes leading from genomic changes to altered phenotype and disease. The EuroPhenome project (http://www.EuroPhenome.org) is a comprehensive resource for raw and annotated high-throughput phenotyping data arising from projects such as EUMODIC. EUMODIC is gathering data from the EMPReSSslim pipeline (http://www.empress.har.mrc.ac.uk/) which is performed on inbred mouse strains and knock-out lines arising from the EUCOMM project. The EuroPhenome interface allows the user to access the data via the phenotype or genotype. It also allows the user to access the data in a variety of ways, including graphical display, statistical analysis and access to the raw data via web services. The raw phenotyping data captured in EuroPhenome is annotated by an annotation pipeline which automatically identifies statistically different mutants from the appropriate baseline and assigns ontology terms for that specific test. Mutant phenotypes can be quickly identified using two EuroPhenome tools: PhenoMap, a graphical representation of statistically relevant phenotypes, and mining for a mutant using ontology terms. To assist with data definition and cross-database comparisons, phenotype data is annotated using combinations of terms from biological ontologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: This study describes the characteristics of the metabolic syndrome in HIV-positive patients in the Data Collection on Adverse Events of Anti-HIV Drugs study and discusses the impact of different methodological approaches on estimates of the prevalence of metabolic syndrome over time. METHODS: We described the prevalence of the metabolic syndrome in patients under follow-up at the end of six calendar periods from 2000 to 2007. The definition that was used for the metabolic syndrome was modified to take account of the use of lipid-lowering and antihypertensive medication, measurement variability and missing values, and assessed the impact of these modifications on the estimated prevalence. RESULTS: For all definitions considered, there was an increasing prevalence of the metabolic syndrome over time, although the prevalence estimates themselves varied widely. Using our primary definition, we found an increase in prevalence from 19.4% in 2000/2001 to 41.6% in 2006/2007. Modification of the definition to incorporate antihypertensive and lipid-lowering medication had relatively little impact on the prevalence estimates, as did modification to allow for missing data. In contrast, modification to allow the metabolic syndrome to be reversible and to allow for measurement variability lowered prevalence estimates substantially. DISCUSSION: The prevalence of the metabolic syndrome in cohort studies is largely based on the use of nonstandardized measurements as they are captured in daily clinical care. As a result, bias is easily introduced, particularly when measurements are both highly variable and may be missing. We suggest that the prevalence of the metabolic syndrome in cohort studies should be based on two consecutive measurements of the laboratory components in the syndrome definition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the recent history of a large prealpine lake (Lake Bourget) using chironomids, diatoms and organic matter analysis, and deals with the ability of paleolimnological approach to define an ecological reference state for the lake in the sense of the European Framework Directive. The study at low resolution of subfossil chironomids in a 4-m-long core shows the remarkable stability over the last 2.5 kyrs of the profundal community dominated by a Micropsectra-association until the beginning of the twentieth century, when oxyphilous taxa disappeared. Focusing on this key recent period, a high resolution and multiproxy study of two short cores reveals a progressive evolution of the lake's ecological state. Until AD 1880, Lake Bourget showed low organic matter content in the deep sediments (TOC less than 1%) and a well-oxygenated hypolimnion that allowed the development of a profundal oxyphilous chironomid fauna (Micropsectra-association). Diatom communities were characteristic of oligotrophic conditions. Around AD 1880, a slight increase in the TOC was the first sign of changes in lake conditions. This was followed by a first limited decline in oligotrophic diatom taxa and the disappearance of two oxyphilous chironomid taxa at the beginning of the twentieth century. The 1940s were a major turning point in recent lake history. Diatom assemblages and accumulation of well preserved planktonic organic matter in the sediment provide evidence of strong eutrophication. The absence of profundal chironomid communities reveals permanent hypolimnetic anoxia. From AD 1995 to 2006, the diatom assemblages suggest a reduction in nutrients, and a return to mesotrophic conditions, a result of improved wastewater management. However, no change in hypolimnion benthic conditions has been shown by either the organic matter or the subfossil chironomid profundal community. Our results emphasize the relevance of the paleolimnological approach for the assessment of reference conditions for modern lakes. Before AD 1900, the profundal Micropsectra-association and the Cyclotella dominated diatom community can be considered as the Lake Bourget reference community, which reflects the reference ecological state of the lake.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES: Evaluation of glomerular hyperfiltration (GH) is difficult; the variable reported definitions impede comparisons between studies. A clear and universal definition of GH would help in comparing results of trials aimed at reducing GH. This study assessed how GH is measured and defined in the literature. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: Three databases (Embase, MEDLINE, CINAHL) were systematically searched using the terms "hyperfiltration" or "glomerular hyperfiltration". All studies reporting a GH threshold or studying the effect of a high GFR in a continuous manner against another outcome of interest were included. RESULTS: The literature search was performed from November 2012 to February 2013 and updated in August 2014. From 2013 retrieved studies, 405 studies were included. Threshold use to define GH was reported in 55.6% of studies. Of these, 88.4% used a single threshold and 11.6% used numerous thresholds adapted to participant sex or age. In 29.8% of the studies, the choice of a GH threshold was not based on a control group or literature references. After 2004, the use of GH threshold use increased (P<0.001), but the use of a control group to precisely define that GH threshold decreased significantly (P<0.001); the threshold did not differ among pediatric, adult, or mixed-age studies. The GH threshold ranged from 90.7 to 175 ml/min per 1.73 m(2) (median, 135 ml/min per 1.73 m(2)). CONCLUSION: Thirty percent of studies did not justify the choice of threshold values. The decrease of GFR in the elderly was rarely considered in defining GH. From a methodologic point of view, an age- and sex-matched control group should be used to define a GH threshold.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: This study aimed to characterize myocardial infarction after percutaneous coronary intervention (PCI) based on cardiac marker elevation as recommended by the new universal definition and on the detection of late gadolinium enhancement (LGE) by cardiovascular magnetic resonance (CMR). It is also assessed whether baseline inflammatory biomarkers are higher in patients developing myocardial injury. BACKGROUND: Cardiovascular magnetic resonance accurately assesses infarct size. Baseline C-reactive protein (CRP) and neopterin predict prognosis after stent implantation. METHODS: Consecutive patients with baseline troponin (Tn) I within normal limits and no LGE in the target vessel underwent baseline and post-PCI CMR. The Tn-I was measured until 24 h after PCI. Serum high-sensitivity CRP and neopterin were assessed before coronary angiography. RESULTS: Of 45 patients, 64 (53 to 72) years of age, 33% developed LGE with infarct size of 0.83 g (interquartile range: 0.32 to 1.30 g). A Tn-I elevation >99% upper reference limit (i.e., myocardial necrosis) (median Tn-I: 0.51 μg/l, interquartile range: 0.16 to 1.23) and Tn-I > 3× upper reference limit (i.e., type 4a myocardial infarction [MI]) occurred in 58% and 47% patients, respectively. LGE was undetectable in 42% and 43% of patients with periprocedural myocardial necrosis and type 4a MI, respectively. Agreement between LGE and type 4a MI was moderate (kappa = 0.45). The levels of CRP or neopterin did not significantly differ between patients with or without myocardial injury, detected by CMR or according to the new definition (p = NS). CONCLUSIONS: This study reports the lack of substantial agreement between the new universal definition and CMR for the diagnosis of small-size periprocedural myocardial damage after complex PCI. Baseline levels of CRP or neopterin were not predictive for the development of periprocedural myocardial damage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review covers some of the contributions to date from cerebellar imaging studies performed at ultra-high magnetic fields. A short overview of the general advantages and drawbacks of the use of such high field systems for imaging is given. One of the biggest advantages of imaging at high magnetic fields is the improved spatial resolution, achievable thanks to the increased available signal-to-noise ratio. This high spatial resolution better matches the dimensions of the cerebellar substructures, allowing a better definition of such structures in the images. The implications of the use of high field systems is discussed for several imaging sequences and image contrast mechanisms. This review covers studies which were performed in vivo in both rodents and humans, with a special focus on studies that were directed towards the observation of the different cerebellar layers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Regional rates of hospitalization for ambulatory care sensitive conditions (ACSC) are used to compare the availability and quality of ambulatory care but the risk adjustment for population health status is often minimal. The objectives of the study was to examine the impact of more extensive risk adjustment on regional comparisons and to investigate the relationship between various area-level factors and the properly adjusted rates. METHODS: Our study is an observational study based on routine data of 2 million anonymous insured in 26 Swiss cantons followed over one or two years. A binomial negative regression was modeled with increasingly detailed information on health status (age and gender only, inpatient diagnoses, outpatient conditions inferred from dispensed drugs and frequency of physician visits). Hospitalizations for ACSC were identified from principal diagnoses detecting 19 conditions, with an updated list of ICD-10 diagnostic codes. Co-morbidities and surgical procedures were used as exclusion criteria to improve the specificity of the detection of potentially avoidable hospitalizations. The impact of the adjustment approaches was measured by changes in the standardized ratios calculated with and without other data besides age and gender. RESULTS: 25% of cases identified by inpatient main diagnoses were removed by applying exclusion criteria. Cantonal ACSC hospitalizations rates varied from to 1.4 to 8.9 per 1,000 insured, per year. Morbidity inferred from diagnoses and drugs dramatically increased the predictive performance, the greatest effect found for conditions linked to an ACSC. More visits were associated with fewer PAH although very high users were at greater risk and subjects who had not consulted at negligible risk. By maximizing health status adjustment, two thirds of the cantons changed their adjusted ratio by more than 10 percent. Cantonal variations remained substantial but unexplained by supply or demand. CONCLUSION: Additional adjustment for health status is required when using ACSC to monitor ambulatory care. Drug-inferred morbidities are a promising approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An attractive treatment of cancer consists in inducing tumor-eradicating CD8(+) CTL specific for tumor-associated Ags, such as NY-ESO-1 (ESO), a strongly immunogenic cancer germ line gene-encoded tumor-associated Ag, widely expressed on diverse tumors. To establish optimal priming of ESO-specific CTL and to define critical vaccine variables and mechanisms, we used HLA-A2/DR1 H-2(-/-) transgenic mice and sequential immunization with immunodominant DR1- and A2-restricted ESO peptides. Immunization of mice first with the DR1-restricted ESO(123-137) peptide and subsequently with mature dendritic cells (DCs) presenting this and the A2-restriced ESO(157-165) epitope generated abundant, circulating, high-avidity primary and memory CD8(+) T cells that efficiently killed A2/ESO(157-165)(+) tumor cells. This prime boost regimen was superior to other vaccine regimes and required strong Th1 cell responses, copresentation of MHC class I and MHC class II peptides by the same DC, and resulted in upregulation of sphingosine 1-phosphate receptor 1, and thus egress of freshly primed CD8(+) T cells from the draining lymph nodes into circulation. This well-defined system allowed detailed mechanistic analysis, which revealed that 1) the Th1 cytokines IFN-gamma and IL-2 played key roles in CTL priming, namely by upregulating on naive CD8(+) T cells the chemokine receptor CCR5; 2) the inflammatory chemokines CCL4 (MIP-1beta) and CCL3 (MIP-1alpha) chemoattracted primed CD4(+) T cells to mature DCs and activated, naive CD8(+) T cells to DC-CD4 conjugates, respectively; and 3) blockade of these chemokines or their common receptor CCR5 ablated priming of CD8(+) T cells and upregulation of sphingosine 1-phosphate receptor 1. These findings provide new opportunities for improving T cell cancer vaccines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tractable cases of the binary CSP are mainly divided in two classes: constraint language restrictions and constraint graph restrictions. To better understand and identify the hardest binary CSPs, in this work we propose methods to increase their hardness by increasing the balance of both the constraint language and the constraint graph. The balance of a constraint is increased by maximizing the number of domain elements with the same number of occurrences. The balance of the graph is defined using the classical definition from graph the- ory. In this sense we present two graph models; a first graph model that increases the balance of a graph maximizing the number of vertices with the same degree, and a second one that additionally increases the girth of the graph, because a high girth implies a high treewidth, an important parameter for binary CSPs hardness. Our results show that our more balanced graph models and constraints result in harder instances when compared to typical random binary CSP instances, by several orders of magnitude. Also we detect, at least for sparse constraint graphs, a higher treewidth for our graph models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several equipments and methodologies have been developed to make available precision agriculture, especially considering the high cost of its implantation and sampling. An interesting possibility is to define management zones aim at dividing producing areas in smaller management zones that could be treated differently, serving as a source of recommendation and analysis. Thus, this trial used physical and chemical properties of soil and yield aiming at the generation of management zones in order to identify whether they can be used as recommendation and analysis. Management zones were generated by the Fuzzy C-Means algorithm and their evaluation was performed by calculating the reduction of variance and performing means tests. The division of the area into two management zones was considered appropriate for the present distinct averages of most soil properties and yield. The used methodology allowed the generation of management zones that can serve as source of recommendation and soil analysis; despite the relative efficiency has shown a reduced variance for all attributes in divisions in the three sub-regions, the ANOVA did not show significative differences among the management zones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-speed semiconductor lasers are an integral part in the implemen- tation of high-bit-rate optical communications systems. They are com- pact, rugged, reliable, long-lived, and relatively inexpensive sources of coherent light. Due to the very low attenuation window that exists in the silica based optical fiber at 1.55 μm and the zero dispersion point at 1.3 μm, they have become the mainstay of optical fiber com- munication systems. For the fabrication of lasers with gratings such as, distributed bragg reflector or distributed feedback lasers, etching is the most critical step. Etching defines the lateral dimmensions of the structure which determines the performance of optoelectronic devices. In this thesis studies and experiments were carried out about the exist- ing etching processes for InP and a novel dry etching process was de- veloped. The newly developed process was based on Cl2/CH4/H2/Ar chemistry and resulted in very smooth surfaces and vertical side walls. With this process the grating definition was significantly improved as compared to other technological developments in the respective field. A surface defined grating definition approach is used in this thesis work which does not require any re-growth steps and makes the whole fabrication process simpler and cost effective. Moreover, this grating fabrication process is fully compatible with nano-imprint lithography and can be used for high throughput low-cost manufacturing. With usual etching techniques reported before it is not possible to etch very deep because of aspect ratio dependent etching phenomenon where with increasing etch depth the etch rate slows down resulting in non-vertical side walls and footing effects. Although with our de- veloped process quite vertical side walls were achieved but footing was still a problem. To overcome the challenges related to grating defini- tion and deep etching, a completely new three step gas chopping dry etching process was developed. This was the very first time that a time multiplexed etching process for an InP based material system was demonstrated. The developed gas chopping process showed extra ordinary results including high mask selectivity of 15, moderate etch- ing rate, very vertical side walls and a record high aspect ratio of 41. Both the developed etching processes are completely compatible with nano imprint lithography and can be used for low-cost high-throughput fabrication. A large number of broad area laser, ridge waveguide laser, distributed feedback laser, distributed bragg reflector laser and coupled cavity in- jection grating lasers were fabricated using the developed one step etch- ing process. Very extensive characterization was done to optimize all the important design and fabrication parameters. The devices devel- oped have shown excellent performance with a very high side mode suppression ratio of more than 52 dB, an output power of 17 mW per facet, high efficiency of 0.15 W/A, stable operation over temperature and injected currents and a threshold current as low as 30 mA for almost 1 mm long device. A record high modulation bandwidth of 15 GHz with electron-photon resonance and open eye diagrams for 10 Gbps data transmission were also shown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have designed and implemented a low-cost digital system using closed-circuit television cameras coupled to a digital acquisition system for the recording of in vivo behavioral data in rodents and for allowing observation and recording of more than 10 animals simultaneously at a reduced cost, as compared with commercially available solutions. This system has been validated using two experimental rodent models: one involving chemically induced seizures and one assessing appetite and feeding. We present observational results showing comparable or improved levels of accuracy and observer consistency between this new system and traditional methods in these experimental models, discuss advantages of the presented system over conventional analog systems and commercially available digital systems, and propose possible extensions to the system and applications to nonrodent studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The management of information in engineering organisations is facing a particular challenge in the ever-increasing volume of information. It has been recognised that an effective methodology is required to evaluate information in order to avoid information overload and to retain the right information for reuse. By using, as a starting point, a number of the current tools and techniques which attempt to obtain ‘the value’ of information, it is proposed that an assessment or filter mechanism for information is needed to be developed. This paper addresses this issue firstly by briefly reviewing the information overload problem, the definition of value, and related research work on the value of information in various areas. Then a “characteristic” based framework of information evaluation is introduced using the key characteristics identified from related work as an example. A Bayesian Network diagram method is introduced to the framework to build the linkage between the characteristics and information value in order to quantitatively calculate the quality and value of information. The training and verification process for the model is then described using 60 real engineering documents as a sample. The model gives a reasonable accurate result and the differences between the model calculation and training judgements are summarised as the potential causes are discussed. Finally, several further issues including the challenge of the framework and the implementations of this evaluation assessment method are raised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prevalence of the metabolic syndrome (MetS), CVD and type 2 diabetes (T2D) is known to be higher in populations from the Indian subcontinent compared with the general UK population. While identification of this increased risk is crucial to allow for effective treatment, there is controversy over the applicability of diagnostic criteria, and particularly measures of adiposity in ethnic minorities. Diagnostic cut-offs for BMI and waist circumference have been largely derived from predominantly white Caucasian populations and, therefore, have been inappropriate and not transferable to Asian groups. Many Asian populations, particularly South Asians, have a higher total and central adiposity for a similar body weight compared with matched Caucasians and greater CVD risk associated with a lower BMI. Although the causes of CVD and T2D are multi-factorial, diet is thought to make a substantial contribution to the development of these diseases. Low dietary intakes and tissue levels of long-chain (LC) n-3 PUFA in South Asian populations have been linked to high-risk abnormalities in the MetS. Conversely, increasing the dietary intake of LC n-3 PUFA in South Asians has proved an effective strategy for correcting such abnormalities as dyslipidaemia in the MetS. Appropriate diagnostic criteria that include a modified definition of adiposity must be in place to facilitate the early detection and thus targeted treatment of increased risk in ethnic minorities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have designed and implemented a low-cost digital system using closed-circuit television cameras coupled to a digital acquisition system for the recording of in vivo behavioral data in rodents and for allowing observation and recording of more than 10 animals simultaneously at a reduced cost, as compared with commercially available solutions. This system has been validated using two experimental rodent models: one involving chemically induced seizures and one assessing appetite and feeding. We present observational results showing comparable or improved levels of accuracy and observer consistency between this new system and traditional methods in these experimental models, discuss advantages of the presented system over conventional analog systems and commercially available digital systems, and propose possible extensions to the system and applications to non-rodent studies.