902 resultados para Validation and certification competences process


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Complete investigation of thrombophilic or hemorrhagic clinical presentations is a time-, apparatus-, and cost-intensive process. Sensitive screening tests for characterizing the overall function of the hemostatic system, or defined parts of it, would be very useful. For this purpose, we are developing an electrochemical biosensor system that allows measurement of thrombin generation in whole blood as well as in plasma. METHODS: The measuring system consists of a single-use electrochemical sensor in the shape of a strip and a measuring unit connected to a personal computer, recording the electrical signal. Blood is added to a specific reagent mixture immobilized in dry form on the strip, including a coagulation activator (e.g., tissue factor or silica) and an electrogenic substrate specific to thrombin. RESULTS: Increasing thrombin concentrations gave standard curves with progressively increasing maximal current and decreasing time to reach the peak. Because the measurement was unaffected by color or turbidity, any type of blood sample could be analyzed: platelet-poor plasma, platelet-rich plasma, and whole blood. The test strips with the predried reagents were stable when stored for several months before testing. Analysis of the combined results obtained with different activators allowed discrimination between defects of the extrinsic, intrinsic, and common coagulation pathways. Activated protein C (APC) predried on the strips allowed identification of APC-resistance in plasma and whole blood samples. CONCLUSIONS: The biosensor system provides a new method for assessing thrombin generation in plasma or whole blood samples as small as 10 microL. The assay is easy to use, thus allowing it to be performed in a point-of-care setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Microarray genome analysis is realising its promise for improving detection of genetic abnormalities in individuals with mental retardation and congenital abnormality. Copy number variations (CNVs) are now readily detectable using a variety of platforms and a major challenge is the distinction of pathogenic from ubiquitous, benign polymorphic CNVs. The aim of this study was to investigate replacement of time consuming, locus specific testing for specific microdeletion and microduplication syndromes with microarray analysis, which theoretically should detect all known syndromes with CNV aetiologies as well as new ones. METHODS: Genome wide copy number analysis was performed on 117 patients using Affymetrix 250K microarrays. RESULTS: 434 CNVs (195 losses and 239 gains) were found, including 18 pathogenic CNVs and 9 identified as "potentially pathogenic". Almost all pathogenic CNVs were larger than 500 kb, significantly larger than the median size of all CNVs detected. Segmental regions of loss of heterozygosity larger than 5 Mb were found in 5 patients. CONCLUSIONS: Genome microarray analysis has improved diagnostic success in this group of patients. Several examples of recently discovered "new syndromes" were found suggesting they are more common than previously suspected and collectively are likely to be a major cause of mental retardation. The findings have several implications for clinical practice. The study revealed the potential to make genetic diagnoses that were not evident in the clinical presentation, with implications for pretest counselling and the consent process. The importance of contributing novel CNVs to high quality databases for genotype-phenotype analysis and review of guidelines for selection of individuals for microarray analysis is emphasised.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: In ictal scalp electroencephalogram (EEG) the presence of artefacts and the wide ranging patterns of discharges are hurdles to good diagnostic accuracy. Quantitative EEG aids the lateralization and/or localization process of epileptiform activity. METHODS: Twelve patients achieving Engel Class I/IIa outcome following temporal lobe surgery (1 year) were selected with approximately 1-3 ictal EEGs analyzed/patient. The EEG signals were denoised with discrete wavelet transform (DWT), followed by computing the normalized absolute slopes and spatial interpolation of scalp topography associated to detection of local maxima. For localization, the region with the highest normalized absolute slopes at the time when epileptiform activities were registered (>2.5 times standard deviation) was designated as the region of onset. For lateralization, the cerebral hemisphere registering the first appearance of normalized absolute slopes >2.5 times the standard deviation was designated as the side of onset. As comparison, all the EEG episodes were reviewed by two neurologists blinded to clinical information to determine the localization and lateralization of seizure onset by visual analysis. RESULTS: 16/25 seizures (64%) were correctly localized by the visual method and 21/25 seizures (84%) by the quantitative EEG method. 12/25 seizures (48%) were correctly lateralized by the visual method and 23/25 seizures (92%) by the quantitative EEG method. The McNemar test showed p=0.15 for localization and p=0.0026 for lateralization when comparing the two methods. CONCLUSIONS: The quantitative EEG method yielded significantly more seizure episodes that were correctly lateralized and there was a trend towards more correctly localized seizures. SIGNIFICANCE: Coupling DWT with the absolute slope method helps clinicians achieve a better EEG diagnostic accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, Digital Systems and Services for Technology Supported Learning and Education are recognized as the key drivers to transform the way that individuals, groups and organizations “learn” and the way to “assess learning” in 21st Century. These transformations influence: Objectives - moving from acquiring new “knowledge” to developing new and relevant “competences”; Methods – moving from “classroom” based teaching to “context-aware” personalized learning; and Assessment – moving from “life-long” degrees and certifications to “on-demandand “in-context” accreditation of qualifications. Within this context, promoting Open Access to Formal and Informal Learning, is currently a key issue in the public discourse and the global dialogue on Education, including Massive Open Online Courses (MOOCs) and Flipped School Classrooms. This volume on Digital Systems for Open Access to Formal and Informal Learning contributes to the international dialogue between researchers, technologists, practitioners and policy makers in Technology Supported Education and Learning. It addresses emerging issues related with both theory and practice, as well as, methods and technologies that can support Open Access to Formal and Informal Learning. In the twenty chapters contributed by international experts who are actively shaping the future of Educational Technology around the world, topics such as: - The evolution of University Open Courses in Transforming Learning - Supporting Open Access to Teaching and Learning of People with Disabilities - Assessing Student Learning in Online Courses - Digital Game-based Learning for School Education - Open Access to Virtual and Remote Labs for STEM Education - Teachers’ and Schools’ ICT Competence Profiling - Web-Based Education and Innovative Leadership in a K-12 International School Setting are presented. An in-depth blueprint of the promise, potential, and imminent future of the field, Digital Systems for Open Access to Formal and Informal Learning is necessary reading for researchers and practitioners, as well as, undergraduate and postgraduate students, in educational technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By means of fixed-links modeling the present study assessed processes involved in visual short-term memory functioning and investigates how these processes are related to intelligence. Using a color change detection task, short-term memory demands increased across three experimental conditions as a function of number of presented stimuli. We measured amount of information retained in visual short-term memory by hit rate as well as speed of visual short-term memory scanning by reaction time. For both measures, fixed-links modeling revealed a constant process reflecting processes irrespective of task manipulation as well as two increasing processes reflecting the increasing short-term memory demands. For visual short-term memory scanning, a negative association between intelligence and the constant process was found but no relationship between intelligence and the increasing processes. Thus, basic processing speed, rather than speed influenced by visual short-term memory demands, differentiates between high- and low-intelligent individuals. Intelligence was positively related to the experimental processes of shortterm memory retention but not to the constant process. In sum, significant associations with intelligence were only obtained when the specific processes of short-term memory were decomposed emphasizing the importance of a thorough assessment of cognitive processes when investigating their relation to intelligence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent studies have identified relationships between landscape form, erosion and climate in regions of landscape rejuvenation, associated with increased denudation. Most of these landscapes are located in non-glaciated mountain ranges and are characterized by transient geomorphic features. The landscapes of the Swiss Alps are likewise in a transient geomorphic state as seen by multiple knickzones. In this mountain belt, the transient state has been related to erosional effects during the Late Glacial Maximum (LGM). Here, we focus on the catchment scale and categorize hillslopes based on erosional mechanisms, landscape form and landcover. We then explore relationships of these variables to precipitation and extent of LGM glaciers to disentangle modern versus palaeo controls on the modern shape of the Alpine landscape. We find that in grasslands, the downslope flux of material mainly involves unconsolidated material through hillslope creep, testifying a transport-limited erosional regime. Alternatively, strength-limited hillslopes, where erosion is driven by bedrock failure, are covered by forests and/or expose bedrock, and they display oversteepened hillslopes and channels. There, hillslope gradients and relief are more closely correlated with LGM ice occurrence than with precipitation or the erodibility of the underlying bedrock. We relate the spatial occurrence of the transport- and strength-limited process domains to the erosive effects of LGM glaciers. In particular, strength-limited, rock dominated basins are situated above the equilibrium line altitude (ELA) of the LGM, reflecting the ability of glaciers to scour the landscape beyond threshold slope conditions. In contrast, transport-limited, soil-mantled landscapes are common below the ELA. Hillslopes covered by forests occupy the elevations around the ELA and are constrained by the tree line. We conclude that the current erosional forces at work in the Central Alps are still responding to LGM glaciation, and that the modern climate has not yet impacted on the modern landscape.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of northern high-latitude peatlands played an important role in the carbon (C) balance of the land biosphere since the Last Glacial Maximum (LGM). At present, carbon storage in northern peatlands is substantial and estimated to be 500 ± 100 Pg C (1 Pg C = 1015 g C). Here, we develop and apply a peatland module embedded in a dynamic global vegetation and land surface process model (LPX-Bern 1.0). The peatland module features a dynamic nitrogen cycle, a dynamic C transfer between peatland acrotelm (upper oxic layer) and catotelm (deep anoxic layer), hydrology- and temperature-dependent respiration rates, and peatland specific plant functional types. Nitrogen limitation down-regulates average modern net primary productivity over peatlands by about half. Decadal acrotelm-to-catotelm C fluxes vary between −20 and +50 g C m−2 yr−1 over the Holocene. Key model parameters are calibrated with reconstructed peat accumulation rates from peat-core data. The model reproduces the major features of the peat core data and of the observation-based modern circumpolar soil carbon distribution. Results from a set of simulations for possible evolutions of northern peat development and areal extent show that soil C stocks in modern peatlands increased by 365–550 Pg C since the LGM, of which 175–272 Pg C accumulated between 11 and 5 kyr BP. Furthermore, our simulations suggest a persistent C sequestration rate of 35–50 Pg C per 1000 yr in present-day peatlands under current climate conditions, and that this C sink could either sustain or turn towards a source by 2100 AD depending on climate trajectories as projected for different representative greenhouse gas concentration pathways.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Apoptosis, a form of programmed cell death, is critical to homoeostasis, normal development, and physiology. Dysregulation of apoptosis can lead to the accumulation of unwanted cells, such as occurs in cancer, and the removal of needed cells or disorders of normal tissues, such as heart, neurodegenerative, and autoimmune diseases. Noninvasive detection of apoptosis may play an important role in the evaluation of disease states and response to therapeutic intervention for a variety of diseases. It is desirable to have an imaging method to accurately detect and monitor this process in patients. In this study, we developed annexin A5-conjugated polymeric micellar nanoparticles dual-labeled with a near-infrared fluorescence fluorophores (Cy7) and a radioisotope (111In), named as 111In-labeled annexin A5-CCPM. In vitro studies demonstrated that annexin A5-CCPM could strongly and specifically bind to apoptotic cells. In vivo studies showed that apoptotic tissues could be clearly visualized by both single photon emission computed tomography (SPECT) and fluorescence molecular tomography (FMT) after intravenous injection of 111In-labeled Annexin A5-CCPM in 6 different apoptosis models. In contrast, there was little signal in respective healthy tissues. All the biodistribution data confirmed imaging results. Moreover, histological analysis revealed that radioactivity count correlated with fluorescence signal from the nanoparticles, and both signals co-localized with the region of apoptosis. In sum, 111In-labeled annexin A5-CCPM allowed visualization of apoptosis by both nuclear and optical imaging techniques. The complementary information acquired with multiple imaging techniques should be advantageous in improving diagnostics and management of patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Virtual worlds have moved from being a geek topic to one of mainstream academic interest. This transition is contingent not only on the augmented economic, societal and cultural value of these virtual realities and their effect upon real life but also on their convenience as fields for experimentation, for testing models and paradigms. User creation is however not something that has been transplanted from the real to the virtual world but a phenomenon and a dynamic process that happens from within and is defined through complex relationships between commercial and non-commercial, commodified and not commodified, individual and of the community, amateur and professional, art and not art. Accounting for this complex environment, the present paper explores user created content in virtual worlds, its dimensions and value and above all, its constraints by code and law. It puts forward suggestions for better understanding and harnessing this creativity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital technologies have often been perceived as imperilling traditional cultural expressions (TCE). This angst has interlinked technical and socio-cultural dimensions. On the technical side, it is related to the affordances of digital media that allow, among other things, instantaneous access to information without real location constraints, data transport at the speed of light and effortless reproduction of the original without any loss of quality. In a socio-cultural context, digital technologies have been regarded as the epitome of globalisation forces - not only driving and deepening the process of globalisation itself but also spreading its effects. The present article examines the validity of these claims and sketches a number of ways in which digital technologies may act as benevolent factors. We illustrate in particular that some digital technologies can be instrumentalised to protect TCE forms, reflecting more appropriately the specificities of TCE as a complex process of creation of identity and culture. The article also seeks to reveal that digital technologies - and more specifically the Internet and the World Wide Web - have had a profound impact on the ways cultural content is created, disseminated, accessed and consumed. We argue that this environment may have generated various opportunities for better accommodating TCE, especially in their dynamic sense of human creativity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increased pulmonary artery pressure is a well-known phenomenon of hypoxia and is seen in patients with chronic pulmonary diseases, and also in mountaineers on high altitude expedition. Different mediators are known to regulate pulmonary artery vessel tone. However, exact mechanisms are not fully understood and a multimodal process consisting of a whole panel of mediators is supposed to cause pulmonary artery vasoconstriction. We hypothesized that increased hypoxemia is associated with an increase in vasoconstrictive mediators and decrease of vasodilatators leading to a vasoconstrictive net effect. Furthermore, we suggested oxidative stress being partly involved in changement of these parameters. Oxygen saturation (Sao2) and clinical parameters were assessed in 34 volunteers before and during a Swiss research expedition to Mount Muztagh Ata (7549 m) in Western China. Blood samples were taken at four different sites up to an altitude of 6865 m. A mass spectrometry-based targeted metabolomic platform was used to detect multiple parameters, and revealed functional impairment of enzymes that require oxidation-sensitive cofactors. Specifically, the tetrahydrobiopterin (BH4)-dependent enzyme nitric oxide synthase (NOS) showed significantly lower activities (citrulline-to-arginine ratio decreased from baseline median 0.21 to 0.14 at 6265 m), indicating lower NO availability resulting in less vasodilatative activity. Correspondingly, an increase in systemic oxidative stress was found with a significant increase of the percentage of methionine sulfoxide from a median 6% under normoxic condition to a median level of 30% (p<0.001) in camp 1 at 5533 m. Furthermore, significant increase in vasoconstrictive mediators (e.g., tryptophan, serotonin, and peroxidation-sensitive lipids) were found. During ascent up to 6865 m, significant altitude-dependent changes in multiple vessel-tone modifying mediators with excess in vasoconstrictive metabolites could be demonstrated. These changes, as well as highly significant increase in systemic oxidative stress, may be predictive for increase in acute mountain sickness score and changes in Sao2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compared to μ→eγ and μ→eee, the process μ→e conversion in nuclei receives enhanced contributions from Higgs-induced lepton flavor violation. Upcoming μ→e conversion experiments with drastically increased sensitivity will be able to put extremely stringent bounds on Higgs-mediated μ→e transitions. We point out that the theoretical uncertainties associated with these Higgs effects, encoded in the couplings of quark scalar operators to the nucleon, can be accurately assessed using our recently developed approach based on SU(2) chiral perturbation theory that cleanly separates two- and three-flavor observables. We emphasize that with input from lattice QCD for the coupling to strangeness fNs, hadronic uncertainties are appreciably reduced compared to the traditional approach where fNs is determined from the pion-nucleon σ term by means of an SU(3) relation. We illustrate this point by considering Higgs-mediated lepton flavor violation in the standard model supplemented with higher-dimensional operators, the two-Higgs-doublet model with generic Yukawa couplings, and the minimal supersymmetric standard model. Furthermore, we compare bounds from present and future μ→e conversion and μ→eγ experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Histomorphometric evaluation of the buccal aspects of periodontal tissues in rodents requires reproducible alignment of maxillae and highly precise sections containing central sections of buccal roots; this is a cumbersome and technically sensitive process due to the small specimen size. The aim of the present report is to describe and analyze a method to transfer virtual sections of micro-computer tomographic (CT)-generated image stacks to the microtome for undecalcified histological processing and to describe the anatomy of the periodontium in rat molars. A total of 84 undecalcified sections of all buccal roots of seven untreated rats was analyzed. The accuracy of section coordinate transfer from virtual micro-CT slice to the histological slice, right-left side differences and the measurement error for linear and angular measurements on micro-CT and on histological micrographs were calculated using the Bland-Altman method, interclass correlation coefficient and the method of moments estimator. Also, manual alignment of the micro-CT-scanned rat maxilla was compared with multiplanar computer-reconstructed alignment. The supra alveolar rat anatomy is rather similar to human anatomy, whereas the alveolar bone is of compact type and the keratinized gingival epithelium bends apical to join the junctional epithelium. The high methodological standardization presented herein ensures retrieval of histological slices with excellent display of anatomical microstructures, in a reproducible manner, minimizes random errors, and thereby may contribute to the reduction of number of animals needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nitrous oxide (N2O) is an important greenhouse gas and ozone-depleting substance that has anthropogenic as well as natural marine and terrestrial sources. The tropospheric N2O concentrations have varied substantially in the past in concert with changing climate on glacial–interglacial and millennial timescales. It is not well understood, however, how N2O emissions from marine and terrestrial sources change in response to varying environmental conditions. The distinct isotopic compositions of marine and terrestrial N2O sources can help disentangle the relative changes in marine and terrestrial N2O emissions during past climate variations. Here we present N2O concentration and isotopic data for the last deglaciation, from 16,000 to 10,000 years before present, retrieved from air bubbles trapped in polar ice at Taylor Glacier, Antarctica. With the help of our data and a box model of the N2O cycle, we find a 30 per cent increase in total N2O emissions from the late glacial to the interglacial, with terrestrial and marine emissions contributing equally to the overall increase and generally evolving in parallel over the last deglaciation, even though there is no a priori connection between the drivers of the two sources. However, we find that terrestrial emissions dominated on centennial timescales, consistent with a state-of-the-art dynamic global vegetation and land surface process model that suggests that during the last deglaciation emission changes were strongly influenced by temperature and precipitation patterns over land surfaces. The results improve our understanding of the drivers of natural N2O emissions and are consistent with the idea that natural N2O emissions will probably increase in response to anthropogenic warming.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clear cell renal cell carcinoma (ccRCC) characterized by a tumor thrombus (TT) extending into the inferior vena cava (IVC) generally indicates poor prognosis. Nevertheless, the risk for tumor recurrence after nephrectomy and thrombectomy varies. An applicable and accurate prediction system to select ccRCC patients with TT of the IVC (ccRCC/TT) at high risk after nephrectomy is urgently needed, but has not been established up to now. To our knowledge, a possible role of microRNAs (miRs) for the development of ccRCC/TT or their impact as prognostic markers in ccRCC/TT has not been explored yet. Therefore, we analyzed the expression of the previously described onco-miRs miR-200c, miR-210, miR-126, miR-221, let-7b, miR-21, miR-143 and miR-141 in a study collective of 74 ccRCC patients. Using the expression profiles of these eight miRs we developed classification systems that accurately differentiate ccRCC from non-cancerous renal tissue and ccRCC/TT from tumors without TT. In the subgroup of 37 ccRCC/TT cases we found that miR-21, miR-126, and miR-221 predicted cancer related death (CRD) accurately and independently from other clinico-pathological features. Furthermore, a combined risk score based on the expression of miR-21, miR-126 and miR-221 was developed and showed high sensitivity and specificity to predict cancer specific survival (CSS) in ccRCC/TT. Using the combined risk score we were able to classify ccRCC/TT patients correctly into high and low risk cases. The risk stratification by the combined risk score (CRS) will benefit from further cohort validation and might have potential for clinical application as a molecular prediction system to identify high- risk ccRCC/TT patients.