457 resultados para artifact
Resumo:
A cardiac-triggered, free-breathing, 3D balanced FFE projection renal MR angiography (MRA) technique with a 2D pencil beam aortic labeling pulse for selective aortic spin tagging was developed. For respiratory motion artifact suppression during free breathing, a prospective real-time navigator was implemented for renal MRA. Images obtained with the new approach were compared with standard contrast-enhanced (CE) 3D breath-hold MRA in seven swine. Signal properties and vessel visualization were analyzed. With the presented technique, high-resolution, high-contrast renal projection MRA with superior vessel length visualization (including a greater visible number of distal branches of the renal arteries) compared to standard breath-hold CE-MRA was obtained. The present results warrant clinical studies in patients with renal artery disease.
Resumo:
Diffusion magnetic resonance studies of the brain are typically performed using volume coils. Although in human brain this leads to a near optimal filling factor, studies of rodent brain must contend with the fact that only a fraction of the head volume can be ascribed to the brain. The use of surface coil as transceiver increases Signal-to-Noise Ratio (SNR), reduces radiofrequency power requirements and opens the possibility of parallel transmit schemes, likely to allow efficient acquisition schemes, of critical importance for reducing the long scan times implicated in diffusion tensor imaging. This study demonstrates the implementation of a semiadiabatic echo planar imaging sequence (echo time=40 ms, four interleaves) at 14.1T using a quadrature surface coil as transceiver. It resulted in artifact free images with excellent SNR throughout the brain. Diffusion tensor derived parameters obtained within the rat brain were in excellent agreement with reported values.
Resumo:
Background: It has been shown in a variety of organisms, including mammals, that genes that appeared recently in evolution, for example orphan genes, evolve faster than older genes. Low functional constraints at the time of origin of novel genes may explain these results. However, this observation has been recently attributed to an artifact caused by the inability of Blast to detect the fastest genes in different eukaryotic genomes. Distinguishing between these two possible explanations would be of great importance for any studies dealing with the taxon distribution of proteins and the origin of novel genes. Results: Here we used simulations of protein sequences to examine the capacity of Blast to detect proteins of diverse evolutionary rates in the different species of an eukaryotic phylogenetic tree that included metazoans, fungi and plants. We simulated the evolution of protein genes with the same evolutionary rates than those observed in functional mammalian genes and with among-site rate heterogeneity. Under these conditions, we found that only a very small percentage of simulated ancestral eukaryotic proteins was affected by the Blast artifact. We show that the good detectability of Blast is due to the heterogeneity of protein evolutionary rates at different sites, since only a small conserved motif in a sequence suffices to detect its homologues. Our results indicate that Blast, at least when applied within eukaryotes, only misses homologues of extremely fast-evolving sequences, which are rare in the mammalian genome, as well as sequences evolving homogeneously or pseudogenes.Conclusion: Although great care should be exercised in the recognition of remote homologues, most functional mammalian genes can be detected in eukaryotic genomes by Blast. That is, the majority of functional mammalian genes are not as fast as for not being detected in other metazoans, fungi or plants, if they had been present in these organisms. Thus, the correlation previously found between age and rate seems not to be due to a pure Blast artifact, at least for mammals. This may have important implications to understand the mechanisms by which novel genes originate.
Resumo:
EEG recordings are usually corrupted by spurious extra-cerebral artifacts, which should be rejected or cleaned up by the practitioner. Since manual screening of human EEGs is inherently error prone and might induce experimental bias, automatic artifact detection is an issue of importance. Automatic artifact detection is the best guarantee for objective and clean results. We present a new approach, based on the time–frequency shape of muscular artifacts, to achieve reliable and automatic scoring. The impact of muscular activity on the signal can be evaluated using this methodology by placing emphasis on the analysis of EEG activity. The method is used to discriminate evoked potentials from several types of recorded muscular artifacts—with a sensitivity of 98.8% and a specificity of 92.2%. Automatic cleaning ofEEGdata are then successfully realized using this method, combined with independent component analysis. The outcome of the automatic cleaning is then compared with the Slepian multitaper spectrum based technique introduced by Delorme et al (2007 Neuroimage 34 1443–9).
Resumo:
In this paper, we present a comprehensive study of different Independent Component Analysis (ICA) algorithms for the calculation of coherency and sharpness of electroencephalogram (EEG) signals, in order to investigate the possibility of early detection of Alzheimer’s disease (AD). We found that ICA algorithms can help in the artifact rejection and noise reduction, improving the discriminative property of features in high frequency bands (specially in high alpha and beta ranges). In addition to different ICA algorithms, the optimum number of selected components is investigated, in order to help decision processes for future works.
Resumo:
Staphylococcus aureus is an opportunistic pathogen whose infectious capacity depends on surface proteins, which enable bacteria to colonize and invade host tissues and cells. We analyzed "trypsin-shaved" surface proteins of S. aureus cultures by high resolution LC-MS/MS at different growth stages and culture conditions. Some modified peptides were identified, with a mass shift corresponding to the addition of a CH(2)O group (+30.0106u). We present evidence that this shift corresponds to a hyxdroxymethylation of asparagine and glutamine residues. This known but poorly documented post-translational modification was only found in a few proteins of S. aureus grown under specific conditions. This specificity seemed to exclude the hypothesis of an artifact due to sample preparation. Altogether hydroxymethylation was observed in 35 peptides from 15 proteins in our dataset, which corresponded to 41 modified sites, 35 of them being univocally localized. While no function can currently be assigned to this post-translational modification, we hypothesize that it could be linked to modulation of virulence factors, since it was mostly found on some surface proteins of S. aureus.
Resumo:
In the main report concerning the role that magnesium may have in highway concrete aggregate, over 20,000 electron microprobe data were obtained, primarily from automated scans, or traverses, across dolomite aggregate grains and the adjacent cement paste. Representative traverses were shown in figures and averages of the data were presented in Table II. In this Appendix, detailed representative and selected analyses of carbonate aggregate only are presented. These analyses were not presented in the main report because they would be interesting to only a few specialists in dolomite· rocks. In this Appendix, individual point analyses of mineral compositions in the paste have been omitted along with dolomite compositions at grain boundaries and cracks. Clay minerals and quartz inclusions in the aggregate are also not included. In the analyses, the first three column headings from left to right show line number, x-axis, and y-axis (Line number is an artifact of the computer print-out for each new traverse. Consecutive line numbers indicate a continuous traverse with distances between each point of 1.5 to a few μ-m. X-axis and y-axis are coordinates on the electron microscope stage). The next columns present weight percent oxide content of FeO, K20, CaO, Si02, Al203, MgO, SrO, BaO, MnO, Na20, and C02 (calculated assuming the number of moles of C02 is equal to the sum of moles of oxides, chiefly CaO and MgO), TOTAL (the sum of all oxides), and total (sum of all oxides excluding COi). In many of the analyses total is omitted.
Resumo:
When individuals learn by trial-and-error, they perform randomly chosen actions and then reinforce those actions that led to a high payoff. However, individuals do not always have to physically perform an action in order to evaluate its consequences. Rather, they may be able to mentally simulate actions and their consequences without actually performing them. Such fictitious learners can select actions with high payoffs without making long chains of trial-and-error learning. Here, we analyze the evolution of an n-dimensional cultural trait (or artifact) by learning, in a payoff landscape with a single optimum. We derive the stochastic learning dynamics of the distance to the optimum in trait space when choice between alternative artifacts follows the standard logit choice rule. We show that for both trial-and-error and fictitious learners, the learning dynamics stabilize at an approximate distance of root n/(2 lambda(e)) away from the optimum, where lambda(e) is an effective learning performance parameter depending on the learning rule under scrutiny. Individual learners are thus unlikely to reach the optimum when traits are complex (n large), and so face a barrier to further improvement of the artifact. We show, however, that this barrier can be significantly reduced in a large population of learners performing payoff-biased social learning, in which case lambda(e) becomes proportional to population size. Overall, our results illustrate the effects of errors in learning, levels of cognition, and population size for the evolution of complex cultural traits. (C) 2013 Elsevier Inc. All rights reserved.
Resumo:
Peer-reviewed
Resumo:
There is ample epidemiological and anecdotal evidence that a PFO increases the risk of stroke both in young and elderly patients, although only in a modest way: PFOs are more prevalent in patients with cryptogenic (unexplained) stroke than in healthy subjects, and are more prevalent in cryptogenic stroke than in strokes of other causes. Furthermore, multiple case series confirm an association of paradoxical embolism across a PFO in patients with deep vein thrombosis and/or pulmonary emboli.2. Is stroke recurrence risk in PFO-patients really not elevated when compared to PFO-free patients, as suggested by traditional observational studies? This finding is an epidemiological artifact called "the paradox of recurrence risk research" (Dahabreh & Kent, JAMA 2011) and is due to one (minor) risk factor, such as PFO, being wiped out by other, stronger risk factors in the control population.3. Having identified PFO as a risk factor for a first stroke and probably also for recurrences, we have to treat it, because treating risk factors always has paid off. No one would nowadays question the aggressive treatment of other risk factors of stroke such as hypertension, atrial fibrillation, smoking, or hyperlipidemia.4. In order to be effective, the preventive treatment has to control the risk factor (i.e. close effectively the PFO), and has to have little or no side effects. Both these conditions are now fulfilled thanks to increasing expertise of cardiologists with technically advanced closure devices and solid back up by multidisciplinary stroke teams.5. Closing a PFO does not dispense us from treating other stroke risk factors aggressively, given that these are cumulative with PFO.6. The most frequent reason why patients have a stroke recurrence after PFO closure is not that closure is ineffective, but that the initial stroke etiology is insufficiently investigated and not PFO related, and that the recurrence is due to another mechanism because of poor risk factor control.7. Similarly, the randomized CLOSURE study was negative because a) patients were included who had a low chance that their initial event was due to the PFO, b) patients were selected with a low chance that a PFO-related recurrence would occur, c) there was an unacceptable high rate of closure-related side effects, and d) the number of randomized patients was too small for a prevention trial.8. It is only a question of time until a sufficiently large randomized clinical trial with true PFO-related stroke patients and a high PFO-related recurrence risk will be performed and show the effectiveness of this closure9. PFO being a rather modest risk factor for stroke does not mean we should prevent our patients from getting the best available prevention by the best physicians in the best stroke centers Therefore, a PFO-closure performed by an excellent cardiologist following the recommendation of an expert neurovascular specialist after a thorough workup in a leading stroke center is one of the most effective stroke prevention treatments available in 2011.
Resumo:
The aim of this research paper is to present a macroscopic study about the feasibility and the efficiency of mobile devices in computing Least-Cost Path (LCP). This kind of artifact must work in off-line mode and must allow to load data from a mountain zone like digital terrain models and meteorological data.The research strategy has two steps:- First of all, we need to identify the set of software components in order to implement them inside the IT artifact. This set of components should have to be able to do LCP calculations, visualize results and present a well adapted human interface. The main goal of this first steep is to demonstrate the feasibility of a mobile geographic information system by following the ¿Design & Creation¿ research strategy.- In a second time, the goal is to evaluate the reliability and usability of this IT artifact by an ¿Experiments¿ research approach. In this step we want to characterize the behavior of the artifact in terms of fidelity and LCP process speed. This evaluation will be carried out by some external users.During the reading of this paper, we will see that this kind of geographic information system (the IT artifact) has the minimal requirements needed to carry out LCP calculations in mobile devices although it has several limitations and constraints in terms of useability and reliability. We will point out qualitative and quantitative elements related to the IT artifact performances while doing this kind of computations.
Resumo:
The goal of this dissertation is to find and provide the basis for a managerial tool that allows a firm to easily express its business logic. The methodological basis for this work is design science, where the researcher builds an artifact to solve a specific problem. In this case the aim is to provide an ontology that makes it possible to explicit a firm's business model. In other words, the proposed artifact helps a firm to formally describe its value proposition, its customers, the relationship with them, the necessary intra- and inter-firm infrastructure and its profit model. Such an ontology is relevant because until now there is no model that expresses a company's global business logic from a pure business point of view. Previous models essentially take an organizational or process perspective or cover only parts of a firm's business logic. The four main pillars of the ontology, which are inspired by management science and enterprise- and processmodeling, are product, customer interface, infrastructure and finance. The ontology is validated by case studies, a panel of experts and managers. The dissertation also provides a software prototype to capture a company's business model in an information system. The last part of the thesis consists of a demonstration of the value of the ontology in business strategy and Information Systems (IS) alignment. Structure of this thesis: The dissertation is structured in nine parts: Chapter 1 presents the motivations of this research, the research methodology with which the goals shall be achieved and why this dissertation present a contribution to research. Chapter 2 investigates the origins, the term and the concept of business models. It defines what is meant by business models in this dissertation and how they are situated in the context of the firm. In addition this chapter outlines the possible uses of the business model concept. Chapter 3 gives an overview of the research done in the field of business models and enterprise ontologies. Chapter 4 introduces the major contribution of this dissertation: the business model ontology. In this part of the thesis the elements, attributes and relationships of the ontology are explained and described in detail. Chapter 5 presents a case study of the Montreux Jazz Festival which's business model was captured by applying the structure and concepts of the ontology. In fact, it gives an impression of how a business model description based on the ontology looks like. Chapter 6 shows an instantiation of the ontology into a prototype tool: the Business Model Modelling Language BM2L. This is an XML-based description language that allows to capture and describe the business model of a firm and has a large potential for further applications. Chapter 7 is about the evaluation of the business model ontology. The evaluation builds on literature review, a set of interviews with practitioners and case studies. Chapter 8 gives an outlook on possible future research and applications of the business model ontology. The main areas of interest are alignment of business and information technology IT/information systems IS and business model comparison. Finally, chapter 9 presents some conclusions.
Resumo:
The aim of this research paper is to present a macroscopic study about the feasibility and the efficiency of mobile devices in computing least-cost path (LCP). This kind of artifact must work in off-line mode and must allow to load data from a mountain zone like digital terrain models and meteorological data.
Resumo:
Evidences collected from smartphones users show a growing desire of personalization offered by services for mobile devices. However, the need to accurately identify users' contexts has important implications for user's privacy and it increases the amount of trust, which users are requested to have in the service providers. In this paper, we introduce a model that describes the role of personalization and control in users' assessment of cost and benefits associated to the disclosure of private information. We present an instantiation of such model, a context-aware application for smartphones based on the Android operating system, in which users' private information are protected. Focus group interviews were conducted to examine users' privacy concerns before and after having used our application. Obtained results confirm the utility of our artifact and provide support to our theoretical model, which extends previous literature on privacy calculus and user's acceptance of context-aware technology.
Resumo:
Electroencephalographic (EEG) recordings are, most of the times, corrupted by spurious artifacts, which should be rejected or cleaned by the practitioner. As human scalp EEG screening is error-prone, automatic artifact detection is an issue of capital importance, to ensure objective and reliable results. In this paper we propose a new approach for discrimination of muscular activity in the human scalp quantitative EEG (QEEG), based on the time-frequency shape analysis. The impact of the muscular activity on the EEG can be evaluated from this methodology. We present an application of this scoring as a preprocessing step for EEG signal analysis, in order to evaluate the amount of muscular activity for two set of EEG recordings for dementia patients with early stage of Alzheimer’s disease and control age-matched subjects.