965 resultados para Epsicopal see
Resumo:
Introduction: Coronary magnetic resonance angiography (MRA) is a medical imaging technique that involves collecting data from consecutive heartbeats, always at the same time in the cardiac cycle, in order to minimize heart motion artifacts. This technique relies on the assumption that coronary arteries always follow the same trajectory from heartbeat to heartbeat. Until now, choosing the acquisition window in the cardiac cycle was based exclusively on the position of minimal coronary motion. The goal of this study was to test the hypothesis that there are time intervals during the cardiac cycle when coronary beat-to-beat repositioning is optimal. The repositioning uncertainty values in these time intervals were then compared with the intervals of low coronary motion in order to propose an optimal acquisition window for coronary MRA. Methods: Cine breath-hold x-ray angiograms with synchronous ECG were collected from 11 patients who underwent elective routine diagnostic coronarography. Twenty-three bifurcations of the left coronary artery were selected as markers to evaluate repositioning uncertainty and velocity during cardiac cycle. Each bifurcation was tracked by two observers, with the help of a user-assisted algorithm implemented in Matlab (The Mathworks, Natick, MA, USA) that compared the trajectories of the markers coming from consecutive heartbeats and computed the coronary repositioning uncertainty with steps of 50ms until 650ms after the R-wave. Repositioning uncertainty was defined as the diameter of the smallest circle encompassing the points to be compared at the same time after the R-wave. Student's t-tests with a false discovery rate (FDR, q=0.1) correction for multiple comparison were applied to see whether coronary repositioning and velocity vary statistically during cardiac cycle. Bland-Altman plots and linear regression were used to assess intra- and inter-observer agreement. Results: The analysis of left coronary artery beat-to-beat repositioning uncertainty shows a tendency to have better repositioning in mid systole (less than 0.84±0.58mm) and mid diastole (less than 0.89±0.6mm) than in the rest of the cardiac cycle (highest value at 50ms=1.35±0.64mm). According to Student's t-tests with FDR correction for multiple comparison (q=0.1), two intervals, in mid systole (150-200ms) and mid diastole (550-600ms), provide statistically better repositioning in comparison with the early systole and the early diastole. Coronary velocity analysis reveals that left coronary artery moves more slowly in end systole (14.35±11.35mm/s at 225ms) and mid diastole (11.78±11.62mm/s at 625ms) than in the rest of the cardiac cycle (highest value at 25ms: 55.96±22.34mm/s). This was confirmed by Student's t-tests with FDR correction for multiple comparison (q=0.1, FDR-corrected p-value=0.054): coronary velocity values at 225, 575 and 625ms are not much different between them but they are statistically inferior to all others. Bland-Altman plots and linear regression show that intra-observer agreement (y=0.97x+0.02 with R²=0.93 at 150ms) is better than inter-observer (y=0.8x+0.11 with R²=0.67 at 150ms). Discussion: The present study has demonstrated that there are two time intervals in the cardiac cycle, one in mid systole and one in mid diastole, where left coronary artery repositioning uncertainty reaches points of local minima. It has also been calculated that the velocity is the lowest in end systole and mid diastole. Since systole is less influenced by heart rate variability than diastole, it was finally proposed to test an acquisition window between 150 and 200ms after the R-wave.
Resumo:
Gastrointestinal bleeding is among the major clinical challenges for the gastroenterologists and the initial approach is very complex. For a big part of bleeding lesions, it is important to perform an endoscopic hemostatis after the introduction of an intravenous treatment (that has to be started as soon as there is a clinical suspicion of an upper gastrointestinal bleeding). The significant progresses made during the last years have allowed firstly to see the entire small bowel mucosa (video capsule) and secondly new treatments have successfully replaced surgical interventions.
Resumo:
Introduction : Population aging leads to a considerable increase in the prevalence of specific diseases. We aimed to assess if those changes were already reflected in an Internal Medicine ward. Methods : Anonymous data was obtained from the administrative database of the department of internal medicine of the Lausanne University Hospital (CHUV). All hospitalizations of adult (>=18 years) patients occurring between 2003 and 2011 were included. Infections, cancers and diseases according to body system (heart, lung...) were defined by the first letter of the ICD-10 code for the main cause of hospitalization. Specific diseases (myocardial infarction, heart failure...) were defined by the first three letters of the ICD-10 codes for the main cause of hospitalization. Results : Data from 32,741 hospitalizations occurring between 2003 and 2011 was analyzed. Cardiovascular (ICD-10 code I) and respiratory (ICD-10 code J) diseases ranked first and second, respectively, and their ranks did not change during the study period (figure). Digestive and endocrine diseases decreased while psychiatric diseases increased from rank 9 in 2003 to rank 6 in 2011 (figure). Among specific diseases, pneumonia (organism unspecified, code J18) ranked first in 2003 and second in 2011. Acute myocardial infarction (code I21) ranked second in 2003 and third in 2011. Chronic obstructive pulmonary disease with acute lower respiratory infection (code J44) ranked third in 2003 and decreased to rank 8 in 2011. Conversely, heart failure (code I50) increased from rank 8 in 2003 to rank 1 in 2011 and delirium (not induced by alcohol and other psychoactive substances, code F05) increased from below rank 20 in 2003 to rank 4 in 2011. For more details, see table. Conclusion : In less than 10 years, considerable changes occurred in the presentation of patients attending an Internal Medicine ward. The changes in diseases call for adaptations in hospital staff and logistics.
Resumo:
In recent years, analysis of the genomes of many organisms has received increasing international attention. The bulk of the effort to date has centred on the Human Genome Project and analysis of model organisms such as yeast, Drosophila and Caenorhabditis elegans. More recently, the revolution in genome sequencing and gene identification has begun to impact on infectious disease organisms. Initially, much of the effort was concentrated on prokaryotes, but small eukaryotic genomes, including the protozoan parasites Plasmodium, Toxoplasma and trypanosomatids (Leishmania, Trypanosoma brucei and T. cruzi), as well as some multicellular organisms, such as Brugia and Schistosoma, are benefiting from the technological advances of the genome era. These advances promise a radical new approach to the development of novel diagnostic tools, chemotherapeutic targets and vaccines for infectious disease organisms, as well as to the more detailed analysis of cell biology and function.Several networks or consortia linking laboratories around the world have been established to support these parasite genome projects[1] (for more information, see http://www.ebi.ac.uk/ parasites/paratable.html). Five of these networks were supported by an initiative launched in 1994 by the Specific Programme for Research and Tropical Diseases (TDR) of the WHO[2, 3, 4, 5, 6]. The Leishmania Genome Network (LGN) is one of these[3]. Its activities are reported at http://www.ebi.ac.uk/parasites/leish.html, and its current aim is to map and sequence the genome of Leishmania by the year 2002. All the mapping, hybridization and sequence data are also publicly available from LeishDB, an AceDB-based genome database (http://www.ebi.ac.uk/parasites/LGN/leissssoft.html).
Resumo:
Purpose: Animal models are essential to study pathological mechanisms and to test new therapeutic strategies. Many mouse models mimic human rod loss but only a limited number simulate cone dystrophies. The importance of cone function for human vision highlights the need to engineer a model for cone degeneration. An approach of lentiviral-directed transgenesis was tested in mice to express a dominant mutant gene described in a human cone dystrophy.Methods: Lentiviral vectors (LV) encoding either hrGFPII or the human double mutant GUCY2DE837D/R838S cDNA under the control of a region of the pig arrestin-3 promoter (Arr3) were produced and used for lentiviral-derived transgenesis. PCR-genotyping determined the transgenic mouse ratio. The expression of GFP was then analyzed both in vivo and by immunohistochemistry in Arr3-GFPII mice. Functional analysis was performed by ERG at 5, 9, 16 and 24 weeks for Arr3-GUCY2DE837D/R838S mice. Mice were sacrificed at 10 months of age for both histological analysis and RNA extraction.Results: While all the newborns from the transgenesis using the LV-Arr3-GFPII were transgenic, one third of the newborns from the LV-Arr3-GUCY2DE837D/R838S transgenesis were positive. Expression of GFPII was demonstrated by in vivo imaging, while expression of the mutant GUCY2D transcript was detetected using RT-PCR. No severe alteration of the functional response was observed up to 24 weeks of age in the transgenic mice. No obvious modification of the retinal morphology was identified either.Conclusions: Lentiviral-directed transgenesis is a rapid and straightforward method to engineer transgenic mice. Protein expression can be specifically targeted to the retina and thus could help to study the effect of expression of dominant mutant proteins. In our case, Arr3-GUCY2DE837D/R838S mice have a less severe phenotype than that described for human patients. Further analyses are required to understand this difference but several modifications of the expression cassette might also help to increase the expression of the mutant protein and reinforce the phenotype. Interestingly, the same construct is less effective in mouse versus pig retina (see Arsenijevic et al. ARVO 2011 abstract).
Resumo:
Aquest treball de final de carrera tracta sobre les diferents perspectives i aproximacions a l'Holocaust nazi per part de la literatura i el cinema. La Xoà, forma que els jueus utilitzen per anomenar l'Holocaust, és un dels moments més importants de la història europea. Ens agradaria veure si aquest tema tan delicat és tractat des de diferents perspectives i aproximacions en la literatura i el cinema. També voldríem esbrinar quina pel·lícula és millor per ensenyar què va ser l'Holocaust a una persona adulta, a un grup d'adolescents, a uns infants, etc. i si hi ha alguna pel·lícula o novel·la que vagi més enllà, en el sentit que tracti l'Holocaust des d'un punt de vista més simbòlic que no pas realista, d'arrel kafkiana, com en el cas de "L'incinerador de cadàvers". Per trobar aquestes diferents perspectives (més realistes i veristes o bé més simbòliques) caldrà observar diversos tipus de literatura (biogràfica, semibiogràfica i de ficció), pel·lícules (bèl·liques, drames, biogràfiques, etc.), articles... Analitzarem al mateix temps què diuen diferents autors sobre l'holocaust com un fet inenarrable. Compararem els dos o tres punts de vista (torturats, torturadors i testimonis).
Resumo:
El motiu d'aquest pràcticum és fer el seguiment i la valoració de la introducció de les TIC en dos IES en el marc del projecte Xarxipèlag de les Illes Balears. Es tracta de veure de quina manera els docents han integrat o no l'ús de les TIC a les seves aules, així com també la incidència de la formació rebuda.
Resumo:
The decision to publish educational materials openly and under free licenses brings up the challenge of doing it in a sustainable way. Some lessons can be learned from the business models for production, maintenance and distribution of Free and Open Source Software. The Free Technology Academy (FTA) has taken on these challenges and has implemented some of these models. We briefly review the FTA educational programme, methodologies and organisation, and see to which extent these models are proving successful in the case of the FTA.
Resumo:
Introduction: Estimation of the time since death based on the gastric content is still a controversy subject. Many studies have been achieved leaving the same incertitude: the intra- and inter-individual variability. Aim: After a homicidal case where a specialized gastroenterologist was cited to estimate the time of death based on the gastric contents and his experience in clinical practice. Consequently we decided to make a review of the scientific literature to see if that method was more reliable nowadays. Material and methods: We chose articles from 1979 that describe the estimation of the gastric emptying rate according to several factors and the forensic articles about the estimation of the time of death in relation with the gastric content. Results: Most of the articles cited by the specialized gastroenterologist were studies about living healthy people and the effects of several factors (medication, supine versus upside-down position, body mass index or different type of food). Forensic articles frequently concluded that the estimation of the time since death by analyzing the gastric content can be used but not as the unique method. Conclusion: Estimation of the time since death by analyze of the gastric contents is a method that can be used nowadays. But it cannot be the only method as the inter- and intra-individual variability remains an important bias.
Resumo:
Peer-reviewed
Resumo:
La comunicació científica té en els dipòsits institucionals una de les seves instruments fonamentals. En els últims anys estem observant un ràpid creixement en el nombre de dipòsits i també en el volum dels seus continguts. A la vegada es constata que els esforços s'estan concentrant en incrementar la seva presència i en afavorir el dipòsit de noves obres per part dels seus autors. Pel contrari, es troben a faltar polítiques i accions que prevegin i assegurin la sostenibilitat i la preservació futura dels dipòsits. La comunicació analitzarà la realitat actual, a nivell espanyol i internacional, de les accions a favor de la preservació dels dipòsits institucionals, entre elles: els programaris utilitzats i l'existència de polítiques explícites. Finalment es desenvoluparan els elements mínims que hauria de contenir un pla d'acció per assegurar la sostenibilitat i la preservació futura de qualsevol arxiu.
Resumo:
Les prioritats per als museus canvien. La missió de la nova museologia és convertir els museus en llocs per a gaudir i aprendre, cosa que fa que hagin de dur a terme una gestió financera molt semblant a la d'una empresa social que competeixi en el sector del lleure. Amb el pas del temps, els museus han d'establir i aplicar els criteris necessaris per a la supervivència, aplanant el terreny perquè altres institucions públiques siguin més obertes en els seus esforços per comunicar i difondre el seu patrimoni. Ja podem començar a parlar d'algunes conclusions comunament acceptades sobre el comportament dels visitants, que són necessàries per a planificar exposicions futures que vegin l'aprenentatge com un procés constructiu, les col·leccions com a objectes amb significat i les mateixes exposicions com a mitjans de comunicació que haurien de transformar la manera de pensar de l'espectador i que estan al servei del mateix missatge. Sembla que internet representa un mitjà efectiu per a assolir aquests objectius, ja que és capaç (a) d'adaptar-se als interessos i les característiques intel·lectuals d'un públic divers; (b) de redescobrir els significats dels objectes i adquirir un reconeixement sociocultural del seu valor per mitjà del seu potencial interactiu, i (c) de fer ús d'elements atractius i estimulants perquè tothom en gaudeixi. Per a aquest propòsit, és bàsic fer-nos les preguntes següents: quins criteris ha de seguir un museu virtual per a optimar la difusió del seu patrimoni?; quins elements estimulen els usuaris a quedar-se en una pàgina web i fer visites virtuals que els siguin satisfactòries?; quin paper té la usabilitat de l'aplicació en tot això?
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods
Resumo:
The chapter provides an account of the changing role played by active labour market policies (ALMPs) in Europe since the post-war years. Focusing on six countries (Sweden, Denmark, France, Germany, Italy, and the United Kingdom), it shows that the role of ALMPs is related to the broad economic situation. At times of rapid expansion and labour shortage, like the 1950s and 1960s, their key objective was to upskill the workforce. After the oil shocks of the 1970s, the raison d'être of ALMPs shifted from economic to social policy, and since the mid-1990s, we see the development of a new function, well captured by the notion of activation, which refers to the strengthening of work incentives and the removal of obstacles to employment, mostly for low-skilled people. The adequacy between economic context and policy is not always optimal, though. Like other ones, this policy domain suffers from inertia, with the result that the countries that have led the way in one period have more difficulty adapting to the economic conditions prevailing in the following one.
Resumo:
Factor analysis as frequent technique for multivariate data inspection is widely used also for compositional data analysis. The usual way is to use a centered logratio (clr)transformation to obtain the random vector y of dimension D. The factor model istheny = Λf + e (1)with the factors f of dimension k & D, the error term e, and the loadings matrix Λ.Using the usual model assumptions (see, e.g., Basilevsky, 1994), the factor analysismodel (1) can be written asCov(y) = ΛΛT + ψ (2)where ψ = Cov(e) has a diagonal form. The diagonal elements of ψ as well as theloadings matrix Λ are estimated from an estimation of Cov(y).Given observed clr transformed data Y as realizations of the random vectory. Outliers or deviations from the idealized model assumptions of factor analysiscan severely effect the parameter estimation. As a way out, robust estimation ofthe covariance matrix of Y will lead to robust estimates of Λ and ψ in (2), seePison et al. (2003). Well known robust covariance estimators with good statisticalproperties, like the MCD or the S-estimators (see, e.g. Maronna et al., 2006), relyon a full-rank data matrix Y which is not the case for clr transformed data (see,e.g., Aitchison, 1986).The isometric logratio (ilr) transformation (Egozcue et al., 2003) solves thissingularity problem. The data matrix Y is transformed to a matrix Z by usingan orthonormal basis of lower dimension. Using the ilr transformed data, a robustcovariance matrix C(Z) can be estimated. The result can be back-transformed tothe clr space byC(Y ) = V C(Z)V Twhere the matrix V with orthonormal columns comes from the relation betweenthe clr and the ilr transformation. Now the parameters in the model (2) can beestimated (Basilevsky, 1994) and the results have a direct interpretation since thelinks to the original variables are still preserved.The above procedure will be applied to data from geochemistry. Our specialinterest is on comparing the results with those of Reimann et al. (2002) for the Kolaproject data