961 resultados para Three-state Potts model
Resumo:
INTRODUCTION: Cartilage defects are common pathologies and surgical cartilage repair shows promising results. In its postoperative evaluation, the magnetic resonance observation of cartilage repair tissue (MOCART) score, using different variables to describe the constitution of the cartilage repair tissue and the surrounding structures, is widely used. High-field magnetic resonance imaging (MRI) and 3-dimensional (3D) isotropic sequences may combine ideal preconditions to enhance the diagnostic performance of cartilage imaging.Aim of this study was to introduce an improved 3D MOCART score using the possibilities of an isotropic 3D true fast imaging with steady-state precession (True-FISP) sequence in the postoperative evaluation of patients after matrix-associated autologous chondrocyte transplantation (MACT) as well as to compare the results to the conventional 2D MOCART score using standard MR sequences. MATERIAL AND METHODS: The study had approval by the local ethics commission. One hundred consecutive MR scans in 60 patients at standard follow-up intervals of 1, 3, 6, 12, 24, and 60 months after MACT of the knee joint were prospectively included. The mean follow-up interval of this cross-sectional evaluation was 21.4 +/- 20.6 months; the mean age of the patients was 35.8 +/- 9.4 years. MRI was performed at a 3.0 Tesla unit. All variables of the standard 2D MOCART score where part of the new 3D MOCART score. Furthermore, additional variables and options were included with the aims to use the capabilities of isotropic MRI, to include the results of recent studies, and to adapt to the needs of patients and physician in a clinical routine examination. A proton-density turbo spin-echo sequence, a T2-weighted dual fast spin-echo (dual-FSE) sequence, and a T1-weighted turbo inversion recovery magnitude (TIRM) sequence were used to assess the standard 2D MOCART score; an isotropic 3D-TrueFISP sequence was prepared to evaluate the new 3D MOCART score. All 9 variables of the 2D MOCART score were compared with the corresponding variables obtained by the 3D MOCART score using the Pearson correlation coefficient; additionally the subjective quality and possible artifacts of the MR sequences were analyzed. RESULTS: The correlation between the standard 2D MOCART score and the new 3D MOCART showed for the 8 variables "defect fill," "cartilage interface," "surface," "adhesions," "structure," "signal intensity," "subchondral lamina," and "effusion"-a highly significant (P < 0.001) correlation with a Pearson coefficient between 0.566 and 0.932. The variable "bone marrow edema" correlated significantly (P < 0.05; Pearson coefficient: 0.257). The subjective quality of the 3 standard MR sequences was comparable to the isotropic 3D-TrueFISP sequence. Artifacts were more frequently visible within the 3D-TrueFISP sequence. CONCLUSION: In the clinical routine follow-up after cartilage repair, the 3D MOCART score, assessed by only 1 high-resolution isotropic MR sequence, provides comparable information than the standard 2D MOCART score. Hence, the new 3D MOCART score has the potential to combine the information of the standard 2D MOCART score with the possible advantages of isotropic 3D MRI at high-field. A clear limitation of the 3D-TrueFISP sequence was the high number of artifacts. Future studies have to prove the clinical benefits of a 3D MOCART score.
Resumo:
BACKGROUND: In the acute respiratory distress syndrome potentially recruitable lung volume is currently discussed. (3)He-magnetic resonance imaging ((3)He-MRI) offers the possibility to visualize alveolar recruitment directly. METHODS: With the approval of the state animal care committee, unilateral lung damage was induced in seven anesthetized pigs by saline lavage of the right lungs. The left lung served as an intraindividual control (healthy lung). Unilateral lung damage was confirmed by conventional proton MRI and spiral-CT scanning. The total aerated lung volume was determined both at a positive end-expiratory pressure (PEEP) of 0 and 10 mbar from three-dimensionally reconstructed (3)He images, both for healthy and damaged lungs. The fractional increase of aerated volume in damaged and healthy lungs, followed by a PEEP increase from 0 to 10 mbar, was compared. RESULTS: Aerated gas space was visualized with a high spatial resolution in the three-dimensionally reconstructed (3)He-MR images, and aeration defects in the lavaged lung matched the regional distribution of atelectasis in proton MRI. After recruitment and PEEP increase, the aerated volume increased significantly both in healthy lungs from 415 ml [270-445] (median [min-max]) to 481 ml [347-523] and in lavaged lungs from 264 ml [71-424] to 424 ml [129-520]. The fractional increase in lavaged lungs was significantly larger than that in healthy lungs (healthy: 17% [11-38] vs. lavage: 42% [14-90] (P=0.031). CONCLUSION: The (3)He-MRI signal might offer an experimental approach to discriminate atelectatic vs. poor aerated lung areas in a lung damage animal model. Our results confirm the presence of potential recruitable lung volume by either alveolar collapse or alveolar flooding, in accordance with previous reports by computed tomography.
Resumo:
Dynamic models for electrophoresis are based upon model equations derived from the transport concepts in solution together with user-inputted conditions. They are able to predict theoretically the movement of ions and are as such the most versatile tool to explore the fundamentals of electrokinetic separations. Since its inception three decades ago, the state of dynamic computer simulation software and its use has progressed significantly and Electrophoresis played a pivotal role in that endeavor as a large proportion of the fundamental and application papers were published in this periodical. Software is available that simulates all basic electrophoretic systems, including moving boundary electrophoresis, zone electrophoresis, ITP, IEF and EKC, and their combinations under almost exactly the same conditions used in the laboratory. This has been employed to show the detailed mechanisms of many of the fundamental phenomena that occur in electrophoretic separations. Dynamic electrophoretic simulations are relevant for separations on any scale and instrumental format, including free-fluid preparative, gel, capillary and chip electrophoresis. This review includes a historical overview, a survey of current simulators, simulation examples and a discussion of the applications and achievements of dynamic simulation.
Resumo:
This article gives an overview of trends in policy on the functions and role of social work in French society over the past twenty years. The author suggests several reasons for the current feeling of crisis of professional identity among professionals and the complexities of the political demands made on social work. These are analysed as a consequence of the specific French context of decentralisation of the State and of the multitude of approaches to organisation and to professional training programs. On a broader level, it seems that French social work reflects many characteristics of “modernity” being concerned with difficulties in defining a clear identity, lack of a clear territorial and institutional base (or “disembeddedness” to borrow Giddens term) and difficulties in finding a clear voice to make its values heard in an increasingly politicised arena of public debate.
Resumo:
The three-step test is central to the regulation of copyright limitations at the international level. Delineating the room for exemptions with abstract criteria, the three-step test is by far the most important and comprehensive basis for the introduction of national use privileges. It is an essential, flexible element in the international limitation infrastructure that allows national law makers to satisfy domestic social, cultural, and economic needs. Given the universal field of application that follows from the test’s open-ended wording, the provision creates much more breathing space than the more specific exceptions recognized in international copyright law. EC copyright legislation, however, fails to take advantage of the flexibility inherent in the three-step test. Instead of using the international provision as a means to open up the closed EC catalogue of permissible exceptions, offer sufficient breathing space for social, cultural, and economic needs, and enable EC copyright law to keep pace with the rapid development of the Internet, the Copyright Directive 2001/29/EC encourages the application of the three-step test to further restrict statutory exceptions that are often defined narrowly in national legislation anyway. In the current online environment, however, enhanced flexibility in the field of copyright limitations is indispensable. From a social and cultural perspective, the web 2.0 promotes and enhances freedom of expression and information with its advanced search engine services, interactive platforms, and various forms of user-generated content. From an economic perspective, it creates a parallel universe of traditional content providers relying on copyright protection, and emerging Internet industries whose further development depends on robust copyright limita- tions. In particular, the newcomers in the online market – social networking sites, video forums, and virtual worlds – promise a remarkable potential for economic growth that has already attracted the attention of the OECD. Against this background, the time is ripe to debate the introduction of an EC fair use doctrine on the basis of the three-step test. Otherwise, EC copyright law is likely to frustrate important opportunities for cultural, social, and economic development. To lay groundwork for the debate, the differences between the continental European and the Anglo-American approach to copyright limitations (section 1), and the specific merits of these two distinct approaches (section 2), will be discussed first. An analysis of current problems that have arisen under the present dysfunctional EC system (section 3) will then serve as a starting point for proposing an EC fair use doctrine based on the three-step test (section 4). Drawing conclusions, the international dimension of this fair use proposal will be considered (section 5).
Resumo:
Reduced glutathione (GSH) protects cells against injury by oxidative stress and maintains a range of vital functions. In vitro cell cultures have been used as experimental models to study the role of GSH in chemical toxicity in mammals; however, this approach has been rarely used with fish cells to date. The present study aimed to evaluate sensitivity and specificity of three fluorescent dyes for measuring pro-oxidant-induced changes of GSH contents in fish cell lines: monochlorobimane (mBCl), 5-chloromethylfluorescein diacetate (CMFDA) and 7-amino-4-chloromethylcoumarin (CMAC-blue). Two cell lines were studied, the EPC line established from a skin tumour of carp Cyprinus carpio, and BF-2 cells established from fins of bluegill sunfish Lepomis macrochirus. The cells were exposed for 6 and 24 h to low cytotoxic concentrations of pro-oxidants including hydrogen peroxide, paraquat (PQ), copper and the GSH synthesis inhibitor, L-buthionine-SR-sulfoximine (BSO). The results indicate moderate differences in the GSH response between EPC and BF-2 cells, but distinct differences in the magnitude of the GSH response for the four pro-oxidants. Further, the choice of GSH dye can critically affect the results, with CMFDA appearing to be less specific for GSH than mBCl and CMAC-blue.
Resumo:
With a steady increase of regulatory requirements for business processes, automation support of compliance management is a field garnering increasing attention in Information Systems research. Several approaches have been developed to support compliance checking of process models. One major challenge for such approaches is their ability to handle different modeling techniques and compliance rules in order to enable widespread adoption and application. Applying a structured literature search strategy, we reflect and discuss compliance-checking approaches in order to provide an insight into their generalizability and evaluation. The results imply that current approaches mainly focus on special modeling techniques and/or a restricted set of types of compliance rules. Most approaches abstain from real-world evaluation which raises the question of their practical applicability. Referring to the search results, we propose a roadmap for further research in model-based business process compliance checking.
Resumo:
Stemmatology, or the reconstruction of the transmission history of texts, is a field that stands particularly to gain from digital methods. Many scholars already take stemmatic approaches that rely heavily on computational analysis of the collated text (e.g. Robinson and O’Hara 1996; Salemans 2000; Heikkilä 2005; Windram et al. 2008 among many others). Although there is great value in computationally assisted stemmatology, providing as it does a reproducible result and allowing access to the relevant methodological process in related fields such as evolutionary biology, computational stemmatics is not without its critics. The current state-of-the-art effectively forces scholars to choose between a preconceived judgment of the significance of textual differences (the Lachmannian or neo-Lachmannian approach, and the weighted phylogenetic approach) or to make no judgment at all (the unweighted phylogenetic approach). Some basis for judgment of the significance of variation is sorely needed for medieval text criticism in particular. By this, we mean that there is a need for a statistical empirical profile of the text-genealogical significance of the different sorts of variation in different sorts of medieval texts. The rules that apply to copies of Greek and Latin classics may not apply to copies of medieval Dutch story collections; the practices of copying authoritative texts such as the Bible will most likely have been different from the practices of copying the Lives of local saints and other commonly adapted texts. It is nevertheless imperative that we have a consistent, flexible, and analytically tractable model for capturing these phenomena of transmission. In this article, we present a computational model that captures most of the phenomena of text variation, and a method for analysis of one or more stemma hypotheses against the variation model. We apply this method to three ‘artificial traditions’ (i.e. texts copied under laboratory conditions by scholars to study the properties of text variation) and four genuine medieval traditions whose transmission history is known or deduced in varying degrees. Although our findings are necessarily limited by the small number of texts at our disposal, we demonstrate here some of the wide variety of calculations that can be made using our model. Certain of our results call sharply into question the utility of excluding ‘trivial’ variation such as orthographic and spelling changes from stemmatic analysis.
Resumo:
Aeromonas salmonicida subsp. salmonicida is an important pathogen in salmonid aquaculture and is responsible for the typical furunculosis. The type-three secretion system (T3SS) is a major virulence system. In this work, we review structure and function of this highly sophisticated nanosyringe in A. salmonicida. Based on the literature as well as personal experimental observations, we document the genetic (re)organization, expression regulation, anatomy, putative functional origin and roles in the infectious process of this T3SS. We propose a model of pathogenesis where A. salmonicida induces a temporary immunosuppression state in fish in order to acquire free access to host tissues. Finally, we highlight putative important therapeutic and vaccine strategies to prevent furunculosis of salmonid fish.
Resumo:
Ziel der Untersuchung war es, Einflüsse auf den Arbeitsmarkt der Bundesrepublik durch das Anwachsen des Wohlfahrtsstaates zu ermitteln. Zu diesem Zweck wurden zwischen 1981 und 1983 2.171 Männer und Frauen der Geburtskohorten 1929-31, 1939-41 und 1949-51 mit standardisierten Interviews befragt zu ihrem Lebensverlauf, speziell unter den Gesichtspunkten: Soziale Herkunft, Ausbildung, Beschäftigung, Familie, Mobilität. Insbesondere interessierte die Frage nach der Beschäftigung im öffentlichen Dienst oder in der Privatwirtschaft bzw. ob und wann ein Wechsel von einem in den anderen Bereich stattgefunden hat. Einige Ergebnisse: Die Bildungsexpansion in den 70er Jahren führte dazu, daß eine steigende Anzahl von Universitätsabsolventen im öffentlichen Dienst Beschäftigung fand; seit der Stagnation des Wohlfahrtsstaates ab ca. 1980 sind die Beschäftigungschancen für hochqualifizierte Berufsanfänger dort wieder gesunken. In der Privatwirtschaft wird unqualifizierte Beschäftigung durch höher qualifizierte ersetzt, während im öffentlichen Dienst bei geringerem Arbeitsplatzangebot weniger hochqualifizierte Berufsanfänger nachgefragt werden.
Resumo:
In order to overcome the limitations of the linear-quadratic model and include synergistic effects of heat and radiation, a novel radiobiological model is proposed. The model is based on a chain of cell populations which are characterized by the number of radiation induced damages (hits). Cells can shift downward along the chain by collecting hits and upward by a repair process. The repair process is governed by a repair probability which depends upon state variables used for a simplistic description of the impact of heat and radiation upon repair proteins. Based on the parameters used, populations up to 4-5 hits are relevant for the calculation of the survival. The model describes intuitively the mathematical behaviour of apoptotic and nonapoptotic cell death. Linear-quadratic-linear behaviour of the logarithmic cell survival, fractionation, and (with one exception) the dose rate dependencies are described correctly. The model covers the time gap dependence of the synergistic cell killing due to combined application of heat and radiation, but further validation of the proposed approach based on experimental data is needed. However, the model offers a work bench for testing different biological concepts of damage induction, repair, and statistical approaches for calculating the variables of state.
Resumo:
Global wetlands are believed to be climate sensitive, and are the largest natural emitters of methane (CH4). Increased wetland CH4 emissions could act as a positive feedback to future warming. The Wetland and Wetland CH4 Inter-comparison of Models Project (WETCHIMP) investigated our present ability to simulate large-scale wetland characteristics and corresponding CH4 emissions. To ensure inter-comparability, we used a common experimental protocol driving all models with the same climate and carbon dioxide (CO2) forcing datasets. The WETCHIMP experiments were conducted for model equilibrium states as well as transient simulations covering the last century. Sensitivity experiments investigated model response to changes in selected forcing inputs (precipitation, temperature, and atmospheric CO2 concentration). Ten models participated, covering the spectrum from simple to relatively complex, including models tailored either for regional or global simulations. The models also varied in methods to calculate wetland size and location, with some models simulating wetland area prognostically, while other models relied on remotely sensed inundation datasets, or an approach intermediate between the two. Four major conclusions emerged from the project. First, the suite of models demonstrate extensive disagreement in their simulations of wetland areal extent and CH4 emissions, in both space and time. Simple metrics of wetland area, such as the latitudinal gradient, show large variability, principally between models that use inundation dataset information and those that independently determine wetland area. Agreement between the models improves for zonally summed CH4 emissions, but large variation between the models remains. For annual global CH4 emissions, the models vary by ±40% of the all-model mean (190 Tg CH4 yr−1). Second, all models show a strong positive response to increased atmospheric CO2 concentrations (857 ppm) in both CH4 emissions and wetland area. In response to increasing global temperatures (+3.4 °C globally spatially uniform), on average, the models decreased wetland area and CH4 fluxes, primarily in the tropics, but the magnitude and sign of the response varied greatly. Models were least sensitive to increased global precipitation (+3.9 % globally spatially uniform) with a consistent small positive response in CH4 fluxes and wetland area. Results from the 20th century transient simulation show that interactions between climate forcings could have strong non-linear effects. Third, we presently do not have sufficient wetland methane observation datasets adequate to evaluate model fluxes at a spatial scale comparable to model grid cells (commonly 0.5°). This limitation severely restricts our ability to model global wetland CH4 emissions with confidence. Our simulated wetland extents are also difficult to evaluate due to extensive disagreements between wetland mapping and remotely sensed inundation datasets. Fourth, the large range in predicted CH4 emission rates leads to the conclusion that there is both substantial parameter and structural uncertainty in large-scale CH4 emission models, even after uncertainties in wetland areas are accounted for.
Resumo:
A three-dimensional, regional coupled atmosphere-ocean model with full physics is developed to study air-sea interactions during winter storms off the U. S. east coast. Because of the scarcity of open ocean observations, models such as this offer valuable opportunities to investigate how oceanic forcing drives atmospheric circulation and vice versa. The study presented here considers conditions of strong atmospheric forcing (high wind speeds) and strong oceanic forcing (significant sea surface temperature (SST) gradients). A simulated atmospheric cyclone evolves in a manner consistent with Eta reanalysis, and the simulated air-sea heat and momentum exchanges strongly affect the circulations in both the atmosphere and the ocean. For the simulated cyclone of 19-20 January 1998, maximum ocean-to-atmosphere heat fluxes first appear over the Gulf Stream in the South Atlantic Bight, and this results in rapid deepening of the cyclone off the Carolina coast. As the cyclone moves eastward, the heat flux maximum shifts into the region near Cape Hatteras and later northeast of Hatteras, where it enhances the wind locally. The oceanic response to the atmospheric forcing is closely related to the wind direction. Southerly and southwesterly winds tend to strengthen surface currents in the Gulf Stream, whereas northeasterly winds weaken the surface currents in the Gulf Stream and generate southwestward flows on the shelf. The oceanic feedback to the atmosphere moderates the cyclone strength. Compared with a simulation in which the oceanic model always passes the initial SST to the atmospheric model, the coupled simulation in which the oceanic model passes the evolving SST to the atmospheric model produces higher ocean-to-atmosphere heat flux near Gulf Stream meander troughs. This is due to wind-driven lateral shifts of the stream, which in turn enhance the local northeasterly winds. Away from the Gulf Stream the coupled simulation produces surface winds that are 5 similar to 10% weaker. Differences in the surface ocean currents between these two experiments are significant on the shelf and in the open ocean.