930 resultados para Analysis tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performances of the H → ZZ* → 4l analysis are studied in the context of the High Luminosity upgrade of the LHC collider, with the CMS detector. The high luminosity (up to L = 5 × 10^34 cm−2s−1) of the accelerator poses very challenging experimental con- ditions. In particular, the number of overlapping events per bunch crossing will increase to 140. To cope with this difficult environment, the CMS detector will be upgraded in two stages: Phase-I and Phase-II. The tools used in the analysis are the CMS Full Simulation and the fast parametrized Delphes simulation. A validation of Delphes with respect to the Full Simulation is performed, using reference Phase-I detector samples. Delphes is then used to simulate the Phase-II detector response. The Phase-II configuration is compared with the Phase-I detector and the same Phase-I detector affected by aging processes, both modeled with the Full Simulation framework. Conclusions on these three scenarios are derived: the degradation in performances observed with the “aged” scenario shows that a major upgrade of the detector is mandatory. The specific upgrade configuration studied allows to keep the same performances as in Phase-I and, in the case of the four-muons channel, even to exceed them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approaching the world of the fairy tale as an adult, one soon realizes that things are not what they once seemed during story time in bed. Something that once appeared so innocent and simple can become rather complex when digging into its origin. A kiss, for example, can mean something else entirely. I can clearly remember my sister, who is ten years older than I am, telling me that the fairy tales I was told had a mysterious hidden meaning I could not understand. I was probably 9 or 10 when she told me that the story of Sleeping Beauty, which I used to love so much in Disney’s rendering, was nothing more than the story of an adolescent girl, with all the necessary steps needed to become a woman, the bleeding of menstruation and the sexual awakening - even though she did not really put it in these terms. This shocking news troubled me for a while, so much so that I haven’t watched that movie since. But in reality it was not fear that my sister had implanted in me: it was curiosity, the feeling that I was missing something terribly important behind the words and images. But it was not until last year during my semester abroad in Germany, where I had the chance to take a very interesting English literature seminar, that I fully understood what I had been looking for all these years. Thanks to what I learned from the work of Bruno Bettelheim, Jack Zipes, Vladimir Propp, and many other authors that wrote extensively about the subject, I feel I finally have the right tools to really get to know this fairy tale. But what I also know now is that the message behind fairy tales is not to be searched for behind only one version: on the contrary, since they come from oral traditions and their form was slowly shaped by centuries of recountals and retellings, the more one digs, the more complete the understanding of the tale will be. I will therefore look for Sleeping Beauty’s hidden meaning by looking for the reason why it did stick so consistently throughout time. To achieve this goal, I have organized my analysis in three chapters: in the first chapter, I will analyze the first known literary version of the tale, the French Perceforest, and then compare it with the following Italian version, Basile’s Sun, Moon, and Talia; in the second chapter, I will focus on the most famous and by now classical literary versions of Sleeping Beauty, La Belle Au Bois Dormant, written by the Frenchman, Perrault, and the German Dornröschen, recorded by the Brothers Grimm’s; finally, in the last chapter, I will analyze Almodovar’s film Talk to Her as a modern rewriting of this tale, which after a closer look, appears closely related to the earliest version of the story, Perceforest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last 10 years several molecular markers have been established as useful tools among the armamentarium of a hematologist. As a consequence, the number of performed hematologic molecular analyses has immensely increased. Often, such tests replace or complement other laboratory methods. Molecular markers can be useful in many ways: they can serve for diagnostics, describe the prognostic profile, predict which types of drugs are indicated, and can be used for the therapeutic monitoring of the patient to indicate an adequate response or predict resistance or relapse of the disease. Many markers fulfill more than one of these aspects. Most important, however, is the right choice of analyses at the right time-points!

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in novel molecular biological diagnostic methods are changing the way of diagnosis and study of metabolic disorders like growth hormone deficiency. Faster sequencing and genotyping methods require strong bioinformatics tools to make sense of the vast amount of data generated by modern laboratories. Advances in genome sequencing and computational power to analyze the whole genome sequences will guide the diagnostics of future. In this chapter, an overview of some basic bioinformatics resources that are needed to study metabolic disorders are reviewed and some examples of bioinformatics analysis of human growth hormone gene, protein and structure are provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the context of drug hypersensitivity, our group has recently proposed a new model based on the structural features of drugs (pharmacological interaction with immune receptors; p-i concept) to explain their recognition by T cells. According to this concept, even chemically inert drugs can stimulate T cells because certain drugs interact in a direct way with T-cell receptors (TCR) and possibly major histocompatibility complex molecules without the need for metabolism and covalent binding to a carrier. In this study, we investigated whether mouse T-cell hybridomas transfected with drug-specific human TCR can be used as an alternative to drug-specific T-cell clones (TCC). Indeed, they behaved like TCC and, in accordance with the p-i concept, the TCR recognize their specific drugs in a direct, processing-independent, and dose-dependent way. The presence of antigen-presenting cells was a prerequisite for interleukin-2 production by the TCR-transfected cells. The analysis of cross-reactivity confirmed the fine specificity of the TCR and also showed that TCR transfectants might provide a tool to evaluate the potential of new drugs to cause hypersensitivity due to cross-reactivity. Recombining the alpha- and beta-chains of sulfanilamide- and quinolone-specific TCR abrogated drug reactivity, suggesting that both original alpha- and beta-chains were involved in drug binding. The TCR-transfected hybridoma system showed that the recognition of two important classes of drugs (sulfanilamides and quinolones) by TCR occurred according to the p-i concept and provides an interesting tool to study drug-TCR interactions and their biological consequences and to evaluate the cross-reactivity potential of new drugs of the same class.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acute lung injury is associated with a variety of histopathological alterations, such as oedema formation, damage to the components of the blood–air barrier and impairment of the surfactant system. Stereological methods are indispensable tools with which to properly quantitate these structural alterations at the light and electron microscopic level. The stereological parameters that are relevant for the analysis of acute lung injury are reviewed in the present articl

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact. Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the post-mortem multi-slice computed tomography (MSCT) and magnetic resonance imaging (MRI) for the documentation and analysis of internal findings, highly precise 3D surface scanning is employed for the documentation of the external body findings and of injury-inflicting instruments. The correlation of injuries of the body to the injury-inflicting object and the accident mechanism are of great importance. The applied methods include documentation of the external and internal body and the involved vehicles and inflicting tools as well as the analysis of the acquired data. The body surface and the accident vehicles with their damages were digitized by 3D surface scanning. For the internal findings of the body, post-mortem MSCT and MRI were used. The analysis included the processing of the obtained data to 3D models, determination of the driving direction of the vehicle, correlation of injuries to the vehicle damages, geometric determination of the impact situation and evaluation of further findings of the accident. In the following article, the benefits of the 3D documentation and computer-assisted, drawn-to-scale 3D comparisons of the relevant injuries with the damages to the vehicle in the analysis of the course of accidents, especially with regard to the impact situation, are shown on two examined cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the demand for miniature products and components continues to increase, the need for manufacturing processes to provide these products and components has also increased. To meet this need, successful macroscale processes are being scaled down and applied at the microscale. Unfortunately, many challenges have been experienced when directly scaling down macro processes. Initially, frictional effects were believed to be the largest challenge encountered. However, in recent studies it has been found that the greatest challenge encountered has been with size effects. Size effect is a broad term that largely refers to the thickness of the material being formed and how this thickness directly affects the product dimensions and manufacturability. At the microscale, the thickness becomes critical due to the reduced number of grains. When surface contact between the forming tools and the material blanks occur at the macroscale, there is enough material (hundreds of layers of material grains) across the blank thickness to compensate for material flow and the effect of grain orientation. At the microscale, there may be under 10 grains across the blank thickness. With a decreased amount of grains across the thickness, the influence of the grain size, shape and orientation is significant. Any material defects (either natural occurring or ones that occur as a result of the material preparation) have a significant role in altering the forming potential. To date, various micro metal forming and micro materials testing equipment setups have been constructed at the Michigan Tech lab. Initially, the research focus was to create a micro deep drawing setup to potentially build micro sensor encapsulation housings. The research focus shifted to micro metal materials testing equipment setups. These include the construction and testing of the following setups: a micro mechanical bulge test, a micro sheet tension test (testing micro tensile bars), a micro strain analysis (with the use of optical lithography and chemical etching) and a micro sheet hydroforming bulge test. Recently, the focus has shifted to study a micro tube hydroforming process. The intent is to target fuel cells, medical, and sensor encapsulation applications. While the tube hydroforming process is widely understood at the macroscale, the microscale process also offers some significant challenges in terms of size effects. Current work is being conducted in applying direct current to enhance micro tube hydroforming formability. Initially, adding direct current to various metal forming operations has shown some phenomenal results. The focus of current research is to determine the validity of this process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Finite element tire modeling can be a challenging process, due to the overall complexities within the tire and the many variables that are required to produce capable predictive simulations. Utilizing tools from Abaqus finite element software, adequate predictive simulations that represent actual operational conditions can be made possible. Many variables that result from complex geometries and materials, multiple loading conditions, and surface contact can be incorporated into modeling simulations. This thesis outlines modeling practices used to conduct analysis on specific tire variants of the STL3 series OTR tire line, produced by Titan Tire. Finite element models were created to represent an inflated tire and rim assembly, supporting a 30,000 lb load while resting on a flat surface. Simulations were conducted with reinforcement belt cords at variable angles in order to understand how belt cord arrangement affects tire components and stiffness response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantifying belowground dynamics is critical to our understanding of plant and ecosystem function and belowground carbon cycling, yet currently available tools for complex belowground image analyses are insufficient. We introduce novel techniques combining digital image processing tools and geographic information systems (GIS) analysis to permit semi-automated analysis of complex root and soil dynamics. We illustrate methodologies with imagery from microcosms, minirhizotrons, and a rhizotron, in upland and peatland soils. We provide guidelines for correct image capture, a method that automatically stitches together numerous minirhizotron images into one seamless image, and image analysis using image segmentation and classification in SPRING or change analysis in ArcMap. These methods facilitate spatial and temporal root and soil interaction studies, providing a framework to expand a more comprehensive understanding of belowground dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global transcriptomic and proteomic profiling platforms have yielded important insights into the complex response to ionizing radiation (IR). Nonetheless, little is known about the ways in which small cellular metabolite concentrations change in response to IR. Here, a metabolomics approach using ultraperformance liquid chromatography coupled with electrospray time-of-flight mass spectrometry was used to profile, over time, the hydrophilic metabolome of TK6 cells exposed to IR doses ranging from 0.5 to 8.0 Gy. Multivariate data analysis of the positive ions revealed dose- and time-dependent clustering of the irradiated cells and identified certain constituents of the water-soluble metabolome as being significantly depleted as early as 1 h after IR. Tandem mass spectrometry was used to confirm metabolite identity. Many of the depleted metabolites are associated with oxidative stress and DNA repair pathways. Included are reduced glutathione, adenosine monophosphate, nicotinamide adenine dinucleotide, and spermine. Similar measurements were performed with a transformed fibroblast cell line, BJ, and it was found that a subset of the identified TK6 metabolites were effective in IR dose discrimination. The GEDI (Gene Expression Dynamics Inspector) algorithm, which is based on self-organizing maps, was used to visualize dynamic global changes in the TK6 metabolome that resulted from IR. It revealed dose-dependent clustering of ions sharing the same trends in concentration change across radiation doses. "Radiation metabolomics," the application of metabolomic analysis to the field of radiobiology, promises to increase our understanding of cellular responses to stressors such as radiation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: In search of an optimal compression therapy for venous leg ulcers, a systematic review and meta-analysis was performed of randomized controlled trials (RCT) comparing compression systems based on stockings (MCS) with divers bandages. METHODS: RCT were retrieved from six sources and reviewed independently. The primary endpoint, completion of healing within a defined time frame, and the secondary endpoints, time to healing, and pain were entered into a meta-analysis using the tools of the Cochrane Collaboration. Additional subjective endpoints were summarized. RESULTS: Eight RCT (published 1985-2008) fulfilled the predefined criteria. Data presentation was adequate and showed moderate heterogeneity. The studies included 692 patients (21-178/study, mean age 61 years, 56% women). Analyzed were 688 ulcerated legs, present for 1 week to 9 years, sizing 1 to 210 cm(2). The observation period ranged from 12 to 78 weeks. Patient and ulcer characteristics were evenly distributed in three studies, favored the stocking groups in four, and the bandage group in one. Data on the pressure exerted by stockings and bandages were reported in seven and two studies, amounting to 31-56 and 27-49 mm Hg, respectively. The proportion of ulcers healed was greater with stockings than with bandages (62.7% vs 46.6%; P < .00001). The average time to healing (seven studies, 535 patients) was 3 weeks shorter with stockings (P = .0002). In no study performed bandages better than MCS. Pain was assessed in three studies (219 patients) revealing an important advantage of stockings (P < .0001). Other subjective parameters and issues of nursing revealed an advantage of MCS as well. CONCLUSIONS: Leg compression with stockings is clearly better than compression with bandages, has a positive impact on pain, and is easier to use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research and professional practices have the joint aim of re-structuring the preconceived notions of reality. They both want to gain the understanding about social reality. Social workers use their professional competence in order to grasp the reality of their clients, while researchers’ pursuit is to open the secrecies of the research material. Development and research are now so intertwined and inherent in almost all professional practices that making distinctions between practising, developing and researching has become difficult and in many aspects irrelevant. Moving towards research-based practices is possible and it is easily applied within the framework of the qualitative research approach (Dominelli 2005, 235; Humphries 2005, 280). Social work can be understood as acts and speech acts crisscrossing between social workers and clients. When trying to catch the verbal and non-verbal hints of each others’ behaviour, the actors have to do a lot of interpretations in a more or less uncertain mental landscape. Our point of departure is the idea that the study of social work practices requires tools which effectively reveal the internal complexity of social work (see, for example, Adams & Dominelli & Payne 2005, 294 – 295). The boom of qualitative research methodologies in recent decades is associated with much profound the rupture in humanities, which is called the linguistic turn (Rorty 1967). The idea that language is not transparently mediating our perceptions and thoughts about reality, but on the contrary it constitutes it was new and even confusing to many social scientists. Nowadays we have got used to read research reports which have applied different branches of discursive analyses or narratologic or semiotic approaches. Although differences are sophisticated between those orientations they share the idea of the predominance of language. Despite the lively research work of today’s social work and the research-minded atmosphere of social work practice, semiotics has rarely applied in social work research. However, social work as a communicative practice concerns symbols, metaphors and all kinds of the representative structures of language. Those items are at the core of semiotics, the science of signs, and the science which examines people using signs in their mutual interaction and their endeavours to make the sense of the world they live in, their semiosis. When thinking of the practice of social work and doing the research of it, a number of interpretational levels ought to be passed before reaching the research phase in social work. First of all, social workers have to interpret their clients’ situations, which will be recorded in the files. In some very rare cases those past situations will be reflected in discussions or perhaps interviews or put under the scrutiny of some researcher in the future. Each and every new observation adds its own flavour to the mixture of meanings. Social workers have combined their observations with previous experience and professional knowledge, furthermore, the situation on hand also influences the reactions. In addition, the interpretations made by social workers over the course of their daily working routines are never limited to being part of the personal process of the social worker, but are also always inherently cultural. The work aiming at social change is defined by the presence of an initial situation, a specific goal, and the means and ways of achieving it, which are – or which should be – agreed upon by the social worker and the client in situation which is unique and at the same time socially-driven. Because of the inherent plot-based nature of social work, the practices related to it can be analysed as stories (see Dominelli 2005, 234), given, of course, that they are signifying and told by someone. The research of the practices is concentrating on impressions, perceptions, judgements, accounts, documents etc. All these multifarious elements can be scrutinized as textual corpora, but not whatever textual material. In semiotic analysis, the material studied is characterised as verbal or textual and loaded with meanings. We present a contribution of research methodology, semiotic analysis, which has to our mind at least implicitly references to the social work practices. Our examples of semiotic interpretation have been picked up from our dissertations (Laine 2005; Saurama 2002). The data are official documents from the archives of a child welfare agency and transcriptions of the interviews of shelter employees. These data can be defined as stories told by the social workers of what they have seen and felt. The official documents present only fragmentations and they are often written in passive form. (Saurama 2002, 70.) The interviews carried out in the shelters can be described as stories where the narrators are more familiar and known. The material is characterised by the interaction between the interviewer and interviewee. The levels of the story and the telling of the story become apparent when interviews or documents are examined with the use of semiotic tools. The roots of semiotic interpretation can be found in three different branches; the American pragmatism, Saussurean linguistics in Paris and the so called formalism in Moscow and Tartu; however in this paper we are engaged with the so called Parisian School of semiology which prominent figure was A. J. Greimas. The Finnish sociologists Pekka Sulkunen and Jukka Törrönen (1997a; 1997b) have further developed the ideas of Greimas in their studies on socio-semiotics, and we lean on their ideas. In semiotics social reality is conceived as a relationship between subjects, observations, and interpretations and it is seen mediated by natural language which is the most common sign system among human beings (Mounin 1985; de Saussure 2006; Sebeok 1986). Signification is an act of associating an abstract context (signified) to some physical instrument (signifier). These two elements together form the basic concept, the “sign”, which never constitutes any kind of meaning alone. The meaning will be comprised in a distinction process where signs are being related to other signs. In this chain of signs, the meaning becomes diverged from reality. (Greimas 1980, 28; Potter 1996, 70; de Saussure 2006, 46-48.) One interpretative tool is to think of speech as a surface under which deep structures – i.e. values and norms – exist (Greimas & Courtes 1982; Greimas 1987). To our mind semiotics is very much about playing with two different levels of text: the syntagmatic surface which is more or less faithful to the grammar, and the paradigmatic, semantic structure of values and norms hidden in the deeper meanings of interpretations. Semiotic analysis deals precisely with the level of meaning which exists under the surface, but the only way to reach those meanings is through the textual level, the written or spoken text. That is why the tools are needed. In our studies, we have used the semiotic square and the actant analysis. The former is based on the distinctions and the categorisations of meanings, and the latter on opening the plotting of narratives in order to reach the value structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The mollicute Mycoplasma conjunctivae is the etiological agent leading to infectious keratoconjunctivitis (IKC) in domestic sheep and wild caprinae. Although this pathogen is relatively benign for domestic animals treated by antibiotics, it can lead wild animals to blindness and death. This is a major cause of death in the protected species in the Alps (e.g., Capra ibex, Rupicapra rupicapra). METHODS: The genome was sequenced using a combined technique of GS-FLX (454) and Sanger sequencing, and annotated by an automatic pipeline that we designed using several tools interconnected via PERL scripts. The resulting annotations are stored in a MySQL database. RESULTS: The annotated sequence is deposited in the EMBL database (FM864216) and uploaded into the mollicutes database MolliGen http://cbi.labri.fr/outils/molligen/ allowing for comparative genomics. CONCLUSION: We show that our automatic pipeline allows for annotating a complete mycoplasma genome and present several examples of analysis in search for biological targets (e.g., pathogenic proteins).