928 resultados para Freedom of choice


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The indole ring is one of the most common features in natural products and small molecules with important bioactivity. Larock reported a new methodology for the synthesis of the indole ring system based on the palladium-catalyzed heteroannulation of 2-iodoaniline and substituted alkyne moieties. This procedure was subsequently extended to the preparation of other nitrogen- and oxygen- containing heterocycles. This is the process of choice for the synthesis of a large number of heterocyclic derivatives, as it provides outstanding regioselectivity and good to excellent yields.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Determination of free urinary cortisol is a test of choice in the diagnosis of Cushing's syndrome. In this study, cortisol was quantified using reversed-phase high-performance liquid chromatography (RP-HPLC) in urine samples previously extracted with ether and using triamcinolone acetonide as internal standard (IS). A BDS-Hypersil-C18® column, water-acetonitrile (72:28; v/v), with a flow rate of 1.0 mL/min and detection at 243 nm were used. This method showed to be both effective and efficient, with sensitivity and linearity ranging from 2.50 to 150 μg/L, and can be used in substitution to the radioimmunoassay technique within this concentration range.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study is made in the context of basic research within the field ofcaring science. The aim is to make a theoretical and ontological investigation of what the space is in the world of caring. The basic proposition is that the space, as a fundamental dimension, has an impact on how the appreciation of one's mental health and suffering is shaped, and vice versa. The overall purpose is to develop a theoretical model of space from the caring science point of view andalso to offer an ideal concept of space to caring science. Guided by a theoretical horizon (Eriksson 1993, Eriksson 1995, Eriksson 2001) and methodological approach grounded in Gadamer's philosophic and existential hermeneutics a three-stage analysis and interpretation is conducted. The hermeneutic spiral of this investigation starts through a procedure in accordance with Eriksson's model (1997) of concept definition. The goal is to clarify the etymology of the concept as well as semantic differences between synonymous concepts, i.e. to identify the different extents of the concept of `space` (`rum`) in order to bring these closer for an exploration. The second phase is to analyse and interpret a sample of narratives in order to explicate the ontological nature and meaning of the space. The material used here is literary texts. The goal is to clarify the characteristics of the very inside of the space when it is shaped in relation to the human being in encountering suffering. In the third phase an interview study is taken place. The focus of the study is directed towards the phenomenon of space as it is known by a patient in a landscape of psychiatric care, i.e. what the space is in a contextual meaning. Then, a gradual hermeneutic understanding of the space is attempted by using theories from the field of caring science as well as additional theories from other disciplines. Metaphors are used as they are vivid and expressive tools for generating meaning. Different metaphoric space formations depict here a variety of purports that, although not quite the same, share extensive elements. Six metaphorically summarized entities of meaning emerged. The comprehensive form of space is pointed out as the Mobile-Immobile Room. Furthermore, the Standby, the Asylum, the Wall and the Place. In the further dialogue with the texts the understanding has deepened ontologically. The theoretical model ofthe space sums up the vertical, horizontal and the inward extent of deepness inthe movement of mental health. Three entities of ontological meaning have emerged as three significant rooms: the Common Land emerges as the ideal concept of mutual creation in the freedom of doing, being and becoming health. On the interpersonal level it means freedom, which includes sovereignty, choice and dignity of the human being. The Ice World signifies, ultimately, the space as a kind of frozenness of despair which "wallpapers" the person's entire being in the world in the drama of suffering. The Spiritual Home is shaped when the human being has acquired the very core of his/her inner and outer placeness as a kind of "at-homeness" and rootedness. Time is a central element and the inward extent of deepness of this trialectic space. Each of the metaphors is then the human being's unique, although even paradoxical, way of conceiving reality, and mastering spiritual suffering. They condense characteristic structures and patterns of dynamic scenery, which take place within the movement of health. The space encloses a contradictory spatiality constituted through the dynamic field of meaningfulness and meaninglessness. Anyway, it is not through a purging of these contradictions but through bringing them together in a drama of suffering that the space is shaped as ontologically good and meaningful in the world of caring.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis describes the process of design and modeling of instrument for knee joint kinematics measurement that can work for both in-vivo and in-vitro subjects. It is designed to be compatible with imaging machine in a sagittal plane. Due to the invasiveness of the imaging machine, the instrument is designed to be able to function independently. The flexibility of this instrument allows to measure anthropometrically different subject. Among the sixth degree of freedom of a knee, three rotational and one translational degree of freedom can be measured for both type of subject. The translational, proximal-distal, motion is stimulated by external force directly applied along its axis. These angular and linear displacements are measured by magnetic sensors and high precision potentiometers respectively

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is known already from 1970´s that laser beam is suitable for processing paper materials. In this thesis, term paper materials mean all wood-fibre based materials, like dried pulp, copy paper, newspaper, cardboard, corrugated board, tissue paper etc. Accordingly, laser processing in this thesis means all laser treatments resulting material removal, like cutting, partial cutting, marking, creasing, perforation etc. that can be used to process paper materials. Laser technology provides many advantages for processing of paper materials: non-contact method, freedom of processing geometry, reliable technology for non-stop production etc. Especially packaging industry is very promising area for laser processing applications. However, there are only few industrial laser processing applications worldwide even in beginning of 2010´s. One reason for small-scale use of lasers in paper material manufacturing is that there is a shortage of published research and scientific articles. Another problem, restraining the use of laser for processing of paper materials, is colouration of paper material i.e. the yellowish and/or greyish colour of cut edge appearing during cutting or after cutting. These are the main reasons for selecting the topic of this thesis to concern characterization of interaction of laser beam and paper materials. This study was carried out in Laboratory of Laser Processing at Lappeenranta University of Technology (Finland). Laser equipment used in this study was TRUMPF TLF 2700 carbon dioxide laser that produces a beam with wavelength of 10.6 μm with power range of 190-2500 W (laser power on work piece). Study of laser beam and paper material interaction was carried out by treating dried kraft pulp (grammage of 67 g m-2) with different laser power levels, focal plane postion settings and interaction times. Interaction between laser beam and dried kraft pulp was detected with different monitoring devices, i.e. spectrometer, pyrometer and active illumination imaging system. This way it was possible to create an input and output parameter diagram and to study the effects of input and output parameters in this thesis. When interaction phenomena are understood also process development can be carried out and even new innovations developed. Fulfilling the lack of information on interaction phenomena can assist in the way of lasers for wider use of technology in paper making and converting industry. It was concluded in this thesis that interaction of laser beam and paper material has two mechanisms that are dependent on focal plane position range. Assumed interaction mechanism B appears in range of average focal plane position of 3.4 mm and 2.4 mm and assumed interaction mechanism A in range of average focal plane position of 0.4 mm and -0.6 mm both in used experimental set up. Focal plane position 1.4 mm represents midzone of these two mechanisms. Holes during laser beam and paper material interaction are formed gradually: first small hole is formed to interaction area in the centre of laser beam cross-section and after that, as function of interaction time, hole expands, until interaction between laser beam and dried kraft pulp is ended. By the image analysis it can be seen that in beginning of laser beam and dried kraft pulp material interaction small holes off very good quality are formed. It is obvious that black colour and heat affected zone appear as function of interaction time. This reveals that there still are different interaction phases within interaction mechanisms A and B. These interaction phases appear as function of time and also as function of peak intensity of laser beam. Limit peak intensity is the value that divides interaction mechanism A and B from one-phase interaction into dual-phase interaction. So all peak intensity values under limit peak intensity belong to MAOM (interaction mechanism A one-phase mode) or to MBOM (interaction mechanism B onephase mode) and values over that belong to MADM (interaction mechanism A dual-phase mode) or to MBDM (interaction mechanism B dual-phase mode). Decomposition process of cellulose is evolution of hydrocarbons when temperature is between 380- 500°C. This means that long cellulose molecule is split into smaller volatile hydrocarbons in this temperature range. As temperature increases, decomposition process of cellulose molecule changes. In range of 700-900°C, cellulose molecule is mainly decomposed into H2 gas; this is why this range is called evolution of hydrogen. Interaction in this range starts (as in range of MAOM and MBOM), when a small good quality hole is formed. This is due to “direct evaporation” of pulp via decomposition process of evolution of hydrogen. And this can be seen can be seen in spectrometer as high intensity peak of yellow light (in range of 588-589 nm) which refers to temperature of ~1750ºC. Pyrometer does not detect this high intensity peak since it is not able to detect physical phase change from solid kraft pulp to gaseous compounds. As interaction time between laser beam and dried kraft pulp continues, hypothesis is that three auto ignition processes occurs. Auto ignition of substance is the lowest temperature in which it will spontaneously ignite in a normal atmosphere without an external source of ignition, such as a flame or spark. Three auto ignition processes appears in range of MADM and MBDM, namely: 1. temperature of auto ignition of hydrogen atom (H2) is 500ºC, 2. temperature of auto ignition of carbon monoxide molecule (CO) is 609ºC and 3. temperature of auto ignition of carbon atom (C) is 700ºC. These three auto ignition processes leads to formation of plasma plume which has strong emission of radiation in range of visible light. Formation of this plasma plume can be seen as increase of intensity in wavelength range of ~475-652 nm. Pyrometer shows maximum temperature just after this ignition. This plasma plume is assumed to scatter laser beam so that it interacts with larger area of dried kraft pulp than what is actual area of beam cross-section. This assumed scattering reduces also peak intensity. So result shows that assumably scattered light with low peak intensity is interacting with large area of hole edges and due to low peak intensity this interaction happens in low temperature. So interaction between laser beam and dried kraft pulp turns from evolution of hydrogen to evolution of hydrocarbons. This leads to black colour of hole edges.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dental injuries are common and the incidence of maxillofacial injuries has increased over the recent decades in Finland. Accidental injuries are the global leading cause of death among children over the age of one year and among adults under the age of 40 globally. Significant resources and costs are needed for the treatment of these patients. The prevention is the most economical way to reduce trauma rates and costs. For the prevention it is crucial to know the prevalences, incidences and risk factors related to injuries. To improve the quality of treatment, it is essential to explore the causes, trauma mechanisms and management of trauma. The above mentioned was the aim of this thesis. With a large epidemiological cohort study (5737 participants) it was possible to estimate lifetime prevalence of and risk factors for dental trauma in general population (Study I). The prevalence of dental fractures was 43% and the prevalence of dental luxations and avulsions was 14%. Male gender, a history of previous non-dental injuries, mental distress, overweight and high alcohol consumption were positively associated with the occurrence of dental injuries Study II was conducted to explore the differences in type and multiplicity of mandibular fractures in three different countries (Canada, Finland and Kuwait). This retrospective study showed that the differences in mandibular fracture multiplicity and location are based on different etiologies and demographic patterns. This data can be exploited for planning of measures to prevent traumatic facial fractures. The etiology, management and outcome of 63 pediatric skull base fracture (Study III) and 20 pediatric frontobasal fracture patients (Study IV) were explored. These retrospective studies showed that, both skull base fracture and frontobasa fracture are rare injuries in childhood and although intracranial injuries and morbidity are frequent, permanent neurological or neuropsychological deficits are infrequent. A systematic algorithm (Study V) for computer tomography (CT) image review was aimed at clinicians and radiologists to improve the assessment of patients with complex upper midface and cranial base trauma. The cohort study was cross sectional and data was collected in the Turku and Oulu University Hospitals. A novel image-reviewing algorithm was created to enhance the specificity of CT for the diagnosis of frontobasal fractures. The study showed that an image-viewing algorithm standardizes the frontobasal trauma detection procedure and leads to better control and assessment. The purpose of the retrospective subcranial craniotomy study (VI) was to review the types of frontobasal fractures and their management, complications and outcome when the fracture is approached subcranially. The subcranial approach appears to be successful and have a reasonably low complication rate. It may be recommended as the technique of choice in multiple and the most complicated frontal base fractures where the endoscopic endonasal approach is not feasible.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective of the study The aim of this study is to understand the institutional implications in Abenomics in a spatial context, the contemporary economic reform taking place in Japan, which is to finally end over two decades of economic malaise. For theoretical perspective of choice, this study explores a synthesis of institutionalism as the main approach, complemented by economies of agglomeration in spatial economics, or New Economic Geography (NEG). The outcomes include a narrative with implications for future research, as well as possible future implications for the economy of Japan, itself. The narrative seeks to depict the dialogue between public discourse and governmental communication in order to create a picture of how this phenomenon is being socially constructed. This is done by studying the official communications by the Cabinet along with public media commentary on respective topics. The reform is studied with reference to historical socio-cultural, economic evolution of Japan, which in turn, is explored through a literature review. This is to assess the unique institutional characteristics of Japan pertinent to reform. Research method This is a social and exploratory qualitative study – an institutional narrative case study. The methodological approach was kept practical: in addition to literature review, a narrative, thematic content analysis with structural emphasis was used to construct the contemporary narrative based on the Cabinet communication. This was combined with practical analytic tools borrowed from critical discourse analysis, which were utilized to assess the implicit intertextual agenda within sources. Findings What appears to characterize the discourse is status quo bias that comes in multiple forms. The bias is also coded in the institutions surrounding the reform, wherein stakeholders have vested interests in protecting the current state of affairs. This correlates with uncertainty avoidance characteristic to Japan. Japan heeds the international criticism to deregulate on a rhetorical level, but consistent with history, the Cabinet solutions appear increasingly bureaucratic. Hence, the imposed western information-age paradigm of liberal cluster agglomeration seems ill-suited to Japan which lacks risk takers and a felicitous entrepreneur culture. The Japanese, however, possess vast innovative potential ascribed to some institutional practices and traits, but restrained by others. The derived conclusion is to study the successful intrapreneur cases in Japanese institutional setting as a potential benchmark for Japan specific cluster agglomeration, and a solution to its structural problems impeding growth.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The future of privacy in the information age is a highly debated topic. In particular, new and emerging technologies such as ICTs and cognitive technologies are seen as threats to privacy. This thesis explores images of the future of privacy among non-experts within the time frame from the present until the year 2050. The aims of the study are to conceptualise privacy as a social and dynamic phenomenon, to understand how privacy is conceptualised among citizens and to analyse ideal-typical images of the future of privacy using the causal layered analysis method. The theoretical background of the thesis combines critical futures studies and critical realism, and the empirical material is drawn from three focus group sessions held in spring 2012 as part of the PRACTIS project. From a critical realist perspective, privacy is conceptualised as a social institution which creates and maintains boundaries between normative circles and preserves the social freedom of individuals. Privacy changes when actors with particular interests engage in technology-enabled practices which challenge current privacy norms. The thesis adopts a position of technological realism as opposed to determinism or neutralism. In the empirical part, the focus group participants are divided into four clusters based on differences in privacy conceptions and perceived threats and solutions. The clusters are fundamentalists, pragmatists, individualists and collectivists. Correspondingly, four ideal-typical images of the future are composed: ‘drift to low privacy’, ‘continuity and benign evolution’, ‘privatised privacy and an uncertain future’, and ‘responsible future or moral decline’. The images are analysed using the four layers of causal layered analysis: litany, system, worldview and myth. Each image has its strengths and weaknesses. The individualistic images tend to be fatalistic in character while the collectivistic images are somewhat utopian. In addition, the images have two common weaknesses: lack of recognition of ongoing developments and simplistic conceptions of privacy based on a dichotomy between the individual and society. The thesis argues for a dialectical understanding of futures as present images of the future and as outcomes of real processes and mechanisms. The first steps in promoting desirable futures are the awareness of privacy as a social institution, the awareness of current images of the future, including their assumptions and weaknesses, and an attitude of responsibility where futures are seen as the consequences of present choices.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Smart phones became part and parcel of our life, where mobility provides a freedom of not being bounded by time and space. In addition, number of smartphones produced each year is skyrocketing. However, this also created discrepancies or fragmentation among devices and OSes, which in turn made an exceeding hard for developers to deliver hundreds of similar featured applications with various versions for the market consumption. This thesis is an attempt to investigate whether cloud based mobile development platforms can mitigate and eventually eliminate fragmentation challenges. During this research, we have selected and analyzed the most popular cloud based development platforms and tested integrated cloud features. This research showed that cloud based mobile development platforms may able to reduce mobile fragmentation and enable to utilize single codebase to deliver a mobile application for different platforms.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The pentavalent antimonial (Sb5+) meglumine is the drug of choice for the treatment of cutaneous leishmaniasis (CL) in Brazil. Although the cardiotoxicity of high-dose, long-term Sb5+ therapy is well known, the use of low-dose, short-term meglumine has been considered to be safe and relatively free from significant cardiac effects. In order to investigate the cardiotoxicity of low-dose, short-term therapy with meglumine in cutaneous leishmaniasis, 62 CL patients treated with meglumine were studied. A standard ECG was obtained before and immediately after the first cycle of treatment (15 mg Sb5+ kg-1 day-1). The electrocardiographic interpretation was carried out blindly by two investigators using the Minnesota Code. There were no significant differences in qualitative ECG variables before and after meglumine treatment. However, the corrected QT interval was clearly prolonged after antimonial therapy (420.0 vs 429.3 ms, P<10-6). QTc augmentation exceeded 40 ms in 12 patients, 7 of whom developed marked QTc interval enlargement (500 ms) after meglumine therapy. This previously unrecognized cardiac toxicity induced by short-term, low-dose antimonial therapy has potentially important clinical implications. Since sudden death has been related to QTc prolongation over 500 ms induced by high-dose antimonial therapy, routine electrocardiographic monitoring is probably indicated even in CL patients treated with short-term, low-dose meglumine schedules. Until further studies are conducted to establish the interactions between pentavalent antimonials and other drugs, special care is recommended when using meglumine in combination with other medications, in particular with drugs that also increase the QTc interval.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of limiting dilution assay (LDA) for assessing the frequency of responders in a cell population is a method extensively used by immunologists. A series of studies addressing the statistical method of choice in an LDA have been published. However, none of these studies has addressed the point of how many wells should be employed in a given assay. The objective of this study was to demonstrate how a researcher can predict the number of wells that should be employed in order to obtain results with a given accuracy, and, therefore, to help in choosing a better experimental design to fulfill one's expectations. We present the rationale underlying the expected relative error computation based on simple binomial distributions. A series of simulated in machina experiments were performed to test the validity of the a priori computation of expected errors, thus confirming the predictions. The step-by-step procedure of the relative error estimation is given. We also discuss the constraints under which an LDA must be performed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Genotyping techniques are valuable tools for the epidemiologic study of Staphylococcus aureus infections in the hospital setting. Pulsed-field gel electrophoresis (PFGE) is the current method of choice for S. aureus strain typing. However, the method is laborious and requires expensive equipment. In the present study, we evaluated the natural polymorphism of the genomic 16S-23S rRNA region for genotyping purpose, by PCR-based ribotyping. Three primer pairs were tested to determine the size of amplicons produced and to obtain better discrimination with agar gel electrophoresis and ethidium bromide staining. The resolution of the typing system was determined using sets of bacteria obtained from clinical specimens from a large tertiary care hospital. These included DNA from three samples obtained from a bacteremic patient, six strains with known and diverse PGFE patterns, and 88 strains collected over a 3-month period in the same hospital. Amplification patterns obtained from samples from the same patient were identical, and PFGE from samples known to be different produced three genotypes. Amplification of DNA from 61 methicillin-resistant isolates produced only one pattern. Methicillin-sensitive strains yielded a diversity of patterns, pointing to a true polyclonal distribution throughout the hospital (22 unique patterns from 27 strains). Computer-based software can be used to differentiate among identifiable strains, given the low number of bands and good characterization of PCR products. PCR-based ribotyping can be a useful technique for genotyping methicillin-sensitive S. aureus strains, but is of limited value for methicillin-resistant strains.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A total of 1712 strains of Haemophilus influenzae isolated from patients with invasive diseases were obtained from ten Brazilian states from 1996 to 2000. ß-Lactamase production was assessed and the minimum inhibitory concentrations (MIC) of ampicillin, chloramphenicol, ceftriaxone and rifampin were determined using a method for broth microdilution of Haemophilus test medium. The prevalence of strains producing ß-lactamase ranged from 6.6 to 57.7%, with an overall prevalence of 18.4%. High frequency of ß-lactamase-mediated ampicillin resistance was observed in Distrito Federal (25%), São Paulo (21.7%) and Paraná (18.5%). Of the 1712 strains analyzed, none was ß-lactamase negative, ampicillin resistant. A total of 16.8% of the strains were resistant to chloramphenicol, and 13.8% of these also presented resistance to ampicillin, and only 3.0% were resistant to chloramphenicol alone. All strains were susceptible to ceftriaxone and rifampin and the MIC90 were 0.015 µg/ml and 0.25 µg/ml, respectively. Ceftriaxone is the drug of choice for empirical treatment of bacterial meningitis in pediatric patients who have not been screened for drug susceptibility. The emergence of drug resistance is a serious challenge for the management of invasive H. influenzae disease, which emphasizes the fundamental role of laboratory-based surveillance for antimicrobial resistance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

When the offset of a visual stimulus (GAP condition) precedes the onset of a target, saccadic reaction times are reduced in relation to the condition with no offset (overlap condition) - the GAP effect. However, the existence of the GAP effect for manual responses is still controversial. In two experiments using both simple (Experiment 1, N = 18) and choice key-press procedures (Experiment 2, N = 12), we looked for the GAP effect in manual responses and investigated possible contextual influences on it. Participants were asked to respond to the imperative stimulus that would occur under different experimental contexts, created by varying the array of warning-stimulus intervals (0, 300 and 1000 ms) and conditions (GAP and overlap): i) intervals and conditions were randomized throughout the experiment; ii) conditions were run in different blocks and intervals were randomized; iii) intervals were run in different blocks and conditions were randomized. Our data showed that no GAP effect was obtained for any manipulation. The predictability of stimulus occurrence produced the strongest influence on response latencies. In Experiment 1, simple manual responses were shorter when the intervals were blocked (247 ms, P < 0.001) in relation to the other two contexts (274 and 279 ms). Despite the use of choice key-press procedures, Experiment 2 produced a similar pattern of results. A discussion addressing the critical conditions to obtain the GAP effect for distinct motor responses is presented. In short, our data stress the relevance of the temporal allocation of attention for behavioral performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Allogeneic hematopoietic stem cell transplantation (AHSCT) is the treatment of choice for young patients with severe aplastic anemia (SAA). The association of antithymocyte globulin (ATG) and cyclophosphamide (CY) is the most frequently used conditioning regimen for this disease. We performed this retrospective study in order to compare the outcomes of HLA-matched sibling donor AHSCT in 41 patients with SAA receiving cyclophosphamide plus ATG (ATG-CY, N = 17) or cyclophosphamide plus busulfan (BU-CY, N = 24). The substitution of BU for ATG was motivated by the high cost of ATG. There were no differences in the clinical features between the two groups, including age, gender, cytomegalovirus status, ABO match, interval between diagnosis and transplant, and number of total nucleated cells infused. No differences were observed in the time to neutrophil and platelet engraftment, or in the risk of veno-occlusive disease and hemorrhage. However, there was a higher risk of mucositis in the BU-CY group (71 vs 24%, P = 0.004). There were no differences in the incidence of neutrophil and platelet engraftment, acute and chronic graft-versus-host disease, and transplant-related mortality. There was a higher incidence of late rejection in the ATG-CY group (41 vs 4%, P = 0.009). Although the ATG-CY group had a longer follow-up (101 months) than the BU-CY group (67 months, P = 0.04), overall survival was similar between the groups (69 vs 58%, respectively, P = 0.32). We conclude that the association BU-CY is a feasible option to the conventional ATG-CY regimen in this population.