980 resultados para Tibetan coded character set extension A
Resumo:
This article is motivated by a lung cancer study where a regression model is involved and the response variable is too expensive to measure but the predictor variable can be measured easily with relatively negligible cost. This situation occurs quite often in medical studies, quantitative genetics, and ecological and environmental studies. In this article, by using the idea of ranked-set sampling (RSS), we develop sampling strategies that can reduce cost and increase efficiency of the regression analysis for the above-mentioned situation. The developed method is applied retrospectively to a lung cancer study. In the lung cancer study, the interest is to investigate the association between smoking status and three biomarkers: polyphenol DNA adducts, micronuclei, and sister chromatic exchanges. Optimal sampling schemes with different optimality criteria such as A-, D-, and integrated mean square error (IMSE)-optimality are considered in the application. With set size 10 in RSS, the improvement of the optimal schemes over simple random sampling (SRS) is great. For instance, by using the optimal scheme with IMSE-optimality, the IMSEs of the estimated regression functions for the three biomarkers are reduced to about half of those incurred by using SRS.
Resumo:
The primary goal of a phase I trial is to find the maximally tolerated dose (MTD) of a treatment. The MTD is usually defined in terms of a tolerable probability, q*, of toxicity. Our objective is to find the highest dose with toxicity risk that does not exceed q*, a criterion that is often desired in designing phase I trials. This criterion differs from that of finding the dose with toxicity risk closest to q*, that is used in methods such as the continual reassessment method. We use the theory of decision processes to find optimal sequential designs that maximize the expected number of patients within the trial allocated to the highest dose with toxicity not exceeding q*, among the doses under consideration. The proposed method is very general in the sense that criteria other than the one considered here can be optimized and that optimal dose assignment can be defined in terms of patients within or outside the trial. It includes as an important special case the continual reassessment method. Numerical study indicates the strategy compares favourably with other phase I designs.
Resumo:
In this dissertation, I present an overall methodological framework for studying linguistic alternations, focusing specifically on lexical variation in denoting a single meaning, that is, synonymy. As the practical example, I employ the synonymous set of the four most common Finnish verbs denoting THINK, namely ajatella, miettiä, pohtia and harkita ‘think, reflect, ponder, consider’. As a continuation to previous work, I describe in considerable detail the extension of statistical methods from dichotomous linguistic settings (e.g., Gries 2003; Bresnan et al. 2007) to polytomous ones, that is, concerning more than two possible alternative outcomes. The applied statistical methods are arranged into a succession of stages with increasing complexity, proceeding from univariate via bivariate to multivariate techniques in the end. As the central multivariate method, I argue for the use of polytomous logistic regression and demonstrate its practical implementation to the studied phenomenon, thus extending the work by Bresnan et al. (2007), who applied simple (binary) logistic regression to a dichotomous structural alternation in English. The results of the various statistical analyses confirm that a wide range of contextual features across different categories are indeed associated with the use and selection of the selected think lexemes; however, a substantial part of these features are not exemplified in current Finnish lexicographical descriptions. The multivariate analysis results indicate that the semantic classifications of syntactic argument types are on the average the most distinctive feature category, followed by overall semantic characterizations of the verb chains, and then syntactic argument types alone, with morphological features pertaining to the verb chain and extra-linguistic features relegated to the last position. In terms of overall performance of the multivariate analysis and modeling, the prediction accuracy seems to reach a ceiling at a Recall rate of roughly two-thirds of the sentences in the research corpus. The analysis of these results suggests a limit to what can be explained and determined within the immediate sentential context and applying the conventional descriptive and analytical apparatus based on currently available linguistic theories and models. The results also support Bresnan’s (2007) and others’ (e.g., Bod et al. 2003) probabilistic view of the relationship between linguistic usage and the underlying linguistic system, in which only a minority of linguistic choices are categorical, given the known context – represented as a feature cluster – that can be analytically grasped and identified. Instead, most contexts exhibit degrees of variation as to their outcomes, resulting in proportionate choices over longer stretches of usage in texts or speech.
Resumo:
The single electron transfer-nitroxide radical coupling (SET-NRC) reaction has been used to produce multiblock polymers with high molecular weights in under 3 min at 50◦C by coupling a difunctional telechelic polystyrene (Br-PSTY-Br)with a dinitroxide. The well known combination of dimethyl sulfoxide as solvent and Me6TREN as ligand facilitated the in situ disproportionation of CuIBr to the highly active nascent Cu0 species. This SET reaction allowed polymeric radicals to be rapidly formed from their corresponding halide end-groups. Trapping of these carbon-centred radicals at close to diffusion controlled rates by dinitroxides resulted in high-molecular-weight multiblock polymers. Our results showed that the disproportionation of CuI was critical in obtaining these ultrafast reactions, and confirmed that activation was primarily through Cu0. We took advantage of the reversibility of the NRC reaction at elevated temperatures to decouple the multiblock back to the original PSTY building block through capping the chain-ends with mono-functional nitroxides. These alkoxyamine end-groups were further exchanged with an alkyne mono-functional nitroxide (TEMPO–≡) and ‘clicked’ by a CuI-catalyzed azide/alkyne cycloaddition (CuAAC) reaction with N3–PSTY–N3 to reform the multiblocks. This final ‘click’ reaction, even after the consecutive decoupling and nitroxide-exchange reactions, still produced high molecular-weight multiblocks efficiently. These SET-NRC reactions would have ideal applications in re-usable plastics and possibly as self-healing materials.
Resumo:
This study examines strategies used to translate various thematic and character delineating allusions in two of Reginald Hill's detective novels, The Wood Beyond and On Beulah Height and their Swedish translations Det mörka arvet and Dalen som dränktes. In this study, thematic allusions and allusions used in character delineation are regarded as intertextual networks. Intertextual networks comprise all the texts that are in one way or another embedded into a text, all the texts referred to in it and even the texts somehow rejected from a text's own canon. Studying allusions as intertextual networks makes it warranted to pay minute attention to even the smallest of details. Seen together, these little details form extensive networks of meaning that readers use to interpret the text. Allusion can be defined as a reference, often covert or indirect, to another text in a way that brings into the text some of the associations of that other text. A text is here understood broadly, hence sources of allusions include all cultural texts from literature and history to cinema and televisions serials. Allusions are culture bound and each culture tends to allude to its own cultural products. The set of transcultural allusions is therefore fairly small. Translation strategies are translatorial ways of solving translation problems. Being culture-bound, allusions are potential translation problems. In order to transmit the thoughts evoked by the allusions in source text readers to the target text readers translators may add guidance to the translated text. Often guidance is not added, which may result in changes in handling of themes or character delineation, clear in the source text but confusing or incomprehensible in the target text. However, norms in target culture may not always allow the translators the possibility to make the text comprehensible. My analyses of translation strategies show that in the two translated novels studied minimum change is a very frequently used strategy. This results in themes and character delineation losing some of the effect they have in the source texts. Perhaps surprisingly, the result is very much the same even where it is possible to discern that the two translators have had differing translation principles. Keywords: allusions, intertextuality, literary translation, translation strategies, norms, crime fiction, Hill, Reginald
Resumo:
This study sets out to provide new information about the interaction between abstract religious ideas and actual acts of violence in the early crusading movement. The sources are asked, whether such a concept as religious violence can be sorted out as an independent or distinguishable source of aggression at the moment of actual bloodshed. The analysis concentrates on the practitioners of sacred violence, crusaders and their mental processing of the use of violence, the concept of the violent act, and the set of values and attitudes defining this concept. The scope of the study, the early crusade movement, covers the period from late 1080 s to the crusader conquest of Jerusalem in 15 July 1099. The research has been carried out by contextual reading of relevant sources. Eyewitness reports will be compared with texts that were produced by ecclesiastics in Europe. Critical reading of the texts reveals both connecting ideas and interesting differences between them. The sources share a positive attitude towards crusading, and have principally been written to propagate the crusade institution and find new recruits. The emphasis of the study is on the interpretation of images: the sources are not asked what really happened in chronological order, but what the crusader understanding of the reality was like. Fictional material can be even more crucial for the understanding of the crusading mentality. Crusader sources from around the turn of the twelfth century accept violent encounters with non-Christians on the grounds of external hostility directed towards the Christian community. The enemies of Christendom can be identified with either non-Christians living outside the Christian society (Muslims), non-Christians living within the Christian society (Jews) or Christian heretics. Western Christians are described as both victims and avengers of the surrounding forces of diabolical evil. Although the ideal of universal Christianity and gradual eradication of the non-Christian is present, the practical means of achieving a united Christendom are not discussed. The objective of crusader violence was thus entirely Christian: the punishment of the wicked and the restoration of Christian morals and the divine order. Meanwhile, the means used to achieve these objectives were not. Given the scarcity of written regulations concerning the use of force in bello, perceptions concerning the practical use of violence were drawn from a multitude of notions comprising an adaptable network of secular and ecclesiastical, pre-Christian and Christian traditions. Though essentially ideological and often religious in character, the early crusader concept of the practise of violence was not exclusively rooted in Christian thought. The main conclusion of the study is that there existed a definable crusader ideology of the use of force by 1100. The crusader image of violence involved several levels of thought. Predominantly, violence indicates a means of achieving higher spiritual rewards; eternal salvation and immortal glory.
Resumo:
Recovering the motion of a non-rigid body from a set of monocular images permits the analysis of dynamic scenes in uncontrolled environments. However, the extension of factorisation algorithms for rigid structure from motion to the low-rank non-rigid case has proved challenging. This stems from the comparatively hard problem of finding a linear “corrective transform” which recovers the projection and structure matrices from an ambiguous factorisation. We elucidate that this greater difficulty is due to the need to find multiple solutions to a non-trivial problem, casting a number of previous approaches as alleviating this issue by either a) introducing constraints on the basis, making the problems nonidentical, or b) incorporating heuristics to encourage a diverse set of solutions, making the problems inter-dependent. While it has previously been recognised that finding a single solution to this problem is sufficient to estimate cameras, we show that it is possible to bootstrap this partial solution to find the complete transform in closed-form. However, we acknowledge that our method minimises an algebraic error and is thus inherently sensitive to deviation from the low-rank model. We compare our closed-form solution for non-rigid structure with known cameras to the closed-form solution of Dai et al. [1], which we find to produce only coplanar reconstructions. We therefore make the recommendation that 3D reconstruction error always be measured relative to a trivial reconstruction such as a planar one.
Resumo:
The Gesture of Exposure On the presentation of the work of art in the modern art exhibition The topic of this dissertation is the presentation of art works in the modern art exhibition as being the established and conventionalized form of art encounter. It investigates the possibility of a theorization of the art exhibition as a separate object for research, and attempts to examine the relationship between the art work and its presentation in a modern art exhibition. The study takes its point of departure in the area vaguely defined as exhibition studies, and in the lack of a general problematization of the analytical tools used for closer examination of the modern art exhibition. Another lacking aspect is a closer consideration of what happens to the work of art when it is exposed in an art exhibition. The aim of the dissertation is to find a set of concepts that can be used for further theorization The art exhibition is here treated, on the one hand, as an act of exposure, as a showing gesture. On the other hand, the art exhibition is seen as a spatiality, as a space that is produced in the act of showing. Both aspects are seen to be intimately involved in knowledge production. The dissertation is divided into four parts, in which different aspects of the art exhibition are analyzed using different theoretical approaches. The first part uses the archaeological model of Michel Foucault, and discusses the exhibition as a discursive formation based on communicative activity. The second part analyses the derived concepts of gesture and space. This leads to the proposition of three metaphorical spatialities the frame, the agora and the threshold which are seen as providing a possibility for a further extension of the theory of exhibitions. The third part extends the problematization of the relationship between the individual work of art and its exposure through the ideas of Walter Benjamin and Maurice Blanchot. The fourth part carries out a close reading of three presentations from the modern era in order to further examine the relationship between the work of art and its presentation, using the tools that have been developed during the study. In the concluding section, it is possible to see clearer borderlines and conditions for the development of an exhibition theory. The concepts that have been analysed and developed into tools are shown to be useful, and the examples take the discussion into a consideration of the altered premises for the encounter with the postmodern work of art.
Resumo:
To improve the sustainability and environmental accountability of the banana industry there is a need to develop a set of soil health indicators that integrate physical, chemical and biological soil properties. These indicators would allow banana growers, extension and research workers to improve soil health management practices. To determine changes in soil properties due to the cultivation of bananas, a paired site survey was conducted comparing soil properties under conventional banana systems to less intensively managed vegetation systems, such as pastures and forest. Measurements were made on physical, chemical and biological soil properties at seven locations in tropical and sub-tropical banana producing areas. Soil nematode community composition was used as a bioindicator of the biological properties of the soil. Soils under conventional banana production tended to have a greater soil bulk density, with less soil organic carbon (C) (both total C and labile C), greater exchangeable cations, higher extractable P, greater numbers of plant-parasitic nematodes and less nematode diversity, relative to less intensively managed plant systems. The organic banana production systems at two locations had greater labile C, relative to conventional banana systems, but there was no significant change in nematode community composition. There were significant interactions between physical, chemical and nematode community measurements in the soil, particularly with soil C measurements, confirming the need for a holistic set of indicators to aid soil management. There was no single indicator of soil health for the Australian banana industry, but a set of soil health indicators, which would allow the measurement of soil improvements should include: bulk density, soil C, pH, EC, total N, extractable P, ECEC and soil nematode community structure.
Resumo:
A new approach is proposed to solve for the growth as well as the movement of hydrogen bubbles during solidification in aluminum castings. A level-set methodology has been adopted to handle this multiphase phenomenon. A microscale domain is considered and the growth and movement of hydrogen bubbles in this domain has been studied. The growth characteristics of hydrogen bubbles have been evaluated under free growth conditions in a melt having a hydrogen input caused b solidification occurring around the microdomain.
Resumo:
Purpose This study evaluated the impact of patient set-up errors on the probability of pulmonary and cardiac complications in the irradiation of left-sided breast cancer. Methods and Materials Using the CMS XiO Version 4.6 (CMS Inc., St Louis, MO) radiotherapy planning system's NTCP algorithm and the Lyman -Kutcher-Burman (LKB) model, we calculated the DVH indices for the ipsilateral lung and heart and the resultant normal tissue complication probabilities (NTCP) for radiation-induced pneumonitis and excess cardiac mortality in 12 left-sided breast cancer patients. Results Isocenter shifts in the posterior direction had the greatest effect on the lung V20, heart V25, mean and maximum doses to the lung and the heart. Dose volume histograms (DVH) results show that the ipsilateral lung V20 tolerance was exceeded in 58% of the patients after 1cm posterior shifts. Similarly, the heart V25 tolerance was exceeded after 1cm antero-posterior and left-right isocentric shifts in 70% of the patients. The baseline NTCPs for radiation-induced pneumonitis ranged from 0.73% - 3.4% with a mean value of 1.7%. The maximum reported NTCP for radiation-induced pneumonitis was 5.8% (mean 2.6%) after 1cm posterior isocentric shift. The NTCP for excess cardiac mortality were 0 % in 100% of the patients (n=12) before and after setup error simulations. Conclusions Set-up errors in left sided breast cancer patients have a statistically significant impact on the Lung NTCPs and DVH indices. However, with a central lung distance of 3cm or less (CLD <3cm), and a maximum heart distance of 1.5cm or less (MHD<1.5cm), the treatment plans could tolerate set-up errors of up to 1cm without any change in the NTCP to the heart.
Resumo:
Electrical and magnetic properties of La3Ni2O7 and La4Ni3O10 have been investigated in comparison with those of La2NiO4, LaNiO3, and LaSrNiO4. The results suggest an increasing 3-dimensional character across the homologous series Lan+1NinO3n+1 with increase in n. Accordingly, the electrical resistivity decreases in the order La3Ni2O7, La4Ni3O10, and LaNiO3 and this trend is suggested to be related to the percolation threshold. Magnetic properties of these oxides also show some interesting trends across the series.
Resumo:
OBJECTIVE Corneal confocal microscopy is a novel diagnostic technique for the detection of nerve damage and repair in a range of peripheral neuropathies, in particular diabetic neuropathy. Normative reference values are required to enable clinical translation and wider use of this technique. We have therefore undertaken a multicenter collaboration to provide worldwide age-adjusted normative values of corneal nerve fiber parameters. RESEARCH DESIGN AND METHODS A total of 1,965 corneal nerve images from 343 healthy volunteers were pooled from six clinical academic centers. All subjects underwent examination with the Heidelberg Retina Tomograph corneal confocal microscope. Images of the central corneal subbasal nerve plexus were acquired by each center using a standard protocol and analyzed by three trained examiners using manual tracing and semiautomated software (CCMetrics). Age trends were established using simple linear regression, and normative corneal nerve fiber density (CNFD), corneal nerve fiber branch density (CNBD), corneal nerve fiber length (CNFL), and corneal nerve fiber tortuosity (CNFT) reference values were calculated using quantile regression analysis. RESULTS There was a significant linear age-dependent decrease in CNFD (-0.164 no./mm(2) per year for men, P < 0.01, and -0.161 no./mm(2) per year for women, P < 0.01). There was no change with age in CNBD (0.192 no./mm(2) per year for men, P = 0.26, and -0.050 no./mm(2) per year for women, P = 0.78). CNFL decreased in men (-0.045 mm/mm(2) per year, P = 0.07) and women (-0.060 mm/mm(2) per year, P = 0.02). CNFT increased with age in men (0.044 per year, P < 0.01) and women (0.046 per year, P < 0.01). Height, weight, and BMI did not influence the 5th percentile normative values for any corneal nerve parameter. CONCLUSIONS This study provides robust worldwide normative reference values for corneal nerve parameters to be used in research and clinical practice in the study of diabetic and other peripheral neuropathies.
Resumo:
This paper describes an algorithm to compute the union, intersection and difference of two polygons using a scan-grid approach. Basically, in this method, the screen is divided into cells and the algorithm is applied to each cell in turn. The output from all the cells is integrated to yield a representation of the output polygon. In most cells, no computation is required and thus the algorithm is a fast one. The algorithm has been implemented for polygons but can be extended to polyhedra as well. The algorithm is shown to take O(N) time in the average case where N is the total number of edges of the two input polygons.