930 resultados para Precision-recall analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A high-performance liquid chromatographic method using polar organic mode was developed to analyze albendazole (ABZ), albendazole sulfone (ABZSO(2)) and the chiral and active metabolite albendazole sulfoxide (ABZSOX, ricobendazole) that was further applied in stereoselective fungal biotransformation studies. The chromatographic separation was performed on a Chiralpak AS column using acetonitrile:ethanol (97:3, v/v) plus 0.2% triethylamine and 0.2% acetic acid as the mobile phase at a flow rate of 0.5 mL min(-1). The present study employed hollow fiber liquid-phase microextraction as sample preparation. The method showed to be linear over the concentration range of 25-5000 ng mL(-1) for each ABZSOX enantiomer, 200-10,000 ng mL(-1) for ABZ and 50-1000 ng mL(-1) for ABZSO(2) metabolite with correlation coefficient (r)> 0.9934. The mean recoveries for ABZ, rac-ABZSOX and ABZSO(2) were, respectively, 9%, 33% and 20% with relative standard deviation below 10%. Within-day and between-day precision and accuracy assays for these analytes were studied at three concentration levels and were lower than 15%. This study opens the door regarding the possibility of using fungi in obtaining of the active metabolite ricobendazole. Nigrospora sphaerica (Sacc.) E. W. Mason (5567), Pestalotiopsis foedans (VR8), Papulaspora immersa Hotson (SS13) and Mucor rouxii were able to stereoselectively metabolize ABZ into its chiral metabolite. Among them, the fungus Mucor rouxii was the most efficient in the production of (+)-ABZSOX. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We performed a macroscopic and microscopic study of the tongues of common opossums, Didelphis marsupialis, from South America. We studied two males and two females. We collected morphometric data on the tongue with precision calipers. For the light microscopy and scanning electron microscopy analyses, we fixed tissue fragments in 10% formaldehyde and 2.5% glutaraldehyde, respectively. The opossum tongues averaged 5.87 +/- 0.20 cm in length, 3.27 +/- 0.15 cm in width at the lingual body, and 3.82 +/- 0.15 cm in width at the root. The mean thickness of the lingual body was 1.8 +/- 0.1 cm, and the thickness of the root was 3.82 +/- 0.15 cm. Sharp filiform papillae were scattered across the entire tongue; conical filiform papillae occurred on the lingual body and tongue tip; fungiform papillae were scattered among the filiform papillae on the lingual body and tongue tip; and there were three vallate papillae at the root of the tongue. We found two strands of papillary projections in the tongue root. Despite the low variability observed in the lingual papillae, the morphological data obtained in this study may be related to the opossum's diverse food habits and the extensive geographic distribution of the species throughout America. Microsc. Res. Tech. 2012. (c) 2012 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An high performance liquid chromatography (HPLC) method for the enantioselective determination of donepezil (DPZ), 5-O-desmethyl donepezil (5-ODD), and 6-O-desmethyl donepezil (6-ODD) in Czapek culture medium to be applied to biotransformation studies with fungi is described for the first time. The HPLC analysis was carried out using a Chiralpak AD-H column with hexane/ethanol/methanol (75:20:5, v/v/v) plus 0.3 % triethylamine as mobile phase and UV detection at 270 nm. Sample preparation was carried out by liquid-liquid extraction using ethyl acetate as extractor solvent. The method was linear over the concentration range of 100-10,000 ng mL(-1) for each enantiomer of DPZ (r a parts per thousand yenaEuro parts per thousand 0.9985) and of 100-5,000 ng mL(-1) for each enantiomer of 5-ODD (r a parts per thousand yenaEuro parts per thousand 0.9977) and 6-ODD (r a parts per thousand yenaEuro parts per thousand 0.9951). Within-day and between-day precision and accuracy evaluated by relative standard deviations and relative errors, respectively, were lower than 15 % for all analytes. The validated method was used to assess DPZ biotransformation by the fungi Beauveria bassiana American Type Culture Collection (ATCC) 7159 and Cunninghamella elegans ATCC 10028B. Using the fungus B. bassiana ATCC 7159, a predominant formation of (R)-5-ODD was observed while for the fungus C. elegans ATCC 10028B, DPZ was biotransformed to (R)-6-ODD with an enantiomeric excess of 100 %.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work investigated the effects of frequency and precision of feedback on the learning of a dual-motor task. One hundred and twenty adults were randomly assigned to six groups of different knowledge of results (KR), frequency (100%, 66% or 33%) and precision (specific or general) levels. In the stabilization phase, participants performed the dual task (combination of linear positioning and manual force control) with the provision of KR. Ten non-KR adaptation trials were performed for the same task, but with the introduction of an electromagnetic opposite traction force. The analysis showed a significant main effect for frequency of KR. The participants who received KR in 66% of the stabilization trials showed superior adaptation performance than those who received 100% or 33%. This finding reinforces that there is an optimal level of information, neither too high nor too low, for motor learning to be effective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hierarchical multi-label classification is a complex classification task where the classes involved in the problem are hierarchically structured and each example may simultaneously belong to more than one class in each hierarchical level. In this paper, we extend our previous works, where we investigated a new local-based classification method that incrementally trains a multi-layer perceptron for each level of the classification hierarchy. Predictions made by a neural network in a given level are used as inputs to the neural network responsible for the prediction in the next level. We compare the proposed method with one state-of-the-art decision-tree induction method and two decision-tree induction methods, using several hierarchical multi-label classification datasets. We perform a thorough experimental analysis, showing that our method obtains competitive results to a robust global method regarding both precision and recall evaluation measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: With the aim of searching genetic factors associated with the response to an immune treatment based on autologous monocyte-derived dendritic cells pulsed with autologous inactivated HIV, we performed exome analysis by screening more than 240,000 putative functional exonic variants in 18 HIV-positive Brazilian patients that underwent the immune treatment. METHODS: Exome analysis has been performed using the ILLUMINA Infinium HumanExome BeadChip. zCall algorithm allowed us to recall rare variants. Quality control and SNP-centred analysis were done with GenABEL R package. An in-house implementation of the Wang method permitted gene-centred analysis. RESULTS: CCR4-NOT transcription complex, subunit 1 (CNOT1) gene (16q21), showed the strongest association with the modification of the response to the therapeutic vaccine (p=0.00075). CNOT1 SNP rs7188697 A/G was significantly associated with DC treatment response. The presence of a G allele indicated poor response to the therapeutic vaccine (p=0.0031; OR=33.00; CI=1.74-624.66), and the SNP behaved in a dominant model (A/A vs. A/G+G/G p=0.0009; OR=107.66; 95% CI=3.85-3013.31), being the A/G genotype present only in weak/transient responders, conferring susceptibility to poor response to the immune treatment. DISCUSSION: CNOT1 is known to be involved in the control of mRNA deadenylation and mRNA decay. Moreover, CNOT1 has been recently described as being involved in the regulation of inflammatory processes mediated by tristetraprolin (TTP). The TTP-CCR4-NOT complex (CNOT1 in the CCR4-NOT complex is the binding site for TTP) has been reported as interfering with HIV replication, through post-transcriptional control. Therefore, we can hypothesize that genetic variation occurring in the CNOT1 gene could impair the TTP-CCR4-NOT complex, thus interfering with HIV replication and/or host immune response. CONCLUSIONS: Being aware that our findings are exclusive to the 18 patients studied with a need for replication, and that the genetic variant of CNOT1 gene, localized at intron 3, has no known functional effect, we propose a novel potential candidate locus for the modulation of the response to the immune treatment, and open a discussion on the necessity to consider the host genome as another potential variant to be evaluated when designing an immune therapy study

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Type Ia supernovae have been successfully used as standardized candles to study the expansion history of the Universe. In the past few years, these studies led to the exciting result of an accelerated expansion caused by the repelling action of some sort of dark energy. This result has been confirmed by measurements of cosmic microwave background radiation, the large-scale structure, and the dynamics of galaxy clusters. The combination of all these experiments points to a “concordance model” of the Universe with flat large-scale geometry and a dominant component of dark energy. However, there are several points related to supernova measurements which need careful analysis in order to doubtlessly establish the validity of the concordance model. As the amount and quality of data increases, the need of controlling possible systematic effects which may bias the results becomes crucial. Also important is the improvement of our knowledge of the physics of supernovae events to assure and possibly refine their calibration as standardized candle. This thesis addresses some of those issues through the quantitative analysis of supernova spectra. The stress is put on a careful treatment of the data and on the definition of spectral measurement methods. The comparison of measurements for a large set of spectra from nearby supernovae is used to study the homogeneity and to search for spectral parameters which may further refine the calibration of the standardized candle. One such parameter is found to reduce the dispersion in the distance estimation of a sample of supernovae to below 6%, a precision which is comparable with the current lightcurve-based calibration, and is obtained in an independent manner. Finally, the comparison of spectral measurements from nearby and distant objects is used to test the possibility of evolution with cosmic time of the intrinsic brightness of type Ia supernovae.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traceability is often perceived by food industry executives as an additional cost of doing business, one to be avoided if possible. However, a traceability system can in fact comply the regulatory requirements, increase food safety and recall performance, improving marketing performances and, as well as, improving supply chain management. Thus, traceability affects business performances of firms in terms of costs and benefits determined by traceability practices. Costs and benefits affect factors such as, firms’ characteristics, level of traceability and ,lastly, costs and benefits perceived prior to traceability implementation. This thesis was undertaken to understand how these factors are linked to affect the outcome of costs and benefits. Analysis of the results of a plant level survey of the Italian ichthyic processing industry revealed that processors generally adopt various level of traceability while government support appears to increase the level of traceability and the expectations and actual costs and benefits. None of the firms’ characteristics, with the exception of government support, influences costs and level of traceability. Only size of firms and level of QMS certifications are linked with benefits while precision of traceability increases benefits without affecting costs. Finally, traceability practices appear due to the request from “external“ stakeholders such as government, authority and customers rather than “internal” factors (e.g. improving the firm management) while the traceability system does not provide any added value from the market in terms of price premium or market share increase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Precision horticulture and spatial analysis applied to orchards are a growing and evolving part of precision agriculture technology. The aim of this discipline is to reduce production costs by monitoring and analysing orchard-derived information to improve crop performance in an environmentally sound manner. Georeferencing and geostatistical analysis coupled to point-specific data mining allow to devise and implement management decisions tailored within the single orchard. Potential applications range from the opportunity to verify in real time along the season the effectiveness of cultural practices to achieve the production targets in terms of fruit size, number, yield and, in a near future, fruit quality traits. These data will impact not only the pre-harvest but their effect will extend to the post-harvest sector of the fruit chain. Chapter 1 provides an updated overview on precision horticulture , while in Chapter 2 a preliminary spatial statistic analysis of the variability in apple orchards is provided before and after manual thinning; an interpretation of this variability and how it can be managed to maximize orchard performance is offered. Then in Chapter 3 a stratification of spatial data into management classes to interpret and manage spatial variation on the orchard is undertaken. An inverse model approach is also applied to verify whether the crop production explains environmental variation. In Chapter 4 an integration of the techniques adopted before is presented. A new key for reading the information gathered within the field is offered. The overall goal of this Dissertation was to probe into the feasibility, the desirability and the effectiveness of a precision approach to fruit growing, following the lines of other areas of agriculture that already adopt this management tool. As existing applications of precision horticulture already had shown, crop specificity is an important factor to be accounted for. This work focused on apple because of its importance in the area where the work was carried out, and worldwide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

«Fiction of frontier». Phenomenology of an open form/voice. Francesco Giustini’s PhD dissertation fits into a genre of research usually neglected by the literary criticism which nevertheless is arousing much interest in recent years: the relationship between Literature and Space. In this context, the specific issue of his work consists in the category of the Frontier including its several implications for the XX century fiction. The preliminary step, at the beginning of the first section of the dissertation, is a semantic analysis: with precision, Giustini describes the meaning of the word “frontier” here declined in a multiplicity of cultural, political and geographical contexts, starting from the American frontier of the pioneers who headed for the West, to the exotic frontiers of the world, with whose the imperialistic colonization has come into contact; from the semi-uninhabited areas like deserts, highlands and virgin forests, to the ethnic frontiers between Indian and white people in South America, since the internal frontiers of the Countries like those ones between the District and the Capital City, the Centre and the Outskirts. In the next step, Giustini wants to focus on a real “ myth of the frontier”, able to nourish cultural and literary imagination. Indeed, the literature has told and chosen the frontier as the scenery for many stories; especially in the 20th Century it made the frontier a problematic space in the light of events and changes that have transformed the perception of space and our relationship with it. Therefore, the dissertation proposes a critical category, it traces the hallmarks of a specific literary phenomenon defined “ Fiction of the frontier” ,present in many literary traditions during the 20th Century. The term “Fiction” (not “Literature” or “Poetics”) does not define a genre but rather a “procedure”, focusing on a constant issue pointed out from the texts examined in this work : the strong call to the act of narration and to its oral traditions. The “Fiction of the Frontier” is perceived as an approach to the world, a way of watching and feeling the objects, an emotion that is lived and told through the story- a story where the narrator ,through his body and his voice, takes the rule of the witness. The following parts, that have an analytic style, are constructed on the basis of this theoretical and methodological reflection. The second section gives a wide range of examples into we can find the figure and the myth of the frontier through the textual analysis which range over several literary traditions. Starting from monographic chapters (Garcia Marquez, Callado, McCarthy), to the comparative reading of couples of texts (Calvino and Verga Llosa, Buzzati and Coetzee, Arguedas and Rulfo). The selection of texts is introduced so as to underline a particular aspect or a form of the frontier at every reading. This section is articulated into thematic voices which recall some actions that can be taken into the ambiguous and liminal space of the frontier (to communicate, to wait, to “trans-culturate”, to imagine, to live in, to not-live in). In this phenomenology, the frontier comes to the light as a physical and concrete element or as a cultural, imaginary, linguistic, ethnic and existential category. In the end, the third section is centered on a more defined and elaborated analysis of two authors, considered as fundamental for the comprehension of the “Fiction of the frontier”: Joseph Conrad and João Guimarães Rosa. Even if they are very different, being part of unlike literary traditions, these two authors show many connections which are pointed by the comparative analysis. Maybe Conrad is the first author that understand the feeling of the frontier , freeing himself from the adventure romance and from the exotic nineteenthcentury tradition. João Guimarães Rosa, in his turn, is the great narrator of Brazilian and South American frontier, he is the man of sertão and of endless spaces of the Centre of Brazil. His production is strongly linked to that one belonged to the author of Heart of Darkness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3D video-fluoroscopy is an accurate but cumbersome technique to estimate natural or prosthetic human joint kinematics. This dissertation proposes innovative methodologies to improve the 3D fluoroscopic analysis reliability and usability. Being based on direct radiographic imaging of the joint, and avoiding soft tissue artefact that limits the accuracy of skin marker based techniques, the fluoroscopic analysis has a potential accuracy of the order of mm/deg or better. It can provide fundamental informations for clinical and methodological applications, but, notwithstanding the number of methodological protocols proposed in the literature, time consuming user interaction is exploited to obtain consistent results. The user-dependency prevented a reliable quantification of the actual accuracy and precision of the methods, and, consequently, slowed down the translation to the clinical practice. The objective of the present work was to speed up this process introducing methodological improvements in the analysis. In the thesis, the fluoroscopic analysis was characterized in depth, in order to evaluate its pros and cons, and to provide reliable solutions to overcome its limitations. To this aim, an analytical approach was followed. The major sources of error were isolated with in-silico preliminary studies as: (a) geometric distortion and calibration errors, (b) 2D images and 3D models resolutions, (c) incorrect contour extraction, (d) bone model symmetries, (e) optimization algorithm limitations, (f) user errors. The effect of each criticality was quantified, and verified with an in-vivo preliminary study on the elbow joint. The dominant source of error was identified in the limited extent of the convergence domain for the local optimization algorithms, which forced the user to manually specify the starting pose for the estimating process. To solve this problem, two different approaches were followed: to increase the optimal pose convergence basin, the local approach used sequential alignments of the 6 degrees of freedom in order of sensitivity, or a geometrical feature-based estimation of the initial conditions for the optimization; the global approach used an unsupervised memetic algorithm to optimally explore the search domain. The performances of the technique were evaluated with a series of in-silico studies and validated in-vitro with a phantom based comparison with a radiostereometric gold-standard. The accuracy of the method is joint-dependent, and for the intact knee joint, the new unsupervised algorithm guaranteed a maximum error lower than 0.5 mm for in-plane translations, 10 mm for out-of-plane translation, and of 3 deg for rotations in a mono-planar setup; and lower than 0.5 mm for translations and 1 deg for rotations in a bi-planar setups. The bi-planar setup is best suited when accurate results are needed, such as for methodological research studies. The mono-planar analysis may be enough for clinical application when the analysis time and cost may be an issue. A further reduction of the user interaction was obtained for prosthetic joints kinematics. A mixed region-growing and level-set segmentation method was proposed and halved the analysis time, delegating the computational burden to the machine. In-silico and in-vivo studies demonstrated that the reliability of the new semiautomatic method was comparable to a user defined manual gold-standard. The improved fluoroscopic analysis was finally applied to a first in-vivo methodological study on the foot kinematics. Preliminary evaluations showed that the presented methodology represents a feasible gold-standard for the validation of skin marker based foot kinematics protocols.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present thesis is a contribution to the theory of algebras of pseudodifferential operators on singular settings. In particular, we focus on the $b$-calculus and the calculus on conformally compact spaces in the sense of Mazzeo and Melrose in connection with the notion of spectral invariant transmission operator algebras. We summarize results given by Gramsch et. al. on the construction of $Psi_0$-and $Psi*$-algebras and the corresponding scales of generalized Sobolev spaces using commutators of certain closed operators and derivations. In the case of a manifold with corners $Z$ we construct a $Psi*$-completion $A_b(Z,{}^bOmega^{1/2})$ of the algebra of zero order $b$-pseudodifferential operators $Psi_{b,cl}(Z, {}^bOmega^{1/2})$ in the corresponding $C*$-closure $B(Z,{}^bOmega^{12})hookrightarrow L(L^2(Z,{}^bOmega^{1/2}))$. The construction will also provide that localised to the (smooth) interior of Z the operators in the $A_b(Z, {}^bOmega^{1/2})$ can be represented as ordinary pseudodifferential operators. In connection with the notion of solvable $C*$-algebras - introduced by Dynin - we calculate the length of the $C*$-closure of $Psi_{b,cl}^0(F,{}^bOmega^{1/2},R^{E(F)})$ in $B(F,{}^bOmega^{1/2}),R^{E(F)})$ by localizing $B(Z, {}^bOmega^{1/2})$ along the boundary face $F$ using the (extended) indical familiy $I^B_{FZ}$. Moreover, we discuss how one can localise a certain solving ideal chain of $B(Z, {}^bOmega^{1/2})$ in neighbourhoods $U_p$ of arbitrary points $pin Z$. This localisation process will recover the singular structure of $U_p$; further, the induced length function $l_p$ is shown to be upper semi-continuous. We give construction methods for $Psi*$- and $C*$-algebras admitting only infinite long solving ideal chains. These algebras will first be realized as unconnected direct sums of (solvable) $C*$-algebras and then refined such that the resulting algebras have arcwise connected spaces of one dimensional representations. In addition, we recall the notion of transmission algebras on manifolds with corners $(Z_i)_{iin N}$ following an idea of Ali Mehmeti, Gramsch et. al. Thereby, we connect the underlying $C^infty$-function spaces using point evaluations in the smooth parts of the $Z_i$ and use generalized Laplacians to generate an appropriate scale of Sobolev spaces. Moreover, it is possible to associate generalized (solving) ideal chains to these algebras, such that to every $ninN$ there exists an ideal chain of length $n$ within the algebra. Finally, we discuss the $K$-theory for algebras of pseudodifferential operators on conformally compact manifolds $X$ and give an index theorem for these operators. In addition, we prove that the Dirac-operator associated to the metric of a conformally compact manifold $X$ is not a Fredholm operator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hypernuclear physics is currently attracting renewed interest, due tornthe important role of hypernuclei spectroscopy rn(hyperon-hyperon and hyperon-nucleon interactions) rnas a unique toolrnto describe the baryon-baryon interactions in a unified way and to rnunderstand the origin of their short-range.rnrnHypernuclear research will be one of the main topics addressed by the {sc PANDA} experimentrnat the planned Facility for Antiproton and Ion Research {sc FAIR}.rnThanks to the use of stored $overline{p}$ beams, copiousrnproduction of double $Lambda$ hypernuclei is expected at thern{sc PANDA} experiment, which will enable high precision $gamma$rnspectroscopy of such nuclei for the first time.rnAt {sc PANDA} excited states of $Xi^-$ hypernuclei will be usedrnas a basis for the formation of double $Lambda$ hypernuclei.rnFor their detection, a devoted hypernuclear detector setup is planned. This setup consists ofrna primary nuclear target for the production of $Xi^{-}+overline{Xi}$ pairs, a secondary active targetrnfor the hypernuclei formation and the identification of associated decay products and a germanium array detector to perform $gamma$ spectroscopy.rnrnIn the present work, the feasibility of performing high precision $gamma$rnspectroscopy of double $Lambda$ hypernuclei at the {sc PANDA} experiment has been studiedrnby means of a Monte Carlo simulation. For this issue, the designing and simulation of the devoted detector setup as well as of the mechanism to produce double $Lambda$ hypernuclei have been optimizedrntogether with the performance of the whole system. rnIn addition, the production yields of double hypernuclei in excitedrnparticle stable states have been evaluated within a statistical decay model.rnrnA strategy for the unique assignment of various newly observed $gamma$-transitions rnto specific double hypernuclei has been successfully implemented by combining the predicted energy spectra rnof each target with the measurement of two pion momenta from the subsequent weak decays of a double hypernucleus.rn% Indeed, based on these Monte Carlo simulation, the analysis of the statistical decay of $^{13}_{Lambda{}Lambda}$B has been performed. rn% As result, three $gamma$-transitions associated to the double hypernuclei $^{11}_{Lambda{}Lambda}$Bern% and to the single hyperfragments $^{4}_{Lambda}$H and $^{9}_{Lambda}$Be, have been well identified.rnrnFor the background handling a method based on time measurement has also been implemented.rnHowever, the percentage of tagged events related to the production of $Xi^{-}+overline{Xi}$ pairs, variesrnbetween 20% and 30% of the total number of produced events of this type. As a consequence, further considerations have to be made to increase the tagging efficiency by a factor of 2.rnrnThe contribution of the background reactions to the radiation damage on the germanium detectorsrnhas also been studied within the simulation. Additionally, a test to check the degradation of the energyrnresolution of the germanium detectors in the presence of a magnetic field has also been performed.rnNo significant degradation of the energy resolution or in the electronics was observed. A correlationrnbetween rise time and the pulse shape has been used to correct the measured energy. rnrnBased on the present results, one can say that the performance of $gamma$ spectroscopy of double $Lambda$ hypernuclei at the {sc PANDA} experiment seems feasible.rnA further improvement of the statistics is needed for the background rejection studies. Moreover, a more realistic layout of the hypernuclear detectors has been suggested using the results of these studies to accomplish a better balance between the physical and the technical requirements.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In technical design processes in the automotive industry, digital prototypes rapidly gain importance, because they allow for a detection of design errors in early development stages. The technical design process includes the computation of swept volumes for maintainability analysis and clearance checks. The swept volume is very useful, for example, to identify problem areas where a safety distance might not be kept. With the explicit construction of the swept volume an engineer gets evidence on how the shape of components that come too close have to be modified.rnIn this thesis a concept for the approximation of the outer boundary of a swept volume is developed. For safety reasons, it is essential that the approximation is conservative, i.e., that the swept volume is completely enclosed by the approximation. On the other hand, one wishes to approximate the swept volume as precisely as possible. In this work, we will show, that the one-sided Hausdorff distance is the adequate measure for the error of the approximation, when the intended usage is clearance checks, continuous collision detection and maintainability analysis in CAD. We present two implementations that apply the concept and generate a manifold triangle mesh that approximates the outer boundary of a swept volume. Both algorithms are two-phased: a sweeping phase which generates a conservative voxelization of the swept volume, and the actual mesh generation which is based on restricted Delaunay refinement. This approach ensures a high precision of the approximation while respecting conservativeness.rnThe benchmarks for our test are amongst others real world scenarios that come from the automotive industry.rnFurther, we introduce a method to relate parts of an already computed swept volume boundary to those triangles of the generator, that come closest during the sweep. We use this to verify as well as to colorize meshes resulting from our implementations.