5 resultados para Precision-recall analysis

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traceability is often perceived by food industry executives as an additional cost of doing business, one to be avoided if possible. However, a traceability system can in fact comply the regulatory requirements, increase food safety and recall performance, improving marketing performances and, as well as, improving supply chain management. Thus, traceability affects business performances of firms in terms of costs and benefits determined by traceability practices. Costs and benefits affect factors such as, firms’ characteristics, level of traceability and ,lastly, costs and benefits perceived prior to traceability implementation. This thesis was undertaken to understand how these factors are linked to affect the outcome of costs and benefits. Analysis of the results of a plant level survey of the Italian ichthyic processing industry revealed that processors generally adopt various level of traceability while government support appears to increase the level of traceability and the expectations and actual costs and benefits. None of the firms’ characteristics, with the exception of government support, influences costs and level of traceability. Only size of firms and level of QMS certifications are linked with benefits while precision of traceability increases benefits without affecting costs. Finally, traceability practices appear due to the request from “external“ stakeholders such as government, authority and customers rather than “internal” factors (e.g. improving the firm management) while the traceability system does not provide any added value from the market in terms of price premium or market share increase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Precision horticulture and spatial analysis applied to orchards are a growing and evolving part of precision agriculture technology. The aim of this discipline is to reduce production costs by monitoring and analysing orchard-derived information to improve crop performance in an environmentally sound manner. Georeferencing and geostatistical analysis coupled to point-specific data mining allow to devise and implement management decisions tailored within the single orchard. Potential applications range from the opportunity to verify in real time along the season the effectiveness of cultural practices to achieve the production targets in terms of fruit size, number, yield and, in a near future, fruit quality traits. These data will impact not only the pre-harvest but their effect will extend to the post-harvest sector of the fruit chain. Chapter 1 provides an updated overview on precision horticulture , while in Chapter 2 a preliminary spatial statistic analysis of the variability in apple orchards is provided before and after manual thinning; an interpretation of this variability and how it can be managed to maximize orchard performance is offered. Then in Chapter 3 a stratification of spatial data into management classes to interpret and manage spatial variation on the orchard is undertaken. An inverse model approach is also applied to verify whether the crop production explains environmental variation. In Chapter 4 an integration of the techniques adopted before is presented. A new key for reading the information gathered within the field is offered. The overall goal of this Dissertation was to probe into the feasibility, the desirability and the effectiveness of a precision approach to fruit growing, following the lines of other areas of agriculture that already adopt this management tool. As existing applications of precision horticulture already had shown, crop specificity is an important factor to be accounted for. This work focused on apple because of its importance in the area where the work was carried out, and worldwide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

«Fiction of frontier». Phenomenology of an open form/voice. Francesco Giustini’s PhD dissertation fits into a genre of research usually neglected by the literary criticism which nevertheless is arousing much interest in recent years: the relationship between Literature and Space. In this context, the specific issue of his work consists in the category of the Frontier including its several implications for the XX century fiction. The preliminary step, at the beginning of the first section of the dissertation, is a semantic analysis: with precision, Giustini describes the meaning of the word “frontier” here declined in a multiplicity of cultural, political and geographical contexts, starting from the American frontier of the pioneers who headed for the West, to the exotic frontiers of the world, with whose the imperialistic colonization has come into contact; from the semi-uninhabited areas like deserts, highlands and virgin forests, to the ethnic frontiers between Indian and white people in South America, since the internal frontiers of the Countries like those ones between the District and the Capital City, the Centre and the Outskirts. In the next step, Giustini wants to focus on a real “ myth of the frontier”, able to nourish cultural and literary imagination. Indeed, the literature has told and chosen the frontier as the scenery for many stories; especially in the 20th Century it made the frontier a problematic space in the light of events and changes that have transformed the perception of space and our relationship with it. Therefore, the dissertation proposes a critical category, it traces the hallmarks of a specific literary phenomenon defined “ Fiction of the frontier” ,present in many literary traditions during the 20th Century. The term “Fiction” (not “Literature” or “Poetics”) does not define a genre but rather a “procedure”, focusing on a constant issue pointed out from the texts examined in this work : the strong call to the act of narration and to its oral traditions. The “Fiction of the Frontier” is perceived as an approach to the world, a way of watching and feeling the objects, an emotion that is lived and told through the story- a story where the narrator ,through his body and his voice, takes the rule of the witness. The following parts, that have an analytic style, are constructed on the basis of this theoretical and methodological reflection. The second section gives a wide range of examples into we can find the figure and the myth of the frontier through the textual analysis which range over several literary traditions. Starting from monographic chapters (Garcia Marquez, Callado, McCarthy), to the comparative reading of couples of texts (Calvino and Verga Llosa, Buzzati and Coetzee, Arguedas and Rulfo). The selection of texts is introduced so as to underline a particular aspect or a form of the frontier at every reading. This section is articulated into thematic voices which recall some actions that can be taken into the ambiguous and liminal space of the frontier (to communicate, to wait, to “trans-culturate”, to imagine, to live in, to not-live in). In this phenomenology, the frontier comes to the light as a physical and concrete element or as a cultural, imaginary, linguistic, ethnic and existential category. In the end, the third section is centered on a more defined and elaborated analysis of two authors, considered as fundamental for the comprehension of the “Fiction of the frontier”: Joseph Conrad and João Guimarães Rosa. Even if they are very different, being part of unlike literary traditions, these two authors show many connections which are pointed by the comparative analysis. Maybe Conrad is the first author that understand the feeling of the frontier , freeing himself from the adventure romance and from the exotic nineteenthcentury tradition. João Guimarães Rosa, in his turn, is the great narrator of Brazilian and South American frontier, he is the man of sertão and of endless spaces of the Centre of Brazil. His production is strongly linked to that one belonged to the author of Heart of Darkness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3D video-fluoroscopy is an accurate but cumbersome technique to estimate natural or prosthetic human joint kinematics. This dissertation proposes innovative methodologies to improve the 3D fluoroscopic analysis reliability and usability. Being based on direct radiographic imaging of the joint, and avoiding soft tissue artefact that limits the accuracy of skin marker based techniques, the fluoroscopic analysis has a potential accuracy of the order of mm/deg or better. It can provide fundamental informations for clinical and methodological applications, but, notwithstanding the number of methodological protocols proposed in the literature, time consuming user interaction is exploited to obtain consistent results. The user-dependency prevented a reliable quantification of the actual accuracy and precision of the methods, and, consequently, slowed down the translation to the clinical practice. The objective of the present work was to speed up this process introducing methodological improvements in the analysis. In the thesis, the fluoroscopic analysis was characterized in depth, in order to evaluate its pros and cons, and to provide reliable solutions to overcome its limitations. To this aim, an analytical approach was followed. The major sources of error were isolated with in-silico preliminary studies as: (a) geometric distortion and calibration errors, (b) 2D images and 3D models resolutions, (c) incorrect contour extraction, (d) bone model symmetries, (e) optimization algorithm limitations, (f) user errors. The effect of each criticality was quantified, and verified with an in-vivo preliminary study on the elbow joint. The dominant source of error was identified in the limited extent of the convergence domain for the local optimization algorithms, which forced the user to manually specify the starting pose for the estimating process. To solve this problem, two different approaches were followed: to increase the optimal pose convergence basin, the local approach used sequential alignments of the 6 degrees of freedom in order of sensitivity, or a geometrical feature-based estimation of the initial conditions for the optimization; the global approach used an unsupervised memetic algorithm to optimally explore the search domain. The performances of the technique were evaluated with a series of in-silico studies and validated in-vitro with a phantom based comparison with a radiostereometric gold-standard. The accuracy of the method is joint-dependent, and for the intact knee joint, the new unsupervised algorithm guaranteed a maximum error lower than 0.5 mm for in-plane translations, 10 mm for out-of-plane translation, and of 3 deg for rotations in a mono-planar setup; and lower than 0.5 mm for translations and 1 deg for rotations in a bi-planar setups. The bi-planar setup is best suited when accurate results are needed, such as for methodological research studies. The mono-planar analysis may be enough for clinical application when the analysis time and cost may be an issue. A further reduction of the user interaction was obtained for prosthetic joints kinematics. A mixed region-growing and level-set segmentation method was proposed and halved the analysis time, delegating the computational burden to the machine. In-silico and in-vivo studies demonstrated that the reliability of the new semiautomatic method was comparable to a user defined manual gold-standard. The improved fluoroscopic analysis was finally applied to a first in-vivo methodological study on the foot kinematics. Preliminary evaluations showed that the presented methodology represents a feasible gold-standard for the validation of skin marker based foot kinematics protocols.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.