29 resultados para likelihood profile function
em Universidade do Minho
Resumo:
Dissertação de mestrado em Bioquímica Aplicada – Biomedicina
Resumo:
This paper presents the outcomes of a research work consisting in the development of an Electric Vehicle Assistant (EVA), which creates and stores a driver profile where are contained the driving behaviours related with the EV energy consumption, the EV battery charging information, and the performed routes. This is an application for mobile devices that is able to passively track the driver behaviour and to access several information related with the EV in real time. It is also proposed a range prediction approach based on probability to take into account unpredictable effects of personal driving style, traffic or weather.
Resumo:
Nowadays the main honey producing countries require accurate labeling of honey before commercialization, including floral classification. Traditionally, this classification is made by melissopalynology analysis, an accurate but time-consuming task requiring laborious sample pre-treatment and high-skilled technicians. In this work the potential use of a potentiometric electronic tongue for pollinic assessment is evaluated, using monofloral and polyfloral honeys. The results showed that after splitting honeys according to color (white, amber and dark), the novel methodology enabled quantifying the relative percentage of the main pollens (Castanea sp., Echium sp., Erica sp., Eucaliptus sp., Lavandula sp., Prunus sp., Rubus sp. and Trifolium sp.). Multiple linear regression models were established for each type of pollen, based on the best sensors sub-sets selected using the simulated annealing algorithm. To minimize the overfitting risk, a repeated K-fold cross-validation procedure was implemented, ensuring that at least 10-20% of the honeys were used for internal validation. With this approach, a minimum average determination coefficient of 0.91 ± 0.15 was obtained. Also, the proposed technique enabled the correct classification of 92% and 100% of monofloral and polyfloral honeys, respectively. The quite satisfactory performance of the novel procedure for quantifying the relative pollen frequency may envisage its applicability for honey labeling and geographical origin identification. Nevertheless, this approach is not a full alternative to the traditional melissopalynologic analysis; it may be seen as a practical complementary tool for preliminary honey floral classification, leaving only problematic cases for pollinic evaluation.
Resumo:
This paper presents a framework of competences developed for Industrial Engineering and Management that can be used as a tool for curriculum analysis and design, including the teaching and learning processes as well as the alignment of the curriculum with the professional profile. The framework was applied to the Industrial Engineering and Management program at University of Minho (UMinho), Portugal, and it provides an overview of the connection between IEM knowledge areas and the competences defined in its curriculum. The framework of competences was developed through a process of analysis using a combination of methods and sources for data collection. The framework was developed according to four main steps: 1) characterization of IEM knowledge areas; 2) definition of IEM competences; 3) survey; 4) application of the framework at the IEM curriculum. The findings showed that the framework is useful to build an integrated vision of the curriculum. The most visible aspect in the learning outcomes of IEM program is the lack of balance between technical and transversal competences. There was not almost any reference to the transversal competences and it is fundamentally concentrated on Project-Based Learning courses. The framework presented in this paper provides a contribution to the definition of IEM professional profile through a set of competences which need to be explored further. In addition, it may be a relevant tool for IEM curriculum analysis and a contribution for bridging the gap between universities and companies.
Resumo:
In the present work the benefits of using graphics processing units (GPU) to aid the design of complex geometry profile extrusion dies, are studied. For that purpose, a3Dfinite volume based code that employs unstructured meshes to solve and couple the continuity, momentum and energy conservation equations governing the fluid flow, together with aconstitutive equation, was used. To evaluate the possibility of reducing the calculation time spent on the numerical calculations, the numerical code was parallelized in the GPU, using asimple programing approach without complex memory manipulations. For verificationpurposes, simulations were performed for three benchmark problems: Poiseuille flow, lid-driven cavity flow and flow around acylinder. Subsequently, the code was used on the design of two real life extrusion dies for the production of a medical catheter and a wood plastic composite decking profile. To evaluate the benefits, the results obtained with the GPU parallelized code were compared, in terms of speedup, with a serial implementation of the same code, that traditionally runs on the central processing unit (CPU). The results obtained show that, even with the simple parallelization approach employed, it was possible to obtain a significant reduction of the computation times.
Resumo:
Tese de Doutoramento em Engenharia Química e Biológica.
Resumo:
[Excerpt] The advantages resulting from the use of numerical modelling tools to support the design of processing equipment are almost consensual. The design of calibration systems in profile extrusion is not an exception . H owever , the complex geome tries and heat exchange phenomena involved in this process require the use of numerical solvers able to model the heat exchange in more than one domain ( calibrator and polymer), the compatibilization of the heat transfer at the profile - calibrator interface and with the ability to deal with complex geometries. The combination of all these features is usually hard to find in commercial software. Moreover , the dimension of the meshes required to ob tain accurate results, result in computational times prohibitive for industrial application. (...)
Resumo:
[Extrat] Thermoplastic profiles are very attractive due to their inherent design freedom. However, the usual methodologies employed to design extrusion forming tools, based on experimental based trial–and–error procedures, are highly dependent on the designer’s experience and lead to high resources consumption. Despite of the relatively low cost of the raw materials employed on the production of this type of profiles, the resources involved in the die design process significantly increase their cost. These difficulties are even more evident when a complex geometry profile has to be produced and there is no previous experience with similar geometries. Therefore, novel design approaches are required, in order to reduce the required resources and guarantee a good performance for the produced profile. (...)
Resumo:
Various differential cross-sections are measured in top-quark pair (tt¯) events produced in proton--proton collisions at a centre-of-mass energy of s√=7 TeV at the LHC with the ATLAS detector. These differential cross-sections are presented in a data set corresponding to an integrated luminosity of 4.6 fb−1. The differential cross-sections are presented in terms of kinematic variables of a top-quark proxy referred to as the pseudo-top-quark whose dependence on theoretical models is minimal. The pseudo-top-quark can be defined in terms of either reconstructed detector objects or stable particles in an analogous way. The measurements are performed on tt¯ events in the lepton+jets channel, requiring exactly one charged lepton and at least four jets with at least two of them tagged as originating from a b-quark. The hadronic and leptonic pseudo-top-quarks are defined via the leptonic or hadronic decay mode of the W boson produced by the top-quark decay in events with a single charged lepton.The cross-section is measured as a function of the transverse momentum and rapidity of both the hadronic and leptonic pseudo-top-quark as well as the transverse momentum, rapidity and invariant mass of the pseudo-top-quark pair system. The measurements are corrected for detector effects and are presented within a kinematic range that closely matches the detector acceptance. Differential cross-section measurements of the pseudo-top-quark variables are compared with several Monte Carlo models that implement next-to-leading order or leading-order multi-leg matrix-element calculations.
Resumo:
The tt¯ production cross-section dependence on jet multiplicity and jet transverse momentum is reported for proton--proton collisions at a centre-of-mass energy of 7 TeV in the single-lepton channel. The data were collected with the ATLAS detector at the CERN Large Hadron Collider and comprise the full 2011 data sample corresponding to an integrated luminosity of 4.6 fb−1. Differential cross-sections are presented as a function of the jet multiplicity for up to eight jets using jet transverse momentum thresholds of 25, 40, 60, and 80 GeV, and as a function of jet transverse momentum up to the fifth jet. The results are shown after background subtraction and corrections for all detector effects, within a kinematic range closely matched to the experimental acceptance. Several QCD-based Monte Carlo models are compared with the results. Sensitivity to the parton shower modelling is found at the higher jet multiplicities, at high transverse momentum of the leading jet and in the transverse momentum spectrum of the fifth leading jet. The MC@NLO+HERWIG MC is found to predict too few events at higher jet multiplicities.
Resumo:
In longitudinal studies of disease, patients may experience several events through a follow-up period. In these studies, the sequentially ordered events are often of interest and lead to problems that have received much attention recently. Issues of interest include the estimation of bivariate survival, marginal distributions and the conditional distribution of gap times. In this work we consider the estimation of the survival function conditional to a previous event. Different nonparametric approaches will be considered for estimating these quantities, all based on the Kaplan-Meier estimator of the survival function. We explore the finite sample behavior of the estimators through simulations. The different methods proposed in this article are applied to a data set from a German Breast Cancer Study. The methods are used to obtain predictors for the conditional survival probabilities as well as to study the influence of recurrence in overall survival.
Resumo:
Purpose: The purpose of this study was to evaluate the effect of orthokeratology for different degrees of myopia correction in the relative location of tangential (FT) and sagittal (FS) power errors across the central 70 of the visual field in the horizontal meridian. Methods: Thirty-four right eyes of 34 patients with a mean age of 25.2 ± 6.4 years were fitted with Paragon CRT (Mesa, AZ) rigid gas permeable contact lenses to treat myopia (2.15 ± 1.26D, range: 0.88 to 5.25D). Axial and peripheral refraction were measured along the central 70 of the horizontal visual field with the Grand Seiko WAM5500 open-field auto-refractor. Spherical equivalent (M), as well as tangential (FT) and sagittal power errors (FS) were obtained. Analysis was stratified in three groups according to baseline spherical equivalent: Group 1 [MBaseline = 0.88 to 1.50D; n = 11], Group 2 [MBaseline = 1.51 to 2.49D; n = 11], and Group 3 [MBaseline = 2.50 to 5.25D; n = 12]. Results: Spherical equivalent was significantly more myopic after treatment beyond the central 40 of the visual field (p50.001). FT became significantly more myopic for all groups in the nasal and temporal retina with 25 (p 0.017), 30 (p 0.007) and 35 (p 0.004) of eye rotation. Myopic change in FS was less consistent, achieving only statistical significance for all groups at 35 in the nasal and temporal retina (p 0.045). Conclusions: Orthokeratology changes significantly FT in the myopic direction beyond the central 40 of the visual field for all degrees of myopia. Changes induced by orthokeratology in relative peripheral M, FT and FS with 35 of eye rotation were significantly correlated with axial myopia at baseline. Keywords: Field
Resumo:
This article focuses on the personal experiences of Portuguese women regarding separation and divorce. The sample included 96 women, with at least 1 child, who responded to an inventory that addressed conflict, dysfunctional conjugality, emotional experiences, social support, and adaptation to divorce. Higher levels of conflict and marital dysfunction in litigious divorces were found, as well as more conflict when different lawyers were employed. Those women who were satisfied with alimony and visiting rights reported less conflict, fewer negative emotional experiences, and greater social support. Level of education and duration of separation influenced women’s perceptions. Implications for intervention are addressed.
Resumo:
Several suction–water-content (s-w) calibrations for the filter paper method (FPM) used for soil-suction measurement have been published. Most of the calibrations involve a bilinear function (i.e., two different equations) with an inflection point occurring at 60 kPafunction with a smooth transition between the high and low suctions based on a regression analysis of various previously published calibrations obtained for filter paper Whatman No. 42 (W42) is presented and discussed. The approach is applied herein to data obtained from three establish bilinear calibrations (six equations) for W42 filter paper to determine the two fitting parameters of the continuous function. An experimental evaluation of the new calibration show that the suctions estimated by the contact FPM test using the proposed function compare well with suctions measured by other laboratory
techniques for two different soils for the suction range of 50 kPa
Resumo:
The research aimed to establish tyre-road noise models by using a Data Mining approach that allowed to build a predictive model and assess the importance of the tested input variables. The data modelling took into account three learning algorithms and three metrics to define the best predictive model. The variables tested included basic properties of pavement surfaces, macrotexture, megatexture, and uneven- ness and, for the first time, damping. Also, the importance of those variables was measured by using a sensitivity analysis procedure. Two types of models were set: one with basic variables and another with complex variables, such as megatexture and damping, all as a function of vehicles speed. More detailed models were additionally set by the speed level. As a result, several models with very good tyre-road noise predictive capacity were achieved. The most relevant variables were Speed, Temperature, Aggregate size, Mean Profile Depth, and Damping, which had the highest importance, even though influenced by speed. Megatexture and IRI had the lowest importance. The applicability of the models developed in this work is relevant for trucks tyre-noise prediction, represented by the AVON V4 test tyre, at the early stage of road pavements use. Therefore, the obtained models are highly useful for the design of pavements and for noise prediction by road authorities and contractors.