940 resultados para Multi-cicle, Expectation, and Conditional Estimation Method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lithium Ion (Li-Ion) batteries have got attention in recent decades because of their undisputable advantages over other types of batteries. They are used in so many our devices which we need in our daily life such as cell phones, lap top computers, cameras, and so many electronic devices. They also are being used in smart grids technology, stand-alone wind and solar systems, Hybrid Electric Vehicles (HEV), and Plug in Hybrid Electric Vehicles (PHEV). Despite the rapid increase in the use of Lit-ion batteries, the existence of limited battery models also inadequate and very complex models developed by chemists is the lack of useful models a significant matter. A battery management system (BMS) aims to optimize the use of the battery, making the whole system more reliable, durable and cost effective. Perhaps the most important function of the BMS is to provide an estimate of the State of Charge (SOC). SOC is the ratio of available ampere-hour (Ah) in the battery to the total Ah of a fully charged battery. The Open Circuit Voltage (OCV) of a fully relaxed battery has an approximate one-to-one relationship with the SOC. Therefore, if this voltage is known, the SOC can be found. However, the relaxed OCV can only be measured when the battery is relaxed and the internal battery chemistry has reached equilibrium. This thesis focuses on Li-ion battery cell modelling and SOC estimation. In particular, the thesis, introduces a simple but comprehensive model for the battery and a novel on-line, accurate and fast SOC estimation algorithm for the primary purpose of use in electric and hybrid-electric vehicles, and microgrid systems. The thesis aims to (i) form a baseline characterization for dynamic modeling; (ii) provide a tool for use in state-of-charge estimation. The proposed modelling and SOC estimation schemes are validated through comprehensive simulation and experimental results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The very nature of computer science with its constant changes forces those who wish to follow to adapt and react quickly. Large companies invest in being up to date in order to generate revenue and stay active on the market. Universities, on the other hand, need to imply same practices of staying up to date with industry needs in order to produce industry ready engineers. By interviewing former students, now engineers in the industry, and current university staff this thesis aims to learn if there is space for enhancing the education through different lecturing approaches and/or curriculum adaptation and development. In order to address these concerns a qualitative research has been conducted, focusing on data collection obtained through semi-structured live world interviews. The method used follows the seven stages of research interviewing introduced by Kvale and focuses on collecting and preparing relevant data for analysis. The collected data is transcribed, refined, and further on analyzed in the “Findings and analysis” chapter. The focus of analyzing was answering the three research questions; learning how higher education impacts a Computer Science and Informatics Engineers’ job, how to better undergo the transition from studies to working in the industry and how to develop a curriculum that helps support the previous two. Unaltered quoted extracts are presented and individually analyzed. To paint a better picture a theme-wise analysis is presented summing valuable themes that were repeated throughout the interviewing phase. The findings obtained imply that there are several factors directly influencing the quality of education. From the student side, it mostly concerns expectation and dedication involving studies, and from the university side it is commitment to the curriculum development process. Due to the time and resource limitations this research provides findings conducted on a narrowed scope, although it can serve as a great foundation for further development; possibly as a PhD research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Strong convective events can produce extreme precipitation, hail, lightning or gusts, potentially inducing severe socio-economic impacts. These events have a relatively small spatial extension and, in most cases, a short lifetime. In this study, a model is developed for estimating convective extreme events based on large scale conditions. It is shown that strong convective events can be characterized by a Weibull distribution of radar-based rainfall with a low shape and high scale parameter value. A radius of 90km around a station reporting a convective situation turned out to be suitable. A methodology is developed to estimate the Weibull parameters and thus the occurrence probability of convective events from large scale atmospheric instability and enhanced near-surface humidity, which are usually found on a larger scale than the convective event itself. Here, the probability for the occurrence of extreme convective events is estimated from the KO-index indicating the stability, and relative humidity at 1000hPa. Both variables are computed from ERA-Interim reanalysis. In a first version of the methodology, these two variables are applied to estimate the spatial rainfall distribution and to estimate the occurrence of a convective event. The developed method shows significant skill in estimating the occurrence of convective events as observed at synoptic stations, lightning measurements, and severe weather reports. In order to take frontal influences into account, a scheme for the detection of atmospheric fronts is implemented. While generally higher instability is found in the vicinity of fronts, the skill of this approach is largely unchanged. Additional improvements were achieved by a bias-correction and the use of ERA-Interim precipitation. The resulting estimation method is applied to the ERA-Interim period (1979-2014) to establish a ranking of estimated convective extreme events. Two strong estimated events that reveal a frontal influence are analysed in detail. As a second application, the method is applied to GCM-based decadal predictions in the period 1979-2014, which were initialized every year. It is shown that decadal predictive skill for convective event frequencies over Germany is found for the first 3-4 years after the initialization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, technological advancements have brought industry and research towards the automation of various processes. Automation brings a reduction in costs and an improvement in product quality. For this reason, companies are pushing research to investigate new technologies. The agriculture industry has always looked towards automating various processes, from product processing to storage. In the last years, the automation of harvest and cultivation phases also has become attractive, pushed by the advancement of autonomous driving. Nevertheless, ADAS systems are not enough. Merging different technologies will be the solution to obtain total automation of agriculture processes. For example, sensors that estimate products' physical and chemical properties can be used to evaluate the maturation level of fruit. Therefore, the fusion of these technologies has a key role in industrial process automation. In this dissertation, ADAS systems and sensors for precision agriculture will be both treated. Several measurement procedures for characterizing commercial 3D LiDARs will be proposed and tested to cope with the growing need for comparison tools. Axial errors and transversal errors have been investigated. Moreover, a measurement method and setup for evaluating the fog effect on 3D LiDARs will be proposed. Each presented measurement procedure has been tested. The obtained results highlight the versatility and the goodness of the proposed approaches. Regarding the precision agriculture sensors, a measurement approach for the Moisture Content and density estimation of crop directly on the field is presented. The approach regards the employment of a Near Infrared spectrometer jointly with Partial Least Square statistical analysis. The approach and the model will be described together with a first laboratory prototype used to evaluate the NIRS approach. Finally, a prototype for on the field analysis is realized and tested. The test results are promising, evidencing that the proposed approach is suitable for Moisture Content and density estimation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The challenges of the current global food systems are often framed around feeding the world's growing population while meeting sustainable development for future generations. Globalization has brought to a fragmentation of food spaces, leading to a flexible and mutable supply chain. This poses a major challenge to food and nutrition security, affecting also rural-urban dynamics in territories. Furthermore, the recent crises have highlighted the vulnerability to shocks and disruptions of the food systems and the eco-system due to the intensive management of natural, human and economic capital. Hence, a sustainable and resilient transition of the food systems is required through a multi-faceted approach that tackles the causes of unsustainability and promotes sustainable practices at all levels of the food system. In this respect, a territorial approach becomes a relevant entry point of analysis for the food system’s multifunctionality and can support the evaluation of sustainability by quantifying impacts associated with quantitative methods and understanding the territorial responsibility of different actors with qualitative ones. Against this background the present research aims to i) investigate the environmental, costing and social indicators suitable for a scoring system able to measure the integrated sustainability performance of food initiatives within the City/Region territorial context; ii) develop a territorial assessment framework to measure sustainability impacts of agricultural systems; and iii) define an integrated methodology to match production and consumption at a territorial level to foster a long-term vision of short food supply chains. From a methodological perspective, the research proposes a mixed quantitative and qualitative research method. The outcomes provide an in-depth view into the environmental and socio-economic impacts of food systems at the territorial level, investigating possible indicators, frameworks, and business strategies to foster their future sustainable development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the field of educational and psychological measurement, the shift from paper-based to computerized tests has become a prominent trend in recent years. Computerized tests allow for more complex and personalized test administration procedures, like Computerized Adaptive Testing (CAT). CAT, following the Item Response Theory (IRT) models, dynamically generates tests based on test-taker responses, driven by complex statistical algorithms. Even if CAT structures are complex, they are flexible and convenient, but concerns about test security should be addressed. Frequent item administration can lead to item exposure and cheating, necessitating preventive and diagnostic measures. In this thesis a method called "CHeater identification using Interim Person fit Statistic" (CHIPS) is developed, designed to identify and limit cheaters in real-time during test administration. CHIPS utilizes response times (RTs) to calculate an Interim Person fit Statistic (IPS), allowing for on-the-fly intervention using a more secret item bank. Also, a slight modification is proposed to overcome situations with constant speed, called Modified-CHIPS (M-CHIPS). A simulation study assesses CHIPS, highlighting its effectiveness in identifying and controlling cheaters. However, it reveals limitations when cheaters possess all correct answers. The M-CHIPS overcame this limitation. Furthermore, the method has shown not to be influenced by the cheaters’ ability distribution or the level of correlation between ability and speed of test-takers. Finally, the method has demonstrated flexibility for the choice of significance level and the transition from fixed-length tests to variable-length ones. The thesis discusses potential applications, including the suitability of the method for multiple-choice tests, assumptions about RT distribution and level of item pre-knowledge. Also limitations are discussed to explore future developments such as different RT distributions, unusual honest respondent behaviors, and field testing in real-world scenarios. In summary, CHIPS and M-CHIPS offer real-time cheating detection in CAT, enhancing test security and ability estimation while not penalizing test respondents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tsunamis are rare events. However, their impact can be devastating and it may extend to large geographical areas. For low-probability high-impact events like tsunamis, it is crucial to implement all possible actions to mitigate the risk. The tsunami hazard assessment is the result of a scientific process that integrates traditional geological methods, numerical modelling and the analysis of tsunami sources and historical records. For this reason, analysing past events and understanding how they interacted with the land is the only way to inform tsunami source and propagation models, and quantitatively test forecast models like hazard analyses. The primary objective of this thesis is to establish an explicit relationship between the macroscopic intensity, derived from historical descriptions, and the quantitative physical parameters measuring tsunami waves. This is done first by defining an approximate estimation method based on a simplified 1D physical onshore propagation model to convert the available observations into one reference physical metric. Wave height at the coast was chosen as the reference due to its stability and independence of inland effects. This method was then implemented for a set of well-known past events to build a homogeneous dataset with both macroseismic intensity and wave height. By performing an orthogonal regression, a direct and invertible empirical relationship could be established between the two parameters, accounting for their relevant uncertainties. The target relationship is extensively tested and finally applied to the Italian Tsunami Effect Database (ITED), providing a homogeneous estimation of the wave height for all existing tsunami observations in Italy. This provides the opportunity for meaningful comparison for models and simulations, as well as quantitatively testing tsunami hazard models for the Italian coasts and informing tsunami risk management initiatives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method to quantify lycopene and β-carotene in freeze dried tomato pulp by high performance liquid chromatography (HLPC) was validated according to the criteria of selectivity, sensitivity, precision and accuracy, and uncertainty estimation of measurement was determined with data obtained in the validation. The validated method presented is selective in terms of analysis, and it had a good precision and accuracy. Detection limit for lycopene and β-carotene was 4.2 and 0.23 mg 100 g-1, respectively. The estimation of expanded uncertainty (K = 2) for lycopene was 104 ± 21 mg 100 g-1 and for β-carotene was 6.4 ± 1.5 mg 100 g-1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A rapid and low cost method to determine Cr(VI) in soils based upon alkaline metal extraction at room temperature is proposed as a semi-quantitative procedure to be performed in the field. A color comparison with standards with contents of Cr(VI) in the range of 10 to 150 mg kg-1 was used throughout. For the different types of soils studied, more than 75% of the fortified soluble Cr(VI) were recovered for all levels of spike tested for both the proposed and standard methods. Recoveries of 83 and 99% were obtained for the proposed and the standard methods, respectively, taking into account the analysis of a heavily contaminated soil sample.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes and discusses a method to determine root curvature radius by using cone-beam computed tomography (CBCT). The severity of root canal curvature is essential to select instrument and instrumentation technique. The diagnosis and planning of root canal treatment have traditionally been made based on periapical radiography. However, the higher accuracy of CBCT images to identify anatomic and pathologic alterations compared to panoramic and periapical radiographs has been shown to reduce the incidence of false-negative results. In high-resolution images, the measurement of root curvature radius can be obtained by circumcenter. Based on 3 mathematical points determined with the working tools of Planimp® software, it is possible to calculate root curvature radius in both apical and coronal directions. The CBCT-aided method for determination of root curvature radius presented in this article is easy to perform, reproducible and allows a more reliable and predictable endodontic planning, which reflects directly on a more efficacious preparation of curved root canals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The quantification of the available energy in the environment is important because it determines photosynthesis, evapotranspiration and, therefore, the final yield of crops. Instruments for measuring the energy balance are costly and indirect estimation alternatives are desirable. This study assessed the Deardorff's model performance during a cycle of a sugarcane crop in Piracicaba, State of São Paulo, Brazil, in comparison to the aerodynamic method. This mechanistic model simulates the energy fluxes (sensible, latent heat and net radiation) at three levels (atmosphere, canopy and soil) using only air temperature, relative humidity and wind speed measured at a reference level above the canopy, crop leaf area index, and some pre-calibrated parameters (canopy albedo, soil emissivity, atmospheric transmissivity and hydrological characteristics of the soil). The analysis was made for different time scales, insolation conditions and seasons (spring, summer and autumn). Analyzing all data of 15 minute intervals, the model presented good performance for net radiation simulation in different insolations and seasons. The latent heat flux in the atmosphere and the sensible heat flux in the atmosphere did not present differences in comparison to data from the aerodynamic method during the autumn. The sensible heat flux in the soil was poorly simulated by the model due to the poor performance of the soil water balance method. The Deardorff's model improved in general the flux simulations in comparison to the aerodynamic method when more insolation was available in the environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The strategy used to treat HCV infection depends on the genotype involved. An accurate and reliable genotyping method is therefore of paramount importance. We describe here, for the first time, the use of a liquid microarray for HCV genotyping. This liquid microarray is based on the 5'UTR - the most highly conserved region of HCV - and the variable region NS5B sequence. The simultaneous genotyping of two regions can be used to confirm findings and should detect inter-genotypic recombination. Plasma samples from 78 patients infected with viruses with genotypes and subtypes determined in the Versant (TM) HCV Genotype Assay LiPA (version I; Siemens Medical Solutions, Diagnostics Division, Fernwald, Germany) were tested with our new liquid microarray method. This method successfully determined the genotypes of 74 of the 78 samples previously genotyped in the Versant (TM) HCV Genotype Assay LiPA (74/78, 95%). The concordance between the two methods was 100% for genotype determination (74/74). At the subtype level, all 3a and 2b samples gave identical results with both methods (17/17 and 7/7, respectively). Two 2c samples were correctly identified by microarray, but could only be determined to the genotype level with the Versant (TM) HCV assay. Genotype ""1'' subtypes (1a and 1b) were correctly identified by the Versant (TM) HCV assay and the microarray in 68% and 40% of cases, respectively. No genotype discordance was found for any sample. HCV was successfully genotyped with both methods, and this is of prime importance for treatment planning. Liquid microarray assays may therefore be added to the list of methods suitable for HCV genotyping. It provides comparable results and may readily be adapted for the detection of other viruses frequently co-infecting HCV patients. Liquid array technology is thus a reliable and promising platform for HCV genotyping.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims. We derive lists of proper-motions and kinematic membership probabilities for 49 open clusters and possible open clusters in the zone of the Bordeaux PM2000 proper motion catalogue (+ 11 degrees <= delta <= + 18 degrees). We test different parametrisations of the proper motion and position distribution functions and select the most successful one. In the light of those results, we analyse some objects individually. Methods. We differenciate between cluster and field member stars, and assign membership probabilities, by applying a new and fully automated method based on both parametrisations of the proper motion and position distribution functions, and genetic algorithm optimization heuristics associated with a derivative-based hill climbing algorithm for the likelihood optimization. Results. We present a catalogue comprising kinematic parameters and associated membership probability lists for 49 open clusters and possible open clusters in the Bordeaux PM2000 catalogue region. We note that this is the first determination of proper motions for five open clusters. We confirm the non-existence of two kinematic populations in the region of 15 previously suspected non-existent objects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the Hammersley-Aldous-Diaconis process, infinitely many particles sit in R and at most one particle is allowed at each position. A particle at x, whose nearest neighbor to the right is at y, jumps at rate y - x to a position uniformly distributed in the interval (x, y). The basic coupling between trajectories with different initial configuration induces a process with different classes of particles. We show that the invariant measures for the two-class process can be obtained as follows. First, a stationary M/M/1 queue is constructed as a function of two homogeneous Poisson processes, the arrivals with rate, and the (attempted) services with rate rho > lambda Then put first class particles at the instants of departures (effective services) and second class particles at the instants of unused services. The procedure is generalized for the n-class case by using n - 1 queues in tandem with n - 1 priority types of customers. A multi-line process is introduced; it consists of a coupling (different from Liggett's basic coupling), having as invariant measure the product of Poisson processes. The definition of the multi-line process involves the dual points of the space-time Poisson process used in the graphical construction of the reversed process. The coupled process is a transformation of the multi-line process and its invariant measure is the transformation described above of the product measure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study tested the concurrent and construct validity of a newly developed OMNI-Kayak Scale, testing 8 male kayakers who performed a flatwater load-incremented ""shuttle"" test over a 500-m course and 3 estimation-production trials over a 1,000-m course. Velocity, blood lactate concentration, heart rate, and rating of perceived exertion (RPE), using the OMNI-Kayak RPE Scale and the Borg 6-20 Scale were recorded. OMNI-Kayak Scale RPE was highly correlated with velocity, the Borg 6-20 Scale RPE, blood lactate, and heart rate for both load-incremented test (rs=.87-.96), and estimation trials (rs=.75-.90). There were no significant differences among velocities, heart rate and blood lactate concentration between estimation and production trials. The OMNI-Kayak RPE Scale showed concurrent and construct validity in assessing perception of effort in flatwater kayaking and is a valid tool for self-regulation of exercise intensity.