979 resultados para interval approach


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A demographic model is developed based on interbirth intervals and is applied to estimate the population growth rate of humpback whales (Megaptera novaeangliae) in the Gulf of Maine. Fecundity rates in this model are based on the probabilities of giving birth at time t after a previous birth and on the probabilities of giving birth first at age x. Maximum likelihood methods are used to estimate these probabilities using sighting data collected for individually identified whales. Female survival rates are estimated from these same sighting data using a modified Jolly–Seber method. The youngest age at first parturition is 5 yr, the estimated mean birth interval is 2.38 yr (SE = 0.10 yr), the estimated noncalf survival rate is 0.960 (SE = 0.008), and the estimated calf survival rate is 0.875 (SE = 0.047). The population growth rate (l) is estimated to be 1.065; its standard error is estimated as 0.012 using a Monte Carlo approach, which simulated sampling from a hypothetical population of whales. The simulation is also used to investigate the bias in estimating birth intervals by previous methods. The approach developed here is applicable to studies of other populations for which individual interbirth intervals can be measured.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work present a interval approach to deal with images with that contain uncertainties, as well, as treating these uncertainties through morphologic operations. Had been presented two intervals models. For the first, is introduced an algebraic space with three values, that was constructed based in the tri-valorada logic of Lukasiewiecz. With this algebraic structure, the theory of the interval binary images, that extends the classic binary model with the inclusion of the uncertainty information, was introduced. The same one can be applied to represent certain binary images with uncertainty in pixels, that it was originated, for example, during the process of the acquisition of the image. The lattice structure of these images, allow the definition of the morphologic operators, where the uncertainties are treated locally. The second model, extend the classic model to the images in gray levels, where the functions that represent these images are mapping in a finite set of interval values. The algebraic structure belong the complete lattices class, what also it allow the definition of the elementary operators of the mathematical morphology, dilation and erosion for this images. Thus, it is established a interval theory applied to the mathematical morphology to deal with problems of uncertainties in images

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper investigates the robust H∞ control for Takagi-Sugeno (T-S) fuzzy systems with interval time-varying delay. By employing a new and tighter integral inequality and constructing an appropriate type of Lyapunov functional, delay-dependent stability criteria are derived for the control problem. Because neither any model transformation nor free weighting matrices are employed in our theoretical derivation, the developed stability criteria significantly improve and simplify the existing stability conditions. Also, the maximum allowable upper delay bound and controller feedback gains can be obtained simultaneously from the developed approach by solving a constrained convex optimization problem. Numerical examples are given to demonstrate the effectiveness of the proposed methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a Chance-constraint Programming approach for constructing maximum-margin classifiers which are robust to interval-valued uncertainty in training examples. The methodology ensures that uncertain examples are classified correctly with high probability by employing chance-constraints. The main contribution of the paper is to pose the resultant optimization problem as a Second Order Cone Program by using large deviation inequalities, due to Bernstein. Apart from support and mean of the uncertain examples these Bernstein based relaxations make no further assumptions on the underlying uncertainty. Classifiers built using the proposed approach are less conservative, yield higher margins and hence are expected to generalize better than existing methods. Experimental results on synthetic and real-world datasets show that the proposed classifiers are better equipped to handle interval-valued uncertainty than state-of-the-art.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Not considered in the analytical model of the plant, uncertainties always dramatically decrease the performance of the fault detection task in the practice. To cope better with this prevalent problem, in this paper we develop a methodology using Modal Interval Analysis which takes into account those uncertainties in the plant model. A fault detection method is developed based on this model which is quite robust to uncertainty and results in no false alarm. As soon as a fault is detected, an ANFIS model is trained in online to capture the major behavior of the occurred fault which can be used for fault accommodation. The simulation results understandably demonstrate the capability of the proposed method for accomplishing both tasks appropriately

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recently, Branzei, Dimitrov, and Tijs (2003) introduced cooperative interval-valued games. Among other insights, the notion of an interval core has been coined and proposed as a solution concept for interval-valued games. In this paper we will present a general mathematical programming algorithm which can be applied to find an element in the interval core. As an example, we discuss lot sizing with uncertain demand to provide an application for interval-valued games and to demonstrate how interval core elements can be computed. Also, we reveal that pitfalls exist if interval core elements are computed in a straightforward manner by considering the interval borders separately.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

One of the earliest accounts of duration perception by Karl von Vierordt implied a common process underlying the timing of intervals in the sub-second and the second range. To date, there are two major explanatory approaches for the timing of brief intervals: the Common Timing Hypothesis and the Distinct Timing Hypothesis. While the common timing hypothesis also proceeds from a unitary timing process, the distinct timing hypothesis suggests two dissociable, independent mechanisms for the timing of intervals in the sub-second and the second range, respectively. In the present paper, we introduce confirmatory factor analysis (CFA) to elucidate the internal structure of interval timing in the sub-second and the second range. Our results indicate that the assumption of two mechanisms underlying the processing of intervals in the second and the sub-second range might be more appropriate than the assumption of a unitary timing mechanism. In contrast to the basic assumption of the distinct timing hypothesis, however, these two timing mechanisms are closely associated with each other and share 77% of common variance. This finding suggests either a strong functional relationship between the two timing mechanisms or a hierarchically organized internal structure. Findings are discussed in the light of existing psychophysical and neurophysiological data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper studies feature subset selection in classification using a multiobjective estimation of distribution algorithm. We consider six functions, namely area under ROC curve, sensitivity, specificity, precision, F1 measure and Brier score, for evaluation of feature subsets and as the objectives of the problem. One of the characteristics of these objective functions is the existence of noise in their values that should be appropriately handled during optimization. Our proposed algorithm consists of two major techniques which are specially designed for the feature subset selection problem. The first one is a solution ranking method based on interval values to handle the noise in the objectives of this problem. The second one is a model estimation method for learning a joint probabilistic model of objectives and variables which is used to generate new solutions and advance through the search space. To simplify model estimation, l1 regularized regression is used to select a subset of problem variables before model learning. The proposed algorithm is compared with a well-known ranking method for interval-valued objectives and a standard multiobjective genetic algorithm. Particularly, the effects of the two new techniques are experimentally investigated. The experimental results show that the proposed algorithm is able to obtain comparable or better performance on the tested datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing prevalence of obesity in society has been associated with a number of atherogenic risk factors such as insulin resistance. Aerobic training is often recommended as a strategy to induce weight loss, with a greater impact of high-intensity levels on cardiovascular function and insulin sensitivity, and a greater impact of moderate-intensity levels on fat oxidation. Anaerobic high-intensity (supramaximal) interval training has been advocated to improve cardiovascular function, insulin sensitivity and fat oxidation. However, obese individuals tend to have a lower tolerance of high-intensity exercise due to discomfort. Furthermore, some obese individuals may compensate for the increased energy expenditure by eating more and/or becoming less active. Recently, both moderate- and high-intensity aerobic interval training have been advocated as alternative approaches. However, it is still uncertain as to which approach is more effective in terms of increasing fat oxidation given the issues with levels of fitness and motivation, and compensatory behaviours. Accordingly, the objectives of this thesis were to compare the influence of moderate- and high-intensity interval training on fat oxidation and eating behaviour in overweight/obese men. Two exercise interventions were undertaken by 10-12 overweight/obese men to compare their responses to study variables, including fat oxidation and eating behaviour during moderate- and high-intensity interval training (MIIT and HIIT). The acute training intervention was a methodological study designed to examine the validity of using exercise intensity from the graded exercise test (GXT) - which measured the intensity that elicits maximal fat oxidation (FATmax) - to prescribe interval training during 30-min MIIT. The 30-min MIIT session involved 5-min repetitions of workloads 20% below and 20% above the FATmax. The acute intervention was extended to involve HIIT in a cross-over design to compare the influence of MIIT and HIIT on eating behaviour using subjective appetite sensation and food preference through the liking and wanting test. The HIIT consisted of 15-sec interval training at 85 %VO2peak interspersed by 15-sec unloaded recovery, with a total mechanical work equal to MIIT. The medium term training intervention was a cross-over 4-week (12 sessions) MIIT and HIIT exercise training with a 6-week detraining washout period. The MIIT sessions consisted of 5-min cycling stages at ±20% of mechanical work at 45 %VO2peak, and the HIIT sessions consisted of repetitive 30-sec work at 90 %VO2peak and 30-sec interval rests, during identical exercise sessions of between 30 and 45 mins. Assessments included a constant-load test (45 %VO2peak for 45 mins) followed by 60-min recovery at baseline and the end of 4-week training, to determine fat oxidation rate. Participants’ responses to exercise were measured using blood lactate (BLa), heart rate (HR) and rating of perceived exertion (RPE) and were measured during the constant-load test and in the first intervention training session of every week during training. Eating behaviour responses were assessed by measuring subjective appetite sensations, liking and wanting and ad libitum energy intake. Results of the acute intervention showed that FATmax is a valid method to estimate VO2 and BLa, but is not valid to estimate HR and RPE in the MIIT session. While the average rate of fat oxidation during 30-min MIIT was comparable with the rate of fat oxidation at FATmax (0.16 ±0.09 and 0.14 ±0.08 g/min, respectively), fat oxidation was significantly higher at minute 25 of MIIT (P≤0.01). In addition, there was no significant difference between MIIT and HIIT in the rate of appetite sensations after exercise, but there was a tendency towards a lower rate of hunger after HIIT. Different intensities of interval exercise also did not affect explicit liking or implicit wanting. Results of the medium-term intervention indicated that current interval training levels did not affect body composition, fasting insulin and fasting glucose. Maximal aerobic capacity significantly increased (P≤0.01) (2.8 and 7.0% after MIIT and HIIT respectively) during GXT, and fat oxidation significantly increased (P≤0.01) (96 and 43% after MIIT and HIIT respectively) during the acute constant-load exercise test. RPE significantly decreased after HIIT greater than MIIT (P≤0.05), and the decrease in BLa was greater during the constant-load test after HIIT than MIIT, but this difference did not reach statistical significance (P=0.09). In addition, following constant-load exercise, exercise-induced hunger and desire to eat decreased after HIIT greater than MIIT but were not significant (p value for desire to eat was 0.07). Exercise-induced liking of high-fat sweet (HFSW) and high-fat non-sweet (HFNS) foods increased after MIIT and decreased after HIIT (p value for HFNS was 0.09). The intervention explained 12.4% of the change in fat intake (p = 0.07). This research is significant in that it confirmed two points in the acute study. While the rate of fat oxidation increased during MIIT, the average rate of fat oxidation during 30-min MIIT was comparable with the rate of fat oxidation at FATmax. In addition, manipulating the intensity of acute interval exercise did not affect appetite sensations and liking and wanting. In the medium-term intervention, constant-load exercise-induced fat oxidation significantly increased after interval training, independent of exercise intensity. In addition, desire to eat, explicit liking for HFNS and fat intake collectively confirmed that MIIT is accompanied by a greater compensation of eating behaviour than HIIT. Findings from this research will assist in developing exercise strategies to provide obese men with various training options. In addition, the finding that overweight/obese men expressed a lower RPE and decreased BLa after HIIT compared with MIIT is contrary to the view that obese individuals may not tolerate high-intensity interval training. Therefore, high-intensity interval training can be advocated among the obese adult male population. Future studies may extend this work by using a longer-term intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Through the application of process mining, valuable evidence-based insights can be obtained about business processes in organisations. As a result the field has seen an increased uptake in recent years as evidenced by success stories and increased tool support. However, despite this impact, current performance analysis capabilities remain somewhat limited in the context of information-poor event logs. For example, natural daily and weekly patterns are not considered. In this paper a new framework for analysing event logs is defined which is based on the concept of event gap. The framework allows for a systematic approach to sophisticated performance-related analysis of event logs containing varying degrees of information. The paper formalises a range of event gap types and then presents an implementation as well as an evaluation of the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dissertation consists of three essays on misplanning wealth and health accumulation. The conventional economics assumes that individual's intertemporal preferences are exponential (exponential preferences, EP). Recent findings in behavioural economics have shown that, actually, people do discount near future relatively heavier than distant future. This implies hyperbolic intertemporal preferences (HP). Essays I and II concentrate especially on the effects of a delayed completion of tasks, a feature of behaviour that HP enables. Essay III uses current Finnish data to analyse the evolvement of the quality adjusted life years (QALYs) and inconsistencies in measuring that. Essay I studies the existence effects of a lucrative retirement savings program (SP) on the retirement savings of different individual types having HP. If the individual does not know that he will have HP also in the future, i.e. he is the naïve, for certain conditions, he delays the enrolment on SP until he abandons it. Very interesting finding is that the naïve retires then poorer in the presence than in the absence of SP. For the same conditions, the individual who knows that he will have HP also in the future, i.e. he is the sophisticated, gains from the existence of SP, and retires with greater retirement savings in the presence than in the absence of SP. Finally, capabilities to learn from past behaviour and about intertemporal preferences improve possibilities to gain from the existence but an adequate time to learn must be then guaranteed. Essay II studies delayed doctor's visits, theirs effects on the costs of a public health care system and government's attempts to control patient behaviour and fund the system. The controlling devices are a consultation fee and a deductible for that. The deductible is effective only for a patient whose diagnosis reveals a disease that would not get cured without the doctor's visit. The naives delay their visits the longest while EP-patients are the quickest visitors. To control the naives, the government should implement a low fee and a high deductible, while for the sophisticates the opposite is true. Finally, if all the types exist in an economy then using an incorrect conventional assumption that all individuals have EP leads to worse situation and requires higher tax rates than assuming incorrectly but unconventionally that only the naives exists. Essay III studies the development of QALYs in Finland 1995/96-2004. The essay concentrates on developing a consistent measure, i.e. independent of discounting, for measuring the age and gender specific QALY-changes and their incidences. For the given time interval, use of a relative change out of an attainable change seems to be almost intact to discounting and reveals that the greatest gains are for older age groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An error-free computational approach is employed for finding the integer solution to a system of linear equations, using finite-field arithmetic. This approach is also extended to find the optimum solution for linear inequalities such as those arising in interval linear programming probloms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radiation therapy (RT) plays currently significant role in curative treatments of several cancers. External beam RT is carried out mostly by using megavoltage beams of linear accelerators. Tumor eradication and normal tissue complications correlate to dose absorbed in tissues. Normally this dependence is steep and it is crucial that actual dose within patient accurately correspond to the planned dose. All factors in a RT procedure contain uncertainties requiring strict quality assurance. From hospital physicist´s point of a view, technical quality control (QC), dose calculations and methods for verification of correct treatment location are the most important subjects. Most important factor in technical QC is the verification that radiation production of an accelerator, called output, is within narrow acceptable limits. The output measurements are carried out according to a locally chosen dosimetric QC program defining measurement time interval and action levels. Dose calculation algorithms need to be configured for the accelerators by using measured beam data. The uncertainty of such data sets limits for best achievable calculation accuracy. All these dosimetric measurements require good experience, are workful, take up resources needed for treatments and are prone to several random and systematic sources of errors. Appropriate verification of treatment location is more important in intensity modulated radiation therapy (IMRT) than in conventional RT. This is due to steep dose gradients produced within or close to healthy tissues locating only a few millimetres from the targeted volume. The thesis was concentrated in investigation of the quality of dosimetric measurements, the efficacy of dosimetric QC programs, the verification of measured beam data and the effect of positional errors on the dose received by the major salivary glands in head and neck IMRT. A method was developed for the estimation of the effect of the use of different dosimetric QC programs on the overall uncertainty of dose. Data were provided to facilitate the choice of a sufficient QC program. The method takes into account local output stability and reproducibility of the dosimetric QC measurements. A method based on the model fitting of the results of the QC measurements was proposed for the estimation of both of these factors. The reduction of random measurement errors and optimization of QC procedure were also investigated. A method and suggestions were presented for these purposes. The accuracy of beam data was evaluated in Finnish RT centres. Sufficient accuracy level was estimated for the beam data. A method based on the use of reference beam data was developed for the QC of beam data. Dosimetric and geometric accuracy requirements were evaluated for head and neck IMRT when function of the major salivary glands is intended to be spared. These criteria are based on the dose response obtained for the glands. Random measurement errors could be reduced enabling lowering of action levels and prolongation of measurement time interval from 1 month to even 6 months simultaneously maintaining dose accuracy. The combined effect of the proposed methods, suggestions and criteria was found to facilitate the avoidance of maximal dose errors of up to even about 8 %. In addition, their use may make the strictest recommended overall dose accuracy level of 3 % (1SD) achievable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the problem of high-resolution reconstruction in frequency-domain optical-coherence tomography (FDOCT). The traditional method employed uses the inverse discrete Fourier transform, which is limited in resolution due to the Heisenberg uncertainty principle. We propose a reconstruction technique based on zero-crossing (ZC) interval analysis. The motivation for our approach lies in the observation that, for a multilayered specimen, the backscattered signal may be expressed as a sum of sinusoids, and each sinusoid manifests as a peak in the FDOCT reconstruction. The successive ZC intervals of a sinusoid exhibit high consistency, with the intervals being inversely related to the frequency of the sinusoid. The statistics of the ZC intervals are used for detecting the frequencies present in the input signal. The noise robustness of the proposed technique is improved by using a cosine-modulated filter bank for separating the input into different frequency bands, and the ZC analysis is carried out on each band separately. The design of the filter bank requires the design of a prototype, which we accomplish using a Kaiser window approach. We show that the proposed method gives good results on synthesized and experimental data. The resolution is enhanced, and noise robustness is higher compared with the standard Fourier reconstruction. (c) 2012 Optical Society of America

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In several systems, the physical parameters of the system vary over time or operating points. A popular way of representing such plants with structured or parametric uncertainties is by means of interval polynomials. However, ensuring the stability of such systems is a robust control problem. Fortunately, Kharitonov's theorem enables the analysis of such interval plants and also provides tools for design of robust controllers in such cases. The present paper considers one such case, where the interval plant is connected with a timeinvariant, static, odd, sector type nonlinearity in its feedback path. This paper provides necessary conditions for the existence of self sustaining periodic oscillations in such interval plants, and indicates a possible design algorithm to avoid such periodic solutions or limit cycles. The describing function technique is used to approximate the nonlinearity and subsequently arrive at the results. Furthermore, the value set approach, along with Mikhailov conditions, are resorted to in providing graphical techniques for the derivation of the conditions and subsequent design algorithm of the controller.