908 resultados para Discrete Wavelet Transforms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The objective of the study is to explore preferences of gastroenterologists for biosimilar drugs in Crohn’s Disease and reveal trade-offs between the perceived risks and benefits related to biosimilar drugs. Method: Discrete choice experiment was carried out involving 51 Hungarian gastroenterologists in May, 2014. The following attributes were used to describe hypothetical choice sets: 1) type of the treatment (biosimilar/originator) 2) severity of disease 3) availability of continuous medicine supply 4) frequency of the efficacy check-ups. Multinomial logit model was used to differentiate between three attitude types: 1) always opting for the originator 2) willing to consider biosimilar for biological-naïve patients only 3) willing to consider biosimilar treatment for both types of patients. Conditional logit model was used to estimate the probabilities of choosing a given profile. Results: Men, senior consultants, working in IBD center and treating more patients are more likely to willing to consider biosimilar for biological-naïve patients only. Treatment type (originator/biosimilar) was the most important determinant of choice for patients already treated with biologicals, and the availability of continuous medicine supply in the case biological-naïve patients. The probabilities of choosing the biosimilar with all the benefits offered over the originator under current reimbursement conditions are 89% vs 11% for new patients, and 44% vs 56% for patients already treated with biological. Conclusions: Gastroenterologists were willing to trade between perceived risks and benefits of biosimilars. The continuous medical supply would be one of the major benefits of biosimilars. However, benefits offered in the scenarios do not compensate for the change from the originator to the biosimilar treatment of patients already treated with biologicals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Az empirikus makrogazdasági elemzések során rendszerint idősorok vizsgálatára szorítkozunk, ugyan-akkor egyre több tanulmány jut arra a következtetésre, hogy a frekvenciatartományok szintjén zajló folyama-tok megértése is szükséges ahhoz, hogy pontosabb képet nyerjünk a változók közötti kapcsolat irányáról, erősségéről, dinamikájáról. Jelen dolgozat célja, hogy az idő és frekvenciatérben történő elemzést biztosító folytonos wavelet transzformációk és az ezekhez kapcsolódó wavelet koherencia használatával bemutassa a svéd és a norvég gazdaság inflációs, ipari kibocsátás és GDP-mutatóinak az olajárral való együttmozgását. Az eredmények alapján a folytonos waveletekkel történő elemzés hasznos kiegészítést nyújt a szokásos idősoros technikák mellé, új, korábban nem ismert összefüggések feltárására is alkalmas lehet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, discrete time one-factor models of the term structure of interest rates and their application to the pricing of interest rate contingent claims are examined theoretically and empirically. The first chapter provides a discussion of the issues involved in the pricing of interest rate contingent claims and a description of the Ho and Lee (1986), Maloney and Byrne (1989), and Black, Derman, and Toy (1990) discrete time models. In the second chapter, a general discrete time model of the term structure from which the Ho and Lee, Maloney and Byrne, and Black, Derman, and Toy models can all be obtained is presented. The general model also provides for the specification of an additional model, the ExtendedMB model. The third chapter illustrates the application of the discrete time models to the pricing of a variety of interest rate contingent claims. In the final chapter, the performance of the Ho and Lee, Black, Derman, and Toy, and ExtendedMB models in the pricing of Eurodollar futures options is investigated empirically. The results indicate that the Black, Derman, and Toy and ExtendedMB models outperform the Ho and Lee model. Little difference in the performance of the Black, Derman, and Toy and ExtendedMB models is detected. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis develops and validates the framework of a specialized maintenance decision support system for a discrete part manufacturing facility. Its construction utilizes a modular approach based on the fundamental philosophy of Reliability Centered Maintenance (RCM). The proposed architecture uniquely integrates System Decomposition, System Evaluation, Failure Analysis, Logic Tree Analysis, and Maintenance Planning modules. It presents an ideal solution to the unique maintenance inadequacies of modern discrete part manufacturing systems. Well established techniques are incorporated as building blocks of the system's modules. These include Failure Mode Effect and Criticality Analysis (FMECA), Logic Tree Analysis (LTA), Theory of Constraints (TOC), and an Expert System (ES). A Maintenance Information System (MIS) performs the system's support functions. Validation was performed by field testing of the system at a Miami based manufacturing facility. Such a maintenance support system potentially reduces downtime losses and contributes to higher product quality output. Ultimately improved profitability is the final outcome. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Access to healthcare is a major problem in which patients are deprived of receiving timely admission to healthcare. Poor access has resulted in significant but avoidable healthcare cost, poor quality of healthcare, and deterioration in the general public health. Advanced Access is a simple and direct approach to appointment scheduling in which the majority of a clinic's appointments slots are kept open in order to provide access for immediate or same day healthcare needs and therefore, alleviate the problem of poor access the healthcare. This research formulates a non-linear discrete stochastic mathematical model of the Advanced Access appointment scheduling policy. The model objective is to maximize the expected profit of the clinic subject to constraints on minimum access to healthcare provided. Patient behavior is characterized with probabilities for no-show, balking, and related patient choices. Structural properties of the model are analyzed to determine whether Advanced Access patient scheduling is feasible. To solve the complex combinatorial optimization problem, a heuristic that combines greedy construction algorithm and neighborhood improvement search was developed. The model and the heuristic were used to evaluate the Advanced Access patient appointment policy compared to existing policies. Trade-off between profit and access to healthcare are established, and parameter analysis of input parameters was performed. The trade-off curve is a characteristic curve and was observed to be concave. This implies that there exists an access level at which at which the clinic can be operated at optimal profit that can be realized. The results also show that, in many scenarios by switching from existing scheduling policy to Advanced Access policy clinics can improve access without any decrease in profit. Further, the success of Advanced Access policy in providing improved access and/or profit depends on the expected value of demand, variation in demand, and the ratio of demand for same day and advanced appointments. The contributions of the dissertation are a model of Advanced Access patient scheduling, a heuristic to solve the model, and the use of the model to understand the scheduling policy trade-offs which healthcare clinic managers must make. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation presents a unique research opportunity by using recordings which provide electrocardiogram (ECG) plus a reference breathing signal (RBS). ECG derived breathing (EDR) is measured and correlated against RBS. Standard deviations of multiresolution wavelet analysis coefficients (SDMW) are obtained from heart rate and classified using RBS. Prior works by others used select patients for sleep apnea scoring with EDR but no RBS. Another prior work classified select heart disease patients with SDMW but no RBS. This study used randomly chosen sleep disorder patient recordings; central and obstructive apneas, with and without heart disease.^ Implementation required creating an application because existing systems were limited in power and scope. A review survey was created to choose a development environment. The survey is presented as a learning tool and teaching resource. Development objectives were rapid development using limited resources (manpower and money). Open Source resources were used exclusively for implementation. ^ Results show: (1) Three groups of patients exist in the study. Grouping RBS correlations shows a response with either ECG interval or amplitude variation. A third group exists where neither ECG intervals nor amplitude variation correlate with breathing. (2) Previous work done by other groups analyzed SDMW. Similar results were found in this study but some subjects had higher SDMW, attributed to a large number of apneas, arousals and/or disconnects. SDMW does not need RBS to show apneic conditions exist within ECG recordings. (3) Results in this study support the assertion that autonomic nervous system variation was measured with SDMW. Measurements using RBS are not corrupted due to breathing even though respiration overlaps the same frequency band.^ Overall, this work becomes an Open Source resource which can be reused, modified and/or expanded. It might fast track additional research. In the future the system could also be used for public domain data. Prerecorded data exist in similar formats in public databases which could provide additional research opportunities. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research aims at a study of the hybrid flow shop problem which has parallel batch-processing machines in one stage and discrete-processing machines in other stages to process jobs of arbitrary sizes. The objective is to minimize the makespan for a set of jobs. The problem is denoted as: FF: batch1,sj:Cmax. The problem is formulated as a mixed-integer linear program. The commercial solver, AMPL/CPLEX, is used to solve problem instances to their optimality. Experimental results show that AMPL/CPLEX requires considerable time to find the optimal solution for even a small size problem, i.e., a 6-job instance requires 2 hours in average. A bottleneck-first-decomposition heuristic (BFD) is proposed in this study to overcome the computational (time) problem encountered while using the commercial solver. The proposed BFD heuristic is inspired by the shifting bottleneck heuristic. It decomposes the entire problem into three sub-problems, and schedules the sub-problems one by one. The proposed BFD heuristic consists of four major steps: formulating sub-problems, prioritizing sub-problems, solving sub-problems and re-scheduling. For solving the sub-problems, two heuristic algorithms are proposed; one for scheduling a hybrid flow shop with discrete processing machines, and the other for scheduling parallel batching machines (single stage). Both consider job arrival and delivery times. An experiment design is conducted to evaluate the effectiveness of the proposed BFD, which is further evaluated against a set of common heuristics including a randomized greedy heuristic and five dispatching rules. The results show that the proposed BFD heuristic outperforms all these algorithms. To evaluate the quality of the heuristic solution, a procedure is developed to calculate a lower bound of makespan for the problem under study. The lower bound obtained is tighter than other bounds developed for related problems in literature. A meta-search approach based on the Genetic Algorithm concept is developed to evaluate the significance of further improving the solution obtained from the proposed BFD heuristic. The experiment indicates that it reduces the makespan by 1.93 % in average within a negligible time when problem size is less than 50 jobs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We organized an international campaign to observe the blazar 0716+714 in the optical band. The observations took place from February 24, 2009 to February 26, 2009. The global campaign was carried out by observers from more that sixteen countries and resulted in an extended light curve nearly seventy-eight hours long. The analysis and the modeling of this light curve form the main work of this dissertation project. In the first part of this work, we present the time series and noise analyses of the data. The time series analysis utilizes discrete Fourier transform and wavelet analysis routines to search for periods in the light curve. We then present results of the noise analysis which is based on the idea that each microvariability curve is the realization of the same underlying stochastic noise processes in the blazar jet. ^ Neither reoccuring periods nor random noise can successfully explain the observed optical fluctuations. Hence in the second part, we propose and develop a new model to account for the microvariability we see in blazar 0716+714. We propose that the microvariability is due to the emission from turbulent regions in the jet that are energized by the passage of relativistic shocks. Emission from each turbulent cell forms a pulse of emission, and when convolved with other pulses, yields the observed light curve. We use the model to obtain estimates of the physical parameters of the emission regions in the jet.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite a growing recognition that the solutions to current environmental problems will be developed through collaborations between scientists and stakeholders, substantial challenges stifle such cooperation and slow the transfer of knowledge. Challenges occur at several levels, including individual, disciplinary, and institutional. All of these have implications for scholars working at academic and research institutions. Fortunately, creative ideas and tested models exist that provide opportunities for conversation and serious consideration about how such institutions can facilitate the dialogue between scientists and society

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In finance literature many economic theories and models have been proposed to explain and estimate the relationship between risk and return. Assuming risk averseness and rational behavior on part of the investor, the models are developed which are supposed to help in forming efficient portfolios that either maximize (minimize) the expected rate of return (risk) for a given level of risk (rates of return). One of the most used models to form these efficient portfolios is the Sharpe's Capital Asset Pricing Model (CAPM). In the development of this model it is assumed that the investors have homogeneous expectations about the future probability distribution of the rates of return. That is, every investor assumes the same values of the parameters of the probability distribution. Likewise financial volatility homogeneity is commonly assumed, where volatility is taken as investment risk which is usually measured by the variance of the rates of return. Typically the square root of the variance is used to define financial volatility, furthermore it is also often assumed that the data generating process is made of independent and identically distributed random variables. This again implies that financial volatility is measured from homogeneous time series with stationary parameters. In this dissertation, we investigate the assumptions of homogeneity of market agents and provide evidence for the case of heterogeneity in market participants' information, objectives, and expectations about the parameters of the probability distribution of prices as given by the differences in the empirical distributions corresponding to different time scales, which in this study are associated with different classes of investors, as well as demonstrate that statistical properties of the underlying data generating processes including the volatility in the rates of return are quite heterogeneous. In other words, we provide empirical evidence against the traditional views about homogeneity using non-parametric wavelet analysis on trading data, The results show heterogeneity of financial volatility at different time scales, and time-scale is one of the most important aspects in which trading behavior differs. In fact we conclude that heterogeneity as posited by the Heterogeneous Markets Hypothesis is the norm and not the exception.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research has found that children with autism spectrum disorders (ASD) show significant deficits in receptive language skills (Wiesmer, Lord, & Esler, 2010). One of the primary goals of applied behavior analytic intervention is to improve the communication skills of children with autism by teaching receptive discriminations. Both receptive discriminations and receptive language entail matching spoken words with corresponding objects, symbols (e.g., pictures or words), actions, people, and so on (Green, 2001). In order to develop receptive language skills, children with autism often undergo discrimination training within the context of discrete trial training. This training entails teaching the learner how to respond differentially to different stimuli (Green, 2001). It is through discrimination training that individuals with autism learn and develop language (Lovaas, 2003). The present study compares three procedures for teaching receptive discriminations: (1) simple/conditional (Procedure A), (2) conditional only (Procedure B), and (3) conditional discrimination of two target cards (Procedure C). Six children, ranging in age from 2-years-old to 5-years-old, with an autism diagnosis were taught how to receptively discriminate nine sets of stimuli. Results suggest that the extra training steps included in the simple/conditional and conditional only procedures may not be necessary to teach children with autism how to receptively discriminate. For all participants, Procedure C appeared to be the most efficient and effective procedure for teaching young children with autism receptive discriminations. Response maintenance and generalization probes conducted one-month following the end of training indicate that even though Procedure C resulted in less training sessions overall, no one procedure resulted in better maintenance and generalization than the others. In other words, more training sessions, as evident with the simple/conditional and conditional only procedures, did not facilitate participants’ ability to accurately respond or generalize one-month following training. The present study contributes to the literature on what is the most efficient and effective way to teach receptive discrimination during discrete trial training to children with ASD. These findings are critical as research shows that receptive language skills are predictive of better outcomes and adaptive behaviors in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Communication has become an essential function in our civilization. With the increasing demand for communication channels, it is now necessary to find ways to optimize the use of their bandwidth. One way to achieve this is by transforming the information before it is transmitted. This transformation can be performed by several techniques. One of the newest of these techniques is the use of wavelets. Wavelet transformation refers to the act of breaking down a signal into components called details and trends by using small waveforms that have a zero average in the time domain. After this transformation the data can be compressed by discarding the details, transmitting the trends. In the receiving end, the trends are used to reconstruct the image. In this work, the wavelet used for the transformation of an image will be selected from a library of available bases. The accuracy of the reconstruction, after the details are discarded, is dependent on the wavelets chosen from the wavelet basis library. The system developed in this thesis takes a 2-D image and decomposes it using a wavelet bank. A digital signal processor is used to achieve near real-time performance in this transformation task. A contribution of this thesis project is the development of DSP-based test bed for the future development of new real-time wavelet transformation algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to correct some mistakes in the literature and derive a necessary and sufficient condition for the MRL to follow the roller-coaster pattern of the corresponding failure rate function. It was also desired to find the conditions under which the discrete failure rate function has an upside-down bathtub shape if corresponding MRL function has a bathtub shape. The study showed that if discrete MRL has a bathtub shape, then under some conditions the corresponding failure rate function has an upside-down bathtub shape. Also the study corrected some mistakes in proofs of Tang, Lu and Chew (1999) and established a necessary and sufficient condition for the MRL to follow the roller-coaster pattern of the corresponding failure rate function. Similarly, some mistakes in Gupta and Gupta (2000) are corrected, with the ensuing results being expanded and proved thoroughly to establish the relationship between the crossing points of the failure rate and associated MRL functions. The new results derived in this study will be useful to model various lifetime data that occur in environmental studies, medical research, electronics engineering, and in many other areas of science and technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Tara Oceans Expedition (2009-2013) was a global survey of ocean ecosystems aboard the Sailing Vessel Tara. It carried out extensive measurements of evironmental conditions and collected plankton (viruses, bacteria, protists and metazoans) for later analysis using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present data set includes properties of seawater, particulate matter and dissolved matter that were measured from discrete water samples collected with Niskin bottles during the 2009-2013 Tara Oceans expedition. Properties include pigment concentrations from HPLC analysis (10 depths per vertical profile, 25 pigments per depth), the carbonate system (Surface and 400m; pH (total scale), CO2, pCO2, fCO2, HCO3, CO3, Total alkalinity, Total carbon, OmegaAragonite, OmegaCalcite, and dosage Flags), nutrients (10 depths per vertical profile; NO2, PO4, N02/NO3, SI, quality Flags), DOC, CDOM, and dissolved oxygen isotopes. The Service National d'Analyse des Paramètres Océaniques du CO2, at the Université Pierre et Marie Curie, determined CT and AT potentiometrically. More than 200 vertical profiles of these properties were made across the world ocean. DOC, CDOM and dissolved oxygen isotopes are available only for the Arctic Ocean and Arctic Seas (2013).