676 resultados para real-world


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Microarray data is frequently used to characterize the expression profile of a whole genome and to compare the characteristics of that genome under several conditions. Geneset analysis methods have been described previously to analyze the expression values of several genes related by known biological criteria (metabolic pathway, pathology signature, co-regulation by a common factor, etc.) at the same time and the cost of these methods allows for the use of more values to help discover the underlying biological mechanisms. Results: As several methods assume different null hypotheses, we propose to reformulate the main question that biologists seek to answer. To determine which genesets are associated with expression values that differ between two experiments, we focused on three ad hoc criteria: expression levels, the direction of individual gene expression changes (up or down regulation), and correlations between genes. We introduce the FAERI methodology, tailored from a two-way ANOVA to examine these criteria. The significance of the results was evaluated according to the self-contained null hypothesis, using label sampling or by inferring the null distribution from normally distributed random data. Evaluations performed on simulated data revealed that FAERI outperforms currently available methods for each type of set tested. We then applied the FAERI method to analyze three real-world datasets on hypoxia response. FAERI was able to detect more genesets than other methodologies, and the genesets selected were coherent with current knowledge of cellular response to hypoxia. Moreover, the genesets selected by FAERI were confirmed when the analysis was repeated on two additional related datasets. Conclusions: The expression values of genesets are associated with several biological effects. The underlying mathematical structure of the genesets allows for analysis of data from several genes at the same time. Focusing on expression levels, the direction of the expression changes, and correlations, we showed that two-step data reduction allowed us to significantly improve the performance of geneset analysis using a modified two-way ANOVA procedure, and to detect genesets that current methods fail to detect.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND/AIMS: Switzerland's drug policy model has always been unique and progressive, but there is a need to reassess this system in a rapidly changing world. The IMPROVE study was conducted to gain understanding of the attitudes and beliefs towards opioid maintenance therapy (OMT) in Switzerland with regards to quality and access to treatment. To obtain a "real-world" view on OMT, the study approached its goals from two different angles: from the perspectives of the OMT patients and of the physicians who treat patients with maintenance therapy. The IMPROVE study collected a large body of data on OMT in Switzerland. This paper presents a small subset of the dataset, focusing on the research design and methodology, the profile of the participants and the responses to several key questions addressed by the questionnaires. METHODS: IMPROVE was an observational, questionnaire-based cross-sectional study on OMT conducted in Switzerland. Respondents consisted of OMT patients and treating physicians from various regions of the country. Data were collected using questionnaires in German and French. Physicians were interviewed by phone with a computer-based questionnaire. Patients self-completed a paper-based questionnaire at the physicians' offices or OMT treatment centres. RESULTS: A total of 200 physicians and 207 patients participated in the study. Liquid methadone and methadone tablets or capsules were the medications most commonly prescribed by physicians (60% and 20% of patient load, respectively) whereas buprenorphine use was less frequent. Patients (88%) and physicians (83%) were generally satisfied with the OMT currently offered. The current political framework and lack of training or information were cited as determining factors that deter physicians from engaging in OMT. About 31% of OMT physicians interviewed were ≥60 years old, indicating an ageing population. Diversion and misuse were considered a significant problem in Switzerland by 45% of the physicians. CONCLUSION: The subset of IMPROVE data presented gives a present-day, real-life overview of the OMT landscape in Switzerland. It represents a valuable resource for policy makers, key opinion leaders and drug addiction researchers and will be a useful basis for improving the current Swiss OMT model.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

L'objectiu principal d'aquest projecte ha estat aprofundir en la construcció de programari, abordant totes les etapes d'un projecte de construcció de programari des de la perspectiva de l'enginyeria del software (anàlisi, disseny, implementació i proves) i utilitzant el paradigma de programació Orientada a l'Objecte mitjançant l'ús de la tecnologia J2EE, conjuntament amb bastions de programari de gran importància en el mon real i per tant, en l'àmbit de desenvolupament de programari i tecnològic actuals.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tone Mapping is the problem of compressing the range of a High-Dynamic Range image so that it can be displayed in a Low-Dynamic Range screen, without losing or introducing novel details: The final image should produce in the observer a sensation as close as possible to the perception produced by the real-world scene. We propose a tone mapping operator with two stages. The first stage is a global method that implements visual adaptation, based on experiments on human perception, in particular we point out the importance of cone saturation. The second stage performs local contrast enhancement, based on a variational model inspired by color vision phenomenology. We evaluate this method with a metric validated by psychophysical experiments and, in terms of this metric, our method compares very well with the state of the art.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Intuitively, music has both predictable and unpredictable components. In this work we assess this qualitative statement in a quantitative way using common time series models fitted to state-of-the-art music descriptors. These descriptors cover different musical facets and are extracted from a large collection of real audio recordings comprising a variety of musical genres. Our findings show that music descriptor time series exhibit a certain predictability not only for short time intervals, but also for mid-term and relatively long intervals. This fact is observed independently of the descriptor, musical facet and time series model we consider. Moreover, we show that our findings are not only of theoretical relevance but can also have practical impact. To this end we demonstrate that music predictability at relatively long time intervals can be exploited in a real-world application, namely the automatic identification of cover songs (i.e. different renditions or versions of the same musical piece). Importantly, this prediction strategy yields a parameter-free approach for cover song identification that is substantially faster, allows for reduced computational storage and still maintains highly competitive accuracies when compared to state-of-the-art systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

AbstractFor a wide range of environmental, hydrological, and engineering applications there is a fast growing need for high-resolution imaging. In this context, waveform tomographic imaging of crosshole georadar data is a powerful method able to provide images of pertinent electrical properties in near-surface environments with unprecedented spatial resolution. In contrast, conventional ray-based tomographic methods, which consider only a very limited part of the recorded signal (first-arrival traveltimes and maximum first-cycle amplitudes), suffer from inherent limitations in resolution and may prove to be inadequate in complex environments. For a typical crosshole georadar survey the potential improvement in resolution when using waveform-based approaches instead of ray-based approaches is in the range of one order-of- magnitude. Moreover, the spatial resolution of waveform-based inversions is comparable to that of common logging methods. While in exploration seismology waveform tomographic imaging has become well established over the past two decades, it is comparably still underdeveloped in the georadar domain despite corresponding needs. Recently, different groups have presented finite-difference time-domain waveform inversion schemes for crosshole georadar data, which are adaptations and extensions of Tarantola's seminal nonlinear generalized least-squares approach developed for the seismic case. First applications of these new crosshole georadar waveform inversion schemes on synthetic and field data have shown promising results. However, there is little known about the limits and performance of such schemes in complex environments. To this end, the general motivation of my thesis is the evaluation of the robustness and limitations of waveform inversion algorithms for crosshole georadar data in order to apply such schemes to a wide range of real world problems.One crucial issue to making applicable and effective any waveform scheme to real-world crosshole georadar problems is the accurate estimation of the source wavelet, which is unknown in reality. Waveform inversion schemes for crosshole georadar data require forward simulations of the wavefield in order to iteratively solve the inverse problem. Therefore, accurate knowledge of the source wavelet is critically important for successful application of such schemes. Relatively small differences in the estimated source wavelet shape can lead to large differences in the resulting tomograms. In the first part of my thesis, I explore the viability and robustness of a relatively simple iterative deconvolution technique that incorporates the estimation of the source wavelet into the waveform inversion procedure rather than adding additional model parameters into the inversion problem. Extensive tests indicate that this source wavelet estimation technique is simple yet effective, and is able to provide remarkably accurate and robust estimates of the source wavelet in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity as well as significant ambient noise in the recorded data. Furthermore, our tests also indicate that the approach is insensitive to the phase characteristics of the starting wavelet, which is not the case when directly incorporating the wavelet estimation into the inverse problem.Another critical issue with crosshole georadar waveform inversion schemes which clearly needs to be investigated is the consequence of the common assumption of frequency- independent electromagnetic constitutive parameters. This is crucial since in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behaviour. In particular, in the presence of water, there is a wide body of evidence showing that the dielectric permittivity can be significantly frequency dependent over the GPR frequency range, due to a variety of relaxation processes. The second part of my thesis is therefore dedicated to the evaluation of the reconstruction limits of a non-dispersive crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. I show that the inversion algorithm, combined with the iterative deconvolution-based source wavelet estimation procedure that is partially able to account for the frequency-dependent effects through an "effective" wavelet, performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper examines the incentive of atomistic agricultural producers within a specific geographical region to differentiate and collectively market products. We develop a model that allows us to analyze the market and welfare effects of the main types of real-world producer organizations, using it to derive economic insights regarding the circumstances under which these organizations will evolve, and describing implications of the results obtained in the context of an ongoing debate between the European Union and United States. As the anticipated fixed costs of development and marketing increase and the anticipated size of the market falls, it becomes essential to increase the ability of the producer organization to control supply in order to ensure the coverage of fixed costs. Whenever a collective organization allows a market (with a new product) to exist that otherwise would not have existed there is an increase in societal welfare. Counterintuitively, stronger property right protection for producer organizations may be welfare enhancing even after a differentiated product has been developed. The reason for this somewhat paradoxical result is that legislation aimed at curtailing the market power of producer organizations may induce large technological distortions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The past four decades have witnessed an explosive growth in the field of networkbased facility location modeling. This is not at all surprising since location policy is one of the most profitable areas of applied systems analysis in regional science and ample theoretical and applied challenges are offered. Location-allocation models seek the location of facilities and/or services (e.g., schools, hospitals, and warehouses) so as to optimize one or several objectives generally related to the efficiency of the system or to the allocation of resources. This paper concerns the location of facilities or services in discrete space or networks, that are related to the public sector, such as emergency services (ambulances, fire stations, and police units), school systems and postal facilities. The paper is structured as follows: first, we will focus on public facility location models that use some type of coverage criterion, with special emphasis in emergency services. The second section will examine models based on the P-Median problem and some of the issues faced by planners when implementing this formulation in real world locational decisions. Finally, the last section will examine new trends in public sector facility location modeling.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation focuses on the strategies consumers use when making purchase decisions. It is organized in two main parts, one centering on descriptive and the other on applied decision making research. In the first part, a new process tracing tool called InterActive Process Tracing (IAPT) is pre- sented, which I developed to investigate the nature of consumers' decision strategies. This tool is a combination of several process tracing techniques, namely Active Information Search, Mouselab, and retrospective verbal protocol. To validate IAPT, two experiments on mobile phone purchase de- cisions were conducted where participants first repeatedly chose a mobile phone and then were asked to formalize their decision strategy so that it could be used to make choices for them. The choices made by the identified strategies correctly predicted the observed choices in 73% (Experiment 1) and 67% (Experiment 2) of the cases. Moreover, in Experiment 2, Mouselab and eye tracking were directly compared with respect to their impact on information search and strategy description. Only minor differences were found between these two methods. I conclude that IAPT is a useful research tool to identify choice strategies, and that using eye tracking technology did not increase its validity beyond that gained with Mouselab. In the second part, a prototype of a decision aid is introduced that was developed building in particular on the knowledge about consumers' decision strategies gained in Part I. This decision aid, which is called the InterActive Choice Aid (IACA), systematically assists consumers in their purchase decisions. To evaluate the prototype regarding its perceived utility, an experiment was conducted where IACA was compared to two other prototypes that were based on real-world consumer decision aids. All three prototypes differed in the number and type of tools they provided to facilitate the process of choosing, ranging from low (Amazon) to medium (Sunrise/dpreview) to high functionality (IACA). Overall, participants slightly preferred the prototype of medium functionality and this prototype was also rated best on the dimensions of understandability and ease of use. IACA was rated best regarding the two dimensions of ease of elimination and ease of comparison of alternatives. Moreover, participants choices were more in line with the normatively oriented weighted additive strategy when they used IACA than when they used the medium functionality prototype. The low functionality prototype was the least preferred overall. It is concluded that consumers can and will benefit from highly functional decision aids like IACA, but only when these systems are easy to understand and to use.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We analyze the linkage between protectionism and invasive species (IS) hazard in the context of two-way trade and multilateral trade integration, two major features of real-world agricultural trade. Multilateral integration includes the joint reduction of tariffs and trade costs among trading partners. Multilateral trade integration is more likely to increase damages from IS than predicted by unilateral trade opening under the classic Heckscher-Ohlin-Samuelson (HOS) framework because domestic production (the base susceptible to damages) is likely to increase with expanding export markets. A country integrating its trade with a partner characterized by relatively higher tariff and trade costs is also more likely to experience increased IS damages via expanded domestic production for the same reason. We illustrate our analytical results with a stylized model of the world wheat market.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Each winter, Iowa Department of Transportation (Iowa DOT) maintenance operators are responsible for plowing snow off federal and state roads in Iowa. Drivers typically work long shifts under treacherous conditions. In addition to properly navigating the vehicle, drivers are required to operate several plowing mechanisms simultaneously, such as plow controls and salt spreaders. There is little opportunity for practicing these skills in real-world situations. A virtual reality training program would provide operators with the opportunity to practice these skills under realistic yet safe conditions, as well as provide basic training to novice or less-experienced operators. In order to provide such training to snowplow operators in Iowa, the Iowa DOT purchased a snowplow simulator. The Iowa DOT commissioned a study through Iowa State University designed to (1) assess the use of this simulator as a training tool and (2) examine personality and other characteristics associated with being an experienced snowplow operator. The results of this study suggest that Iowa DOT operators of all ages and levels of experience enjoyed and seemed to benefit from virtual reality snowplow simulator training. Simulator sickness ratings were relatively low, implying that the simulator is appropriate for training a wide range of Iowa DOT operators. Many reported that simulator training was the most useful aspect of training for them.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Forecasting real-world quantities with basis on information from textual descriptions has recently attracted significant interest as a research problem, although previous studies have focused on applications involving only the English language. This document presents an experimental study on the subject of making predictions with textual contents written in Portuguese, using documents from three distinct domains. I specifically report on experiments using different types of regression models, using state-of-the-art feature weighting schemes, and using features derived from cluster-based word representations. Through controlled experiments, I have shown that prediction models using the textual information achieve better results than simple baselines such as taking the average value over the training data, and that richer document representations (i.e., using Brown clusters and the Delta- TF-IDF feature weighting scheme) result in slight performance improvements.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The past four decades have witnessed an explosive growth in the field of networkbased facilitylocation modeling. This is not at all surprising since location policy is one of the mostprofitable areas of applied systems analysis in regional science and ample theoretical andapplied challenges are offered. Location-allocation models seek the location of facilitiesand/or services (e.g., schools, hospitals, and warehouses) so as to optimize one or severalobjectives generally related to the efficiency of the system or to the allocation of resources.This paper concerns the location of facilities or services in discrete space or networks, thatare related to the public sector, such as emergency services (ambulances, fire stations, andpolice units), school systems and postal facilities. The paper is structured as follows: first,we will focus on public facility location models that use some type of coverage criterion,with special emphasis in emergency services. The second section will examine models based onthe P-Median problem and some of the issues faced by planners when implementing thisformulation in real world locational decisions. Finally, the last section will examine newtrends in public sector facility location modeling.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents a stylized model of international trade and asset price bubbles. Its central insight is that bubbles tend to appear and expand in countries where productivity is low relative to the rest of the world. These bubbles absorb local savings, eliminating inefficient investments and liberating resources that are in part used to invest in high productivity countries. Through this channel, bubbles act as a substitute for international capital flows, improving the international allocation of investment and reducing rate-of-return differentials across countries. This view of asset price bubbles could eventually provide a simple account of some real world phenomenae that have been difficult to model before, such as the recurrence and depth of financial crises or their puzzling tendency to propagate across countries.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper reviews the recent literature on monetary policy rules. We exposit the monetary policy design problem within a simple baselinetheoretical framework. We then consider the implications of adding various real world complications. Among other things, we show that the optimal policy implicitly incorporates inflation targeting. Wealso characterize the gains from making credible commitments to fightinflation. In contrast to conventional wisdom, we show that gains from commitment may emerge even in the central bank is not trying toinadvisedly push output above its natural level. We also consider theimplications of frictions such as imperfect information.