675 resultados para ”real world mathematics”
Resumo:
Intuitively, music has both predictable and unpredictable components. In this work we assess this qualitative statement in a quantitative way using common time series models fitted to state-of-the-art music descriptors. These descriptors cover different musical facets and are extracted from a large collection of real audio recordings comprising a variety of musical genres. Our findings show that music descriptor time series exhibit a certain predictability not only for short time intervals, but also for mid-term and relatively long intervals. This fact is observed independently of the descriptor, musical facet and time series model we consider. Moreover, we show that our findings are not only of theoretical relevance but can also have practical impact. To this end we demonstrate that music predictability at relatively long time intervals can be exploited in a real-world application, namely the automatic identification of cover songs (i.e. different renditions or versions of the same musical piece). Importantly, this prediction strategy yields a parameter-free approach for cover song identification that is substantially faster, allows for reduced computational storage and still maintains highly competitive accuracies when compared to state-of-the-art systems.
Resumo:
AbstractFor a wide range of environmental, hydrological, and engineering applications there is a fast growing need for high-resolution imaging. In this context, waveform tomographic imaging of crosshole georadar data is a powerful method able to provide images of pertinent electrical properties in near-surface environments with unprecedented spatial resolution. In contrast, conventional ray-based tomographic methods, which consider only a very limited part of the recorded signal (first-arrival traveltimes and maximum first-cycle amplitudes), suffer from inherent limitations in resolution and may prove to be inadequate in complex environments. For a typical crosshole georadar survey the potential improvement in resolution when using waveform-based approaches instead of ray-based approaches is in the range of one order-of- magnitude. Moreover, the spatial resolution of waveform-based inversions is comparable to that of common logging methods. While in exploration seismology waveform tomographic imaging has become well established over the past two decades, it is comparably still underdeveloped in the georadar domain despite corresponding needs. Recently, different groups have presented finite-difference time-domain waveform inversion schemes for crosshole georadar data, which are adaptations and extensions of Tarantola's seminal nonlinear generalized least-squares approach developed for the seismic case. First applications of these new crosshole georadar waveform inversion schemes on synthetic and field data have shown promising results. However, there is little known about the limits and performance of such schemes in complex environments. To this end, the general motivation of my thesis is the evaluation of the robustness and limitations of waveform inversion algorithms for crosshole georadar data in order to apply such schemes to a wide range of real world problems.One crucial issue to making applicable and effective any waveform scheme to real-world crosshole georadar problems is the accurate estimation of the source wavelet, which is unknown in reality. Waveform inversion schemes for crosshole georadar data require forward simulations of the wavefield in order to iteratively solve the inverse problem. Therefore, accurate knowledge of the source wavelet is critically important for successful application of such schemes. Relatively small differences in the estimated source wavelet shape can lead to large differences in the resulting tomograms. In the first part of my thesis, I explore the viability and robustness of a relatively simple iterative deconvolution technique that incorporates the estimation of the source wavelet into the waveform inversion procedure rather than adding additional model parameters into the inversion problem. Extensive tests indicate that this source wavelet estimation technique is simple yet effective, and is able to provide remarkably accurate and robust estimates of the source wavelet in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity as well as significant ambient noise in the recorded data. Furthermore, our tests also indicate that the approach is insensitive to the phase characteristics of the starting wavelet, which is not the case when directly incorporating the wavelet estimation into the inverse problem.Another critical issue with crosshole georadar waveform inversion schemes which clearly needs to be investigated is the consequence of the common assumption of frequency- independent electromagnetic constitutive parameters. This is crucial since in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behaviour. In particular, in the presence of water, there is a wide body of evidence showing that the dielectric permittivity can be significantly frequency dependent over the GPR frequency range, due to a variety of relaxation processes. The second part of my thesis is therefore dedicated to the evaluation of the reconstruction limits of a non-dispersive crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. I show that the inversion algorithm, combined with the iterative deconvolution-based source wavelet estimation procedure that is partially able to account for the frequency-dependent effects through an "effective" wavelet, performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.
Resumo:
This paper examines the incentive of atomistic agricultural producers within a specific geographical region to differentiate and collectively market products. We develop a model that allows us to analyze the market and welfare effects of the main types of real-world producer organizations, using it to derive economic insights regarding the circumstances under which these organizations will evolve, and describing implications of the results obtained in the context of an ongoing debate between the European Union and United States. As the anticipated fixed costs of development and marketing increase and the anticipated size of the market falls, it becomes essential to increase the ability of the producer organization to control supply in order to ensure the coverage of fixed costs. Whenever a collective organization allows a market (with a new product) to exist that otherwise would not have existed there is an increase in societal welfare. Counterintuitively, stronger property right protection for producer organizations may be welfare enhancing even after a differentiated product has been developed. The reason for this somewhat paradoxical result is that legislation aimed at curtailing the market power of producer organizations may induce large technological distortions.
Resumo:
The past four decades have witnessed an explosive growth in the field of networkbased facility location modeling. This is not at all surprising since location policy is one of the most profitable areas of applied systems analysis in regional science and ample theoretical and applied challenges are offered. Location-allocation models seek the location of facilities and/or services (e.g., schools, hospitals, and warehouses) so as to optimize one or several objectives generally related to the efficiency of the system or to the allocation of resources. This paper concerns the location of facilities or services in discrete space or networks, that are related to the public sector, such as emergency services (ambulances, fire stations, and police units), school systems and postal facilities. The paper is structured as follows: first, we will focus on public facility location models that use some type of coverage criterion, with special emphasis in emergency services. The second section will examine models based on the P-Median problem and some of the issues faced by planners when implementing this formulation in real world locational decisions. Finally, the last section will examine new trends in public sector facility location modeling.
Resumo:
This dissertation focuses on the strategies consumers use when making purchase decisions. It is organized in two main parts, one centering on descriptive and the other on applied decision making research. In the first part, a new process tracing tool called InterActive Process Tracing (IAPT) is pre- sented, which I developed to investigate the nature of consumers' decision strategies. This tool is a combination of several process tracing techniques, namely Active Information Search, Mouselab, and retrospective verbal protocol. To validate IAPT, two experiments on mobile phone purchase de- cisions were conducted where participants first repeatedly chose a mobile phone and then were asked to formalize their decision strategy so that it could be used to make choices for them. The choices made by the identified strategies correctly predicted the observed choices in 73% (Experiment 1) and 67% (Experiment 2) of the cases. Moreover, in Experiment 2, Mouselab and eye tracking were directly compared with respect to their impact on information search and strategy description. Only minor differences were found between these two methods. I conclude that IAPT is a useful research tool to identify choice strategies, and that using eye tracking technology did not increase its validity beyond that gained with Mouselab. In the second part, a prototype of a decision aid is introduced that was developed building in particular on the knowledge about consumers' decision strategies gained in Part I. This decision aid, which is called the InterActive Choice Aid (IACA), systematically assists consumers in their purchase decisions. To evaluate the prototype regarding its perceived utility, an experiment was conducted where IACA was compared to two other prototypes that were based on real-world consumer decision aids. All three prototypes differed in the number and type of tools they provided to facilitate the process of choosing, ranging from low (Amazon) to medium (Sunrise/dpreview) to high functionality (IACA). Overall, participants slightly preferred the prototype of medium functionality and this prototype was also rated best on the dimensions of understandability and ease of use. IACA was rated best regarding the two dimensions of ease of elimination and ease of comparison of alternatives. Moreover, participants choices were more in line with the normatively oriented weighted additive strategy when they used IACA than when they used the medium functionality prototype. The low functionality prototype was the least preferred overall. It is concluded that consumers can and will benefit from highly functional decision aids like IACA, but only when these systems are easy to understand and to use.
Resumo:
We analyze the linkage between protectionism and invasive species (IS) hazard in the context of two-way trade and multilateral trade integration, two major features of real-world agricultural trade. Multilateral integration includes the joint reduction of tariffs and trade costs among trading partners. Multilateral trade integration is more likely to increase damages from IS than predicted by unilateral trade opening under the classic Heckscher-Ohlin-Samuelson (HOS) framework because domestic production (the base susceptible to damages) is likely to increase with expanding export markets. A country integrating its trade with a partner characterized by relatively higher tariff and trade costs is also more likely to experience increased IS damages via expanded domestic production for the same reason. We illustrate our analytical results with a stylized model of the world wheat market.
Resumo:
Each winter, Iowa Department of Transportation (Iowa DOT) maintenance operators are responsible for plowing snow off federal and state roads in Iowa. Drivers typically work long shifts under treacherous conditions. In addition to properly navigating the vehicle, drivers are required to operate several plowing mechanisms simultaneously, such as plow controls and salt spreaders. There is little opportunity for practicing these skills in real-world situations. A virtual reality training program would provide operators with the opportunity to practice these skills under realistic yet safe conditions, as well as provide basic training to novice or less-experienced operators. In order to provide such training to snowplow operators in Iowa, the Iowa DOT purchased a snowplow simulator. The Iowa DOT commissioned a study through Iowa State University designed to (1) assess the use of this simulator as a training tool and (2) examine personality and other characteristics associated with being an experienced snowplow operator. The results of this study suggest that Iowa DOT operators of all ages and levels of experience enjoyed and seemed to benefit from virtual reality snowplow simulator training. Simulator sickness ratings were relatively low, implying that the simulator is appropriate for training a wide range of Iowa DOT operators. Many reported that simulator training was the most useful aspect of training for them.
Resumo:
Forecasting real-world quantities with basis on information from textual descriptions has recently attracted significant interest as a research problem, although previous studies have focused on applications involving only the English language. This document presents an experimental study on the subject of making predictions with textual contents written in Portuguese, using documents from three distinct domains. I specifically report on experiments using different types of regression models, using state-of-the-art feature weighting schemes, and using features derived from cluster-based word representations. Through controlled experiments, I have shown that prediction models using the textual information achieve better results than simple baselines such as taking the average value over the training data, and that richer document representations (i.e., using Brown clusters and the Delta- TF-IDF feature weighting scheme) result in slight performance improvements.
Resumo:
The past four decades have witnessed an explosive growth in the field of networkbased facilitylocation modeling. This is not at all surprising since location policy is one of the mostprofitable areas of applied systems analysis in regional science and ample theoretical andapplied challenges are offered. Location-allocation models seek the location of facilitiesand/or services (e.g., schools, hospitals, and warehouses) so as to optimize one or severalobjectives generally related to the efficiency of the system or to the allocation of resources.This paper concerns the location of facilities or services in discrete space or networks, thatare related to the public sector, such as emergency services (ambulances, fire stations, andpolice units), school systems and postal facilities. The paper is structured as follows: first,we will focus on public facility location models that use some type of coverage criterion,with special emphasis in emergency services. The second section will examine models based onthe P-Median problem and some of the issues faced by planners when implementing thisformulation in real world locational decisions. Finally, the last section will examine newtrends in public sector facility location modeling.
Resumo:
This paper presents a stylized model of international trade and asset price bubbles. Its central insight is that bubbles tend to appear and expand in countries where productivity is low relative to the rest of the world. These bubbles absorb local savings, eliminating inefficient investments and liberating resources that are in part used to invest in high productivity countries. Through this channel, bubbles act as a substitute for international capital flows, improving the international allocation of investment and reducing rate-of-return differentials across countries. This view of asset price bubbles could eventually provide a simple account of some real world phenomenae that have been difficult to model before, such as the recurrence and depth of financial crises or their puzzling tendency to propagate across countries.
Resumo:
This paper reviews the recent literature on monetary policy rules. We exposit the monetary policy design problem within a simple baselinetheoretical framework. We then consider the implications of adding various real world complications. Among other things, we show that the optimal policy implicitly incorporates inflation targeting. Wealso characterize the gains from making credible commitments to fightinflation. In contrast to conventional wisdom, we show that gains from commitment may emerge even in the central bank is not trying toinadvisedly push output above its natural level. We also consider theimplications of frictions such as imperfect information.
Resumo:
Many classifiers achieve high levels of accuracy but have limited applicability in real world situations because they do not lead to a greater understanding or insight into the^way features influence the classification. In areas such as health informatics a classifier that clearly identifies the influences on classification can be used to direct research and formulate interventions. This research investigates the practical applications of Automated Weighted Sum, (AWSum), a classifier that provides accuracy comparable to other techniques whilst providing insight into the data. This is achieved by calculating a weight for each feature value that represents its influence on the class value. The merits of this approach in classification and insight are evaluated on a Cystic Fibrosis and Diabetes datasets with positive results.
Resumo:
Poder mesurar i enregistrar diferents tipus de magnituds com pressió, força, temperatura etc. s’ha convertit en una necessitat per moltes aplicacions actuals. Aquestes magnituds poden tenir procedències molt diverses, tals com l’entorn, o poden ser generades per sistemes mecànics, elèctrics, etc. Per tal de poder adquirir aquestes magnituds, s’utilitzen els sistemes d’adquisició de dades. Aquests sistemes, prenen mostres analògiques del món real, i les transformen en dades digitals que poden ser manipulades per un sistema electrònic. Pràcticament qualsevol magnitud es pot mesurar utilitzant el sensor adient. Una magnitud molt utilitzada en sistemes d’adquisició de dades, és la temperatura. Els sistemes d’adquisició de temperatures estan molt generalitzats, i podem trobar-los com a sistemes, on l’objectiu és mostrar les dades adquirides, o podem trobar-los formant part de sistemes de control, aportant uns inputs necessaris per el seu correcte funcionament, garantir-ne l’estabilitat, seguretat etc. Aquest projecte, promogut per l’empresa Elausa, s’encarregarà d’adquirir, el senyal d’entrada de 2 Termoparells. Aquests mesuraran temperatures de circuits electrònics, que es trobaran dintre la càmera climàtica de Elausa, sotmesos a diferents condicions de temperatura, per tal de rebre l’homologació del circuit. El sistema haurà de poder mostrar les dades adquirides en temps real, i emmagatzemar-les en un PC que estarà ubicat en una oficina, situada a uns 30 m de distància de la sala on es farà el test. El sistema constarà d’un circuit electrònic que adquirirà, i condicionarà el senyal de sortida dels termoparells, per adaptar-lo a la tensió d’entrada d’un convertidor analògic digital, del microcontrolador integrat en aquesta placa. Seguidament aquesta informació, s’enviarà a través d’un mòdul transmissor de radiofreqüència, cap al PC on es visualitzaran les dades adquirides. Els objectius plantejats són els següents: - Dissenyar el circuit electrònic d’adquisició i condicionament del senyal. - Dissenyar, fabricar i muntar el circuit imprès de la placa d’adquisició. - Realitzar el programa de control del microcontrolador. - Realitzar el programa per presentar i desar les dades en un PC. - El sistema ha d’adquirir 2 temperatures, a través de Termoparells amb un rang d’entrada de -40ºC a +240ºC - S’ha de transmetre les dades via R.F. Els resultats del projecte han estat satisfactoris i s’han complert els objectius plantejats.
Resumo:
The competitiveness of businesses is increasingly dependent on their electronic networks with customers, suppliers, and partners. While the strategic and operational impact of external integration and IOS adoption has been extensively studied, much less attention has been paid to the organizational and technical design of electronic relationships. The objective of our longitudinal research project is the development of a framework for understanding and explaining B2B integration. Drawing on existing literature and empirical cases we present a reference model (a classification scheme for B2B Integration). The reference model comprises technical, organizational, and institutional levels to reflect the multiple facets of B2B integration. In this paper we onvestigate the current state of electronic collaboration in global supply chains focussing on the technical view. Using an indepth case analysis we identify five integration scenarios. In the subsequent confirmatory phase of the research we analyse 112 real-world company cases to validate these five integration scenarios. Our research advances and deepens existing studies by developing a B2B reference model, which reflects the current state of practice and is independent of specific implementation technologies. In the next stage of the research the emerging reference model will be extended to create an assessment model for analysing the maturity level of a given company in a specific supply chain.
Resumo:
BACKGROUND: The long-term incidence of stent thrombosis (ST) and complications after sirolimus-eluting stents (SES) implantation is still a matter of debate. METHOD: We conducted a systematic follow-up on the day of their 5-year SES implantation anniversary, in a series of consecutive real-world patients treated with a SES. The use of SES implantation was not restricted to "on-label" indications, and target lesions included in-stent restenosis, vein graft, left main stem locations, bifurcations, and long lesions. The Academic Research Consortium criteria were used for ST classification. RESULTS: Three hundred fifty consecutive patients were treated with SES between April and December 2002 in 3 Swiss hospitals. Mean age was 63 +/- 6 years, 78% were men, 20% presented with acute coronary syndrome, and 19% were patients with diabetes. Five-year follow-up was obtained in 98% of eligible patients. Stent thrombosis had occurred in 12 patients (3.6%) [definite 6 (1.8%), probable 1 (0.3%) and possible 5 (1.5%)]. Eighty-one percent of the population was free of complications. Major adverse cardiac events occurred in 74 (21%) patients and were as follows: cardiac death 3%, noncardiac death 4%, myocardial infarction 2%, target lesion revascularization 8%, non-target lesion revascularization target vessel revascularization 3%, coronary artery bypass graft 2%. Non-TVR was performed in 8%. CONCLUSION: Our data confirm the good long-term outcome of patients treated with SES. The incidence of complications and sub acute thrombosis at 5 years in routine clinical practice reproduces the results of prospective randomized trials.