83 resultados para Inherent Audiences
Resumo:
Positioning a robot with respect to objects by using data provided by a camera is a well known technique called visual servoing. In order to perform a task, the object must exhibit visual features which can be extracted from different points of view. Then, visual servoing is object-dependent as it depends on the object appearance. Therefore, performing the positioning task is not possible in presence of non-textured objects or objects for which extracting visual features is too complex or too costly. This paper proposes a solution to tackle this limitation inherent to the current visual servoing techniques. Our proposal is based on the coded structured light approach as a reliable and fast way to solve the correspondence problem. In this case, a coded light pattern is projected providing robust visual features independently of the object appearance
Resumo:
Innovation is a research topic with a broad tradition. However, learning processes,from which innovations emerge, and the dynamics of change and development havetraditionally been studied in relation with the manufacturing sector. Moreover, theobjects of study have been usually process and tangible product innovations. Althoughrecently researchers have focused their attention in other sectors, more research onservice innovation should be carried out. Furthermore, regarding innovation intourism, there is a need to adapt generic theories to the tourism sector and tocontribute with new ideas.In order to find out, which are the origins of innovation processes, it is necessary tolook into two fundamental subjects that are inherent to innovation, which are learningand interaction. Both are closely related. The first appears to be an intrinsic conditionof individuals. Moreover, it can also be identified in organizations. Thus, learning allowsindividuals as well as organizations to develop. However, learning and development isnot possible without taking the environment into account. Hence, it is necessary thatinteractions take place between individuals, groups of individuals, organizations, etc.Furthermore, the concept of interaction implies the transfer of knowledge, which isthe basis for innovations.The purposes of this master thesis are to study in detail several of these topics and to develop a conceptual framework for the research on innovation in tourism
Resumo:
This letter presents a comparison between threeFourier-based motion compensation (MoCo) algorithms forairborne synthetic aperture radar (SAR) systems. These algorithmscircumvent the limitations of conventional MoCo, namelythe assumption of a reference height and the beam-center approximation.All these approaches rely on the inherent time–frequencyrelation in SAR systems but exploit it differently, with the consequentdifferences in accuracy and computational burden. Aftera brief overview of the three approaches, the performance ofeach algorithm is analyzed with respect to azimuthal topographyaccommodation, angle accommodation, and maximum frequencyof track deviations with which the algorithm can cope. Also, ananalysis on the computational complexity is presented. Quantitativeresults are shown using real data acquired by the ExperimentalSAR system of the German Aerospace Center (DLR).
Resumo:
La integració dels materials biocompatibles en la nanotecnologia ha permès aquesta àrea tenir aplicacions en els camps de la biologia i la medicina, un fet que ha donat lloc a l'aparició de la nanobiotecnologia. La gran majoria d'aquestes aplicacions es basen en un aspecte fonamental: la interacció que es dóna entre els constituents biològics (normalment proteïnes) i els materials biocompatibles. Els nanotubs de carboni presenten una citotoxicitat inherent, mentre que els de nitrur de bor (BNNTs), isòsters amb els de carboni, són inherentment no-citotòxics i mostren una afinitat natural per les proteïnes. En aquesta memòria es presenten els resultats obtinguts de la interacció de BNNTs amb constituents bàsics biomoleculars (molècules que representen grups funcionals i aminoàcids) en absència de solvent mitjançant tècniques de modelatge i càlculs químic-quàntics amb tractament periòdic realitzats amb el codi CRYSTAL09. En primer lloc, s'ha trobat que els mètodes DFT basats en el GGA donen valors de band gap excessivament baixos (2.7 - 4.6 eV) comparat amb el valor experimental (5.5 eV), mentre que el funcional híbrid B3LYP dóna bons valors de band gap, el més acurat essent un BNNT amb índex (9,0), (5.4 eV). S'ha determinat que la interacció de BNNTs amb molècules pot venir guiat per: i) interaccions datives amb el B; ii) enllaç d'H amb el N; iii) interaccions pi-stacking. Les dues primeres forces d'interacció es veuen afavorides en BNNTs de radi petit, els quals interaccionen molt favorablement amb molècules polars, mentre que les terceres es veuen afavorides en BNNTs de radi gran, els quals interaccionen molt favorablement amb sistemes aromàtics o que continguin dobles enllaços. S'ha estudiat la interacció de BNNTs(6,0) amb molècules que contenen grups funcionals presents en residus aminoàcids i s'ha establert una escala d'afinitats relativa, la qual indica que tenen la major interacció aquelles molècules que estableixen interaccions datives B(nanotub)-N(molècula), seguit d’aquelles molècules que poden establir interaccions de tipus pi-stacking, i acabant amb aquelles molècules que estableixen interaccions datives B(nanotub)-O(molècula). Per últim s'ha estudiat la interacció de BNNTs amb diferents aminoàcids (glicina, lisina, àcid glutàmic i fenilalanina) i s'ha establert una escala d'afinitats relativa, la qual està d'acord amb les tendències observades per les molècules que contenen grups funcionals de residus aminoàcids.
Resumo:
In the last decades; a growing stock of literature has been devoted to the criticism of GDP as an indicator of societal wealth. A relevant question is: what are the perspectives to build, on the existing knowledge and consensus, alternative measures of prosperity? A starting point may be to connect well-being research agenda with the sustainability one. However, there is no doubt that there is a lot of complexity and fuzziness inherent in multidimensional concepts such as sustainability and well-being. This article analyses the theoretical foundations and the empirical validity of some multidimensional technical tools that can be used for well-being evaluation and assessment. Of course one should not forget that policy conclusions derived through any mathematical model depend also on the conceptual framework used, i.e. which representation of reality (and thus which societal values and interests) has been considered.
Resumo:
Background: The ultimate goal of synthetic biology is the conception and construction of genetic circuits that are reliable with respect to their designed function (e.g. oscillators, switches). This task remains still to be attained due to the inherent synergy of the biological building blocks and to an insufficient feedback between experiments and mathematical models. Nevertheless, the progress in these directions has been substantial. Results: It has been emphasized in the literature that the architecture of a genetic oscillator must include positive (activating) and negative (inhibiting) genetic interactions in order to yield robust oscillations. Our results point out that the oscillatory capacity is not only affected by the interaction polarity but by how it is implemented at promoter level. For a chosen oscillator architecture, we show by means of numerical simulations that the existence or lack of competition between activator and inhibitor at promoter level affects the probability of producing oscillations and also leaves characteristic fingerprints on the associated period/amplitude features. Conclusions: In comparison with non-competitive binding at promoters, competition drastically reduces the region of the parameters space characterized by oscillatory solutions. Moreover, while competition leads to pulse-like oscillations with long-tail distribution in period and amplitude for various parameters or noisy conditions, the non-competitive scenario shows a characteristic frequency and confined amplitude values. Our study also situates the competition mechanism in the context of existing genetic oscillators, with emphasis on the Atkinson oscillator.
Resumo:
The space and time discretization inherent to all FDTD schemesintroduce non-physical dispersion errors, i.e. deviations ofthe speed of sound from the theoretical value predicted bythe governing Euler differential equations. A generalmethodologyfor computing this dispersion error via straightforwardnumerical simulations of the FDTD schemes is presented.The method is shown to provide remarkable accuraciesof the order of 1/1000 in a wide variety of twodimensionalfinite difference schemes.
Resumo:
Aquesta comunicació presenta el projecte Aula de Tests desenvolupat com a suport en el desplegament de les assignatures de física i matemàtiques de primer curs dels estudis d'enginyeria de l’Escola Superior Politècnica de la Universitat Pompeu Fabra. El projecte té com a objectiu dissenyar eines d'auto aprenentatge i d'avaluació contínua accessible on-line a través de l'entorn Moodle per a afavorir el procés d'aprenentatge de l’estudiant. El context d’aquesta experiència es caracteritzaper la inherent dificultat dels estudis d’enginyeria, pel fet que molts estudiants entren a la universitat amb mancances substancials de coneixements en aquestes àrees així com la heterogeneïtat en quant a la formació pre-universitària. S’hi descriuen lescaracterístiques de les activitats programades i el context on s’han aplicat i es presenten els resultats de satisfacció, participació i notes que aporten informació útil al professorat per a adequar la planificació i les activitats els cursos següents.
Resumo:
Biplots are graphical displays of data matrices based on the decomposition of a matrix as the product of two matrices. Elements of these two matrices are used as coordinates for the rows and columns of the data matrix, with an interpretation of the joint presentation that relies on the properties of the scalar product. Because the decomposition is not unique, there are several alternative ways to scale the row and column points of the biplot, which can cause confusion amongst users, especially when software packages are not united in their approach to this issue. We propose a new scaling of the solution, called the standard biplot, which applies equally well to a wide variety of analyses such as correspondence analysis, principal component analysis, log-ratio analysis and the graphical results of a discriminant analysis/MANOVA, in fact to any method based on the singular-value decomposition. The standard biplot also handles data matrices with widely different levels of inherent variance. Two concepts taken from correspondence analysis are important to this idea: the weighting of row and column points, and the contributions made by the points to the solution. In the standard biplot one set of points, usually the rows of the data matrix, optimally represent the positions of the cases or sample units, which are weighted and usually standardized in some way unless the matrix contains values that are comparable in their raw form. The other set of points, usually the columns, is represented in accordance with their contributions to the low-dimensional solution. As for any biplot, the projections of the row points onto vectors defined by the column points approximate the centred and (optionally) standardized data. The method is illustrated with several examples to demonstrate how the standard biplot copes in different situations to give a joint map which needs only one common scale on the principal axes, thus avoiding the problem of enlarging or contracting the scale of one set of points to make the biplot readable. The proposal also solves the problem in correspondence analysis of low-frequency categories that are located on the periphery of the map, giving the false impression that they are important.
Resumo:
I describe the customer valuations game, a simple intuitive game that can serve as a foundation for teaching revenue management. The game requires little or no preparation, props or software, takes around two hours (and hence can be finished in one session), and illustrates the formation of classical (airline and hotel) revenue management mechanisms such as advanced purchase discounts, booking limits and fixed multiple prices. I normally use the game as a base to introduce RM and to develop RM forecasting and optimization concepts off it. The game is particularly suited for non-technical audiences.
Resumo:
Although we found a general trend favouring the omnivorousness thesis, as soon as we adjusted it to a set of structural factors and consumers tastes it was clear that this was caused by elitist inclusive omnivores who had increased the scope of their tastes. In general, younger cohorts were becoming less omnivorous, nevertheless, they were also becoming more educated and had greater to higher levels of inc ome, making the youth moreomnivorous. As expected, upscale consumers set limits on their popular taste: musicalgenres, whose audiences had educational levels below the mean profile were less preferredby upscale respondents. In spite of this, as time passed, some popular brows gained socialstatus.
Resumo:
Correspondence analysis has found extensive use in ecology, archeology, linguisticsand the social sciences as a method for visualizing the patterns of association in a table offrequencies or nonnegative ratio-scale data. Inherent to the method is the expression of the datain each row or each column relative to their respective totals, and it is these sets of relativevalues (called profiles) that are visualized. This relativization of the data makes perfect sensewhen the margins of the table represent samples from sub-populations of inherently differentsizes. But in some ecological applications sampling is performed on equal areas or equalvolumes so that the absolute levels of the observed occurrences may be of relevance, in whichcase relativization may not be required. In this paper we define the correspondence analysis ofthe raw unrelativized data and discuss its properties, comparing this new method to regularcorrespondence analysis and to a related variant of non-symmetric correspondence analysis.
Resumo:
Recent research shows that financial reports are losing relevance. Mainly thisis due to the growing strategic importance of intangible assets in theperformance of a company. A possible solution is to modify accounting standardsso that statements include more self-generated intangible assets, taking intoaccount with their inherent risk and difficulty of valuation. We surveyed loanofficers who were asked to assess the credit-worthiness of a hypotheticalcompany. The only information given was a simplified version of financialstatements. Half the group got statements where research and development costshad been capitalized. The other half got statements in which these costs hadbeen treated as an expense. The findings show that capitalization wassignificantly more likely to attract a positive response to a loan request. Thepaper raises the question of whether accounting for intangibles might providemanagers with one more creative accounting technique and, in consequence, itsethical implications.
Resumo:
Revenue management (RM) is a complicated business process that can best be described ascontrol of sales (using prices, restrictions, or capacity), usually using software as a tool to aiddecisions. RM software can play a mere informative role, supplying analysts with formatted andsummarized data who use it to make control decisions (setting a price or allocating capacity fora price point), or, play a deeper role, automating the decisions process completely, at the otherextreme. The RM models and algorithms in the academic literature by and large concentrateon the latter, completely automated, level of functionality.A firm considering using a new RM model or RM system needs to evaluate its performance.Academic papers justify the performance of their models using simulations, where customerbooking requests are simulated according to some process and model, and the revenue perfor-mance of the algorithm compared to an alternate set of algorithms. Such simulations, whilean accepted part of the academic literature, and indeed providing research insight, often lackcredibility with management. Even methodologically, they are usually awed, as the simula-tions only test \within-model" performance, and say nothing as to the appropriateness of themodel in the first place. Even simulations that test against alternate models or competition arelimited by their inherent necessity on fixing some model as the universe for their testing. Theseproblems are exacerbated with RM models that attempt to model customer purchase behav-ior or competition, as the right models for competitive actions or customer purchases remainsomewhat of a mystery, or at least with no consensus on their validity.How then to validate a model? Putting it another way, we want to show that a particularmodel or algorithm is the cause of a certain improvement to the RM process compared to theexisting process. We take care to emphasize that we want to prove the said model as the causeof performance, and to compare against a (incumbent) process rather than against an alternatemodel.In this paper we describe a \live" testing experiment that we conducted at Iberia Airlineson a set of flights. A set of competing algorithms control a set of flights during adjacentweeks, and their behavior and results are observed over a relatively long period of time (9months). In parallel, a group of control flights were managed using the traditional mix of manualand algorithmic control (incumbent system). Such \sandbox" testing, while common at manylarge internet search and e-commerce companies is relatively rare in the revenue managementarea. Sandbox testing has an undisputable model of customer behavior but the experimentaldesign and analysis of results is less clear. In this paper we describe the philosophy behind theexperiment, the organizational challenges, the design and setup of the experiment, and outlinethe analysis of the results. This paper is a complement to a (more technical) related paper thatdescribes the econometrics and statistical analysis of the results.
Resumo:
We construct a weighted Euclidean distance that approximates any distance or dissimilarity measure between individuals that is based on a rectangular cases-by-variables data matrix. In contrast to regular multidimensional scaling methods for dissimilarity data, the method leads to biplots of individuals and variables while preserving all the good properties of dimension-reduction methods that are based on the singular-value decomposition. The main benefits are the decomposition of variance into components along principal axes, which provide the numerical diagnostics known as contributions, and the estimation of nonnegative weights for each variable. The idea is inspired by the distance functions used in correspondence analysis and in principal component analysis of standardized data, where the normalizations inherent in the distances can be considered as differential weighting of the variables. In weighted Euclidean biplots we allow these weights to be unknown parameters, which are estimated from the data to maximize the fit to the chosen distances or dissimilarities. These weights are estimated using a majorization algorithm. Once this extra weight-estimation step is accomplished, the procedure follows the classical path in decomposing the matrix and displaying its rows and columns in biplots.