927 resultados para gift cards
Resumo:
Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.
Resumo:
Os dispositivos móveis são pessoais, intransmissíveis e cada vez mais utilizados, tornando-se assim numa boa ferramenta para a realização de um conjunto de serviços na indústria hoteleira. Entre esses serviços que necessitam da identificação pessoal, encontram-se a possibilidade do cliente reservar um quarto ou utilizar o serviço de quartos. Atualmente é muito utilizado, nos locais de alojamento, um smart card que possibilite ao cliente ter acesso a alguns dos serviços disponíveis. O objetivo deste documento é apresentar uma alternativa ao sistema de cartões, utilizando para o efeito, dispositivos móveis. De modo a garantir a segurança e uma utilização semelhante ao sistema de cartões existentes foi utilizada a tecnologia NFC (Near Field Communication) que, ao permitir o modo de emulação de cartão, facilita a transação do sistema de smart card existente, para o da utilização de dispositivos móveis na realização das mesmas funções. Mais concretamente, será abordada a utilização de smartphones para o processo de abertura de portas. Para que exista uma melhor compreensão e para que haja um conhecimento das suas capacidades e limites foram estudados casos de uso da tecnologia NFC. Este documento apresenta ainda os processos de desenvolvimento de uma aplicação nativa para o sistema operativo Android, cujo objetivo é proporcionar ao cliente de um local de alojamento um novo modo de acesso ao quarto, utilizando a tecnologia NFC. Para além desta funcionalidade a aplicação permite ainda ao utilizador fazer reservas, fazer o check-in, fazer o check-out entre outras. Posteriormente serão apresentadas as conclusões e possíveis trabalhos futuros.
Resumo:
In a highly competitive market companies know that having quality products or provide good services is not enough to keep customers "faithful". Currently, quality of products/services, location and price are fundamental aspects customers expect to get on every purchase, so they look for ways to distinguish companies. This can happen either in a strictly materialistic way or by evaluation of intangible metrics such as having his opinion appreciated or being part of a selected group of "premium" customers. Therefore, companies must find ways to value and reward its customers in order to keep them "faithful" to their products or services. Loyalty systems are one means to achieve this goal, however, due to its nature and how they are implemented, often companies end up having low acceptance, without achieving intended objectives. In an era of technological revolution, where global average adoption of smartphones and tablets is 74% and 40% [Our Mobile Planet, 2014], the opportunity to reinvent loyalty systems reappears. Throughout this thesis a new tool, relying on the latest technologies and aiming to fulfill this market opportunity, will be presented. The main idea is to use ancient loyalty concepts, such as stamps or pointscards, and transforms them into digital cards, to be used in digital wallets, introducing an innovative technology component based on Apple's Passbook technology. The main goal is to create a platform for managing the card’s life cycle, allowing anyone to create, edit, distribute and analyze the data, and also create a new communication channel with customers, improving the customer-‐supplier relationship and enhancing the mobile-‐marketing.
Resumo:
IEEE 802.11 is one of the most well-established and widely used standard for wireless LAN. Its Medium Access control (MAC) layer assumes that the devices adhere to the standard’s rules and timers to assure fair access and sharing of the medium. However, wireless cards driver flexibility and configurability make it possible for selfish misbehaving nodes to take advantages over the other well-behaving nodes. The existence of selfish nodes degrades the QoS for the other devices in the network and may increase their energy consumption. In this paper we propose a green solution for selfish misbehavior detection in IEEE 802.11-based wireless networks. The proposed scheme works in two phases: Global phase which detects whether the network contains selfish nodes or not, and Local phase which identifies which node or nodes within the network are selfish. Usually, the network must be frequently examined for selfish nodes during its operation since any node may act selfishly. Our solution is green in the sense that it saves the network resources as it avoids wasting the nodes energy by examining all the individual nodes of being selfish when it is not necessary. The proposed detection algorithm is evaluated using extensive OPNET simulations. The results show that the Global network metric clearly indicates the existence of a selfish node while the Local nodes metric successfully identified the selfish node(s). We also provide mathematical analysis for the selfish misbehaving and derived formulas for the successful channel access probability.
Resumo:
OBJECTIVES: To determine the frequency of radiological manifestations of chest tuberculosis among the tuberculosis outpatients at the Santa Casa de Misericórdia de São Paulo Hospital, and to correlate these radiological findings with the sputum bacilloscopy. SAMPLE AND METHODS: A review was made of the medical record cards and chest X-rays of all patients attended between January 1996 and December 1998. Patients with a diagnosis of tuberculosis who presented intrathoracic manifestations of the disease and negative anti-HIV serology were selected. RESULTS: The selection included 153 patients, with an average age of 37.5 years, who were predominantly male (60.8%) and white (56.9%). Pulmonary lesions were present in 121 (79.9%) and extrapulmonary lesions in 32 (20.1%). Parenchymal-infiltrate lesions appeared in 56 patients (36.6%), cavity lesions in 55 (36.0%), pleural effusion in 28 (18.3%), isolated nodules in 6 (3.9%), mediastinal enlargement in 4 (2.6%) and miliary pattern in 4 (2.6%). Cavities were present in 45.5% of the patients with pulmonary lesions, generally in association with the parenchymal-infiltrate lesions. Parenchymal infiltrate was present in 86.8% of the patients with pulmonary lesions. There was significant presence of alcohol-acid resistant bacillus in the sputum of patients with cavities (76.4%), in comparison with those without cavities (50%) (p = 0.003). CONCLUSIONS: Parenchymal-infiltrate lesions are the most frequent radiological manifestation of pulmonary tuberculosis, and they are generally associated with cavities. There is a relationship between the presence of acid fast bacilli in sputum and pulmonary cavity lesions.
Resumo:
The developments of the internet, the proliferation of the use of Web 2.0 tools, and of the technology in general, are leveraging new ways of people to communicate, collaborate, and interact. This new world and new markets, in a daily change, are enabling the emergence of new innovative enterprises and services, taking advantage of the new technologies and of the global network. Cardmobili is a Portuguese start-up company working in the area of mobile services. This company provides a mobile service to manage rewards and membership cards, enabling users to store them in the cloud, while using mobile applications to present them in store, collecting and using the rewards, sharing cards and information with other users and friends in social networks. Cardmobili is linked to merchants’ loyalty management systems, enabling users to access exclusive offers, delivered to their mobile application and web account. The company provides complete services to make any loyalty or membership program mobile: branding, new customer registration, integration of customer account balance, mobile vouchers, coupons and offers, and mobile communication.
Resumo:
Parasitic infection is one of the problems that affect human health, especially in developing countries. In this study, all of the fast food shops, restaurants, and roast meat outlets of Khorramabad (Western Iran) and all the staff employed by them, some 210 people, were selected through a census and their stools were examined for the presence of parasites. The parasitological tests of direct wet-mount, Lugol's iodine staining, formaldehyde-ether sedimentation and Trichrome staining techniques were performed on the samples. The data was analyzed with a chi-square test and logistic regression was selected as the analytical model. The results showed 19 (9%) stool specimens were positive for different intestinal parasites. These intestinal parasites included Giardia lamblia2.9%, Entamoeba coli 4.3%, Blastocystis sp. 1.4%, and Hymenolepis nana 0.5%. There was a significant difference between the presence of a valid health card, awareness of transmission of intestinal parasites, participation in training courses in environmental health with intestinal parasites (p < 0.05). No statistically significant difference was found between the rate of literacy and gender among patients infected with intestinal parasites (p > 0.05). To control parasitic infection in food handlers, several strategies are recommended such as stool examinations every three months, public education, application of health regulations, controlling the validity of health cards and training on parasitic infection transmission. In this regard, the findings of the present study can be used as a basis to develop preventive programs targeting food handlers because the spread of disease via them is a common problem worldwide.
Resumo:
We investigate the determinants of giving in a lab-in-the-field experiment with large stakes. Study participants in urban Mozambique play dictator games where their counterpart is the closest person to them outside their household. Dictators share more with counterparts when they have the option of giving in kind (in the form of goods), compared to giving that must be in cash. Qualitative post-experiment responses suggest that this effect is driven by a desire to control how recipients use gifted resources. Standard economic determinants such as the rate of return to giving and the size of the endowment also affect giving, but the effects of even large changes in these determinants are significantly smaller than the effect of the in-kind option. Our results support theories of giving where the utility of givers depends on the composition (not just the level) of gift-recipient expenditures, and givers thus seek control over transferred resources.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
Dissertação de Mestrado apresentada ao ISPA - Instituto Universitário
Resumo:
Following the Introduction, which surveys existing literature on the technology advances and regulation in telecommunications and on two-sided markets, we address specific issues on the industries of the New Economy, featured by the existence of network effects. We seek to explore how each one of these industries work, identify potential market failures and find new solutions at the economic regulation level promoting social welfare. In Chapter 1 we analyze a regulatory issue on access prices and investments in the telecommunications market. The existing literature on access prices and investment has pointed out that networks underinvest under a regime of mandatory access provision with a fixed access price per end-user. We propose a new access pricing rule, the indexation approach, i.e., the access price, per end-user, that network i pays to network j is function of the investment levels set by both networks. We show that the indexation can enhance economic efficiency beyond what is achieved with a fixed access price. In particular, access price indexation can simultaneously induce lower retail prices and higher investment and social welfare as compared to a fixed access pricing or a regulatory holidays regime. Furthermore, we provide sufficient conditions under which the indexation can implement the socially optimal investment or the Ramsey solution, which would be impossible to obtain under fixed access pricing. Our results contradict the notion that investment efficiency must be sacrificed for gains in pricing efficiency. In Chapter 2 we investigate the effect of regulations that limit advertising airtime on advertising quality and on social welfare. We show, first, that advertising time regulation may reduce the average quality of advertising broadcast on TV networks. Second, an advertising cap may reduce media platforms and firms' profits, while the net effect on viewers (subscribers) welfare is ambiguous because the ad quality reduction resulting from a regulatory cap o¤sets the subscribers direct gain from watching fewer ads. We find that if subscribers are sufficiently sensitive to ad quality, i.e., the ad quality reduction outweighs the direct effect of the cap, a cap may reduce social welfare. The welfare results suggest that a regulatory authority that is trying to increase welfare via regulation of the volume of advertising on TV might necessitate to also regulate advertising quality or, if regulating quality proves impractical, take the effect of advertising quality into consideration. 3 In Chapter 3 we investigate the rules that govern Electronic Payment Networks (EPNs). In EPNs the No-Surcharge Rule (NSR) requires that merchants charge at most the same amount for a payment card transaction as for cash. In this chapter, we analyze a three- party model (consumers, merchants, and a proprietary EPN) with endogenous transaction volumes and heterogenous merchants' transactional benefits of accepting cards to assess the welfare impacts of the NSR. We show that, if merchants are local monopolists and the network externalities from merchants to cardholders are sufficiently strong, with the exception of the EPN, all agents will be worse o¤ with the NSR, and therefore the NSR is socially undesirable. The positive role of the NSR in terms of improvement of retail price efficiency for cardholders is also highlighted.
Resumo:
Tendo como ponto de partida os estudos da performance para a análise do terrorismo, a presente dissertação teve como resultado a possibilidade de reflectir sobre tácticas de incorporação, reperformance e meta-teatro, três conceitos que permitem compreender de que forma a arte assimila e se compreende em relação com o terrorismo. Apresenta, por um lado, documentos oficiais que demonstram a existência de um conflito quanto à definição de terrorismo, reflectindo sobre “terrorismo de estado” e “contra-estado”. Por outro lado, a partir da análise dos Surveillence Camera Players e da performance Three Posters, ou de artistas como Hasan Elahy e Alyson Wyper, esta dissertação defende que a arte reperforma “táticas de representação” e realização mediática do terrorismo, nomeadamente, o teatro panóptico, a tortura como performance e os vídeo-testemunhos de mártires como retratos e vídeo-performances.
Resumo:
Relatório de estágio de mestrado em Ensino de Música
Resumo:
versão acessível em http://ace2015.info/wp-content/uploads/2015/11/ACE_2015_submission_148.pdf
Resumo:
OBJECTIVE: Compare pattern of exploratory eye movements during visual scanning of the Rorschach and TAT test cards in people with schizophrenia and controls. METHOD: 10 participants with schizophrenia and 10 controls matched by age, schooling and intellectual level participated in the study. Severity of symptoms was evaluated with the Positive and Negative Syndrome Scale. Test cards were divided into three groups: TAT cards with scenes content, TAT cards with interaction content (TAT-faces), and Rorschach cards with abstract images. Eye movements were analyzed for: total number, duration and location of fixation; and length of saccadic movements. RESULTS: Different pattern of eye movement was found, with schizophrenia participants showing lower number of fixations but longer fixation duration in Rorschach cards and TAT-faces. The biggest difference was observed in Rorschach, followed by TAT-faces and TAT-scene cards. CONCLUSIONS: Results suggest alteration in visual exploration mechanisms possibly related to integration of abstract visual information.