538 resultados para Revisit


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We revisit the visibility problem, which is traditionally known in Computer Graphics and Vision fields as the process of computing a (potentially) visible set of primitives in the computational model of a scene. We propose a hybrid solution that uses a dry structure (in the sense of data reduction), a triangulation of the type Ja
 1 , to accelerate the task of searching for visible primitives. We came up with a solution that is useful for real-time, on-line, interactive applications as 3D visualization. In such applications the main goal is to load the minimum amount of primitives from the scene during the rendering stage, as possible. For this purpose, our algorithm executes the culling by using a hybrid paradigm based on viewing-frustum, back-face culling and occlusion models. Results have shown substantial improvement over these traditional approaches if applied separately. This novel approach can be used in devices with no dedicated processors or with low processing power, as cell phones or embedded displays, or to visualize data through the Internet, as in virtual museums applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The electrical conductivity of solid-state matter is a fundamental physical property and can be precisely derived from the resistance measured via the four-point probe technique excluding contributions from parasitic contact resistances. Over time, this method has become an interdisciplinary characterization tool in materials science, semiconductor industries, geology, physics, etc, and is employed for both fundamental and application-driven research. However, the correct derivation of the conductivity is a demanding task which faces several difficulties, e.g. the homogeneity of the sample or the isotropy of the phases. In addition, these sample-specific characteristics are intimately related to technical constraints such as the probe geometry and size of the sample. In particular, the latter is of importance for nanostructures which can now be probed technically on very small length scales. On the occasion of the 100th anniversary of the four-point probe technique, introduced by Frank Wenner, in this review we revisit and discuss various correction factors which are mandatory for an accurate derivation of the resistivity from the measured resistance. Among others, sample thickness, dimensionality, anisotropy, and the relative size and geometry of the sample with respect to the contact assembly are considered. We are also able to derive the correction factors for 2D anisotropic systems on circular finite areas with variable probe spacings. All these aspects are illustrated by state-of-the-art experiments carried out using a four-tip STM/SEM system. We are aware that this review article can only cover some of the most important topics. Regarding further aspects, e.g. technical realizations, the influence of inhomogeneities or different transport regimes, etc, we refer to other review articles in this field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite all intentions in the course of the Bologna Process and decades of investment into improving the social dimension, results in many national and international studies show that inequity remains stubbornly persistent, and that inequity based on socio-economic status, parental education, gender, country-of-origin, rural background and more continues to prevail in our Higher Education systems and at the labour market. While improvement has been shown, extrapolation of the gains of the last 40 years in the field show that it could take over 100 years for disadvantaged groups to catch up with their more advantaged peers, should the current rate of improvement be maintained. Many of the traditional approaches to improving equity have also necessitated large-scale public investments, in the form of direct support to underrepresented groups. In an age of austerity, many countries in Europe are finding it necessary to revisit and scale down these policies, so as to accommodate other priorities, such as balanced budgets or dealing with an aging population. An analysis of the current situation indicates that the time is ripe for disruptive innovations to mobilise the cause forward by leaps and bounds, instead of through incrementalist approaches. Despite the list of programmes in this analysis there is very little evidence as to the causal link between programmes, methodologies for their use and increases/improvements in equity in institutions. This creates a significant information gap for institutions and public authorities seeking for indicators to allocate limited resources to equity improving initiatives, without adequate evidence of effectiveness. The IDEAS project and this publication aims at addressing and improving this information gap. (DIPF/Orig.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Atualmente, encontramo-nos numa Era em que a internet se está a transformar numa ferramenta cada vez mais essencial no nosso dia-a-dia. A internet, mais especificamente, as redes sociais apareceram para que o consumidor esteja em permanente conexão com o mundo, à distância de um clique. Nesta investigação, o YouTube foi a rede social escolhida para o estudo e o objetivo principal é perceber quais as características que um canal no YouTube deve ter para que os consumidores tenham intenção de o revisitar. Na metodologia adaptou-se um modelo que avalia a qualidade de websites, o modelo WebQual. Foi elaborado um questionário online, sendo as perguntas adaptadas para um canal de YouTube. Para analisar as hipóteses do estudo, foi feita uma regressão linear múltipla. Foi então possível concluir que os constructos Utilidade, Facilidade de Uso, Entretenimento e Relação Complementar, influenciam a Intenção de Revisitar logo, pode-se dizer que o canal do YouTube estudado apresenta qualidade aos olhos dos consumidores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prior research shows that electronic word of mouth (eWOM) wields considerable influence over consumer behavior. However, as the volume and variety of eWOM grows, firms are faced with challenges in analyzing and responding to this information. In this dissertation, I argue that to meet the new challenges and opportunities posed by the expansion of eWOM and to more accurately measure its impacts on firms and consumers, we need to revisit our methodologies for extracting insights from eWOM. This dissertation consists of three essays that further our understanding of the value of social media analytics, especially with respect to eWOM. In the first essay, I use machine learning techniques to extract semantic structure from online reviews. These semantic dimensions describe the experiences of consumers in the service industry more accurately than traditional numerical variables. To demonstrate the value of these dimensions, I show that they can be used to substantially improve the accuracy of econometric models of firm survival. In the second essay, I explore the effects on eWOM of online deals, such as those offered by Groupon, the value of which to both consumers and merchants is controversial. Through a combination of Bayesian econometric models and controlled lab experiments, I examine the conditions under which online deals affect online reviews and provide strategies to mitigate the potential negative eWOM effects resulting from online deals. In the third essay, I focus on how eWOM can be incorporated into efforts to reduce foodborne illness, a major public health concern. I demonstrate how machine learning techniques can be used to monitor hygiene in restaurants through crowd-sourced online reviews. I am able to identify instances of moral hazard within the hygiene inspection scheme used in New York City by leveraging a dictionary specifically crafted for this purpose. To the extent that online reviews provide some visibility into the hygiene practices of restaurants, I show how losses from information asymmetry may be partially mitigated in this context. Taken together, this dissertation contributes by revisiting and refining the use of eWOM in the service sector through a combination of machine learning and econometric methodologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Radiation in the first days of supernova explosions contains rich information about physical properties of the exploding stars. In the past three years, I used the intermediate Palomar Transient Factory to conduct one-day cadence surveys, in order to systematically search for infant supernovae. I show that the one-day cadences in these surveys were strictly controlled, that the realtime image subtraction pipeline managed to deliver transient candidates within ten minutes of images being taken, and that we were able to undertake follow-up observations with a variety of telescopes within hours of transients being discovered. So far iPTF has discovered over a hundred supernovae within a few days of explosions, forty-nine of which were spectroscopically classified within twenty-four hours of discovery.

Our observations of infant Type Ia supernovae provide evidence for both the single-degenerate and double-degenerate progenitor channels. On the one hand, a low-velocity Type Ia supernova iPTF14atg revealed a strong ultraviolet pulse within four days of its explosion. I show that the pulse is consistent with the expected emission produced by collision between the supernova ejecta and a companion star, providing direct evidence for the single degenerate channel. By comparing the distinct early-phase light curves of iPTF14atg to an otherwise similar event iPTF14dpk, I show that the viewing angle dependence of the supernova-companion collision signature is probably responsible to the difference of the early light curves. I also show evidence for a dark period between the supernova explosion and the first light of the radioactively-powered light curve. On the other hand, a peculiar Type Ia supernova iPTF13asv revealed strong near-UV emission and absence of iron in the spectra within the first two weeks of explosion, suggesting a stratified ejecta structure with iron group elements confined to the slow-moving part of the ejecta. With its total ejecta mass estimated to exceed the Chandrasekhar limit, I show that the stratification and large mass of the ejecta favor the double-degenerate channel.

In a separate approach, iPTF found the first progenitor system of a Type Ib supernova iPTF13bvn in the pre-explosion HST archival mages. Independently, I used the early-phase optical observations of this supernova to constrain its progenitor radius to be no larger than several solar radii. I also used its early radio detections to derive a mass loss rate of 3e-5 solar mass per year for the progenitor right before the supernova explosion. These constraints on the physical properties of the iPTF13bvn progenitor provide a comprehensive data set to test Type Ib supernova theories. A recent HST revisit to the iPTF13bvn site two years after the supernova explosion has confirmed the progenitor system.

Moving forward, the next frontier in this area is to extend these single-object analyses to a large sample of infant supernovae. The upcoming Zwicky Transient Facility with its fast survey speed, which is expected to find one infant supernova every night, is well positioned to carry out this task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The past decade has seen a lot of research on statistics-based network protocol identification using machine learning techniques. Prior studies have shown promising results in terms of high accuracy and fast classification speed. However, most works have embodied an implicit assumption that all protocols are known in advance and presented in the training data, which is unrealistic since real-world networks constantly witness emerging traffic patterns as well as unknown protocols in the wild. In this paper, we revisit the problem by proposing a learning scheme with unknown pattern extraction for statistical protocol identification. The scheme is designed with a more realistic setting, where the training dataset contains labeled samples from a limited number of protocols, and the goal is to tell these known protocols apart from each other and from potential unknown ones. Preliminary results derived from real-world traffic are presented to show the effectiveness of the scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large and persistent gaps in subnational public expenditure have important implications regarding growth, equity, and migration. In this context, we revisit the question of expenditure convergence across the American states to provide more nuanced evidence than found by a small number of previous studies. We employ a methodology due to Smeekes (Bootstrap sequential tests to determine the stationary units in a panel, 2011) that sequentially tests for unit roots in pairwise (real per capita) expenditure gaps based on user specified fractions. In a panel of 48 combined state–local government units (1957–2008), we found that expenditures on highways, sanitation, utility, and education were far more convergent than expenditures on health and hospitals, police and fire protection, and public welfare. There was little evidence of “club convergence” based on the proportion of intraregional convergent pairs. Several historically high-grant receiving states showed relatively strong evidence of convergence. Our results bode well for future output convergence and opportunities for Tiebout-type migration across jurisdictions. They also imply a diminished role for public infrastructure and education spending in business location choices over time and a mixed role for federal grants in inducing convergence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Article aims to revisit the historical development of the doctrine ofexemplary or punitive damages. Punitive damages are anomalous in that they lie in both tort and crime, a matter that has led to much criticism by modern commentators. Yet, a definitive history of punitive damages does not exist to explain this anomaly. The main contribution of this Article, then, is to begin such a history by way of a meta-narrative. It identifies and links the historically significant moments that led to punitive damages, beginning with the background period of classical Roman law, its renewed reception in Western Europe in the twelfth and thirteenth centuries that coincided with the emergence of the English common law,the English statutes of the late thirteenth century, to the court cases of Wilkes v. Wood and Huckle v. Money in the eighteenth century that heralded the "first explicit articulation" of the legal principle of punitive damages. This Article argues that this history is not linear in nature but historically contingent. This is a corrective to present scholarship, which fails to adequately connect or contextualize these historical moments, or over-simplifies this development over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research enhances the understanding of consumer behaviour and customer experience in the context of town centres. First, it defines town centre customer experience (TCCE) as a multifaceted journey that combines interactions with a diverse range of public and private organisations, including retailers and social and community elements; this results in a unique experience co-created with the consumer across a series of functional and experiential touchpoints. Second, combining qualitative and quantitative insights, this research reveals a series of specific functional and experiential TCCE touchpoints, which underpin the consumer internal response (motivation to visit) and outward behaviour (desire to stay and revisit intentions) in the town centre. In addition to enhancing town centre and customer experience knowledge, these findings offer important new insights to those managing town centres and seeking to retain customer loyalty in the high street. Above all, these findings can help identify the touchpoints that need to be reinforced and/or improved to differentiate a town from its competing centres and to create tailored marketing strategies. Taken together, such initiatives have the potential to positively impact the revitalisation of the high street and the town centre economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this concluding chapter our purpose is two-fold. The first is to draw out some of the common themes which underpin the chapters. In part, we commenced this task in arranging the book into the four sections of Images of Schooling, Performing Pedagogy Visually, Power and Representation and Ethical Issues. However, in recognition that, like all categorisations, this was arbitrary and potentially reductive, we now revisit the contributions making connections across and between the chapters. A related and second task of this conclusion is to highlight gaps and limitations of what we have gathered together in this collection. Inevitably, this book does not speak to all of the issues embedded in a visual approach to educational research. In recognising this partiality, our aim is to gesture towards the types of questions and concerns that VRMs raise and still require educational researchers to think about — and in differing ways.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper notebook with its companion pencil or pen is a creative tool for many contemporary choreographers and their dancers. Using the notebook affords a relationship with a set of external objects inscribed “on the page” in the form of drawn sketches, notations, and diagrams combined with text (Blackwell et al.). This relationship can be described in cognitive terms, for example, where the page becomes a surrogate for working memory, or a way for seeing something new by modeling structures or processes. The notebook in this sense becomes a site for the encounter of cognition and creativity, providing a place for thinking generatively with external objects (sketches, notations, etc.), an idea this essay will revisit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of renewable energies as a response to the EU targets defined for 2030 Climate Change and Energy has been increasing. Also non-dispatchable and intermittent renewable energies like wind and solar cannot generally match supply and demand, which can also cause some problems in the grid. So, the increased interest in energy storage has evolved and there is nowadays an urgent need for larger energy storage capacity. Compressed Air Energy Storage (CAES) is a proven technology for storing large quantities of electrical energy in the form of high-pressure air for later use when electricity is needed. It exists since the 1970’s and is one of the few energy storage technologies suitable for long duration (tens of hours) and utility scale (hundreds to thousands of MW) applications. It is also one of the most cost-effective solutions for large to small scale storage applications. Compressed Air Energy Storage can be integrated and bring advantages to different levels of the electric system, from the Generation level, to the Transmission and Distribution levels, so in this paper a revisit of CAES is done in order to better understand what and how it can be used for our modern needs of energy storage.