869 resultados para broadcast search
Resumo:
This article is concerned with the liability of search engines for algorithmically produced search suggestions, such as through Google’s ‘autocomplete’ function. Liability in this context may arise when automatically generated associations have an offensive or defamatory meaning, or may even induce infringement of intellectual property rights. The increasing number of cases that have been brought before courts all over the world puts forward questions on the conflict of fundamental freedoms of speech and access to information on the one hand, and personality rights of individuals— under a broader right of informational self-determination—on the other. In the light of the recent judgment of the Court of Justice of the European Union (EU) in Google Spain v AEPD, this article concludes that many requests for removal of suggestions including private individuals’ information will be successful on the basis of EU data protection law, even absent prejudice to the person concerned.
Resumo:
Trading commercial real estate involves a process of exchange that is costly and which occurs over an extended and uncertain period of time. This has consequences for the performance and risk of real estate investments. Most research on transaction times has occurred for residential rather than commercial real estate. We study the time taken to transact commercial real estate assets in the UK using a sample of 578 transactions over the period 2004 to 2013. We measure average time to transact from a buyer and seller perspective, distinguishing the search and due diligence phases of the process, and we conduct econometric analysis to explain variation in due diligence times between assets. The median time for purchase of real estate from introduction to completion was 104 days and the median time for sale from marketing to completion was 135 days. There is considerable variation around these times and results suggest that some of this variation is related to market state, type and quality of asset, and type of participants involved in the transaction. Our findings shed light on the drivers of liquidity at an individual asset level and can inform models that quantify the impact of uncertain time on market on real estate investment risk.
Resumo:
More than two decades have passed since the fall of the Berlin Wall and the transfer of the Cold War file from a daily preoccupation of policy makers to a more detached assessment by historians. Scholars of U.S.-Latin American relations are beginning to take advantage both of the distance in time and of newly opened archives to reflect on the four decades that, from the 1940s to the 1980s, divided the Americas, as they did much of the world. Others are seeking to understand U.S. policy and inter-American relations in the post-Cold War era, a period that not only lacks a clear definition but also still has no name. Still others have turned their gaze forward to offer policies in regard to the region for the new Obama administration. Numerous books and review essays have addressed these three subjects—the Cold War, the post-Cold War era, and current and future issues on the inter-American agenda. Few of these studies attempt, however, to connect the three subjects or to offer new and comprehensive theories to explain the course of U.S. policies from the beginning of the twentieth century until the present. Indeed, some works and policy makers continue to use the mind-sets of the Cold War as though that conflict were still being fought. With the benefit of newly opened archives, some scholars have nevertheless drawn insights from the depths of the Cold War that improve our understanding of U.S. policies and inter-American relations, but they do not address the question as to whether the United States has escaped the longer cycle of intervention followed by neglect that has characterized its relations with Latin America. Another question is whether U.S. policies differ markedly before, during, and after the Cold War. In what follows, we ask whether the books reviewed here provide any insights in this regard and whether they offer a compass for the future of inter-American relations. We also offer our own thoughts as to how their various perspectives could be synthesized to address these questions more comprehensively.
Resumo:
Task relevance affects emotional attention in healthy individuals. Here, we investigate whether the association between anxiety and attention bias is affected by the task relevance of emotion during an attention task. Participants completed two visual search tasks. In the emotion-irrelevant task, participants were asked to indicate whether a discrepant face in a crowd of neutral, middle-aged faces was old or young. Irrelevant to the task, target faces displayed angry, happy, or neutral expressions. In the emotion-relevant task, participants were asked to indicate whether a discrepant face in a crowd of middle-aged neutral faces was happy or angry (target faces also varied in age). Trait anxiety was not associated with attention in the emotion-relevant task. However, in the emotion-irrelevant task, trait anxiety was associated with a bias for angry over happy faces. These findings demonstrate that the task relevance of emotional information affects conclusions about the presence of an anxiety-linked attention bias.
Resumo:
Searching for and mapping the physical extent of unmarked graves using geophysical techniques has proven difficult in many cases. The success of individual geophysical techniques for detecting graves depends on a site-by-site basis. Significantly, detection of graves often results from measured contrasts that are linked to the background soils rather than the type of archaeological feature associated with the grave. It is evident that investigation of buried remains should be considered within a 3D space as the variation in burial environment can be extremely varied through the grave. Within this paper, we demonstrate the need for a multi-method survey strategy to investigate unmarked graves, as applied at a “planned” but unmarked pauper’s cemetery. The outcome from this case study provides new insights into the strategy that is required at such sites. Perhaps the most significant conclusion is that unmarked graves are best understood in terms of characterization rather than identification. In this paper, we argue for a methodological approach that, while following the current trends to use multiple techniques, is fundamentally dependent on a structured approach to the analysis of the data. The ramifications of this case study illustrate the necessity of an integrated strategy to provide a more holistic understanding of unmarked graves that may help aid in management of these unseen but important aspects of our heritage. It is concluded that the search for graves is still a current debate and one that will be solved by methodological rather than technique-based arguments.
Resumo:
The challenge of moving past the classic Window Icons Menus Pointer (WIMP) interface, i.e. by turning it ‘3D’, has resulted in much research and development. To evaluate the impact of 3D on the ‘finding a target picture in a folder’ task, we built a 3D WIMP interface that allowed the systematic manipulation of visual depth, visual aides, semantic category distribution of targets versus non-targets; and the detailed measurement of lower-level stimuli features. Across two separate experiments, one large sample web-based experiment, to understand associations, and one controlled lab environment, using eye tracking to understand user focus, we investigated how visual depth, use of visual aides, use of semantic categories, and lower-level stimuli features (i.e. contrast, colour and luminance) impact how successfully participants are able to search for, and detect, the target image. Moreover in the lab-based experiment, we captured pupillometry measurements to allow consideration of the influence of increasing cognitive load as a result of either an increasing number of items on the screen, or due to the inclusion of visual depth. Our findings showed that increasing the visible layers of depth, and inclusion of converging lines, did not impact target detection times, errors, or failure rates. Low-level features, including colour, luminance, and number of edges, did correlate with differences in target detection times, errors, and failure rates. Our results also revealed that semantic sorting algorithms significantly decreased target detection times. Increased semantic contrasts between a target and its neighbours correlated with an increase in detection errors. Finally, pupillometric data did not provide evidence of any correlation between the number of visible layers of depth and pupil size, however, using structural equation modelling, we demonstrated that cognitive load does influence detection failure rates when there is luminance contrasts between the target and its surrounding neighbours. Results suggest that WIMP interaction designers should consider stimulus-driven factors, which were shown to influence the efficiency with which a target icon can be found in a 3D WIMP interface.
Resumo:
Primordial Quark Nuggets, remnants of the quark-hadron phase transition, may be hiding most of the baryon number in superdense chunks have been discussed for years always from the theoretical point of view. While they seemed originally fragile at intermediate cosmological temperatures, it became increasingly clear that they may survive due to a variety of effects affecting their evaporation (surface and volume) rates. A search of these objects have never been attempted to elucidate their existence. We discuss in this note how to search directly for cosmological fossil nuggets among the small asteroids approaching Earth. `Asteroids` with a high visible-to-infrared flux ratio, constant lightcurves and devoid of spectral features are signals of an actual possible nugget nature. A viable search of very definite primordial quark nugget features can be conducted as a spinoff of the ongoing/forthcoming NEAs observation programmes.
Resumo:
Background and aim: Knowledge about the genetic factors responsible for noise-induced hearing loss (NIHL) is still limited. This study investigated whether genetic factors are associated or not to susceptibility to NIHL. Subjects and methods: The family history and genotypes were studied for candidate genes in 107 individuals with NIHL, 44 with other causes of hearing impairment and 104 controls. Mutations frequently found among deaf individuals were investigated (35delG, 167delT in GJB2, Delta(GJB6- D13S1830), Delta(GJB6- D13S1854) in GJB6 and A1555G in MT-RNR1 genes); allelic and genotypic frequencies were also determined at the SNP rs877098 in DFNB1, of deletions of GSTM1 and GSTT1 and sequence variants in both MTRNR1 and MTTS1 genes, as well as mitochondrial haplogroups. Results: When those with NIHL were compared with the control group, a significant increase was detected in the number of relatives affected by hearing impairment, of the genotype corresponding to the presence of both GSTM1 and GSTT1 enzymes and of cases with mitochondrial haplogroup L1. Conclusion: The findings suggest effects of familial history of hearing loss, of GSTT1 and GSTM1 enzymes and of mitochondrial haplogroup L1 on the risk of NIHL. This study also described novel sequence variants of MTRNR1 and MTTS1 genes.
Resumo:
We analyzed ostriches from an equipped farm located in the Brazilian southeast region for the presence of Salmonella spp. This bacterium was investigated in 80 samples of ostrich droppings, 90 eggs, 30 samples of feed and 30 samples of droppings from rodents. Additionally, at slaughter-house this bacterium was investigated in droppings, caecal content, spleen, liver and carcasses from 90 slaughtered ostriches from the studied farm. Also, blood serum of those animals were harvested and submitted to serum plate agglutination using commercial Salmonella Pullorum antigen. No Salmonella spp. was detected in any eggs, caecal content, liver, spleen, carcass and droppings from ostriches and rodents. However, Salmonella Javiana and Salmonella enterica subsp. enterica 4, 12: i:- were isolated from some samples of feed. The serologic test was negative for all samples. Good sanitary farming management and the application of HACCP principles and GMP during the slaughtering process could explain the absence of Salmonella spp. in the tested samples.
Resumo:
A novel technique for selecting the poles of orthonormal basis functions (OBF) in Volterra models of any order is presented. It is well-known that the usual large number of parameters required to describe the Volterra kernels can be significantly reduced by representing each kernel using an appropriate basis of orthonormal functions. Such a representation results in the so-called OBF Volterra model, which has a Wiener structure consisting of a linear dynamic generated by the orthonormal basis followed by a nonlinear static mapping given by the Volterra polynomial series. Aiming at optimizing the poles that fully parameterize the orthonormal bases, the exact gradients of the outputs of the orthonormal filters with respect to their poles are computed analytically by using a back-propagation-through-time technique. The expressions relative to the Kautz basis and to generalized orthonormal bases of functions (GOBF) are addressed; the ones related to the Laguerre basis follow straightforwardly as a particular case. The main innovation here is that the dynamic nature of the OBF filters is fully considered in the gradient computations. These gradients provide exact search directions for optimizing the poles of a given orthonormal basis. Such search directions can, in turn, be used as part of an optimization procedure to locate the minimum of a cost-function that takes into account the error of estimation of the system output. The Levenberg-Marquardt algorithm is adopted here as the optimization procedure. Unlike previous related work, the proposed approach relies solely on input-output data measured from the system to be modeled, i.e., no information about the Volterra kernels is required. Examples are presented to illustrate the application of this approach to the modeling of dynamic systems, including a real magnetic levitation system with nonlinear oscillatory behavior.
Resumo:
We present the results of searches for dipolar-type anisotropies in different energy ranges above 2.5 x 10(17) eV with the surface detector array of the Pierre Auger Observatory, reporting on both the phase and the amplitude measurements of the first harmonic modulation in the right-ascension distribution. Upper limits on the amplitudes are obtained, which provide the most stringent bounds at present, being below 2% at 99% C.L. for EeV energies. We also compare our results to those of previous experiments as well as with some theoretical expectations. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Brazil`s State of Sao Paulo Research Foundation
Resumo:
We present parallel algorithms on the BSP/CGM model, with p processors, to count and generate all the maximal cliques of a circle graph with n vertices and m edges. To count the number of all the maximal cliques, without actually generating them, our algorithm requires O(log p) communication rounds with O(nm/p) local computation time. We also present an algorithm to generate the first maximal clique in O(log p) communication rounds with O(nm/p) local computation, and to generate each one of the subsequent maximal cliques this algorithm requires O(log p) communication rounds with O(m/p) local computation. The maximal cliques generation algorithm is based on generating all maximal paths in a directed acyclic graph, and we present an algorithm for this problem that uses O(log p) communication rounds with O(m/p) local computation for each maximal path. We also show that the presented algorithms can be extended to the CREW PRAM model.