913 resultados para Amazon metric


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an exploration of intellectual property and fashion, this article examines the question of the intermediary liability of online auction-houses for counterfeiting. In the United States, the illustrious jewellery store, Tiffany & Co, brought a legal action against eBay Inc, alleging direct trademark infringement, contributory trademark infringement, false advertising, unfair competition and trademark dilution. The luxury store depicted the online auction-house as a pirate bazaar, a flea-market and a haven for counterfeiting. During epic litigation, eBay Inc successfully defended itself against these allegations in a United States District Court and the United States Court of Appeals for the Second Circuit. Tiffany & Co made a desperate, unsuccessful effort to appeal the matter to the Supreme Court of the United States. The matter featured a number of interventions from amicus curiae — Tiffany was supported by Coty, the Fashion Designer's Guild, and the International Anticounterfeiting Coalition, while eBay was defended by publicly-spirited civil society groups such as Electronic Frontier Foundation, Public Citizen, and Public Knowledge as well as Yahoo!, Google Inc, Amazon.com, and associations representing telecommunications carriers and internet service providers. The litigation in the United States can be counterpointed with the fusillade of legal action against eBay in the European Union. In contrast to Tiffany & Co, Louis Vuitton triumphed over eBay in the French courts — claiming its victory as vindication of the need to protect the commercial interests and cultural heritage of France. However, eBay has fared somewhat better in a dispute with L’Oréal in Great Britain and the European Court of Justice. It is argued that, in a time of flux and uncertainty, Australia should follow the position of the United States courts in Tiffany & Co v eBay Inc. The final part examines the ramifications of this litigation over online auction-houses for trade mark law reform and consumer rights; parallel disputes over intermediary liability and safe harbours in the field of copyright law and the Anti-Counterfeiting Trade Agreement 2010. The conclusion calls for a revision of trade mark law, animated by a respect for consumers’ rights and interests in the electronic marketplace.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study uses the reverse salient methodology to contrast subsystems in video game consoles in order to discover, characterize, and forecast the most significant technology gap. We build on the current methodologies (Performance Gap and Time Gap) for measuring the magnitude of Reverse Salience, by showing the effectiveness of Performance Gap Ratio (PGR). The three subject subsystems in this analysis are the CPU Score, GPU core frequency, and video memory bandwidth. CPU Score is a metric developed for this project, which is the product of the core frequency, number of parallel cores, and instruction size. We measure the Performance Gap of each subsystem against concurrently available PC hardware on the market. Using PGR, we normalize the evolution of these technologies for comparative analysis. The results indicate that while CPU performance has historically been the Reverse Salient, video memory bandwidth has taken over as the quickest growing technology gap in the current generation. Finally, we create a technology forecasting model that shows how much the video RAM bandwidth gap will grow through 2019 should the current trend continue. This analysis can assist console developers in assigning resources to the next generation of platforms, which will ultimately result in longer hardware life cycles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a new information-theoretic metric, the symmetric Kullback-Leibler divergence (sKL-divergence), to measure the difference between two water diffusivity profiles in high angular resolution diffusion imaging (HARDI). Water diffusivity profiles are modeled as probability density functions on the unit sphere, and the sKL-divergence is computed from a spherical harmonic series, which greatly reduces computational complexity. Adjustment of the orientation of diffusivity functions is essential when the image is being warped, so we propose a fast algorithm to determine the principal direction of diffusivity functions using principal component analysis (PCA). We compare sKL-divergence with other inner-product based cost functions using synthetic samples and real HARDI data, and show that the sKL-divergence is highly sensitive in detecting small differences between two diffusivity profiles and therefore shows promise for applications in the nonlinear registration and multisubject statistical analysis of HARDI data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We apply an information-theoretic cost metric, the symmetrized Kullback-Leibler (sKL) divergence, or $J$-divergence, to fluid registration of diffusion tensor images. The difference between diffusion tensors is quantified based on the sKL-divergence of their associated probability density functions (PDFs). Three-dimensional DTI data from 34 subjects were fluidly registered to an optimized target image. To allow large image deformations but preserve image topology, we regularized the flow with a large-deformation diffeomorphic mapping based on the kinematics of a Navier-Stokes fluid. A driving force was developed to minimize the $J$-divergence between the deforming source and target diffusion functions, while reorienting the flowing tensors to preserve fiber topography. In initial experiments, we showed that the sKL-divergence based on full diffusion PDFs is adaptable to higher-order diffusion models, such as high angular resolution diffusion imaging (HARDI). The sKL-divergence was sensitive to subtle differences between two diffusivity profiles, showing promise for nonlinear registration applications and multisubject statistical analysis of HARDI data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Head motion (HM) is a critical confounding factor in functional MRI. Here we investigate whether HM during resting state functional MRI (RS-fMRI) is influenced by genetic factors in a sample of 462 twins (65% fema≤ 101 MZ (monozygotic) and 130 DZ (dizygotic) twin pairs; mean age: 21 (SD=3.16), range 16-29). Heritability estimates for three HM components-mean translation (MT), maximum translation (MAXT) and mean rotation (MR)-ranged from 37 to 51%. We detected a significant common genetic influence on HM variability, with about two-thirds (genetic correlations range 0.76-1.00) of the variance shared between MR, MT and MAXT. A composite metric (HM-PC1), which aggregated these three, was also moderately heritable (h2=42%). Using a sub-sample (N=35) of the twins we confirmed that mean and maximum translational and rotational motions were consistent "traits" over repeated scans (r=0.53-0.59); reliability was even higher for the composite metric (r=0.66). In addition, phenotypic and cross-trait cross-twin correlations between HM and resting state functional connectivities (RS-FCs) with Brodmann areas (BA) 44 and 45, in which RS-FCs were found to be moderately heritable (BA44: h2-=0.23 (sd=0.041), BA45: h2-=0.26 (sd=0.061)), indicated that HM might not represent a major bias in genetic studies using FCs. Even so, the HM effect on FC was not completely eliminated after regression. HM may be a valuable endophenotype whose relationship with brain disorders remains to be elucidated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automatic labeling of white matter fibres in diffusion-weighted brain MRI is vital for comparing brain integrity and connectivity across populations, but is challenging. Whole brain tractography generates a vast set of fibres throughout the brain, but it is hard to cluster them into anatomically meaningful tracts, due to wide individual variations in the trajectory and shape of white matter pathways. We propose a novel automatic tract labeling algorithm that fuses information from tractography and multiple hand-labeled fibre tract atlases. As streamline tractography can generate a large number of false positive fibres, we developed a top-down approach to extract tracts consistent with known anatomy, based on a distance metric to multiple hand-labeled atlases. Clustering results from different atlases were fused, using a multi-stage fusion scheme. Our "label fusion" method reliably extracted the major tracts from 105-gradient HARDI scans of 100 young normal adults. © 2012 Springer-Verlag.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the influence of the choice of template in tensor-based morphometry. Using 3D brain MR images from 10 monozygotic twin pairs, we defined a tensor-based distance in the log-Euclidean framework [1] between each image pair in the study. Relative to this metric, twin pairs were found to be closer to each other on average than random pairings, consistent with evidence that brain structure is under strong genetic control. We also computed the intraclass correlation and associated permutation p-value at each voxel for the determinant of the Jacobian matrix of the transformation. The cumulative distribution function (cdf) of the p-values was found at each voxel for each of the templates and compared to the null distribution. Surprisingly, there was very little difference between CDFs of statistics computed from analyses using different templates. As the brain with least log-Euclidean deformation cost, the mean template defined here avoids the blurring caused by creating a synthetic image from a population, and when selected from a large population, avoids bias by being geometrically centered, in a metric that is sensitive enough to anatomical similarity that it can even detect genetic affinity among anatomies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Our aim was to make a quantitative comparison of the response of the different visual cortical areas to selective stimulation of the two different cone-opponent pathways [long- and medium-wavelength (L/M)- and short-wavelength (S)-cone-opponent] and the achromatic pathway under equivalent conditions. The appropriate stimulus-contrast metric for the comparison of colour and achromatic sensitivity is unknown, however, and so a secondary aim was to investigate whether equivalent fMRI responses of each cortical area are predicted by stimulus contrast matched in multiples of detection threshold that approximately equates for visibility, or direct (cone) contrast matches in which psychophysical sensitivity is uncorrected. We found that the fMRI response across the two colour and achromatic pathways is not well predicted by threshold-scaled stimuli (perceptual visibility) but is better predicted by cone contrast, particularly for area V1. Our results show that the early visual areas (V1, V2, V3, VP and hV4) all have robust responses to colour. No area showed an overall colour preference, however, until anterior to V4 where we found a ventral occipital region that has a significant preference for chromatic stimuli, indicating a functional distinction from earlier areas. We found that all of these areas have a surprisingly strong response to S-cone stimuli, at least as great as the L/M response, suggesting a relative enhancement of the S-cone cortical signal. We also identified two areas (V3A and hMT+) with a significant preference for achromatic over chromatic stimuli, indicating a functional grouping into a dorsal pathway with a strong magnocellular input.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In his 1987 book, The Media Lab: Inventing the Future at MIT, Stewart Brand provides an insight into the visions of the future of the media in the 1970s and 1980s. 1 He notes that Nicolas Negroponte made a compelling case for the foundation of a media laboratory at MIT with diagrams detailing the convergence of three sectors of the media—the broadcast and motion picture industry; the print and publishing industry; and the computer industry. Stewart Brand commented: ‘If Negroponte was right and communications technologies really are converging, you would look for signs that technological homogenisation was dissolving old boundaries out of existence, and you would expect an explosion of new media where those boundaries used to be’. Two decades later, technology developers, media analysts and lawyers have become excited about the latest phase of media convergence. In 2006, the faddish Time Magazine heralded the arrival of various Web 2.0 social networking services: You can learn more about how Americans live just by looking at the backgrounds of YouTube videos—those rumpled bedrooms and toy‐strewn basement rec rooms—than you could from 1,000 hours of network television. And we didn’t just watch, we also worked. Like crazy. We made Facebook profiles and Second Life avatars and reviewed books at Amazon and recorded podcasts. We blogged about our candidates losing and wrote songs about getting dumped. We camcordered bombing runs and built open‐source software. America loves its solitary geniuses—its Einsteins, its Edisons, its Jobses—but those lonely dreamers may have to learn to play with others. Car companies are running open design contests. Reuters is carrying blog postings alongside its regular news feed. Microsoft is working overtime to fend off user‐created Linux. We’re looking at an explosion of productivity and innovation, and it’s just getting started, as millions of minds that would otherwise have drowned in obscurity get backhauled into the global intellectual economy. The magazine announced that Time’s Person of the Year was ‘You’, the everyman and everywoman consumer ‘for seizing the reins of the global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game’. This review essay considers three recent books, which have explored the legal dimensions of new media. In contrast to the unbridled exuberance of Time Magazine, this series of legal works displays an anxious trepidation about the legal ramifications associated with the rise of social networking services. In his tour de force, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet, Daniel Solove considers the implications of social networking services, such as Facebook and YouTube, for the legal protection of reputation under privacy law and defamation law. Andrew Kenyon’s edited collection, TV Futures: Digital Television Policy in Australia, explores the intersection between media law and copyright law in the regulation of digital television and Internet videos. In The Future of the Internet and How to Stop It, Jonathan Zittrain explores the impact of ‘generative’ technologies and ‘tethered applications’—considering everything from the Apple Mac and the iPhone to the One Laptop per Child programme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a machine learning approach to measure the visual quality of JPEG-coded images. The features for predicting the perceived image quality are extracted by considering key human visual sensitivity (HVS) factors such as edge amplitude, edge length, background activity and background luminance. Image quality assessment involves estimating the functional relationship between HVS features and subjective test scores. The quality of the compressed images are obtained without referring to their original images ('No Reference' metric). Here, the problem of quality estimation is transformed to a classification problem and solved using extreme learning machine (ELM) algorithm. In ELM, the input weights and the bias values are randomly chosen and the output weights are analytically calculated. The generalization performance of the ELM algorithm for classification problems with imbalance in the number of samples per quality class depends critically on the input weights and the bias values. Hence, we propose two schemes, namely the k-fold selection scheme (KS-ELM) and the real-coded genetic algorithm (RCGA-ELM) to select the input weights and the bias values such that the generalization performance of the classifier is a maximum. Results indicate that the proposed schemes significantly improve the performance of ELM classifier under imbalance condition for image quality assessment. The experimental results prove that the estimated visual quality of the proposed RCGA-ELM emulates the mean opinion score very well. The experimental results are compared with the existing JPEG no-reference image quality metric and full-reference structural similarity image quality metric.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimating the economic burden of injuries is important for setting priorities, allocating scarce health resources and planning cost-effective prevention activities. As a metric of burden, costs account for multiple injury consequences—death, severity, disability, body region, nature of injury—in a single unit of measurement. In a 1989 landmark report to the US Congress, Rice et al1 estimated the lifetime costs of injuries in the USA in 1985. By 2000, the epidemiology and burden of injuries had changed enough that the US Congress mandated an update, resulting in a book on the incidence and economic burden of injury in the USA.2 To make these findings more accessible to the larger realm of scientists and practitioners and to provide a template for conducting the same economic burden analyses in other countries and settings, a summary3 was published in Injury Prevention. Corso et al reported that, between 1985 and 2000, injury rates declined roughly 15%. The estimated lifetime cost of these injuries declined 20%, totalling US$406 billion, including US$80 billion in medical costs and US$326 billion in lost productivity. While incidence reflects problem size, the relative burden of injury is better expressed using costs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ship seakeeping operability refers to the quantification of motion performance in waves relative to mission requirements. This is used to make decisions about preferred vessel designs, but it can also be used as comprehensive assessment of the benefits of ship-motion-control systems. Traditionally, operability computation aggregates statistics of motion computed over over the envelope of likely environmental conditions in order to determine a coefficient in the range from 0 to 1 called operability. When used for assessment of motion-control systems, the increase of operability is taken as the key performance indicator. The operability coefficient is often given the interpretation of the percentage of time operable. This paper considers an alternative probabilistic approach to this traditional computation of operability. It characterises operability not as a number to which a frequency interpretation is attached, but as a hypothesis that a vessel will attain the desired performance in one mission considering the envelope of likely operational conditions. This enables the use of Bayesian theory to compute the probability of that this hypothesis is true conditional on data from simulations. Thus, the metric considered is the probability of operability. This formulation not only adheres to recent developments in reliability and risk analysis, but also allows incorporating into the analysis more accurate descriptions of ship-motion-control systems since the analysis is not limited to linear ship responses in the frequency domain. The paper also discusses an extension of the approach to the case of assessment of increased levels of autonomy for unmanned marine craft.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose The post-illumination pupil response (PIPR) has been quantified using four metrics, but the spectral sensitivity of only one is known; here we determine the other three. To optimize the human PIPR measurement, we determine the protocol producing the largest PIPR, the duration of the PIPR, and the metric(s) with the lowest coefficient of variation. Methods The consensual pupil light reflex (PLR) was measured with a Maxwellian view pupillometer. - Experiment 1: Spectral sensitivity of four PIPR metrics [plateau, 6 s, area under curve (AUC) early and late recovery] was determined from a criterion PIPR to a 1s pulse and fitted with Vitamin A1 nomogram (λmax = 482nm). - Experiment 2: The PLR was measured as a function of three stimulus durations (1s, 10s, 30s), five irradiances spanning low to high melanopsin excitation levels (retinal irradiance: 9.8 to 14.8 log quanta.cm-2.s-1), and two wavelengths, one with high (465nm) and one with low (637nm) melanopsin excitation. Intra and inter-individual coefficients of variation (CV) were calculated. Results The melanopsin (opn4) photopigment nomogram adequately describes the spectral sensitivity of all four PIPR metrics. The PIPR amplitude was largest with 1s short wavelength pulses (≥ 12.8 log quanta.cm-2.s-1). The plateau and 6s PIPR showed the least intra and inter-individual CV (≤ 0.2). The maximum duration of the sustained PIPR was 83.0±48.0s (mean±SD) for 1s pulses and 180.1±106.2s for 30s pulses (465nm; 14.8 log quanta.cm-2.s-1). Conclusions All current PIPR metrics provide a direct measure of the intrinsic melanopsin photoresponse. To measure progressive changes in melanopsin function in disease, we recommend that the PIPR be measured using short duration pulses (e.g., ≤ 1s) with high melanopsin excitation and analyzed with plateau and/or 6s metrics. Our PIPR duration data provide a baseline for the selection of inter-stimulus intervals between consecutive pupil testing sequences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose The post-illumination pupil response (PIPR) has been quantified in the literature by four metrics. The spectral sensitivity of only one metric is known and this study quantifies the other three. To optimize the measurement of the PIPR in humans, we also determine the stimulus protocol producing the largest PIPR, the duration of the PIPR, and the metric(s) with the lowest coefficient of variation. Methods The consensual pupil light reflex (PLR) was measured with a Maxwellian view pupillometer (35.6° diameter stimulus). - Experiment 1: Spectral sensitivity of four PIPR metrics [plateau, 6 s, area under curve (AUC) early and late recovery] was determined from a criterion PIPR (n = 2 participants) to a 1 s pulse at five wavelengths (409-592nm) and fitted with Vitamin A nomogram (ƛmax = 482 nm). - Experiment 2: The PLR was measured in five healthy participants [29 to 42 years (mean = 32.6 years)] as a function of three stimulus durations (1 s, 10 s, 30 s), five irradiances spanning low to high melanopsin excitation levels (retinal irradiance: 9.8 to 14.8 log quanta.cm-2.s-1), and two wavelengths, one with high (465 nm) and one with low (637 nm) melanopsin excitation. Intra and inter-individual coefficients of variation (CV) were calculated. Results The melanopsin (opn4) photopigment nomogram adequately described the spectral sensitivity derived from all four PIPR metrics. The largest PIPR amplitude was observed with 1 s short wavelength pulses (retinal irradiance ≥ 12.8 log quanta.cm-2.s-1). Of the 4 PIPR metrics, the plateau and 6 s PIPR showed the least intra and inter-individual CV (≤ 0.2). The maximum duration of the sustained PIPR was 83.4 ± 48.0 s (mean ± SD) for 1 s pulses and 180.1 ± 106.2 s for 30 s pulses (465 nm; 14.8 log quanta.cm-2.s-1). Conclusions All current PIPR metrics provide a direct measure of intrinsic melanopsin retinal ganglion cell function. To measure progressive changes in melanopsin function in disease, we recommend that the intrinsic melanopsin response should be measured using a 1 s pulse with high melanopsin excitation and the PIPR should be analyzed with the plateau and/or 6 s metrics. That the PIPR can have a sustained constriction for as long as 3 minutes, our PIPR duration data provide a baseline for the selection of inter-stimulus intervals between consecutive pupil testing sequences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Possible integration of Single Electron Transistor (SET) with CMOS technology is making the study of semiconductor SET more important than the metallic SET and consequently, the study of energy quantization effects on semiconductor SET devices and circuits is gaining significance. In this paper, for the first time, the effects of energy quantization on SET inverter performance are examined through analytical modeling and Monte Carlo simulations. It is observed that the primary effect of energy quantization is to change the Coulomb Blockade region and drain current of SET devices and as a result affects the noise margin, power dissipation, and the propagation delay of SET inverter. A new model for the noise margin of SET inverter is proposed which includes the energy quantization effects. Using the noise margin as a metric, the robustness of SET inverter is studied against the effects of energy quantization. It is shown that SET inverter designed with CT : CG = 1/3 (where CT and CG are tunnel junction and gate capacitances respectively) offers maximum robustness against energy quantization.