18 resultados para Nielsen, Poul Runge: EU markedsret

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aquest treball fa una incursió en els Gestors de Continguts del mercat especialitzats en entorns educatius i es divideix en dues parts: En la primera es fa una comparativa entre els gestors de continguts Moodle, Atutor i Aula Wiki a través de la seva Avaluació Heurística basada en els principis de Jakob Nielsen.En la segona part, a partir del gestor de continguts que ha obtingut millor puntuació en la Avaluació Heurística, es realitza un Test d'Usuaris amb professors d'ensenyament secundari de diverses Àrees.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This project is focused on an analysis and an heuristic evaluation of a multi-player game designed for mobile phone and based on an adaptation of Nielsen's and Molich's heuristics which was carried out by a group of researchers of the Lancaster University.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

“Un reciente estudio de la consultora Nielsen estima en 49.650 millones de € las pérdidas anuales procedentes de la inversión en publicidad no efectiva en el mundo”.Esta noticia refleja un hecho que causa alarmismo en la sociedad. Semejante gasto desbaratado en llevar a cabo un proyecto que no causa ningún beneficio es motivo de preocupación entre economistas y profesionales del Marketing. Ciertamente, entenderíamos que multitud de personas se llevasen las manos a la cabeza ante la evidencia de semejante despilfarro. Llegados a este punto, hay que comunicar dos hechos. En primer lugar, hemos de aclarar que este es un titular ficticio y los datos que se aportan son irreales.Lamentablemente, en segundo lugar se ha de exponer que las cifras estimadas reales duplican –siendo optimistas‐ las anteriormente citadas (1 “Advertising expenditure forecast 2008” ZenithOptimedia – Nielsen Facts 2008).Actualmente los expertos en Marketing están en medio de un proceso de búsqueda de nuevas fórmulas que aumenten la eficacia y reduzcan el gasto de sus campañas publicitarias, y ése es el terreno en donde se mueve el Marketing viral, fenómeno en plena expansión gracias a la creciente importancia de Internet en nuestras vidas.Este hecho fue lo que nos ha empujado a investigar sobre la disciplina y sus métodos.Descubrir qué se ocultaba detrás de la categoría “viral” y darnos cuenta de su imparable desarrollo, descubriendo cómo habíamos sido a la vez verdugos y mártires en la propagación de mensajes publicitarios.En nuestro trabajo queremos darle un nuevo sentido al concepto de ahorro en el Marketing viral, intentado descubrir si es posible llevar a cabo una transformación low‐cost del concepto de Marketing viral y aplicarla con éxito a un sector de lapoblación en el que podamos controlar los efectos de una campaña de dichas características, por lo cual escogimos a los estudiantes del campus de Ciutadella de la Universidad Pompeu Fabra como target de nuestra campaña.Además, nos planteamos analizar el ahorro de nuestra campaña mediante el análisis y comparación coste‐resultado de otras vías de Marketing tradicionales con un concepto novedoso y a la vez preciso como es el de “eficacia real publicitaria”, en el cual nos servimos de estudios sociopsicológicos de consumidores para establecer unosbaremos más cercanos a la realidad que los métodos más comunes de medición de resultados publicitarios.Para ello, es necesario crear una identidad completamente nueva. Es aquí donde surge la marca, franquicia, empresa, organización y filosofía –ficticias‐ Kimbi. Y el reto exige un esfuerzo notable: dar a conocer una identidad –comercial y empresarial‐ ficticia que no ofrece ningún servicio ni producto a un sector poblacional y crear expectativas en elobjetivo, llamar su atención, grabar en sus mentes nuestros identificativos y esperar que el virus del “movimiento Kimbi” se propague entre ellos con éxito. Es decir, competir en el saturado panorama publicitario contra multitud de multinacionales que ya tienen unos usuarios fieles y una reputación labrada, que ofrecen productos y servicios de manera gratuita en bastantes ocasiones y que disponen de ingentes cantidades de recursos y capital para publicitar su identidad comercial en multitud de medios mainstream con el objetivo de atraerles y causarles un impacto publicitario. Esperamos haber conseguido, al menos, que el lector sienta el deseo de ver el trabajo. ¿Queréis descubrir el resultado?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The depth of the water table and the clay content are determinant factors for the exploitability of natural aggregates, such as the alluvial sands and gravels found on the fluvial domain of the Ter River. In this preliminary study, carried out in the Celri basin, we conclude that these variables can be determined by means of geophysical methods and recornmends the use of such methods in studies of regional character

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Glycogen-depleting exercise can lead to supercompensation of muscle glycogen stores, but the biochemical mechanisms of this phenomenon are still not completely understood. Methods: Using chronic low-frequency stimulation (CLFS) as an exercise model, the tibialis anterior muscle of rabbits was stimulated for either 1 or 24 hours, inducing a reduction in glycogen of 90% and 50% respectively. Glycogen recovery was subsequently monitored during 24 hours of rest. Results: In muscles stimulated for 1 hour, glycogen recovered basal levels during the rest period. However, in those stimulated for 24 hours, glycogen was supercompensated and its levels remained 50% higher than basal levels after 6 hours of rest, although the newly synthesized glycogen had fewer branches. This increase in glycogen correlated with an increase in hexokinase-2 expression and activity, a reduction in the glycogen phosphorylase activity ratio and an increase in the glycogen synthase activity ratio, due to dephosphorylation of site 3a, even in the presence of elevated glycogen stores. During supercompensation there was also an increase in 59-AMP-activated protein kinase phosphorylation, correlating with a stable reduction in ATP and total purine nucleotide levels. Conclusions: Glycogen supercompensation requires a coordinated chain of events at two levels in the context of decreased cell energy balance: First, an increase in the glucose phosphorylation capacity of the muscle and secondly, control of the enzymes directly involved in the synthesis and degradation of the glycogen molecule. However, supercompensated glycogen has fewer branches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[cat] Es presenta un estimador nucli transformat que és adequat per a distribucions de cua pesada. Utilitzant una transformació basada en la distribució de probabilitat Beta l’elecció del paràmetre de finestra és molt directa. Es presenta una aplicació a dades d’assegurances i es mostra com calcular el Valor en Risc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An experimental method of studying shifts between concentration-versus-depth profiles of vacancy- and interstitial-type defects in ion-implanted silicon is demonstrated. The concept is based on deep level transient spectroscopy measurements utilizing the filling pulse variation technique. The vacancy profile, represented by the vacancy¿oxygen center, and the interstitial profile, represented by the interstitial carbon¿substitutional carbon pair, are obtained at the same sample temperature by varying the duration of the filling pulse. The effect of the capture in the Debye tail has been extensively studied and taken into account. Thus, the two profiles can be recorded with a high relative depth resolution. Using low doses, point defects have been introduced in lightly doped float zone n-type silicon by implantation with 6.8 MeV boron ions and 680 keV and 1.3 MeV protons at room temperature. The effect of the angle of ion incidence has also been investigated. For all implantation conditions the peak of the interstitial profile is displaced towards larger depths compared to that of the vacancy profile. The amplitude of this displacement increases as the width of the initial point defect distribution increases. This behavior is explained by a simple model where the preferential forward momentum of recoiling silicon atoms and the highly efficient direct recombination of primary point defects are taken into account.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[cat] Es presenta un estimador nucli transformat que és adequat per a distribucions de cua pesada. Utilitzant una transformació basada en la distribució de probabilitat Beta l’elecció del paràmetre de finestra és molt directa. Es presenta una aplicació a dades d’assegurances i es mostra com calcular el Valor en Risc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been argued that a black hole horizon can support the long-range fields of a Nielsen-Olesen string and that one can think of such a vortex as black hole "hair." In this paper, we examine the properties of an Abelian Higgs vortex in the presence of a charged black hole as we allow the hole to approach extremality. Using both analytical and numerical techniques, we show that the magnetic field lines (as well as the scalar field) of the vortex are completely expelled from the black hole in the extreme limit. This was to be expected, since extreme black holes in Einstein-Maxwell theory are known to exhibit such a "Meissner effect" in general. This would seem to imply that a vortex does not want to be attached to an extreme black hole. We calculate the total energy of the vortex fields in the presence of an extreme black hole. When the hole is small relative to the size of the vortex, it is energetically favored for the hole to remain inside the vortex region, contrary to the intuition that the hole should be expelled. However, as we allow the extreme horizon radius to become very large compared to the radius of the vortex, we do find evidence of an instability. This proves that it is energetically unfavorable for a thin vortex to interact with a large extreme black hole. This would seem to dispel the notion that a black hole can support "long" Abelian Higgs hair in the extreme limit. We show that these considerations do not go through in the near-extreme limit. Finally, we discuss the implications for strings that end at black holes, as in the processes where a string snaps by nucleating black holes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been argued that a black hole horizon can support the long range fields of a Nielsen-Olesen string, and that one can think of such a vortex as black hole hair. We show that the fields inside the vortex are completely expelled from a charged black hole in the extreme limit (but not in the near extreme limit). This would seem to imply that a vortex cannot be attached to an extreme black hole. Furthermore, we provide evidence that it is energetically unfavorable for a thin vortex to interact with a large extreme black hole. This dispels the notion that a black hole can support long Abelian Higgs hair in the extreme limit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of prediction is considered in a multidimensional setting. Extending an idea presented by Barndorff-Nielsen and Cox, a predictive density for a multivariate random variable of interest is proposed. This density has the form of an estimative density plus a correction term. It gives simultaneous prediction regions with coverage error of smaller asymptotic order than the estimative density. A simulation study is also presented showing the magnitude of the improvement with respect to the estimative method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Enzymatic biodiesel is becoming an increasingly popular topic in bioenergy literature because of its potential to overcome the problems posed by chemical processes. However, the high cost of the enzymatic process still remains the main drawback for its industrial application, mostly because of the high price of refined oils. Unfortunately, low cost substrates, such as crude soybean oil, often release a product that hardly accomplishes the final required biodiesel specifications and need an additional pretreatment for gums removal. In order to reduce costs and to make the enzymatic process more efficient, we developed an innovative system for enzymatic biodiesel production involving a combination of a lipase and two phospholipases. This allows performing the enzymatic degumming and transesterification in a single step, using crude soybean oil as feedstock, and converting part of the phospholipids into biodiesel. Since the two processes have never been studied together, an accurate analysis of the different reaction components and conditions was carried out. Results Crude soybean oil, used as low cost feedstock, is characterized by a high content of phospholipids (900 ppm of phosphorus). However, after the combined activity of different phospholipases and liquid lipase Callera Trans L, a complete transformation into fatty acid methyl esters (FAMEs >95%) and a good reduction of phosphorus (P <5 ppm) was achieved. The combination of enzymes allowed avoidance of the acid treatment required for gums removal, the consequent caustic neutralization, and the high temperature commonly used in degumming systems, making the overall process more eco-friendly and with higher yield. Once the conditions were established, the process was also tested with different vegetable oils with variable phosphorus contents. Conclusions Use of liquid lipase Callera Trans L in biodiesel production can provide numerous and sustainable benefits. Besides reducing the costs derived from enzyme immobilization, the lipase can be used in combination with other enzymes such as phospholipases for gums removal, thus allowing the use of much cheaper, non-refined oils. The possibility to perform degumming and transesterification in a single tank involves a great efficiency increase in the new era of enzymatic biodiesel production at industrial scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Freshwater ecosystems and their biodiversity are presently seriously threatened by global development and population growth, leading to increases in nutrient inputs and intensification of eutrophication-induced problems in receiving fresh waters, particularly in lakes. Climate change constitutes another threat exacerbating the symptoms of eutrophication and species migration and loss. Unequivocal evidence of climate change impacts is still highly fragmented despite the intensive research, in part due to the variety and uncertainty of climate models and underlying emission scenarios but also due to the different approaches applied to study its effects. We first describe the strengths and weaknesses of the multi-faceted approaches that are presently available for elucidating the effects of climate change in lakes, including space-for-time substitution, time series, experiments, palaeoecology and modelling. Reviewing combined results from studies based on the various approaches, we describe the likely effects of climate changes on biological communities, trophic dynamics and the ecological state of lakes. We further discuss potential mitigation and adaptation measures to counteract the effects of climate change on lakes and, finally, we highlight some of the future challenges that we face to improve our capacity for successful prediction.