990 resultados para ETH Zurich
Resumo:
Social desirability and the fear of negative consequences often deter a considerable share of survey respondents from responding truthfully to sensitive questions. Thus, resulting prevalence estimates are biased. Indirect techniques for surveying sensitive questions such as the Randomized Response Technique are intended to mitigate misreporting by providing complete concealment of individual answers. However, it is far from clear whether these indirect techniques actually produce more valid measurements than standard direct questioning. In order to evaluate the validity of different sensitive question techniques we carried out an online validation experiment at Amazon Mechanical Turk in which respondents' self-reports of norm-breaking behavior (cheating in dice games) were validated against observed behavior. This document describes the design of the validation experiment and provides details on the questionnaire, the different sensitive question technique implementations, the field work, and the resulting dataset. The appendix contains a codebook of the data and facsimiles of the questionnaire pages and other survey materials.
Resumo:
Is Benford's law a good instrument to detect fraud in reports of statistical and scientific data? For a valid test the probability of "false positives" and "false negatives" has to be low. However, it is very doubtful whether the Benford distribution is an appropriate tool to discriminate between manipulated and non-manipulated estimates. Further research should focus more on the validity of the test and test results should be interpreted more carefully.
Resumo:
El interés cada vez mayor por las redes de sensores inalámbricos pueden ser entendido simplemente pensando en lo que esencialmente son: un gran número de pequeños nodos sensores autoalimentados que recogen información o detectan eventos especiales y se comunican de manera inalámbrica, con el objetivo final de entregar sus datos procesados a una estación base. Los nodos sensores están densamente desplegados dentro del área de interés, se pueden desplegar al azar y tienen capacidad de cooperación. Por lo general, estos dispositivos son pequeños y de bajo costo, de modo que pueden ser producidos y desplegados en gran numero aunque sus recursos en términos de energía, memoria, velocidad de cálculo y ancho de banda están enormemente limitados. Detección, tratamiento y comunicación son tres elementos clave cuya combinación en un pequeño dispositivo permite lograr un gran número de aplicaciones. Las redes de sensores proporcionan oportunidades sin fin, pero al mismo tiempo plantean retos formidables, tales como lograr el máximo rendimiento de una energía que es escasa y por lo general un recurso no renovable. Sin embargo, los recientes avances en la integración a gran escala, integrado de hardware de computación, comunicaciones, y en general, la convergencia de la informática y las comunicaciones, están haciendo de esta tecnología emergente una realidad. Del mismo modo, los avances en la nanotecnología están empezando a hacer que todo gire entorno a las redes de pequeños sensores y actuadores distribuidos. Hay diferentes tipos de sensores tales como sensores de presión, acelerómetros, cámaras, sensores térmicos o un simple micrófono. Supervisan las condiciones presentes en diferentes lugares tales como la temperatura, humedad, el movimiento, la luminosidad, presión, composición del suelo, los niveles de ruido, la presencia o ausencia de ciertos tipos de objetos, los niveles de tensión mecánica sobre objetos adheridos y las características momentáneas tales como la velocidad , la dirección y el tamaño de un objeto, etc. Se comprobara el estado de las Redes Inalámbricas de Sensores y se revisaran los protocolos más famosos. Así mismo, se examinara la identificación por radiofrecuencia (RFID) ya que se está convirtiendo en algo actual y su presencia importante. La RFID tiene un papel crucial que desempeñar en el futuro en el mundo de los negocios y los individuos por igual. El impacto mundial que ha tenido la identificación sin cables está ejerciendo fuertes presiones en la tecnología RFID, los servicios de investigación y desarrollo, desarrollo de normas, el cumplimiento de la seguridad y la privacidad y muchos más. Su potencial económico se ha demostrado en algunos países mientras que otros están simplemente en etapas de planificación o en etapas piloto, pero aun tiene que afianzarse o desarrollarse a través de la modernización de los modelos de negocio y aplicaciones para poder tener un mayor impacto en la sociedad. Las posibles aplicaciones de redes de sensores son de interés para la mayoría de campos. La monitorización ambiental, la guerra, la educación infantil, la vigilancia, la micro-cirugía y la agricultura son solo unos pocos ejemplos de los muchísimos campos en los que tienen cabida las redes mencionadas anteriormente. Estados Unidos de América es probablemente el país que más ha investigado en esta área por lo que veremos muchas soluciones propuestas provenientes de ese país. Universidades como Berkeley, UCLA (Universidad de California, Los Ángeles) Harvard y empresas como Intel lideran dichas investigaciones. Pero no solo EE.UU. usa e investiga las redes de sensores inalámbricos. La Universidad de Southampton, por ejemplo, está desarrollando una tecnología para monitorear el comportamiento de los glaciares mediante redes de sensores que contribuyen a la investigación fundamental en glaciología y de las redes de sensores inalámbricos. Así mismo, Coalesenses GmbH (Alemania) y Zurich ETH están trabajando en diversas aplicaciones para redes de sensores inalámbricos en numerosas áreas. Una solución española será la elegida para ser examinada más a fondo por ser innovadora, adaptable y polivalente. Este estudio del sensor se ha centrado principalmente en aplicaciones de tráfico, pero no se puede olvidar la lista de más de 50 aplicaciones diferentes que ha sido publicada por la firma creadora de este sensor específico. En la actualidad hay muchas tecnologías de vigilancia de vehículos, incluidos los sensores de bucle, cámaras de video, sensores de imagen, sensores infrarrojos, radares de microondas, GPS, etc. El rendimiento es aceptable, pero no suficiente, debido a su limitada cobertura y caros costos de implementación y mantenimiento, especialmente este ultimo. Tienen defectos tales como: línea de visión, baja exactitud, dependen mucho del ambiente y del clima, no se puede realizar trabajos de mantenimiento sin interrumpir las mediciones, la noche puede condicionar muchos de ellos, tienen altos costos de instalación y mantenimiento, etc. Por consiguiente, en las aplicaciones reales de circulación, los datos recibidos son insuficientes o malos en términos de tiempo real debido al escaso número de detectores y su costo. Con el aumento de vehículos en las redes viales urbanas las tecnologías de detección de vehículos se enfrentan a nuevas exigencias. Las redes de sensores inalámbricos son actualmente una de las tecnologías más avanzadas y una revolución en la detección de información remota y en las aplicaciones de recogida. Las perspectivas de aplicación en el sistema inteligente de transporte son muy amplias. Con este fin se ha desarrollado un programa de localización de objetivos y recuento utilizando una red de sensores binarios. Esto permite que el sensor necesite mucha menos energía durante la transmisión de información y que los dispositivos sean más independientes con el fin de tener un mejor control de tráfico. La aplicación se centra en la eficacia de la colaboración de los sensores en el seguimiento más que en los protocolos de comunicación utilizados por los nodos sensores. Las operaciones de salida y retorno en las vacaciones son un buen ejemplo de por qué es necesario llevar la cuenta de los coches en las carreteras. Para ello se ha desarrollado una simulación en Matlab con el objetivo localizar objetivos y contarlos con una red de sensores binarios. Dicho programa se podría implementar en el sensor que Libelium, la empresa creadora del sensor que se examinara concienzudamente, ha desarrollado. Esto permitiría que el aparato necesitase mucha menos energía durante la transmisión de información y los dispositivos sean más independientes. Los prometedores resultados obtenidos indican que los sensores de proximidad binarios pueden formar la base de una arquitectura robusta para la vigilancia de áreas amplias y para el seguimiento de objetivos. Cuando el movimiento de dichos objetivos es suficientemente suave, no tiene cambios bruscos de trayectoria, el algoritmo ClusterTrack proporciona un rendimiento excelente en términos de identificación y seguimiento de trayectorias los objetos designados como blancos. Este algoritmo podría, por supuesto, ser utilizado para numerosas aplicaciones y se podría seguir esta línea de trabajo para futuras investigaciones. No es sorprendente que las redes de sensores de binarios de proximidad hayan atraído mucha atención últimamente ya que, a pesar de la información mínima de un sensor de proximidad binario proporciona, las redes de este tipo pueden realizar un seguimiento de todo tipo de objetivos con la precisión suficiente. Abstract The increasing interest in wireless sensor networks can be promptly understood simply by thinking about what they essentially are: a large number of small sensing self-powered nodes which gather information or detect special events and communicate in a wireless fashion, with the end goal of handing their processed data to a base station. The sensor nodes are densely deployed inside the phenomenon, they deploy random and have cooperative capabilities. Usually these devices are small and inexpensive, so that they can be produced and deployed in large numbers, and so their resources in terms of energy, memory, computational speed and bandwidth are severely constrained. Sensing, processing and communication are three key elements whose combination in one tiny device gives rise to a vast number of applications. Sensor networks provide endless opportunities, but at the same time pose formidable challenges, such as the fact that energy is a scarce and usually non-renewable resource. However, recent advances in low power Very Large Scale Integration, embedded computing, communication hardware, and in general, the convergence of computing and communications, are making this emerging technology a reality. Likewise, advances in nanotechnology and Micro Electro-Mechanical Systems are pushing toward networks of tiny distributed sensors and actuators. There are different sensors such as pressure, accelerometer, camera, thermal, and microphone. They monitor conditions at different locations, such as temperature, humidity, vehicular movement, lightning condition, pressure, soil makeup, noise levels, the presence or absence of certain kinds of objects, mechanical stress levels on attached objects, the current characteristics such as speed, direction and size of an object, etc. The state of Wireless Sensor Networks will be checked and the most famous protocols reviewed. As Radio Frequency Identification (RFID) is becoming extremely present and important nowadays, it will be examined as well. RFID has a crucial role to play in business and for individuals alike going forward. The impact of ‘wireless’ identification is exerting strong pressures in RFID technology and services research and development, standards development, security compliance and privacy, and many more. The economic value is proven in some countries while others are just on the verge of planning or in pilot stages, but the wider spread of usage has yet to take hold or unfold through the modernisation of business models and applications. Possible applications of sensor networks are of interest to the most diverse fields. Environmental monitoring, warfare, child education, surveillance, micro-surgery, and agriculture are only a few examples. Some real hardware applications in the United States of America will be checked as it is probably the country that has investigated most in this area. Universities like Berkeley, UCLA (University of California, Los Angeles) Harvard and enterprises such as Intel are leading those investigations. But not just USA has been using and investigating wireless sensor networks. University of Southampton e.g. is to develop technology to monitor glacier behaviour using sensor networks contributing to fundamental research in glaciology and wireless sensor networks. Coalesenses GmbH (Germany) and ETH Zurich are working in applying wireless sensor networks in many different areas too. A Spanish solution will be the one examined more thoroughly for being innovative, adaptable and multipurpose. This study of the sensor has been focused mainly to traffic applications but it cannot be forgotten the more than 50 different application compilation that has been published by this specific sensor’s firm. Currently there are many vehicle surveillance technologies including loop sensors, video cameras, image sensors, infrared sensors, microwave radar, GPS, etc. The performance is acceptable but not sufficient because of their limited coverage and expensive costs of implementation and maintenance, specially the last one. They have defects such as: line-ofsight, low exactness, depending on environment and weather, cannot perform no-stop work whether daytime or night, high costs for installation and maintenance, etc. Consequently, in actual traffic applications the received data is insufficient or bad in terms of real-time owed to detector quantity and cost. With the increase of vehicle in urban road networks, the vehicle detection technologies are confronted with new requirements. Wireless sensor network is the state of the art technology and a revolution in remote information sensing and collection applications. It has broad prospect of application in intelligent transportation system. An application for target tracking and counting using a network of binary sensors has been developed. This would allow the appliance to spend much less energy when transmitting information and to make more independent devices in order to have a better traffic control. The application is focused on the efficacy of collaborative tracking rather than on the communication protocols used by the sensor nodes. Holiday crowds are a good case in which it is necessary to keep count of the cars on the roads. To this end a Matlab simulation has been produced for target tracking and counting using a network of binary sensors that e.g. could be implemented in Libelium’s solution. Libelium is the enterprise that has developed the sensor that will be deeply examined. This would allow the appliance to spend much less energy when transmitting information and to make more independent devices. The promising results obtained indicate that binary proximity sensors can form the basis for a robust architecture for wide area surveillance and tracking. When the target paths are smooth enough ClusterTrack particle filter algorithm gives excellent performance in terms of identifying and tracking different target trajectories. This algorithm could, of course, be used for different applications and that could be done in future researches. It is not surprising that binary proximity sensor networks have attracted a lot of attention lately. Despite the minimal information a binary proximity sensor provides, networks of these sensing modalities can track all kinds of different targets classes accurate enough.
Resumo:
Gaining valid answers to so-called sensitive questions is an age-old problem in survey research. Various techniques have been developed to guarantee anonymity and minimize the respondent's feelings of jeopardy. Two such techniques are the randomized response technique (RRT) and the unmatched count technique (UCT). In this study we evaluate the effectiveness of different implementations of the RRT (using a forced-response design) in a computer-assisted setting and also compare the use of the RRT to that of the UCT. The techniques are evaluated according to various quality criteria, such as the prevalence estimates they provide, the ease of their use, and respondent trust in the techniques. Our results indicate that the RRTs are problematic with respect to several domains, such as the limited trust they inspire and non-response, and that the RRT estimates are unreliable due to a strong false "no" bias, especially for the more sensitive questions. The UCT, however, performed well compared to the RRTs on all the evaluated measures. The UCT estimates also had more face validity than the RRT estimates. We conclude that the UCT is a promising alternative to RRT in self-administered surveys and that future research should be directed towards evaluating and improving the technique.
Resumo:
The counterfactual decomposition technique popularized by Blinder (1973) and Oaxaca (1973) is widely used to study mean outcome differences between groups. For example, the technique is often used to analyze wage gaps by sex or race. The present paper summarizes the technique and addresses a number of complications such as the identification of effects of categorical predictors in the detailed decomposition or the estimation of standard errors. A new Stata command called -oaxaca- is introduced and examples illustrating its usage are given.
Resumo:
A new Stata command called -mgof- is introduced. The command is used to compute distributional tests for discrete (categorical, multinomial) variables. Apart from classic large sample $\chi^2$-approximation tests based on Pearson's $X^2$, the likelihood ratio, or any other statistic from the power-divergence family (Cressie and Read 1984), large sample tests for complex survey designs and exact tests for small samples are supported. The complex survey correction is based on the approach by Rao and Scott (1981) and parallels the survey design correction used for independence tests in -svy:tabulate-. The exact tests are computed using Monte Carlo methods or exhaustive enumeration. An exact Kolmogorov-Smirnov test for discrete data is also provided.
Resumo:
Exchange between anonymous actors in Internet auctions corresponds to a one-shot prisoner's dilemma-like situation. Therefore, in any given auction the risk is high that seller and buyer will cheat and, as a consequence, that the market will collapse. However, mutual cooperation can be attained by the simple and very efficient institution of a public rating system. By this system, sellers have incentives to invest in reputation in order to enhance future chances of business. Using data from about 200 auctions of mobile phones we empirically explore the effects of the reputation system. In general, the analysis of nonobtrusive data from auctions may help to gain a deeper understanding of basic social processes of exchange, reputation, trust, and cooperation, and of the impact of institutions on the efficiency of markets. In this study we report empirical estimates of effects of reputation on characteristics of transactions such as the probability of a successful deal, the mode of payment, and the selling price (highest bid). In particular, we try to answer the question whether sellers receive a "premium" for reputation. Our results show that buyers are willing to pay higher prices for reputation in order to diminish the risk of exploitation. On the other hand, sellers protect themselves from cheating buyers by the choice of an appropriate payment mode. Therefore, despite the risk of mutual opportunistic behavior, simple institutional settings lead to cooperation, relatively rare events of fraud, and efficient markets.
Resumo:
The Graduate Institute organized an academic workshop and roundtable on the occasion of EFTA's 50th Anniversary in Geneva under the chairmanship of H.E. Doris Leuthard, President of the Swiss Confederation. Pierre Sauve, Deputy Managing Director and Director of Studies, WTI and Co-leader, NCCR-Trade work programme on preferentialism and Anirudh Shingal, Senior Research Fellow, WTI and Co-leader, NCCR-Trade work programme on impact assessment of trade, co-authored a paper on the nature of preferentialism in services trade, which Anirudh presented at the workshop. The event was extremely well-attended by high profile dignitaries and academics including President Leuthard; Director General of the WTO, Pascal Lamy; trade ministers of Brazil and Finland; Jan Kubis, Executive Secretary of the UNECE and several current and former ambassadors. The academic workshop, moderated by Theresa Carpenter (Graduate Institute, Geneva), began in the morning with Prof. Victor Norman's (Norwegian School of Economics & Business Administration) presentation on the future of EFTA. Other presentations included those by Prof. Peter Egger (ETH Zurich) on the structural estimation of gravity models with market entry dynamics and by Prof. Richard Baldwin (Graduate Institute, Geneva) on 21st century regionalism. The high-profile Panel in the afternoon, moderated by Prof. Richard Baldwin, was led by President Leuthard who spoke on free trade agreements and the multilateral trading system in 2020. The keynote address at the Panel was delivered by Prof. Jagdish Bhagwati (Coulmbia University), who spoke on strengthening defences against protectionism and liberalizing trade.
Resumo:
Marine invertebrates representing at least five phyla are symbiotic with dinoflagellates from the genus Symbiodinium. This group of single-celled protists was once considered to be a single pandemic species, Symbiodinium microadriaticum. Molecular investigations over the past 25 years have revealed, however, that Symbiodinium is a diverse group of organisms with at least eight (A-H) divergent clades that in turn contain multiple molecular subclade types. The diversity within this genus may subsequently determine the response of corals to normal and stressful conditions, leading to the proposal that the symbiosis may impart unusually rapid adaptation to environmental change by the metazoan host. These questions have added importance due to the critical challenges that corals and the reefs they build face as a consequence of current rapid climate change. This review outlines our current understanding of the diverse genus Symbiodinium and explores the ability of this genus and its symbioses to adapt to rapid environmental change. (c) 2006 Rubel Foundation, ETH Zurich. Published by Elsevier GmbH. All rights reserved.
Resumo:
International audience
Resumo:
The year 14,226 BP marks an important border in the actual radiocarbon (14C) calibration curve: the high resolution and precision characterising the first part (0 – 14,226 BP) of the curve are due to the potential represented by tree-ring datasets, which directly provide the atmospheric 14C content at the time of tree-rings formation with high resolution. They systematically decrease going back in time, where only a few floating tree-ring chronologies alternate to other low-resolution records. The lack of resolution in the dating procedure before 14,226 years BP leads to significant issues in the interpretation and untangling of tricky facts of our past, in the field of Human Evolution. Research on sub-fossil trees and the construction of new Glacial tree-ring chronologies can significantly improve the radiocarbon dating in terms of temporal resolution and precision until 55,000 years BP to clear puzzles in the Human Evolution history. In this thesis, the dendrochronological study, the radiocarbon dating and the extrapolation of environmental and climate information from sub-fossil trees found on the Portugal foreshore, remnants of a Glacial lagoonal forest, are presented. The careful sampling, the dendrochronological measurements and cross-dating, the application of the most suitable cellulose extraction protocol and the most advanced technologies of the MICADAS system at ETH-Zurich, led to the construction of a new 220-years long tree-ring site chronology and to high resolution, highly reliable and with a tight error range radiocarbon ages. At the moment, it results impossible to absolutely date this radiocarbon sequence by the comparison of Δ14C of the trees and 10 Be fluctuations from the ice-cores. For this reason, tree growth analysis, comparisons with a living pine stand and forest-fires history reconstruction have made it possible to hypothesize site and climate characteristics useful to constrain the positioning in time of the obtained radiocarbon sequence.