780 resultados para Menu fraud
Resumo:
Antike Theater und Masken online enthält etwa 800 Farbfotografien von antiken Theatern und 600 von Masken aus den Ländern rund um das Mittelmeer, aus Westeuropa und aus zahlreichen Museen. Die Sammlung des Basler Theaterhistorikers Karl Gotthilf Kachler (1906-2000) entstand zwischen dem Ende der 1950er-Jahre und dem Beginn der 1980er-Jahre auf zahlreichen Forschungsreisen. Er übergab 1999 die ursprünglich über 5’000 Diapositive dem Institut für Theaterwissenschaft der Universität Bern zur Bearbeitung. Im Jahr 2003 erschien im Chronos Verlag Zürich der Katalog Antike Theater und Masken mit einer beiliegenden DVD, die eine repräsentative Auswahl von 1’400 am ITW durch Sara Aebi und Regula Brunner thematisch kontextualisierten und kommentierten Fotografien zeigte. Die Herausgabe von Antike Theater und Masken online erfolgt durch Andreas Kotte, Direktor des ITW Bern und Projektleiter.
Resumo:
Crowdsourcing linguistic phenomena with smartphone applications is relatively new. In linguistics, apps have predominantly been developed to create pronunciation dictionaries, to train acoustic models, and to archive endangered languages. This paper presents the first account of how apps can be used to collect data suitable for documenting language change: we created an app, Dialäkt Äpp (DÄ), which predicts users’ dialects. For 16 linguistic variables, users select a dialectal variant from a drop-down menu. DÄ then geographically locates the user’s dialect by suggesting a list of communes where dialect variants most similar to their choices are used. Underlying this prediction are 16 maps from the historical Linguistic Atlas of German-speaking Switzerland, which documents the linguistic situation around 1950. Where users disagree with the prediction, they can indicate what they consider to be their dialect’s location. With this information, the 16 variables can be assessed for language change. Thanks to the playfulness of its functionality, DÄ has reached many users; our linguistic analyses are based on data from nearly 60,000 speakers. Results reveal a relative stability for phonetic variables, while lexical and morphological variables seem more prone to change. Crowdsourcing large amounts of dialect data with smartphone apps has the potential to complement existing data collection techniques and to provide evidence that traditional methods cannot, with normal resources, hope to gather. Nonetheless, it is important to emphasize a range of methodological caveats, including sparse knowledge of users’ linguistic backgrounds (users only indicate age, sex) and users’ self-declaration of their dialect. These are discussed and evaluated in detail here. Findings remain intriguing nevertheless: as a means of quality control, we report that traditional dialectological methods have revealed trends similar to those found by the app. This underlines the validity of the crowdsourcing method. We are presently extending DÄ architecture to other languages.
Resumo:
Consumers are often less satisfied with a product chosen from a large assortment than a limited one. Experienced choice difficulty presumably causes this as consumers have to engage in a great number of individual comparisons. In two studies we tested whether partitioning the choice task so that consumers decided sequentially on each individual attribute may provide a solution. In a Starbucks coffee house, consumers who chose from the menu rated the coffee as less tasty when chosen from a large rather than a small assortment. However, when the consumers chose it by sequentially deciding about one attribute at a time, the effect reversed. In a tailored-suit customization, consumers who chose multiple attributes at a time were less satisfied with their suit, compared to those who chose one attribute at a time. Sequential attribute-based processing proves to be an effective strategy to reap the benefits of a large assortment.
Resumo:
Background: It is yet unclear if there are differences between using electronic key feature problems (KFPs) or electronic case-based multiple choice questions (cbMCQ) for the assessment of clinical decision making. Summary of Work: Fifth year medical students were exposed to clerkships which ended with a summative exam. Assessment of knowledge per exam was done by 6-9 KFPs, 9-20 cbMCQ and 9-28 MC questions. Each KFP consisted of a case vignette and three key features (KF) using “long menu” as question format. We sought students’ perceptions of the KFPs and cbMCQs in focus groups (n of students=39). Furthermore statistical data of 11 exams (n of students=377) concerning the KFPs and (cb)MCQs were compared. Summary of Results: The analysis of the focus groups resulted in four themes reflecting students’ perceptions of KFPs and their comparison with (cb)MCQ: KFPs were perceived as (i) more realistic, (ii) more difficult, (iii) more motivating for the intense study of clinical reasoning than (cb)MCQ and (iv) showed an overall good acceptance when some preconditions are taken into account. The statistical analysis revealed that there was no difference in difficulty; however KFP showed a higher discrimination and reliability (G-coefficient) even when corrected for testing times. Correlation of the different exam parts was intermediate. Conclusions: Students perceived the KFPs as more motivating for the study of clinical reasoning. Statistically KFPs showed a higher discrimination and higher reliability than cbMCQs. Take-home messages: Including KFPs with long menu questions into summative clerkship exams seems to offer positive educational effects.
Resumo:
Fragestellung/Einleitung: Es ist unklar inwiefern Unterschiede bestehen im Einsatz von Key Feature Problemen (KFP) mit Long Menu Fragen und fallbasierten Typ A Fragen (FTA) für die Überprüfung des klinischen Denkens (Clinical Reasoning) in der klinischen Ausbildung von Medizinstudierenden. Methoden: Medizinstudierende des fünften Studienjahres nahmen an ihrer klinischen Pädiatrie-Rotation teil, die mit einer summativen Prüfung endete. Die Überprüfung des Wissen wurde pro Prüfung elektronisch mit 6-9 KFP [1], [3], 9-20 FTA und 9-28 nichtfallbasierten Multiple Choice Fragen (NFTA) durchgeführt. Jedes KFP bestand aus einer Fallvignette und drei Key Features und nutzen ein sog. Long Menu [4] als Antwortformat. Wir untersuchten die Perzeption der KFP und FTA in Focus Gruppen [2] (n of students=39). Weiterhin wurden die statistischen Kennwerte der KFP und FTA von 11 Prüfungen (n of students=377) verglichen. Ergebnisse: Die Analyse der Fokusgruppen resultierte in vier Themen, die die Perzeption der KFP und deren Vergleich mit FTA darstellten: KFP wurden als 1. realistischer, 2. schwerer, und 3. motivierender für das intensive Selbststudium des klinischen Denkens als FTA aufgenommen und zeigten 4. insgesamt eine gute Akzeptanz sofern gewisse Voraussetzungen berücksichtigt werden. Die statistische Auswertung zeigte keinen Unterschied im Schwierigkeitsgrad; jedoch zeigten die KFP eine höhere Diskrimination und Reliabilität (G-coefficient) selbst wenn für die Prüfungszeit korrigiert wurde. Die Korrelation der verschiedenen Prüfungsteile war mittel. Diskussion/Schlussfolgerung: Die Studierenden erfuhren die KFP als motivierenden für das Selbststudium des klinischen Denkens. Statistisch zeigten die KFP eine grössere Diskrimination und höhere Relibilität als die FTA. Der Einbezug von KFP mit Long Menu in Prüfungen des klinischen Studienabschnitts erscheint vielversprechend und einen „educational effect“ zu haben.
Resumo:
Background. Diets high in fat and calories are promoted by the toxic food environment in which high fat, high calorie foods are readily accessible, thus contributing to high rates of overweight and obesity.^ Hypothesis. Changing the food environment to make low-fat, low-calorie foods readily identifiable and accessible while simultaneously offering incentives for choosing those foods will result in increased consumption of targeted foods, thus decreasing caloric and fat intake and ultimately decreasing obesity rates.^ Objective. To conduct an outcome evaluation study on the effectiveness of The Fresh & Healthy Program, a health promotion project designed to promote healthy eating among The Methodist Hospital employees by labeling and promoting low calorie, low fat items in the hospital cafeteria. ^ Program. By promoting healthy eating, this program seeks to address unhealthy dietary behaviors, one of the most widely known and influential behavioral causes of obesity. Food items that are included in the program meet nutritional criteria for calories and fat and are labeled with a special logo. Program participants receive incentives for purchasing Fresh & Healthy items. The program was designed and implemented by a team of registered dietitians, two health education specialists, and retail foodservice managers at The Methodist Hospital in the Texas Medical Center in Houston and has been in existence since April 2006.^ Methods. The evaluation uses a non-randomized, one-group, time series design to evaluate the effect of the program on sales of targeted food items.^ Key words. point-of-purchase, menu labeling, environmental obesity interventions, food pricing interventions ^
Resumo:
Background. Today modern day slavery is known as human trafficking and is a growing pandemic that is a grave human rights violation. Estimates suggest that 12.3 million people are working under conditions of force, fraud or coercion. Working toward eradication is a worthy effort; it would free millions of humans from slavery, mostly women and children, as well as uphold basic human rights. One tactic to eradicating human trafficking is to increase identification of victims among those likely to encounter victims of human trafficking.^ Purpose. This study aims to develop an intervention that improves certain stakeholders' ability, in the health clinic setting, to appropriately identify and report victims of human trafficking to the National Human Trafficking Resource Center.^ Methods. The Intervention Mapping (IM) process was used by program planners to develop an intervention for health professionals. This methodology is a six step process that guides program planners to develop an intervention. Each step builds on the others through the execution of a needs assessment, and the development of matrices based on performance objectives and determinants of the targeted health behavior. The end product results in an ecological, theoretical, and evidence based intervention.^ Discussion. The IM process served as a useful protocol for program planners to take an ecological approach as well as incorporate theory and evidence into the intervention. Consultation with key informants, the planning group, adopters, implementers, and individuals responsible for institutionalization also contributed to the practicality and feasibility of the intervention. Program planners believe that this intervention fully meets recommendations set forth in the literature.^ Conclusions. The intervention mapping methodology enabled program planners to develop an intervention that is appropriate and acceptable to the implementer and the recipients.^
Resumo:
En este trabajo se reconstruye la trayectoria del régimen fraudulento durante el gobierno de Agustín Justo (1932-1938). Las acciones coercitivas desplegadas por los caudillos durante la jornada electoral fueron su rasgo más evidente, pero la instrumentación del fraude tuvo alcances más profundos. A lo largo de esta experiencia, el ejercicio abierto de la coerción sobre el principal partido opositor y el avasallamiento de los derechos de la ciudadanía fueron acompañados por la reorganización del Estado en un sentido que quebró la relación de equilibrio entre los poderes a través de la subordinación del Poder Judicial y el Legislativo a las directivas del elenco gobernante. En este texto se identifican las prácticas y las decisiones desplegadas por los partidos políticos del campo opositor y del oficialista en relación con este proceso.
Resumo:
En este trabajo se reconstruye la trayectoria del régimen fraudulento durante el gobierno de Agustín Justo (1932-1938). Las acciones coercitivas desplegadas por los caudillos durante la jornada electoral fueron su rasgo más evidente, pero la instrumentación del fraude tuvo alcances más profundos. A lo largo de esta experiencia, el ejercicio abierto de la coerción sobre el principal partido opositor y el avasallamiento de los derechos de la ciudadanía fueron acompañados por la reorganización del Estado en un sentido que quebró la relación de equilibrio entre los poderes a través de la subordinación del Poder Judicial y el Legislativo a las directivas del elenco gobernante. En este texto se identifican las prácticas y las decisiones desplegadas por los partidos políticos del campo opositor y del oficialista en relación con este proceso.
Resumo:
En este trabajo se reconstruye la trayectoria del régimen fraudulento durante el gobierno de Agustín Justo (1932-1938). Las acciones coercitivas desplegadas por los caudillos durante la jornada electoral fueron su rasgo más evidente, pero la instrumentación del fraude tuvo alcances más profundos. A lo largo de esta experiencia, el ejercicio abierto de la coerción sobre el principal partido opositor y el avasallamiento de los derechos de la ciudadanía fueron acompañados por la reorganización del Estado en un sentido que quebró la relación de equilibrio entre los poderes a través de la subordinación del Poder Judicial y el Legislativo a las directivas del elenco gobernante. En este texto se identifican las prácticas y las decisiones desplegadas por los partidos políticos del campo opositor y del oficialista en relación con este proceso.
Resumo:
Democracy is not necessarily consolidated simply by the introduction of formal democratic institutions. It is often observed in new democracies that democratic institutions are neglected and eroded in actual practice. Particularly, electoral fraud committed by a ruler is one of the main problems in this regard. This paper deals with two questions, (1) under what conditions does a ruler have an incentive to hold fair elections (or to rig elections), and (2) what makes a ruler prefer to establish an independent election governing institution? Assuming that a ruler prefers to maintain her power, basically she has an incentive to rig elections in order to be victorious in the political competition. A ruler, however, faces the risk of losing power if the opposition stages successful protests on a sufficiently large scale. If opponents are able to pose a credible threat to a ruler, she will have an incentive to hold fair elections. The problem is that information on electoral fraud is not shared by every player in the game. For the opposition, imperfect information deepens their coordination problems. Imperfect information, on the other hand, in some cases causes a problem for a ruler. If the opposition is sufficiently cohesive and have little tolerance of cheating, even unverified suspicions of fraud may trigger menacing protests. In such a case, a ruler has an incentive to establish an independent election commission to avoid unnecessary collisions by revealing the nature of the elections.
Resumo:
The difficulty of holding fair elections continues to be a critical problem in many newly democratized countries. The core of the problem is the electoral administration's lack of political autonomy and capability to regulate fraud. This paper seeks to identify the conditions for establishing an autonomous and capable electoral administration system. An electoral administration system has two main functions: to disclose the nature of elections and to prevent fraud. We argue in this paper that an autonomous and capable electoral administration system exists if the major political players have the incentive to disclose the information on the elections and to secure the ruler's credible commitment to fair elections. We examine this argument through comparative case studies of Korea and the Philippines. Despite similar historical and institutional settings, their election commissions exhibit contrasting features. The difference in the incentive structures of the major political players seems to have caused the divergence in the institutional evolution of the election commissions in the two countries.
Resumo:
To date, big data applications have focused on the store-and-process paradigm. In this paper we describe an initiative to deal with big data applications for continuous streams of events. In many emerging applications, the volume of data being streamed is so large that the traditional ‘store-then-process’ paradigm is either not suitable or too inefficient. Moreover, soft-real time requirements might severely limit the engineering solutions. Many scenarios fit this description. In network security for cloud data centres, for instance, very high volumes of IP packets and events from sensors at firewalls, network switches and routers and servers need to be analyzed and should detect attacks in minimal time, in order to limit the effect of the malicious activity over the IT infrastructure. Similarly, in the fraud department of a credit card company, payment requests should be processed online and need to be processed as quickly as possible in order to provide meaningful results in real-time. An ideal system would detect fraud during the authorization process that lasts hundreds of milliseconds and deny the payment authorization, minimizing the damage to the user and the credit card company.
Resumo:
La señalización digital o digital signage es una tecnología de comunicaciones digital que se está usando en los últimos años para reemplazar a la antigua publicidad impresa. Esta tecnología mejora la presentación y promoción de los productos anunciados, así como facilita el intercambio de información gracias a su colocación en lugares públicos o al aire libre. Las aplicaciones con las que cuenta este nuevo método de publicidad son muy variadas, ya que pueden variar desde ambientes privados en empresas, hasta lugares públicos como centros comerciales. Aunque la primera y principal utilidad de la señalización digital es la publicidad para que el usuario sienta una necesidad de adquirir productos, también la posibilidad de ofrecer más información sobre determinados artículos a través de las nuevas tecnologías es muy importante en este campo. La aplicación realizada en este proyecto es el desarrollo de un programa en Adobe Flash a través de lenguaje de programación XML. A través de una pantalla táctil, el usuario de un museo puede interactivamente acceder a un menú en el que aparecen los diferentes estilos de arte en un determinado tiempo de la historia. A través de una línea de tiempo se puede acceder a información sobre cada objeto que esté expuesto en la exhibición. Además se pueden observar imágenes de los detalles más importantes del objeto que no pueden ser vistos a simple vista, ya que no está permitido manipularlos. El empleo de la pantalla interactiva sirve para el usuario de la exhibición como una herramienta extra para recabar información sobre lo que está viendo, a través de una tecnología nueva y fácil de usar para todo el mundo, ya que solo se necesita usar las propias manos. La facilidad de manejo en aplicaciones como estas es muy importante, ya que el usuario final puede no tener conocimientos tecnológicos por lo que la información debe darse claramente. Como conclusión, se puede decir que digital signage es un mercado que está en expansión y que las empresas deben invertir en el desarrollo de contenidos, ya que las tecnologías avanzan aunque el digital signage no lo haga, y este sector podría ser muy útil en un futuro no muy lejano, ya que la información que es capaz de transmitir al espectador en todos los lugares es mucho más válida y útil que la proporcionada por un simple póster impreso en una valla publicitaria. Abstract The Digital signage is a digital communications technology being used in recent years to replace the old advertising printed. This technology improves the presentation and promotion of the advertised products, and makes easy the exchange of information with its placement in public places or outdoors. The applications that account this new method of advertising are several; they can range from private rooms in companies, to public places like malls. Although the first major utility of Digital signage is the advertising that makes the user feel and need of purchasing products. In addition, the chance of providing more information about certain items through new technologies is very important in this field. The application made in this project is the development of a program in Adobe Flash via XML programming language. Through a touch-screen, a museum user can interactively access a menu in which different styles of art in a particular time in history appears. Through a timeline you can access to information about each object that is exposed on display. Also you can see pictures of the most important details of the object that can not be seen with the naked eye, since it is not allowed to manipulate it. The use of the interactive screen serves to the user exhibition as an extra tool to gather information about what is seeing through a new technology and easy to use for everyone, since only need to use one’s own hands. The ease of handling in applications such as this is very important as the end user may not have expertise technology so the information should be clearly. As conclusion, one can say digital signage is an expansion market and companies must invest in content development, as although digital technologies advance digital signage does not, and this sector could be very useful in a near future, because information that is able of transmitting the everywhere viewer is much more valid and useful than that provided by a simple printed poster on a billboard.
Resumo:
In recent years, applications in domains such as telecommunications, network security or large scale sensor networks showed the limits of the traditional store-then-process paradigm. In this context, Stream Processing Engines emerged as a candidate solution for all these applications demanding for high processing capacity with low processing latency guarantees. With Stream Processing Engines, data streams are not persisted but rather processed on the fly, producing results continuously. Current Stream Processing Engines, either centralized or distributed, do not scale with the input load due to single-node bottlenecks. Moreover, they are based on static configurations that lead to either under or over-provisioning. This Ph.D. thesis discusses StreamCloud, an elastic paralleldistributed stream processing engine that enables for processing of large data stream volumes. Stream- Cloud minimizes the distribution and parallelization overhead introducing novel techniques that split queries into parallel subqueries and allocate them to independent sets of nodes. Moreover, Stream- Cloud elastic and dynamic load balancing protocols enable for effective adjustment of resources depending on the incoming load. Together with the parallelization and elasticity techniques, Stream- Cloud defines a novel fault tolerance protocol that introduces minimal overhead while providing fast recovery. StreamCloud has been fully implemented and evaluated using several real word applications such as fraud detection applications or network analysis applications. The evaluation, conducted using a cluster with more than 300 cores, demonstrates the large scalability, the elasticity and fault tolerance effectiveness of StreamCloud. Resumen En los útimos años, aplicaciones en dominios tales como telecomunicaciones, seguridad de redes y redes de sensores de gran escala se han encontrado con múltiples limitaciones en el paradigma tradicional de bases de datos. En este contexto, los sistemas de procesamiento de flujos de datos han emergido como solución a estas aplicaciones que demandan una alta capacidad de procesamiento con una baja latencia. En los sistemas de procesamiento de flujos de datos, los datos no se persisten y luego se procesan, en su lugar los datos son procesados al vuelo en memoria produciendo resultados de forma continua. Los actuales sistemas de procesamiento de flujos de datos, tanto los centralizados, como los distribuidos, no escalan respecto a la carga de entrada del sistema debido a un cuello de botella producido por la concentración de flujos de datos completos en nodos individuales. Por otra parte, éstos están basados en configuraciones estáticas lo que conducen a un sobre o bajo aprovisionamiento. Esta tesis doctoral presenta StreamCloud, un sistema elástico paralelo-distribuido para el procesamiento de flujos de datos que es capaz de procesar grandes volúmenes de datos. StreamCloud minimiza el coste de distribución y paralelización por medio de una técnica novedosa la cual particiona las queries en subqueries paralelas repartiéndolas en subconjuntos de nodos independientes. Ademas, Stream- Cloud posee protocolos de elasticidad y equilibrado de carga que permiten una optimización de los recursos dependiendo de la carga del sistema. Unidos a los protocolos de paralelización y elasticidad, StreamCloud define un protocolo de tolerancia a fallos que introduce un coste mínimo mientras que proporciona una rápida recuperación. StreamCloud ha sido implementado y evaluado mediante varias aplicaciones del mundo real tales como aplicaciones de detección de fraude o aplicaciones de análisis del tráfico de red. La evaluación ha sido realizada en un cluster con más de 300 núcleos, demostrando la alta escalabilidad y la efectividad tanto de la elasticidad, como de la tolerancia a fallos de StreamCloud.