26 resultados para DDM Data Distribution Management testbed benchmark design implementation instance generator
Resumo:
While mobile technologies can provide great personalized services for mobile users, they also threaten their privacy. Such personalization-privacy paradox are particularly salient for context aware technology based mobile applications where user's behaviors, movement and habits can be associated with a consumer's personal identity. In this thesis, I studied the privacy issues in the mobile context, particularly focus on an adaptive privacy management system design for context-aware mobile devices, and explore the role of personalization and control over user's personal data. This allowed me to make multiple contributions, both theoretical and practical. In the theoretical world, I propose and prototype an adaptive Single-Sign On solution that use user's context information to protect user's private information for smartphone. To validate this solution, I first proved that user's context is a unique user identifier and context awareness technology can increase user's perceived ease of use of the system and service provider's authentication security. I then followed a design science research paradigm and implemented this solution into a mobile application called "Privacy Manager". I evaluated the utility by several focus group interviews, and overall the proposed solution fulfilled the expected function and users expressed their intentions to use this application. To better understand the personalization-privacy paradox, I built on the theoretical foundations of privacy calculus and technology acceptance model to conceptualize the theory of users' mobile privacy management. I also examined the role of personalization and control ability on my model and how these two elements interact with privacy calculus and mobile technology model. In the practical realm, this thesis contributes to the understanding of the tradeoff between the benefit of personalized services and user's privacy concerns it may cause. By pointing out new opportunities to rethink how user's context information can protect private data, it also suggests new elements for privacy related business models.
Resumo:
Natural genetic variation can have a pronounced influence on human taste perception, which in turn may influence food preference and dietary choice. Genome-wide association studies represent a powerful tool to understand this influence. To help optimize the design of future genome-wide-association studies on human taste perception we have used the well-known TAS2R38-PROP association as a tool to determine the relative power and efficiency of different phenotyping and data-analysis strategies. The results show that the choice of both data collection and data processing schemes can have a very substantial impact on the power to detect genotypic variation that affects chemosensory perception. Based on these results we provide practical guidelines for the design of future GWAS studies on chemosensory phenotypes. Moreover, in addition to the TAS2R38 gene past studies have implicated a number of other genetic loci to affect taste sensitivity to PROP and the related bitter compound PTC. None of these other locations showed genome-wide significant associations in our study. To facilitate further, target-gene driven, studies on PROP taste perception we provide the genome-wide list of p-values for all SNPs genotyped in the current study.
Resumo:
Despite the increasing popularity of enterprise architecture management (EAM) in practice, many EAM initiatives either do not fully meet the expected targets or fail. Several frameworks have been suggested as guidelines to EA implementation, but companies seldom follow prescriptive frameworks. Instead, they follow very diverse implementation approaches that depend on their organizational contingencies and the way of adopting and evolving EAM over time. This research strives for a broader understanding of EAM by exploring context-dependent EAM adoption approaches as well as identifying the main EA principles that affect EA effectiveness. Based on two studies, this dissertation aims to address two main questions: (1) EAM design: Which approaches do companies follow when adopting EAM? (2) EA principles and their impact: What impact does EA principles have on EA effectiveness/quality? By utilizing both qualitative and quantitative research methods, this research contributes to exploring different EAM designs in different organizational contingencies as well as using EA principles as an effective means to achieve principle-based EAM design. My research can help companies identify a suitable EAM design that fits their organizational settings and shape their EA through a set of principles.
Resumo:
We present models predicting the potential distribution of a threatened ant species, Formica exsecta Nyl., in the Swiss National Park ( SNP). Data to fit the models have been collected according to a random-stratified design with an equal number of replicates per stratum. The basic aim of such a sampling strategy is to allow the formal testing of biological hypotheses about those factors most likely to account for the distribution of the modeled species. The stratifying factors used in this study were: vegetation, slope angle and slope aspect, the latter two being used as surrogates of solar radiation, considered one of the basic requirements of F. exsecta. Results show that, although the basic stratifying predictors account for more than 50% of the deviance, the incorporation of additional non-spatially explicit predictors into the model, as measured in the field, allows for an increased model performance (up to nearly 75%). However, this was not corroborated by permutation tests. Implementation on a national scale was made for one model only, due to the difficulty of obtaining similar predictors on this scale. The resulting map on the national scale suggests that the species might once have had a broader distribution in Switzerland. Reasons for its particular abundance within the SNP might possibly be related to habitat fragmentation and vegetation transformation outside the SNP boundaries.
Resumo:
Time-lapse geophysical measurements are widely used to monitor the movement of water and solutes through the subsurface. Yet commonly used deterministic least squares inversions typically suffer from relatively poor mass recovery, spread overestimation, and limited ability to appropriately estimate nonlinear model uncertainty. We describe herein a novel inversion methodology designed to reconstruct the three-dimensional distribution of a tracer anomaly from geophysical data and provide consistent uncertainty estimates using Markov chain Monte Carlo simulation. Posterior sampling is made tractable by using a lower-dimensional model space related both to the Legendre moments of the plume and to predefined morphological constraints. Benchmark results using cross-hole ground-penetrating radar travel times measurements during two synthetic water tracer application experiments involving increasingly complex plume geometries show that the proposed method not only conserves mass but also provides better estimates of plume morphology and posterior model uncertainty than deterministic inversion results.
Resumo:
La présente étude est à la fois une évaluation du processus de la mise en oeuvre et des impacts de la police de proximité dans les cinq plus grandes zones urbaines de Suisse - Bâle, Berne, Genève, Lausanne et Zurich. La police de proximité (community policing) est à la fois une philosophie et une stratégie organisationnelle qui favorise un partenariat renouvelé entre la police et les communautés locales dans le but de résoudre les problèmes relatifs à la sécurité et à l'ordre public. L'évaluation de processus a analysé des données relatives aux réformes internes de la police qui ont été obtenues par l'intermédiaire d'entretiens semi-structurés avec des administrateurs clés des cinq départements de police, ainsi que dans des documents écrits de la police et d'autres sources publiques. L'évaluation des impacts, quant à elle, s'est basée sur des variables contextuelles telles que des statistiques policières et des données de recensement, ainsi que sur des indicateurs d'impacts construit à partir des données du Swiss Crime Survey (SCS) relatives au sentiment d'insécurité, à la perception du désordre public et à la satisfaction de la population à l'égard de la police. Le SCS est un sondage régulier qui a permis d'interroger des habitants des cinq grandes zones urbaines à plusieurs reprises depuis le milieu des années 1980. L'évaluation de processus a abouti à un « Calendrier des activités » visant à créer des données de panel permettant de mesurer les progrès réalisés dans la mise en oeuvre de la police de proximité à l'aide d'une grille d'évaluation à six dimensions à des intervalles de cinq ans entre 1990 et 2010. L'évaluation des impacts, effectuée ex post facto, a utilisé un concept de recherche non-expérimental (observational design) dans le but d'analyser les impacts de différents modèles de police de proximité dans des zones comparables à travers les cinq villes étudiées. Les quartiers urbains, délimités par zone de code postal, ont ainsi été regroupés par l'intermédiaire d'une typologie réalisée à l'aide d'algorithmes d'apprentissage automatique (machine learning). Des algorithmes supervisés et non supervisés ont été utilisés sur les données à haute dimensionnalité relatives à la criminalité, à la structure socio-économique et démographique et au cadre bâti dans le but de regrouper les quartiers urbains les plus similaires dans des clusters. D'abord, les cartes auto-organisatrices (self-organizing maps) ont été utilisées dans le but de réduire la variance intra-cluster des variables contextuelles et de maximiser simultanément la variance inter-cluster des réponses au sondage. Ensuite, l'algorithme des forêts d'arbres décisionnels (random forests) a permis à la fois d'évaluer la pertinence de la typologie de quartier élaborée et de sélectionner les variables contextuelles clés afin de construire un modèle parcimonieux faisant un minimum d'erreurs de classification. Enfin, pour l'analyse des impacts, la méthode des appariements des coefficients de propension (propensity score matching) a été utilisée pour équilibrer les échantillons prétest-posttest en termes d'âge, de sexe et de niveau d'éducation des répondants au sein de chaque type de quartier ainsi identifié dans chacune des villes, avant d'effectuer un test statistique de la différence observée dans les indicateurs d'impacts. De plus, tous les résultats statistiquement significatifs ont été soumis à une analyse de sensibilité (sensitivity analysis) afin d'évaluer leur robustesse face à un biais potentiel dû à des covariables non observées. L'étude relève qu'au cours des quinze dernières années, les cinq services de police ont entamé des réformes majeures de leur organisation ainsi que de leurs stratégies opérationnelles et qu'ils ont noué des partenariats stratégiques afin de mettre en oeuvre la police de proximité. La typologie de quartier développée a abouti à une réduction de la variance intra-cluster des variables contextuelles et permet d'expliquer une partie significative de la variance inter-cluster des indicateurs d'impacts avant la mise en oeuvre du traitement. Ceci semble suggérer que les méthodes de géocomputation aident à équilibrer les covariables observées et donc à réduire les menaces relatives à la validité interne d'un concept de recherche non-expérimental. Enfin, l'analyse des impacts a révélé que le sentiment d'insécurité a diminué de manière significative pendant la période 2000-2005 dans les quartiers se trouvant à l'intérieur et autour des centres-villes de Berne et de Zurich. Ces améliorations sont assez robustes face à des biais dus à des covariables inobservées et covarient dans le temps et l'espace avec la mise en oeuvre de la police de proximité. L'hypothèse alternative envisageant que les diminutions observées dans le sentiment d'insécurité soient, partiellement, un résultat des interventions policières de proximité semble donc être aussi plausible que l'hypothèse nulle considérant l'absence absolue d'effet. Ceci, même si le concept de recherche non-expérimental mis en oeuvre ne peut pas complètement exclure la sélection et la régression à la moyenne comme explications alternatives. The current research project is both a process and impact evaluation of community policing in Switzerland's five major urban areas - Basel, Bern, Geneva, Lausanne, and Zurich. Community policing is both a philosophy and an organizational strategy that promotes a renewed partnership between the police and the community to solve problems of crime and disorder. The process evaluation data on police internal reforms were obtained through semi-structured interviews with key administrators from the five police departments as well as from police internal documents and additional public sources. The impact evaluation uses official crime records and census statistics as contextual variables as well as Swiss Crime Survey (SCS) data on fear of crime, perceptions of disorder, and public attitudes towards the police as outcome measures. The SCS is a standing survey instrument that has polled residents of the five urban areas repeatedly since the mid-1980s. The process evaluation produced a "Calendar of Action" to create panel data to measure community policing implementation progress over six evaluative dimensions in intervals of five years between 1990 and 2010. The impact evaluation, carried out ex post facto, uses an observational design that analyzes the impact of the different community policing models between matched comparison areas across the five cities. Using ZIP code districts as proxies for urban neighborhoods, geospatial data mining algorithms serve to develop a neighborhood typology in order to match the comparison areas. To this end, both unsupervised and supervised algorithms are used to analyze high-dimensional data on crime, the socio-economic and demographic structure, and the built environment in order to classify urban neighborhoods into clusters of similar type. In a first step, self-organizing maps serve as tools to develop a clustering algorithm that reduces the within-cluster variance in the contextual variables and simultaneously maximizes the between-cluster variance in survey responses. The random forests algorithm then serves to assess the appropriateness of the resulting neighborhood typology and to select the key contextual variables in order to build a parsimonious model that makes a minimum of classification errors. Finally, for the impact analysis, propensity score matching methods are used to match the survey respondents of the pretest and posttest samples on age, gender, and their level of education for each neighborhood type identified within each city, before conducting a statistical test of the observed difference in the outcome measures. Moreover, all significant results were subjected to a sensitivity analysis to assess the robustness of these findings in the face of potential bias due to some unobserved covariates. The study finds that over the last fifteen years, all five police departments have undertaken major reforms of their internal organization and operating strategies and forged strategic partnerships in order to implement community policing. The resulting neighborhood typology reduced the within-cluster variance of the contextual variables and accounted for a significant share of the between-cluster variance in the outcome measures prior to treatment, suggesting that geocomputational methods help to balance the observed covariates and hence to reduce threats to the internal validity of an observational design. Finally, the impact analysis revealed that fear of crime dropped significantly over the 2000-2005 period in the neighborhoods in and around the urban centers of Bern and Zurich. These improvements are fairly robust in the face of bias due to some unobserved covariate and covary temporally and spatially with the implementation of community policing. The alternative hypothesis that the observed reductions in fear of crime were at least in part a result of community policing interventions thus appears at least as plausible as the null hypothesis of absolutely no effect, even if the observational design cannot completely rule out selection and regression to the mean as alternative explanations.
Resumo:
The choice to adopt risk-sensitive measurement approaches for operational risks: the case of Advanced Measurement Approach under Basel II New Capital Accord This paper investigates the choice of the operational risk approach under Basel II requirements and whether the adoption of advanced risk measurement approaches allows banks to save capital. Among the three possible approaches for operational risk measurement, the Advanced Measurement Approach (AMA) is the most sophisticated and requires the use of historical loss data, the application of statistical tools, and the engagement of a highly qualified staff. Our results provide evidence that the adoption of AMA is contingent on the availability of bank resources and prior experience in risk-sensitive operational risk measurement practices. Moreover, banks that choose AMA exhibit low requirements for capital and, as a result might gain a competitive advantage compared to banks that opt for less sophisticated approaches. - Internal Risk Controls and their Impact on Bank Solvency Recent cases in financial sector showed the importance of risk management controls on risk taking and firm performance. Despite advances in the design and implementation of risk management mechanisms, there is little research on their impact on behavior and performance of firms. Based on data from a sample of 88 banks covering the period between 2004 and 2010, we provide evidence that internal risk controls impact the solvency of banks. In addition, our results show that the level of internal risk controls leads to a higher degree of solvency in banks with a major shareholder in contrast to widely-held banks. However, the relationship between internal risk controls and bank solvency is negatively affected by BHC growth strategies and external restrictions on bank activities, while the higher regulatory requirements for bank capital moderates positively this relationship. - The Impact of the Sophistication of Risk Measurement Approaches under Basel II on Bank Holding Companies Value Previous research showed the importance of external regulation on banks' behavior. Some inefficient standards may accentuate risk-taking in banks and provoke a financial crisis. Despite the growing literature on the potential effects of Basel II rules, there is little empirical research on the efficiency of risk-sensitive capital measurement approaches and their impact on bank profitability and market valuation. Based on data from a sample of 66 banks covering the period between 2008 and 2010, we provide evidence that prudential ratios computed under Basel II standards predict the value of banks. However, this relation is contingent on the degree of sophistication of risk measurement approaches that banks apply. Capital ratios are effective in predicting bank market valuation when banks adopt the advanced approaches to compute the value of their risk-weighted assets.
Resumo:
A medical and scientific multidisciplinary consensus meeting was held from 29 to 30 November 2013 on Anti-Doping in Sport at the Home of FIFA in Zurich, Switzerland, to create a roadmap for the implementation of the 2015 World Anti-Doping Code. The consensus statement and accompanying papers set out the priorities for the antidoping community in research, science and medicine. The participants achieved consensus on a strategy for the implementation of the 2015 World Anti-Doping Code. Key components of this strategy include: (1) sport-specific risk assessment, (2) prevalence measurement, (3) sport-specific test distribution plans, (4) storage and reanalysis, (5) analytical challenges, (6) forensic intelligence, (7) psychological approach to optimise the most deterrent effect, (8) the Athlete Biological Passport (ABP) and confounding factors, (9) data management system (Anti-Doping Administration & Management System (ADAMS), (10) education, (11) research needs and necessary advances, (12) inadvertent doping and (13) management and ethics: biological data. True implementation of the 2015 World Anti-Doping Code will depend largely on the ability to align thinking around these core concepts and strategies. FIFA, jointly with all other engaged International Federations of sports (Ifs), the International Olympic Committee (IOC) and World Anti-Doping Agency (WADA), are ideally placed to lead transformational change with the unwavering support of the wider antidoping community. The outcome of the consensus meeting was the creation of the ad hoc Working Group charged with the responsibility of moving this agenda forward.
Resumo:
Background: There may be a considerable gap between LDL cholesterol (LDL-C) and blood pressure (BP) goal values recommended by the guidelines and results achieved in daily practice. Design Prospective cross-sectional survey of cardiovascular disease risk profiles and management with focus on lipid lowering and BP lowering in clinical practice. Methods: In phase 1, the cardiovascular risk of patients with known lipid profile visiting their general practitioner was anonymously assessed in accordance to the PROCAM-score. In phase 2, high-risk patients who did not achieve LDL-C goal less than 2.6 mmol/l in phase 1 could be further documented. Results: Six hundred thirty-five general practitioners collected the data of 23 892 patients with known lipid profile. Forty percent were high-risk patients (diabetes mellitus or coronary heart disease or PROCAM-score >20%), compared with 27% estimated by the physicians. Goal attainment rate was almost double for BP than for LDL-C in high-risk patients (62 vs. 37%). Both goals were attained by 25%. LDL-C values in phase 1 and 2 were available for 3097 high-risk patients not at LDL-C goal in phase 1; 32% of patients achieved LDL-C goal of less than 2.6 mmol/l after a mean of 17 weeks. The most successful strategies for LDL-C reduction were implemented in only 22% of the high-risk patients. Conclusion: Although patients at high cardiovascular risk were treated more intensively than low or medium risk patients, the majority remained insufficiently controlled, which is an incentive for intensified medical education. Adequate implementation of Swiss and International guidelines would expectedly contribute to improved achievement of LDL-C and BP goal values in daily practice.
Resumo:
The current challenge in a context of major environmental changes is to anticipate the responses of species to future landscape and climate scenarios. In the Mediterranean basin, climate change is one the most powerful driving forces of fire dynamics, with fire frequency and impact having markedly increased in recent years. Species distribution modelling plays a fundamental role in this challenge, but better integration of available ecological knowledge is needed to adequately guide conservation efforts. Here, we quantified changes in habitat suitability of an early-succession bird in Catalonia, the Dartford Warbler (Sylvia undata) ― globally evaluated as Near Threatened in the IUCN Red List. We assessed potential changes in species distributions between 2000 and 2050 under different fire management and climate change scenarios and described landscape dynamics using a spatially-explicit fire-succession model that simulates fire impacts in the landscape and post-fire regeneration (MEDFIRE model). Dartford Warbler occurrence data were acquired at two different spatial scales from: 1) the Atlas of European Breeding Birds (EBCC) and 2) Catalan Breeding Bird Atlas (CBBA). Habitat suitability was modelled using five widely-used modelling techniques in an ensemble forecasting framework. Our results indicated considerable habitat suitability losses (ranging between 47% and 57% in baseline scenarios), which were modulated to a large extent by fire regime changes derived from fire management policies and climate changes. Such result highlighted the need for taking the spatial interaction between climate changes, fire-mediated landscape dynamics and fire management policies into account for coherently anticipating habitat suitability changes of early succession bird species. We conclude that fire management programs need to be integrated into conservation plans to effectively preserve sparsely forested and early succession habitats and their associated species in the face of global environmental change.
Resumo:
Segment poses and joint kinematics estimated from skin markers are highly affected by soft tissue artifact (STA) and its rigid motion component (STARM). While four marker-clusters could decrease the STA non-rigid motion during gait activity, other data, such as marker location or STARM patterns, would be crucial to compensate for STA in clinical gait analysis. The present study proposed 1) to devise a comprehensive average map illustrating the spatial distribution of STA for the lower limb during treadmill gait and 2) to analyze STARM from four marker-clusters assigned to areas extracted from spatial distribution. All experiments were realized using a stereophotogrammetric system to track the skin markers and a bi-plane fluoroscopic system to track the knee prosthesis. Computation of the spatial distribution of STA was realized on 19 subjects using 80 markers apposed on the lower limb. Three different areas were extracted from the distribution map of the thigh. The marker displacement reached a maximum of 24.9mm and 15.3mm in the proximal areas of thigh and shank, respectively. STARM was larger on thigh than the shank with RMS error in cluster orientations between 1.2° and 8.1°. The translation RMS errors were also large (3.0mm to 16.2mm). No marker-cluster correctly compensated for STARM. However, the coefficient of multiple correlations exhibited excellent scores between skin and bone kinematics, as well as for STARM between subjects. These correlations highlight dependencies between STARM and the kinematic components. This study provides new insights for modeling STARM for gait activity.