944 resultados para Three Pillar Framework
Resumo:
A major weakness among loading models for pedestrians walking on flexible structures proposed in recent years is the various uncorroborated assumptions made in their development. This applies to spatio-temporal characteristics of pedestrian loading and the nature of multi-object interactions. To alleviate this problem, a framework for the determination of localised pedestrian forces on full-scale structures is presented using a wireless attitude and heading reference systems (AHRS). An AHRS comprises a triad of tri-axial accelerometers, gyroscopes and magnetometers managed by a dedicated data processing unit, allowing motion in three-dimensional space to be reconstructed. A pedestrian loading model based on a single point inertial measurement from an AHRS is derived and shown to perform well against benchmark data collected on an instrumented treadmill. Unlike other models, the current model does not take any predefined form nor does it require any extrapolations as to the timing and amplitude of pedestrian loading. In order to assess correctly the influence of the moving pedestrian on behaviour of a structure, an algorithm for tracking the point of application of pedestrian force is developed based on data from a single AHRS attached to a foot. A set of controlled walking tests with a single pedestrian is conducted on a real footbridge for validation purposes. A remarkably good match between the measured and simulated bridge response is found, indeed confirming applicability of the proposed framework.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Global environmental change requires responses that involve marked or qualitative changes in individuals, institutions, societies, and cultures. Yet, while there has been considerable effort to develop theory about such processes, there has been limited research on practices for facilitating transformative change. We present a novel pathways approach called Three Horizons that helps participants work with complex and intractable problems and uncertain futures. The approach is important for helping groups work with uncertainty while also generating agency in ways not always addressed by existing futures approaches. We explain how the approach uses a simple framework for structured and guided dialogue around different patterns of change by using examples. We then discuss some of the key characteristics of the practice that facilitators and participants have found to be useful. This includes (1) providing a simple structure for working with complexity, (2) helping develop future consciousness (an awareness of the future potential in the present moment), (3) helping distinguish between incremental and transformative change, (4) making explicit the processes of power and patterns of renewal, (5) enabling the exploration of how to manage transitions, and (6) providing a framework for dialogue among actors with different mindsets. The complementarity of Three Horizons to other approaches (e.g., scenario planning, dilemma thinking) is then discussed. Overall, we highlight that there is a need for much greater attention to researching practices of transformation in ways that bridge different kinds of knowledge, including episteme and phronesis. Achieving this will itself require changes to contemporary systems of knowledge production. The practice of Three Horizons could be a useful way to explore how such transformations in knowledge production and use could be achieved.
Resumo:
An Euler-Lagrange particle tracking model, developed for simulating fire atmosphere/sprinkler spray interactions, is described. Full details of the model along with the approximations made and restrictions applying are presented. Errors commonly found in previous formulations of the source terms used in this two-phase approach are described and corrected. In order to demonstrate the capabilities of the model it is applied to the simulation of a fire in a long corridor containing a sprinkler. The simulation presented is three-dimensional and transient and considers mass, momentum and energy transfer between the gaseous atmosphere and injected liquid droplets.
Resumo:
An important aspect of sustainability is to maintain biodiversity and ecosystem functioning while improving human well-being. For this, the ecosystem service (ES) approach has the potential to bridge the still existing gap between ecological management and social development, especially by focusing on trade-offs and synergies between ES and between their beneficiaries. Several frameworks have been proposed to account for trade-offs and synergies between ES, and between ES and other components of social-ecological systems. However, to date, insufficient explicit attention has been paid to the three facets encompassed in the ES concept, namely potential supply, demand, and use, leading to incomplete descriptions of ES interactions. We expand on previous frameworks by proposing a new influence network framework (INF) based on an explicit consideration of influence relationships between these three ES facets, biodiversity, and external driving variables. We tested its ability to provide a comprehensive view of complex social-ecological interactions around ES through a consultative process focused on environmental management in the French Alps. We synthetized the interactions mentioned during this consultative process and grouped variables according to their overall propensity to influence or be influenced by the system. The resulting directed sequence of influences distinguished between: (1) mostly influential variables (dynamic social variables and ecological state variables), (2) target variables (provisioning and cultural services), and (3) mostly impacted variables (regulating services and biodiversity parameters). We discussed possible reasons for the discrepancies between actual and perceived influences and proposed options to overcome them. We demonstrated that the INF holds the potential to deliver collective assessments of ES relations by: (1) including ecological as well as social aspects, (2) providing opportunities for colearning processes between stakeholder groups, and (3) supporting communication about complex social-ecological systems and consequences for environmental management.
Resumo:
The origins of agriculture and the shift from hunting and gathering to committed agriculture is regarded as one of the major transitions in human history. Archeologists and anthropologists have invested significant efforts in explaining the origins of agriculture. A period of gathering intensification and experimentation and pursuing a mixed economic strategy seems the most plausible explanation for the transition to agriculture and provides an approach to study a process in which several nonlinear processes may have played a role. However, the mechanisms underlying the transition to full agriculture are not completely clear. This is partly due to the nature of the archeological record, which registers a practice only once it has become clearly established. Thus, points of transitions have limited visibility and the mechanisms involved in the process are difficult to untangle. The complexity of such transitions also implies that shifts can be distinctively different in particular environments and under varying historical and social conditions. In this paper we discuss some of the elements involved in the transition to food production within the framework of resilience theory. We propose a theoretical conceptual model in which the resilience of livelihood strategies lies at the intersection of three spheres: the environmental, economical, and social domains. Transitions occur when the rate of change, in one or more of these domains, is so elevated or its magnitude so large that the livelihood system is unable to bounce back to its original state. In this situation, the system moves to an alternative stable state, from one livelihood strategy to another.
Resumo:
Individuals and corporate users are persistently considering cloud adoption due to its significant benefits compared to traditional computing environments. The data and applications in the cloud are stored in an environment that is separated, managed and maintained externally to the organisation. Therefore, it is essential for cloud providers to demonstrate and implement adequate security practices to protect the data and processes put under their stewardship. Security transparency in the cloud is likely to become the core theme that underpins the systematic disclosure of security designs and practices that enhance customer confidence in using cloud service and deployment models. In this paper, we present a framework that enables a detailed analysis of security transparency for cloud based systems. In particular, we consider security transparency from three different levels of abstraction, i.e., conceptual, organisation and technical levels, and identify the relevant concepts within these levels. This allows us to provide an elaboration of the essential concepts at the core of transparency and analyse the means for implementing them from a technical perspective. Finally, an example from a real world migration context is given to provide a solid discussion on the applicability of the proposed framework.
Resumo:
This paper presents a three dimensional, thermos-mechanical modelling approach to the cooling and solidification phases associated with the shape casting of metals ei. Die, sand and investment casting. Novel vortex-based Finite Volume (FV) methods are described and employed with regard to the small strain, non-linear Computational Solid Mechanics (CSM) capabilities required to model shape casting. The CSM capabilities include the non-linear material phenomena of creep and thermo-elasto-visco-plasticity at high temperatures and thermo-elasto-visco-plasticity at low temperatures and also multi body deformable contact with which can occur between the metal casting of the mould. The vortex-based FV methods, which can be readily applied to unstructured meshes, are included within a comprehensive FV modelling framework, PHYSICA. The additional heat transfer, by conduction and convection, filling, porosity and solidification algorithms existing within PHYSICA for the complete modelling of all shape casting process employ cell-centred FV methods. The termo-mechanical coupling is performed in a staggered incremental fashion, which addresses the possible gap formation between the component and the mould, and is ultimately validated against a variety of shape casting benchmarks.
Resumo:
The difficulties encountered in implementing large scale CM codes on multiprocessor systems are now fairly well understood. Despite the claims of shared memory architecture manufacturers to provide effective parallelizing compilers, these have not proved to be adequate for large or complex programs. Significant programmer effort is usually required to achieve reasonable parallel efficiencies on significant numbers of processors. The paradigm of Single Program Multi Data (SPMD) domain decomposition with message passing, where each processor runs the same code on a subdomain of the problem, communicating through exchange of messages, has for some time been demonstrated to provide the required level of efficiency, scalability, and portability across both shared and distributed memory systems, without the need to re-author the code into a new language or even to support differing message passing implementations. Extension of the methods into three dimensions has been enabled through the engineering of PHYSICA, a framework for supporting 3D, unstructured mesh and continuum mechanics modeling. In PHYSICA, six inspectors are used. Part of the challenge for automation of parallelization is being able to prove the equivalence of inspectors so that they can be merged into as few as possible.
Resumo:
Purpose – The paper aims to conceptualise cosmopolitanism drivers from the third-level power perspective by drawing on Lukes’ (1974; 2005) theory of power. In addition, the paper aims to investigate the relationship between entrepreneurs’ cosmopolitan dispositions and habitus, i.e. a pattern of an individual’s demeanour, as understood by Bourdieu. Design/methodology/approach – This conceptual paper makes use of Bourdieu’s framework (habitus) by extending it to the urban cosmopolitan environment and linking habitus to the three-dimensional theory of power and, importantly, to the power’s third dimension – preference-shaping. Findings – Once cosmopolitanism is embedded in the urban area’s values, this creates multiple endless rounds of mutual influence (by power holders onto entrepreneurs via political and business elites, and by entrepreneurs onto power holders via the same channels), with mutual benefit. Therefore, mutually beneficial influence that transpires in continuous support of a cosmopolitan city’s environment may be viewed as one of the factors that enhances cosmopolitan cities’ resilience to changes in macroeconomic conditions. Originality/value – The paper offers a theoretical model that enriches the understanding of the power-cosmopolitanism-entrepreneurship link, by emphasising the preference-shaping capacity of power, which leads to the embedment of cosmopolitanism in societal values. As a value shared by political and business elites, cosmopolitanism is also actively promoted by entrepreneurs through their disposition and habitus. This ensures not only their willing compliance with power and the environment, but also their enhancement of favourable business conditions. Entrepreneurs depart from mere acquiescence (to power and its explicit dominance), and instead practice their cosmopolitan influence by active preference-shaping.
Resumo:
There is a limited amount of research in the area of missing persons, especially adults. The aim of this research is to expand on the understanding of missing people, by examining adults' behaviours while missing and determining if distinct behavioural themes exist. Based on previous literature it was hypothesised that three behavioural themes will be present; dysfunctional, escape, and unintentional. Thirty-six behaviours were coded from 362 missing person police reports and analysed using smallest space analysis (SSA). This produced a spatial representation of the behaviours, showing three distinct behavioural themes. Seventy percent of the adult missing person reports were classified under one dominant theme, 41% were ‘unintentional’, 18% were ‘dysfunctional’, and 11% were ‘escape’. The relationship between a missing person's dominant behavioural theme and their assigned risk level and demographic characteristics were also analysed. A significant association was found between the age, occupational status, whether they had any mental health issues, and the risk level assigned to the missing person; and their dominant behavioural theme. The findings are the first step in the development of a standardised checklist for a missing person investigation. This has implications on how practitioners prioritise missing adults, and interventions to prevent individuals from going missing. Copyright © 2016 John Wiley & Sons, Ltd.
Resumo:
Résumé : Une définition opérationnelle de la dyslexie qui est adéquate et pertinente à l'éducation n'a pu être identifiée suite à une recension des écrits. Les études sur la dyslexie se retrouvent principalement dans trois champs: la neurologie, la neurolinguistique et la génétique. Les résultats de ces recherches cependant, se limitent au domaine médical et ont peu d'utilité pour une enseignante ou un enseignant. La classification de la dyslexie de surface et la dyslexie profonde est la plus appropriée lorsque la dyslexie est définie comme trouble de lecture dans le contexte de l'éducation. L'objectif de ce mémoire était de développer un cadre conceptuel théorique dans lequel les troubles de lecture chez les enfants dyslexiques sont dû à une difficulté en résolution de problèmes dans le traitement de l'information. La validation du cadre conceptuel a été exécutée à l'aide d'un expert en psychologie cognitive, un expert en dyslexie et une enseignante. La perspective de la résolution de problèmes provient du traitement de l'information en psychologie cognitive. Le cadre conceptuel s'adresse uniquement aux troubles de lectures qui sont manifestés par les enfants dyslexiques.||Abstract : An extensive literature review failed to uncover an adequate operational definition of dyslexia applicable to education. The predominant fields of research that have produced most of the studies on dyslexia are neurology, neurolinguistics and genetics. Their perspectives were shown to be more pertinent to medical experts than to teachers. The categorization of surface and deep dyslexia was shown to be the best description of dyslexia in an educational context. The purpose of the present thesis was to develop a theoretical conceptual framework which describes a link between dyslexia, a text-processing model and problem solving. This conceptual framework was validated by three experts specializing in a specific field (either cognitive psychology, dyslexia or teaching). The concept of problem solving was based on information-processing theories in cognitive psychology. This framework applies specifically to reading difficulties which are manifested by dyslexic children.
Resumo:
Background: Over the last few decades, the prevalence of young adults with disabilities (YAD) has steadily risen as a result of advances in medicine, clinical treatment, and biomedical technologythat enhanced their survival into adulthood. Despite investments in services, family supports, and insurance, they experience poor health status and barriers to successful transition into adulthood. Objectives: We investigated the collective roles of multi-faceted factors at intrapersonal, interpersonal and community levels within the social ecological framework on health related outcome including self-rated health (SRH) of YAD. The three specific aims are: 1) to examine sociodemographic differences and health insurance coverage in adolescence; 2) to investigate the role of social skills in relationships with family and peers developed in adolescence; and 3) to collectively explore the association of sociodemographic characteristics, social skills, and community participation in adolescence on SRH. Methods: Using longitudinal data (N=5,020) from the National Longitudinal Transition Study (NLTS2), we conducted multivariate logistic regression analyses to understand the association between insurance status as well as social skills in adolescence and YAD’s health related outcomes. Structural equation modeling (SEM) assessed the confluence of multi-faceted factors from the social ecological model that link to health in early adulthood. Results: Compared with YAD who had private insurance, YAD who had public health insurance in adolescence are at higher odds of experiencing poorer health related outcomes in self-rated health [adjusted odds ratio (aOR=2.89, 95% confidence interval (CI): 1.16, 7.23), problems with health (aOR=2.60, 95%CI: 1.26, 5.35), and missing social activities due to health problems (aOR=2.86, 95%CI: 1.39, 5.85). At the interpersonal level, overall social skills developed through relationship with family and peers in adolescence do not appear to have association with health related outcomes in early adulthood. Finally, at the community level, community participation in adolescence does not have an association with SRH in early adulthood. Conclusions: Having public health insurance coverage does not equate to good health. YAD need additional supports to achieve positive health outcomes. The findings in social skills and community participation suggest other potential factors may be at play for health related outcomes for YAD and the need for further investigation.
Resumo:
Symbolic execution is a powerful program analysis technique, but it is very challenging to apply to programs built using event-driven frameworks, such as Android. The main reason is that the framework code itself is too complex to symbolically execute. The standard solution is to manually create a framework model that is simpler and more amenable to symbolic execution. However, developing and maintaining such a model by hand is difficult and error-prone. We claim that we can leverage program synthesis to introduce a high-degree of automation to the process of framework modeling. To support this thesis, we present three pieces of work. First, we introduced SymDroid, a symbolic executor for Android. While Android apps are written in Java, they are compiled to Dalvik bytecode format. Instead of analyzing an app’s Java source, which may not be available, or decompiling from Dalvik back to Java, which requires significant engineering effort and introduces yet another source of potential bugs in an analysis, SymDroid works directly on Dalvik bytecode. Second, we introduced Pasket, a new system that takes a first step toward automatically generating Java framework models to support symbolic execution. Pasket takes as input the framework API and tutorial programs that exercise the framework. From these artifacts and Pasket's internal knowledge of design patterns, Pasket synthesizes an executable framework model by instantiating design patterns, such that the behavior of a synthesized model on the tutorial programs matches that of the original framework. Lastly, in order to scale program synthesis to framework models, we devised adaptive concretization, a novel program synthesis algorithm that combines the best of the two major synthesis strategies: symbolic search, i.e., using SAT or SMT solvers, and explicit search, e.g., stochastic enumeration of possible solutions. Adaptive concretization parallelizes multiple sub-synthesis problems by partially concretizing highly influential unknowns in the original synthesis problem. Thanks to adaptive concretization, Pasket can generate a large-scale model, e.g., thousands lines of code. In addition, we have used an Android model synthesized by Pasket and found that the model is sufficient to allow SymDroid to execute a range of apps.
Resumo:
Processors with large numbers of cores are becoming commonplace. In order to utilise the available resources in such systems, the programming paradigm has to move towards increased parallelism. However, increased parallelism does not necessarily lead to better performance. Parallel programming models have to provide not only flexible ways of defining parallel tasks, but also efficient methods to manage the created tasks. Moreover, in a general-purpose system, applications residing in the system compete for the shared resources. Thread and task scheduling in such a multiprogrammed multithreaded environment is a significant challenge. In this thesis, we introduce a new task-based parallel reduction model, called the Glasgow Parallel Reduction Machine (GPRM). Our main objective is to provide high performance while maintaining ease of programming. GPRM supports native parallelism; it provides a modular way of expressing parallel tasks and the communication patterns between them. Compiling a GPRM program results in an Intermediate Representation (IR) containing useful information about tasks, their dependencies, as well as the initial mapping information. This compile-time information helps reduce the overhead of runtime task scheduling and is key to high performance. Generally speaking, the granularity and the number of tasks are major factors in achieving high performance. These factors are even more important in the case of GPRM, as it is highly dependent on tasks, rather than threads. We use three basic benchmarks to provide a detailed comparison of GPRM with Intel OpenMP, Cilk Plus, and Threading Building Blocks (TBB) on the Intel Xeon Phi, and with GNU OpenMP on the Tilera TILEPro64. GPRM shows superior performance in almost all cases, only by controlling the number of tasks. GPRM also provides a low-overhead mechanism, called “Global Sharing”, which improves performance in multiprogramming situations. We use OpenMP, as the most popular model for shared-memory parallel programming as the main GPRM competitor for solving three well-known problems on both platforms: LU factorisation of Sparse Matrices, Image Convolution, and Linked List Processing. We focus on proposing solutions that best fit into the GPRM’s model of execution. GPRM outperforms OpenMP in all cases on the TILEPro64. On the Xeon Phi, our solution for the LU Factorisation results in notable performance improvement for sparse matrices with large numbers of small blocks. We investigate the overhead of GPRM’s task creation and distribution for very short computations using the Image Convolution benchmark. We show that this overhead can be mitigated by combining smaller tasks into larger ones. As a result, GPRM can outperform OpenMP for convolving large 2D matrices on the Xeon Phi. Finally, we demonstrate that our parallel worksharing construct provides an efficient solution for Linked List processing and performs better than OpenMP implementations on the Xeon Phi. The results are very promising, as they verify that our parallel programming framework for manycore processors is flexible and scalable, and can provide high performance without sacrificing productivity.