897 resultados para Process theory


Relevância:

60.00% 60.00%

Publicador:

Resumo:

A method is given for proving efficiency of NPMLE directly linked to empirical process theory. The conditions in general are appropriate consistency of the NPMLE, differentiability of the model, differentiability of the parameter of interest, local convexity of the parameter space, and a Donsker class condition for the class of efficient influence functions obtained by varying the parameters. For the case that the model is linear in the parameter and the parameter space is convex, as with most nonparametric missing data models, we show that the method leads to an identity for the NPMLE which almost says that the NPMLE is efficient and provides us straightforwardly with a consistency and efficiency proof. This identify is extended to an almost linear class of models which contain biased sampling models. To illustrate, the method is applied to the univariate censoring model, random truncation models, interval censoring case I model, the class of parametric models and to a class of semiparametric models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Opportunistic routing (OR) employs a list of candidates to improve wireless transmission reliability. However, conventional list-based OR restricts the freedom of opportunism, since only the listed nodes are allowed to compete for packet forwarding. Additionally, the list is generated statically based on a single network metric prior to data transmission, which is not appropriate for mobile ad-hoc networks (MANETs). In this paper, we propose a novel OR protocol - Context-aware Adaptive Opportunistic Routing (CAOR) for MANETs. CAOR abandons the idea of candidate list and it allows all qualified nodes to participate in packet transmission. CAOR forwards packets by simultaneously exploiting multiple cross-layer context information, such as link quality, geographic progress, energy, and mobility.With the help of the Analytic Hierarchy Process theory, CAOR adjusts the weights of context information based on their instantaneous values to adapt the protocol behavior at run-time. Moreover, CAOR uses an active suppression mechanism to reduce packet duplication. Simulation results show that CAOR can provide efficient routing in highly mobile environments. The adaptivity feature of CAOR is also validated.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In an American postsecondary context, conflict is inherent (Gianneschi & Yanagiura, 2006; Valian, 1999). Successful navigation of conflict in the academy is vital for those who aspire to leadership positions (Nadler & Nadler, 1987; Walters, Stuhlmacher, & Meyer, 1998). Presently, however, women face significant barriers to achieving success in higher education administration, including gender expectations for conflict resolution behavior (Bartunek, 1992; Bowles, Babcock, & McGinn, 2005; Gayle, Preiss, & Allen, 2002). While a considerable body of literature exists for understanding gender negotiation, it remains rooted in a masculine paradigm (Kolb & Putnam, 2006; Shuter & Turner, 1997), and, as such, established theories lack a feminist epistemological perspective. Consequently, my primary research question is, How do women leaders experience and perceive conflict in the higher education work environment? I conduct a qualitative study that examines workplace conflict experiences of 15 women leaders from diverse personal and professional backgrounds. Hartsock's (1983) three-tiered gender-sensitive analysis of power, updated to include multicultural perspectives, serves as my theoretical framework. It is a lens through which I evaluate theories, finding multicultural organizational, higher education conflict, and gender negotiation theories most applicable to this study. The framework also creates the foundation upon which I build my study. Specifically, I determine that a feminist research method is most relevant to this investigation. To analyze data obtained through in depth interviews, I employ a highly structured form of grounded theory called dimensional analysis. Based on my findings, I co-construct with study participants a Feminist Conflict Process Theory and Flowchart in which initially the nature of the relationship, and subsequently the level of risk to the relationship, institution, or self, is evaluated. This study supports that which is observed in the conflict resolution practitioner literature, but is unique in its observation of factors that influence decisions within a dynamic conflict resolution process. My findings are significant to women who aspire to serve in leadership positions in higher education, as well as to the academy as a whole, for it expands our knowledge of women's ontological and epistemological perspectives on resolving conflict in postsecondary education.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper I show how the conserved quantity theory, or more generally the process theory of Wesley Salmon and myself, provides a sufficient condition in an analysis of causation. To do so I will show how it handles the problem of alleged 'misconnections'. I show what the conserved quantity theory says about such cases, and why intuitions are not to be taken as sacrosanct.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An intelligent agent, operating in an external world which cannot be fully described in its internal world model, must be able to monitor the success of a previously generated plan and to respond to any errors which may have occurred. The process of error analysis requires the ability to reason in an expert fashion about time and about processes occurring in the world. Reasoning about time is needed to deal with causality. Reasoning about processes is needed since the direct effects of a plan action can be completely specified when the plan is generated, but the indirect effects cannot. For example, the action `open tap' leads with certainty to `tap open', whereas whether there will be a fluid flow and how long it might last is more difficult to predict. The majority of existing planning systems cannot handle these kinds of reasoning, thus limiting their usefulness. This thesis argues that both kinds of reasoning require a complex internal representation of the world. The use of Qualitative Process Theory and an interval-based representation of time are proposed as a representation scheme for such a world model. The planning system which was constructed has been tested on a set of realistic planning scenarios. It is shown that even simple planning problems, such as making a cup of coffee, require extensive reasoning if they are to be carried out successfully. The final Chapter concludes that the planning system described does allow the correct solution of planning problems involving complex side effects, which planners up to now have been unable to solve.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An applied psychological framework for coping with performance uncertainty in sport and work systems is presented. The theme of personal control serves to integrate ideas prevalent in industrial and organisational psychology, the stress literature and labour process theory. These commonly focus on the promotion of tacit knowledge and learned resourcefulness in individual performers. Finally, data from an empirical evaluation of a development training programme to facilitate self-regulation skills in professional athletes are briefly highlighted.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this study was to analyze the evolution of Florida state level policy efforts and to assess the responding educational policy development and implementation at the local school district level. The focus of this study was the secondary language arts curriculum in Miami-Dade County Public Schools. ^ Data was collected using document analysis as a source of meaning making out of the language sets proffered by agencies at each level. A matrix was created based on Klein's levels of curriculum decision-making and Functional Process Theory categories of policy formation. The matrix allowed the researcher to code and classify specific information in terms accountability/high-stakes testing; authority; outside influences; and operational/structural organization. ^ Federal policy documents provided a background and impetus for much of what originated at the State level. The State then produced policy directives which were accepted by the District and specific policy directives and guidelines for practice. No evidence was found indicating the involvement of any other agencies in the development, transmission or implementation of the State level initiated policies. ^ After analyzing the evolutionary process, it became clear that state policy directives were never challenged or discussed. Rather, they were accepted as standards to be met and as such, school districts complied. Policy implementation is shown to be a top-down phenomenon. No evidence was found indicating a dialogue between state and local systems, rather the state, as the source of authority, issued specifically worded policy directives and the district complied. Finally, this study recognizes that outside influences play an important role in shaping the education reform policy in the state of Florida. The federal government, through NCLB and other initiatives created a climate which led almost naturally to the creation of the Florida A+ Plan. Similarly, the concern of the business community, always interested in the production of competent workers, continued to support efforts at raising the minimum skill level of Florida high school graduates. ^ Suggestions are made for future research including the examination of local school sites in order to assess the overall nature of the school experience rather than rely upon performance indicators mandated by state policy. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this study was to analyze the evolution of Florida state level policy efforts and to assess the responding educational policy development and implementation at the local school district level. The focus of this study was the secondary language arts curriculum in Miami-Dade County Public Schools. Data was collected using document analysis as a source of meaning making out of the language sets proffered by agencies at each level. A matrix was created based on Klein's levels of curriculum decision-making and Functional Process Theory categories of policy formation. The matrix allowed the researcher to code and classify specific information in terms accountability/high-stakes testing; authority; outside influences; and operational/structural organization. Federal policy documents provided a background and impetus for much of what originated at the State level. The State then produced policy directives which were accepted by the District and specific policy directives and guidelines for practice. No evidence was found indicating the involvement of any other agencies in the development, transmission or implementation of the State level initiated policies. After analyzing the evolutionary process, it became clear that state policy directives were never challenged or discussed. Rather, they were accepted as standards to be met and as such, school districts complied. Policy implementation is shown to be a top-down phenomenon. No evidence was found indicating a dialogue between state and local systems, rather the state, as the source of authority, issued specifically worded policy directives and the district complied. Finally, this study recognizes that outside influences play an important role in shaping the education reform policy in the state of Florida. The federal government, through NCLB and other initiatives created a climate which led almost naturally to the creation of the Florida A+ Plan. Similarly, the concern of the business community, always interested in the production of competent workers, continued to support efforts at raising the minimum skill level of Florida high school graduates. Suggestions are made for future research including the examination of local school sites in order to assess the overall nature of the school experience rather than rely upon performance indicators mandated by state policy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Behavior of granular material subjected to repeated load triaxial compression tests is characterized by a model based on rate process theory. Starting with the Arrhenius equation from chemical kinetics, the relationship of temperature, shear stress, normal stress and volume change to deformation rate is developed. The proposed model equation includes these factors as a product of exponential terms. An empirical relationship between deformation and the cube root of the number of stress applications at constant temperature and normal stress is combined with the rate equation to yield an integrated relationship of temperature, deviator stress, confining pressure and number of deviator stress applications to axial strain. The experimental program consists of 64 repeated load triaxial compression tests, 52 on untreated crushed stone and 12 on the same crushed stone material treated with 4% asphalt cement. Results were analyzed with multiple linear regression techniques and show substantial agreement with the model equations. Experimental results fit the rate equation somewhat better than the integrated equation when all variable quantities are considered. The coefficient of shear temperature gives the activation enthalpy, which is about 4.7 kilocalories/mole for untreated material and 39.4 kilocalories/mole for asphalt-treated material. This indicates the activation enthalpy is about that of the pore fluid. The proportionality coefficient of deviator stress may be used to measure flow unit volume. The volumes thus determined for untreated and asphalt-treated material are not substantially different. This may be coincidental since comparison with flow unit volumes reported by others indicates flow unit volume is related to gradation of untreated material. The flow unit volume of asphalt-treated material may relate to asphalt cement content. The proposed model equations provide a more rational basis for further studies of factors affecting deformation of granular materials under stress similar to that in pavement subjected to transient traffic loads.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Current Ambient Intelligence and Intelligent Environment research focuses on the interpretation of a subject’s behaviour at the activity level by logging the Activity of Daily Living (ADL) such as eating, cooking, etc. In general, the sensors employed (e.g. PIR sensors, contact sensors) provide low resolution information. Meanwhile, the expansion of ubiquitous computing allows researchers to gather additional information from different types of sensor which is possible to improve activity analysis. Based on the previous research about sitting posture detection, this research attempts to further analyses human sitting activity. The aim of this research is to use non-intrusive low cost pressure sensor embedded chair system to recognize a subject’s activity by using their detected postures. There are three steps for this research, the first step is to find a hardware solution for low cost sitting posture detection, second step is to find a suitable strategy of sitting posture detection and the last step is to correlate the time-ordered sitting posture sequences with sitting activity. The author initiated a prototype type of sensing system called IntelliChair for sitting posture detection. Two experiments are proceeded in order to determine the hardware architecture of IntelliChair system. The prototype looks at the sensor selection and integration of various sensor and indicates the best for a low cost, non-intrusive system. Subsequently, this research implements signal process theory to explore the frequency feature of sitting posture, for the purpose of determining a suitable sampling rate for IntelliChair system. For second and third step, ten subjects are recruited for the sitting posture data and sitting activity data collection. The former dataset is collected byasking subjects to perform certain pre-defined sitting postures on IntelliChair and it is used for posture recognition experiment. The latter dataset is collected by asking the subjects to perform their normal sitting activity routine on IntelliChair for four hours, and the dataset is used for activity modelling and recognition experiment. For the posture recognition experiment, two Support Vector Machine (SVM) based classifiers are trained (one for spine postures and the other one for leg postures), and their performance evaluated. Hidden Markov Model is utilized for sitting activity modelling and recognition in order to establish the selected sitting activities from sitting posture sequences.2. After experimenting with possible sensors, Force Sensing Resistor (FSR) is selected as the pressure sensing unit for IntelliChair. Eight FSRs are mounted on the seat and back of a chair to gather haptic (i.e., touch-based) posture information. Furthermore, the research explores the possibility of using alternative non-intrusive sensing technology (i.e. vision based Kinect Sensor from Microsoft) and find out the Kinect sensor is not reliable for sitting posture detection due to the joint drifting problem. A suitable sampling rate for IntelliChair is determined according to the experiment result which is 6 Hz. The posture classification performance shows that the SVM based classifier is robust to “familiar” subject data (accuracy is 99.8% with spine postures and 99.9% with leg postures). When dealing with “unfamiliar” subject data, the accuracy is 80.7% for spine posture classification and 42.3% for leg posture classification. The result of activity recognition achieves 41.27% accuracy among four selected activities (i.e. relax, play game, working with PC and watching video). The result of this thesis shows that different individual body characteristics and sitting habits influence both sitting posture and sitting activity recognition. In this case, it suggests that IntelliChair is suitable for individual usage but a training stage is required.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This research explores the business model (BM) evolution process of entrepreneurial companies and investigates the relationship between BM evolution and firm performance. Recently, it has been increasingly recognised that the innovative design (and re-design) of BMs is crucial to the performance of entrepreneurial firms, as BM can be associated with superior value creation and competitive advantage. However, there has been limited theoretical and empirical evidence in relation to the micro-mechanisms behind the BM evolution process and the entrepreneurial outcomes of BM evolution. This research seeks to fill this gap by opening up the ‘black box’ of the BM evolution process, exploring the micro-patterns that facilitate the continuous shaping, changing, and renewing of BMs and examining how BM evolutions create and capture value in a dynamic manner. Drawing together the BM and strategic entrepreneurship literature, this research seeks to understand: (1) how and why companies introduce BM innovations and imitations; (2) how BM innovations and imitations interplay as patterns in the BM evolution process; and (3) how BM evolution patterns affect firm performances. This research adopts a longitudinal multiple case study design that focuses on the emerging phenomenon of BM evolution. Twelve entrepreneurial firms in the Chinese Online Group Buying (OGB) industry were selected for their continuous and intensive developments of BMs and their varying success rates in this highly competitive market. Two rounds of data collection were carried out between 2013 and 2014, which generates 31 interviews with founders/co-founders and in total 5,034 pages of data. Following a three-stage research framework, the data analysis begins by mapping the BM evolution process of the twelve companies and classifying the changes in the BMs into innovations and imitations. The second stage focuses down to the BM level, which addresses the BM evolution as a dynamic process by exploring how BM innovations and imitations unfold and interplay over time. The final stage focuses on the firm level, providing theoretical explanations as to the effects of BM evolution patterns on firm performance. This research provides new insights into the nature of BM evolution by elaborating on the missing link between BM dynamics and firm performance. The findings identify four patterns of BM evolution that have different effects on a firm’s short- and long-term performance. This research contributes to the BM literature by presenting what the BM evolution process actually looks like. Moreover, it takes a step towards the process theory of the interplay between BM innovations and imitations, which addresses the role of companies’ actions, and more importantly, reactions to the competitors. Insights are also given into how entrepreneurial companies achieve and sustain value creation and capture by successfully combining the BM evolution patterns. Finally, the findings on BM evolution contributes to the strategic entrepreneurship literature by increasing the understanding of how companies compete in a more dynamic and complex environment. It reveals that, the achievement of superior firm performance is more than a simple question of whether to innovate or imitate, but rather an integration of innovation and imitation strategies over time. This study concludes with a discussion of the findings and their implications for theory and practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Previous research with the ratio-bias task found larger response latencies for conflict trials where the heuristic- and analytic-based responses are assumed to be in opposition (e.g., choosing between 1/10 and 9/100 ratios of success) when compared to no-conflict trials where both processes converge on the same response (e.g., choosing between 1/10 and 11/100). This pattern is consistent with parallel dualprocess models, which assume that there is effective, rather than lax, monitoring of the output of heuristic processing. It is, however, unclear why conflict resolution sometimes fails. Ratio-biased choices may increase because of a decline in analytical reasoning (leaving heuristic-based responses unopposed) or to a rise in heuristic processing (making it more difficult for analytic processes to override the heuristic preferences). Using the process-dissociation procedure, we found that instructions to respond logically and response speed affected analytic (controlled) processing (C), leaving heuristic processing (H) unchanged, whereas the intuitive preference for large nominators (as assessed by responses to equal ratio trials) affected H but not C. These findings create new challenges to the debate between dual-process and singleprocess accounts, which are discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background and aims: Advances in modern medicine have led to improved outcomes after stroke, yet an increased treatment burden has been placed on patients. Treatment burden is the workload of health care for people with chronic illness and the impact that this has on functioning and well-being. Those with comorbidities are likely to be particularly burdened. Excessive treatment burden can negatively affect outcomes. Individuals are likely to differ in their ability to manage health problems and follow treatments, defined as patient capacity. The aim of this thesis was to explore the experience of treatment burden for people who have had a stroke and the factors that influence patient capacity. Methods: There were four phases of research. 1) A systematic review of the qualitative literature that explored the experience of treatment burden for those with stroke. Data were analysed using framework synthesis, underpinned by Normalisation Process Theory (NPT). 2) A cross-sectional study of 1,424,378 participants >18 years, demographically representative of the Scottish population. Binary logistic regression was used to analyse the relationship between stroke and the presence of comorbidities and prescribed medications. 3) Interviews with twenty-nine individuals with stroke, fifteen analysed by framework analysis underpinned by NPT and fourteen by thematic analysis. The experience of treatment burden was explored in depth along with factors that influence patient capacity. 4) Integration of findings in order to create a conceptual model of treatment burden and patient capacity in stroke. Results: Phase 1) A taxonomy of treatment burden in stroke was created. The following broad areas of treatment burden were identified: making sense of stroke management and planning care; interacting with others including health professionals, family and other stroke patients; enacting management strategies; and reflecting on management. Phase 2) 35,690 people (2.5%) had a diagnosis of stroke and of the 39 co-morbidities examined, 35 were significantly more common in those with stroke. The proportion of those with stroke that had >1 additional morbidities present (94.2%) was almost twice that of controls (48%) (odds ratio (OR) adjusted for age, gender and socioeconomic deprivation; 95% confidence interval: 5.18; 4.95-5.43) and 34.5% had 4-6 comorbidities compared to 7.2% of controls (8.59; 8.17-9.04). In the stroke group, 12.6% of people had a record of >11 repeat prescriptions compared to only 1.5% of the control group (OR adjusted for age, gender, deprivation and morbidity count: 15.84; 14.86-16.88). Phase 3) The taxonomy of treatment burden from Phase 1 was verified and expanded. Additionally, treatment burdens were identified as arising from either: the workload of healthcare; or the endurance of care deficiencies. A taxonomy of patient capacity was created. Six factors were identified that influence patient capacity: personal attributes and skills; physical and cognitive abilities; support network; financial status; life workload, and environment. A conceptual model of treatment burden was created. Healthcare workload and the presence of care deficiencies can influence and be influenced by patient capacity. The quality and configuration of health and social care services influences healthcare workload, care deficiencies and patient capacity. Conclusions: This thesis provides important insights into the considerable treatment burden experienced by people who have had a stroke and the factors that affect their capacity to manage health. Multimorbidity and polypharmacy are common in those with stroke and levels of these are high. Findings have important implications for the design of clinical guidelines and healthcare delivery, for example co-ordination of care should be improved, shared decision-making enhanced, and patients better supported following discharge from hospital.