391 resultados para STOCHASTIC PROCESSES
Resumo:
This study investigated personal and social processes of adjustment at different stages of illness for individuals with brain tumour. A purposive sample of 18 participants with mixed tumour types (9 benign and 9 malignant) and 15 family caregivers was recruited from a neurosurgical practice and a brain tumour support service. In-depth semi-structured interviews focused on participants’ perceptions of their adjustment, including personal appraisals, coping and social support since their brain tumour diagnosis. Interview transcripts were analysed thematically using open, axial and selective coding techniques. The primary theme that emerged from the analysis entailed “key sense making appraisals”, which was closely related to the following secondary themes: (1) Interactions with those in the healthcare system, (2) reactions and support from the personal support network, and (3) a diversity of coping efforts. Adjustment to brain tumour involved a series of appraisals about the illness that were influenced by interactions with those in the healthcare system, reactions and support from people in their support network, and personal coping efforts. Overall, the findings indicate that adjustment to brain tumour is highly individualistic; however, some common personal and social processes are evident in how people make sense of and adapt to the illness over time. A preliminary framework of adjustment based on the present findings and its clinical relevance are discussed. In particular, it is important for health professionals to seek to understand and support individuals’ sense-making processes following diagnosis of brain tumour.
Resumo:
Ocean processes are dynamic and complex events that occur on multiple different spatial and temporal scales. To obtain a synoptic view of such events, ocean scientists focus on the collection of long-term time series data sets. Generally, these time series measurements are continually provided in real or near-real time by fixed sensors, e.g., buoys and moorings. In recent years, an increase in the utilization of mobile sensor platforms, e.g., Autonomous Underwater Vehicles, has been seen to enable dynamic acquisition of time series data sets. However, these mobile assets are not utilized to their full capabilities, generally only performing repeated transects or user-defined patrolling loops. Here, we provide an extension to repeated patrolling of a designated area. Our algorithms provide the ability to adapt a standard mission to increase information gain in areas of greater scientific interest. By implementing a velocity control optimization along the predefined path, we are able to increase or decrease spatiotemporal sampling resolution to satisfy the sampling requirements necessary to properly resolve an oceanic phenomenon. We present a path planning algorithm that defines a sampling path, which is optimized for repeatability. This is followed by the derivation of a velocity controller that defines how the vehicle traverses the given path. The application of these tools is motivated by an ongoing research effort to understand the oceanic region off the coast of Los Angeles, California. The computed paths are implemented with the computed velocities onto autonomous vehicles for data collection during sea trials. Results from this data collection are presented and compared for analysis of the proposed technique.
Resumo:
Path planning and trajectory design for autonomous underwater vehicles (AUVs) is of great importance to the oceanographic research community because automated data collection is becoming more prevalent. Intelligent planning is required to maneuver a vehicle to high-valued locations to perform data collection. In this paper, we present algorithms that determine paths for AUVs to track evolving features of interest in the ocean by considering the output of predictive ocean models. While traversing the computed path, the vehicle provides near-real-time, in situ measurements back to the model, with the intent to increase the skill of future predictions in the local region. The results presented here extend prelim- inary developments of the path planning portion of an end-to-end autonomous prediction and tasking system for aquatic, mobile sensor networks. This extension is the incorporation of multiple vehicles to track the centroid and the boundary of the extent of a feature of interest. Similar algorithms to those presented here are under development to consider additional locations for multiple types of features. The primary focus here is on algorithm development utilizing model predictions to assist in solving the motion planning problem of steering an AUV to high-valued locations, with respect to the data desired. We discuss the design technique to generate the paths, present simulation results and provide experimental data from field deployments for tracking dynamic features by use of an AUV in the Southern California coastal ocean.
Resumo:
Children often have difficulties in learning spatial representations. This study investigated the effect of four different instructional formats on learning outcomes and strategies used when dealing with spatial tasks such as assembly procedures. It was hypothesised that instructional material that imposed least extraneous cognitive load would facilitate enhanced learning. Forty secondary students were presented with four types of instruction; orthographic drawing, isometric drawing, physical model and, isometric and physical model together. The findings provide evidence to suggest that working from physical models caused least extraneous cognitive load compared to the isometric and orthographic groups. The model group took less time, had more correctly completed models, required fewer extra looks, spent less time studying the instruction and made fewer errors. Problem decomposition, forward working and attending to information in the foreground of the graphical representation strategies were analysed.
Resumo:
Manufacturing organisations spend more on Business Process Improvement initiatives to make them more competitive in growing global market. This paper presents a Rapid Improvement Workshop (RIW) framework which companies can used to identify the critical factors regulating the diffusion of business process improvement in their company. The framework can then be used address how process improvement can be efficiently implemented. We use the results from case studies at Caterpillar India. The paper identifies the critical factors that contribute to the successful implementation of process improvement programs in manufacturing organisations. We further identify certain technological and cultural barriers to the implementation of process improvement programs and how Indian manufacturing companies can overcome these barriers to attain competitive advantage in the global markets.
Resumo:
This paper presents a robust stochastic model for the incorporation of natural features within data fusion algorithms. The representation combines Isomap, a non-linear manifold learning algorithm, with Expectation Maximization, a statistical learning scheme. The representation is computed offline and results in a non-linear, non-Gaussian likelihood model relating visual observations such as color and texture to the underlying visual states. The likelihood model can be used online to instantiate likelihoods corresponding to observed visual features in real-time. The likelihoods are expressed as a Gaussian Mixture Model so as to permit convenient integration within existing nonlinear filtering algorithms. The resulting compactness of the representation is especially suitable to decentralized sensor networks. Real visual data consisting of natural imagery acquired from an Unmanned Aerial Vehicle is used to demonstrate the versatility of the feature representation.
Resumo:
Process models in organizational collections are typically modeled by the same team and using the same conventions. As such, these models share many characteristic features like size range, type and frequency of errors. In most cases merely small samples of these collections are available due to e.g. the sensitive information they contain. Because of their sizes, these samples may not provide an accurate representation of the characteristics of the originating collection. This paper deals with the problem of constructing collections of process models, in the form of Petri nets, from small samples of a collection for accurate estimations of the characteristics of this collection. Given a small sample of process models drawn from a real-life collection, we mine a set of generation parameters that we use to generate arbitrary-large collections that feature the same characteristics of the original collection. In this way we can estimate the characteristics of the original collection on the generated collections.We extensively evaluate the quality of our technique on various sample datasets drawn from both research and industry.
Resumo:
Gay community media functions as a system with three nodes, in which the flows of information and capital theoretically benefit all parties: the gay community gains a sense of cohesion and citizenship through media; the gay media outlets profit from advertisers’ capital; and advertisers recoup their investments in lucrative ‘pink dollar’ revenue. But if a necessary corollary of all communication systems is error or noise, where—and what—are the errors in this system? In this paper we argue that the ‘error’ in the gay media system is Queerness, and that the gay media system ejects (in a process of Kristevan abjection) these Queer identities in order to function successfully. We examine the ways in which Queer identities are excluded from representation in such media through a discourse and content analysis of The Sydney Star Observer (Australia’s largest gay and lesbian paper). First, we analyse the way Queer bodies are excluded from the discourses that construct and reinforce both the ideal gay male body and the notions of homosexual essence required for that body to be meaningful. We then argue that abject Queerness returns in the SSO’s discourses of public health through the conspicuous absence of the AIDS-inflicted body (which we read as the epitome of the abject Queer), since this absence paradoxically conjures up a trace of that which the system tries to expel. We conclude by arguing that because the ‘Queer error’ is integral to the SSO, gay community media should practise a politics of Queer inclusion rather than exclusion.
Resumo:
Competitive markets are increasingly driving new initiatives for shorter cycle times resulting in increased overlapping of project phases. This, in turn, necessitates improving the interfaces between the different phases to be overlapped (integrated), thus allowing transfer of processes, information and knowledge from one individual or team to another. This transfer between phases, within and between projects, is one of the basic challenges to the philosophy of project management. To make the process transfer more transparent with minimal loss of momentum and project knowledge, this paper draws upon Total Quality Management (TQM) and Business Process Re-engineering (BPR) philosophies to develop a Best Practice Model for managing project phase integration. The paper presents the rationale behind the model development and outlines its two key parts; (1) Strategic Framework and (2) Implementation Plan. Key components of both the Strategic Framework and the Implementation Plan are presented and discussed.
Resumo:
Genomic and proteomic analyses have attracted a great deal of interests in biological research in recent years. Many methods have been applied to discover useful information contained in the enormous databases of genomic sequences and amino acid sequences. The results of these investigations inspire further research in biological fields in return. These biological sequences, which may be considered as multiscale sequences, have some specific features which need further efforts to characterise using more refined methods. This project aims to study some of these biological challenges with multiscale analysis methods and stochastic modelling approach. The first part of the thesis aims to cluster some unknown proteins, and classify their families as well as their structural classes. A development in proteomic analysis is concerned with the determination of protein functions. The first step in this development is to classify proteins and predict their families. This motives us to study some unknown proteins from specific families, and to cluster them into families and structural classes. We select a large number of proteins from the same families or superfamilies, and link them to simulate some unknown large proteins from these families. We use multifractal analysis and the wavelet method to capture the characteristics of these linked proteins. The simulation results show that the method is valid for the classification of large proteins. The second part of the thesis aims to explore the relationship of proteins based on a layered comparison with their components. Many methods are based on homology of proteins because the resemblance at the protein sequence level normally indicates the similarity of functions and structures. However, some proteins may have similar functions with low sequential identity. We consider protein sequences at detail level to investigate the problem of comparison of proteins. The comparison is based on the empirical mode decomposition (EMD), and protein sequences are detected with the intrinsic mode functions. A measure of similarity is introduced with a new cross-correlation formula. The similarity results show that the EMD is useful for detection of functional relationships of proteins. The third part of the thesis aims to investigate the transcriptional regulatory network of yeast cell cycle via stochastic differential equations. As the investigation of genome-wide gene expressions has become a focus in genomic analysis, researchers have tried to understand the mechanisms of the yeast genome for many years. How cells control gene expressions still needs further investigation. We use a stochastic differential equation to model the expression profile of a target gene. We modify the model with a Gaussian membership function. For each target gene, a transcriptional rate is obtained, and the estimated transcriptional rate is also calculated with the information from five possible transcriptional regulators. Some regulators of these target genes are verified with the related references. With these results, we construct a transcriptional regulatory network for the genes from the yeast Saccharomyces cerevisiae. The construction of transcriptional regulatory network is useful for detecting more mechanisms of the yeast cell cycle.
Resumo:
Focusing on the conditions that an optimization problem may comply with, the so-called convergence conditions have been proposed and sequentially a stochastic optimization algorithm named as DSZ algorithm is presented in order to deal with both unconstrained and constrained optimizations. The principle is discussed in the theoretical model of DSZ algorithm, from which we present the practical model of DSZ algorithm. Practical model efficiency is demonstrated by the comparison with the similar algorithms such as Enhanced simulated annealing (ESA), Monte Carlo simulated annealing (MCS), Sniffer Global Optimization (SGO), Directed Tabu Search (DTS), and Genetic Algorithm (GA), using a set of well-known unconstrained and constrained optimization test cases. Meanwhile, further attention goes to the strategies how to optimize the high-dimensional unconstrained problem using DSZ algorithm.
Resumo:
Real-world business processes rely on the availability of scarce, shared resources, both human and non-human. Current workflow management systems support allocation of individual human resources to tasks but lack support for the full range of resource types used in practice, and the inevitable constraints on their availability and applicability. Based on past experience with resource-intensive workflow applications, we derive generic requirements for a workflow system which can use its knowledge of resource capabilities and availability to help create feasible task schedules. We then define the necessary architecture for implementing such a system and demonstrate its practicality through a proof-of-concept implementation. This work is presented in the context of a real-life surgical care process observed in a number of German hospitals.
Resumo:
This study examined the effect that temporal order within the entrepreneurial discovery-exploitation process has on the outcomes of venture creation. Consistent with sequential theories of discovery-exploitation, the general flow of venture creation was found to be directed from discovery toward exploitation in a random sample of nascent ventures. However, venture creation attempts which specifically follow this sequence derive poor outcomes. Moreover, simultaneous discovery-exploitation was the most prevalent temporal order observed, and venture attempts that proceed in this manner more likely become operational. These findings suggest that venture creation is a multi-scale phenomenon that is at once directional in time, and simultaneously driven by symbiotically coupled discovery and exploitation.
Resumo:
The structure and dynamics of a modern business environment are very hard to model using traditional methods. Such complexity raises challenges to effective business analysis and improvement. The importance of applying business process simulation to analyze and improve business activities has been widely recognized. However, one remaining challenge is the development of approaches to human resource behavior simulation. To address this problem, we describe a novel simulation approach where intelligent agents are used to simulate human resources by performing allocated work from a workflow management system. The behavior of the intelligent agents is driven a by state transition mechanism called a Hierarchical Task Network (HTN). We demonstrate and validate our simulator via a medical treatment process case study. Analysis of the simulation results shows that the behavior driven by the HTN is consistent with design of the workflow model. We believe these preliminary results support the development of more sophisticated agent-based human resource simulation systems.