845 resultados para Dual-process Model
Resumo:
Objectives: Family caregivers play a vital role in maintaining the lives of individuals with advanced illness living in the community. However, the responsibility of caregiving for an end-of-life family member can have profound consequences on the psychological, physical and financial well-being of the caregiver. While the literature has identified caregiver stress or strain as a complex process with multiple contributing factors, few comprehensive studies exist. This study examined a wide range of theory-driven variables contributing to family caregiver stress. Method: Data variables from interviews with primary family caregivers were mapped onto the factors within the Stress Process Model theoretical framework. A hierarchical multiple linear regression analysis was used to determine the strongest predictors of caregiver strain as measured by a validated composite index, the Caregiver Strain Index. Results: The study included 132 family caregivers across south-central/western Ontario, Canada. About half of these caregivers experienced high strain, the extent of which was predicted by lower perceived program accessibility, lower functional social support, greater weekly amount of time caregivers committed to the care recipient, younger caregiver age and poorer caregiver self-perceived health. Conclusion: This study examined the influence of a multitude of factors in the Stress Process Model on family caregiver strain, finding stress to be a multidimensional construct. Perceived program accessibility was the strongest predictor of caregiver strain, more so than intensity of care, highlighting the importance of the availability of community resources to support the family caregiving role.
Resumo:
Due to the variability of wind power, it is imperative to accurately and timely forecast the wind generation to enhance the flexibility and reliability of the operation and control of real-time power. Special events such as ramps, spikes are hard to predict with traditional methods using solely recently measured data. In this paper, a new Gaussian Process model with hybrid training data taken from both the local time and historic dataset is proposed and applied to make short-term predictions from 10 minutes to one hour ahead. A key idea is that the similar pattern data in history are properly selected and embedded in Gaussian Process model to make predictions. The results of the proposed algorithms are compared to those of standard Gaussian Process model and the persistence model. It is shown that the proposed method not only reduces magnitude error but also phase error.
Resumo:
The conversion of biomass for the production of liquid fuels can help reduce the greenhouse gas (GHG) emissions that are predominantly generated by the combustion of fossil fuels. Oxymethylene ethers (OMEs) are a series of liquid fuel additives that can be obtained from syngas, which is produced from the gasification of biomass. The blending of OMEs in conventional diesel fuel can reduce soot formation during combustion in a diesel engine. In this research, a process for the production of OMEs from woody biomass has been simulated. The process consists of several unit operations including biomass gasifi- cation, syngas cleanup, methanol production, and conversion of methanol to OMEs. The methodology involved the development of process models, the identification of the key process parameters affecting OME production based on the process model, and the development of an optimal process design for high OME yields. It was found that up to 9.02 tonnes day1 of OME3, OME4, and OME5 (which are suitable as diesel additives) can be produced from 277.3 tonnes day1 of wet woody biomass. Furthermore, an optimal combination of the parameters, which was generated from the developed model, can greatly enhance OME production and thermodynamic efficiency. This model can further be used in a techno- economic assessment of the whole biomass conversion chain to produce OMEs. The results of this study can be helpful for petroleum-based fuel producers and policy makers in determining the most attractive pathways of converting bio-resources into liquid fuels.
Resumo:
The contemporary world is crowded of large, interdisciplinary, complex systems made of other systems, personnel, hardware, software, information, processes, and facilities. The Systems Engineering (SE) field proposes an integrated holistic approach to tackle these socio-technical systems that is crucial to take proper account of their multifaceted nature and numerous interrelationships, providing the means to enable their successful realization. Model-Based Systems Engineering (MBSE) is an emerging paradigm in the SE field and can be described as the formalized application of modelling principles, methods, languages, and tools to the entire lifecycle of those systems, enhancing communications and knowledge capture, shared understanding, improved design precision and integrity, better development traceability, and reduced development risks. This thesis is devoted to the application of the novel MBSE paradigm to the Urban Traffic & Environment domain. The proposed system, the GUILTE (Guiding Urban Intelligent Traffic & Environment), deals with a present-day real challenging problem “at the agenda” of world leaders, national governors, local authorities, research agencies, academia, and general public. The main purposes of the system are to provide an integrated development framework for the municipalities, and to support the (short-time and real-time) operations of the urban traffic through Intelligent Transportation Systems, highlighting two fundamental aspects: the evaluation of the related environmental impacts (in particular, the air pollution and the noise), and the dissemination of information to the citizens, endorsing their involvement and participation. These objectives are related with the high-level complex challenge of developing sustainable urban transportation networks. The development process of the GUILTE system is supported by a new methodology, the LITHE (Agile Systems Modelling Engineering), which aims to lightening the complexity and burdensome of the existing methodologies by emphasizing agile principles such as continuous communication, feedback, stakeholders involvement, short iterations and rapid response. These principles are accomplished through a universal and intuitive SE process, the SIMILAR process model (which was redefined at the light of the modern international standards), a lean MBSE method, and a coherent System Model developed through the benchmark graphical modeling languages SysML and OPDs/OPL. The main contributions of the work are, in their essence, models and can be settled as: a revised process model for the SE field, an agile methodology for MBSE development environments, a graphical tool to support the proposed methodology, and a System Model for the GUILTE system. The comprehensive literature reviews provided for the main scientific field of this research (SE/MBSE) and for the application domain (Traffic & Environment) can also be seen as a relevant contribution.
Resumo:
years 8 months) and 24 older (M == 7 years 4 months) children. A Monitoring Process Model (MPM) was developed and tested in order to ascertain at which component process ofthe MPM age differences would emerge. The MPM had four components: (1) assessment; (2) evaluation; (3) planning; and (4) behavioural control. The MPM was assessed directly using a referential communication task in which the children were asked to make a series of five Lego buildings (a baseline condition and one building for each MPM component). Children listened to instructions from one experimenter while a second experimenter in the room (a confederate) intetjected varying levels ofverbal feedback in order to assist the children and control the component ofthe MPM. This design allowed us to determine at which "stage" ofprocessing children would most likely have difficulty monitoring themselves in this social-cognitive task. Developmental differences were obselVed for the evaluation, planning and behavioural control components suggesting that older children were able to be more successful with the more explicit metacomponents. Interestingly, however, there was no age difference in terms ofLego task success in the baseline condition suggesting that without the intelVention ofthe confederate younger children monitored the task about as well as older children. This pattern ofresults indicates that the younger children were disrupted by the feedback rather than helped. On the other hand, the older children were able to incorporate the feedback offered by the confederate into a plan ofaction. Another aim ofthis study was to assess similar processing components to those investigated by the MPM Lego task in a more naturalistic observation. Together the use ofthe Lego Task ( a social cognitive task) and the naturalistic social interaction allowed for the appraisal of cross-domain continuities and discontinuities in monitoring behaviours. In this vein, analyses were undertaken in order to ascertain whether or not successful performance in the MPM Lego Task would predict cross-domain competence in the more naturalistic social interchange. Indeed, success in the two latter components ofthe MPM (planning and behavioural control) was related to overall competence in the naturalistic task. However, this cross-domain prediction was not evident for all levels ofthe naturalistic interchange suggesting that the nature ofthe feedback a child receives is an important determinant ofresponse competency. Individual difference measures reflecting the children's general cognitive capacity (Working Memory and Digit Span) and verbal ability (vocabulary) were also taken in an effort to account for more variance in the prediction oftask success. However, these individual difference measures did not serve to enhance the prediction oftask performance in either the Lego Task or the naturalistic task. Similarly, parental responses to questionnaires pertaining to their child's temperament and social experience also failed to increase prediction oftask performance. On-line measures ofthe children's engagement, positive affect and anxiety also failed to predict competence ratings.
Resumo:
We present a statistical image-based shape + structure model for Bayesian visual hull reconstruction and 3D structure inference. The 3D shape of a class of objects is represented by sets of contours from silhouette views simultaneously observed from multiple calibrated cameras. Bayesian reconstructions of new shapes are then estimated using a prior density constructed with a mixture model and probabilistic principal components analysis. We show how the use of a class-specific prior in a visual hull reconstruction can reduce the effect of segmentation errors from the silhouette extraction process. The proposed method is applied to a data set of pedestrian images, and improvements in the approximate 3D models under various noise conditions are shown. We further augment the shape model to incorporate structural features of interest; unknown structural parameters for a novel set of contours are then inferred via the Bayesian reconstruction process. Model matching and parameter inference are done entirely in the image domain and require no explicit 3D construction. Our shape model enables accurate estimation of structure despite segmentation errors or missing views in the input silhouettes, and works even with only a single input view. Using a data set of thousands of pedestrian images generated from a synthetic model, we can accurately infer the 3D locations of 19 joints on the body based on observed silhouette contours from real images.
Resumo:
Aunque el concepto de sabiduría ha sido ampliamente estudiado por expertos de áreas como la filosofía, la religión y la psicología, aún enfrenta limitaciones en cuanto a su definición y evaluación. Por esto, el presente trabajo tiene como objetivo, formular una definición del concepto de sabiduría que permita realizar una propuesta de evaluación del concepto como competencia en los gerentes. Para esto, se realizó un análisis documental de tipo cualitativo. De esta manera, se analizaron diversos textos sobre la historia, las definiciones y las metodologías para evaluar tanto la sabiduría como las competencias; diferenciando la sabiduría de otros constructos y analizando la diferencia entre las competencias generales y las gerenciales para posteriormente, definir la sabiduría como una competencia gerencial. Como resultado de este análisis se generó un prototipo de prueba denominado SAPIENS-O, a través del cuál se busca evaluar la sabiduría como competencia gerencial. Como alcances del instrumento se pueden identificar la posibilidad de medir la sabiduría como competencia en los gerentes, la posibilidad de dar un nuevo panorama a las dificultades teóricas y empíricas sobre la sabiduría y la posibilidad de facilitar el estudio de la sabiduría en ambientes reales, más específicamente en ambientes organizacionales.
Resumo:
Purpose – The purpose of this paper is to propose a process model for knowledge transfer in using theories relating knowledge communication and knowledge translation. Design/methodology/approach – Most of what is put forward in this paper is based on a research project titled “Procurement for innovation and knowledge transfer (ProFIK)”. The project is funded by a UK government research council – The Engineering and Physical Sciences Research Council (EPSRC). The discussions are mainly grounded on a thorough review of literature accomplished as part of the research project. Findings – The process model developed in this paper has built upon the theory of knowledge transfer and the theory of communication. Knowledge transfer, per se, is not a mere transfer of knowledge. It involves different stages of knowledge transformation. Depending on the context of knowledge transfer, it can also be influenced by many factors; some positive and some negative. The developed model of knowledge transfer attempts to encapsulate all these issues in order to create a holistic framework. Originality/value of paper – An attempt has been made in the paper to combine some of the significant theories or findings relating to knowledge transfer together, making the paper an original and valuable one.
Resumo:
O documento analisa como investidores de impacto selecionar suas companhias de portfólio na América Latina e que critérios são avaliados no processo. Uma vez que praticamente ne-nhuma pesquisa sobre isso foi con conduzidos até à data, e desde que o modelo de processo de seleção aplicados em capital de risco não é dissemelhantes, foi adotado essa abordagem. Os resultados revelam que os investidores de impacto originar e avaliar negócios de uma for-ma semelhante a capitalistas de risco , mas que alguns critérios são ajustados e outros adicio-nados a fim de refletir o duplo objectivo de investimento de impacto. Os investidores de im-pacto podem originar ofertas passivamente, mas eles preferem procurar empreendimentos sociais de forma proativa: contatos pessoais, o acesso a redes e eventos do setor são cruciais neste contexto. Impacto Investidores considerando um investimento em pesquisa para a Amé-rica Latina inteira, empreendedores sociais honestos e confiáveis comprometidos com impacto social; empreendimentos sociais elegíveis devem ser rentáveis com potencial de escalabilidade; o produto deve ter um impacto social, ou seja, criar valor para o consumidor individual e para a comunidade em geral; tamanho do mercado e crescimento do mercado são fatores externos cruciais; e as características de negócio dependem de atitude de risco do investidor e as perspectivas de uma saída bem sucedida, tanto em termos financeiros e sociais. Os investi-dores de impacto também estão dispostos a dar apoio não financeiro antes de um investimen-to, se um empreendimento social, mostra alto potencial para atingir o seu objectivo dual.
Resumo:
Nowadays, more than half of the computer development projects fail to meet the final users' expectations. One of the main causes is insufficient knowledge about the organization of the enterprise to be supported by the respective information system. The DEMO methodology (Design and Engineering Methodology for Organizations) has been proved as a well-defined method to specify, through models and diagrams, the essence of any organization at a high level of abstraction. However, this methodology is platform implementation independent, lacking the possibility of saving and propagating possible changes from the organization models to the implemented software, in a runtime environment. The Universal Enterprise Adaptive Object Model (UEAOM) is a conceptual schema being used as a basis for a wiki system, to allow the modeling of any organization, independent of its implementation, as well as the previously mentioned change propagation in a runtime environment. Based on DEMO and UEAOM, this project aims to develop efficient and standardized methods, to enable an automatic conversion of DEMO Ontological Models, based on UEAOM specification into BPMN (Business Process Model and Notation) models of processes, using clear semantics, without ambiguities, in order to facilitate the creation of processes, almost ready for being executed on workflow systems that support BPMN.
Resumo:
We explore here the issue of duality versus spectrum equivalence in dual theories generated through the master action approach. Specifically we examine a generalized self-dual (GSD) model where a Maxwell term is added to the self-dual model. A gauge embedding procedure applied to the GSD model leads to a Maxwell-Chern-Simons (MCS) theory with higher derivatives. We show here that the latter contains a ghost mode contrary to the original GSD model. By figuring out the origin of the ghost we are able to suggest a new master action which interpolates between the local GSD model and a nonlocal MCS model. Those models share the same spectrum and are ghost free. Furthermore, there is a dual map between both theories at classical level which survives quantum correlation functions up to contact terms. The remarks made here may be relevant for other applications of the master action approach.
Resumo:
This paper presents a new model for the representation of electrodes' filaments of hot-cathode fluorescent lamps, during preheating processes based on the injection of currents with constant root mean square (rms) values. The main improvement obtained with this model is the prediction of the R-h/R-c ratio during the preheating process, as a function of the preheating time and of the rms current injected in the electrodes. Using the proposed model, it is possible to obtain an estimate of the time interval and the current that should be provided by the electronic ballast, in order to ensure a suitable preheating process. is estimate of time and current can be used as input data in the design of electronic ballasts with programmed lamp start, permitting the prediction of the R-h/R-c ratio during the initial steps of the design (theoretical analysis and digital simulation). Therefore, the use of the proposed model permits to reduce the necessity of several empirical adjustments in the prototype, in order to set the operation of electronic ballasts during the preheating process. This fact reduces time and costs associated to the global design procedure of electronic ballasts.
Resumo:
We explore here the issue of duality versus spectrum equivalence in dual theories generated through the master action approach. Specifically we examine a generalized self-dual (GSD) model where a Maxwell term is added to the self-dual model. A gauge embedding procedure applied to the GSD model leads to a Maxwell-Chern-Simons (MCS) theory with higher derivatives. We show here that the latter contains a ghost mode contrary to the original GSD model. By figuring out the origin of the ghost we are able to suggest a new master action which interpolates between the local GSD model and a nonlocal MCS model. Those models share the same spectrum and are ghost free. Furthermore, there is a dual map between both theories at classical level which survives quantum correlation functions up to contact terms. The remarks made here may be relevant for other applications of the master action approach. © SISSA 2006.
Resumo:
In electronic commerce, systems development is based on two fundamental types of models, business models and process models. A business model is concerned with value exchanges among business partners, while a process model focuses on operational and procedural aspects of business communication. Thus, a business model defines the what in an e-commerce system, while a process model defines the how. Business process design can be facilitated and improved by a method for systematically moving from a business model to a process model. Such a method would provide support for traceability, evaluation of design alternatives, and seamless transition from analysis to realization. This work proposes a unified framework that can be used as a basis to analyze, to interpret and to understand different concepts associated at different stages in e-Commerce system development. In this thesis, we illustrate how UN/CEFACT’s recommended metamodels for business and process design can be analyzed, extended and then integrated for the final solutions based on the proposed unified framework. Also, as an application of the framework, we demonstrate how process-modeling tasks can be facilitated in e-Commerce system design. The proposed methodology, called BP3 stands for Business Process Patterns Perspective. The BP3 methodology uses a question-answer interface to capture different business requirements from the designers. It is based on pre-defined process patterns, and the final solution is generated by applying the captured business requirements by means of a set of production rules to complete the inter-process communication among these patterns.
Resumo:
The purpose of this doctoral thesis is to prove existence for a mutually catalytic random walk with infinite branching rate on countably many sites. The process is defined as a weak limit of an approximating family of processes. An approximating process is constructed by adding jumps to a deterministic migration on an equidistant time grid. As law of jumps we need to choose the invariant probability measure of the mutually catalytic random walk with a finite branching rate in the recurrent regime. This model was introduced by Dawson and Perkins (1998) and this thesis relies heavily on their work. Due to the properties of this invariant distribution, which is in fact the exit distribution of planar Brownian motion from the first quadrant, it is possible to establish a martingale problem for the weak limit of any convergent sequence of approximating processes. We can prove a duality relation for the solution to the mentioned martingale problem, which goes back to Mytnik (1996) in the case of finite rate branching, and this duality gives rise to weak uniqueness for the solution to the martingale problem. Using standard arguments we can show that this solution is in fact a Feller process and it has the strong Markov property. For the case of only one site we prove that the model we have constructed is the limit of finite rate mutually catalytic branching processes as the branching rate approaches infinity. Therefore, it seems naturalto refer to the above model as an infinite rate branching process. However, a result for convergence on infinitely many sites remains open.