839 resultados para Exactly solvable model of an asset market


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, the interest of the automotive market for hybrid vehicles has increased due to the more restrictive pollutants emissions legislation and to the necessity of decreasing the fossil fuel consumption, since such solution allows a consistent improvement of the vehicle global efficiency. The term hybridization regards the energy flow in the powertrain of a vehicle: a standard vehicle has, usually, only one energy source and one energy tank; instead, a hybrid vehicle has at least two energy sources. In most cases, the prime mover is an internal combustion engine (ICE) while the auxiliary energy source can be mechanical, electrical, pneumatic or hydraulic. It is expected from the control unit of a hybrid vehicle the use of the ICE in high efficiency working zones and to shut it down when it is more convenient, while using the EMG at partial loads and as a fast torque response during transients. However, the battery state of charge may represent a limitation for such a strategy. That’s the reason why, in most cases, energy management strategies are based on the State Of Charge, or SOC, control. Several studies have been conducted on this topic and many different approaches have been illustrated. The purpose of this dissertation is to develop an online (usable on-board) control strategy in which the operating modes are defined using an instantaneous optimization method that minimizes the equivalent fuel consumption of a hybrid electric vehicle. The equivalent fuel consumption is calculated by taking into account the total energy used by the hybrid powertrain during the propulsion phases. The first section presents the hybrid vehicles characteristics. The second chapter describes the global model, with a particular focus on the energy management strategies usable for the supervisory control of such a powertrain. The third chapter shows the performance of the implemented controller on a NEDC cycle compared with the one obtained with the original control strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this thesis was to examine the mediating effects of job-related negative emotions on the relationship between workplace aggression and outcomes. Additionally, the moderating effects of workplace social support and intensity of workplace aggression are considered. A total 321 of working individuals participated through an online survey. The results of this thesis suggest that job-related negative emotions are a mediator of the relationship between workplace aggression and outcomes, with full and partial mediation supported. Workplace social support was found to be a buffering variable in the relationship between workplace aggression and outcomes, regardless of the source of aggression (supervisor or co-worker) or the source of the social support. Finally, intensity of aggression was found to be a strong moderator of the relationship between workplace aggression and outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increased activity of the noradrenergic system in the amygdala has been suggested to contribute to the hyperarousal symptoms associated with post-traumatic stress disorder (PTSD). However, only two studies have examined the content of noradrenaline or its metabolites in the amygdala of rats previously exposed to traumatic stress showing inconsistent results. The aim of this study was to investigate the effects of an inescapable foot shock (IFS) procedure 1) on reactivity to novelty in an open-field (as an index of hyperarousal), and 2) on noradrenaline release in the amygdala during an acute stress. To test the role of noradrenaline in amygdala, we also investigated the effects of microinjections of propranolol, a β-adrenoreceptor antagonist, and clenbuterol, a β-adrenoreceptor agonist, into the amygdala of IFS and control animals. Finally, we evaluated the expression of mRNA levels of β-adrenoreceptors (β1 and β2) in the amygdala, the hippocampus and the prefrontal cortex. Male Wistar rats (3 months) were stereotaxically implanted with bilateral guide cannulae. After recovering from surgery, animals were exposed to IFS (10 shocks, 0.86 mA, and 6 seconds per shock) and seven days later either microdialysis or microinjections were performed in amygdala. Animals exposed to IFS showed a reduced locomotion compared to non-shocked animals during the first 5 minutes in the open-field. In the amygdala, IFS animals showed an enhanced increase of noradrenaline induced by stress compared to control animals. Bilateral microinjections of propranolol (0.5 μg) into the amygdala one hour before testing in the open-field normalized the decreased locomotion observed in IFS animals. On the other hand, bilateral microinjections of clenbuterol (30 ng) into the amygdala of control animals did not change the exploratory activity induced by novelty in the open field. IFS modified the mRNA expression of β1 and β2 adrenoreceptors in the prefrontal cortex and the hippocampus. These results suggest that an increased noradrenergic activity in the amygdala contributes to the expression of hyperarousal in an animal model of PTSD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assessment processes are essential to guarantee quality and continuous improvement of software in healthcare, as they measure software attributes in their lifecycle, verify the degree of alignment between the software and its objectives and identify unpredicted events. This article analyses the use of an assessment model based on software metrics for three healthcare information systems from a public hospital that provides secondary and tertiary care in the region of Ribeirão Preto. Compliance with the metrics was investigated using questionnaires in guided interviews of the system analysts responsible for the applications. The outcomes indicate that most of the procedures specified in the model can be adopted to assess the systems that serves the organization, particularly in the attributes of compatibility, reliability, safety, portability and usability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inflammatory bowel disease (IBD) is a chronic inflammation which affects the gastrointestinal tract (GIT). One of the best ways to study the immunological mechanisms involved during the disease is the T cell transfer model of colitis. In this model, immunodeficient mice (RAG-/-recipients) are reconstituted with naive CD4+ T cells from healthy wild type hosts. This model allows examination of the earliest immunological events leading to disease and chronic inflammation, when the gut inflammation perpetuates but does not depend on a defined antigen. To study the potential role of antigen presenting cells (APCs) in the disease process, it is helpful to have an antigen-driven disease model, in which a defined commensal-derived antigen leads to colitis. An antigen driven-colitis model has hence been developed. In this model OT-II CD4+ T cells, that can recognize only specific epitopes in the OVA protein, are transferred into RAG-/- hosts challenged with CFP-OVA-expressing E. coli. This model allows the examination of interactions between APCs and T cells in the lamina propria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The usage of multi material structures in industry, especially in the automotive industry are increasing. To overcome the difficulties in joining these structures, adhesives have several benefits over traditional joining methods. Therefore, accurate simulations of the entire process of fracture including the adhesive layer is crucial. In this paper, material parameters of a previously developed meso mechanical finite element (FE) model of a thin adhesive layer are optimized using the Strength Pareto Evolutionary Algorithm (SPEA2). Objective functions are defined as the error between experimental data and simulation data. The experimental data is provided by previously performed experiments where an adhesive layer was loaded in monotonically increasing peel and shear. Two objective functions are dependent on 9 model parameters (decision variables) in total and are evaluated by running two FEsimulations, one is loading the adhesive layer in peel and the other in shear. The original study converted the two objective functions into one function that resulted in one optimal solution. In this study, however, a Pareto frontis obtained by employing the SPEA2 algorithm. Thus, more insight into the material model, objective functions, optimal solutions and decision space is acquired using the Pareto front. We compare the results and show good agreement with the experimental data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract : This is a study concerning comparisons between the Dubovik Aerosol optical depth (AOD) retrievals from AEROCAN (ARONET) stations and AOD estimates from simulations provided by a chemical transport model (GEOS-Chem : Goddard Earth Observing System Chemistry). The AOD products associated with the Dubovik product are divided into total, fine and coarse mode components. The retrieval period is from January 2009 to January 2013 for 5 Arctic stations (Barrow, Alaska; Resolute Bay, Nunavut; 0PAL and PEARL (Eureka), Nunavut; and Thule, Greenland). We also employed AOD retrievals from 10 other mid-latitude Canadian stations for comparisons with the Arctic stations. The results of our investigation were submitted to Atmosphere-Ocean. To briefly summarize those results, the model generally but not always tended to underestimate the (monthly) averaged AOD and its components. We found that the subdivision into fine and coarse mode components could provide unique signatures of particular events (Asian dust) and that the means of characterizing the statistics (log-normal frequency distributions versus normal distributions) was an attribute that was common to both the retrievals and the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The issue of ‘rigour vs. relevance’ in IS research has generated an intense, heated debate for over a decade. It is possible to identify, however, only a limited number of contributions on how to increase the relevance of IS research without compromising its rigour. Based on a lifecycle view of IS research, we propose the notion of ‘reality checks’ in order to review IS research outcomes in the light of actual industry demands. We assume that five barriers impact the efficient transfer of IS research outcomes; they are lack of awareness, lack of understandability, lack of relevance, lack of timeliness, and lack of applicability. In seeking to understand the effect of these barriers on the transfer of mature IS research into practice, we used focus groups. We chose DeLone and McLean’s IS success model as our stimulus because it is one of the more widely researched areas of IS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two distinct maintenance-data-models are studied: a government Enterprise Resource Planning (ERP) maintenance-data-model, and the Software Engineering Industries (SEI) maintenance-data-model. The objective is to: (i) determine whether the SEI maintenance-data-model is sufficient in the context of ERP (by comparing with an ERP case), (ii) identify whether the ERP maintenance-data-model in this study has adequately captured the essential and common maintenance attributes (by comparing with the SEI), and (iii) proposed a new ERP maintenance-data-model as necessary. Our findings suggest that: (i) there are variations to the SEI model in an ERP-context, and (ii) there are rooms for improvements in our ERP case’s maintenance-data-model. Thus, a new ERP maintenance-data-model capturing the fundamental ERP maintenance attributes is proposed. This model is imperative for: (i) enhancing the reporting and visibility of maintenance activities, (ii) monitoring of the maintenance problems, resolutions and performance, and (iii) helping maintenance manager to better manage maintenance activities and make well-informed maintenance decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE The aim of this research project was to obtain an understanding of the barriers to and facilitators of providing palliative care in neonatal nursing. This article reports the first phase of this research: to develop and administer an instrument to measure the attitudes of neonatal nurses to palliative care. METHODS The instrument developed for this research (the Neonatal Palliative Care Attitude Scale) underwent face and content validity testing with an expert panel and was pilot tested to establish temporal stability. It was then administered to a population sample of 1285 neonatal nurses in Australian NICUs, with a response rate of 50% (N 645). Exploratory factor-analysis techniques were conducted to identify scales and subscales of the instrument. RESULTS Data-reduction techniques using principal components analysis were used. Using the criteria of eigenvalues being 1, the items in the Neonatal Palliative Care Attitude Scale extracted 6 factors, which accounted for 48.1% of the variance among the items. By further examining the questions within each factor and the Cronbach’s of items loading on each factor, factors were accepted or rejected. This resulted in acceptance of 3 factors indicating the barriers to and facilitators of palliative care practice. The constructs represented by these factors indicated barriers to and facilitators of palliative care practice relating to (1) the organization in which the nurse practices, (2) the available resources to support a palliative model of care, and (3) the technological imperatives and parental demands. CONCLUSIONS The subscales identified by this analysis identified items that measured both barriers to and facilitators of palliative care practice in neonatal nursing. While establishing preliminary reliability of the instrument by using exploratory factor-analysis techniques, further testing of this instrument with different samples of neonatal nurses is necessary using a confirmatory factor-analysis approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Historically, asset management focused primarily on the reliability and maintainability of assets; organisations have since then accepted the notion that a much larger array of processes govern the life and use of an asset. With this, asset management’s new paradigm seeks a holistic, multi-disciplinary approach to the management of physical assets. A growing number of organisations now seek to develop integrated asset management frameworks and bodies of knowledge. This research seeks to complement existing outputs of the mentioned organisations through the development of an asset management ontology. Ontologies define a common vocabulary for both researchers and practitioners who need to share information in a chosen domain. A by-product of ontology development is the realisation of a process architecture, of which there is also no evidence in published literature. To develop the ontology and subsequent asset management process architecture, a standard knowledge-engineering methodology is followed. This involves text analysis, definition and classification of terms and visualisation through an appropriate tool (in this case, the Protégé application was used). The result of this research is the first attempt at developing an asset management ontology and process architecture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is increasing agreement that understanding complexity is important for project management because of difficulties associated with decision-making and goal attainment which appear to stem from complexity. However the current operational definitions of complex projects, based upon size and budget, have been challenged and questions have been raised about how complexity can be measured in a robust manner that takes account of structural, dynamic and interaction elements. Thematic analysis of data from 25 in-depth interviews of project managers involved with complex projects, together with an exploration of the literature reveals a wide range of factors that may contribute to project complexity. We argue that these factors contributing to project complexity may define in terms of dimensions, or source characteristics, which are in turn subject to a range of severity factors. In addition to investigating definitions and models of complexity from the literature and in the field, this study also explores the problematic issues of ‘measuring’ or assessing complexity. A research agenda is proposed to further the investigation of phenomena reported in this initial study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study focuses on an alluvial plain situated within a large meander of the Logan River at Josephville near Beaudesert which supports a factory that processes gelatine. The plant draws water from on site bores, as well as the Logan River, for its production processes and produces approximately 1.5 ML per day (Douglas Partners, 2004) of waste water containing high levels of dissolved ions. At present a series of treatment ponds are used to aerate the waste water reducing the level of organic matter; the water is then used to irrigate grazing land around the site. Within the study the hydrogeology is investigated, a conceptual groundwater model is produced and a numerical groundwater flow model is developed from this. On the site are several bores that access groundwater, plus a network of monitoring bores. Assessment of drilling logs shows the area is formed from a mixture of poorly sorted Quaternary alluvial sediments with a laterally continuous aquifer comprised of coarse sands and fine gravels that is in contact with the river. This aquifer occurs at a depth of between 11 and 15 metres and is overlain by a heterogeneous mixture of silts, sands and clays. The study investigates the degree of interaction between the river and the groundwater within the fluvially derived sediments for reasons of both environmental monitoring and sustainability of the potential local groundwater resource. A conceptual hydrogeological model of the site proposes two hydrostratigraphic units, a basal aquifer of coarse-grained materials overlain by a thick semi-confining unit of finer materials. From this, a two-layer groundwater flow model and hydraulic conductivity distribution was developed based on bore monitoring and rainfall data using MODFLOW (McDonald and Harbaugh, 1988) and PEST (Doherty, 2004) based on GMS 6.5 software (EMSI, 2008). A second model was also considered with the alluvium represented as a single hydrogeological unit. Both models were calibrated to steady state conditions and sensitivity analyses of the parameters has demonstrated that both models are very stable for changes in the range of ± 10% for all parameters and still reasonably stable for changes up to ± 20% with RMS errors in the model always less that 10%. The preferred two-layer model was found to give the more realistic representation of the site, where water level variations and the numerical modeling showed that the basal layer of coarse sands and fine gravels is hydraulically connected to the river and the upper layer comprising a poorly sorted mixture of silt-rich clays and sands of very low permeability limits infiltration from the surface to the lower layer. The paucity of historical data has limited the numerical modelling to a steady state one based on groundwater levels during a drought period and forecasts for varying hydrological conditions (e.g. short term as well as prolonged dry and wet conditions) cannot reasonably be made from such a model. If future modelling is to be undertaken it is necessary to establish a regular program of groundwater monitoring and maintain a long term database of water levels to enable a transient model to be developed at a later stage. This will require a valid monitoring network to be designed with additional bores required for adequate coverage of the hydrogeological conditions at the Josephville site. Further investigations would also be enhanced by undertaking pump testing to investigate hydrogeological properties in the aquifer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. The objective is to estimate the cost-effectiveness of an intervention that reduces hospital readmission among older people at high risk. A cost-effectiveness model to estimate the costs and health benefits of the intervention was implemented. Methodology/Principal Findings. The model used data from a randomised controlled trial conducted in an Australian tertiary metropolitan hospital. Participants were acute medical admissions aged >65 years with at least one risk factor for readmission: multiple comorbidities, impaired functionality, aged >75 years, 30 recent multiple admissions, poor social support, history of depression. The intervention was a comprehensive nursing and physiotherapy assessment and an individually tailored program of exercise strategies and nurse home visits with telephone follow-up; commencing in hospital and continuing following discharge for 24 weeks. The change to cost outcomes, including the costs of implementing the intervention and all subsequent use of health care services, and, the change to health benefits, represented by quality adjusted life years, were estimated for the intervention as compared to existing practice. The mean change to total costs and quality 38 adjusted life years for an average individual over 24 weeks participating in the intervention were: cost savings of $333 (95% Bayesian credible interval $-1,932:1,282) and 0.118 extra quality adjusted life years (95% Bayesian credible interval 0.1:0.136). The mean net41 monetary-benefit per individual for the intervention group compared to the usual care condition was $7,907 (95% Bayesian credible interval $5,959:$9,995) for the 24 week period. Conclusions/Significance. The estimation model that describes this intervention predicts cost savings and improved health outcomes. A decision to remain with existing practices causes unnecessary costs and reduced health. Decision makers should consider adopting this 46 program for elderly hospitalised patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Australia, airports have emerged as important sub-regional activity centres and now pose challenges for both airport operation and planning in the surrounding urban and regional environment. The changing nature of airports in their metropolitan context and the emergence of new pressures and problems require the introduction of a fresh conceptual framework to assist the better understanding of these complex roles and spatial interactions. The approach draws upon the meta-concept of interfaces of an ‘airport metropolis’ as an organising device consisting of four main domains: economic development, land use,infrastructure, and governance. The paper uses the framework to further discuss airport and regional interactions and highlights the use of sustainability criteria to operationalise the model. The approach aims to move research and practice beyond the traditionally compartmentalised analysis of airport issues and policy-making by highlighting interdependencies between airports and regions.