949 resultados para event based


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Discrete Event Simulation (DES) is a very popular simulation technique in Operational Research. Recently, there has been the emergence of another technique, namely Agent Based Simulation (ABS). Although there is a lot of literature relating to DES and ABS, we have found less that focuses on exploring the capabilities of both in tackling human behaviour issues. In order to understand the gap between these two simulation techniques, therefore, our aim is to understand the distinctions between DES and ABS models with the real world phenomenon in modelling and simulating human behaviour. In achieving the aim, we have carried out a case study at a department store. Both DES and ABS models will be compared using the same problem domain which is concerning on management policy in a fitting room. The behaviour of staffs while working and customers’ satisfaction will be modelled for both models behaviour understanding.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In our research we investigate the output accuracy of discrete event simulation models and agent based simulation models when studying human centric complex systems. In this paper we focus on human reactive behaviour as it is possible in both modelling approaches to implement human reactive behaviour in the model by using standard methods. As a case study we have chosen the retail sector, and here in particular the operations of the fitting room in the women wear department of a large UK department store. In our case study we looked at ways of determining the efficiency of implementing new management policies for the fitting room operation through modelling the reactive behaviour of staff and customers of the department. First, we have carried out a validation experiment in which we compared the results from our models to the performance of the real system. This experiment also allowed us to establish differences in output accuracy between the two modelling methods. In a second step a multi-scenario experiment was carried out to study the behaviour of the models when they are used for the purpose of operational improvement. Overall we have found that for our case study example both, discrete event simulation and agent based simulation have the same potential to support the investigation into the efficiency of implementing new management policies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we investigate output accuracy for a Discrete Event Simulation (DES) model and Agent Based Simulation (ABS) model. The purpose of this investigation is to find out which of these simulation techniques is the best one for modelling human reactive behaviour in the retail sector. In order to study the output accuracy in both models, we have carried out a validation experiment in which we compared the results from our simulation models to the performance of a real system. Our experiment was carried out using a large UK department store as a case study. We had to determine an efficient implementation of management policy in the store’s fitting room using DES and ABS. Overall, we have found that both simulation models were a good representation of the real system when modelling human reactive behaviour.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the metal industry, and more specifically in the forging one, scrap material is a crucial issue and reducing it would be an important goal to reach. Not only would this help the companies to be more environmentally friendly and more sustainable, but it also would reduce the use of energy and lower costs. At the same time, the techniques for Industry 4.0 and the advancements in Artificial Intelligence (AI), especially in the field of Deep Reinforcement Learning (DRL), may have an important role in helping to achieve this objective. This document presents the thesis work, a contribution to the SmartForge project, that was performed during a semester abroad at Karlstad University (Sweden). This project aims at solving the aforementioned problem with a business case of the company Bharat Forge Kilsta, located in Karlskoga (Sweden). The thesis work includes the design and later development of an event-driven architecture with microservices, to support the processing of data coming from sensors set up in the company's industrial plant, and eventually the implementation of an algorithm with DRL techniques to control the electrical power to use in it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report on the event structure and double helicity asymmetry (A(LL)) of jet production in longitudinally polarized p + p collisions at root s = 200 GeV. Photons and charged particles were measured by the PHENIX experiment at midrapidity vertical bar eta vertical bar < 0.35 with the requirement of a high-momentum (> 2 GeV/c) photon in the event. Event structure, such as multiplicity, p(T) density and thrust in the PHENIX acceptance, were measured and compared with the results from the PYTHIA event generator and the GEANT detector simulation. The shape of jets and the underlying event were well reproduced at this collision energy. For the measurement of jet A(LL), photons and charged particles were clustered with a seed-cone algorithm to obtain the cluster pT sum (p(T)(reco)). The effect of detector response and the underlying events on p(T)(reco) was evaluated with the simulation. The production rate of reconstructed jets is satisfactorily reproduced with the next-to-leading-order and perturbative quantum chromodynamics jet production cross section. For 4< p(T)(reco) < 12 GeV/c with an average beam polarization of < P > = 49% we measured Lambda(LL) = -0.0014 +/- 0.0037(stat) at the lowest p(T)(reco) bin (4-5 GeV= c) and -0.0181 +/- 0.0282(stat) at the highest p(T)(reco) bin (10-12 GeV= c) with a beam polarization scale error of 9.4% and a pT scale error of 10%. Jets in the measured p(T)(reco) range arise primarily from hard-scattered gluons with momentum fraction 0: 02 < x < 0: 3 according to PYTHIA. The measured A(LL) is compared with predictions that assume various Delta G(x) distributions based on the Gluck-Reya-Stratmann-Vogelsang parameterization. The present result imposes the limit -a.1 < integral(0.3)(0.02) dx Delta G(x, mu(2) = GeV2) < 0.4 at 95% confidence level or integral(0.3)(0.002) dx Delta G(x, mu(2) = 1 GeV2) < 0.5 at 99% confidence level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chagas disease is still a major public health problem in Latin America. Its causative agent, Trypanosoma cruzi, can be typed into three major groups, T. cruzi I, T. cruzi II and hybrids. These groups each have specific genetic characteristics and epidemiological distributions. Several highly virulent strains are found in the hybrid group; their origin is still a matter of debate. The null hypothesis is that the hybrids are of polyphyletic origin, evolving independently from various hybridization events. The alternative hypothesis is that all extant hybrid strains originated from a single hybridization event. We sequenced both alleles of genes encoding EF-1 alpha, actin and SSU rDNA of 26 T. cruzi strains and DHFR-TS and TR of 12 strains. This information was used for network genealogy analysis and Bayesian phylogenies. We found T. cruzi I and T. cruzi II to be monophyletic and that all hybrids had different combinations of T. cruzi I and T. cruzi II haplotypes plus hybrid-specific haplotypes. Bootstrap values (networks) and posterior probabilities (Bayesian phylogenies) of clades supporting the monophyly of hybrids were far below the 95% confidence interval, indicating that the hybrid group is polyphyletic. We hypothesize that T. cruzi I and T. cruzi II are two different species and that the hybrids are extant representatives of independent events of genome hybridization, which sporadically have sufficient fitness to impact on the epidemiology of Chagas disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to propose a multiobjective optimization approach for solving the manufacturing cell formation problem, explicitly considering the performance of this said manufacturing system. Cells are formed so as to simultaneously minimize three conflicting objectives, namely, the level of the work-in-process, the intercell moves and the total machinery investment. A genetic algorithm performs a search in the design space, in order to approximate to the Pareto optimal set. The values of the objectives for each candidate solution in a population are assigned by running a discrete-event simulation, in which the model is automatically generated according to the number of machines and their distribution among cells implied by a particular solution. The potential of this approach is evaluated via its application to an illustrative example, and a case from the relevant literature. The obtained results are analyzed and reviewed. Therefore, it is concluded that this approach is capable of generating a set of alternative manufacturing cell configurations considering the optimization of multiple performance measures, greatly improving the decision making process involved in planning and designing cellular systems. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed control systems consist of sensors, actuators and controllers, interconnected by communication networks and are characterized by a high number of concurrent process. This work presents a proposal for a procedure to model and analyze communication networks for distributed control systems in intelligent building. The approach considered for this purpose is based on the characterization of the control system as a discrete event system and application of coloured Petri net as a formal method for specification, analysis and verification of control solutions. With this approach, we develop the models that compose the communication networks for the control systems of intelligent building, which are considered the relationships between the various buildings systems. This procedure provides a structured development of models, facilitating the process of specifying the control algorithm. An application example is presented in order to illustrate the main features of this approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm, based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining ""absolute"" and ""relative"" safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 [14], using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the ""Automatic Dependent Surveillance-Broadcasting"" (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The oxidation of critical cysteines/related thiols of adenine nucleotide translocase (ANT) is believed to be an important event of the Ca(2+)-induced mitochondrial permeability transition (MPT), a process mediated by a cyclosporine A/ADP-sensitive permeability transition pores (PTP) opening. We addressed the ANT-Cys(56) relative mobility status resulting from the interaction of ANT/surrounding cardiolipins with Ca(2+) and/or ADP by means of computational chemistry analysis (Molecular Interaction Fields and Molecular Dynamics studies), supported by classic mitochondrial swelling assays. The following events were predicted: (i) Ca(2+) interacts preferentially with the ANT surrounding cardiolipins bound to the H4 helix of translocase, (ii) weakens the cardiolipins/ANT interactions and (iii) destabilizes the initial ANT-Cys(56) residue increasing its relative mobility. The binding of ADP that stabilizes the conformation ""m"" of ANT and/or cardiolipin, respectively to H5 and H4 helices, could stabilize their contacts with the short helix h56 that includes Cys(56), accounting for reducing its relative mobility. The results suggest that Ca(2+) binding to adenine nucleotide translocase (ANT)-surrounding cardiolipins in c-state of the translocase enhances (ANT)-Cys(56) relative mobility and that this may constitute a potential critical step of Ca(2+)-induced PTP opening. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background From the mid-1980s to mid-1990s, the WHO MONICA Project monitored coronary events and classic risk factors for coronary heart disease (CHD) in 38 populations from 21 countries. We assessed the extent to which changes in these risk factors explain the variation in the trends in coronary-event rates across the populations. Methods In men and women aged 35-64 years, non-fatal myocardial infarction and coronary deaths were registered continuously to assess trends in rates of coronary events. We carried out population surveys to estimate trends in risk factors. Trends in event rates were regressed on trends in risk score and in individual risk factors. Findings Smoking rates decreased in most male populations but trends were mixed in women; mean blood pressures and cholesterol concentrations decreased, body-mass index increased, and overall risk scores and coronary-event rates decreased. The model of trends in 10-year coronary-event rates against risk scores and single risk factors showed a poor fit, but this was improved with a 4-year time lag for coronary events. The explanatory power of the analyses was limited by imprecision of the estimates and homogeneity of trends in the study populations. Interpretation Changes in the classic risk factors seem to partly explain the variation in population trends in CHD. Residual variance is attributable to difficulties in measurement and analysis, including time lag, and to factors that were not included, such as medical interventions. The results support prevention policies based on the classic risk factors but suggest potential for prevention beyond these.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The study we assessed how often patients who are manifesting a myocardial infarction (MI) would not be considered candidates for intensive lipid-lowering therapy based on the current guidelines. Methods: In 355 consecutive patients manifesting ST elevation MI (STEMI), admission plasma C-reactive protein (CRP) was measured and Framingham risk score (FRS), PROCAM risk score, Reynolds risk score, ASSIGN risk score, QRISK, and SCORE algorithms were applied. Cardiac computed tomography and carotid ultrasound were performed to assess the coronary artery calcium score (CAC), carotid intima-media thickness (cIMT) and the presence of carotid plaques. Results: Less than 50% of STEMI patients would be identified as having high risk before the event by any of these algorithms. With the exception of FRS (9%), all other algorithms would assign low risk to about half of the enrolled patients. Plasma CRP was <1.0 mg/L in 70% and >2 mg/L in 14% of the patients. The average cIMT was 0.8 +/- 0.2 mm and only in 24% of patients was >= 1.0 mm. Carotid plaques were found in 74% of patients. CAC > 100 was found in 66% of patients. Adding CAC >100 plus the presence of carotid plaque, a high-risk condition would be identified in 100% of the patients using any of the above mentioned algorithms. Conclusion: More than half of patients manifesting STEMI would not be considered as candidates for intensive preventive therapy by the current clinical algorithms. The addition of anatomical parameters such as CAC and the presence of carotid plaques can substantially reduce the CVD risk underestimation. (C) 2010 Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Depression is the most frequent psychiatric disorder in Parkinson`s disease (PD). Although evidence Suggests that depression in PD is related to the degenerative process that underlies the disease, further studies are necessary to better understand the neural basis of depression in this population of patients. In order to investigate neuronal alterations underlying the depression in PD, we studied thirty-six patients with idiopathic PD. Twenty of these patients had the diagnosis of major depression disorder and sixteen did not. The two groups were matched for PD motor severity according to Unified Parkinson Disease Rating Scale (UPDRS). First we conducted a functional magnetic resonance imaging (fMRI) using an event-related parametric emotional perception paradigm with test retest design. Our results showed decreased activation in the left mediodorsal (MD) thalamus and in medial prefrontall cortex in PD patients with depression compared to those without depression. Based upon these results and the increased neuron count in MD thalamus found in previous studies, we conducted a region of interest (ROI) guided voxel-based morphometry (VBM) study comparing the thalamic volume. Our results showed an increased volume in mediodorsal thalamic nuclei bilaterally. Converging morphological changes and functional emotional processing in mediodorsal thalamus highlight the importance of limbic thalamus in PD depression. In addition this data supports the link between neurodegenerative alterations and mood regulation. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: There have been few population based studies on stroke risk factors and prognosis conducted in Brazil. The objective of this study was to evaluate, over a 2 year period, the incidence of the subtypes of first ever stroke, the prevalence of cardiovascular risk factors and functional prognosis in a city located in the south of Brazil. Methods: The period from January 2005 to December 2006 was evaluated prospectively by compiling data on first ever stroke cases, medications used prior to the morbidity and the incidence of traditional risk factors. The annual incidence was adjusted for age using the direct method. Patients were monitored for at least 6 months following the event. Results: Of 1323 stroke cases, 759 were first ever stroke cases. Of these, 610 were classified as infarctions, 94 as intracerebral haemorrhage and 55 as subarachnoid haemorrhage. The crude incidence rate per 100 000 inhabitants was 61.8 for infarction (95% CI 57.0 to 66.9), 9.5 for intracerebral haemorrhage (95% CI 7.7 to 11.6) and 5.6 for subarachnoid haemorrhage (95% CI 4.2 to 7.3). The 30 day case fatality was 19.1%. The most prevalent cardiovascular risk factor was arterial hypertension. By post-stroke month 6, 25% had died (95% CI 21.4 to 29.1) whereas 61.5% had regained their independence (95% CI 56.2 to 68.3). Conclusions: Case fatality rate, prognosis and incidence adjusted for stroke subtypes were similar to those found in other population based studies. The prevalence rates of ischaemic heart disease, dyslipidaemia, arterial hypertension and diabetes suggest that Joinville presents a mixed pattern of cardiovascular risk, a pattern seen in developed and developing countries alike.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to use a population-based register of acute cardiac events to investigate the association between survival after an acute event and history of smoking and alcohol consumption. The population was all residents of the Lower Hunter Region of Australia aged 25 to 69 years who suffered myocardial infarction or sudden cardiac death between 1986 and 1994. Among 10,170 events, 2504 resulted in death within 28 days. After adjusting for sex, age and medical history, current smokers had a similar risk of dying after an acute cardiac event to never-smokers [odds ratio (OR)=1.10, 95% confidence interval (CI) 0.94-1.29]. People who consumed more than 8 alcoholic drinks per day on more than 2 days per week (OR=1.93, 95% CI 1.39-2.69) and former moderate to heavy drinkers (OR=4.59, 95% CI 3.65-5.76) were more likely to die than people who were nondrinkers. The results of this large community study, suggesting no effect of smoking on case fatality and an increased risk of death after an acute cardiac event for heavy drinkers and former moderate to heavy drinkers, highlight the importance of a population view of case fatality. These results can also shed some light on reasons for the paradoxical results from clinical trials. (C) 2001 Elsevier Science Inc. All rights reserved.