30 resultados para system dynamics analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Simulation is an effective method for improving supply chain performance. However, there is limited advice available to assist practitioners in selecting the most appropriate method for a given problem. Much of the advice that does exist relies on custom and practice rather than a rigorous conceptual or empirical analysis. An analysis of the different modelling techniques applied in the supply chain domain was conducted, and the three main approaches to simulation used were identified; these are System Dynamics (SD), Discrete Event Simulation (DES) and Agent Based Modelling (ABM). This research has examined these approaches in two stages. Firstly, a first principles analysis was carried out in order to challenge the received wisdom about their strengths and weaknesses and a series of propositions were developed from this initial analysis. The second stage was to use the case study approach to test these propositions and to provide further empirical evidence to support their comparison. The contributions of this research are both in terms of knowledge and practice. In terms of knowledge, this research is the first holistic cross paradigm comparison of the three main approaches in the supply chain domain. Case studies have involved building ‘back to back’ models of the same supply chain problem using SD and a discrete approach (either DES or ABM). This has led to contributions concerning the limitations of applying SD to operational problem types. SD has also been found to have risks when applied to strategic and policy problems. Discrete methods have been found to have potential for exploring strategic problem types. It has been found that discrete simulation methods can model material and information feedback successfully. Further insights have been gained into the relationship between modelling purpose and modelling approach. In terms of practice, the findings have been summarised in the form of a framework linking modelling purpose, problem characteristics and simulation approach.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Visual field assessment is a core component of glaucoma diagnosis and monitoring, and the Standard Automated Perimetry (SAP) test is considered up until this moment, the gold standard of visual field assessment. Although SAP is a subjective assessment and has many pitfalls, it is being constantly used in the diagnosis of visual field loss in glaucoma. Multifocal visual evoked potential (mfVEP) is a newly introduced method used for visual field assessment objectively. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study, we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. OBJECTIVES: The purpose of this study is to examine the effectiveness of a new analysis method in the Multi-Focal Visual Evoked Potential (mfVEP) when it is used for the objective assessment of the visual field in glaucoma patients, compared to the gold standard technique. METHODS: 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. Analysis of the HFA was done using the standard grading system. RESULTS: Analysis of mfVEP results showed that there was a statistically significant difference between the 3 groups in the mean signal to noise ratio SNR (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). sensitivity and specificity of the HAS protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. DISCUSSION: The results showed that the new analysis protocol was able to confirm already existing field defects detected by standard HFA, was able to differentiate between the 3 study groups with a clear distinction between normal and patients with suspected glaucoma; however the distinction between normal and glaucoma patients was especially clear and significant. CONCLUSION: The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Several levels of complexity are available for modelling of wastewater treatment plants. Modelling local effects rely on computational fluid dynamics (CFD) approaches whereas activated sludge models (ASM) represent the global methodology. By applying both modelling approaches to pilot plant and full scale systems, this paper evaluates the value of each method and especially their potential combination. Model structure identification for ASM is discussed based on a full-scale closed loop oxidation ditch modelling. It is illustrated how and for what circumstances information obtained via CFD (computational fluid dynamics) analysis, residence time distribution (RTD) and other experimental means can be used. Furthermore, CFD analysis of the multiphase flow mechanisms is employed to obtain a correct description of the oxygenation capacity of the system studied, including an easy implementation of this information in the classical ASM modelling (e.g. oxygen transfer). The combination of CFD and activated sludge modelling of wastewater treatment processes is applied to three reactor configurations, a perfectly mixed reactor, a pilot scale activated sludge basin (ASB) and a real scale ASB. The application of the biological models to the CFD model is validated against experimentation for the pilot scale ASB and against a classical global ASM model response. A first step in the evaluation of the potential of the combined CFD-ASM model is performed using a full scale oxidation ditch system as testing scenario.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

CONCLUSIONS: The new HSA protocol used in the mfVEP testing can be applied to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. PURPOSE: Multifocal visual evoked potential (mfVEP) is a newly introduced method used for objective visual field assessment. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard automated perimetry (SAP) visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. METHODS: Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey field analyzer (HFA) test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the hemifield sector analysis (HSA) protocol. Analysis of the HFA was done using the standard grading system. RESULTS: Analysis of mfVEP results showed that there was a statistically significant difference between the three groups in the mean signal to noise ratio (ANOVA test, p < 0.001 with a 95% confidence interval). The difference between superior and inferior hemispheres in all subjects were statistically significant in the glaucoma patient group in all 11 sectors (t-test, p < 0.001), partially significant in 5 / 11 (t-test, p < 0.01), and no statistical difference in most sectors of the normal group (1 / 11 sectors was significant, t-test, p < 0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86%, respectively, and for glaucoma suspect patients the values were 89% and 79%, respectively.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Due to high-speed rotation, the problems about rotor mechanics and dynamics for outer rotor high-speed machine are more serious than conventional ones, in view of above problems the mechanical and dynamics analysis for an outer rotor high-speed permanent magnet claw pole motor are carried out. The rotor stress analytical calculation model was derived, then the stress distribution is calculated by finite element method also, which is coincided with that calculated by analytical model. In addition, the stress distribution of outer rotor yoke and PMs considering centrifugal force and temperature effect has been calculated, some influence factors on rotor stress distribution have been analyzed such as pole-arc coefficient and speed. The rotor natural frequency and critical speed were calculated by vibration mode analysis, and its dynamics characteristics influenced by gyroscope effect were analyzed based on Campbell diagram. Based on the analysis results above an outer rotor permanent magnet high-speed claw pole motor is design and verified.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Control design for stochastic uncertain nonlinear systems is traditionally based on minimizing the expected value of a suitably chosen loss function. Moreover, most control methods usually assume the certainty equivalence principle to simplify the problem and make it computationally tractable. We offer an improved probabilistic framework which is not constrained by these previous assumptions, and provides a more natural framework for incorporating and dealing with uncertainty. The focus of this paper is on developing this framework to obtain an optimal control law strategy using a fully probabilistic approach for information extraction from process data, which does not require detailed knowledge of system dynamics. Moreover, the proposed control method framework allows handling the problem of input-dependent noise. A basic paradigm is proposed and the resulting algorithm is discussed. The proposed probabilistic control method is for the general nonlinear class of discrete-time systems. It is demonstrated theoretically on the affine class. A nonlinear simulation example is also provided to validate theoretical development.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The airway epithelium is the first point of contact in the lung for inhaled material, including infectious pathogens and particulate matter, and protects against toxicity from these substances by trapping and clearance via the mucociliary escalator, presence of a protective barrier with tight junctions and initiation of a local inflammatory response. The inflammatory response involves recruitment of phagocytic cells to neutralise and remove and invading materials and is oftern modelled using rodents. However, development of valid in vitro airway epithelial models is of great importance due to the restrictions on animal studies for cosmetic compound testing implicit in the 7th amendment to the European Union Cosmetics Directive. Further, rodent innate immune responses have fundamental differences to human. Pulmonary endothelial cells and leukocytes are also involved in the innate response initiated during pulmonary inflammation. Co-culture models of the airways, in particular where epithelial cells are cultured at air liquid interface with the presence of tight junctions and differentiated mucociliary cells, offer a solution to this problem. Ideally validated models will allow for detection of early biomarkers of response to exposure and investigation into inflammatory response during exposure. This thesis describes the approaches taken towards developing an in vitro epithelial/endothelial cell model of the human airways and identification biomarkers of response to exposure to xenobiotics. The model comprised normal human primary microvascular endothelial cells and the bronchial epithelial cell line BEAS-2B or normal human bronchial epithelial cells. BEAS-2B were chosen as their characterisation at air liquid interface is limited but they are robust in culture, thereby predicted to provide a more reliable test system. Proteomics analysis was undertaken on challenged cells to investigate biomarkers of exposure. BEAS-2B morphology was characterised at air liquid interface compared with normal human bronchial epithelial cells. The results indicate that BEAS-2B cells at an air liquid interface form tight junctions as shown by expression of the tight junction protein zonula occludens-1. To this author’s knowledge this is the first time this result has been reported. The inflammatory response of BEAS-2B (measured as secretion of the inflammatory mediators interleukin-8 and -6) air liquid interface mono-cultures to Escherichia coli lipopolysaccharide or particulate matter (fine and ultrafine titanium dioxide) was comparable to published data for epithelial cells. Cells were also exposed to polymers of “commercial interest” which were in the nanoparticle range (and referred to particles hereafter). BEAS-2B mono-cultures showed an increased secretion of inflammatory mediators after challenge. Inclusion of microvascular endothelial cells resulted in protection against LPS- and particle- induced epithelial toxicity, measured as cell viability and inflammatory response, indicating the importance of co-cultures for investigations into toxicity. Two-dimensional proteomic analysis of lysates from particle-challenged cells failed to identify biomarkers of toxicity due to assay interference and experimental variability. Separately, decreased plasma concentrations of serine protease inhibitors, and the negative acute phase proteins transthyretin, histidine-rich glycoprotein and alpha2-HS glycoprotein were identified as potential biomarkers of methyl methacrylate/ethyl methacrylate/butylacrylate treatment in rats.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The supply chain can be a source of competitive advantage for the firm. Simulation is an effective tool for investigating supply chain problems. The three main simulation approaches in the supply chain context are System Dynamics (SD), Discrete Event Simulation (DES) and Agent Based Modelling (ABM). A sample from the literature suggests that whilst SD and ABM have been used to address strategic and planning problems, DES has mainly been used on planning and operational problems., A review of received wisdom suggests that historically, driven by custom and practice, certain simulation techniques have been focused on certain problem types. A theoretical review of the techniques, however, suggests that the scope of their application should be much wider and that supply chain practitioners could benefit from applying them in this broader way.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper emphasizes on the concept of innovation which is more and more nowadays recognized as of significant importance for all companies across different business sectors. The paper initially provides a review of the innovation literature in terms of types, classifications, and sources of innovation that have been proposed over time. Then, innovation in the context of the food industry is examined and it is attempted to identify innovation strategies followed by Greek food companies based on a value driven approach of innovation. The paper finally, provides insights from eight Greek food companies, which were selected from four subsectors: fruit and vegetables, dairy products, meat products (cured meats), and bakery products. The criterion used for the selection was market success and outstanding performance (e.g. market share, achieved results). Evidence indicates that companies tend to innovate along the dimension of offerings, which is more related to the traditional view of product and process innovation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present what is to our knowledge the first comprehensive investigation of the use of blazed fiber Bragg gratings (BFBGs) to interrogate wavelength division multiplexed (WDM) in-fiber optical sensor arrays. We show that the light outcoupled from the core of these BFBGs is radiated with sufficient optical power that it may be detected with a low-cost charge-coupled device (CCD) array. We present thorough system performance analysis that shows sufficient spectral-spatial resolution to decode sensors with a WDM separation of 75 ρm, signal-to-noise ratio greater than 45-dB bandwidth of 70 nm, and drift of only 0.1 ρm. We show the system to be polarization-state insensitive, making the BFBG-CCD spectral analysis technique a practical, extremely low-cost, alternative to traditional tunable filter approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As more of the economy moves from traditional manufacturing to the service sector, the nature of work is becoming less tangible and thus, the representation of human behaviour in models is becoming more important. Representing human behaviour and decision making in models is challenging, both in terms of capturing the essence of the processes, and also the way that those behaviours and decisions are or can be represented in the models themselves. In order to advance understanding in this area, a useful first step is to evaluate and start to classify the various types of behaviour and decision making that are required to be modelled. This talk will attempt to set out and provide an initial classification of the different types of behaviour and decision making that a modeller might want to represent in a model. Then, it will be useful to start to assess the main methods of simulation in terms of their capability in representing these various aspects. The three main simulation methods, System Dynamics, Agent Based Modelling and Discrete Event Simulation all achieve this to varying degrees. There is some evidence that all three methods can, within limits, represent the key aspects of the system being modelled. The three simulation approaches are then assessed for their suitability in modelling these various aspects. Illustration of behavioural modelling will be provided from cases in supply chain management, evacuation modelling and rail disruption.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The persistence of Salmonella spp. in low moisture foods is a challenge for the food industry as despite control strategies already in place, notable outbreaks still occur. The aim of this study was to characterise isolates of Salmonella, known to be persistent in the food manufacturing environment, by comparing their microbiological characteristics with a panel of matched clinical and veterinary isolates. The gross morphology of the challenge panel was phenotypically characterised in terms of cellular size, shape and motility. In all the parameters measured, the factory isolates were indistinguishable from the human, clinical and veterinary strains. Further detailed metabolic profiling was undertaken using the biolog Microbial ID system. Multivariate analysis of the metabolic microarray revealed differences in metabolism of the factory isolate of S.Montevideo, based on its upregulated ability to utilise glucose and the sugar alcohol groups. The remainder of the serotype-matched isolates were metabolically indistinguishable. Temperature and humidity are known to influence bacterial survival and through environmental monitoring experimental parameters were defined. The results revealed Salmonella survival on stainless steel was affected by environmental temperatures that may be experienced in a food processing environment; with higher survival rates (D25=35.4) at temperatures at 25°C and lower humidity levels of 15% RH, however a rapid decline in cell count (D10=3.4) with lower temperatures of 10°C and higher humidity of 70% RH. Several resident factories strains survived in higher numbers on stainless steel (D25=29.69) compared to serotype matched clinical and veterinary isolates (D25=22.98). Factory isolates of Salmonella did not show an enhanced growth rate in comparison to serotype matched solates grown in Luria broth, Nutrient broth and M9 minimal media indicating that as an independent factor, growth was unlikely to be a major factor driving Salmonella persistence. Using a live / dead stain coupled with fluorescence microscopy revealed that when no longer culturable, isolates of S.Schwarzengrund entered into a viable nonculturable state. The biofilm forming capacity of the panel was characterised and revealed that all were able to form biofilms. None of the factory isolates showed an enhanced capability to form biofilms in comparison to serotype-matched isolates. In disinfection studies, planktonic cells were more susceptible to disinfectants than cells in biofilm and all the disinfectants tested were successful in reducing bacterial load. Contact time was one of the most important factors for reducing bacterial populations in a biofilm. The genomes of eight strains were sequenced. At the nucleotide and amino acid level the food factory isolates were similar to those of isolates from other environments; no major genomic rearrangements were observed, supporting the conclusions of the phenotypic and metabolic analysis. In conclusion, having investigated a variety of morphological, biochemical and genomic factors, it is unlikely that the persistence of Salmonella in the food manufacturing environment is attributable to a single phenotypic, metabolic or genomic factor. Whilst a combination of microbiological factors may be involved it is also possible that strain persistence in the factory environment is a consequence of failure to apply established hygiene management principles.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The computational mechanics approach has been applied to the orientational behavior of water molecules in a molecular dynamics simulated water–Na + system. The distinctively different statistical complexity of water molecules in the bulk and in the first solvation shell of the ion is demonstrated. It is shown that the molecules undergo more complex orientational motion when surrounded by other water molecules compared to those constrained by the electric field of the ion. However the spatial coordinates of the oxygen atom shows the opposite complexity behavior in that complexity is higher for the solvation shell molecules. New information about the dynamics of water molecules in the solvation shell is provided that is additional to that given by traditional methods of analysis.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Transmembrane proteins play crucial roles in many important physiological processes. The intracellular domain of membrane proteins is key for their function by interacting with a wide variety of cytosolic proteins. It is therefore important to examine this interaction. A recently developed method to study these interactions, based on the use of liposomes as a model membrane, involves the covalent coupling of the cytoplasmic domains of membrane proteins to the liposome membrane. This allows for the analysis of interaction partners requiring both protein and membrane lipid binding. This thesis further establishes the liposome recruitment system and utilises it to examine the intracellular interactome of the amyloid precursor protein (APP), most well-known for its proteolytic cleavage that results in the production and accumulation of amyloid beta fragments, the main constituent of amyloid plaques in Alzheimer’s disease pathology. Despite this, the physiological function of APP remains largely unclear. Through the use of the proteo-liposome recruitment system two novel interactions of APP’s intracellular domain (AICD) are examined with a view to gaining a greater insight into APP’s physiological function. One of these novel interactions is between AICD and the mTOR complex, a serine/threonine protein kinase that integrates signals from nutrients and growth factors. The kinase domain of mTOR directly binds to AICD and the N-terminal amino acids of AICD are crucial for this interaction. The second novel interaction is between AICD and the endosomal PIKfyve complex, a lipid kinase involved in the production of phosphatidylinositol-3,5-bisphosphate (PI(3,5)P2) from phosphatidylinositol-3-phosphate, which has a role in controlling ensdosome dynamics. The scaffold protein Vac14 of the PIKfyve complex binds directly to AICD and the C-terminus of AICD is important for its interaction with the PIKfyve complex. Using a recently developed intracellular PI(3,5)P2 probe it is shown that APP controls the formation of PI(3,5)P2 positive vesicular structures and that the PIKfyve complex is involved in the trafficking and degradation of APP. Both of these novel APP interactors have important implications of both APP function and Alzheimer’s disease. The proteo-liposome recruitment method is further validated through its use to examine the recruitment and assembly of the AP-2/clathrin coat from purified components to two membrane proteins containing different sorting motifs. Taken together this thesis highlights the proteo-liposome recruitment system as a valuable tool for the study of membrane proteins intracellular interactome. It allows for the mimicking of the protein in its native configuration therefore identifying weaker interactions that are not detected by more conventional methods and also detecting interactions that are mediated by membrane phospholipids.