234 resultados para classical over barrier model(COBM)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The residence time distribution (RTD) is a crucial parameter when treating engine exhaust emissions with a Dielectric Barrier Discharge (DBD) reactor. In this paper, the residence time of such a reactor is investigated using a finite element based software: COMSOL Multiphysics 4.3. Non-thermal plasma (NTP) discharge is being introduced as a promising method for pollutant emission reduction. DBD is one of the most advantageous of NTP technologies. In a two cylinder co-axial DBD reactor, tubes are placed between two electrodes and flow passes through the annuals between these barrier tubes. If the mean residence time increases in a DBD reactor, there will be a corresponding increase in reaction time and consequently, the pollutant removal efficiency can increase. However, pollutant formation can occur during increased mean residence time and so the proportion of fluid that may remain for periods significantly longer than the mean residence time is of great importance. In this study, first, the residence time distribution is calculated based on the standard reactor used by the authors for ultrafine particle (10-500 nm) removal. Then, different geometrics and various inlet velocities are considered. Finally, for selected cases, some roughness elements added inside the reactor and the residence time is calculated. These results will form the basis for a COMSOL plasma and CFD module investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The preparedness theory of classical conditioning proposed by Seligman (1970, 1971) has been applied extensively over the past 40 years to explain the nature and "source" of human fear and phobias. In this review we examine the formative studies that tested the four defining characteristics of prepared learning with animal fear-relevant stimuli (typically snakes and spiders) and consider claims that fear of social stimuli, such as angry faces, or faces of racial out-group members, may also be acquired utilising the same preferential learning mechanism. Exposition of critical differences between fear learning to animal and social stimuli suggests that a single account cannot adequately explain fear learning with animal and social stimuli. We demonstrate that fear conditioned to social stimuli is less robust than fear conditioned to animal stimuli as it is susceptible to cognitive influence and propose that it may instead reflect on negative stereotypes and social norms. Thus, a theoretical model that can accommodate the influence of both biological and cultural factors is likely to have broader utility in the explanation of fear and avoidance responses than accounts based on a single mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Moving fronts of cells are essential features of embryonic development, wound repair and cancer metastasis. This paper describes a set of experiments to investigate the roles of random motility and proliferation in driving the spread of an initially confined cell population. The experiments include an analysis of cell spreading when proliferation was inhibited. Our data have been analysed using two mathematical models: a lattice-based discrete model and a related continuum partial differential equation model. We obtain independent estimates of the random motility parameter, D, and the intrinsic proliferation rate, λ, and we confirm that these estimates lead to accurate modelling predictions of the position of the leading edge of the moving front as well as the evolution of the cell density profiles. Previous work suggests that systems with a high λ/D ratio will be characterized by steep fronts, whereas systems with a low λ/D ratio will lead to shallow diffuse fronts and this is confirmed in the present study. Our results provide evidence that continuum models, based on the Fisher–Kolmogorov equation, are a reliable platform upon which we can interpret and predict such experimental observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chatrooms, for example Internet Relay Chat, are generally multi-user, multi-channel and multiserver chat-systems which run over the Internet and provide a protocol for real-time text-based conferencing between users all over the world. While a well-trained human observer is able to understand who is chatting with whom, there are no efficient and accurate automated tools to determine the groups of users conversing with each other. A precursor to analysing evolving cyber-social phenomena is to first determine what the conversations are and which groups of chatters are involved in each conversation. We consider this problem in this paper. We propose an algorithm to discover all groups of users that are engaged in conversation. Our algorithms are based on a statistical model of a chatroom that is founded on our experience with real chatrooms. Our approach does not require any semantic analysis of the conversations, rather it is based purely on the statistical information contained in the sequence of posts. We improve the accuracy by applying some graph algorithms to clean the statistical information. We present some experimental results which indicate that one can automatically determine the conversing groups in a chatroom, purely on the basis of statistical analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction QC and EQA are integral to good pathology laboratory practice. Medical Laboratory Science students undertake a project exploring internal QC and EQA procedures used in chemical pathology laboratories. Each student represents an individual lab and the class group represents the peer group of labs performing the same assay using the same method. Methods Using a manual BCG assay for serum albumin, normal and abnormal controls are run with a patient sample over 7 weeks. The QC results are assessed each week using calculated z-scores and both 2S & 3S control rules to determine whether a run is ‘in control’. At the end of the 7 weeks a completed LJ chart is assessed using the Westgard Multirules. Students investigate causes of error and the implications for both lab practice and patient care if runs are not ‘in control’. Twice in the 7 weeks two EQA samples (with target values unknown) are assayed alongside the weekly QC and patient samples. Results from each student are collated and form the basis of an EQA program. ALP are provided and students complete a Youden Plot, which is used to analyse the performance of each ‘lab’ and the method to identify bias. Students explore the concept of possible clinical implications of a biased method and address the actions that should be taken if a lab is not in consensus with the peer group. Conclusion This project is a model of ‘real world’ practice in which student demonstrate an understanding of the importance of QC procedures in a pathology laboratory, apply and interpret statistics and QC rules and charts, apply critical thinking and analytical skills to quality performance data to make recommendations for further practice and improve their technical competence and confidence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a lightweight biometric solution for user authentication over networks using online handwritten signatures. The algorithm proposed is based on a modified Hausdorff distance and has favorable characteristics such as low computational cost and minimal training requirements. Furthermore, we investigate an information theoretic model for capacity and performance analysis for biometric authentication which brings additional theoretical insights to the problem. A fully functional proof-of-concept prototype that relies on commonly available off-the-shelf hardware is developed as a client-server system that supports Web services. Initial experimental results show that the algorithm performs well despite its low computational requirements and is resilient against over-the-shoulder attacks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work has led to the development of empirical mathematical models to quantitatively predicate the changes of morphology in osteocyte-like cell lines (MLO-Y4) in culture. MLO-Y4 cells were cultured at low density and the changes in morphology recorded over 11 hours. Cell area and three dimensional shape features including aspect ratio, circularity and solidity were then determined using widely accepted image analysis software (ImageJTM). Based on the data obtained from the imaging analysis, mathematical models were developed using the non-linear regression method. The developed mathematical models accurately predict the morphology of MLO-Y4 cells for different culture times and can, therefore, be used as a reference model for analyzing MLO-Y4 cell morphology changes within various biological/mechanical studies, as necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose To develop a novel 3-D cell culture model with the view to studying the pathomechanisms underlying the development of age-related macular degeneration (AMD). Our central hypothesis is that the silk structural protein fibroin used in conjunction with cultured human cells can be used to mimic the structural relationships between the RPE and choriocapillaris in health and disease. Methods Co-cultures of human RPE cells (ARPE-19 cells grown in Miller’s medium) and microvascular endothelial cells (HMEC-1 cells grown in endothelial culture medium) were established on opposing sides of a synthetic Bruch’s membrane (3 microns thick) constructed from B mori silk fibroin. Cell attachment was facilitated by pre-coating the fibroin membrane with vitronectin (for ARPE-19 cells) and gelatin (for HMEC-1 cells) respectively. The effects of tropoelastin on attachment of ARPE-19 cells was also examined. Barrier function was examined by measurement of trans-epithelial resistance (TER) using a voltohmmeter (EVOM-2). The phagocytic activity of the synthetic RPE was tested using vitronectin-coated microspheres (2 micron diameter FluoSpheres). In some cultures, membrane defects were created by puncturing within a 24 G needle. The architecture of the synthetic tissue before and after wounding was examined by confocal microscopy after staining for ZO-1 and F-actin. Results The RPE layer of the 3D model developed a cobblestoned morphology (validated by staining for ZO-1 and F-actin), displayed barrier function (validated by measurement of TER) and demonstrated cytoplasmic uptake of vitronectin-coated microspheres. Attachment of ARPE-19 cells to fibroin was unaffected by tropoelastin. Microvascular endothelial cells attached well to the gelatin-coated surface of the fibroin membrane and remained physically separated from the overlaying RPE layer. The fibroin membranes were amenable to puncturing without collapse thus providing the opportunity to study transmembrane migration of the endothelial cells. Conclusions Synthetic Bruch’s membranes constructed from silk fibroin, vitronectin and gelatin, support the co-cultivation of RPE cells and microvascular endothelial cells. The resulting RPE layer displays functions similar to that of native RPE and the entire tri-layered structure displays potential to be used as an in vitro model of choroidal neovascularization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider Cooperative Intrusion Detection System (CIDS) which is a distributed AIS-based (Artificial Immune System) IDS where nodes collaborate over a peer-to-peer overlay network. The AIS uses the negative selection algorithm for the selection of detectors (e.g., vectors of features such as CPU utilization, memory usage and network activity). For better detection performance, selection of all possible detectors for a node is desirable but it may not be feasible due to storage and computational overheads. Limiting the number of detectors on the other hand comes with the danger of missing attacks. We present a scheme for the controlled and decentralized division of detector sets where each IDS is assigned to a region of the feature space. We investigate the trade-off between scalability and robustness of detector sets. We address the problem of self-organization in CIDS so that each node generates a distinct set of the detectors to maximize the coverage of the feature space while pairs of nodes exchange their detector sets to provide a controlled level of redundancy. Our contribution is twofold. First, we use Symmetric Balanced Incomplete Block Design, Generalized Quadrangles and Ramanujan Expander Graph based deterministic techniques from combinatorial design theory and graph theory to decide how many and which detectors are exchanged between which pair of IDS nodes. Second, we use a classical epidemic model (SIR model) to show how properties from deterministic techniques can help us to reduce the attack spread rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All civil and private aircraft are required to comply with the airworthiness standards set by their national airworthiness authority and throughout their operational life must be in a condition of safe operation. Aviation accident data shows that over 20% of all fatal accidents in aviation are due to airworthiness issues, specifically aircraft mechanical failures. Ultimately it is the responsibility of each registered operator to ensure that their aircraft remain in a condition of safe operation, and this is done through both effective management of airworthiness activities and the effective programme governance of safety outcomes. Typically, the projects within these airworthiness management programmes are focused on acquiring, modifying and maintaining the aircraft as a capability supporting the business. Programme governance provides the structure through which the goals and objectives of airworthiness programmes are set along with the means of attaining them. Whilst the principal causes of failures in many programmes can be traced to inadequate programme governance, many of the failures in large-scale projects can have their root causes in the organizational culture and more specifically in the organizational processes related to decision-making. This paper examines the primary theme of project and programme-based enterprises, and introduces a model for measuring organizational culture in airworthiness management programmes using measures drawn from 211 respondents in Australian airline programmes. The paper describes the theoretical perspectives applied to modifying an original model to specifically focus it on measuring the organizational culture of programmes for managing airworthiness; identifying the most important factors needed to explain the relationship between the measures collected, and providing a description of the nature of these factors. The paper concludes by identifying a model that best describes the organizational culture data collected from seven airworthiness management programmes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We applied a texture-based flow visualisation technique to a numerical hydrodynamic model of the Pumicestone Passage in southeast Queensland, Australia. The quality of the visualisations using our flow visualisation tool, are compared with animations generated using more traditional drogue release plot and velocity contour and vector techniques. The texture-based method is found to be far more effective in visualising advective flow within the model domain. In some instances, it also makes it easier for the researcher to identify specific hydrodynamic features within the complex flow regimes of this shallow tidal barrier estuary as compared with the direct and geometric based methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A hydrogen gas sensor based on Pt/nanostructured ZnO Schottky diode has been developed. Our proposed theoretical model allows for the explanation of superior dynamic performance of the reverse biased diode when compared to the forward bias operation. The sensor was evaluated with low concentration H2 gas exposures over a temperature range of 280°C to 430°C. Upon exposure to H2 gas, the effective change in free carrier concentration at the Pt/structured ZnO interface is amplified by an enhancement factor, effectively lowering the reverse barrier, producing a large voltage shift. The lowering of the reverse barrier permits a faster response in reverse bias operation, than in forward bias operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High energy bone fractures resulting from impact trauma are often accompanied by subcutaneous soft tissue injuries, even if the skin remains intact. There is evidence that such closed soft tissue injuries affect the healing of bone fractures, and vice versa. Despite this knowledge, most impact trauma studies in animals have focussed on bone fractures or soft tissue trauma in isolation. However, given the simultaneous impact on both tissues a better understanding of the interaction between these two injuries is necessary to optimise clinical treatment. The aim of this study was therefore to develop a new experimental model and characterise, for the first time, the healing of a complex fracture with concurrent closed soft tissue trauma in sheep. A pendulum impact device was designed to deliver a defined and standardised impact to the distal thigh of sheep, causing a reproducible contusion injury to the subcutaneous soft tissues. In a subsequent procedure, a reproducible femoral butterfly fracture (AO C3-type) was created at the sheep’s femur, which was initially stabilised for 5 days by an external fixator construct to allow for soft tissue swelling to recede, and ultimately in a bridging construct using locking plates. The combined injuries were applied to twelve sheep and the healing observed for four or eight weeks (six animals per group) until sacrifice. The pendulum impact led to a moderate to severe circumferential soft tissue injury with significant bruising, haematomas and partial muscle disruptions. Posttraumatic measurements showed elevated intra-compartmental pressure and circulatory tissue breakdown markers, with recovery to normal, pre-injury values within four days. Clinically, no neurovascular deficiencies were observed. Bi-weekly radiological analysis of the healing fractures showed progressive callus healing over time, with the average number of callus bridges increasing from 0.4 at two weeks to 4.2 at eight weeks. Biomechanical testing after sacrifice showed increasing torsional stiffness between four and eight weeks healing time from 10% to 100%, and increasing ultimate torsional strength from 10% to 64% (relative to the contralateral control limb). Our results demonstrate the robust healing of a complex femur fracture in the presence of a severe soft tissue contusion injury in sheep and demonstrate the establishment of a clinically relevant experimental model, for research aimed at improving the treatment of bone fractures accompanied by closed soft tissue injuries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Participants may respond to phases of a workplace walking program at different rates. This study evaluated the factors that contribute to the number of steps through phases of the program. The intervention was automated through a web-based program designed to increase workday walking. Methods: The study reviewed independent variable influences throughout phases I–III. A convenience sample of university workers (n=56; 43.6±1.7 years; BMI 27.44±.2.15 kg/m2; 48 female) were recruited at worksites in Australia. These workers were given a pedometer (Yamax SW 200) and access to the website program. For analyses, step counts entered by workers into the website were downloaded and mean workday steps were compared using a seemingly unrelated regression. This model was employed to capture the contemporaneous correlation within individuals in the study across observed time periods. Results: The model predicts that the 36 subjects with complete information took an average 7460 steps in the baseline two week period. After phase I, statistically significance increases in steps (from baseline) were explained by age, working status (full or part time), occupation (academic or professional), and self reported public transport (PT) use (marginally significant). Full time workers walked more than part time workers by about 440 steps, professionals walked about 300 steps more than academics, and PT users walked about 400 steps more than non-PT users. The ability to differentiate steps after two weeks among participants suggests a differential affect of the program after only two weeks. On average participants increased steps from week two to four by about 525 steps, but regular auto users had nearly 750 steps less than non-auto users at week four. The effect of age was diminished in the 4th week of observation and accounted for 34 steps per year of age. In phase III, discriminating between participants became more difficult, with only age effects differentiating their increase over baseline. The marginal effect of age by phase III compared to phase I, increased from 36 to 50, suggesting a 14 step per year increase from the 2nd to 6th week. Discussion: The findings suggest that participants responded to the program at different rates, with uniformity of effect achieved by the 6th week. Participants increased steps, however a tapering off occurred over time. Age played the most consistent role in predicting steps over the program. PT use was associated with increased step counts, while Auto use was associated with decreased step counts.