914 resultados para Up-flow anaerobic sludge blankets
Resumo:
Despite the Revised International Prognostic Index's (R-IPI) undoubted utility in diffuse large B-cell lymphoma (DLBCL), significant clinical heterogeneity within R-IPI categories persists. Emerging evidence indicates that circulating host immunity is a robust and R-IPI independent prognosticator, most likely reflecting the immune status of the intratumoral microenvironment. We hypothesized that direct quantification of immunity within lymphomatous tissue would better permit stratification within R-IPI categories. We analyzed 122 newly diagnosed consecutive DLBCL patients treated with rituximab, cyclophosphamide, doxorubicin, vincristine, and prednisone (R-CHOP) chemo-immunotherapy. Median follow-up was 4 years. As expected, the R-IPI was a significant predictor of outcome with 5-year overall survival (OS) 87% for very good, 87% for good, and 51% for poor-risk R-IPI scores (P < 0.001). Consistent with previous reports, systemic immunity also predicted outcome (86% OS for high lymphocyte to monocyte ratio [LMR], versus 63% with low LMR, P = 0.01). Multivariate analysis confirmed LMR as independently prognostic. Flow cytometry on fresh diagnostic lymphoma tissue, identified CD4+ T-cell infiltration as the most significant predictor of outcome with ≥23% infiltration dividing the cohort into high and low risk groups with regard to event-free survival (EFS, P = 0.007) and OS (P = 0.003). EFS and OS were independent of the R-IPI and LMR. Importantly, within very good/good R-IPI patients, CD4+ T-cells still distinguished patients with different 5 year OS (high 96% versus low 63%, P = 0.02). These results illustrate the importance of circulating and local intratumoral immunity in DLBCL treated with R-CHOP.
Resumo:
Collisions between pedestrians and vehicles continue to be a major problem throughout the world. Pedestrians trying to cross roads and railway tracks without any caution are often highly susceptible to collisions with vehicles and trains. Continuous financial, human and other losses have prompted transport related organizations to come up with various solutions addressing this issue. However, the quest for new and significant improvements in this area is still ongoing. This work addresses this issue by building a general framework using computer vision techniques to automatically monitor pedestrian movements in such high-risk areas to enable better analysis of activity, and the creation of future alerting strategies. As a result of rapid development in the electronics and semi-conductor industry there is extensive deployment of CCTV cameras in public places to capture video footage. This footage can then be used to analyse crowd activities in those particular places. This work seeks to identify the abnormal behaviour of individuals in video footage. In this work we propose using a Semi-2D Hidden Markov Model (HMM), Full-2D HMM and Spatial HMM to model the normal activities of people. The outliers of the model (i.e. those observations with insufficient likelihood) are identified as abnormal activities. Location features, flow features and optical flow textures are used as the features for the model. The proposed approaches are evaluated using the publicly available UCSD datasets, and we demonstrate improved performance using a Semi-2D Hidden Markov Model compared to other state of the art methods. Further we illustrate how our proposed methods can be applied to detect anomalous events at rail level crossings.
Resumo:
Vacuum cleaners can release large concentrations of particles, both in their exhaust air and from resuspension of settled dust. However, the size, variability and microbial diversity of these emissions are unknown, despite evidence to suggest they may contribute to allergic responses and infection transmission indoors. This study aimed to evaluate bioaerosol emission from various vacuum cleaners. We sampled the air in an experimental flow tunnel where vacuum cleaners were run and their airborne emissions sampled with closed-face cassettes. Dust samples were also 35 collected from the dust bag. Total bacteria, total archaea, Penicillium/Aspergillus and total Clostridium cluster 1 were quantified with specific qPCR protocols and emission rates were calculated. Clostridium botulinum, as well as antibiotic resistance genes were detected in each sample using endpoint PCR. Bacterial diversity was also analyzed using denaturing gel electrophoresis (DGGE), image analysis and band sequencing. We demonstrated that emission of bacteria and moulds (Pen/Asp) can reach values as high as 1E05/min and that those emissions are not related to each other. The bag dust bacterial and mould content was also consistently across the vacuums we assessed, reaching up to 1E07 bacteria or moulds equivalent/g. Antibiotic resistance genes were detected in several samples. No archaea or C. botulinum were detected in any air samples. Diversity analyses showed that most bacteria are from human sources, in keeping with other recent results. These results highlight the potential capability of vacuum cleaners to disseminate appreciable quantities of moulds and human-associated bacteria indoors and their role as a source of exposure to bioaerosols.
Resumo:
Background aims Mesenchymal stromal cells (MSCs) cultivated from the corneal limbus (L-MSCs) provide a potential source of cells for corneal repair. In the present study, we investigated the immunosuppressive properties of human L-MSCs and putative rabbit L-MSCs to develop an allogeneic therapy and animal model of L-MSC transplantation. Methods MSC-like cultures were established from the limbal stroma of human and rabbit (New Zealand white) corneas using either serum-supplemented medium or a commercial serum-free MSC medium (MesenCult-XF Culture Kit; Stem Cell Technologies, Melbourne, Australia). L-MSC phenotype was examined by flow cytometry. The immunosuppressive properties of L-MSC cultures were assessed using mixed leukocyte reactions. L-MSC cultures were also tested for their ability to support colony formation by primary limbal epithelial (LE) cells. Results Human L-MSC cultures were typically CD34−, CD45− and HLA-DR− and CD73+, CD90+, CD105+ and HLA-ABC+. High levels (>80%) of CD146 expression were observed for L-MSC cultures grown in serum-supplemented medium but not cultures grown in MesenCult-XF (approximately 1%). Rabbit L-MSCs were approximately 95% positive for major histocompatibility complex class I and expressed lower levels of major histocompatibility complex class II (approximately 10%), CD45 (approximately 20%), CD105 (approximately 60%) and CD90 (<10%). Human L-MSCs and rabbit L-MSCs suppressed human T-cell proliferation by up to 75%. Conversely, L-MSCs from either species stimulated a 2-fold to 3-fold increase in LE cell colony formation. Conclusions L-MSCs display immunosuppressive qualities in addition to their established non-immunogenic profile and stimulate LE cell growth in vitro across species boundaries. These results support the potential use of allogeneic L-MSCs in the treatment of corneal disorders and suggest that the rabbit would provide a useful pre-clinical model.
Resumo:
Double-pass counter flow v-grove collector is considered one of the most efficient solar air-collectors. In this design of the collector, the inlet air initially flows at the top part of the collector and changes direction once it reaches the end of the collector and flows below the collector to the outlet. A mathematical model is developed for this type of collector and simulation is carried out using MATLAB programme. The simulation results were verified with three distinguished research results and it was found that the simulation has the ability to predict the performance of the air collector accurately as proven by the comparison of experimental data with simulation. The difference between the predicted and experimental results is, at maximum, approximately 7% which is within the acceptable limit considering some uncertainties in the input parameter values to allow comparison. A parametric study was performed and it was found that solar radiation, inlet air temperature, flow rate and length has a significant effect on the efficiency of the air collector. Additionally, the results are compared with single flow V-groove collector.
Resumo:
Crashes on motorway contribute to a significant proportion (40-50%) of non-recurrent motorway congestions. Hence reduce crashes will help address congestion issues (Meyer, 2008). Crash likelihood estimation studies commonly focus on traffic conditions in a Short time window around the time of crash while longer-term pre-crash traffic flow trends are neglected. In this paper we will show, through data mining techniques, that a relationship between pre-crash traffic flow patterns and crash occurrence on motorways exists, and that this knowledge has the potential to improve the accuracy of existing models and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with traffic flow data of one hour prior to the crash using an incident detection algorithm. Traffic flow trends (traffic speed/occupancy time series) revealed that crashes could be clustered with regards of the dominant traffic flow pattern prior to the crash. Using the k-means clustering method allowed the crashes to be clustered based on their flow trends rather than their distance. Four major trends have been found in the clustering results. Based on these findings, crash likelihood estimation algorithms can be fine-tuned based on the monitored traffic flow conditions with a sliding window of 60 minutes to increase accuracy of the results and minimize false alarms.
Resumo:
Crashes that occur on motorways contribute to a significant proportion (40-50%) of non-recurrent motorway congestions. Hence, reducing the frequency of crashes assists in addressing congestion issues (Meyer, 2008). Crash likelihood estimation studies commonly focus on traffic conditions in a short time window around the time of a crash while longer-term pre-crash traffic flow trends are neglected. In this paper we will show, through data mining techniques that a relationship between pre-crash traffic flow patterns and crash occurrence on motorways exists. We will compare them with normal traffic trends and show this knowledge has the potential to improve the accuracy of existing models and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with crashes corresponding to traffic flow data using an incident detection algorithm. Traffic trends (traffic speed time series) revealed that crashes can be clustered with regards to the dominant traffic patterns prior to the crash. Using the K-Means clustering method with Euclidean distance function allowed the crashes to be clustered. Then, normal situation data was extracted based on the time distribution of crashes and were clustered to compare with the “high risk” clusters. Five major trends have been found in the clustering results for both high risk and normal conditions. The study discovered traffic regimes had differences in the speed trends. Based on these findings, crash likelihood estimation models can be fine-tuned based on the monitored traffic conditions with a sliding window of 30 minutes to increase accuracy of the results and minimize false alarms.
Resumo:
In the recent manuscript published by Egodawatta et al. (2013), the authors investigated the build-up process of heavy metals (HMs) associated with road-deposited sediment (RDS) on residential road surfaces, and presented empirical models for the prediction of both the surface loads and build-up rates of HMs on these surfaces...
Resumo:
The objective of exercise training is to initiate desirable physiological adaptations that ultimately enhance physical work capacity. Optimal training prescription requires an individualized approach, with an appropriate balance of training stimulus and recovery and optimal periodization. Recovery from exercise involves integrated physiological responses. The cardiovascular system plays a fundamental role in facilitating many of these responses, including thermoregulation and delivery/removal of nutrients and waste products. As a marker of cardiovascular recovery, cardiac parasympathetic reactivation following a training session is highly individualized. It appears to parallel the acute/intermediate recovery of the thermoregulatory and vascular systems, as described by the supercompensation theory. The physiological mechanisms underlying cardiac parasympathetic reactivation are not completely understood. However, changes in cardiac autonomic activity may provide a proxy measure of the changes in autonomic input into organs and (by default) the blood flow requirements to restore homeostasis. Metaboreflex stimulation (e.g. muscle and blood acidosis) is likely a key determinant of parasympathetic reactivation in the short term (0–90 min post-exercise), whereas baroreflex stimulation (e.g. exercise-induced changes in plasma volume) probably mediates parasympathetic reactivation in the intermediate term (1–48 h post-exercise). Cardiac parasympathetic reactivation does not appear to coincide with the recovery of all physiological systems (e.g. energy stores or the neuromuscular system). However, this may reflect the limited data currently available on parasympathetic reactivation following strength/resistance-based exercise of variable intensity. In this review, we quantitatively analyse post-exercise cardiac parasympathetic reactivation in athletes and healthy individuals following aerobic exercise, with respect to exercise intensity and duration, and fitness/training status. Our results demonstrate that the time required for complete cardiac autonomic recovery after a single aerobic-based training session is up to 24 h following low-intensity exercise, 24–48 h following threshold-intensity exercise and at least 48 h following high-intensity exercise. Based on limited data, exercise duration is unlikely to be the greatest determinant of cardiac parasympathetic reactivation. Cardiac autonomic recovery occurs more rapidly in individuals with greater aerobic fitness. Our data lend support to the concept that in conjunction with daily training logs, data on cardiac parasympathetic activity are useful for individualizing training programmes. In the final sections of this review, we provide recommendations for structuring training microcycles with reference to cardiac parasympathetic recovery kinetics. Ultimately, coaches should structure training programmes tailored to the unique recovery kinetics of each individual.
Resumo:
Numerical investigation is carried out for natural convection heat transfer in an isosceles triangular enclosure partitioned in the centre by a vertical wall with infinite conductivity. A sudden temperature difference between two zones of the enclosure has been imposed to trigger the natural convection. As a result, heat is transferred between both sides of the enclosure through the conducting vertical wall with natural convection boundary layers forming adjacent to the middle partition and two inclined surfaces. The Finite Volume based software, Ansys 14.5 (Fluent) is used for the numerical simulations. The numerical results are obtained for different values of aspect ratio, A (0.2, 0.5 and 1.0) and Rayleigh number, Ra (10^5 <= Ra <= 10^8) for a fixed Prandtl number, Pr = 0.72 of air. It is anticipated from the numerical simulations that the coupled thermal boundary layers development adjacent to the partition undergoes several distinct stages including an initial stage, a transitional stage and a steady stage. Time dependent features of the coupled thermal boundary layers as well as the overall natural convection flow in the partitioned enclosure have been discussed in this study.
Resumo:
Passenger flow studies in airport terminals have shown consistent statistical relationships between airport spatial layout and pedestrian movement, facilitating prediction of movement from terminal designs. However, these studies are done at an aggregate level and do not incorporate how individual passengers make decisions at a microscopic level. Therefore, they do not explain the formation of complex movement flows. In addition, existing models mostly focus on standard airport processing procedures such as immigration and security, but seldom consider discretionary activities of passengers, and thus are not able to truly describe the full range of passenger flows within airport terminals. As the route-choice decision-making of passengers involves many uncertain factors within the airport terminals, the mechanisms to fulfill the capacity of managing the route-choice have proven difficult to acquire and quantify. Could the study of cognitive factors of passengers (i.e. human mental preferences of deciding which on-airport facility to use) be useful to tackle these issues? Assuming the movement in virtual simulated environments can be analogous to movement in real environments, passenger behaviour dynamics can be similar to those generated in virtual experiments. Three levels of dynamics have been devised for motion control: the localised field, tactical level, and strategic level. A localised field refers to basic motion capabilities, such as walking speed, direction and avoidance of obstacles. The other two fields represent cognitive route-choice decision-making. This research views passenger flow problems via a "bottom-up approach", regarding individual passengers as independent intelligent agents who can behave autonomously and are able to interact with others and the ambient environment. In this regard, passenger flow formation becomes an emergent phenomenon of large numbers of passengers interacting with others. In the thesis, first, the passenger flow in airport terminals was investigated. Discretionary activities of passengers were integrated with standard processing procedures in the research. The localised field for passenger motion dynamics was constructed by a devised force-based model. Next, advanced traits of passengers (such as their desire to shop, their comfort with technology and their willingness to ask for assistance) were formulated to facilitate tactical route-choice decision-making. The traits consist of quantified measures of mental preferences of passengers when they travel through airport terminals. Each category of the traits indicates a decision which passengers may take. They were inferred through a Bayesian network model by analysing the probabilities based on currently available data. Route-choice decision-making was finalised by calculating corresponding utility results based on those probabilities observed. Three sorts of simulation outcomes were generated: namely, queuing length before checkpoints, average dwell time of passengers at service facilities, and instantaneous space utilisation. Queuing length reflects the number of passengers who are in a queue. Long queues no doubt cause significant delay in processing procedures. The dwell time of each passenger agent at the service facilities were recorded. The overall dwell time of passenger agents at typical facility areas were analysed so as to demonstrate portions of utilisation in the temporal aspect. For the spatial aspect, the number of passenger agents who were dwelling within specific terminal areas can be used to estimate service rates. All outcomes demonstrated specific results by typical simulated passenger flows. They directly reflect terminal capacity. The simulation results strongly suggest that integrating discretionary activities of passengers makes the passenger flows more intuitive, observing probabilities of mental preferences by inferring advanced traits make up an approach capable of carrying out tactical route-choice decision-making. On the whole, the research studied passenger flows in airport terminals by an agent-based model, which investigated individual characteristics of passengers and their impact on psychological route-choice decisions of passengers. Finally, intuitive passenger flows in airport terminals were able to be realised in simulation.
Resumo:
Abstract An experimental dataset representing a typical flow field in a stormwater gross pollutant trap (GPT) was visualised. A technique was developed to apply the image-based flow visualisation (IBFV) algorithm to the raw dataset. Particle image velocimetry (PIV) software was previously used to capture the flow field data by tracking neutrally buoyant particles with a high speed camera. The dataset consisted of scattered 2D point velocity vectors and the IBFV visualisation facilitates flow feature characterisation within the GPT. The flow features played a pivotal role in understanding stormwater pollutant capture and retention behaviour within the GPT. It was found that the IBFV animations revealed otherwise unnoticed flow features and experimental artefacts. For example, a circular tracer marker in the IBFV program visually highlighted streamlines to investigate the possible flow paths of pollutants entering the GPT. The investigated flow paths were compared with the behaviour of pollutants monitored during experiments.
Resumo:
Because of increased competition between healthcare providers, higher customer expectations, stringent checks on insurance payments and new government regulations, it has become vital for healthcare organisations to enhance the quality of the care they provide, to increase efficiency, and to improve the cost effectiveness of their services. Consequently, a number of quality management concepts and tools are employed in the healthcare domain to achieve the most efficient ways of using time, manpower, space and other resources. Emergency departments are designed to provide a high-quality medical service with immediate availability of resources to those in need of emergency care. The challenge of maintaining a smooth flow of patients in emergency departments is a global problem. This study attempts to improve the patient flow in emergency departments by considering Lean techniques and Six Sigma methodology in a comprehensive conceptual framework. The proposed research will develop a systematic approach through integration of Lean techniques with Six Sigma methodology to improve patient flow in emergency departments. The results reported in this paper are based on a standard questionnaire survey of 350 patients in the Emergency Department of Aseer Central Hospital in Saudi Arabia. The results of the study led us to determine the most significant variables affecting patient satisfaction with patient flow, including waiting time during patient treatment in the emergency department; effectiveness of the system when dealing with the patient’s complaints; and the layout of the emergency department. The proposed model will be developed within a performance evaluation metric based on these critical variables, to be evaluated in future work within fuzzy logic for continuous quality improvement.
Resumo:
Background: Microvessel density, an indirect measure of angiogenesis, has been shown to be an independent prognostic marker in many solid tumours including non-small cell lung cancer (NSCLC). Platelets transport and release angiogenic growth factors. Platelets are increasingly likely to adhere to tumour microvessels due to raised expression of platelet-binding proteins and stasis in blood-flow. Increased vascular permeability in tumour microvessels facilitates platelet extravasation into the extracellular matrix. Adherence and extravasation both lead to platelet activation and release of growth factors capable of instigating the angiogenic process. Methods: A total of 181 patients were identified who underwent resection of stage I-IIIa NSCLC with a post-operative survival >60 days. Patients were followed-up for a minimum of 24 months. Sections from the tumour periphery were stained for the endothelial marker CD34 (Novocastra NCL-END) using standard ABC immunohistochemistry. Chalkley counting was used to assess microvessel density. Results: A pre-operative platelet count greater than the median and above the normal range (>400) was associated with a poor outcome (P = 0.01 and P = 0.04, respectively). Tumours with an above median and high Chalkley count (upper tertile) had a worse prognosis (P = 0.007 and P = 0.0006, respectively). There was no association between platelet count and Chalkley count. Conclusions: Platelet and microvessel counts are both potential prognostic markers for NSCLC. The role of platelets in the angiogenic process needs to be further investigated. (C) 2000 Elsevier Science Ireland Ltd.