186 resultados para Non-normal process


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The field of prognostics has attracted significant interest from the research community in recent times. Prognostics enables the prediction of failures in machines resulting in benefits to plant operators such as shorter downtimes, higher operation reliability, reduced operations and maintenance cost, and more effective maintenance and logistics planning. Prognostic systems have been successfully deployed for the monitoring of relatively simple rotating machines. However, machines and associated systems today are increasingly complex. As such, there is an urgent need to develop prognostic techniques for such complex systems operating in the real world. This review paper focuses on prognostic techniques that can be applied to rotating machinery operating under non-linear and non-stationary conditions. The general concept of these techniques, the pros and cons of applying these methods, as well as their applications in the research field are discussed. Finally, the opportunities and challenges in implementing prognostic systems and developing effective techniques for monitoring machines operating under non-stationary and non-linear conditions are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the extensive use of rating systems in the web, and their significance in decision making process by users, the need for more accurate aggregation methods has emerged. The Naïve aggregation method, using the simple mean, is not adequate anymore in providing accurate reputation scores for items [6 ], hence, several researches where conducted in order to provide more accurate alternative aggregation methods. Most of the current reputation models do not consider the distribution of ratings across the different possible ratings values. In this paper, we propose a novel reputation model, which generates more accurate reputation scores for items by deploying the normal distribution over ratings. Experiments show promising results for our proposed model over state-of-the-art ones on sparse and dense datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The VEGF pathway has become an important therapeutic target in lung cancer, where VEGF has long been established as a potent pro-angiogenic growth factor expressed by many types of tumors. While Bevacizumab (Avastin) has proven successful in increasing the objective tumor response rate and in prolonging progression and overall survival in patients with NSCLC, the survival benefit is however relatively short and the majority of patients eventually relapse. The current use of tyrosine kinase inhibitors alone and in combination with chemotherapy has been underwhelming, highlighting an urgent need for new targeted therapies. In this study, we examined the mechanisms of VEGF-mediated survival in NSCLC cells and the role of the Neuropilin receptors in this process. Methods NSCLC cells were screened for expression of VEGF and its receptors. The effects of recombinant VEGF and its blockade on lung tumor cell proliferation and cell cycle were examined. Phosphorylation of Akt and Erk1/2 proteins was examined by high content analysis and confocal microscopy. The effects of silencing VEGF on cell proliferation and survival signaling were also assessed. A Neuropilin-1 stable-transfected cell line was generated. Cell growth characteristics in addition to pAkt and pErk1/2 signaling were studied in response to VEGF and its blockade. Tumor growth studies were carried out in nude mice following subcutaneous injection of NP1 over-expressing cells. Results Inhibition of the VEGF pathway with anti-VEGF and anti-VEGFR-2 antibodies or siRNA to VEGF, NP1 and NP2 resulted in growth inhibition of NP1 positive tumor cell lines associated with down-regulation of PI3K and MAPK kinase signaling. Stable transfection of NP1 negative cells with NP1 induced proliferation in vitro, which was further enhanced by exogenous VEGF. In vivo, NP1 over-expressing cells significantly increased tumor growth in xenografts compared to controls. Conclusions Our data demonstrate that VEGF is an autocrine growth factor in NSCLC signaling, at least in part, through NP1. Targeting this VEGF receptor may offer potential as a novel therapeutic approach and also support the evaluation of the role of NP1 as a biomarker predicting sensitivity or resistance to VEGF and VEGFR-targeted therapies in the clinical arena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Companies standardise and automate their business processes in order to improve process eff ciency and minimise operational risks. However, it is di fficult to eliminate all process risks during the process design stage due to the fact that processes often run in complex and changeable environments and rely on human resources. Timely identification of process risks is crucial in order to insure the achievement of process goals. Business processes are often supported by information systems that record information about their executions in event logs. In this article we present an approach and a supporting tool for the evaluation of the overall process risk and for the prediction of process outcomes based on the analysis of information recorded in event logs. It can help managers evaluate the overall risk exposure of their business processes, track the evolution of overall process risk, identify changes and predict process outcomes based on the current value of overall process risk. The approach was implemented and validated using synthetic event logs and through a case study with a real event log.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-motorised underwater treadmills are commonly used in fitness activities. However, no studies have examined physiological and biomechanical responses of walking on non-motorised treadmills at different intensities and depths. Fifteen middle-aged healthy women underwent two underwater walking tests at two different depths, immersed either up to the xiphoid process (deep water) or the iliac crest (shallow water), at 100, 110, 120, 130 step-per-minute (spm). Oxygen consumption (VO2), heart rate (HR), blood lactate concentration, perceived exertion and step length were determined. Compared to deep water, walking in shallow water exhibited, at all intensities, significantly higher VO2 (+13.5%, on average) and HR (+8.1%, on average) responses. Water depth did not influence lactate concentration, whereas perceived exertion was higher in shallow compared to deep water, solely at 120 (+40%) and 130 (+39.4%) spm. Average step length was reduced as the intensity increased (from 100 to 130 spm), irrespective of water depth. Expressed as a percentage of maximum, average VO2 and HR were: 64–76% of peak VO2 and 71–90% of maximum HR, respectively at both water depths. Accordingly, this form of exercise can be included in the “vigorous” range of exercise intensity, at any of the step frequencies used in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process variability in pollutant build-up and wash-off generates inherent uncertainty that affects the outcomes of stormwater quality models. Poor characterisation of process variability constrains the accurate accounting of the uncertainty associated with pollutant processes. This acts as a significant limitation to effective decision making in relation to stormwater pollution mitigation. The study undertaken developed three theoretical scenarios based on research findings that variations in particle size fractions <150µm and >150µm during pollutant build-up and wash-off primarily determine the variability associated with these processes. These scenarios, which combine pollutant build-up and wash-off processes that takes place on a continuous timeline, are able to explain process variability under different field conditions. Given the variability characteristics of a specific build-up or wash-off event, the theoretical scenarios help to infer the variability characteristics of the associated pollutant process that follows. Mathematical formulation of the theoretical scenarios enables the incorporation of variability characteristics of pollutant build-up and wash-off processes in stormwater quality models. The research study outcomes will contribute to the quantitative assessment of uncertainty as an integral part of the interpretation of stormwater quality modelling outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fusing data from multiple sensing modalities, e.g. laser and radar, is a promising approach to achieve resilient perception in challenging environmental conditions. However, this may lead to \emph{catastrophic fusion} in the presence of inconsistent data, i.e. when the sensors do not detect the same target due to distinct attenuation properties. It is often difficult to discriminate consistent from inconsistent data across sensing modalities using local spatial information alone. In this paper we present a novel consistency test based on the log marginal likelihood of a Gaussian process model that evaluates data from range sensors in a relative manner. A new data point is deemed to be consistent if the model statistically improves as a result of its fusion. This approach avoids the need for absolute spatial distance threshold parameters as required by previous work. We report results from object reconstruction with both synthetic and experimental data that demonstrate an improvement in reconstruction quality, particularly in cases where data points are inconsistent yet spatially proximal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While there is clear recognition of the need to incorporate sustainable development into university curricula, there is limited research that examines how to achieve that integration or evaluates its impacts on student learning. This paper responds to these knowledge gaps through a case study of curriculum renewal that involved embedding sustainability into a first year engineering curriculum. The initiative was guided by a deliberative and dynamic model for curriculum renewal that brought together internal and external stakeholders through a structured sequence of facilitated workshops and meetings. That process identified sustainability-related knowledge and skills relevant for first year engineering, and faculty members teaching in the first year program were guided through a process of curriculum renewal to meet those needs. The process through which the whole of curriculum renewal was undertaken is innovative and provides a case study of precedent in the field of education for sustainability. The study demonstrates the contribution that can be made by a web-based sustainability portal in supporting curriculum renewal. Learning and teaching outcomes were evaluated through ‘before and after surveys’ of the first year engineering students. Statistically significant increases in student's self-reported knowledge of sustainability were measured as a result of exposure to the renewed first year curriculum and this confirmed the value of the initiative in terms of enhancing student learning. While applied in this case to engineering, the process to achieve integration of sustainability into the curriculum approach is likely to have value for other academic disciplines. Considering student performance on assignments and exam questions relating to sustainability would provide a stronger basis for future research to understand the impact of initiatives like this on student learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The insecure supply of fossil fuel coerces the scientific society to keep a vision to boost investments in the renewable energy sector. Among the many renewable fuels currently available around the world, biodiesel offers an immediate impact in our energy. In fact, a huge interest in related research indicates a promising future for the biodiesel technology. Heterogeneous catalyzed production of biodiesel has emerged as a preferred route as it is environmentally benign needs no water washing and product separation is much easier. The number of well-defined catalyst complexes that are able to catalyze transesterification reactions efficiently has been significantly expanded in recent years. The activity of catalysts, specifically in application to solid acid/base catalyst in transesterification reaction depends on their structure, strength of basicity/acidity, surface area as well as the stability of catalyst. There are various process intensification technologies based on the use of alternate energy sources such as ultrasound and microwave. The latest advances in research and development related to biodiesel production is represented by non-catalytic supercritical method and focussed exclusively on these processes as forthcoming transesterification processes. The latest developments in this field featuring highly active catalyst complexes are outlined in this review. The knowledge of more extensive research on advances in biofuels will allow a deeper insight into the mechanism of these technologies toward meeting the critical energy challenges in future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular phylogenetic studies of homologous sequences of nucleotides often assume that the underlying evolutionary process was globally stationary, reversible, and homogeneous (SRH), and that a model of evolution with one or more site-specific and time-reversible rate matrices (e.g., the GTR rate matrix) is enough to accurately model the evolution of data over the whole tree. However, an increasing body of data suggests that evolution under these conditions is an exception, rather than the norm. To address this issue, several non-SRH models of molecular evolution have been proposed, but they either ignore heterogeneity in the substitution process across sites (HAS) or assume it can be modeled accurately using the distribution. As an alternative to these models of evolution, we introduce a family of mixture models that approximate HAS without the assumption of an underlying predefined statistical distribution. This family of mixture models is combined with non-SRH models of evolution that account for heterogeneity in the substitution process across lineages (HAL). We also present two algorithms for searching model space and identifying an optimal model of evolution that is less likely to over- or underparameterize the data. The performance of the two new algorithms was evaluated using alignments of nucleotides with 10 000 sites simulated under complex non-SRH conditions on a 25-tipped tree. The algorithms were found to be very successful, identifying the correct HAL model with a 75% success rate (the average success rate for assigning rate matrices to the tree's 48 edges was 99.25%) and, for the correct HAL model, identifying the correct HAS model with a 98% success rate. Finally, parameter estimates obtained under the correct HAL-HAS model were found to be accurate and precise. The merits of our new algorithms were illustrated with an analysis of 42 337 second codon sites extracted from a concatenation of 106 alignments of orthologous genes encoded by the nuclear genomes of Saccharomyces cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, S. castellii, S. kluyveri, S. bayanus, and Candida albicans. Our results show that second codon sites in the ancestral genome of these species contained 49.1% invariable sites, 39.6% variable sites belonging to one rate category (V1), and 11.3% variable sites belonging to a second rate category (V2). The ancestral nucleotide content was found to differ markedly across these three sets of sites, and the evolutionary processes operating at the variable sites were found to be non-SRH and best modeled by a combination of eight edge-specific rate matrices (four for V1 and four for V2). The number of substitutions per site at the variable sites also differed markedly, with sites belonging to V1 evolving slower than those belonging to V2 along the lineages separating the seven species of Saccharomyces. Finally, sites belonging to V1 appeared to have ceased evolving along the lineages separating S. cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, and S. bayanus, implying that they might have become so selectively constrained that they could be considered invariable sites in these species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process view technology is catching more attentions in modern business process management, as it enables the customisation of business process representation. This capability helps improve the privacy protection, authority control, flexible display, etc., in business process modelling. One of approaches to generate process views is to allow users to construct an aggregate on their underlying processes. However, most aggregation approaches stick to a strong assumption that business processes are always well-structured, which is over strict to BPMN. Aiming to build process views for non-well-structured BPMN processes, this paper investigates the characteristics of BPMN structures, tasks, events, gateways, etc., and proposes a formal process view aggregation approach to facilitate BPMN process view creation. A set of consistency rules and construction rules are defined to regulate the aggregation and guarantee the order preservation, structural and behaviour correctness and a novel aggregation technique, called EP-Fragment, is developed to tackle non-well-structured BPMN processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an approach, based on Lean production philosophy, for rationalising the processes involved in the production of specification documents for construction projects. Current construction literature erroneously depicts the process for the creation of construction specifications as a linear one. This traditional understanding of the specification process often culminates in process-wastes. On the contrary, the evidence suggests that though generalised, the activities involved in producing specification documents are nonlinear. Drawing on the outcome of participant observation, this paper presents an optimised approach for representing construction specifications. Consequently, the actors typically involved in producing specification documents are identified, the processes suitable for automation are highlighted and the central role of tacit knowledge is integrated into a conceptual template of construction specifications. By applying the transformation, flow, value (TFV) theory of Lean production the paper argues that value creation can be realised by eliminating the wastes associated with the traditional preparation of specification documents with a view to integrating specifications in digital models such as Building Information Models (BIM). Therefore, the paper presents an approach for rationalising the TFV theory as a method for optimising current approaches for generating construction specifications based on a revised specification writing model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organisations are always focussed on ensuring that their business operations are performed in the most cost-effective manner, and that processes are responsive to ever-changing cost pressures. In many organisations, however, strategic cost-based decisions at the managerial level are not directly or quickly translatable to process-level operational support. A primary reason for this disconnect is the limited system-based support for cost-informed decisions at the process-operational level in real time. In this paper, we describe the different ways in which a workflow management system can support process-related decisions, guided by cost-informed considerations at the operational level, during execution. As a result, cost information is elevated from its non-functional attribute role to a first-class, fully functional process perspective. The paper defines success criteria that a WfMS should meet to provide such support, and discusses a reference implementation within the YAWL workflow environment that demonstrates how the various types of cost-informed decision rules are supported, using an illustrative example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To examine if streamlining a medical research funding application process saved time for applicants. Design: Cross-sectional surveys before and after the streamlining. Setting: The National Health and Medical Research Council (NHMRC) of Australia. Participants: Researchers who submitted one or more NHMRC Project Grant applications in 2012 or 2014. Main outcome measures: Average researcher time spent preparing an application and the total time for all applications in working days. Results: The average time per application increased from 34 working days before streamlining (95% CI 33 to 35) to 38 working days after streamlining (95% CI 37 to 39; mean difference 4 days, bootstrap p value <0.001). The estimated total time spent by all researchers on applications after streamlining was 614 working years, a 67-year increase from before streamlining. Conclusions: Streamlined applications were shorter but took longer to prepare on average. Researchers may be allocating a fixed amount of time to preparing funding applications based on their expected return, or may be increasing their time in response to increased competition. Many potentially productive years of researcher time are still being lost to preparing failed applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM To examine the prevalence of dyslexia and Meares–Irlen syndrome (MIS) among female students and determine their level of visual stress in comparison with normal subjects. METHODS A random sample of 450 female medical students of King Saud University Riyadh (age range, 18 - 30 years) responded to a wide range of questions designed to accomplish the aims of this study. The detailed questionnaire consisted of 54 questions with twelve questions enquiring on ocular history and demography of participants while 42 questions were on visual symptoms. Items were categorized into; critical and non-critical questions (CQ and NCQ) and were rated on four point Likert scale. Based on the responses obtained, the subjects were grouped into normal (control), dyslexic with or without MIS (Group 1) and subjects with MIS only (Group 2). Responses were analysed as averages and mean scores were calculated and compared between-groups using one way analysis of variance to evaluate total (TVSS = NCQ + CQ), critical and non-critical visual stress scores. The relationship between categorical variables such as age, handedness and condition were assessed with Chi- Square test. RESULTS The completion rate was 96.8% and majority of the respondents (92%) were normal readers, 2% dyslexic and 6% had MIS. They were age-matched. More than half of the participants had visited an eye care practitioner in the last 2yrs. About 13% were recommended eye exercises and one participant experienced pattern glare. Hand preference was not associated with any condition but Group 1 subjects (3/9, 33%) were significantly more likely to be diagnosed of lazy eye than Group 2 (2/27, 7%) and control (27/414, 5%) subjects. The mean ± SD of TVSS responses were 63 ± 14 but it was 44 ± 9 for CQ and 19 ± 5 for NCQ. Responses from all three variables were normally distributed but the CQ responses were on the average more positive (82%) in Group 2 and less positive (46%) in Group 1 than control. With NCQ, the responses were equally less positive in Group 1 and 2 than control. Group 2 subjects showed significantly higher TVSS (P = 0.002), NCQ (P = 0.006) and CQ (P = 0.008) visual stress scores than control but no difference between Group 1 and control subjects, was observed for all scores (P > 0.05, for all comparisons). CONCLUSION The prevalence of dyslexia and MIS among Saudi female students was 2 and 6%, respectively. Critical questions performed best for assessing visual stress symptoms in dyslexic and MIS subjects. Generally, students with MIS were more sensitive to visual stress than normal students but dyslexics were more likely to present with a lazy eye than MIS and normal readers.