977 resultados para Operation analysis
Resumo:
Objectives: To assess the potential source of variation that surgeon may add to patient outcome in a clinical trial of surgical procedures. Methods: Two large (n = 1380) parallel multicentre randomized surgical trials were undertaken to compare laparoscopically assisted hysterectomy with conventional methods of abdominal and vaginal hysterectomy; involving 43 surgeons. The primary end point of the trial was the occurrence of at least one major complication. Patients were nested within surgeons giving the data set a hierarchical structure. A total of 10% of patients had at least one major complication, that is, a sparse binary outcome variable. A linear mixed logistic regression model (with logit link function) was used to model the probability of a major complication, with surgeon fitted as a random effect. Models were fitted using the method of maximum likelihood in SAS((R)). Results: There were many convergence problems. These were resolved using a variety of approaches including; treating all effects as fixed for the initial model building; modelling the variance of a parameter on a logarithmic scale and centring of continuous covariates. The initial model building process indicated no significant 'type of operation' across surgeon interaction effect in either trial, the 'type of operation' term was highly significant in the abdominal trial, and the 'surgeon' term was not significant in either trial. Conclusions: The analysis did not find a surgeon effect but it is difficult to conclude that there was not a difference between surgeons. The statistical test may have lacked sufficient power, the variance estimates were small with large standard errors, indicating that the precision of the variance estimates may be questionable.
Resumo:
This paper examines the life cycle GHG emissions from existing UK pulverized coal power plants. The life cycle of the electricity Generation plant includes construction, operation and decommissioning. The operation phase is extended to upstream and downstream processes. Upstream processes include the mining and transport of coal including methane leakage and the production and transport of limestone and ammonia, which are necessary for flue gas clean up. Downstream processes, on the other hand, include waste disposal and the recovery of land used for surface mining. The methodology used is material based process analysis that allows calculation of the total emissions for each process involved. A simple model for predicting the energy and material requirements of the power plant is developed. Preliminary calculations reveal that for a typical UK coal fired plant, the life cycle emissions amount to 990 g CO2-e/kWh of electricity generated, which compares well with previous UK studies. The majority of these emissions result from direct fuel combustion (882 g/kWh 89%) with methane leakage from mining operations accounting for 60% of indirect emissions. In total, mining operations (including methane leakage) account for 67.4% of indirect emissions, while limestone and other material production and transport account for 31.5%. The methodology developed is also applied to a typical IGCC power plant. It is found that IGCC life cycle emissions are 15% less than those from PC power plants. Furthermore, upon investigating the influence of power plant parameters on life cycle emissions, it is determined that, while the effect of changing the load factor is negligible, increasing efficiency from 35% to 38% can reduce emissions by 7.6%. The current study is funded by the UK National Environment Research Council (NERC) and is undertaken as part of the UK Carbon Capture and Storage Consortium (UKCCSC). Future work will investigate the life cycle emissions from other power generation technologies with and without carbon capture and storage. The current paper reveals that it might be possible that, when CCS is employed. the emissions during generation decrease to a level where the emissions from upstream processes (i.e. coal production and transport) become dominant, and so, the life cycle efficiency of the CCS system can be significantly reduced. The location of coal, coal composition and mining method are important in determining the overall impacts. In addition to studying the net emissions from CCS systems, future work will also investigate the feasibility and technoeconomics of these systems as a means of carbon abatement.
Resumo:
Abstract. Different types of mental activity are utilised as an input in Brain-Computer Interface (BCI) systems. One such activity type is based on Event-Related Potentials (ERPs). The characteristics of ERPs are not visible in single-trials, thus averaging over a number of trials is necessary before the signals become usable. An improvement in ERP-based BCI operation and system usability could be obtained if the use of single-trial ERP data was possible. The method of Independent Component Analysis (ICA) can be utilised to separate single-trial recordings of ERP data into components that correspond to ERP characteristics, background electroencephalogram (EEG) activity and other components with non- cerebral origin. Choice of specific components and their use to reconstruct “denoised” single-trial data could improve the signal quality, thus allowing the successful use of single-trial data without the need for averaging. This paper assesses single-trial ERP signals reconstructed using a selection of estimated components from the application of ICA on the raw ERP data. Signal improvement is measured using Contrast-To-Noise measures. It was found that such analysis improves the signal quality in all single-trials.
Resumo:
IPLV overall coefficient, presented by Air-Conditioning and Refrigeration Institute (ARI) of America, shows running/operation status of air-conditioning system host only. For overall operation coefficient, logical solution has not been developed, to reflect the whole air-conditioning system under part load. In this research undertaking, the running time proportions of air-conditioning systems under part load have been obtained through analysis on energy consumption data during practical operation in all public buildings in Chongqing. This was achieved by using analysis methods, based on the statistical energy consumption data distribution of public buildings month-by-month. Comparing with the weight number of IPLV, part load operation coefficient of air-conditioning system, based on this research, does not only show the status of system refrigerating host, but also reflects and calculate energy efficiency of the whole air-conditioning system. The coefficient results from the processing and analyzing of practical running data, shows the practical running status of area and building type (actual and objective) – not clear. The method is different from model analysis which gets IPLV weight number, in the sense that this method of coefficient results in both four equal proportions and also part load operation coefficient of air-conditioning system under any load rate as necessary.
Resumo:
Accurate single trial P300 classification lends itself to fast and accurate control of Brain Computer Interfaces (BCIs). Highly accurate classification of single trial P300 ERPs is achieved by characterizing the EEG via corresponding stationary and time-varying Wackermann parameters. Subsets of maximally discriminating parameters are then selected using the Network Clustering feature selection algorithm and classified with Naive-Bayes and Linear Discriminant Analysis classifiers. Hence the method is assessed on two different data-sets from BCI competitions and is shown to produce accuracies of between approximately 70% and 85%. This is promising for the use of Wackermann parameters as features in the classification of single-trial ERP responses.
Resumo:
The overall operation and internal complexity of a particular production machinery can be depicted in terms of clusters of multidimensional points which describe the process states, the value in each point dimension representing a measured variable from the machinery. The paper describes a new cluster analysis technique for use with manufacturing processes, to illustrate how machine behaviour can be categorised and how regions of good and poor machine behaviour can be identified. The cluster algorithm presented is the novel mean-tracking algorithm, capable of locating N-dimensional clusters in a large data space in which a considerable amount of noise is present. Implementation of the algorithm on a real-world high-speed machinery application is described, with clusters being formed from machinery data to indicate machinery error regions and error-free regions. This analysis is seen to provide a promising step ahead in the field of multivariable control of manufacturing systems.
Resumo:
The Stochastic Diffusion Search (SDS) was developed as a solution to the best-fit search problem. Thus, as a special case it is capable of solving the transform invariant pattern recognition problem. SDS is efficient and, although inherently probabilistic, produces very reliable solutions in widely ranging search conditions. However, to date a systematic formal investigation of its properties has not been carried out. This thesis addresses this problem. The thesis reports results pertaining to the global convergence of SDS as well as characterising its time complexity. However, the main emphasis of the work, reports on the resource allocation aspect of the Stochastic Diffusion Search operations. The thesis introduces a novel model of the algorithm, generalising an Ehrenfest Urn Model from statistical physics. This approach makes it possible to obtain a thorough characterisation of the response of the algorithm in terms of the parameters describing the search conditions in case of a unique best-fit pattern in the search space. This model is further generalised in order to account for different search conditions: two solutions in the search space and search for a unique solution in a noisy search space. Also an approximate solution in the case of two alternative solutions is proposed and compared with predictions of the extended Ehrenfest Urn model. The analysis performed enabled a quantitative characterisation of the Stochastic Diffusion Search in terms of exploration and exploitation of the search space. It appeared that SDS is biased towards the latter mode of operation. This novel perspective on the Stochastic Diffusion Search lead to an investigation of extensions of the standard SDS, which would strike a different balance between these two modes of search space processing. Thus, two novel algorithms were derived from the standard Stochastic Diffusion Search, ‘context-free’ and ‘context-sensitive’ SDS, and their properties were analysed with respect to resource allocation. It appeared that they shared some of the desired features of their predecessor but also possessed some properties not present in the classic SDS. The theory developed in the thesis was illustrated throughout with carefully chosen simulations of a best-fit search for a string pattern, a simple but representative domain, enabling careful control of search conditions.
Resumo:
In this paper we present a connectionist searching technique - the Stochastic Diffusion Search (SDS), capable of rapidly locating a specified pattern in a noisy search space. In operation SDS finds the position of the pre-specified pattern or if it does not exist - its best instantiation in the search space. This is achieved via parallel exploration of the whole search space by an ensemble of agents searching in a competitive cooperative manner. We prove mathematically the convergence of stochastic diffusion search. SDS converges to a statistical equilibrium when it locates the best instantiation of the object in the search space. Experiments presented in this paper indicate the high robustness of SDS and show good scalability with problem size. The convergence characteristic of SDS makes it a fully adaptive algorithm and suggests applications in dynamically changing environments.
Resumo:
This paper describes the novel use of cluster analysis in the field of industrial process control. The severe multivariable process problems encountered in manufacturing have often led to machine shutdowns, where the need for corrective actions arises in order to resume operation. Production faults which are caused by processes running in less efficient regions may be prevented or diagnosed using a reasoning based on cluster analysis. Indeed the intemal complexity of a production machinery may be depicted in clusters of multidimensional data points which characterise the manufacturing process. The application of a Mean-Tracking cluster algorithm (developed in Reading) to field data acquired from a high-speed machinery will be discussed. The objective of such an application is to illustrate how machine behaviour can be studied, in particular how regions of erroneous and stable running behaviour can be identified.
Resumo:
In biological mass spectrometry (MS), two ionization techniques are predominantly employed for the analysis of larger biomolecules, such as polypeptides. These are nano-electrospray ionization [1, 2] (nanoESI) and matrix-assisted laser desorption/ionization [3, 4] (MALDI). Both techniques are considered to be “soft”, allowing the desorption and ionization of intact molecular analyte species and thus their successful mass-spectrometric analysis. One of the main differences between these two ionization techniques lies in their ability to produce multiply charged ions. MALDI typically generates singly charged peptide ions whereas nanoESI easily provides multiply charged ions, even for peptides as low as 1000 Da in mass. The production of highly charged ions is desirable as this allows the use of mass analyzers, such as ion traps (including orbitraps) and hybrid quadrupole instruments, which typically offer only a limited m/z range (< 2000–4000). It also enables more informative fragmentation spectra using techniques such as collisioninduced dissociation (CID) and electron capture/transfer dissociation (ECD/ETD) in combination with tandem MS (MS/MS). [5, 6] Thus, there is a clear advantage of using ESI in research areas where peptide sequencing, or in general, the structural elucidation of biomolecules by MS/MS is required. Nonetheless, MALDI with its higher tolerance to contaminants and additives, ease-of-operation, potential for highspeed and automated sample preparation and analysis as well as its MS imaging capabilities makes it an ionization technique that can cover bioanalytical areas for which ESI is less suitable. [7, 8] If these strengths could be combined with the analytical power of multiply charged ions, new instrumental configurations and large-scale proteomic analyses based on MALDI MS(/MS) would become feasible.
Resumo:
This article argues in favour of a functional analysis of proprietary estoppel which focuses on the role of the doctrine in enabling claims to the informal acquisition of property rights in land. The article shows that adopting such an analysis both assists our understanding of two recent decisions of the English House of Lords and helps to resolve issues of taxonomy that arise in relation to the doctrine. A functional analysis both unites the sub-categories of proprietary estoppel into a single principle and distinguishes this principle from other types of estoppel claim. It is suggested, however, that the unification of common law and equitable estoppel remains both possible and desirable as long as ‘unification’ is understood broadly and is not confined to the recognition of a doctrine that is identical in its scope and operation in all cases. It is further shown that despite a lack of discussion of the concept in the House of Lords, unconscionability continues to play a key role in proprietary estoppel and therefore in the informal acquisition of property rights. Unconscionability may now benefit from a closer connection with the other elements of claims which should prevent abuse of the concept and allay concerns of ‘palm-tree justice’.
Resumo:
The use of inter-laboratory test comparisons to determine the performance of individual laboratories for specific tests (or for calibration) [ISO/IEC Guide 43-1, 1997. Proficiency testing by interlaboratory comparisons - Part 1: Development and operation of proficiency testing schemes] is called Proficiency Testing (PT). In this paper we propose the use of the generalized likelihood ratio test to compare the performance of the group of laboratories for specific tests relative to the assigned value and illustrate the procedure considering an actual data from the PT program in the area of volume. The proposed test extends the test criteria in use allowing to test for the consistency of the group of laboratories. Moreover, the class of elliptical distributions are considered for the obtained measurements. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The simple halogenation of alkynes in conventional organic reactions gives a blend of cis and trans isomers. It is proposed then, a synthesis of stereospecific halogenation of alkynes in trans position, using palladacycle as intermediaries. The recrystallization of the compound obtained by bromination of 2-Styrylpyridine, with cyclepalladium intermediary results in a single crystal, which is subjected to X-ray diffraction. The crystal packing is established through weak interactions of three types. The first one is of the type pi x pi interactions, from symmetry operation, between the centroids. The second one is of the type C-X center dot center dot center dot pi interactions. And the last type is an anomalous intermolecular interaction between halogens, C-X center dot center dot center dot X-C, with bond distances smaller than the sum of the van der Waals radii. The conformation on the C=C bond is trans and the dihedral angle between the aromatic rings is (with esd approximate) 18.1(3)degrees. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Syftet med studien var att beskriva patientens upplevelser efter en gastric bypass operation. En empirisk studie med kvalitativ ansats genomfördes, där intervjuer skedde med sex personer som genomfört en gastric bypass operation. Data analyserades genom kvalitativ innehållsanalys. Under intervjuerna framkom att informanterna upplevde att operationen inte hade någon negativ inverkan på deras vardagliga liv. De upplevde en ökad livskvalitet och förbättrad hälsa som lett till ett mer fysiskt aktivt liv. Viktnedgången tillsammans med omgivningens positiva reaktioner på den nya kroppen hade stärkt deras självförtroende. Den största förändringen informanterna upplevde i det vardagliga livet var deras förändrade matvanor. De beskrev att maten var i fokus och att de planerade sin vardag efter måltiderna. De följder som framkom upplevdes inte som något stort problem, utan viktminskningen och de positiva hälsoupplevelserna dominerade. Informanterna kände sig välinformerade och hade realistiska förväntningar på operationen, samt var förberedda på de konsekvenser som kunde uppstå. Studien visar att den specialiserade vården på överviktskliniker kan bidra till att underlätta tiden efter operationen för patienten. Det framkom att informanterna upplevde en kunskapsbrist om gastric bypass operationer bland sjukvårdspersonal, inom primärvård och slutenvård. För att patienten ska känna trygghet och förtroende är det därför viktigt att öka kunskapen om överviktsoperationer bland hälso- och sjukvårdspersonal.
Resumo:
Purpose – This case study presents an impact assessment of Corporate Social Responsibility (CSR) programs of the TFM Company in order to understand how they contribute to the sustainable development of communities in areas in which they operate. Design/Methodology/Approach - Data for this study was collected using qualitative data methods that included semi-structured interviews and Focus Group Discussions most of them audio and video recorded. Documentary analysis and a field visit were also undertaken for the purpose of quality analysis of the CSR programs on the terrain. Data collected was analyzed using the Seven Questions to sustainability (7Qs) framework, an evaluation tool developed by the Mining, Minerals and Sustainable Development (MMSD) North America chapter. Content analysis method was on the other hand used to examine the interviews and FGDs of the study participants. Findings - Results shows that CSR programs of TFM SA do contribute to community development, as there have been notable changes in the communities’ living conditions. But whether they have contributed to sustainable development is not yet the case as programs that enhance the capacity of communities and other stakeholders to support these projects development beyond the implementation stage and the mines operation lifetime need to be considered and implemented. Originality/Value – In DRC, there is paucity of information of research studies that focus on impact assessment of CSR programs in general and specifically those of mining companies and their contribution to sustainable development of local communities. Many of the available studies cover issues of minerals and conflict or conflict minerals as mostly referred to. This study addressees this gap.