838 resultados para computer-based instrumentation
Resumo:
Parliamentary debates about the resolution of the EU debt crisis seem to provide a good example for the frequently assumed “politicizationˮ of European governance. Against this background, the paper argues that in order to make sense of this assumption, a clearer differentiation of three thematic focal points of controversies – with regard to the assessment of government leadership, concerning the debate between competing party ideologies within the left/right dimension, and with regard to the assessment of supranational integration – is needed. Applying this threefold distinction, the paper uses a theory of differential Europeanization to explain differences in the thematic structure of debates in the Austrian Nationalrat, the British House of Commons, and the German Bundestag. Empirically, the paper is based on data gained from the computer-based coding of plenary debates about the resolution of the European debt crisis between 2010 and 2011.
Resumo:
Virtual learning environments (VLEs) are computer-based online learning environments, which provide opportunities for online learners to learn at the time and location of their choosing, whilst allowing interactions and encounters with other online learners, as well as affording access to a wide range of resources. They have the capability of reaching learners in remote areas around the country or across country boundaries at very low cost. Personalized VLEs are those VLEs that provide a set of personalization functionalities, such as personalizing learning plans, learning materials, tests, and are capable of initializing the interaction with learners by providing advice, necessary instant messages, etc., to online learners. One of the major challenges involved in developing personalized VLEs is to achieve effective personalization functionalities, such as personalized content management, learner model, learner plan and adaptive instant interaction. Autonomous intelligent agents provide an important technology for accomplishing personalization in VLEs. A number of agents work collaboratively to enable personalization by recognizing an individual's eLeaming pace and reacting correspondingly. In this research, a personalization model has been developed that demonstrates dynamic eLearning processes; secondly, this study proposes an architecture for PVLE by using intelligent decision-making agents' autonomous, pre-active and proactive behaviors. A prototype system has been developed to demonstrate the implementation of this architecture. Furthemore, a field experiment has been conducted to investigate the performance of the prototype by comparing PVLE eLearning effectiveness with a non-personalized VLE. Data regarding participants' final exam scores were collected and analyzed. The results indicate that intelligent agent technology can be employed to achieve personalization in VLEs, and as a consequence to improve eLeaming effectiveness dramatically.
Resumo:
Objective: To evaluate the efficacy of Lactobacillus rhamnosus GG in the prevention of antibiotic-associated diarrhoea. Data Sources: A computer-based search of MED-LINE, CINAHL, AMED, the Cochrane Controlled Trials Register and the Cochrane Database of Systematic Reviews was conducted. A hand-search of the bibliographies of relevant papers and previous meta-analyses was undertaken. Review Methods: Trials were included in the review if they compared the effects of L. rhamnosus GG and placebo and listed diarrhoea as a primary end-point. Studies were excluded if they were not placebo-controlled or utilised other probiotic strains. Results:Six trials were found that met all eligibility requirements. Significant statistical heterogeneity of the trials precluded meta-analysis. Four of the six trials found a significant reduction in the risk of antibiotic-associated diarrhoea with co-administration of Lactobacillus GG. One of the trials found a reduced number of days with antibiotic-induced diarrhoea with Lactobacillus GG administration, whilst the final trial found no benefit of Lactobacillus GG supplementation. Conclusion: Additional research is needed to further clarify the effectiveness of Lactobacillus GG in the prevention of antibiotic-associated diarrhoea. Copyright (c) 2005 S. Karger AG, Basel.
Resumo:
The Gauss-Marquardt-Levenberg (GML) method of computer-based parameter estimation, in common with other gradient-based approaches, suffers from the drawback that it may become trapped in local objective function minima, and thus report optimized parameter values that are not, in fact, optimized at all. This can seriously degrade its utility in the calibration of watershed models where local optima abound. Nevertheless, the method also has advantages, chief among these being its model-run efficiency, and its ability to report useful information on parameter sensitivities and covariances as a by-product of its use. It is also easily adapted to maintain this efficiency in the face of potential numerical problems (that adversely affect all parameter estimation methodologies) caused by parameter insensitivity and/or parameter correlation. The present paper presents two algorithmic enhancements to the GML method that retain its strengths, but which overcome its weaknesses in the face of local optima. Using the first of these methods an intelligent search for better parameter sets is conducted in parameter subspaces of decreasing dimensionality when progress of the parameter estimation process is slowed either by numerical instability incurred through problem ill-posedness, or when a local objective function minimum is encountered. The second methodology minimizes the chance of successive GML parameter estimation runs finding the same objective function minimum by starting successive runs at points that are maximally removed from previous parameter trajectories. As well as enhancing the ability of a GML-based method to find the global objective function minimum, the latter technique can also be used to find the locations of many non-global optima (should they exist) in parameter space. This can provide a useful means of inquiring into the well-posedness of a parameter estimation problem, and for detecting the presence of bimodal parameter and predictive probability distributions. The new methodologies are demonstrated by calibrating a Hydrological Simulation Program-FORTRAN (HSPF) model against a time series of daily flows. Comparison with the SCE-UA method in this calibration context demonstrates a high level of comparative model run efficiency for the new method. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Aim The aim of this systematic review was to assess the quality and outcomes of clinical trials investigating the effect of St John's wort extracts on the metabolism of drugs by CYP3A. Methods Prospective clinical trials assessing the effect of St John's wort (SJW) extracts on metabolism by CYP3A were identified through computer-based searches (from their inception to May 2005) of Medline, Cinahl, PsycINFO, AMED, Current Contents and Embase, hand-searches of bibliographies of relevant papers and consultation with manufacturers and researchers in the field. Two reviewers selected trials for inclusion, independently extracted data and recorded details on study design. Results Thirty-one studies met the eligibility criteria. More than two-thirds of the studies employed a before-and-after design, less than one-third of the studies used a crossover design, and only three studies were double-blind and placebo controlled. In 12 studies the SJW extract had been assayed, and 14 studies stated the specific SJW extract used. Results from 26 studies, including all of the 19 studies that used high-dose hyperforin extracts (> 10 mg day(-1)), had outcomes consistent with CYP3A induction. The three studies using low-dose hyperforin extracts (< 4 mg day(-1)) demonstrated no significant effect on CYP3A. Conclusion There is reasonable evidence to suggest that high-dose hyperforin SJW extracts induce CYP3A. More studies are required to determine whether decreased CYP3A induction occurs after low-dose hyperforin extracts. Future studies should adopt study designs with a control phase or control group, identify the specific SJW extract employed and provide quantitative analyses of key constituents.
Resumo:
Study Design. Development of an automatic measurement algorithm and comparison with manual measurement methods. Objectives. To develop a new computer-based method for automatic measurement of vertebral rotation in idiopathic scoliosis from computed tomography images and to compare the automatic method with two manual measurement techniques. Summary of Background Data. Techniques have been developed for vertebral rotation measurement in idiopathic scoliosis using plain radiographs, computed tomography, or magnetic resonance images. All of these techniques require manual selection of landmark points and are therefore subject to interobserver and intraobserver error. Methods. We developed a new method for automatic measurement of vertebral rotation in idiopathic scoliosis using a symmetry ratio algorithm. The automatic method provided values comparable with Aaro and Ho's manual measurement methods for a set of 19 transverse computed tomography slices through apical vertebrae, and with Aaro's method for a set of 204 reformatted computed tomography images through vertebral endplates. Results. Confidence intervals (95%) for intraobserver and interobserver variability using manual methods were in the range 5.5 to 7.2. The mean (+/- SD) difference between automatic and manual rotation measurements for the 19 apical images was -0.5 degrees +/- 3.3 degrees for Aaro's method and 0.7 degrees +/- 3.4 degrees for Ho's method. The mean (+/- SD) difference between automatic and manual rotation measurements for the 204 endplate images was 0.25 degrees +/- 3.8 degrees. Conclusions. The symmetry ratio algorithm allows automatic measurement of vertebral rotation in idiopathic scoliosis without intraobserver or interobserver error due to landmark point selection.
Resumo:
The increasing use of information and communications technologies among government departments and non-government agencies has fundamentally changed the implementation of employment services policy in Australia. The administrative arrangements for governing unemployment and unemployed people are now constituted by a complex contractual interplay between government departments as ‘purchasers’ and a range of small and large private organizations as ‘providers’. Assessing, tracking and monitoring the activities of unemployed people through the various parts of the employment services system has been made possible by developments in information technology and tailored computer programs. Consequently, the discretionary capacity that is traditionally associated with ‘street-level bureaucracy’ has been partly transformed into more prescriptive forms of ‘screen-level bureaucracy’. The knowledge embedded in these new computer-based technologies is considered superior because it is based on ‘objective calculations’, rather than subjective assessments of individual employees. The relationship between the sociopolitical context of unemployment policy and emerging forms of e-government is explored using illustrative findings from a qualitative pilot study undertaken in two Australian sites. The findings suggest that some of the new technologies in the employment services system are welcomed, while other applications are experienced as contradictory to the aims of delivering a personalized and respectful service.
Resumo:
Computer-based, socio-technical systems projects are frequently failures. In particular, computer-based information systems often fail to live up to their promise. Part of the problem lies in the uncertainty of the effect of combining the subsystems that comprise the complete system; i.e. the system's emergent behaviour cannot be predicted from a knowledge of the subsystems. This paper suggests uncertainty management is a fundamental unifying concept in analysis and design of complex systems and goes on to indicate that this is due to the co-evolutionary nature of the requirements and implementation of socio-technical systems. The paper shows a model of the propagation of a system change that indicates that the introduction of two or more changes over time can cause chaotic emergent behaviour.
Resumo:
The physical implementation of quantum information processing is one of the major challenges of current research. In the last few years, several theoretical proposals and experimental demonstrations on a small number of qubits have been carried out, but a quantum computing architecture that is straightforwardly scalable, universal, and realizable with state-of-the-art technology is still lacking. In particular, a major ultimate objective is the construction of quantum simulators, yielding massively increased computational power in simulating quantum systems. Here we investigate promising routes towards the actual realization of a quantum computer, based on spin systems. The first one employs molecular nanomagnets with a doublet ground state to encode each qubit and exploits the wide chemical tunability of these systems to obtain the proper topology of inter-qubit interactions. Indeed, recent advances in coordination chemistry allow us to arrange these qubits in chains, with tailored interactions mediated by magnetic linkers. These act as switches of the effective qubit-qubit coupling, thus enabling the implementation of one- and two-qubit gates. Molecular qubits can be controlled either by uniform magnetic pulses, either by local electric fields. We introduce here two different schemes for quantum information processing with either global or local control of the inter-qubit interaction and demonstrate the high performance of these platforms by simulating the system time evolution with state-of-the-art parameters. The second architecture we propose is based on a hybrid spin-photon qubit encoding, which exploits the best characteristic of photons, whose mobility is exploited to efficiently establish long-range entanglement, and spin systems, which ensure long coherence times. The setup consists of spin ensembles coherently coupled to single photons within superconducting coplanar waveguide resonators. The tunability of the resonators frequency is exploited as the only manipulation tool to implement a universal set of quantum gates, by bringing the photons into/out of resonance with the spin transition. The time evolution of the system subject to the pulse sequence used to implement complex quantum algorithms has been simulated by numerically integrating the master equation for the system density matrix, thus including the harmful effects of decoherence. Finally a scheme to overcome the leakage of information due to inhomogeneous broadening of the spin ensemble is pointed out. Both the proposed setups are based on state-of-the-art technological achievements. By extensive numerical experiments we show that their performance is remarkably good, even for the implementation of long sequences of gates used to simulate interesting physical models. Therefore, the here examined systems are really promising buildingblocks of future scalable architectures and can be used for proof-of-principle experiments of quantum information processing and quantum simulation.
Resumo:
This review will discuss the use of manual grading scales, digital photography, and automated image analysis in the quantification of fundus changes caused by age-related macular disease. Digital imaging permits processing of images for enhancement, comparison, and feature quantification, and these techniques have been investigated for automated drusen analysis. The accuracy of automated analysis systems has been enhanced by the incorporation of interactive elements, such that the user is able to adjust the sensitivity of the system, or manually add and remove pixels. These methods capitalize on both computer and human image feature recognition and the advantage of computer-based methodologies for quantification. The histogram-based adaptive local thresholding system is able to extract useful information from the image without being affected by the presence of other structures. More recent developments involve compensation for fundus background reflectance, which has most recently been combined with the Otsu method of global thresholding. This method is reported to provide results comparable with manual stereo viewing. Developments in this area are likely to encourage wider use of automated techniques. This will make the grading of photographs easier and cheaper for clinicians and researchers. © 2007 Elsevier Inc. All rights reserved.
Resumo:
The role of technology management in achieving improved manufacturing performance has been receiving increased attention as enterprises are becoming more exposed to competition from around the world. In the modern market for manufactured goods the demand is now for more product variety, better quality, shorter delivery and greater flexibility, while the financial and environmental cost of resources has become an urgent concern to manufacturing managers. This issue of the International Journal of Technology Management addresses the question of how the diffusion, implementation and management of technology can improve the performance of manufacturing industries. The authors come from a large number of different countries and their contributions cover a wide range of topics within this general theme. Some papers are conceptual, others report on research carried out in a range of different industries including steel production, iron founding, electronics, robotics, machinery, precision engineering, metal working and motor manufacture. In some cases they describe situations in specific countries. Several are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the conference theme was 'Achieving Competitive Edge: Getting Ahead Through Technology and People'. The first two papers deal with questions of advanced manufacturing technology implementation and management. Firstly Beatty describes a three year longitudinal field study carried out in ten Canadian manufacturing companies using CADICAM and CIM systems. Her findings relate to speed of implementation, choice of system type, the role of individuals in implementation, organization and job design. This is followed by a paper by Bessant in which he argues that a more a strategic approach should be taken towards the management of technology in the 1990s and beyond. Also considered in this paper are the capabilities necessary in order to deploy advanced manufacturing technology as a strategic resource and the way such capabilities might be developed within the firm. These two papers, which deal largely with the implementation of hardware, are supplemented by Samson and Sohal's contribution in which they argue that a much wider perspective should be adopted based on a new approach to manufacturing strategy formulation. Technology transfer is the topic of the following two papers. Pohlen again takes the case of advanced manufacturing technology and reports on his research which considers the factors contributing to successful realisation of AMT transfer. The paper by Lee then provides a more detailed account of technology transfer in the foundry industry. Using a case study based on a firm which has implemented a number of transferred innovations a model is illustrated in which the 'performance gap' can be identified and closed. The diffusion of technology is addressed in the next two papers. In the first of these, by Lowe and Sim, the managerial technologies of 'Just in Time' and 'Manufacturing Resource Planning' (or MRP 11) are examined. A study is described from which a number of factors are found to influence the adoption process including, rate of diffusion and size. Dahlin then considers the case of a specific item of hardware technology, the industrial robot. Her paper reviews the history of robot diffusion since the early 1960s and then tries to predict how the industry will develop in the future. The following two papers deal with the future of manufacturing in a more general sense. The future implementation of advanced manufacturing technology is the subject explored by de Haan and Peters who describe the results of their Dutch Delphi forecasting study conducted among a panel of experts including scientists, consultants, users and suppliers of AMT. Busby and Fan then consider a type of organisational model, 'the extended manufacturing enterprise', which would represent a distinct alternative pure market-led and command structures by exploiting the shared knowledge of suppliers and customers. The three country-based papers consider some strategic issues relating manufacturing technology. In a paper based on investigations conducted in China He, Liff and Steward report their findings from strategy analyses carried out in the steel and watch industries with a view to assessing technology needs and organizational change requirements. This is followed by Tang and Nam's paper which examines the case of machinery industry in Korea and its emerging importance as a key sector in the Korean economy. In his paper which focuses on Venezuela, Ernst then considers the particular problem of how this country can address the problem of falling oil revenues. He sees manufacturing as being an important contributor to Venezuela's future economy and proposes a means whereby government and private enterprise can co-operate in development of the manufacturing sector. The last six papers all deal with specific topics relating to the management manufacturing. Firstly Youssef looks at the question of manufacturing flexibility, introducing and testing a conceptual model that relates computer based technologies flexibility. Dangerfield's paper which follows is based on research conducted in the steel industry. He considers the question of scale and proposes a modelling approach determining the plant configuration necessary to meet market demand. Engstrom presents the results of a detailed investigation into the need for reorganising material flow where group assembly of products has been adopted. Sherwood, Guerrier and Dale then report the findings of a study into the effectiveness of Quality Circle implementation. Stillwagon and Burns, consider how manufacturing competitiveness can be improved individual firms by describing how the application of 'human performance engineering' can be used to motivate individual performance as well as to integrate organizational goals. Finally Sohal, Lewis and Samson describe, using a case study example, how just-in-time control can be applied within the context of computer numerically controlled flexible machining lines. The papers in this issue of the International Journal of Technology Management cover a wide range of topics relating to the general question of improving manufacturing performance through the dissemination, implementation and management of technology. Although they differ markedly in content and approach, they have the collective aim addressing the concepts, principles and practices which provide a better understanding the technology of manufacturing and assist in achieving and maintaining a competitive edge.
Resumo:
If product cycle time reduction is the mission, and the multifunctional team is the means of achieving the mission, what then is the modus operandi by which this means is to accomplish its mission? This paper asserts that a preferred modus operandi for the multifunctional team is to adopt a process-oriented view of the manufacturing enterprise, and for this it needs the medium of a process map [16] The substance of this paper is a methodology which enables the creation of such maps Specific examples of process models drawn from the product develop ment life cycle are presented and described in order to support the methodology's integrity and value The specific deliverables we have so far obtained are a methodology for process capture and analysis, a collection of process models spanning the product development cycle, and, an engineering handbook which hosts these models and presents a computer-based means of navigating through these processes in order to allow users a better understanding of the nature of the business, their role in it, and why the job that they do benefits the work of the company We assert that this kind of thinking is the essence of concurrent engineering implementation, and further that the systemigram process models uniquely stim ulate and organise such thinking.
Resumo:
Liquid-liquid extraction has long been known as a unit operation that plays an important role in industry. This process is well known for its complexity and sensitivity to operation conditions. This thesis presents an attempt to explore the dynamics and control of this process using a systematic approach and state of the art control system design techniques. The process was studied first experimentally under carefully selected. operation conditions, which resembles the ranges employed practically under stable and efficient conditions. Data were collected at steady state conditions using adequate sampling techniques for the dispersed and continuous phases as well as during the transients of the column with the aid of a computer-based online data logging system and online concentration analysis. A stagewise single stage backflow model was improved to mimic the dynamic operation of the column. The developed model accounts for the variation in hydrodynamics, mass transfer, and physical properties throughout the length of the column. End effects were treated by addition of stages at the column entrances. Two parameters were incorporated in the model namely; mass transfer weight factor to correct for the assumption of no mass transfer in the. settling zones at each stage and the backmixing coefficients to handle the axial dispersion phenomena encountered in the course of column operation. The parameters were estimated by minimizing the differences between the experimental and the model predicted concentration profiles at steady state conditions using non-linear optimisation technique. The estimated values were then correlated as functions of operating parameters and were incorporated in·the model equations. The model equations comprise a stiff differential~algebraic system. This system was solved using the GEAR ODE solver. The calculated concentration profiles were compared to those experimentally measured. A very good agreement of the two profiles was achieved within a percent relative error of ±2.S%. The developed rigorous dynamic model of the extraction column was used to derive linear time-invariant reduced-order models that relate the input variables (agitator speed, solvent feed flowrate and concentration, feed concentration and flowrate) to the output variables (raffinate concentration and extract concentration) using the asymptotic method of system identification. The reduced-order models were shown to be accurate in capturing the dynamic behaviour of the process with a maximum modelling prediction error of I %. The simplicity and accuracy of the derived reduced-order models allow for control system design and analysis of such complicated processes. The extraction column is a typical multivariable process with agitator speed and solvent feed flowrate considered as manipulative variables; raffinate concentration and extract concentration as controlled variables and the feeds concentration and feed flowrate as disturbance variables. The control system design of the extraction process was tackled as multi-loop decentralised SISO (Single Input Single Output) as well as centralised MIMO (Multi-Input Multi-Output) system using both conventional and model-based control techniques such as IMC (Internal Model Control) and MPC (Model Predictive Control). Control performance of each control scheme was. studied in terms of stability, speed of response, sensitivity to modelling errors (robustness), setpoint tracking capabilities and load rejection. For decentralised control, multiple loops were assigned to pair.each manipulated variable with each controlled variable according to the interaction analysis and other pairing criteria such as relative gain array (RGA), singular value analysis (SVD). Loops namely Rotor speed-Raffinate concentration and Solvent flowrate Extract concentration showed weak interaction. Multivariable MPC has shown more effective performance compared to other conventional techniques since it accounts for loops interaction, time delays, and input-output variables constraints.
Resumo:
Hazard and operability (HAZOP) studies on chemical process plants are very time consuming, and often tedious, tasks. The requirement for HAZOP studies is that a team of experts systematically analyse every conceivable process deviation, identifying possible causes and any hazards that may result. The systematic nature of the task, and the fact that some team members may be unoccupied for much of the time, can lead to tedium, which in turn may lead to serious errors or omissions. An aid to HAZOP are fault trees, which present the system failure logic graphically such that the study team can readily assimilate their findings. Fault trees are also useful to the identification of design weaknesses, and may additionally be used to estimate the likelihood of hazardous events occurring. The one drawback of fault trees is that they are difficult to generate by hand. This is because of the sheer size and complexity of modern process plants. The work in this thesis proposed a computer-based method to aid the development of fault trees for chemical process plants. The aim is to produce concise, structured fault trees that are easy for analysts to understand. Standard plant input-output equation models for major process units are modified such that they include ancillary units and pipework. This results in a reduction in the nodes required to represent a plant. Control loops and protective systems are modelled as operators which act on process variables. This modelling maintains the functionality of loops, making fault tree generation easier and improving the structure of the fault trees produced. A method, called event ordering, is proposed which allows the magnitude of deviations of controlled or measured variables to be defined in terms of the control loops and protective systems with which they are associated.
Resumo:
Geometric information relating to most engineering products is available in the form of orthographic drawings or 2D data files. For many recent computer based applications, such as Computer Integrated Manufacturing (CIM), these data are required in the form of a sophisticated model based on Constructive Solid Geometry (CSG) concepts. A recent novel technique in this area transfers 2D engineering drawings directly into a 3D solid model called `the first approximation'. In many cases, however, this does not represent the real object. In this thesis, a new method is proposed and developed to enhance this model. This method uses the notion of expanding an object in terms of other solid objects, which are either primitive or first approximation models. To achieve this goal, in addition to the prepared subroutine to calculate the first approximation model of input data, two other wireframe models are found for extraction of sub-objects. One is the wireframe representation on input, and the other is the wireframe of the first approximation model. A new fast method is developed for the latter special case wireframe, which is named the `first approximation wireframe model'. This method avoids the use of a solid modeller. Detailed descriptions of algorithms and implementation procedures are given. In these techniques utilisation of dashed line information is also considered in improving the model. Different practical examples are given to illustrate the functioning of the program. Finally, a recursive method is employed to automatically modify the output model towards the real object. Some suggestions for further work are made to increase the domain of objects covered, and provide a commercially usable package. It is concluded that the current method promises the production of accurate models for a large class of objects.