51 resultados para computer-based technology


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Gauss-Marquardt-Levenberg (GML) method of computer-based parameter estimation, in common with other gradient-based approaches, suffers from the drawback that it may become trapped in local objective function minima, and thus report optimized parameter values that are not, in fact, optimized at all. This can seriously degrade its utility in the calibration of watershed models where local optima abound. Nevertheless, the method also has advantages, chief among these being its model-run efficiency, and its ability to report useful information on parameter sensitivities and covariances as a by-product of its use. It is also easily adapted to maintain this efficiency in the face of potential numerical problems (that adversely affect all parameter estimation methodologies) caused by parameter insensitivity and/or parameter correlation. The present paper presents two algorithmic enhancements to the GML method that retain its strengths, but which overcome its weaknesses in the face of local optima. Using the first of these methods an intelligent search for better parameter sets is conducted in parameter subspaces of decreasing dimensionality when progress of the parameter estimation process is slowed either by numerical instability incurred through problem ill-posedness, or when a local objective function minimum is encountered. The second methodology minimizes the chance of successive GML parameter estimation runs finding the same objective function minimum by starting successive runs at points that are maximally removed from previous parameter trajectories. As well as enhancing the ability of a GML-based method to find the global objective function minimum, the latter technique can also be used to find the locations of many non-global optima (should they exist) in parameter space. This can provide a useful means of inquiring into the well-posedness of a parameter estimation problem, and for detecting the presence of bimodal parameter and predictive probability distributions. The new methodologies are demonstrated by calibrating a Hydrological Simulation Program-FORTRAN (HSPF) model against a time series of daily flows. Comparison with the SCE-UA method in this calibration context demonstrates a high level of comparative model run efficiency for the new method. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aim The aim of this systematic review was to assess the quality and outcomes of clinical trials investigating the effect of St John's wort extracts on the metabolism of drugs by CYP3A. Methods Prospective clinical trials assessing the effect of St John's wort (SJW) extracts on metabolism by CYP3A were identified through computer-based searches (from their inception to May 2005) of Medline, Cinahl, PsycINFO, AMED, Current Contents and Embase, hand-searches of bibliographies of relevant papers and consultation with manufacturers and researchers in the field. Two reviewers selected trials for inclusion, independently extracted data and recorded details on study design. Results Thirty-one studies met the eligibility criteria. More than two-thirds of the studies employed a before-and-after design, less than one-third of the studies used a crossover design, and only three studies were double-blind and placebo controlled. In 12 studies the SJW extract had been assayed, and 14 studies stated the specific SJW extract used. Results from 26 studies, including all of the 19 studies that used high-dose hyperforin extracts (> 10 mg day(-1)), had outcomes consistent with CYP3A induction. The three studies using low-dose hyperforin extracts (< 4 mg day(-1)) demonstrated no significant effect on CYP3A. Conclusion There is reasonable evidence to suggest that high-dose hyperforin SJW extracts induce CYP3A. More studies are required to determine whether decreased CYP3A induction occurs after low-dose hyperforin extracts. Future studies should adopt study designs with a control phase or control group, identify the specific SJW extract employed and provide quantitative analyses of key constituents.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Study Design. Development of an automatic measurement algorithm and comparison with manual measurement methods. Objectives. To develop a new computer-based method for automatic measurement of vertebral rotation in idiopathic scoliosis from computed tomography images and to compare the automatic method with two manual measurement techniques. Summary of Background Data. Techniques have been developed for vertebral rotation measurement in idiopathic scoliosis using plain radiographs, computed tomography, or magnetic resonance images. All of these techniques require manual selection of landmark points and are therefore subject to interobserver and intraobserver error. Methods. We developed a new method for automatic measurement of vertebral rotation in idiopathic scoliosis using a symmetry ratio algorithm. The automatic method provided values comparable with Aaro and Ho's manual measurement methods for a set of 19 transverse computed tomography slices through apical vertebrae, and with Aaro's method for a set of 204 reformatted computed tomography images through vertebral endplates. Results. Confidence intervals (95%) for intraobserver and interobserver variability using manual methods were in the range 5.5 to 7.2. The mean (+/- SD) difference between automatic and manual rotation measurements for the 19 apical images was -0.5 degrees +/- 3.3 degrees for Aaro's method and 0.7 degrees +/- 3.4 degrees for Ho's method. The mean (+/- SD) difference between automatic and manual rotation measurements for the 204 endplate images was 0.25 degrees +/- 3.8 degrees. Conclusions. The symmetry ratio algorithm allows automatic measurement of vertebral rotation in idiopathic scoliosis without intraobserver or interobserver error due to landmark point selection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Computer-based, socio-technical systems projects are frequently failures. In particular, computer-based information systems often fail to live up to their promise. Part of the problem lies in the uncertainty of the effect of combining the subsystems that comprise the complete system; i.e. the system's emergent behaviour cannot be predicted from a knowledge of the subsystems. This paper suggests uncertainty management is a fundamental unifying concept in analysis and design of complex systems and goes on to indicate that this is due to the co-evolutionary nature of the requirements and implementation of socio-technical systems. The paper shows a model of the propagation of a system change that indicates that the introduction of two or more changes over time can cause chaotic emergent behaviour.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Computer display height and desk design to allow forearm support are two critical design features of workstations for information technology tasks. However there is currently no 3D description of head and neck posture with different computer display heights and no direct comparison to paper based information technology tasks. There is also inconsistent evidence on the effect of forearm support on posture and no evidence on whether these features interact. This study compared the 3D head, neck and upper limb postures of 18 male and 18 female young adults whilst working with different display and desk design conditions. There was no substantial interaction between display height and desk design. Lower display heights increased head and neck flexion with more spinal asymmetry when working with paper. The curved desk, designed to provide forearm support, increased scapula elevation/protraction and shoulder flexion/abduction.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Interactive health communication using Internet technologies is expanding the range and flexibility of intervention and teaching options available in preventive medicine and the health sciences. Advantages of interactive health communication include the enhanced convenience, novelty, and appeal of computer-mediated communication; its flexibility and interactivity; and automated processing. We outline some of these fundamental aspects of computer-mediated communication as it applies to preventive medicine. Further, a number of key pathways of information technology evolution are creating new opportunities for the delivery of professional education in preventive medicine and other health domains, as well as for delivering automated, self-instructional health behavior-change programs through the Internet. We briefly describe several of these key evolutionary pathways, We describe some examples from work we have done in Australia. These demonstrate how we have creatively responded to the challenges of these new information environments, and how they may be pursued in the education of preventive medicine and other health care practitioners and in the development and delivery of health behavior change programs through the Internet. Innovative and thoughtful applications of this new technology can increase the consistency, reliability, and quality of information delivered.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We are currently in the midst of a second quantum revolution. The first quantum revolution gave us new rules that govern physical reality. The second quantum revolution will take these rules and use them to develop new technologies. In this review we discuss the principles upon which quantum technology is based and the tools required to develop it. We discuss a number of examples of research programs that could deliver quantum technologies in coming decades including: quantum information technology, quantum electromechanical systems, coherent quantum electronics, quantum optics and coherent matter technology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The absence of considerations of technology in policy studies reinforces the popular notion that technology is a neutral tool, Through an analysis of the role played by computers in the policy processes of Australia's Department of Social Security, this paper argues that computers are political players in policy processes, Findings indicate that computers make aspects of the social domain knowable and therefore governable, The use of computers makes previously infeasible policies possible, Computers also operate as bureaucrats and as agents of client surveillance. Increased policy change, reduced discretion and increasingly targeted and complex policies can be attributed to the use of computer technology, If policy processes are to be adequately understood and analysed, then the role of technology in those processes must be considered.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the design of lattice domes, design engineers need expertise in areas such as configuration processing, nonlinear analysis, and optimization. These are extensive numerical, iterative, and lime-consuming processes that are prone to error without an integrated design tool. This article presents the application of a knowledge-based system in solving lattice-dome design problems. An operational prototype knowledge-based system, LADOME, has been developed by employing the combined knowledge representation approach, which uses rules, procedural methods, and an object-oriented blackboard concept. The system's objective is to assist engineers in lattice-dome design by integrating all design tasks into a single computer-aided environment with implementation of the knowledge-based system approach. For system verification, results from design examples are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BP Refinery (Bulwer Island) Ltd (BP) located on the eastern Australian coast is currently undergoing a major expansion as a part of the Queensland Clean Fuels Project. The associated wastewater treatment plant upgrade will provide a better quality of treated effluent than is currently possible with the existing infrastructure, and which will be of a sufficiently high standard to meet not only the requirements of imposed environmental legislation but also BP's environmental objectives. A number of challenges were faced when considering the upgrade, particularly; cost constraints and limited plot space, highly variable wastewater, toxicity issues, and limited available hydraulic head. Sequencing Batch Reactor (SBR) Technology was chosen for the lagoon upgrade based on the following; SBR technology allowed a retro-fit of the existing earthen lagoon without the need for any additional substantial concrete structures, a dual lagoon system allowed partial treatment of wastewaters during construction, SBRs give substantial process flexibility, SBRs have the ability to easily modify process parameters without any physical modifications, and significant cost benefits. This paper presents the background to this application, an outline of laboratory studies carried out on the wastewater and details the full scale design issues and methods for providing a cost effective, efficient treatment system using the existing lagoon system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A specialised reconfigurable architecture is targeted at wireless base-band processing. It is built to cater for multiple wireless standards. It has lower power consumption than the processor-based solution. It can be scaled to run in parallel for processing multiple channels. Test resources are embedded on the architecture and testing strategies are included. This architecture is functionally partitioned according to the common operations found in wireless standards, such as CRC error correction, convolution and interleaving. These modules are linked via Virtual Wire Hardware modules and route-through switch matrices. Data can be processed in any order through this interconnect structure. Virtual Wire ensures the same flexibility as normal interconnects, but the area occupied and the number of switches needed is reduced. The testing algorithm scans all possible paths within the interconnection network exhaustively and searches for faults in the processing modules. The testing algorithm starts by scanning the externally addressable memory space and testing the master controller. The controller then tests every switch in the route-through switch matrix by making loops from the shared memory to each of the switches. The local switch matrix is also tested in the same way. Next the local memory is scanned. Finally, pre-defined test vectors are loaded into local memory to check the processing modules. This paper compares various base-band processing solutions. It describes the proposed platform and its implementation. It outlines the test resources and algorithm. It concludes with the mapping of Bluetooth and GSM base-band onto the platform.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We review progress at the Australian Centre for Quantum Computer Technology towards the fabrication and demonstration of spin qubits and charge qubits based on phosphorus donor atoms embedded in intrinsic silicon. Fabrication is being pursued via two complementary pathways: a 'top-down' approach for near-term production of few-qubit demonstration devices and a 'bottom-up' approach for large-scale qubit arrays with sub-nanometre precision. The 'top-down' approach employs a low-energy (keV) ion beam to implant the phosphorus atoms. Single-atom control during implantation is achieved by monitoring on-chip detector electrodes, integrated within the device structure. In contrast, the 'bottom-up' approach uses scanning tunnelling microscope lithography and epitaxial silicon overgrowth to construct devices at an atomic scale. In both cases, surface electrodes control the qubit using voltage pulses, and dual single-electron transistors operating near the quantum limit provide fast read-out with spurious-signal rejection.