982 resultados para Computer techniques
Resumo:
Design of liquid retaining structures involves many decisions to be made by the designer based on rules of thumb, heuristics, judgment, code of practice and previous experience. Various design parameters to be chosen include configuration, material, loading, etc. A novice engineer may face many difficulties in the design process. Recent developments in artificial intelligence and emerging field of knowledge-based system (KBS) have made widespread applications in different fields. However, no attempt has been made to apply this intelligent system to the design of liquid retaining structures. The objective of this study is, thus, to develop a KBS that has the ability to assist engineers in the preliminary design of liquid retaining structures. Moreover, it can provide expert advice to the user in selection of design criteria, design parameters and optimum configuration based on minimum cost. The development of a prototype KBS for the design of liquid retaining structures (LIQUID), using blackboard architecture with hybrid knowledge representation techniques including production rule system and object-oriented approach, is presented in this paper. An expert system shell, Visual Rule Studio, is employed to facilitate the development of this prototype system. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
The majority of the world's population now resides in urban environments and information on the internal composition and dynamics of these environments is essential to enable preservation of certain standards of living. Remotely sensed data, especially the global coverage of moderate spatial resolution satellites such as Landsat, Indian Resource Satellite and Systeme Pour I'Observation de la Terre (SPOT), offer a highly useful data source for mapping the composition of these cities and examining their changes over time. The utility and range of applications for remotely sensed data in urban environments could be improved with a more appropriate conceptual model relating urban environments to the sampling resolutions of imaging sensors and processing routines. Hence, the aim of this work was to take the Vegetation-Impervious surface-Soil (VIS) model of urban composition and match it with the most appropriate image processing methodology to deliver information on VIS composition for urban environments. Several approaches were evaluated for mapping the urban composition of Brisbane city (south-cast Queensland, Australia) using Landsat 5 Thematic Mapper data and 1:5000 aerial photographs. The methods evaluated were: image classification; interpretation of aerial photographs; and constrained linear mixture analysis. Over 900 reference sample points on four transects were extracted from the aerial photographs and used as a basis to check output of the classification and mixture analysis. Distinctive zonations of VIS related to urban composition were found in the per-pixel classification and aggregated air-photo interpretation; however, significant spectral confusion also resulted between classes. In contrast, the VIS fraction images produced from the mixture analysis enabled distinctive densities of commercial, industrial and residential zones within the city to be clearly defined, based on their relative amount of vegetation cover. The soil fraction image served as an index for areas being (re)developed. The logical match of a low (L)-resolution, spectral mixture analysis approach with the moderate spatial resolution image data, ensured the processing model matched the spectrally heterogeneous nature of the urban environments at the scale of Landsat Thematic Mapper data.
Resumo:
Teaching the PSP: Challenges and Lessons Learned by Jurgen Borstler, David Carrington, Gregory W Hislop, Susan Lisack, Keith Olson, and Laurie Williams, pp. 42-48. Soft-ware engineering educators need to provide environments where students learn about the size and complexity of modern software systems and the techniques available for managing these difficulties. Five universities used the Personal Software Process to teach software engineering concepts in a variety of contexts.
Resumo:
The paper presents methods for measurement of convective heat transfer distributions in a cold flow, supersonic blowdown wind tunnel. The techniques involve use of the difference between model surface temperature and adiabatic wall temperature as the driving temperature difference for heat transfer and no active heating or cooling of the test gas or model is required. Thermochromic liquid crystals are used for surface temperature indication and results presented from experiments in a Mach 3 flow indicate that measurements of the surface heat transfer distribution under swept shock wave boundary layer interactions can be made. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Aims To determine the degree of inter-institutional agreement in the assessment of dobutamine stress echocardiograms using modern stress echo cardiographic technology in combination with standardized data acquisition and assessment criteria. Method and Results Among six experienced institutions, 150 dobutamine stress echocardiograms (dobutamine up to 40 mug.kg(-1) min(-1) and atropine up to I mg) were performed on patients with suspected coronary artery disease using fundamental and harmonic imaging following a consistent digital acquisition protocol. Each dobutamine stress echocardiogram was assessed at every institution regarding endocardial visibility and left ventricular wall motion without knowledge of any other data using standardized reading criteria. No patients were excluded due to poor image quality or inadequate stress level. Coronary angiography was performed within 4 weeks. Coronary angiography demonstrated significant coronary artery disease (less than or equal to50% diameter stenosis) in 87 patients. Using harmonic imaging an average of 5.2+/-0.9 institutions agreed on dobutamine stress echocardiogram results as being normal or abnormal (mean kappa 0.55; 95% CI 0.50-0.60). Agreement was higher in patients with no (equal assessment of dobutamine stress echocardiogram results by 5.5 +/- 0.8 institutions) or three-vessel coronary artery disease (5.4 +/- 0.8 institutions) and lower in one- or two- vessel disease (5.0 +/- 0.9 and 5.2 +/- 1.0 institutions, respectively-, P=0.041). Disagreement on test results was greater in only minor wall motion abnormalities. Agreement on dobutamine stress echocardiogram results was lower using fundamental imaging (mean kappa 0.49; 95% CI 0.44-0.54; P
Resumo:
Most sugarcane breeding programs in Australia use large unreplicated trials to evaluate clones in the early stages of selection. Commercial varieties that are replicated provide a method of local control of soil fertility. Although such methods may be useful in detecting broad trends in the field, variation often occurs on a much smaller scale. Methods such as spatial analysis adjust a plot for variability by using information from immediate neighbours. These techniques are routinely used to analyse cereal data in Australia and have resulted in increased accuracy and precision in the estimates of variety effects. In this paper, spatial analyses in which the variability is decomposed into local, natural, and extraneous components are applied to early selection trials in sugarcane. Interplot competition in cane yield and trend in sugar content were substantial in many of the trials and there were often large differences in the selections between the spatial and current method used by the Bureau of Sugar Experiment Stations. A joint modelling approach for tonnes sugar per hectare in response to fertility trends and interplot competition is recommended.
Resumo:
The microbiological quality of routinely processed tripe and rumen pillars were compared with those derived after emptying the rumen (paunch) without using water (dry dumping) and after deliberately bursting the paunches before processing. Prior to packing the mean:log(10) aerobic plate counts (APC) for the routinely processed tripe and rumen pillars were 3.55+/-1.08 and 3.28+/-0.87/g respectively. The corresponding mean log(10) total coliform counts (TCC) were 1.27+/-1.28 and 2.08+/-0.87. The mean log(10) APC counts on tripe and rumen pillars after dry-dumping were 3.06+/-0.60 and 3.90+/-0.75/g, respectively. The corresponding mean log(10) TCC were 1.03+/-0.60/g and 2.75+/-1.14/g respectively. After deliberately bursting the paunches, before processing, the mean log(10) APC counts on tripe and rumen pillars were 3.55+/-0.83/g and 3.50+/-0.59/g and the mean log(10) TCC were 1.54+/-0.95/g and 2.66+/-0.82/g respectively. In all cases the prevalence of Salmonella and Campylobacter spp. was less than 3%. The results indicate that both tripe and rumen pillars can be produced after dry dumping without compromising the quality of tripe and rumen pillars. Similarly, incidentally burst paunches that become contaminated with ingesta on the serosal surface can be processed without compromising product quality. Crown Copyright (C) 2002 Published by Elsevier Science Ltd. All rights reserved.
Resumo:
This paper presents results on the simulation of the solid state sintering of copper wires using Monte Carlo techniques based on elements of lattice theory and cellular automata. The initial structure is superimposed onto a triangular, two-dimensional lattice, where each lattice site corresponds to either an atom or vacancy. The number of vacancies varies with the simulation temperature, while a cluster of vacancies is a pore. To simulate sintering, lattice sites are picked at random and reoriented in terms of an atomistic model governing mass transport. The probability that an atom has sufficient energy to jump to a vacant lattice site is related to the jump frequency, and hence the diffusion coefficient, while the probability that an atomic jump will be accepted is related to the change in energy of the system as a result of the jump, as determined by the change in the number of nearest neighbours. The jump frequency is also used to relate model time, measured in Monte Carlo Steps, to the actual sintering time. The model incorporates bulk, grain boundary and surface diffusion terms and includes vacancy annihilation on the grain boundaries. The predictions of the model were found to be consistent with experimental data, both in terms of the microstructural evolution and in terms of the sintering time. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
This study integrated the research streams of computer-mediated communication (CMC) and group conflict by comparing the expression of different types of conflict in CMC groups and face-to face (FTF) groups over time. The main aim of the study was to compare the cues-filtered-out approach against the social information processing theory A laboratory study was conducted with 39 groups (19 CMC and 20 FTF) in which members were required to work together over three sessions. The frequencies of task, process, and relationship conflict were analyzed. Findings supported the social information processing theory. There was more process and relationship conflict in CMC groups compared to FTF groups on Day 1. However, this difference disappeared on Days 2 and 3. There was no difference between CMC and FTF groups in the amount of task conflict expressed on any day.
Resumo:
Chest clapping, vibration, and shaking were studied in 10 physiotherapists who applied these techniques on an anesthetized animal model. Hemodynamic variables (such as heart rate, blood pressure, pulmonary artery pressure, and right atrial pressure) were measured during the application of these techniques to verify claims of adverse events. In addition, expired tidal volume and peak expiratory flow rate were measured to ascertain effects of these techniques. Physiotherapists in this study applied chest clapping at a rate of 6.2 +/- 0.9 Hz, vibration at 10.5 +/- 2.3 Hz, and shaking at 6.2 +/- 2.3 Hz. With the use of these rates, esophageal pressure swings of 8.8 +/- 5.0, 0.7 +/- 0.3, and 1.4 +/- 0.7 mmHg resulted from clapping, vibration, and shaking respectively. Variability in rates and forces generated by these techniques was 80% of variance in shaking force (P = 0.003). Application of these techniques by physiotherapists was found to have no significant effects on hemodynamic and most ventilatory variables in this study. From this study, we conclude that chest clapping, vibration, and shaking 1) can be consistently performed by physiotherapists; 2) are significantly related to physiotherapists' characteristics, particularly clinical experience; and 3) caused no significant hemodynamic effects.
Resumo:
Computer Science is a subject which has difficulty in marketing itself. Further, pinning down a standard curriculum is difficult-there are many preferences which are hard to accommodate. This paper argues the case that part of the problem is the fact that, unlike more established disciplines, the subject does not clearly distinguish the study of principles from the study of artifacts. This point was raised in Curriculum 2001 discussions, and debate needs to start in good time for the next curriculum standard. This paper provides a starting point for debate, by outlining a process by which principles and artifacts may be separated, and presents a sample curriculum to illustrate the possibilities. This sample curriculum has some positive points, though these positive points are incidental to the need to start debating the issue. Other models, with a less rigorous ordering of principles before artifacts, would still gain from making it clearer whether a specific concept was fundamental, or a property of a specific technology. (C) 2003 Elsevier Ltd. All rights reserved.