105 resultados para object orientation processing
Resumo:
In the design of lattice domes, design engineers need expertise in areas such as configuration processing, nonlinear analysis, and optimization. These are extensive numerical, iterative, and lime-consuming processes that are prone to error without an integrated design tool. This article presents the application of a knowledge-based system in solving lattice-dome design problems. An operational prototype knowledge-based system, LADOME, has been developed by employing the combined knowledge representation approach, which uses rules, procedural methods, and an object-oriented blackboard concept. The system's objective is to assist engineers in lattice-dome design by integrating all design tasks into a single computer-aided environment with implementation of the knowledge-based system approach. For system verification, results from design examples are presented.
Resumo:
The genetic relationship between lower (information processing speed), intermediate (working memory), and higher levels (complex cognitive processes as indexed by IQ) of mental ability was studied in a classical twin design comprising 166 monozygotic and 190 dizygotic twin pairs. Processing speed was measured by a choice reaction time (RT) task (2-, 4-, and 8-choice), working memory by a visual-spatial delayed response task, and IQ by the Multidimensional Aptitude Battery. Multivariate analysis, adjusted for test-retest reliability, showed the presence of a genetic factor influencing all variables and a genetic factor influencing 4- and 8-choice RTs, working memory, and IQ. There were also genetic factors specific to 8-choice RT, working memory, and IQ. The results confirmed a strong relationship between choice RT and IQ (phenotypic correlations: -0.31 to -0.53 in females, -0.32 to -0.56 in males; genotypic correlations: -0.45 to -0.70) and a weaker but significant association between working memory and IQ (phenotypic: 0.26 in females, 0.13 in males; genotypic: 0.34). A significant part of the genetic variance (43%) in IQ was not related to either choice RT or delayed response performance, and may represent higher order cognitive processes.
Resumo:
This paper presents a method of formally specifying, refining and verifying concurrent systems which uses the object-oriented state-based specification language Object-Z together with the process algebra CSP. Object-Z provides a convenient way of modelling complex data structures needed to define the component processes of such systems, and CSP enables the concise specification of process interactions. The basis of the integration is a semantics of Object-Z classes identical to that of CSP processes. This allows classes specified in Object-Z to he used directly within the CSP part of the specification. In addition to specification, we also discuss refinement and verification in this model. The common semantic basis enables a unified method of refinement to be used, based upon CSP refinement. To enable state-based techniques to be used fur the Object-Z components of a specification we develop state-based refinement relations which are sound and complete with respect to CSP refinement. In addition, a verification method for static and dynamic properties is presented. The method allows us to verify properties of the CSP system specification in terms of its component Object-Z classes by using the laws of the the CSP operators together with the logic for Object-Z.
Resumo:
A scheme is presented to incorporate a mixed potential integral equation (MPIE) using Michalski's formulation C with the method of moments (MoM) for analyzing the scattering of a plane wave from conducting planar objects buried in a dielectric half-space. The robust complex image method with a two-level approximation is used for the calculation of the Green's functions for the half-space. To further speed up the computation, an interpolation technique for filling the matrix is employed. While the induced current distributions on the object's surface are obtained in the frequency domain, the corresponding time domain responses are calculated via the inverse fast Fourier transform (FFT), The complex natural resonances of targets are then extracted from the late time response using the generalized pencil-of-function (GPOF) method. We investigate the pole trajectories as we vary the distance between strips and the depth and orientation of single, buried strips, The variation from the pole position of a single strip in a homogeneous dielectric medium was only a few percent for most of these parameter variations.
Resumo:
The discovery of periodic mesoporous MCM-41 and related molecular sieves has attracted significant attention from a fundamental as well as applied perspective. They possess well-defined cylindrical/hexagonal mesopores with a simple geometry, tailored pore size, and reproducible surface properties. Hence, there is an ever-growing scientific interest in the challenges posed by their processing and characterization and by the refinement of various sorption models. Further, MCM-41-based materials are currently under intense investigation with respect to their utility as adsorbents, catalysts, supports, ion-exchangers, and molecular hosts. In this article, we provide a critical review of the developments in these areas with particular emphasis on adsorption characteristics, progress in controlling the pore sizes, and a comparison of pore size distributions using traditional and newer models. The model proposed by the authors for adsorption isotherms and criticalities in capillary condensation and hysteresis is found to explain unusual adsorption behavior in these materials while providing a convenient characterization tool.
Resumo:
Recent progress in the production, purification, and experimental and theoretical investigations of carbon nanotubes for hydrogen storage are reviewed. From the industrial point of view, the chemical vapor deposition process has shown advantages over laser ablation and electric-arc-discharge methods. The ultimate goal in nanotube synthesis should be to gain control over geometrical aspects of nanotubes, such as location and orientation, and the atomic structure of nanotubes, including helicity and diameter. There is currently no effective and simple purification procedure that fulfills all requirements for processing carbon nanotubes. Purification is still the bottleneck for technical applications, especially where large amounts of material are required. Although the alkali-metal-doped carbon nanotubes showed high H-2 Weight uptake, further investigations indicated that some of this uptake was due to water rather than hydrogen. This discovery indicates a potential source of error in evaluation of the storage capacity of doped carbon nanotubes. Nevertheless, currently available single-wall nanotubes yield a hydrogen uptake value near 4 wt% under moderate pressure and room temperature. A further 50% increase is needed to meet U.S. Department of Energy targets for commercial exploitation. Meeting this target will require combining experimental and theoretical efforts to achieve a full understanding of the adsorption process, so that the uptake can be rationally optimized to commercially attractive levels. Large-scale production and purification of carbon nanotubes and remarkable improvement of H-2 storage capacity in carbon nanotubes represent significant technological and theoretical challenges in the years to come.
Resumo:
The present study investigates human visual processing of simple two-colour patterns using a delayed match to sample paradigm with positron emission tomography (PET). This study is unique in that we specifically designed the visual stimuli to be the same for both pattern and colour recognition with all patterns being abstract shapes not easily verbally coded composed of two-colour combinations. We did this to explore those brain regions required for both colour and pattern processing and to separate those areas of activation required for one or the other. We found that both tasks activated similar occipital regions, the major difference being more extensive activation in pattern recognition. A right-sided network that involved the inferior parietal lobule, the head of the caudate nucleus, and the pulvinar nucleus of the thalamus was common to both paradigms. Pattern recognition also activated the left temporal pole and right lateral orbital gyrus, whereas colour recognition activated the left fusiform gyrus and several right frontal regions. (C) 2001 Wiley-Liss, Inc.
Resumo:
The human nervous system constructs a Euclidean representation of near (personal) space by combining multiple sources of information (cues). We investigated the cues used for the representation of personal space in a patient with visual form agnosia (DF). Our results indicated that DF relies predominantly on binocular vergence information when determining the distance of a target despite the presence of other (retinal) cues. Notably, DF was able to construct an Euclidean representation of personal space from vergence alone. This finding supports previous assertions that vergence provides the nervous system with veridical information for the construction of personal space. The results from the current study, together with those of others, suggest that: (i) the ventral stream is responsible for extracting depth and distance information from monocular retinal cues (i.e. from shading, texture, perspective) and (ii) the dorsal stream has access to binocular information (from horizontal image disparities and vergence). These results also indicate that DF was not able to use size information to gauge target distance, suggesting that intact temporal cortex is necessary for learned size to influence distance processing. Our findings further suggest that in neurologically intact humans, object information extracted in the ventral pathway is combined with the products of dorsal stream processing for guiding prehension. Finally, we studied the size-distance paradox in visual form agnosia in order to explore the cognitive use of size information. The results of this experiment were consistent with a previous suggestion that the paradox is a cognitive phenomenon.
Resumo:
Two hazard risk assessment matrices for the ranking of occupational health risks are described. The qualitative matrix uses qualitative measures of probability and consequence to determine risk assessment codes for hazard-disease combinations. A walk-through survey of an underground metalliferous mine and concentrator is used to demonstrate how the qualitative matrix can be applied to determine priorities for the control of occupational health hazards. The semi-quantitative matrix uses attributable risk as a quantitative measure of probability and uses qualitative measures of consequence. A practical application of this matrix is the determination of occupational health priorities using existing epidemiological studies. Calculated attributable risks from epidemiological studies of hazard-disease combinations in mining and minerals processing are used as examples. These historic response data do not reflect the risks associated with current exposures. A method using current exposure data, known exposure-response relationships and the semi-quantitative matrix is proposed for more accurate and current risk rankings.
Resumo:
As marketers and researchers we understand quality from the consumer's perspective, and throughout contemporary service quality literature there is an emphasis on what the consumer is looking for, or at least that is the intention. Through examining the underlying assumptions of dominant service quality theories, an implicit dualistic ontology is highlighted (where subject and object are considered independent) and argued to effectively negate the said necessary consumer orientation. This fundamental assumption is discussed, as are the implications, following a critical review of dominant service quality models. Consequently, we propose an alternative approach to service quality research that aims towards a more genuine understanding of the consumer's perspective on quality experienced within a service context. Essentially, contemporary service quality research is suggested to be limited in its inherent third-person perspective and the interpretive, specifically phenomenographic, approach put forward here is suggested as a means of achieving a first-person perspective on service quality.