884 resultados para Computational Geometry and Object Modelling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human brain is often considered to be the most cognitively capable among mammalian brains and to be much larger than expected for a mammal of our body size. Although the number of neurons is generally assumed to be a determinant of computational power, and despite the widespread quotes that the human brain contains 100 billion neurons and ten times more glial cells, the absolute number of neurons and glial cells in the human brain remains unknown. Here we determine these numbers by using the isotropic fractionator and compare them with the expected values for a human-sized primate. We find that the adult male human brain contains on average 86.1 +/- 8.1 billion NeuN-positive cells (""neurons"") and 84.6 +/- 9.8 billion NeuN-negative (""nonneuronal"") cells. With only 19% of all neurons located in the cerebral cortex, greater cortical size (representing 82% of total brain mass) in humans compared with other primates does not reflect an increased relative number of cortical neurons. The ratios between glial cells and neurons in the human brain structures are similar to those found in other primates, and their numbers of cells match those expected for a primate of human proportions. These findings challenge the common view that humans stand out from other primates in their brain composition and indicate that, with regard to numbers of neuronal and nonneuronal cells, the human brain is an isometrically scaled-up primate brain. J. Comp. Neurol. 513:532-541, 2009. (c) 2009 Wiley-Liss, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aimed to develop a plate to treat fractures of the mandibular body in dogs and to validate the project using finite elements and biomechanical essays. Mandible prototypes were produced with 10 oblique ventrorostral fractures (favorable) and 10 oblique ventrocaudal fractures (unfavorable). Three groups were established for each fracture type. Osteosynthesis with a pure titanium plate of double-arch geometry and blocked monocortical screws offree angulanon were used. The mechanical resistance of the prototype with unfavorable fracture was lower than that of the fcworable fracture. In both fractures, the deflection increased and the relative stiffness decreased proportionally to the diminishing screw number The finite element analysis validated this plate study, since the maximum tension concentration observed on the plate was lower than the resistance limit tension admitted by the titanium. In conclusion, the double-arch geometry plate fixed with blocked monocortical screws has sufficient resistance to stabilize oblique,fractures, without compromising mandibular dental or neurovascular structures. J Vet Dent 24 (7); 212 - 221, 2010

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction The objective of this study was to analyse the accommodation needs of people with intellectual disability over the age of 18 years in Toowoomba and contiguous shires. In 2004, a group of carers established Toowoomba Intellectual Disability Support Association (TIDSA) to address the issue of the lack of supported accommodation for people with intellectual disability over the age of 18 and the concerns of ageing carers. The Centre for Rural and Remote Area Health (CRRAH) was engaged by TIDSA to ascertain this need and undertook a research project funded by the Queensland Gambling Community Benefit Fund. While data specifically relating to people with intellectual disability and their carers are difficult to obtain, the Australian Bureau of Statistics report that carers of people with a disability are more likely to be female and at least 65 years of age. Projections by the National Centre for Social and Economic Modelling (NATSEM) show that disability rates are increasing and carer rates are decreasing. Thus the problem of appropriate support to the increasing number of ageing carers and those who they care for will be a major challenge to policy makers and is an issue of immediate concern. In general, what was once the norm of accommodating people with intellectual disability in large institutions is now changing to accommodating into community-based residences (Annison, 2000; Young, Ashman, Sigafoos, & Grevell, 2001). However, in Toowoomba and contiguous shires, TIDSA have noted that the availability of suitable accommodation for people with intellectual disability over the age of 18 years is declining with no new options available in an environment of increasing demand. Most effort seemed to be directed towards crisis provision. Method This study employed two phases of data gathering, the first being the distribution of a questionnaire through local service providers and upon individual request to the carers of people with intellectual disability over the age of 18. The questionnaire comprised of Likert-type items intended to measure various aspects of current and future accommodation issues. Most questions were followed with space for free-response comments to provide the opportunity for carers to further clarify and expand on their responses. The second phase comprised semi-structured interviews conducted with ten carers and ten people with intellectual disability who had participated in the Phase One questionnaire. Interviews were transcribed verbatim and subjected to content analysis where major themes were explored. Results Age and gender Carer participants in this study totalled 150. The mean age of these carers was 61.5 years and ranged from 40 – 91 years. Females comprised 78% of the sample (mean age = 61.49; range from 40-91) and 22% were male (mean age = 61.7 range from 43-81). The mean age of people with intellectual disability in our study was 37.2 years ranging from 18 – 79 years with 40% female (mean age = 39.5; range from 19-79) and 60% male (mean age = 35.6; range from 18-59). The average age of carers caring for a person over the age of 18 who is living at home is 61 years. The average age of the carer who cares for a person who is living away from home is 62 years. The overall age range of both these groups of carers is between 40 and 81 years. The oldest group of carers (mean age = 70 years) were those where the person with intellectual disability lives away from home in a large residential facility. Almost one quarter of people with an intellectual disability who currently live at home is cared for by one primary carer and this is almost exclusively a parent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present paper addresses two major concerns that were identified when developing neural network based prediction models and which can limit their wider applicability in the industry. The first problem is that it appears neural network models are not readily available to a corrosion engineer. Therefore the first part of this paper describes a neural network model of CO2 corrosion which was created using a standard commercial software package and simple modelling strategies. It was found that such a model was able to capture practically all of the trends noticed in the experimental data with acceptable accuracy. This exercise has proven that a corrosion engineer could readily develop a neural network model such as the one described below for any problem at hand, given that sufficient experimental data exist. This applies even in the cases when the understanding of the underlying processes is poor. The second problem arises from cases when all the required inputs for a model are not known or can be estimated with a limited degree of accuracy. It seems advantageous to have models that can take as input a range rather than a single value. One such model, based on the so-called Monte Carlo approach, is presented. A number of comparisons are shown which have illustrated how a corrosion engineer might use this approach to rapidly test the sensitivity of a model to the uncertainities associated with the input parameters. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As marketers and researchers we understand quality from the consumer's perspective, and throughout contemporary service quality literature there is an emphasis on what the consumer is looking for, or at least that is the intention. Through examining the underlying assumptions of dominant service quality theories, an implicit dualistic ontology is highlighted (where subject and object are considered independent) and argued to effectively negate the said necessary consumer orientation. This fundamental assumption is discussed, as are the implications, following a critical review of dominant service quality models. Consequently, we propose an alternative approach to service quality research that aims towards a more genuine understanding of the consumer's perspective on quality experienced within a service context. Essentially, contemporary service quality research is suggested to be limited in its inherent third-person perspective and the interpretive, specifically phenomenographic, approach put forward here is suggested as a means of achieving a first-person perspective on service quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use published and new trace element data to identify element ratios which discriminate between arc magmas from the supra-subduction zone mantle wedge and those formed by direct melting of subducted crust (i.e. adakites). The clearest distinction is obtained with those element ratios which are strongly fractionated during refertilisation of the depleted mantle wedge, ultimately reflecting slab dehydration. Hence, adakites have significantly lower Pb/Nd and B/Be but higher Nb/Ta than typical arc magmas and continental crust as a whole. Although Li and Be are also overenriched in continental crust, behaviour of Li/Yb and Be/Nd is more complex and these ratios do not provide unique signatures of slab melting. Archaean tonalite-trondhjemite-granodiorites (TTGs) strongly resemble ordinary mantle wedge-derived arc magmas in terms of fluid-mobile trace element content, implying that they-did not form by slab melting but that they originated from mantle which was hydrated and enriched in elements lost from slabs during prograde dehydration. We suggest that Archaean TTGs formed by extensive fractional crystallisation from a mafic precursor. It is widely claimed that the time between the creation and subduction of oceanic lithosphere was significantly shorter in the Archaean (i.e. 20 Ma) than it is today. This difference was seen as an attractive explanation for the presumed preponderance of adakitic magmas during the first half of Earth's history. However, when we consider the effects of a higher potential mantle temperature on the thickness of oceanic crust, it follows that the mean age of oceanic lithosphere has remained virtually constant. Formation of adakites has therefore always depended on local plate geometry and not on potential mantle temperature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effects of convective and absolute instabilities on the formation of drops formed from cylindrical liquid jets of glycerol/water issuing into still air were investigated. Medium-duration reduced gravity tests were conducted aboard NASA's KC-135 and compared to similar tests performed under normal gravity conditions to aid in understanding the drop formation process. In reduced gravity, the Rayleigh-Chandrasekhar Equation was found to accurately predict the transition between a region of absolute and convective instability as defined by a critical Weber number. Observations of the physics of the jet, its breakup, and subsequent drop dynamics under both gravity conditions and the effects of the two instabilities on these processes are presented. All the normal gravity liquid jets investigated, in regions of convective or absolute instability, were subject to significant stretching effects, which affected the subsequent drop and associated geometry and dynamics. These effects were not displayed in reduced gravity and, therefore, the liquid jets would form drops which took longer to form (reduction in drop frequency), larger in size, and more spherical (surface tension effects). Most observed changes, in regions of either absolute or convective instabilities, were due to a reduction in the buoyancy force and an increased importance of the surface tension force acting on the liquid contained in the jet or formed drop. Reduced gravity environments allow better investigations to be performed into the physics of liquid jets, subsequently formed drops, and the effects of instabilities on these systems. In reduced gravity, drops form up to three times more slowly and as a consequence are up to three times larger in volume in the theoretical absolute instability region than in the theoretical convective instability region. This difference was not seen in the corresponding normal gravity tests due to the masking effects of gravity. A drop is shown to be able to form and detach in a region of absolute instability, and spanning the critical Weber number (from a region of convective to absolute instability) resulted in a marked change in dynamics and geometry of the liquid jet and detaching drops. (C) 2002 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to understand the earthquake nucleation process, we need to understand the effective frictional behavior of faults with complex geometry and fault gouge zones. One important aspect of this is the interaction between the friction law governing the behavior of the fault on the microscopic level and the resulting macroscopic behavior of the fault zone. Numerical simulations offer a possibility to investigate the behavior of faults on many different scales and thus provide a means to gain insight into fault zone dynamics on scales which are not accessible to laboratory experiments. Numerical experiments have been performed to investigate the influence of the geometric configuration of faults with a rate- and state-dependent friction at the particle contacts on the effective frictional behavior of these faults. The numerical experiments are designed to be similar to laboratory experiments by DIETERICH and KILGORE (1994) in which a slide-hold-slide cycle was performed between two blocks of material and the resulting peak friction was plotted vs. holding time. Simulations with a flat fault without a fault gouge have been performed to verify the implementation. These have shown close agreement with comparable laboratory experiments. The simulations performed with a fault containing fault gouge have demonstrated a strong dependence of the critical slip distance D-c on the roughness of the fault surfaces and are in qualitative agreement with laboratory experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Blast fragmentation can have a significant impact on the profitability of a mine. An optimum run of mine (ROM) size distribution is required to maximise the performance of downstream processes. If this fragmentation size distribution can be modelled and controlled, the operation will have made a significant advancement towards improving its performance. Blast fragmentation modelling is an important step in Mine to Mill™ optimisation. It allows the estimation of blast fragmentation distributions for a number of different rock mass, blast geometry, and explosive parameters. These distributions can then be modelled in downstream mining and milling processes to determine the optimum blast design. When a blast hole is detonated rock breakage occurs in two different stress regions - compressive and tensile. In the-first region, compressive stress waves form a 'crushed zone' directly adjacent to the blast hole. The second region, termed the 'cracked zone', occurs outside the crush one. The widely used Kuz-Ram model does not recognise these two blast regions. In the Kuz-Ram model the mean fragment size from the blast is approximated and is then used to estimate the remaining size distribution. Experience has shown that this model predicts the coarse end reasonably accurately, but it can significantly underestimate the amount of fines generated. As part of the Australian Mineral Industries Research Association (AMIRA) P483A Mine to Mill™ project, the Two-Component Model (TCM) and Crush Zone Model (CZM), developed by the Julius Kruttschnitt Mineral Research Centre (JKMRC), were compared and evaluated to measured ROM fragmentation distributions. An important criteria for this comparison was the variation of model results from measured ROM in the-fine to intermediate section (1-100 mm) of the fragmentation curve. This region of the distribution is important for Mine to Mill™ optimisation. The comparison of modelled and Split ROM fragmentation distributions has been conducted in harder ores (UCS greater than 80 MPa). Further work involves modelling softer ores. The comparisons will be continued with future site surveys to increase confidence in the comparison of the CZM and TCM to Split results. Stochastic fragmentation modelling will then be conducted to take into account variation of input parameters. A window of possible fragmentation distributions can be compared to those obtained by Split . Following this work, an improved fragmentation model will be developed in response to these findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors investigated the extent to which the joint-attention behaviors of gaze following, social referencing, and object-directed imitation were related to each other and to infants vocabulary development in a sample of 60 infants between the ages of 8 and 14 months. Joint-attention skills and vocabulary development were assessed in a laboratory setting. Split-half reliability analyses on the joint-attention measures indicated that the tasks reliably assessed infants' capabilities. In the main analysis, no significant correlations were found among the joint-attention behaviors except for a significant relationship between gaze following and the number of names in infants' productive vocabularies. The overall pattern of results did not replicate results of previous studies (e.g., M. Carpenter, K. Nagell, & M. Tomasello, 1998) that found relationships between various emerging joint-attention behaviors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Forest cover of the Maringá municipality, located in northern Parana State, was mapped in this study. Mapping was carried out by using high-resolution HRC sensor imagery and medium resolution CCD sensor imagery from the CBERS satellite. Images were georeferenced and forest vegetation patches (TOFs - trees outside forests) were classified using two methods of digital classification: reflectance-based or the digital number of each pixel, and object-oriented. The areas of each polygon were calculated, which allowed each polygon to be segregated into size classes. Thematic maps were built from the resulting polygon size classes and summary statistics generated from each size class for each area. It was found that most forest fragments in Maringá were smaller than 500 m². There was also a difference of 58.44% in the amount of vegetation between the high-resolution imagery and medium resolution imagery due to the distinct spatial resolution of the sensors. It was concluded that high-resolution geotechnology is essential to provide reliable information on urban greens and forest cover under highly human-perturbed landscapes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O documento em anexo encontra-se na versão post-print (versão corrigida pelo editor).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Power system planning, control and operation require an adequate use of existing resources as to increase system efficiency. The use of optimal solutions in power systems allows huge savings stressing the need of adequate optimization and control methods. These must be able to solve the envisaged optimization problems in time scales compatible with operational requirements. Power systems are complex, uncertain and changing environments that make the use of traditional optimization methodologies impracticable in most real situations. Computational intelligence methods present good characteristics to address this kind of problems and have already proved to be efficient for very diverse power system optimization problems. Evolutionary computation, fuzzy systems, swarm intelligence, artificial immune systems, neural networks, and hybrid approaches are presently seen as the most adequate methodologies to address several planning, control and operation problems in power systems. Future power systems, with intensive use of distributed generation and electricity market liberalization increase power systems complexity and bring huge challenges to the forefront of the power industry. Decentralized intelligence and decision making requires more effective optimization and control techniques techniques so that the involved players can make the most adequate use of existing resources in the new context. The application of computational intelligence methods to deal with several problems of future power systems is presented in this chapter. Four different applications are presented to illustrate the promises of computational intelligence, and illustrate their potentials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)