968 resultados para Modeling Development
Resumo:
This study examined the development of fatness, as indexed by skinfold thickness, in healthy Caucasian children and adolescents residing in the same location in Canada in the 1960s and the 1990s. The data comes from two longitudinal studies, conducted approximately 30 years apart, of children aged 8-16 years. The first study (1964-1973) annually measured 207 males and 140 females. The second investigation (1991-1997) repeatedly measured 113 males and 115 females. Identical measurement tools and protocols were used for height, body mass, and skinfolds. Maturational age was estimated as a measure in years from age of peak height velocity. Males from the second investigation matured significantly (P < 0.05) earlier. Multilevel regression modeling was utilized to determine developmental curves for the individuals within the two populations. When differences in height, body mass, and maturity were controlled, skinfold thicknesses of the males and females in the second study were significantly greater (P < 0.05) than age- and sex-matched peers in the first study. This was not seen in models of the BMI. The results suggest that when maturity and size were controlled, the fatness of children and adolescents increased over 30 years. (C) 2002 Wiley-Liss, Inc.
Resumo:
Much research has been devoted over the years to investigating and advancing the techniques and tools used by analysts when they model. As opposed to what academics, software providers and their resellers promote as should be happening, the aim of this research was to determine whether practitioners still embraced conceptual modeling seriously. In addition, what are the most popular techniques and tools used for conceptual modeling? What are the major purposes for which conceptual modeling is used? The study found that the top six most frequently used modeling techniques and methods were ER diagramming, data flow diagramming, systems flowcharting, workflow modeling, UML, and structured charts. Modeling technique use was found to decrease significantly from smaller to medium-sized organizations, but then to increase significantly in larger organizations (proxying for large, complex projects). Technique use was also found to significantly follow an inverted U-shaped curve, contrary to some prior explanations. Additionally, an important contribution of this study was the identification of the factors that uniquely influence the decision of analysts to continue to use modeling, viz., communication (using diagrams) to/from stakeholders, internal knowledge (lack of) of techniques, user expectations management, understanding models' integration into the business, and tool/software deficiencies. The highest ranked purposes for which modeling was undertaken were database design and management, business process documentation, business process improvement, and software development. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Ecologists and economists both use models to help develop strategies for biodiversity management. The practical use of disciplinary models, however, can be limited because ecological models tend not to address the socioeconomic dimension of biodiversity management, whereas economic models tend to neglect the ecological dimension. Given these shortcomings of disciplinary models, there is a necessity to combine ecological and economic knowledge into ecological-economic models. It is insufficient if scientists work separately in their own disciplines and combine their knowledge only when it comes to formulating management recommendations. Such an approach does not capture feedback loops between the ecological and the socioeconomic systems. Furthermore, each discipline poses the management problem in its own way and comes up with its own most appropriate solution. These disciplinary solutions, however are likely to be so different that a combined solution considering aspects of both disciplines cannot be found. Preconditions for a successful model-based integration of ecology and economics include (1) an in-depth knowledge of the two disciplines, (2) the adequate identification and framing of the problem to be investigated, and (3) a common understanding between economists and ecologists of modeling and scale. To further advance ecological-economic modeling the development of common benchmarks, quality controls, and refereeing standards for ecological-economic models is desirable.
Resumo:
Purpose: To determine whether a significant relationship exists between fat mass (FM) development and physical activity (PA) and/or sugar-sweetened drink (SD) consumption in healthy boys and girls aged 8-19 yr. Methods: A total of 105 males and 103 females were assessed during childhood and adolescence for a maximum of 7 yr and a median of 5 yr. Height was measured biannually. Fat-free mass (FFM) and FM were assessed annually by dual x-ray absorptiometry (DXA). PA was evaluated two to three times annually using the PAQ-C/A. Energy intake and SD were assessed using a 24-h dietary intake questionnaire also completed two to three times per year. Years from peak height velocity were used as a biological maturity age indicator. Multilevel random effects models were used to test the relationship. Results: When controlling for maturation, FFM, and energy intake adjusted for SD, PA level was negatively related to FM development in males (P < 0.05) but not in females (P > 0.05). In contrast, there was no relationship between SD and FM development of males or females (P > 0.05). There was also no interaction effect between SD and PA (P > 0.05) with FM development. Conclusion: This finding tends support to the idea that increasing PA in male youths aids in the control of FM development. Models employed showed no relationship between SD and FM in either gender.
Resumo:
This paper shows how formal and informal modeling languages can be cooperatively used in the MDA framework, and how transformations between models in these languages can be achieved using an MDA development environment. The integrated approach also provides an effective V&V technique for the MDA.
Resumo:
The design, development, and use of complex systems models raises a unique class of challenges and potential pitfalls, many of which are commonly recurring problems. Over time, researchers gain experience in this form of modeling, choosing algorithms, techniques, and frameworks that improve the quality, confidence level, and speed of development of their models. This increasing collective experience of complex systems modellers is a resource that should be captured. Fields such as software engineering and architecture have benefited from the development of generic solutions to recurring problems, called patterns. Using pattern development techniques from these fields, insights from communities such as learning and information processing, data mining, bioinformatics, and agent-based modeling can be identified and captured. Collections of such 'pattern languages' would allow knowledge gained through experience to be readily accessible to less-experienced practitioners and to other domains. This paper proposes a methodology for capturing the wisdom of computational modelers by introducing example visualization patterns, and a pattern classification system for analyzing the relationship between micro and macro behaviour in complex systems models. We anticipate that a new field of complex systems patterns will provide an invaluable resource for both practicing and future generations of modelers.
Resumo:
The ability to grow microscopic spherical birefringent crystals of vaterite, a calcium carbonate mineral, has allowed the development of an optical microrheometer based on optical tweezers. However, since these crystals are birefringent, and worse, are expected to have non-uniform birefringence, computational modeling of the microrheometer is a highly challenging task. Modeling the microrheometer - and optical tweezers in general - typically requires large numbers of repeated calculations for the same trapped particle. This places strong demands on the efficiency of computational methods used. While our usual method of choice for computational modelling of optical tweezers - the T-matrix method - meets this requirement of efficiency, it is restricted to homogeneous isotropic particles. General methods that can model complex structures such as the vaterite particles, such as finite-difference time-domain (FDTD) or finite-difference frequency-domain (FDFD) methods, are inefficient. Therefore, we have developed a hybrid FDFD/T-matrix method that combines the generality of volume-discretisation methods such as FDFD with the efficiency of the T-matrix method. We have used this hybrid method to calculate optical forces and torques on model vaterite spheres in optical traps. We present and compare the results of computational modelling and experimental measurements.
Resumo:
The part-of or part-whole construct is a fundamental element of many conceptual modeling grammars that is used to associate one thing (a component) with another thing (a composite). Substantive theoretical issues surrounding the part-whole construct remain to be resolved, however. For instance, contrary to widespread claims, the relationship between components and composites is not always transitive. Moreover, how the partwhole construct should be represented in a conceptual schema diagram remains a contentious issue. Some analysts argue composites should be represented as a relationship or association. Others argue they should be represented as an entity. In this paper we use an ontological theory to support our arguments that composites should be represented as entities and not relationships or associations. We also describe an experiment that we undertook to test whether representing composites as relationships or entities enables users to understand a domain better. Our results support our arguments that using entities to represent composites enables users to better understand a domain.
Resumo:
The application of systems thinking to designing, managing, and improving business processes has developed a new "holonic-based" process modeling methodology. The theoretical background and the methodology are described using examples taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. A key point of differentiation attributed to this methodology is that it allows a set of models to be produced without taking a task breakdown approach but instead uses systems thinking and a construct known as the "holon" to build process descriptions as a system of systems (i.e., a holarchy). The process-oriented holonic modeling methodology has been used for total quality management and business process engineering exercises in different industrial sectors and builds models that connect the strategic vision of a company to its operational processes. Exercises have been conducted in response to environmental pressures to make operations align with strategic thinking as well as becoming increasingly agile and efficient. This unique methodology is best applied in environments of high complexity, low volume, and high variety, where repeated learning opportunities are few and far between (e.g., large development projects). © 2007 IEEE.
Resumo:
The investigation of insulation debris generation, transport, and sedimentation becomes more important with regard to reactor safety research for pressurized water reactors and boiling water reactors when considering the long-term behavior of emergency core coolant systems during all types of loss-of-coolant accidents (LOCAs). The insulation debris released near the break during a LOCA incident consists of a mixture of disparate particle populations that varies with size, shape, consistency, and other properties. Some fractions of the released insulation debris can be transported into the reactor sump, where it may perturb/impinge on the emergency core cooling systems. Open questions of generic interest are, for example, the particle load on strainers and corresponding pressure drop, the sedimentation of the insulation debris in a water pool, and its possible resuspension and transport in the sump water flow. A joint research project on such questions is being performed in cooperation with the University of Applied Sciences Zittau/Görlitz. The project deals with the experimental investigation and the development of computational fluid dynamics (CFD) models for the description of particle transport phenomena in coolant flow. While the experiments are performed at the University of Applied Sciences Zittau/Görlitz, the theoretical work is concentrated at Forschungszentrum Dresden-Rossendorf. In the current paper the basic concepts for CFD modeling are described and feasibility studies including the conceptual design of the experiments are presented.
Resumo:
The objective of this work was to design, construct and commission a new ablative pyrolysis reactor and a high efficiency product collection system. The reactor was to have a nominal throughput of 10 kg/11r of dry biomass and be inherently scalable up to an industrial scale application of 10 tones/hr. The whole process consists of a bladed ablative pyrolysis reactor, two high efficiency cyclones for char removal and a disk and doughnut quench column combined with a wet walled electrostatic precipitator, which is directly mounted on top, for liquids collection. In order to aid design and scale-up calculations, detailed mathematical modelling was undertaken of the reaction system enabling sizes, efficiencies and operating conditions to be determined. Specifically, a modular approach was taken due to the iterative nature of some of the design methodologies, with the output from one module being the input to the next. Separate modules were developed for the determination of the biomass ablation rate, specification of the reactor capacity, cyclone design, quench column design and electrostatic precipitator design. These models enabled a rigorous design protocol to be developed capable of specifying the required reactor and product collection system size for specified biomass throughputs, operating conditions and collection efficiencies. The reactor proved capable of generating an ablation rate of 0.63 mm/s for pine wood at a temperature of 525 'DC with a relative velocity between the heated surface and reacting biomass particle of 12.1 m/s. The reactor achieved a maximum throughput of 2.3 kg/hr, which was the maximum the biomass feeder could supply. The reactor is capable of being operated at a far higher throughput but this would require a new feeder and drive motor to be purchased. Modelling showed that the reactor is capable of achieving a reactor throughput of approximately 30 kg/hr. This is an area that should be considered for the future as the reactor is currently operating well below its theoretical maximum. Calculations show that the current product collection system could operate efficiently up to a maximum feed rate of 10 kg/Fir, provided the inert gas supply was adjusted accordingly to keep the vapour residence time in the electrostatic precipitator above one second. Operation above 10 kg/hr would require some modifications to the product collection system. Eight experimental runs were documented and considered successful, more were attempted but due to equipment failure had to be abandoned. This does not detract from the fact that the reactor and product collection system design was extremely efficient. The maximum total liquid yield was 64.9 % liquid yields on a dry wood fed basis. It is considered that the liquid yield would have been higher had there been sufficient development time to overcome certain operational difficulties and if longer operating runs had been attempted to offset product losses occurring due to the difficulties in collecting all available product from a large scale collection unit. The liquids collection system was highly efficient and modeling determined a liquid collection efficiency of above 99% on a mass basis. This was validated due to the fact that a dry ice/acetone condenser and a cotton wool filter downstream of the collection unit enabled mass measurements of the amount of condensable product exiting the product collection unit. This showed that the collection efficiency was in excess of 99% on a mass basis.
Resumo:
The investigation of insulation debris generation, transport and sedimentation becomes important with regard to reactor safety research for PWR and BWR, when considering the long-term behavior of emergency core cooling systems during all types of loss of coolant accidents (LOCA). The insulation debris released near the break during a LOCA incident consists of a mixture of disparate particle population that varies with size, shape, consistency and other properties. Some fractions of the released insulation debris can be transported into the reactor sump, where it may perturb/impinge on the emergency core cooling systems. Open questions of generic interest are the sedimentation of the insulation debris in a water pool, its possible re-suspension and transport in the sump water flow and the particle load on strainers and corresponding pressure drop. A joint research project on such questions is being performed in cooperation between the University of Applied Sciences Zittau/Görlitz and the Forschungszentrum Dresden-Rossendorf. The project deals with the experimental investigation of particle transport phenomena in coolant flow and the development of CFD models for its description. While the experiments are performed at the University at Zittau/Görlitz, the theoretical modeling efforts are concentrated at Forschungszentrum Dresden-Rossendorf. In the current paper the basic concepts for CFD-modeling are described and feasibility studies including the conceptual design of the experiments are presented. © 2009 Elsevier B.V. All rights reserved.
Resumo:
The investigation of insulation debris generation, transport and sedimentation becomes important with regard to reactor safety research for PWR and BWR, when considering the long-term behavior of emergency core cooling systems during all types of loss of coolant accidents (LOCA). The insulation debris released near the break during a LOCA incident consists of a mixture of disparate particle population that varies with size, shape, consistency and other properties. Some fractions of the released insulation debris can be transported into the reactor sump, where it may perturb/impinge on the emergency core cooling systems. Open questions of generic interest are the sedimentation of the insulation debris in a water pool, its possible re-suspension and transport in the sump water flow and the particle load on strainers and corresponding pressure drop. A joint research project on such questions is being performed in cooperation between the University of Applied Sciences Zittau/Görlitz and the Forschungszentrum Dresden-Rossendorf. The project deals with the experimental investigation of particle transport phenomena in coolant flow and the development of CFD models for its description. While the experiments are performed at the University at Zittau/Görlitz, the theoretical modeling efforts are concentrated at Forschungszentrum Dresden-Rossendorf. In the current paper the basic concepts for CFD modeling are described and feasibility studies including the conceptual design of the experiments are presented.
Resumo:
The number of interoperable research infrastructures has increased significantly with the growing awareness of the efforts made by the Global Earth Observation System of Systems (GEOSS). One of the Societal Benefit Areas (SBA) that is benefiting most from GEOSS is biodiversity, given the costs of monitoring the environment and managing complex information, from space observations to species records including their genetic characteristics. But GEOSS goes beyond simple data sharing to encourage the publishing and combination of models, an approach which can ease the handling of complex multi-disciplinary questions. It is the purpose of this paper to illustrate these concepts by presenting eHabitat, a basic Web Processing Service (WPS) for computing the likelihood of finding ecosystems with equal properties to those specified by a user. When chained with other services providing data on climate change, eHabitat can be used for ecological forecasting and becomes a useful tool for decision-makers assessing different strategies when selecting new areas to protect. eHabitat can use virtually any kind of thematic data that can be considered as useful when defining ecosystems and their future persistence under different climatic or development scenarios. The paper will present the architecture and illustrate the concepts through case studies which forecast the impact of climate change on protected areas or on the ecological niche of an African bird.
Resumo:
The focus of this study is development of parallelised version of severely sequential and iterative numerical algorithms based on multi-threaded parallel platform such as a graphics processing unit. This requires design and development of a platform-specific numerical solution that can benefit from the parallel capabilities of the chosen platform. Graphics processing unit was chosen as a parallel platform for design and development of a numerical solution for a specific physical model in non-linear optics. This problem appears in describing ultra-short pulse propagation in bulk transparent media that has recently been subject to several theoretical and numerical studies. The mathematical model describing this phenomenon is a challenging and complex problem and its numerical modeling limited on current modern workstations. Numerical modeling of this problem requires a parallelisation of an essentially serial algorithms and elimination of numerical bottlenecks. The main challenge to overcome is parallelisation of the globally non-local mathematical model. This thesis presents a numerical solution for elimination of numerical bottleneck associated with the non-local nature of the mathematical model. The accuracy and performance of the parallel code is identified by back-to-back testing with a similar serial version.