993 resultados para 120499 Engineering Design not elsewhere classified


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Virtual Learning Environment (VLE) is one of the fastest growing areas in educational technology research and development. In order to achieve learning effectiveness, ideal VLEs should be able to identify learning needs and customize solutions, with or without an instructor to supplement instruction. They are called Personalized VLEs (PVLEs). In order to achieve PVLEs success, comprehensive conceptual models corresponding to PVLEs are essential. Such conceptual modeling development is important because it facilitates early detection and correction of system development errors. Therefore, in order to capture the PVLEs knowledge explicitly, this paper focuses on the development of conceptual models for PVLEs, including models of knowledge primitives in terms of learner, curriculum, and situational models, models of VLEs in general pedagogical bases, and particularly, the definition of the ontology of PVLEs on the constructivist pedagogical principle. Based on those comprehensive conceptual models, a prototyped multiagent-based PVLE has been implemented. A field experiment was conducted to investigate the learning achievements by comparing personalized and non-personalized systems. The result indicates that the PVLE we developed under our comprehensive ontology successfully provides significant learning achievements. These comprehensive models also provide a solid knowledge representation framework for PVLEs development practice, guiding the analysis, design, and development of PVLEs. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nucleation is the first stage in any granulation process where binder liquid first comes into contact with the powder. This paper investigates the nucleation process where binder liquid is added to a fine powder with a spray nozzle. The dimensionless spray flux approach of Hapgood et al. (Powder Technol. 141 (2004) 20) is extended to account for nonuniform spray patterns and allow for overlap of nuclei granules rather than spray drops. A dimensionless nuclei distribution function which describes the effects of the design and operating parameters of the nucleation process (binder spray characteristics, the nucleation area ratio between droplets and nuclei and the powder bed velocity) on the fractional surface area coverage of nuclei on a moving powder bed is developed. From this starting point, a Monte Carlo nucleation model that simulates full nuclei size distributions as a function of the design and operating parameters that were implemented in the dimensionless nuclei distribution function is developed. The nucleation model was then used to investigate the effects of the design and operating parameters on the formed nuclei size distributions and to correlate these effects to changes of the dimensionless nuclei distribution function. Model simulations also showed that it is possible to predict nuclei size distributions beyond the drop controlled nucleation regime in Hapgood's nucleation regime map. Qualitative comparison of model simulations and experimental nucleation data showed similar shapes of the nuclei size distributions. In its current form, the nucleation model can replace the nucleation term in one-dimensional population balance models describing wet granulation processes. Implementation of more sophisticated nucleation kinetics can make the model applicable to multi-dimensional population balance models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Reliable information on causes of death is a fundamental component of health development strategies, yet globally only about one-third of countries have access to such information. For countries currently without adequate mortality reporting systems there are useful models other than resource-intensive population-wide medical certification. Sample-based mortality surveillance is one such approach. This paper provides methods for addressing appropriate sample size considerations in relation to mortality surveillance, with particular reference to situations in which prior information on mortality is lacking. Methods The feasibility of model-based approaches for predicting the expected mortality structure and cause composition is demonstrated for populations in which only limited empirical data is available. An algorithm approach is then provided to derive the minimum person-years of observation needed to generate robust estimates for the rarest cause of interest in three hypothetical populations, each representing different levels of health development. Results Modelled life expectancies at birth and cause of death structures were within expected ranges based on published estimates for countries at comparable levels of health development. Total person-years of observation required in each population could be more than halved by limiting the set of age, sex, and cause groups regarded as 'of interest'. Discussion The methods proposed are consistent with the philosophy of establishing priorities across broad clusters of causes for which the public health response implications are similar. The examples provided illustrate the options available when considering the design of mortality surveillance for population health monitoring purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a new control design method is proposed for stable processes which can be described using Hammerstein-Wiener models. The internal model control (IMC) framework is extended to accommodate multiple IMC controllers, one for each subsystem. The concept of passive systems is used to construct the IMC controllers which approximate the inverses of the subsystems to achieve dynamic control performance. The Passivity Theorem is used to ensure the closed-loop stability. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes an ongoing collaboration between Boeing Australia Limited and the University of Queensland to develop and deliver an introductory course on software engineering. The aims of the course are to provide a common understanding of the nature of software engineering for all Boeing Australia's engineering staff, and to ensure they understand the practices used throughout the company. The course is designed so that it can be presented to people with varying backgrounds, such as recent software engineering graduates, systems engineers, quality assurance personnel, etc. The paper describes the structure and content of the course, and the evaluation techniques used to collect feedback from the participants and the corresponding results. The immediate feedback on the course indicates that it has been well received by the participants, but also indicates a need for more advanced courses in specific areas. The long-term feedback from participants is less positive, and the long-term feedback from the managers of the course participants indicates a need to expand on the coverage of the Boeing-specific processes and methods. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geospatio-temporal conceptual models provide a mechanism to explicitly represent geospatial and temporal aspects of applications. Such models, which focus on both what and when/where, need to be more expressive than conventional conceptual models (e.g., the ER model), which primarily focus on what is important for a given application. In this study, we view conceptual schema comprehension of geospatio-temporal data semantics in terms of matching the external problem representation (that is, the conceptual schema) to the problem-solving task (that is, syntactic and semantic comprehension tasks), an argument based on the theory of cognitive fit. Our theory suggests that an external problem representation that matches the problem solver's internal task representation will enhance performance, for example, in comprehending such schemas. To assess performance on geospatio-temporal schema comprehension tasks, we conducted a laboratory experiment using two semantically identical conceptual schemas, one of which mapped closely to the internal task representation while the other did not. As expected, we found that the geospatio-temporal conceptual schema that corresponded to the internal representation of the task enhanced the accuracy of schema comprehension; comprehension time was equivalent for both. Cognitive fit between the internal representation of the task and conceptual schemas with geospatio-temporal annotations was, therefore, manifested in accuracy of schema comprehension and not in time for problem solution. Our findings suggest that the annotated schemas facilitate understanding of data semantics represented on the schema.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this manuscript is to introduce a framework for consideration of designs for population pharmacokinetic orpharmacokinetic-pharmacodynamic studies. A standard one compartment pharmacokinetic model with first-order input and elimination is considered. A series of theoretical designs are considered that explore the influence of optimizing the allocation of sampling times, allocating patients to elementary designs, consideration of sparse sampling and unbalanced designs and also the influence of single vs. multiple dose designs. It was found that what appears to be relatively sparse sampling (less blood samples per patient than the number of fixed effects parameters to estimate) can also be highly informative. Overall, it is evident that exploring the population design space can yield many parsimonious designs that are efficient for parameter estimation and that may not otherwise have been considered without the aid of optimal design theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optimal sampling times are found for a study in which one of the primary purposes is to develop a model of the pharmacokinetics of itraconazole in patients with cystic fibrosis for both capsule and solution doses. The optimal design is expected to produce reliable estimates of population parameters for two different structural PK models. Data collected at these sampling times are also expected to provide the researchers with sufficient information to reasonably discriminate between the two competing structural models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores potential for the RAMpage memory hierarchy to use a microkernel with a small memory footprint, in a specialized cache-speed static RAM (tightly-coupled memory, TCM). Dreamy memory is DRAM kept in low-power mode, unless referenced. Simulations show that a small microkernel suits RAMpage well, in that it achieves significantly better speed and energy gains than a standard hierarchy from adding TCM. RAMpage, in its best 128KB L2 case, gained 11% speed using TCM, and reduced energy 14%. Equivalent conventional hierarchy gains were under 1%. While 1MB L2 was significantly faster against lower-energy cases for the smaller L2, the larger SRAM's energy does not justify the speed gain. Using a 128KB L2 cache in a conventional architecture resulted in a best-case overall run time of 2.58s, compared with the best dreamy mode run time (RAMpage without context switches on misses) of 3.34s, a speed penalty of 29%. Energy in the fastest 128KB L2 case was 2.18J vs. 1.50J, a reduction of 31%. The same RAMpage configuration without dreamy mode took 2.83s as simulated, and used 2.39J, an acceptable trade-off (penalty under 10%) for being able to switch easily to a lower-energy mode.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Methodological challenges such as recruitment problems and participant burden make clinical trials in palliative care difficult. In 2001-2004, two community-based randomized controlled trials (RCTs) of case conferences in palliative care settings were independently conducted in Australia-the Queensland Case Conferences trial (QCC) and the Palliative Care Trial (PCT). Design: A structured comparative study of the QCC and PCT was conducted, organized by known practical and organizational barriers to clinical trials in palliative care. Results: Differences in funding dictated study designs and recruitment success; PCT had 6 times the budget of QCC. Sample size attainment. Only PCT achieved the sample size goal. QCC focused on reducing attrition through gatekeeping while PCT maximized participation through detailed recruitment strategies and planned for significant attrition. Testing sustainable interventions. QCC achieved a higher percentage of planned case conferences; the QCC strategy required minimal extra work for clinicians while PCT superimposed conferences on normal work schedules. Minimizing participant burden. Differing strategies of data collection were implemented to reduce participant burden. QCC had short survey instruments. PCT incorporated all data collection into normal clinical nursing encounters. Other. Both studies had acceptable withdrawal rates. Intention-to-treat analyses are planned. Both studies included substudies to validate new outcome measures. Conclusions: Health service interventions in palliative care can be studied using RCTs. Detailed comparative information of strategies, successes and challenges can inform the design of future trials. Key lessons include adequate funding, recruitment focus, sustainable interventions, and mechanisms to minimize participant burden.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, numerical simulations are used in an attempt to find optimal Source profiles for high frequency radiofrequency (RF) volume coils. Biologically loaded, shielded/unshielded circular and elliptical birdcage coils operating at 170 MHz, 300 MHz and 470 MHz are modelled using the FDTD method for both 2D and 3D cases. Taking advantage of the fact that some aspects of the electromagnetic system are linear, two approaches have been proposed for the determination of the drives for individual elements in the RF resonator. The first method is an iterative optimization technique with a kernel for the evaluation of RF fields inside an imaging plane of a human head model using pre-characterized sensitivity profiles of the individual rungs of a resonator; the second method is a regularization-based technique. In the second approach, a sensitivity matrix is explicitly constructed and a regularization procedure is employed to solve the ill-posed problem. Test simulations show that both methods can improve the B-1-field homogeneity in both focused and non-focused scenarios. While the regularization-based method is more efficient, the first optimization method is more flexible as it can take into account other issues such as controlling SAR or reshaping the resonator structures. It is hoped that these schemes and their extensions will be useful for the determination of multi-element RF drives in a variety of applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the interfacial interactions between the nanofiller and polymer matrix is important to improve the design and manufacture of polymer nanocomposites. This paper reports a molecular dynamic Study on the interfacial interactions and structure of a clay-based polyurethane intercalated nanocomposite. The results show that the intercalation of surfactant (i.e. dioctadecyldlmethyl ammonium) and polyurethane (PU) into the nanoconfined gallery of clay leads to the multilayer structure for both surfactant and PU, and the absence of phase separation for PU chains. Such structural characteristics are attributed to the result of competitive interactions among the surfactant, PU and the clay surface, including van der Waals, electrostatic and hydrogen bonding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Virus-like particles (VLPs) are of interest in vaccination, gene therapy and drug delivery, but their potential has yet to be fully realized. This is because existing laboratory processes, when scaled, do not easily give a compositionally and architecturally consistent product. Research suggests that new process routes might ultimately be based on chemical processing by self-assembly, involving the precision manufacture of precursor capsomeres followed by in vitro VLP self-assembly and scale-up to required levels. A synergistic interaction of biomolecular design and bioprocess engineering (i.e. biomolecular engineering) is required if these alternative process routes and, thus, the promise of new VLP products, are to be realized.