466 resultados para Modeling Development
em Queensland University of Technology - ePrints Archive
Resumo:
Objective: The aim of this study was to develop a model capable of predicting variability in the mental workload experienced by frontline operators under routine and nonroutine conditions. Background: Excess workload is a risk that needs to be managed in safety-critical industries. Predictive models are needed to manage this risk effectively yet are difficult to develop. Much of the difficulty stems from the fact that workload prediction is a multilevel problem. Method: A multilevel workload model was developed in Study 1 with data collected from an en route air traffic management center. Dynamic density metrics were used to predict variability in workload within and between work units while controlling for variability among raters. The model was cross-validated in Studies 2 and 3 with the use of a high-fidelity simulator. Results: Reported workload generally remained within the bounds of the 90% prediction interval in Studies 2 and 3. Workload crossed the upper bound of the prediction interval only under nonroutine conditions. Qualitative analyses suggest that nonroutine events caused workload to cross the upper bound of the prediction interval because the controllers could not manage their workload strategically. Conclusion: The model performed well under both routine and nonroutine conditions and over different patterns of workload variation. Application: Workload prediction models can be used to support both strategic and tactical workload management. Strategic uses include the analysis of historical and projected workflows and the assessment of staffing needs. Tactical uses include the dynamic reallocation of resources to meet changes in demand.
Resumo:
In early stages of design and modeling, computers and computer applications are often considered an obstacle, rather than a facilitator of the process. Most notably, brainstorms, process modeling with business experts, or development planning, are often performed by a team in front of a whiteboard. While "whiteboarding" is recognized as an effective tool, low-tech solutions that allow remote participants to contribute are still not generally available. This is a striking observation, considering that vast majority of teams in large organizations are distributed teams. And this has also been one of the key triggers behind the project described in this article, where a team of corporate researchers decided to identify state of the art technologies that could facilitate the scenario mentioned above. This paper is an account of a research project in the area of enterprise collaboration, with a strong focus on the aspects of human computer interaction in mixed mode environments, especially in areas of collaboration where computers still play a secondary role. It is describing a currently running corporate research project. © 2012 Springer-Verlag.
Resumo:
Process modeling is an emergent area of Information Systems research that is characterized through an abundance of conceptual work with little empirical research. To fill this gap, this paper reports on the development and validation of an instrument to measure user acceptance of process modeling grammars. We advance an extended model for a multi-stage measurement instrument development procedure, which incorporates feedback from both expert and user panels. We identify two main contributions: First, we provide a validated measurement instrument for the study of user acceptance of process modeling grammars, which can be used to assist in further empirical studies that investigate phenomena associated with the business process modeling domain. Second, in doing so, we describe in detail a procedural model for developing measurement instruments that ensures high levels of reliability and validity, which may assist fellow scholars in executing their empirical research.
Resumo:
If the land sector is to make significant contributions to mitigating anthropogenic greenhouse gas (GHG) emissions in coming decades, it must do so while concurrently expanding production of food and fiber. In our view, mathematical modeling will be required to provide scientific guidance to meet this challenge. In order to be useful in GHG mitigation policy measures, models must simultaneously meet scientific, software engineering, and human capacity requirements. They can be used to understand GHG fluxes, to evaluate proposed GHG mitigation actions, and to predict and monitor the effects of specific actions; the latter applications require a change in mindset that has parallels with the shift from research modeling to decision support. We compare and contrast 6 agro-ecosystem models (FullCAM, DayCent, DNDC, APSIM, WNMM, and AgMod), chosen because they are used in Australian agriculture and forestry. Underlying structural similarities in the representations of carbon flows though plants and soils in these models are complemented by a diverse range of emphases and approaches to the subprocesses within the agro-ecosystem. None of these agro-ecosystem models handles all land sector GHG fluxes, and considerable model-based uncertainty exists for soil C fluxes and enteric methane emissions. The models also show diverse approaches to the initialisation of model simulations, software implementation, distribution, licensing, and software quality assurance; each of these will differentially affect their usefulness for policy-driven GHG mitigation prediction and monitoring. Specific requirements imposed on the use of models by Australian mitigation policy settings are discussed, and areas for further scientific development of agro-ecosystem models for use in GHG mitigation policy are proposed.
Resumo:
In Service-Oriented Architectures (SOAs), software systems are decomposed into independent units, namely services, that interact with one another through message exchanges. To promote reuse and evolvability, these interactions are explicitly described right from the early phases of the development lifecycle. Up to now, emphasis has been placed on capturing structural aspects of service interactions. Gradually though, the description of behavioral dependencies between service interactions is gaining increasing attention as a means to push forward the SOA vision. This paper deals with the description of these behavioral dependencies during the analysis and design phases. The paper outlines a set of requirements that a language for modeling service interactions at this level should fulfill, and proposes a language whose design is driven by these requirements.
Resumo:
This paper aims to develop the methodology and strategy for concurrent finite element modeling of civil infrastructures at the different scale levels for the purposes of analyses of structural deteriorating. The modeling strategy and method were investigated to develop the concurrent multi-scale model of structural behavior (CMSM-of-SB) in which the global structural behavior and nonlinear damage features of local details in a large complicated structure could be concurrently analyzed in order to meet the needs of structural-state evaluation as well as structural deteriorating. In the proposed method, the “large-scale” modeling is adopted for the global structure with linear responses between stress and strain and the “small-scale” modeling is available for nonlinear damage analyses of the local welded details. A longitudinal truss in steel bridge decks was selected as a case to study how a CMSM-of-SB was developed. The reduced-scale specimen of the longitudinal truss was studied in the laboratory to measure its dynamic and static behavior in global truss and local welded details, while the multi-scale models using constraint equations and substructuring were developed for numerical simulation. The comparison of dynamic and static response between the calculated results by different models indicated that the proposed multi-scale model was found to be the most efficient and accurate. The verification of the model with results from the tested truss under the specific loading showed that, responses at the material scale in the vicinity of local details as well as structural global behaviors could be obtained and fit well with the measured results. The proposed concurrent multi-scale modeling strategy and implementation procedures were applied to Runyang cable-stayed bridge (RYCB) and the CMSM-of-SB of the bridge deck system was accordingly constructed as a practical application.
Resumo:
This paper is a continuation of the paper titled “Concurrent multi-scale modeling of civil infrastructure for analyses on structural deteriorating—Part I: Modeling methodology and strategy” with the emphasis on model updating and verification for the developed concurrent multi-scale model. The sensitivity-based parameter updating method was applied and some important issues such as selection of reference data and model parameters, and model updating procedures on the multi-scale model were investigated based on the sensitivity analysis of the selected model parameters. The experimental modal data as well as static response in terms of component nominal stresses and hot-spot stresses at the concerned locations were used for dynamic response- and static response-oriented model updating, respectively. The updated multi-scale model was further verified to act as the baseline model which is assumed to be finite-element model closest to the real situation of the structure available for the subsequent arbitrary numerical simulation. The comparison of dynamic and static responses between the calculated results by the final model and measured data indicated the updating and verification methods applied in this paper are reliable and accurate for the multi-scale model of frame-like structure. The general procedures of multi-scale model updating and verification were finally proposed for nonlinear physical-based modeling of large civil infrastructure, and it was applied to the model verification of a long-span bridge as an actual engineering practice of the proposed procedures.
Resumo:
Some new types of mathematical model among four key techno - economic indexes of highway rapid passenger through transportation were established based on the principles of transportation economics. According to the research on the feasible solutions to the associated parameters which were then compared to the actual value, found some limitation in the existing transport organization method. In order to conquer that, two new types of transport organization method, namely CD (Collecting and Distributing) Method and Relay Method were brought forward. What’s more, a further research was down to estimate their characteristics, such as feasibilities, operation flows, applicability fields, etc. This analysis proves the two methods can offset the shortage of rapid passenger through transportation. To ensure highway rapid passenger transport develop harmoniously, a three-stage development targets was suggested to fuse different organization methods.
Resumo:
This chapter discusses reference modelling languages for business systems analysis and design. In particular, it reports on reference models in the context of the design-for/by-reuse paradigm, explains how traditional modelling techniques fail to provide adequate conceptual expressiveness to allow for easy model reuse by configuration or adaptation and elaborates on the need for reference modelling languages to be configurable. We discuss requirements for and the development of reference modelling languages that reflect the need for configurability. Exemplarily, we report on the development, definition and configuration of configurable event-driven process chains. We further outline how configurable reference modelling languages and the corresponding design principles can be used in future scenarios such as process mining and data modelling.
Resumo:
Since the 1960s, numerous studies on problem solving have revealed the complexity of the domain and the difficulty in translating research findings into practice. The literature suggests that the impact of problem solving research on the mathematics curriculum has been limited. Furthermore, our accumulation of knowledge on the teaching of problem solving is lagging. In this first discussion paper we initially present a sketch of 50 years of research on mathematical problem solving. We then consider some factors that have held back problem solving research over the past decades and offer some directions for how we might advance the field. We stress the urgent need to take into account the nature of problem solving in various arenas of today’s world and to accordingly modernize our perspectives on the teaching and learning of problem solving and of mathematical content through problem solving. Substantive theory development is also long overdue—we show how new perspectives on the development of problem solving expertise can contribute to theory development in guiding the design of worthwhile learning activities. In particular, we explore a models and modeling perspective as an alternative to existing views on problem solving.
Resumo:
This paper is the second in a pair that Lesh, English, and Fennewald will be presenting at ICME TSG 19 on Problem Solving in Mathematics Education. The first paper describes three shortcomings of past research on mathematical problem solving. The first shortcoming can be seen in the fact that knowledge has not accumulated – in fact it has atrophied significantly during the past decade. Unsuccessful theories continue to be recycled and embellished. One reason for this is that researchers generally have failed to develop research tools needed to reliably observe, document, and assess the development of concepts and abilities that they claim to be important. The second shortcoming is that existing theories and research have failed to make it clear how concept development (or the development of basic skills) is related to the development of problem solving abilities – especially when attention is shifted beyond word problems found in school to the kind of problems found outside of school, where the requisite skills and even the questions to be asked might not be known in advance. The third shortcoming has to do with inherent weaknesses in observational studies and teaching experiments – and the assumption that a single grand theory should be able to describe all of the conceptual systems, instructional systems, and assessment systems that strongly molded and shaped by the same theoretical perspectives that are being used to develop them. Therefore, this paper will describe theoretical perspectives and methodological tools that are proving to be effective to combat the preceding kinds or shortcomings. We refer to our theoretical framework as models & modeling perspectives (MMP) on problem solving (Lesh & Doerr, 2003), learning, and teaching. One of the main methodologies of MMP is called multi-tier design studies (MTD).
Resumo:
This study aimed to develop and assess the reliability and validity of a pair of self-report questionnaires to measure self-efficacy and expectancy associated with benzodiazepine use, the Benzodiazepine Refusal Self- Efficacy Questionnaire (BRSEQ) and the Benzodiazepine Expectancy Questionnaire (BEQ). Internal structure of the questionnaireswas established by principal component analysis (PCA) in a sample of 155 respondents, and verified by confirmatory factor analyses (CFA) in a second independent sample (n=139) using structural equation modeling. The PCA of the BRSEQ resulted in a 16-item, 4-factor scale, and the BEQ formed an 18-item, 2-factor scale. Both scales were internally reliable. CFA confirmed these internal structures and reduced the questionnaires to a 14-item self-efficacy scale and a 12-item expectancy scale. Lower self-efficacy and higher expectancy were moderately associated with higher scores on the SDS-B. The scales provide reliable measures for assessing benzodiazepine self-efficacy and expectancies. Future research will examine the utility of the scales in prospective prediction of benzodiazepine cessation.
Resumo:
The accuracy of data derived from linked-segment models depends on how well the system has been represented. Previous investigations describing the gait of persons with partial foot amputation did not account for the unique anthropometry of the residuum or the inclusion of a prosthesis and footwear in the model and, as such, are likely to have underestimated the magnitude of the peak joint moments and powers. This investigation determined the effect of inaccuracies in the anthropometric input data on the kinetics of gait. Toward this end, a geometric model was developed and validated to estimate body segment parameters of various intact and partial feet. These data were then incorporated into customized linked-segment models, and the kinetic data were compared with that obtained from conventional models. Results indicate that accurate modeling increased the magnitude of the peak hip and knee joint moments and powers during terminal swing. Conventional inverse dynamic models are sufficiently accurate for research questions relating to stance phase. More accurate models that account for the anthropometry of the residuum, prosthesis, and footwear better reflect the work of the hip extensors and knee flexors to decelerate the limb during terminal swing phase.
Resumo:
Scoliosis is a three-dimensional spinal deformity which requires surgical correction in progressive cases. In order to optimize correction and avoid complications following scoliosis surgery, patient-specific finite element models (FEM) are being developed and validated by our group. In this paper, the modeling methodology is described and two clinically relevant load cases are simulated for a single patient. Firstly, a pre-operative patient flexibility assessment, the fulcrum bending radiograph, is simulated to assess the model's ability to represent spine flexibility. Secondly, intra-operative forces during single rod anterior correction are simulated. Clinically, the patient had an initial Cobb angle of 44 degrees, which reduced to 26 degrees during fulcrum bending. Surgically, the coronal deformity corrected to 14 degrees. The simulated initial Cobb angle was 40 degrees, which reduced to 23 degrees following the fulcrum bending load case. The simulated surgical procedure corrected the coronal deformity to 14 degrees. The computed results for the patient-specific FEM are within the accepted clinical Cobb measuring error of 5 degrees, suggested that this modeling methodology is capable of capturing the biomechanical behaviour of a scoliotic human spine during anterior corrective surgery.