955 resultados para Interpretative structural modeling
Resumo:
Numerous research studies have evaluated whether distance learning is a viable alternative to traditional learning methods. These studies have generally made use of cross-sectional surveys for collecting data, comparing distance to traditional learners with intent to validate the former as a viable educational tool. Inherent fundamental differences between traditional and distance learning pedagogies, however, reduce the reliability of these comparative studies and constrain the validity of analyses resulting from this analytical approach. This article presents the results of a research project undertaken to analyze expectations and experiences of distance learners with their degree programs. Students were given surveys designed to examine factors expected to affect their overall value assessment of their distance learning program. Multivariate statistical analyses were used to analyze the correlations among variables of interest to support hypothesized relationships among them. Focusing on distance learners overcomes some of the limitations with assessments that compare off- and on-campus student experiences. Evaluation and modeling of distance learner responses on perceived value for money of the distance education they received indicate that the two most important influences are course communication requirements, which had a negative effect, and course logistical simplicity, which revealed a positive effect. Combined, these two factors accounted for approximately 47% of the variability in perceived value for money of the educational program of sampled students. A detailed focus on comparing expectations with outcomes of distance learners complements the existing literature dominated by comparative studies of distance and nondistance learners.
Resumo:
A total of 1,625 tornadoes occurred in the United States in 2011, resulting in economic losses that exceeded $25 billion. Two tornado outbreaks stand out because they caused more than half of those losses. The tornadoes that cut through Tuscaloosa, Alabama, on April 27 and Joplin, Missouri, on May 22 were responsible for a combined 223 fatalities and more than 13,000 damaged buildings in the two cities. Although the economic losses associated with tornado damage are well documented, the writers argue that the overall impact should encompass longer term, broader considerations such as the social disruption and psychological effects that impact communities. This paper examines observations by tornado damage assessment teams led by the first author in these two medium-sized cities and suggests that the evolution of building codes and past approaches to construction have led to conditions that made this extent of damage possible. The authors outline a multidisciplinary path forward that incorporates engineering research and social and economic studies into a new design paradigm leading to building code changes and social practices that will improve resistance and mitigate future losses at a community level from tornadoes.
Resumo:
This paper presents the blast response, damage mechanism and evaluation of residual load capacity of a concrete–steel composite (CSC) column using dynamic computer simulation techniques. This study is an integral part of a comprehensive research program which investigated the vulnerability of structural framing systems to catastrophic and progressive collapse under blast loading and is intended to provide design information on blast mitigation and safety evaluation of load bearing vulnerable columns that are key elements in a building. The performance of the CSC column is compared with that of a reinforced concrete (RC) column with the same dimensions and steel ratio. Results demonstrate the superior performance of the CSC column, compared to the RC column in terms of residual load carrying capacity, and its potential for use as a key element in structural systems. The procedure and results presented herein can be used in the design and safety evaluation of key elements of multi-storey buildings for mitigating the impact of blast loads.
Resumo:
With a view to minimising the spiraling labour costs, the concrete masonry industry is developing thin layer mortar technology (known as thin bed technology) collaboratively with Queensland University of Technology. Similar technologies are practiced in Europe mainly for clay brick masonry; in the UK thin layer mortared concrete masonry has been researched under commercial contract with limited information published. This paper presents numerous experimental data generated over the past three years. It is shown that this form of masonry requires special drymixed mortar containing a minimum of 2% polymer for improved workability and blocks with tighter height tolerance, both of which might increase the cost of these constituent materials. However, through semiskilled labour, tools to dispense and control the thickness of mortar and the associated increase in productivity, reduction to the overall costs of this form of construction can be achieved. Further the polymer mortar provides several advantages: (1) improved sustainability due to dry curing and (2) potential to construct mortar layers of 2mm thickness and (3) ability for mechanisation of mortar application and control of thickness without the need for skilled labour.
Resumo:
Amiton (O,O-diethyl-S-[2-(diethylamino)ethyl]phosphorothiolate), otherwise known as VG, is listed in schedule 2 of the Chemical Weapons Convention (CWC) and has a structure closely related to VX (O-ethyl-S-(2-diisopropylamino)ethylmethylphosphonothiolate). Fragmentation of protonated VG in the gas phase was performed using electrospray ionisation ion trap mass spectrometry (ESI-ITMS) and revealed several characteristic product ions. Quantum chemical calculations provide the most probable structures for these ions as well as the likely unimolecular mechanisms by which they are formed. The decomposition pathways predicted by computation are consistent with deuterium-labeling studies. The combination of experimental and theoretical data suggests that the fragmentation pathways of VG and analogous organophosphorus nerve agents, such as VX and Russian VX, are predictable and thus ESI tandem mass spectrometry is a powerful tool for the verification of unknown compounds listed in the CWC. Copyright (c) 2006 Commonwealth of Australia. Published by John Wiley & Sons, Ltd.
Resumo:
Due to rapidly diminishing international supplies of fossil fuels, such as petroleum and diesel, the cost of fuel is constantly increasing, leading to higher costs of living, as a result of the significant reliance of many industries on motor vehicles. Many technologies have been developed to replace part or all of a fossil fuel with bio-fuels. One of the dual fuel technologies is fumigation of ethanol in diesel engines, which injects ethanol into the intake air stream of the engine. The advantage of this is that it avoids any costly modification of the engine high pressure diesel injection system, while reducing the volume of diesel required and potentially increasing the power output and efficiency. This paper investigates the performance of a diesel engine, converted to implement ethanol fumigation. The project will use both existing experimental data, along with generating computer modeled results using the program AVL Boost. The data from both experiments and the numerical simulation indicate desirable results for the peak pressure and the indicated mean effective pressure (IMEP). Increase in ethanol substitution resulted in elevated combustion pressure and an increase in the IMEP, while the variation of ethanol injection location resulted in negligible change. These increases in cylinder pressure led to a higher work output and total efficiency in the engine as the ethanol substitution was increased. In comparing the numerical and experimental results, the simulation showed a slight elevation, due to the inaccuracies in the heat release models. Future work is required to improve the combustion model and investigate the effect of the variation of the location of ethanol injection.
Resumo:
Perflurooctanoic acid (PFOA) and perfluorooctane sulfonic acid (PFOS) have been used for a variety of applications including fluoropolymer processing, fire-fighting foams and surface treatments since the 1950s. Both PFOS and PFOA are polyfluoroalkyl chemicals (PFCs), man-made compounds that are persistent in the environment and humans; some PFCs have shown adverse effects in laboratory animals. Here we describe the application of a simple one compartment pharmacokinetic model to estimate total intakes of PFOA and PFOS for the general population of urban areas on the east coast of Australia. Key parameters for this model include the elimination rate constants and the volume of distribution within the body. A volume of distribution was calibrated for PFOA to a value of 170ml/kgbw using data from two communities in the United States where the residents' serum concentrations could be assumed to result primarily from a known and characterized source, drinking water contaminated with PFOA by a single fluoropolymer manufacturing facility. For PFOS, a value of 230ml/kgbw was used, based on adjustment of the PFOA value. Applying measured Australian serum data to the model gave mean+/-standard deviation intake estimates of PFOA of 1.6+/-0.3ng/kgbw/day for males and females >12years of age combined based on samples collected in 2002-2003 and 1.3+/-0.2ng/kg bw/day based on samples collected in 2006-2007. Mean intakes of PFOS were 2.7+/-0.5ng/kgbw/day for males and females >12years of age combined based on samples collected in 2002-2003, and 2.4+/-0.5ng/kgbw/day for the 2006-2007 samples. ANOVA analysis was run for PFOA intake and demonstrated significant differences by age group (p=0.03), sex (p=0.001) and date of collection (p<0.001). Estimated intake rates were highest in those aged >60years, higher in males compared to females, and higher in 2002-2003 compared to 2006-2007. The same results were seen for PFOS intake with significant differences by age group (p<0.001), sex (p=0.001) and date of collection (p=0.016).
Resumo:
The incidence of major storm surges in the last decade have dramatically emphasized the immense destructive capabilities of extreme water level events, particularly when driven by severe tropical cyclones. Given this risk, it is vitally important that the exceedance probabilities of extreme water levels are accurately evaluated to inform risk-based flood and erosion management, engineering and for future land-use planning and to ensure the risk of catastrophic structural failures due to under-design or expensive wastes due to over-design are minimised. Australia has a long history of coastal flooding from tropical cyclones. Using a novel integration of two modeling techniques, this paper provides the first estimates of present day extreme water level exceedance probabilities around the whole coastline of Australia, and the first estimates that combine the influence of astronomical tides, storm surges generated by both extra-tropical and tropical cyclones, and seasonal and inter-annual variations in mean sea level. Initially, an analysis of tide gauge records has been used to assess the characteristics of tropical cyclone-induced surges around Australia. However, given the dearth (temporal and spatial) of information around much of the coastline, and therefore the inability of these gauge records to adequately describe the regional climatology, an observationally based stochastic tropical cyclone model has been developed to synthetically extend the tropical cyclone record to 10,000 years. Wind and pressure fields derived for these synthetically generated events have then been used to drive a hydrodynamic model of the Australian continental shelf region with annual maximum water levels extracted to estimate exceedance probabilities around the coastline. To validate this methodology, selected historic storm surge events have been simulated and resultant storm surges compared with gauge records. Tropical cyclone induced exceedance probabilities have been combined with estimates derived from a 61-year water level hindcast described in a companion paper to give a single estimate of present day extreme water level probabilities around the whole coastline of Australia. Results of this work are freely available to coastal engineers, managers and researchers via a web-based tool (www.sealevelrise.info). The described methodology could be applied to other regions of the world, like the US east coast, that are subject to both extra-tropical and tropical cyclones.
Resumo:
Some minerals are colloidal and show no X-ray diffraction patterns. Vibrational spectroscopy offers one of the few methods for the assessment of the structure of these types of mineral. Among this group of minerals is pitticite simply described as Fe, AsO4, SO4, H2O. The objective of this research is to determine the molecular structure of the mineral pitticite using vibrational spectroscopy. Raman microscopy offers a useful method for the analysis of such colloidal minerals. Raman and infrared bands are attributed to the , and water stretching vibrations. The Raman spectrum is dominated by a very intense sharp band at 983 cm−1 assigned to the symmetric stretching mode. A strong Raman band at 1041 cm−1 is observed and is assigned to the antisymmetric stretching mode. Low intensity Raman bands at 757 and 808 cm−1 may be assigned to the antisymmetric and symmetric stretching modes. Raman bands observed at 432 and 465 cm−1 are attributable to the doubly degenerate ν2(SO4)2- bending mode.
Resumo:
This paper presents two novel nonlinear models of u-shaped anti-roll tanks for ships, and their linearizations. In addition, a third simplified nonlinear model is presented. The models are derived using Lagrangian mechanics. This formulation not only simplifies the modeling process, but also allows one to obtain models that satisfy energy-related physical properties. The proposed nonlinear models and their linearizations are validated using model-scale experimental data. Unlike other models in the literature, the nonlinear models in this paper are valid for large roll amplitudes. Even at moderate roll angles, the nonlinear models have three orders of magnitude lower mean square error relative to experimental data than the linear models.
Resumo:
This article describes a Matlab toolbox for parametric identification of fluid-memory models associated with the radiation forces ships and offshore structures. Radiation forces are a key component of force-to-motion models used in simulators, motion control designs, and also for initial performance evaluation of wave-energy converters. The software described provides tools for preparing non-parmatric data and for identification with automatic model-order detection. The identification problem is considered in the frequency domain.
Resumo:
Real world business process models may consist of hundreds of elements and have sophisticated structure. Although there are tasks where such models are valuable and appreciated, in general complexity has a negative influence on model comprehension and analysis. Thus, means for managing the complexity of process models are needed. One approach is abstraction of business process models-creation of a process model which preserves the main features of the initial elaborate process model, but leaves out insignificant details. In this paper we study the structural aspects of process model abstraction and introduce an abstraction approach based on process structure trees (PST). The developed approach assures that the abstracted process model preserves the ordering constraints of the initial model. It surpasses pattern-based process model abstraction approaches, allowing to handle graph-structured process models of arbitrary structure. We also provide an evaluation of the proposed approach.
Resumo:
Behavioral models capture operational principles of real-world or designed systems. Formally, each behavioral model defines the state space of a system, i.e., its states and the principles of state transitions. Such a model is the basis for analysis of the system’s properties. In practice, state spaces of systems are immense, which results in huge computational complexity for their analysis. Behavioral models are typically described as executable graphs, whose execution semantics encodes a state space. The structure theory of behavioral models studies the relations between the structure of a model and the properties of its state space. In this article, we use the connectivity property of graphs to achieve an efficient and extensive discovery of the compositional structure of behavioral models; behavioral models get stepwise decomposed into components with clear structural characteristics and inter-component relations. At each decomposition step, the discovered compositional structure of a model is used for reasoning on properties of the whole state space of the system. The approach is exemplified by means of a concrete behavioral model and verification criterion. That is, we analyze workflow nets, a well-established tool for modeling behavior of distributed systems, with respect to the soundness property, a basic correctness property of workflow nets. Stepwise verification allows the detection of violations of the soundness property by inspecting small portions of a model, thereby considerably reducing the amount of work to be done to perform soundness checks. Besides formal results, we also report on findings from applying our approach to an industry model collection.
Resumo:
Process models are usually depicted as directed graphs, with nodes representing activities and directed edges control flow. While structured processes with pre-defined control flow have been studied in detail, flexible processes including ad-hoc activities need further investigation. This paper presents flexible process graph, a novel approach to model processes in the context of dynamic environment and adaptive process participants’ behavior. The approach allows defining execution constraints, which are more restrictive than traditional ad-hoc processes and less restrictive than traditional control flow, thereby balancing structured control flow with unstructured ad-hoc activities. Flexible process graph focuses on what can be done to perform a process. Process participants’ routing decisions are based on the current process state. As a formal grounding, the approach uses hypergraphs, where each edge can associate any number of nodes. Hypergraphs are used to define execution semantics of processes formally. We provide a process scenario to motivate and illustrate the approach.