689 resultados para Requirements engineering


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most buildings constructed in Australia must comply with the Building Code of Australia (BCA). Checking for compliance against the BCA is a major task for both designers and building surveyors. This project carries out a prototype research using the EDM Model Checker and the SMC Model Checker for automated design checking against the Building Codes of Australia for use in professional practice. In this project, we develop a means of encoding design requirements and domain specific knowledge for building codes and investigate the flexibility of building models to contain design information. After assessing two implementations of EDM and SMC that check compliance against deemed-to-satisfy provision of building codes relevant to access by people with disabilities, an approach to automated code checking using a shared object-oriented database is established. This project can be applied in other potential areas – including checking a building design for non-compliance of many types of design requirements. Recommendations for future development and use in other potential areas in construction industries are discussed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The requirement to monitor the rapid pace of environmental change due to global warming and to human development is producing large volumes of data but placing much stress on the capacity of ecologists to store, analyse and visualise that data. To date, much of the data has been provided by low level sensors monitoring soil moisture, dissolved nutrients, light intensity, gas composition and the like. However, a significant part of an ecologist’s work is to obtain information about species diversity, distributions and relationships. This task typically requires the physical presence of an ecologist in the field, listening and watching for species of interest. It is an extremely difficult task to automate because of the higher order difficulties in bandwidth, data management and intelligent analysis if one wishes to emulate the highly trained eyes and ears of an ecologist. This paper is concerned with just one part of the bigger challenge of environmental monitoring – the acquisition and analysis of acoustic recordings of the environment. Our intention is to provide helpful tools to ecologists – tools that apply information technologies and computational technologies to all aspects of the acoustic environment. The on-line system which we are building in conjunction with ecologists offers an integrated approach to recording, data management and analysis. The ecologists we work with have different requirements and therefore we have adopted the toolbox approach, that is, we offer a number of different web services that can be concatenated according to need. In particular, one group of ecologists is concerned with identifying the presence or absence of species and their distributions in time and space. Another group, motivated by legislative requirements for measuring habitat condition, are interested in summary indices of environmental health. In both case, the key issues are scalability and automation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper provides an overview of a new framework for a design stage Building Environmental Assessment (BEA) tool and a discussion of strategic responses to existing tool issues and relative stakeholder requirements that lead to the development of this tool founded on new information and communication technology (ICT) related to developments in 3D CAD technology. After introducing the context of the BEA and some of their team’s new work the authors • Critique current BEA tool theory; • Review previous assessments of stakeholder needs; • Introduce a new framework applied to analyse such tools • Highlight and key results considering illustrative ICT capabilities and • Discuss their potential significance upon BEA tool stakeholders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Theories on teaching and learning for adult learners are constantly being reviewed and discussed in the higher educational environment. Theories are not static and appear to be in a constant developmental process. This paper discusses three of these theories: pedagogy, andragogy and heutagogy. It is argued that although educators engage in many of the principles of either student-centered (andragogy) and self-determined (heutagogy) learning, it is not possible to fully implement either theory. The two main limitations are the requirements of both internal and external stakeholders, such as accrediting bodies and requirements to assess all student learning. A reversion to teacher-centered learning (pedagogy) ensues. In summary, we engage in many action-oriented learning activities but revert to teacher-centered approaches in terms of content and assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Engineering education is underrepresented in Australia at the primary, middle school and high school levels. Understanding preservice teachers’ preparedness to be involved in engineering will be important for developing an engineering curriculum. This study administered a literature-based survey to 36 preservice teachers, which gathered data about their perceptions of engineering and their predispositions for teaching engineering. Findings indicated that the four constructs associated with the survey had acceptable Cronbach alpha scores (i.e., personal professional attributes .88, student motivation .91, pedagogical knowledge .91, and fused curricula .89). However, there was no “disagree” or “strongly disagree” response greater than 22% for any of the 25 survey items. Generally, these preservice teachers indicated predispositions for teaching engineering in the middle school. Extensive scaffolding and support with education programs will assist preservice teachers to develop confidence in this field. Governments and education departments need to recognise the importance of engineering education, and universities must take a stronger role in developing engineering education curricula.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is currently a strong focus worldwide on the potential of large-scale Electronic Health Record (EHR) systems to cut costs and improve patient outcomes through increased efficiency. This is accomplished by aggregating medical data from isolated Electronic Medical Record databases maintained by different healthcare providers. Concerns about the privacy and reliability of Electronic Health Records are crucial to healthcare service consumers. Traditional security mechanisms are designed to satisfy confidentiality, integrity, and availability requirements, but they fail to provide a measurement tool for data reliability from a data entry perspective. In this paper, we introduce a Medical Data Reliability Assessment (MDRA) service model to assess the reliability of medical data by evaluating the trustworthiness of its sources, usually the healthcare provider which created the data and the medical practitioner who diagnosed the patient and authorised entry of this data into the patient’s medical record. The result is then expressed by manipulating health record metadata to alert medical practitioners relying on the information to possible reliability problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electronic Health Record (EHR) systems are being introduced to overcome the limitations associated with paper-based and isolated Electronic Medical Record (EMR) systems. This is accomplished by aggregating medical data and consolidating them in one digital repository. Though an EHR system provides obvious functional benefits, there is a growing concern about the privacy and reliability (trustworthiness) of Electronic Health Records. Security requirements such as confidentiality, integrity, and availability can be satisfied by traditional hard security mechanisms. However, measuring data trustworthiness from the perspective of data entry is an issue that cannot be solved with traditional mechanisms, especially since degrees of trust change over time. In this paper, we introduce a Time-variant Medical Data Trustworthiness (TMDT) assessment model to evaluate the trustworthiness of medical data by evaluating the trustworthiness of its sources, namely the healthcare organisation where the data was created and the medical practitioner who diagnosed the patient and authorised entry of this data into the patient’s medical record, with respect to a certain period of time. The result can then be used by the EHR system to manipulate health record metadata to alert medical practitioners relying on the information to possible reliability problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2004, with the increasing overloading restriction requirements of society in Anhui, a provincial comprehensive overloading transportation survey has been developed to take evaluations on overloading actuality and enforcement efficiency with the support of the World Bank. A total of six site surveys were conducted at Hefei, Fuyang, Luan, Wuhu, Huainan and Huangshan Areas with four main contents respectively: traffic volume, axle load, freight information and registration information. Via statistical analysis on the survey data, conclusions were gained that: vehicle overloading are very universal and serious problems at arterial highways in Anhui now. The traffic loads have far exceeded the designed endure capacity of highways and have caused prevalent premature pavement damage, especially for rigid pavement. The overloading trucks are unimpeded engaged in highway freight transportation actually due to the disordered overloading enforcement strategies and the deficient inspecting technologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This appendix describes the Order Fulfillment process followed by a fictitious company named Genko Oil. The process is freely inspired by the VICS (Voluntary Inter-industry Commerce Solutions) reference model1 and provides a demonstration of YAWL’s capabilities in modelling complex control-flow, data and resourcing requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Historically, asset management focused primarily on the reliability and maintainability of assets; organisations have since then accepted the notion that a much larger array of processes govern the life and use of an asset. With this, asset management’s new paradigm seeks a holistic, multi-disciplinary approach to the management of physical assets. A growing number of organisations now seek to develop integrated asset management frameworks and bodies of knowledge. This research seeks to complement existing outputs of the mentioned organisations through the development of an asset management ontology. Ontologies define a common vocabulary for both researchers and practitioners who need to share information in a chosen domain. A by-product of ontology development is the realisation of a process architecture, of which there is also no evidence in published literature. To develop the ontology and subsequent asset management process architecture, a standard knowledge-engineering methodology is followed. This involves text analysis, definition and classification of terms and visualisation through an appropriate tool (in this case, the Protégé application was used). The result of this research is the first attempt at developing an asset management ontology and process architecture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With increasingly complex engineering assets and tight economic requirements, asset reliability becomes more crucial in Engineering Asset Management (EAM). Improving the reliability of systems has always been a major aim of EAM. Reliability assessment using degradation data has become a significant approach to evaluate the reliability and safety of critical systems. Degradation data often provide more information than failure time data for assessing reliability and predicting the remnant life of systems. In general, degradation is the reduction in performance, reliability, and life span of assets. Many failure mechanisms can be traced to an underlying degradation process. Degradation phenomenon is a kind of stochastic process; therefore, it could be modelled in several approaches. Degradation modelling techniques have generated a great amount of research in reliability field. While degradation models play a significant role in reliability analysis, there are few review papers on that. This paper presents a review of the existing literature on commonly used degradation models in reliability analysis. The current research and developments in degradation models are reviewed and summarised in this paper. This study synthesises these models and classifies them in certain groups. Additionally, it attempts to identify the merits, limitations, and applications of each model. It provides potential applications of these degradation models in asset health and reliability prediction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"This book investigates the origins and implications of the securitization crisis, described by the chief executive of ANZ as a "financial services bloodbath". Based on extensive interviews it offers an integrated series of case studies drawn from the United States, the United Kingdom and Australia. A central purpose is to not only chart what went wrong with the investment houses and why the regulatory systems failed, but also provide policy guidance. The book therefore combines the empirical with the normative. In so doing, it provides a route map to navigate one of the most significant financial and regulatory failures in modern times."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we discuss our current efforts to develop and implement an exploratory, discovery mode assessment item into the total learning and assessment profile for a target group of about 100 second level engineering mathematics students. The assessment item under development is composed of 2 parts, namely, a set of "pre-lab" homework problems (which focus on relevant prior mathematical knowledge, concepts and skills), and complementary computing laboratory exercises which are undertaken within a fixed (1 hour) time frame. In particular, the computing exercises exploit the algebraic manipulation and visualisation capabilities of the symbolic algebra package MAPLE, with the aim of promoting understanding of certain mathematical concepts and skills via visual and intuitive reasoning, rather than a formal or rigorous approach. The assessment task we are developing is aimed at providing students with a significant learning experience, in addition to providing feedback on their individual knowledge and skills. To this end, a noteworthy feature of the scheme is that marks awarded for the laboratory work are primarily based on the extent to which reflective, critical thinking is demonstrated, rather than the amount of CBE-style tasks completed by the student within the allowed time. With regard to student learning outcomes, a novel and potentially critical feature of our scheme is that the assessment task is designed to be intimately linked to the overall course content, in that it aims to introduce important concepts and skills (via individual student exploration) which will be revisited somewhat later in the pedagogically more restrictive formal lecture component of the course (typically a large group plenary format). Furthermore, the time delay involved, or "incubation period", is also a deliberate design feature: it is intended to allow students the opportunity to undergo potentially important internal re-adjustments in their understanding, before being exposed to lectures on related course content which are invariably delivered in a more condensed, formal and mathematically rigorous manner. In our presentation, we will discuss in more detail our motivation and rationale for trailing such a scheme for the targeted student group. Some of the advantages and disadvantages of our approach (as we perceived them at the initial stages) will also be enumerated. In a companion paper, the theoretical framework for our approach will be more fully elaborated, and measures of student learning outcomes (as obtained from eg. student provided feedback) will be discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, poly (e-caprolactone) [PCL] and its collagen composite blend (PCL=Col) were fabricated to scaffolds using electrospinning method. Incorporated collagen was present on the surface of the fibers, and it modulated the attachment and proliferation of pig bone marrow mesenchymal cells (pBMMCs). Osteogenic differentiation markers were more pronounced when these cells were cultured on PCL=Col fibrous meshes, as determined by immunohistochemistry for collagen type I, osteopontin, and osteocalcin. Matrix mineralization was observed only on osteogenically induced PCL=Col constructs. Long bone analogs were created by wrapping osteogenic cell sheets around the PCL=Col meshes to form hollow cylindrical cell-scaffold constructs. Culturing these constructs under dynamic conditions enhanced bone-like tissue formation and mechanical strength.We conclude that electrospun PCL=Col mesh is a promising material for bone engineering applications. Its combination with osteogenic cell sheets offers a novel and promising strategy for engineering of tubular bone analogs.