901 resultados para Systems development
Resumo:
Building prefabrication is known as Industrialised Building Systems (IBS) in Malaysia. This construction method possesses unique characteristics that are central to sustainable construction. For example, offsite construction enables efficient management of construction wastage by identifying major causes of waste arising during both the design and construction stages. These causes may then be eliminated by the improvement process in IBS component's manufacturing. However, current decisions on using IBS are typically financial driven and hinder the wider ranged adoption. In addition, current IBS misconceptions and the failure of rating schemes in evaluating the sustainability of IBS affect its implementation. A new approach is required to provide better understanding on the sustainability potential of IBS among stakeholders. Such approach should also help project the outcomes of each levels of decision-making to respond to social, economy and environmental challenges. This paper presents interim findings of research aimed at developing a framework for sustainable IBS development and suggests a more holistic approach to achieve sustainability. A framework of embedding sustainability factors is considered in three main phases of IBS construction; 1) Pre-construction, 2) Construction and 3) Post-construction phase. SWOT analysis was used to evaluate the strengths, weaknesses, opportunities and threats involved in the IBS implementations. The action plans are formulated from the analysis of sustainable objectives. This approach will show where and how sustainability should be integrated to improve IBS construction. A mix of quantitative and qualitative methodology was used in this research to explore the potential of IBS in integrating sustainability. The tools used in the study are questionnaires and semi-structured interviews. Outcomes from these tools lead to the identification of viable approaches involving 18 critical factors to improve sustainability in IBS constructions. Finally, guidelines for decision-making are being developed to provide a useful source of information and support to mutual benefit of the stakeholders in integrating sustainability issues and concepts into IBS applications.
Resumo:
Since the 1960s, many developing countries have introduced IP laws to help them in their social and economic development. Introducing these laws was considered as a civilised act and a precondition of developing countries‘ progress from being =under-developed‘ to becoming =developed‘. In 2004, Brazil and Argentina presented a comprehensive proposal on behalf of developing countries to establish the Development Agenda in the World Intellectual Property Organisation (WIPO). They put forward a view that IP laws in their current form are not helping those countries in their development, as is constantly being suggested by developed countries, and that there is a need to rethink the international IP system and the work of WIPO. The research undertaken examines the correlation between IP and social and economic development. It investigates how IP systems in developing countries could work to advance their development, especially in the context of the internet. The research considers the theory and practice of IP and development, and proposes a new IP framework which developing countries could employ to further their social and economic development.
Resumo:
In various industrial and scientific fields, conceptual models are derived from real world problem spaces to understand and communicate containing entities and coherencies. Abstracted models mirror the common understanding and information demand of engineers, who apply conceptual models for performing their daily tasks. However, most standardized models in Process Management, Product Lifecycle Management and Enterprise Resource Planning lack of a scientific foundation for their notation. In collaboration scenarios with stakeholders from several disciplines, tailored conceptual models complicate communication processes, as a common understanding is not shared or implemented in specific models. To support direct communication between experts from several disciplines, a visual language is developed which allows a common visualization of discipline-specific conceptual models. For visual discrimination and to overcome visual complexity issues, conceptual models are arranged in a three-dimensional space. The visual language introduced here follows and extends established principles of Visual Language science.
Resumo:
This paper proposes how the theoretical framework of ecological dynamics can provide an influential model of the learner and the learning process to pre-empt effective behaviour changes. Here we argue that ecological dynamics supports a well established model of the learner ideally suited to the environmental education context because of its emphasis on the learner-environment relationship. The model stems from perspectives on behaviour change in ecological psychology and dynamical systems theory. The salient points of the model are highlighted for educators interested in manipulating environmental constraints in the learning process, with the aim of designing effective learning programs in environmental education. We conclude by providing generic principles of application which might define the learning process in environmental education programs.
Resumo:
With the progressive exhaustion of fossil energy and the enhanced awareness of environmental protection, more attention is being paid to electric vehicles (EVs). Inappropriate siting and sizing of EV charging stations could have negative effects on the development of EVs, the layout of the city traffic network, and the convenience of EVs' drivers, and lead to an increase in network losses and a degradation in voltage profiles at some nodes. Given this background, the optimal sites of EV charging stations are first identified by a two-step screening method with environmental factors and service radius of EV charging stations considered. Then, a mathematical model for the optimal sizing of EV charging stations is developed with the minimization of total cost associated with EV charging stations to be planned as the objective function and solved by a modified primal-dual interior point algorithm (MPDIPA). Finally, simulation results of the IEEE 123-node test feeder have demonstrated that the developed model and method cannot only attain the reasonable planning scheme of EV charging stations, but also reduce the network loss and improve the voltage profile.
Resumo:
Destruction of cancer cells by genetically modified viral and nonviral vectors has been the aim of many research programs. The ability to target cytotoxic gene therapies to the cells of interest is an essential prerequisite, and the treatment has always had the potential to provide better and more long-lasting therapy than existing chemotherapies. However, the potency of these infectious agents requires effective testing systems, in which hypotheses can be explored both in vitro and in vivo before the establishment of clinical trials in humans. The real prospect of off-target effects should be eliminated in the preclinical stage, if current prejudices against such therapies are to be overcome. In this review we have set out, using adenoviral vectors as a commonly used example, to discuss some of the key parameters required to develop more effective testing, and to critically assess the current cellular models for the development and testing of prostate cancer biotherapy. Only by developing models that more closely mirror human tissues will we be able to translate literature publications into clinical trials and hence into acceptable alternative treatments for the most commonly diagnosed cancer in humans.
Resumo:
This paper presents an approach to developing indicators for expressing resilience of a generic water supply system. The system is contextualised as a meta-system consisting of three subsystems to represent the water catchment and reservoir, treatment plant and the distribution system supplying the end-users. The level of final service delivery to end-users is considered as a surrogate measure of systemic resilience. A set of modelled relationships are used to explore relationships between system components when placed under simulated stress. Conceptual system behaviour of specific types of simulated pressure is created for illustration of parameters for indicator development. The approach is based on the hypothesis that an in-depth knowledge of resilience would enable development of decision support system capability which in turn will contribute towards enhanced management of a water supply system. In contrast to conventional water supply system management approaches, a resilience approach facilitates improvement in system efficiency by emphasising awareness of points-of-intervention where system managers can adjust operational control measures across the meta-system (and within subsystems) rather than expansion of the system in entirety in the form of new infrastructure development.
Resumo:
The knowledge economy relies on the diffusion and use of knowledge as well as its creation (Houghton and Sheenan, 2000). The future success of economic activity will depend on the capacity of organisations to transform by increasing their flexibility. In particular, this transformation is dependant on a decentralised, networked and multi-skilled workforce. To help organisations transition, new strategies and structures for education are required. Education systems need to concentrate less on specialist skills and more on the development of people with broad-based problem solving skills that are adaptable, with social and inter-personal communication skills necessary for networking and communication. This paper presents the findings of a ‘Knowledge Economy Market Development Mapping Study’ conducted to identify the value of design education programs from primary through to tertiary level in Queensland, Australia. The relationship of these programs to the development of the capacities mentioned above is explored. The study includes the collection of qualitative and quantitative data consisting of a literature review, focus groups and survey. Recommendations for the future development of design education programs in Queensland, Australia are proposed, and future research opportunities are presented and discussed.
Resumo:
HE has been changing rapidly due to globalisation that has increased the interconnectedness between nations and people throughout the world (Mok, 2012). As HE has manifested into different forms and governed by competing rationales in recent years, this paper focuses on transnational HE, which is an example of the interconnectedness of universities beyond the national borders. Indonesia is also influenced by the above changes. It took part in free-trade agreements that include HE as a sector to be liberated and accessed by international providers (Nizam, 2006). Indonesian universities found themselves bracing for the global competition for students and simultaneously having to improve their quality in order to survive amidst the growing competition. This competition gave birth to joint transnational HE programs with overseas partners among many Indonesian universities (Macaranas, 2010).
Resumo:
Since the identification of the gene family of kallikrein related peptidases (KLKs), their function has been robustly studied at the biochemical level. In vitro biochemical studies have shown that KLK proteases are involved in a number of extracellular processes that initiate intracellular signaling pathways by hydrolysis, as reviewed in Chapters 8, 9, and 15, Volume 1. These events have been associated with more invasive phenotypes of ovarian, prostate, and other cancers. Concomitantly, aberrant expression of KLKs has been associated with poor prognosis of patients with ovarian and prostate cancer (Borgoño and Diamandis, 2004; Clements et al., 2004; Yousef and Diamandis, 2009), with prostate-specific antigen (PSA, KLK3) being a long standing, clinically employed biomarker for prostate cancer (Lilja et al., 2008). Data generated from patient samples in clinical studies, alongwith biochemical activity, suggests that KLKs function in the development and progression of these diseases. To bridge the gap between their function at the molecular level and the clinical need for efficacious treatment and prognostic biomarkers, functional assessment at the in vitro cellular level, using various culture models, is increasing, particularly in a three-dimensional (3D) context (Abbott, 2003; Bissell and Radisky, 2001; Pampaloni et al., 2007; Yamada and Cukierman, 2007).
Resumo:
This paper is concerned with the unsupervised learning of object representations by fusing visual and motor information. The problem is posed for a mobile robot that develops its representations as it incrementally gathers data. The scenario is problematic as the robot only has limited information at each time step with which it must generate and update its representations. Object representations are refined as multiple instances of sensory data are presented; however, it is uncertain whether two data instances are synonymous with the same object. This process can easily diverge from stability. The premise of the presented work is that a robot's motor information instigates successful generation of visual representations. An understanding of self-motion enables a prediction to be made before performing an action, resulting in a stronger belief of data association. The system is implemented as a data-driven partially observable semi-Markov decision process. Object representations are formed as the process's hidden states and are coordinated with motor commands through state transitions. Experiments show the prediction process is essential in enabling the unsupervised learning method to converge to a solution - improving precision and recall over using sensory data alone.
Resumo:
Recent research has proposed Neo-Piagetian theory as a useful way of describing the cognitive development of novice programmers. Neo-Piagetian theory may also be a useful way to classify materials used in learning and assessment. If Neo-Piagetian coding of learning resources is to be useful then it is important that practitioners can learn it and apply it reliably. We describe the design of an interactive web-based tutorial for Neo-Piagetian categorization of assessment tasks. We also report an evaluation of the tutorial's effectiveness, in which twenty computer science educators participated. The average classification accuracy of the participants on each of the three Neo-Piagetian stages were 85%, 71% and 78%. Participants also rated their agreement with the expert classifications, and indicated high agreement (91%, 83% and 91% across the three Neo-Piagetian stages). Self-rated confidence in applying Neo-Piagetian theory to classifying programming questions before and after the tutorial were 29% and 75% respectively. Our key contribution is the demonstration of the feasibility of the Neo-Piagetian approach to classifying assessment materials, by demonstrating that it is learnable and can be applied reliably by a group of educators. Our tutorial is freely available as a community resource.
Resumo:
Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.
Resumo:
Increasing use of computerized systems in our daily lives creates new adversarial opportunities for which complex mechanisms are exploited to mend the rapid development of new attacks. Behavioral Biometrics appear as one of the promising response to these attacks. But it is a relatively new research area, specific frameworks for evaluation and development of behavioral biometrics solutions could not be found yet. In this paper we present a conception of a generic framework and runtime environment which will enable researchers to develop, evaluate and compare their behavioral biometrics solutions with repeatable experiments under the same conditions with the same data.
Resumo:
There are different ways to authenticate humans, which is an essential prerequisite for access control. The authentication process can be subdivided into three categories that rely on something someone i) knows (e.g. password), and/or ii) has (e.g. smart card), and/or iii) is (biometric features). Besides classical attacks on password solutions and the risk that identity-related objects can be stolen, traditional biometric solutions have their own disadvantages such as the requirement of expensive devices, risk of stolen bio-templates etc. Moreover, existing approaches provide the authentication process usually performed only once initially. Non-intrusive and continuous monitoring of user activities emerges as promising solution in hardening authentication process: iii-2) how so. behaves. In recent years various keystroke dynamic behavior-based approaches were published that are able to authenticate humans based on their typing behavior. The majority focuses on so-called static text approaches, where users are requested to type a previously defined text. Relatively few techniques are based on free text approaches that allow a transparent monitoring of user activities and provide continuous verification. Unfortunately only few solutions are deployable in application environments under realistic conditions. Unsolved problems are for instance scalability problems, high response times and error rates. The aim of this work is the development of behavioral-based verification solutions. Our main requirement is to deploy these solutions under realistic conditions within existing environments in order to enable a transparent and free text based continuous verification of active users with low error rates and response times.