677 resultados para implementation method
Resumo:
Urban design that harnesses natural features (such as green roofs and green walls) to improve design outcomes is gaining significant interest, particularly as there is growing evidence of links between human health and wellbeing, and contact with nature. The use of such natural features can provide many significant benefits, such as reduced urban heat island effects, reduced peak energy demand for building cooling, enhanced stormwater attenuation and management, and reduced air pollution and greenhouse gas emissions. The principle of harnessing natural features as functional design elements, particularly in buildings, is becoming known as ‘biophilic urbanism’. Given the potential for global application and benefits for cities from biophilic urbanism, and the growing number of successful examples of this, it is timely to develop enabling policies that help overcome current barriers to implementation. This paper describes a basis for inquiry into policy considerations related to increasing the application of biophilic urbanism. The paper draws on research undertaken as part of the Sustainable Built Environment National Research Centre (SBEnrc) In Australia in partnership with the Western Australian Department of Finance, Parsons Brinckerhoff, Green Roofs Australasia, and Townsville City Council (CitySolar Program). The paper discusses the emergence of a qualitative, mixed-method approach that combines an extensive literature review, stakeholder workshops and interviews, and a detailed study of leading case studies. It highlights the importance of experiential and contextual learnings to inform biophilic urbanism and provides a structure to distil such learnings to benefit other applications.
Resumo:
The management and improvement of business processes are a core topic of the information systems discipline. The persistent demand in corporations within all industry sectors for increased operational efficiency and innovation, an emerging set of established and evaluated methods, tools, and techniques as well as the quickly growing body of academic and professional knowledge are indicative for the standing that Business Process Management (BPM) has nowadays. During the last decades, intensive research has been conducted with respect to the design, implementation, execution, and monitoring of business processes. Comparatively low attention, however, has been paid to questions related to organizational issues such as the adoption, usage, implications, and overall success of BPM approaches, technologies, and initiatives. This research gap motivated us to edit a corresponding special focus issue for the journal BISE/WIRTSCHAFTSINFORMATIK. We are happy that we are able to present a selection of three research papers and a state-of-the-art paper in the scientific section of the issue at hand. As these papers differ in the topics they investigate, the research method they apply, and the theoretical foundations they build on, the diversity within the BPM field becomes evident. The academic papers are complemented by an interview with Phil Gilbert, IBM’s Vice President for Business Process and Decision Management, who reflects on the relationship between business processes and the data flowing through them, the need to establish a process context for decision making, and the calibration of BPM efforts toward executives who see processes as a means to an end, rather than a first-order concept in its own right.
Resumo:
The regulation of overweight trucks is of increasing importance. Quickly growing heavy vehicle volumes over-proportionally contribute to roadway damage. Raising maintenance costs and compromised road safety are also becoming a major concern to managing agencies. Minimizing pavement wear is done by regulating overloaded trucks on major highways at weigh stations. However, due to lengthy inspections and insufficient capacities, weigh stations tend to be inefficient. New practices, using Radio Frequency Identification (RFID) transponders and weigh-in-motion technologies, called preclearance programs, have been set up in a number of countries. The primary aim of this study is to investigate the current issues with regard to the implementation and operation of the preclearance program. The State of Queensland, Australia, is used as a case study. The investigation focuses on three aspects; the first emphasizes on identifying the need for improvement of the current regulation programs in Queensland. Second, the operators of existing preclearance programs are interviewed for their lessons-learned and the marketing strategies used for promoting their programs. The trucking companies in Queensland are interviewed for their experiences with the current weighing practices and attitudes toward the potential preclearance system. Finally, the estimated benefit of the preclearance program deployment in Queensland is analyzed. The penultimate part brings the former four parts together and provides the study findings and recommendations. The framework and study findings could be valuable inputs for other roadway agencies considering a similar preclearance program or looking to promote their existing ones.
Resumo:
Theme Paper for Curriculum innovation and enhancement theme AIM: This paper reports on a research project that trialled an educational strategy implemented in an undergraduate nursing curriculum. The project aimed to explore the effectiveness of ‘think aloud’ as a strategy for improving clinical reasoning for students in simulated clinical settings. BACKGROUND: Nurses are required to apply and utilise critical thinking skills to enable clinical reasoning and problem solving in the clinical setting (Lasater, 2007). Nursing students are expected to develop and display clinical reasoning skills in practice, but may struggle articulating reasons behind decisions about patient care. The ‘think aloud’ approach is an innovative learning/teaching method which can create an environment suitable for developing clinical reasoning skills in students (Banning, 2008, Lee and Ryan-Wenger, 1997). This project used the ‘think aloud’ strategy within a simulation context to provide a safe learning environment in which third year students were assisted to uncover cognitive approaches to assist in making effective patient care decisions, and improve their confidence, clinical reasoning and active critical reflection about their practice. MEHODS: In semester 2 2011 at QUT, third year nursing students undertook high fidelity simulation (some for the first time), commencing in September of 2011. There were two cohorts for strategy implementation (group 1= used think aloud as a strategy within the simulation, group 2= no specific strategy outside of nursing assessment frameworks used by all students) in relation to problem solving patient needs. The think aloud strategy was described to students in their pre-simulation briefing and allowed time for clarification of this strategy. All other aspects of the simulations remained the same, (resources, suggested nursing assessment frameworks, simulation session duration, size of simulation teams, preparatory materials). Ethics approval has been obtained for this project. RESULTS: Results of a qualitative analysis (in progress- will be completed by March 2012) of student and facilitator reports on students’ ability to meet the learning objectives of solving patient problems using clinical reasoning and experience with the ‘think aloud’ method will be presented. A comparison of clinical reasoning learning outcomes between the two groups will determine the effect on clinical reasoning for students responding to patient problems. CONCLUSIONS: In an environment of increasingly constrained clinical placement opportunities, exploration of alternate strategies to improve critical thinking skills and develop clinical reasoning and problem solving for nursing students is imperative in preparing nurses to respond to changing patient needs.
Resumo:
Knowledge has been widely recognised as a determinant of business performance. Business capabilities require an effective share of resource and knowledge. Specifically, knowledge sharing (KS) between different companies and departments can improve manufacturing processes since intangible knowledge plays an enssential role in achieving competitive advantage. This paper presents a mixed method research study into the impact of KS on the effectiveness of new product development (NPD) in achieving desired business performance (BP). Firstly, an empirical study utilising moderated regression analysis was conducted to test whether and to what extent KS has leveraging power on the relationship between NPD and BP constructs and variables. Secondly, this empirically verified hypothesis was validated through explanatory case studies involving two Taiwanese manufacturing companies using a qualitative interaction term pattern matching technique. The study provides evidence that knowledge sharing and management activities are essential for deriving competitive advantage in the manufacturing industry.
Resumo:
We consider a two-dimensional space-fractional reaction diffusion equation with a fractional Laplacian operator and homogeneous Neumann boundary conditions. The finite volume method is used with the matrix transfer technique of Ilić et al. (2006) to discretise in space, yielding a system of equations that requires the action of a matrix function to solve at each timestep. Rather than form this matrix function explicitly, we use Krylov subspace techniques to approximate the action of this matrix function. Specifically, we apply the Lanczos method, after a suitable transformation of the problem to recover symmetry. To improve the convergence of this method, we utilise a preconditioner that deflates the smallest eigenvalues from the spectrum. We demonstrate the efficiency of our approach for a fractional Fisher’s equation on the unit disk.
Resumo:
This paper presents a formal methodology for attack modeling and detection for networks. Our approach has three phases. First, we extend the basic attack tree approach 1 to capture (i) the temporal dependencies between components, and (ii) the expiration of an attack. Second, using the enhanced attack trees (EAT) we build a tree automaton that accepts a sequence of actions from input stream if there is a traverse of an attack tree from leaves to the root node. Finally, we show how to construct an enhanced parallel automaton (EPA) that has each tree automaton as a subroutine and can process the input stream by considering multiple trees simultaneously. As a case study, we show how to represent the attacks in IEEE 802.11 and construct an EPA for it.
Resumo:
We have compared the effects of different sterilization techniques on the properties of Bombyx mori silk fibroin thin films with the view to subsequent use for corneal tissue engineering. The transparency, tensile properties, corneal epithelial cell attachment and degradation of the films were used to evaluate the suitability of certain sterilization techniques including gamma-irradiation (in air or nitrogen), steam treatment and immersion in aqueous ethanol. The investigations showed that gamma-irradiation, performed either in air or in a nitrogen atmosphere, did not significantly alter the properties of films. The films sterilized by gamma-irradiation or by immersion in ethanol had a transparency greater than 98% and tensile properties comparable to human cornea and amniotic membrane, the materials of choice in the reconstruction of ocular surface. Although steam-sterilization produced stronger, stiffer films, they were less transparent, and cell attachment was affected by the variable topography of these films. It was concluded that gamma-irradiation should be considered to be the most suitable method for the sterilization of silk fibroin films, however, the treatment with ethanol is also an acceptable method.
Resumo:
Background Predicting protein subnuclear localization is a challenging problem. Some previous works based on non-sequence information including Gene Ontology annotations and kernel fusion have respective limitations. The aim of this work is twofold: one is to propose a novel individual feature extraction method; another is to develop an ensemble method to improve prediction performance using comprehensive information represented in the form of high dimensional feature vector obtained by 11 feature extraction methods. Methodology/Principal Findings A novel two-stage multiclass support vector machine is proposed to predict protein subnuclear localizations. It only considers those feature extraction methods based on amino acid classifications and physicochemical properties. In order to speed up our system, an automatic search method for the kernel parameter is used. The prediction performance of our method is evaluated on four datasets: Lei dataset, multi-localization dataset, SNL9 dataset and a new independent dataset. The overall accuracy of prediction for 6 localizations on Lei dataset is 75.2% and that for 9 localizations on SNL9 dataset is 72.1% in the leave-one-out cross validation, 71.7% for the multi-localization dataset and 69.8% for the new independent dataset, respectively. Comparisons with those existing methods show that our method performs better for both single-localization and multi-localization proteins and achieves more balanced sensitivities and specificities on large-size and small-size subcellular localizations. The overall accuracy improvements are 4.0% and 4.7% for single-localization proteins and 6.5% for multi-localization proteins. The reliability and stability of our classification model are further confirmed by permutation analysis. Conclusions It can be concluded that our method is effective and valuable for predicting protein subnuclear localizations. A web server has been designed to implement the proposed method. It is freely available at http://bioinformatics.awowshop.com/snlpred_page.php.
Resumo:
A new deterministic method for predicting simultaneous inbreeding coefficients at three and four loci is presented. The method involves calculating the conditional probability of IBD (identical by descent) at one locus given IBD at other loci, and multiplying this probability by the prior probability of the latter loci being simultaneously IBD. The conditional probability is obtained applying a novel regression model, and the prior probability from the theory of digenic measures of Weir and Cockerham. The model was validated for a finite monoecious population mating at random, with a constant effective population size, and with or without selfing, and also for an infinite population with a constant intermediate proportion of selfing. We assumed discrete generations. Deterministic predictions were very accurate when compared with simulation results, and robust to alternative forms of implementation. These simultaneous inbreeding coefficients were more sensitive to changes in effective population size than in marker spacing. Extensions to predict simultaneous inbreeding coefficients at more than four loci are now possible.
Resumo:
This paper presents a method for investigating ship emissions, the plume capture and analysis system (PCAS), and its application in measuring airborne pollutant emission factors (EFs) and particle size distributions. The current investigation was conducted in situ, aboard two dredgers (Amity: a cutter suction dredger and Brisbane: a hopper suction dredger) but the PCAS is also capable of performing such measurements remotely at a distant point within the plume. EFs were measured relative to the fuel consumption using the fuel combustion derived plume CO2. All plume measurements were corrected by subtracting background concentrations sampled regularly from upwind of the stacks. Each measurement typically took 6 minutes to complete and during one day, 40 to 50 measurements were possible. The relationship between the EFs and plume sample dilution was examined to determine the plume dilution range over which the technique could deliver consistent results when measuring EFs for particle number (PN), NOx, SO2, and PM2.5 within a targeted dilution factor range of 50-1000 suitable for remote sampling. The EFs for NOx, SO2, and PM2.5 were found to be independent of dilution, for dilution factors within that range. The EF measurement for PN was corrected for coagulation losses by applying a time dependant particle loss correction to the particle number concentration data. For the Amity, the EF ranges were PN: 2.2 - 9.6 × 1015 (kg-fuel)-1; NOx: 35-72 g(NO2).(kg-fuel)-1, SO2 0.6 - 1.1 g(SO2).(kg-fuel)-1and PM2.5: 0.7 – 6.1 g(PM2.5).(kg-fuel)-1. For the Brisbane they were PN: 1.0 – 1.5 x 1016 (kg-fuel)-1, NOx: 3.4 – 8.0 g(NO2).(kg-fuel)-1, SO2: 1.3 – 1.7 g(SO2).(kg-fuel)-1 and PM2.5: 1.2 – 5.6 g(PM2.5).(kg-fuel)-1. The results are discussed in terms of the operating conditions of the vessels’ engines. Particle number emission factors as a function of size as well as the count median diameter (CMD), and geometric standard deviation of the size distributions are provided. The size distributions were found to be consistently uni-modal in the range below 500 nm, and this mode was within the accumulation mode range for both vessels. The representative CMDs for the various activities performed by the dredgers ranged from 94-131 nm in the case of the Amity, and 58-80 nm for the Brisbane. A strong inverse relationship between CMD and EF(PN) was observed.
Resumo:
For the timber industry, the ability to simulate the drying of wood is invaluable for manufacturing high quality wood products. Mathematically, however, modelling the drying of a wet porous material, such as wood, is a diffcult task due to its heterogeneous and anisotropic nature, and the complex geometry of the underlying pore structure. The well{ developed macroscopic modelling approach involves writing down classical conservation equations at a length scale where physical quantities (e.g., porosity) can be interpreted as averaged values over a small volume (typically containing hundreds or thousands of pores). This averaging procedure produces balance equations that resemble those of a continuum with the exception that effective coeffcients appear in their deffnitions. Exponential integrators are numerical schemes for initial value problems involving a system of ordinary differential equations. These methods differ from popular Newton{Krylov implicit methods (i.e., those based on the backward differentiation formulae (BDF)) in that they do not require the solution of a system of nonlinear equations at each time step but rather they require computation of matrix{vector products involving the exponential of the Jacobian matrix. Although originally appearing in the 1960s, exponential integrators have recently experienced a resurgence in interest due to a greater undertaking of research in Krylov subspace methods for matrix function approximation. One of the simplest examples of an exponential integrator is the exponential Euler method (EEM), which requires, at each time step, approximation of φ(A)b, where φ(z) = (ez - 1)/z, A E Rnxn and b E Rn. For drying in porous media, the most comprehensive macroscopic formulation is TransPore [Perre and Turner, Chem. Eng. J., 86: 117-131, 2002], which features three coupled, nonlinear partial differential equations. The focus of the first part of this thesis is the use of the exponential Euler method (EEM) for performing the time integration of the macroscopic set of equations featured in TransPore. In particular, a new variable{ stepsize algorithm for EEM is presented within a Krylov subspace framework, which allows control of the error during the integration process. The performance of the new algorithm highlights the great potential of exponential integrators not only for drying applications but across all disciplines of transport phenomena. For example, when applied to well{ known benchmark problems involving single{phase liquid ow in heterogeneous soils, the proposed algorithm requires half the number of function evaluations than that required for an equivalent (sophisticated) Newton{Krylov BDF implementation. Furthermore for all drying configurations tested, the new algorithm always produces, in less computational time, a solution of higher accuracy than the existing backward Euler module featured in TransPore. Some new results relating to Krylov subspace approximation of '(A)b are also developed in this thesis. Most notably, an alternative derivation of the approximation error estimate of Hochbruck, Lubich and Selhofer [SIAM J. Sci. Comput., 19(5): 1552{1574, 1998] is provided, which reveals why it performs well in the error control procedure. Two of the main drawbacks of the macroscopic approach outlined above include the effective coefficients must be supplied to the model, and it fails for some drying configurations, where typical dual{scale mechanisms occur. In the second part of this thesis, a new dual{scale approach for simulating wood drying is proposed that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of softwood at low temperatures and is valid in the so{called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradient on the microscopic field using suitably defined periodic boundary conditions, which allows the macroscopic ux to be defined as an average of the microscopic ux over the unit cell. This formulation provides a first step for moving from the macroscopic formulation featured in TransPore to a comprehensive dual{scale formulation capable of addressing any drying configuration. Simulation results reported for a sample of spruce highlight the potential and flexibility of the new dual{scale approach. In particular, for a given unit cell configuration it is not necessary to supply the effective coefficients prior to each simulation.
Resumo:
An analytical method for the detection of carbonaceous gases by a non-dispersive infrared sensor (NDIR) has been developed. The calibration plots of six carbonaceous gases including CO2, CH4, CO, C2H2, C2H4 and C2H6 were obtained and the reproducibility determined to verify the feasibility of this gas monitoring method. The results prove that squared correlation coefficients for the six gas measurements are greater than 0.999. The reproducibility is excellent, thus indicating that this analytical method is useful to determinate the concentrations of carbonaceous gases.
Resumo:
Throughout the Asia Pacific Economic Cooperation (APEC) and across the world, riders and passengers of motorcycles and scooters are among the most vulnerable road users. Such vulnerability is especially pertinent for nations that more often use motorcycles and scooters as a method of transportation. In developing effective countermeasures to reduce motorcycle and scooter death and injury across multiple regions, consideration is required of various situational and socio-cultural factors that vary across APEC economies. This compendium aims to facilitate implementation of best practice countermeasures to improve motorcycle safety in APEC member economies.
Resumo:
Throughout the Asia Pacific Economic Cooperation (APEC) and across the world, riders and passengers of motorcycles and scooters are among the most vulnerable road users. There is considerably greater likelihood of death or injury from use of these vehicles compared with other motor vehicles. Such vulnerability is especially pertinent for economies that more often use motorcycles and scooters as a method of transportation. In developing effective countermeasures to reduce motorcycle and scooter death and injury across multiple regions consideration is required of various situational and socio-cultural factors that vary across APEC economies. The study presented here sought to understand important motorcycle and scooter safety issues across APEC economies and any current barriers that might exist in implementing potentially effective countermeasures. This is the first stage of a wider project which also includes a literature review and ultimately the production of a compendium to facilitate implementation of best practice countermeasures.