343 resultados para requirement


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to the availability of huge number of web services, finding an appropriate Web service according to the requirements of a service consumer is still a challenge. Moreover, sometimes a single web service is unable to fully satisfy the requirements of the service consumer. In such cases, combinations of multiple inter-related web services can be utilised. This paper proposes a method that first utilises a semantic kernel model to find related services and then models these related Web services as nodes of a graph. An all-pair shortest-path algorithm is applied to find the best compositions of Web services that are semantically related to the service consumer requirement. The recommendation of individual and composite Web services composition for a service request is finally made. Empirical evaluation confirms that the proposed method significantly improves the accuracy of service discovery in comparison to traditional keyword-based discovery methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cryptographic hash functions are an important tool of cryptography and play a fundamental role in efficient and secure information processing. A hash function processes an arbitrary finite length input message to a fixed length output referred to as the hash value. As a security requirement, a hash value should not serve as an image for two distinct input messages and it should be difficult to find the input message from a given hash value. Secure hash functions serve data integrity, non-repudiation and authenticity of the source in conjunction with the digital signature schemes. Keyed hash functions, also called message authentication codes (MACs) serve data integrity and data origin authentication in the secret key setting. The building blocks of hash functions can be designed using block ciphers, modular arithmetic or from scratch. The design principles of the popular Merkle–Damgård construction are followed in almost all widely used standard hash functions such as MD5 and SHA-1.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Post traumatic stress disorder (PTSD) is a serious medical condition effecting both military and civilian populations. While its etiology remains poorly understood it is characterized by high and prolonged levels of fear responding. One biological unknown is whether individuals expressing high or low conditioned fear memory encode the memory differently and if that difference underlies fear response. In this study we examined cellular mechanisms that underlie high and low conditioned fear behavior by using an advanced intercrossed mouse line (B6D2F1) selected for high and low Pavlovian fear response. A known requirement for consolidation of fear memory, phosphorylated mitogen activated protein kinase (p44/42 (ERK) MAPK (pMAPK)) in the lateral amygdala (LA) is a reliable marker of fear learning-related plasticity. In this study, we asked whether high and low conditioned fear behavior is associated with differential pMAPK expression in the LA and if so, is it due to an increase in neurons expressing pMAPK or increased pMAPK per neuron. To examine this, we quantified pMAPK-expressing neurons in the LA at baseline and following Pavlovian fear conditioning. Results indicate that high fear phenotype mice have more pMAPK-expressing neurons in the LA. This finding suggests that increased endogenous plasticity in the LA may be a component of higher conditioned fear responses and begins to explain at the cellular level how different fear responders encode fear memories. Understanding how high and low fear responders encode fear memory will help identify novel ways in which fear-related illness risk can be better predicted and treated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The requirement of isolated relays is one of the prime obstacles in utilizing sequential slotted cooperative protocols for Vehicular Ad-hoc Networks (VANET). Significant research advancement has taken place to improve the diversity multiplexing trade-off (DMT) of cooperative protocols in conventional mobile networks without much attention on vehicular ad-hoc networks. We have extended the concept of sequential slotted amplify and forward (SAF) protocols in the context of urban vehicular ad-hoc networks. Multiple Input Multiple Output (MIMO) reception is used at relaying vehicular nodes to isolate the relays effectively. The proposed approach adds a pragmatic value to the sequential slotted cooperative protocols while achieving attractive performance gains in urban VANETs. We have analysed the DMT bounds and the outage probabilities of the proposed scheme. The results suggest that the proposed scheme can achieve an optimal DMT similar to the DMT upper bound of the sequential SAF. Furthermore, the outage performance of the proposed scheme outperforms the SAF protocol by 2.5 dB at a target outage probability of 10-4.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Knowledge of the pollutant build-up process is a key requirement for developing stormwater pollution mitigation strategies. In this context, process variability is a concept which needs to be understood in-depth. Analysis of particulate build-up on three road surfaces in an urban catchment confirmed that particles <150µm and >150µm have characteristically different build-up patterns, and these patterns are consistent over different field conditions. Three theoretical build-up patterns were developed based on the size-fractionated particulate build-up patterns, and these patterns explain the variability in particle behavior and the variation in particle-bound pollutant load and composition over the antecedent dry period. Behavioral variability of particles <150µm was found to exert the most significant influence on the build-up process variability. As characterization of process variability is particularly important in stormwater quality modeling, it is recommended that the influence of behavioral variability of particles <150µm on pollutant build-up should be specifically addressed. This would eliminate model deficiencies in the replication of the build-up process and facilitate the accounting of the inherent process uncertainty, and thereby enhance the water quality predictions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we derive a new nonlinear two-sided space-fractional diffusion equation with variable coefficients from the fractional Fick’s law. A semi-implicit difference method (SIDM) for this equation is proposed. The stability and convergence of the SIDM are discussed. For the implementation, we develop a fast accurate iterative method for the SIDM by decomposing the dense coefficient matrix into a combination of Toeplitz-like matrices. This fast iterative method significantly reduces the storage requirement of O(n2)O(n2) and computational cost of O(n3)O(n3) down to n and O(nlogn)O(nlogn), where n is the number of grid points. The method retains the same accuracy as the underlying SIDM solved with Gaussian elimination. Finally, some numerical results are shown to verify the accuracy and efficiency of the new method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The requirement for dual screening of titles and abstracts to select papers to examine in full text can create a huge workload, not least when the topic is complex and a broad search strategy is required, resulting in a large number of results. An automated system to reduce this burden, while still assuring high accuracy, has the potential to provide huge efficiency savings within the review process. Objectives To undertake a direct comparison of manual screening with a semi‐automated process (priority screening) using a machine classifier. The research is being carried out as part of the current update of a population‐level public health review. Methods Authors have hand selected studies for the review update, in duplicate, using the standard Cochrane Handbook methodology. A retrospective analysis, simulating a quasi‐‘active learning’ process (whereby a classifier is repeatedly trained based on ‘manually’ labelled data) will be completed, using different starting parameters. Tests will be carried out to see how far different training sets, and the size of the training set, affect the classification performance; i.e. what percentage of papers would need to be manually screened to locate 100% of those papers included as a result of the traditional manual method. Results From a search retrieval set of 9555 papers, authors excluded 9494 papers at title/abstract and 52 at full text, leaving 9 papers for inclusion in the review update. The ability of the machine classifier to reduce the percentage of papers that need to be manually screened to identify all the included studies, under different training conditions, will be reported. Conclusions The findings of this study will be presented along with an estimate of any efficiency gains for the author team if the screening process can be semi‐automated using text mining methodology, along with a discussion of the implications for text mining in screening papers within complex health reviews.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lipopolysaccharide is a major immunogenic structure for the pathogen Yersinia pseudotuberculosis, which contains the O-specific polysaccharide (OPS) that is presented on the cell surface. The OPS contains many repeats of the oligosaccharide O-unit and exhibits a preferred modal chain length that has been shown to be crucial for cell protection in Yersinia. It is well established that the Wzz protein determines the preferred chain length of the OPS, and in its absence, the polymerization of O units by the Wzy polymerase is uncontrolled. However, for Y. pseudotuberculosis, a wzz mutation has never been described. In this study, we examine the effect of Wzz loss in Y. pseudotuberculosis serotype O:2a and compare the lipopolysaccharide chain-length profile to that of Escherichia coli serotype O111. In the absence of Wzz, the lipopolysaccharides of the two species showed significant differences in Wzy polymerization. Yersinia pseudotuberculosis O:2a exhibited only OPS with very short chain lengths, which is atypical of wzz-mutant phenotypes that have been observed for other species. We hypothesise that the Wzy polymerase of Y. pseudotuberculosis O:2a has a unique default activity in the absence of the Wzz, revealing the requirement of Wzz to drive O-unit polymerization to greater lengths.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis considers whether the Australian Privacy Commissioner's use of its powers supports compliance with the requirement to 'take reasonable steps' to protect personal information in National Privacy Principle 4 of the Privacy Act 1988 (Cth). Two unique lenses were used. First, the Commissioner's use of powers was assessed against the principles of transparency, balance and vigorousness and secondly against alignment with an industry practice approach to securing information. Following a comprehensive review of publicly available materials, interviews and investigation file records, this thesis found that the Commissioner's use of his powers has not been transparent, balanced or vigorous, nor has it been supportive of an industry practice approach to securing data. Accordingly, it concludes that the Privacy Commissioner's use of its regulatory powers is unlikely to result in any significant improvement to the security of personal information held by organisations in Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Urbanisation significantly changes the characteristics of a catchment as natural areas are transformed to impervious surfaces such as roads, roofs and parking lots. The increased fraction of impervious surfaces leads to changes to the stormwater runoff characteristics, whilst a variety of anthropogenic activities common to urban areas generate a range of pollutants such as nutrients, solids and organic matter. These pollutants accumulate on catchment surfaces and are removed and trans- ported by stormwater runoff and thereby contribute pollutant loads to receiving waters. In summary, urbanisation influences the stormwater characteristics of a catchment, including hydrology and water quality. Due to the growing recognition that stormwater pollution is a significant environmental problem, the implementation of mitigation strategies to improve the quality of stormwater runoff is becoming increasingly common in urban areas. A scientifically robust stormwater quality treatment strategy is an essential requirement for effective urban stormwater management. The efficient design of treatment systems is closely dependent on the state of knowledge in relation to the primary factors influencing stormwater quality. In this regard, stormwater modelling outcomes provide designers with important guidance and datasets which significantly underpin the design of effective stormwater treatment systems. Therefore, the accuracy of modelling approaches and the reliability modelling outcomes are of particular concern. This book discusses the inherent complexity and key characteristics in the areas of urban hydrology and stormwater quality, based on the influence exerted by a range of rainfall and catchment characteristics. A comprehensive field sampling and testing programme in relation to pollutant build-up, an urban catchment monitoring programme in relation to stormwater quality and the outcomes from advanced statistical analyses provided the platform for the knowledge creation. Two case studies and two real-world applications are discussed to illustrate the translation of the knowledge created to practical use in relation to the role of rainfall and catchment characteristics on urban stormwater quality. An innovative rainfall classification based on stormwater quality was developed to support the effective and scientifically robust design of stormwater treatment systems. Underpinned by the rainfall classification methodology, a reliable approach for design rainfall selection is proposed in order to optimise stormwater treatment based on both, stormwater quality and quantity. This is a paradigm shift from the common approach where stormwater treatment systems are designed based solely on stormwater quantity data. Additionally, how pollutant build-up and stormwater runoff quality vary with a range of catchment characteristics was also investigated. Based on the study out- comes, it can be concluded that the use of only a limited number of catchment parameters such as land use and impervious surface percentage, as it is the case in current modelling approaches, could result in appreciable error in water quality estimation. Influential factors which should be incorporated into modelling in relation to catchment characteristics, should also include urban form and impervious surface area distribution. The knowledge created through the research investigations discussed in this monograph is expected to make a significant contribution to engineering practice such as hydrologic and stormwater quality modelling, stormwater treatment design and urban planning, as the study outcomes provide practical approaches and recommendations for urban stormwater quality enhancement. Furthermore, this monograph also demonstrates how fundamental knowledge of stormwater quality processes can be translated to provide guidance on engineering practice, the comprehensive application of multivariate data analyses techniques and a paradigm on integrative use of computer models and mathematical models to derive practical outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wind energy, being the fastest growing renewable energy source in the present world, requires a large number of wind turbines to transform wind energy into electricity. One factor driving the cost of this energy is the reliable operation of these turbines. Therefore, it is a growing requirement within the wind farm community, to monitor the operation of the wind turbines on a continuous basis so that a possible fault can be detected ahead of time. As the wind turbine operates in an environment of constantly changing wind speed, it is a challenging task to design a fault detection technique which can accommodate the stochastic operational behavior of the turbines. Addressing this issue, this paper proposes a novel fault detection criterion which is robust against operational uncertainty, as well as having the ability to quantify severity level specifically of the drivetrain abnormality within an operating wind turbine. A benchmark model of wind turbine has been utilized to simulate drivetrain fault condition and effectiveness of the proposed technique has been tested accordingly. From the simulation result it can be concluded that the proposed criterion exhibits consistent performance for drivetrain faults for varying wind speed and has linear relationship with the fault severity level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the increasing competitiveness in global markets, many developing nations are striving to constantly improve their services in search for the next competitive edge. As a result, the demand and need for Business Process Management (BPM) in these regions is seeing a rapid rise. Yet there exists a lack of professional expertise and knowledge to cater to that need. Therefore, the development of well-structured BPM training/ education programs has become an urgent requirement for these industries. Furthermore, the lack of textbooks or other self-educating material, that go beyond the basics of BPM, further ratifies the need for case based teaching and related cases that enable the next generation of professionals in these countries. Teaching cases create an authentic learning environment where complexities and challenges of the ‘real world’ can be presented in a narrative, enabling students to evolve crucial skills such as problem analysis, problem solving, creativity within constraints as well as the application of appropriate tools (BPMN) and techniques (including best practices and benchmarking) within richer and real scenarios. The aim of this paper is to provide a comprehensive teaching case demonstrating the means to tackle any developing nation’s legacy government process undermined by inefficiency and ineffectiveness. The paper also includes thorough teaching notes The article is presented in three main parts: (i) Introduction - that provides a brief background setting the context of this paper, (ii) The Teaching Case, and (iii) Teaching notes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The new Australian Curriculum and national standardised testing have placed the teaching of numeracy across the curriculum at the forefront of what Australian schools must do. However, it has been left to schools to determine how they do this. Although there is a growing body of literature giving examples of pedagogies that embed numeracy in various learning areas, there are few studies of cross-curricular numeracy from the management perspective. This paper responds to the research question: How do selected Queensland secondary schools interpret and apply the Australian Curriculum requirement to embed numeracy throughout the curriculum? A multiple case study design was used to investigate the actions of the senior managers and mathematics teachers in three large secondary schools located in outer Brisbane. The numeracy practices in the three schools were interpreted from asocial constructivist perspective. The study found that in each school key managers had differing constructions of numeracy that led to confusion in administrative practices, policy development and leadership. The lack of coordinated cross-curricular action in numeracy in all three schools points to the difficulty that arises when teachers do not share the cross-curricular vision of numeracy present in the Australian Curriculum. The managers identified teachers’ commitment, understanding, or skills in relation to numeracy as significant barriers to the successful implementation of numeracy in their school. Adoption of the Australian Curriculum expectation of embedding numeracy across the curriculum will require school managers to explicitly commit to initiatives that require persistence,time and, most importantly, money.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Building information modelling (BIM) radically changes the practices in architecture, engineering and construction (AEC) and creates new job opportunities. Many governments, such as the United Kingdom, have made BIM a mandatory requirement. This substantially drives the demand for a BIM-literate workforce. Universities are facing the challenge to incorporate BIM into their curricula and produce “BIM ready” graduates to meet the needs of the industry. Like other universities, Queensland University of Technology (QUT) is at the heart of this change and aspires to develop collaborative BIM education across AEC. Previous BIM education studies identify that inadequate BIM awareness of AEC academics is one of the challenges for developing a BIM curriculum and there is a dearth in the learning and teaching support for academics on BIM education. Equipping the AEC academics for a more BIM focused curriculum is all the while more important. This paper aims to leverage knowledge drawn from a Learning & Teaching project currently undertaken at QUT. Its specific objectives are to: 1) review the existing learning and teaching initiatives on BIM education; and 2) briefly describe the learning and teaching activities on collaborative BIM education at QUT. Significance of the paper lies on revealing the importance of building up the capacity of AEC academics for collaborative BIM education. The paper contributes to sparking the interests in better equipping AEC academics to understand what curriculum changes would assist in BIM uptake within the relevant courses to provide context for changes in units; and how the use of BIM can improve the understanding by students of the large amounts of professional knowledge they need to function effectively as graduates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Embedded many-core architectures contain dozens to hundreds of CPU cores that are connected via a highly scalable NoC interconnect. Our Multiprocessor-System-on-Chip CoreVAMPSoC combines the advantages of tightly coupled bus-based communication with the scalability of NoC approaches by adding a CPU cluster as an additional level of hierarchy. In this work, we analyze different cluster interconnect implementations with 8 to 32 CPUs and compare them in terms of resource requirements and performance to hierarchical NoCs approaches. Using 28nm FD-SOI technology the area requirement for 32 CPUs and AXI crossbar is 5.59mm2 including 23.61% for the interconnect at a clock frequency of 830 MHz. In comparison, a hierarchical MPSoC with 4 CPU cluster and 8 CPUs in each cluster requires only 4.83mm2 including 11.61% for the interconnect. To evaluate the performance, we use a compiler for streaming applications to map programs to the different MPSoC configurations. We use this approach for a design-space exploration to find the most efficient architecture and partitioning for an application.