301 resultados para Possible solutions
Resumo:
A novel composite material based on deposition of nanosized zero-valent iron (nZVI) particles on acid-leached diatomite was synthesised for the removal of a chlorinated contaminant in water. The nZVI /diatomite composites were characterized by X-ray diffraction, scanning electron microscopy, elemental analysis, transmission electron microscopy and X-ray photoelectron spectroscopy. Compared with the pure nZVI particles, better dispersion of nZVI particles on the surface or inside the pores of diatom shells was observed. The herbicide simazine was selected as the model chlorinated contaminant and the removal efficiency by nZVI /diatomite composite was compared with that of the pristine nZVI and commercial iron powder. It was found that the diatomite supported nZVI composite material prepared by centrifugation exhibits relatively better efficient activity in decomposition of simazine than commercial Fe, lab synthesized nZVI and composite material prepared via rotary evaporation, and the optimum experimental conditions were obtained based on a series of batch experiments. This study on immobilizing nZVI particles onto diatomite opens a new avenue for the practical application of nZVI and the diatomite-supported nanosized zero-valent iron composite materials have potential applications in environmental remediation.
Resumo:
This paper explores the potential for online video as a mechanism to transform the ways students learn, as measured by research, user experience and usage following surveys and trials of patron-driven acquisition collaboratively undertaken by Queensland University of Technology, La Trobe University and Kanopy.
Resumo:
Due to the demand for better and deeper analysis in sports, organizations (both professional teams and broadcasters) are looking to use spatiotemporal data in the form of player tracking information to obtain an advantage over their competitors. However, due to the large volume of data, its unstructured nature, and lack of associated team activity labels (e.g. strategic/tactical), effective and efficient strategies to deal with such data have yet to be deployed. A bottleneck restricting such solutions is the lack of a suitable representation (i.e. ordering of players) which is immune to the potentially infinite number of possible permutations of player orderings, in addition to the high dimensionality of temporal signal (e.g. a game of soccer last for 90 mins). Leveraging a recent method which utilizes a "role-representation", as well as a feature reduction strategy that uses a spatiotemporal bilinear basis model to form a compact spatiotemporal representation. Using this representation, we find the most likely formation patterns of a team associated with match events across nearly 14 hours of continuous player and ball tracking data in soccer. Additionally, we show that we can accurately segment a match into distinct game phases and detect highlights. (i.e. shots, corners, free-kicks, etc) completely automatically using a decision-tree formulation.
Resumo:
Pretreatments of sugarcane bagasse by three high boiling-point polyol solutions were compared in acid-catalysed processes. Pretreatments by ethylene glycol (EG) and propylene glycol solutions containing 1.2 % H2SO4 and 10 % water at 130 °C for 30 min removed 89 % lignin from bagasse resulting in a glucan digestibility of 95 % with a cellulase loading of ~20 FPU/g glucan. Pretreatment by glycerol solution under the same conditions removed 57 % lignin with a glucan digestibility of 77 %. Further investigations with EG solutions showed that increases in acid content, pretreatment temperature and time, and decrease in water content improved pretreatment effectiveness. A good linear correlation of glucan digestibility with delignification was observed with R2 = 0.984. Bagasse samples pretreated with EG solutions were characterised by scanning electron microscopy, Fourier transform infrared spectroscopy and X-ray diffraction, which confirmed that improved glucan enzymatic digestibility is mainly due to delignification and defibrillation of bagasse. Pretreatment by acidified EG solutions likely led to the formation of EG-glycosides. Up to 36 % of the total lignin was recovered from pretreatment hydrolysate, which may improve the pretreatment efficiency of recycled EG solution.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
The existence of travelling wave solutions to a haptotaxis dominated model is analysed. A version of this model has been derived in Perumpanani et al. (1999) to describe tumour invasion, where diffusion is neglected as it is assumed to play only a small role in the cell migration. By instead allowing diffusion to be small, we reformulate the model as a singular perturbation problem, which can then be analysed using geometric singular perturbation theory. We prove the existence of three types of physically realistic travelling wave solutions in the case of small diffusion. These solutions reduce to the no diffusion solutions in the singular limit as diffusion as is taken to zero. A fourth travelling wave solution is also shown to exist, but that is physically unrealistic as it has a component with negative cell population. The numerical stability, in particular the wavespeed of the travelling wave solutions is also discussed.
Resumo:
We study a version of the Keller–Segel model for bacterial chemotaxis, for which exact travelling wave solutions are explicitly known in the zero attractant diffusion limit. Using geometric singular perturbation theory, we construct travelling wave solutions in the small diffusion case that converge to these exact solutions in the singular limit.
Resumo:
The overall aim of this research project was to provide a broader range of value propositions (beyond upfront traditional construction costs) that could transform both the demand side and supply side of the housing industry. The project involved gathering information about how building information is created, used and communicated and classifying building information, leading to the formation of an Information Flow Chart and Stakeholder Relationship Map. These were then tested via broad housing industry focus groups and surveys. The project revealed four key relationships that appear to operate in isolation to the whole housing sector and may have significant impact on the sustainability outcomes and life cycle costs of dwellings over their life cycle. It also found that although a lot of information about individual dwellings does already exist, this information is not coordinated or inventoried in any systematic manner and that national building information files of building passports would present value to a wide range of stakeholders.
Resumo:
Global awareness for cleaner and renewable energy is transforming the electricity sector at many levels. New technologies are being increasingly integrated into the electricity grid at high, medium and low voltage levels, new taxes on carbon emissions are being introduced and individuals can now produce electricity, mainly through rooftop photovoltaic (PV) systems. While leading to improvements, these changes also introduce challenges, and a question that often rises is ‘how can we manage this constantly evolving grid?’ The Queensland Government and Ergon Energy, one of the two Queensland distribution companies, have partnered with some Australian and German universities on a project to answer this question in a holistic manner. The project investigates the impact the integration of renewables and other new technologies has on the physical structure of the grid, and how this evolving system can be managed in a sustainable and economical manner. To aid understanding of what the future might bring, a software platform has been developed that integrates two modelling techniques: agent-based modelling (ABM) to capture the characteristics of the different system units accurately and dynamically, and particle swarm optimization (PSO) to find the most economical mix of network extension and integration of distributed generation over long periods of time. Using data from Ergon Energy, two types of networks (3 phase, and Single Wired Earth Return or SWER) have been modelled; three-phase networks are usually used in dense networks such as urban areas, while SWER networks are widely used in rural Queensland. Simulations can be performed on these networks to identify the required upgrades, following a three-step process: a) what is already in place and how it performs under current and future loads, b) what can be done to manage it and plan the future grid and c) how these upgrades/new installations will perform over time. The number of small-scale distributed generators, e.g. PV and battery, is now sufficient (and expected to increase) to impact the operation of the grid, which in turn needs to be considered by the distribution network manager when planning for upgrades and/or installations to stay within regulatory limits. Different scenarios can be simulated, with different levels of distributed generation, in-place as well as expected, so that a large number of options can be assessed (Step a). Once the location, sizing and timing of assets upgrade and/or installation are found using optimisation techniques (Step b), it is possible to assess the adequacy of their daily performance using agent-based modelling (Step c). One distinguishing feature of this software is that it is possible to analyse a whole area at once, while still having a tailored solution for each of the sub-areas. To illustrate this, using the impact of battery and PV can have on the two types of networks mentioned above, three design conditions can be identified (amongst others): · Urban conditions o Feeders that have a low take-up of solar generators, may benefit from adding solar panels o Feeders that need voltage support at specific times, may be assisted by installing batteries · Rural conditions - SWER network o Feeders that need voltage support as well as peak lopping may benefit from both battery and solar panel installations. This small example demonstrates that no single solution can be applied across all three areas, and there is a need to be selective in which one is applied to each branch of the network. This is currently the function of the engineer who can define various scenarios against a configuration, test them and iterate towards an appropriate solution. Future work will focus on increasing the level of automation in identifying areas where particular solutions are applicable.
Resumo:
Freestanding membranes created from Bombyx mori silk fibroin (BMSF) offer a potential vehicle for corneal cell transplantation since they are transparent and support the growth of human corneal epithelial cells (HCE). Fibroin derived from the wild silkworm Antheraea pernyi (APSF) might provide a superior material by virtue of containing putative cell- attachment sites that are absent from BMSF. Thus we have investigated the feasibility of producing transparent, freestanding membranes from APSF and have analysed the behaviour of HCE cells on this material. No significant differences in cell numbers or phenotype were observed in short term HCE cell cultures established on either fibroin. Production of transparent freestanding APSF membranes, however, proved to be problematic as cast solutions of APSF were more prone to becoming opaque, displayed significantly lower permeability and were more brittle than BMSF-membranes. Cultures of HCE cells established on either membrane developed a normal stratified morphology with cytokeratin pair 3/12 being immuno-localized to the superficial layers. We conclude that while it is feasible to produce transparent freestanding membranes from APSF, the technical difficulties associated with this biomaterial, along with an absence of enhanced cell growth, currently favours the continued development of BMSF as a preferred vehicle for corneal cell transplantation. Nevertheless, it remains possible that refinement of techniques for processing APSF might yet lead to improvements in the handling properties and performance of this material.
Resumo:
Ever since sodium fluorescein (‘fluorescein’ [FL]) was first used to investigate the ocular surface over a century ago, the term ‘staining’ has been taken to mean the presence of ocular surface fluorescence [1]. This term has not been necessarily taken to infer any particular mechanism of causation, and indeed, can be attributed to a variety of possible aetiologies [2]. In recent times, there has been considerable interest in a form of ocular surface fluorescence seen in association with the use of certain combinations of soft contact lenses and multipurpose solutions. The first clinical account of this phenomenon was reported by Jones et al. [3], which was followed by a more formal investigation by the same author in 2002 [4]. Jones et al described this appearance as a ‘classic solution-based toxicity reaction’. Subsequently, this appearance has come to be known as ‘solution-induced corneal staining’ or more recently by the acronym ‘SICS’ [5]. The term SICS is potentially problematic in that from a cell biology point of view, there is an inference that ‘staining’ means the entry of a dye into corneal epithelial cells. Morgan and Maldonado-Codina [2] noted there was no foundation of solid scientific literature underpinning our understanding of the true basic causative mechanisms of this phenomenon; since that time, further work has been published in this field [6] and [7] but questions still remain about the precise aetiology of this phenomenon...
Resumo:
Most research virtually ignores the important role of a blood clot in supporting bone healing. In this study, we investigated the effects of surface functional groups carboxyl and alkyl on whole blood coagulation, complement activation and blood clot formation. We synthesised and tested a series of materials with different ratios of carboxyl (–COOH) and alkyl (–CH3, –CH2CH3 and –(CH2)3CH3) groups. We found that surfaces with –COOH/–(CH2)3CH3 induced a faster coagulation activation than those with –COOH/– CH3 and –CH2CH3, regardless of the –COOH ratios. An increase in –COOH ratios on –COOH/–CH3 and –CH2CH3 surfaces decreased the rate of coagulation activation. The pattern of complement activation was entirely similar to that of surface-induced coagulation. All material coated surfaces resulted in clots with thicker fibrin in a denser network at the clot/material interface and a significantly slower initial fibrinolysis when compared to uncoated glass surfaces. The amounts of platelet-derived growth factor-AB (PDGF-AB) and transforming growth factor-b (TGF-b1) released from an intact clot were higher than a lysed clot. The release of PDGF-AB was found to be correlated with the fibrin density. This study demonstrated that surface chemistry can significantly influence the activation of blood coagulation and complement system, resultant clot structure, susceptibility to fibrinolysis as well as release of growth factors, which are important factors determining the bone healing process.
Resumo:
Exact solutions of partial differential equation models describing the transport and decay of single and coupled multispecies problems can provide insight into the fate and transport of solutes in saturated aquifers. Most previous analytical solutions are based on integral transform techniques, meaning that the initial condition is restricted in the sense that the choice of initial condition has an important impact on whether or not the inverse transform can be calculated exactly. In this work we describe and implement a technique that produces exact solutions for single and multispecies reactive transport problems with more general, smooth initial conditions. We achieve this by using a different method to invert a Laplace transform which produces a power series solution. To demonstrate the utility of this technique, we apply it to two example problems with initial conditions that cannot be solved exactly using traditional transform techniques.
Jacobian-free Newton-Krylov methods with GPU acceleration for computing nonlinear ship wave patterns
Resumo:
The nonlinear problem of steady free-surface flow past a submerged source is considered as a case study for three-dimensional ship wave problems. Of particular interest is the distinctive wedge-shaped wave pattern that forms on the surface of the fluid. By reformulating the governing equations with a standard boundary-integral method, we derive a system of nonlinear algebraic equations that enforce a singular integro-differential equation at each midpoint on a two-dimensional mesh. Our contribution is to solve the system of equations with a Jacobian-free Newton-Krylov method together with a banded preconditioner that is carefully constructed with entries taken from the Jacobian of the linearised problem. Further, we are able to utilise graphics processing unit acceleration to significantly increase the grid refinement and decrease the run-time of our solutions in comparison to schemes that are presently employed in the literature. Our approach provides opportunities to explore the nonlinear features of three-dimensional ship wave patterns, such as the shape of steep waves close to their limiting configuration, in a manner that has been possible in the two-dimensional analogue for some time.
Resumo:
We revisit the venerable question of access credentials management, which concerns the techniques that we, humans with limited memory, must employ to safeguard our various access keys and tokens in a connected world. Although many existing solutions can be employed to protect a long secret using a short password, those solutions typically require certain assumptions on the distribution of the secret and/or the password, and are helpful against only a subset of the possible attackers. After briefly reviewing a variety of approaches, we propose a user-centric comprehensive model to capture the possible threats posed by online and offline attackers, from the outside and the inside, against the security of both the plaintext and the password. We then propose a few very simple protocols, adapted from the Ford-Kaliski server-assisted password generator and the Boldyreva unique blind signature in particular, that provide the best protection against all kinds of threats, for all distributions of secrets. We also quantify the concrete security of our approach in terms of online and offline password guesses made by outsiders and insiders, in the random-oracle model. The main contribution of this paper lies not in the technical novelty of the proposed solution, but in the identification of the problem and its model. Our results have an immediate and practical application for the real world: they show how to implement single-sign-on stateless roaming authentication for the internet, in a ad-hoc user-driven fashion that requires no change to protocols or infrastructure.