994 resultados para Port Weller Dry Docks Limited.
Resumo:
In this study, 39 sets of hard turning (HT) experimental trials were performed on a Mori-Seiki SL-25Y (4-axis) computer numerical controlled (CNC) lathe to study the effect of cutting parameters in influencing the machined surface roughness. In all the trials, AISI 4340 steel workpiece (hardened up to 69 HRC) was machined with a commercially available CBN insert (Warren Tooling Limited, UK) under dry conditions. The surface topography of the machined samples was examined by using a white light interferometer and a reconfirmation of measurement was done using a Form Talysurf. The machining outcome was used as an input to develop various regression models to predict the average machined surface roughness on this material. Three regression models - Multiple regression, Random Forest, and Quantile regression were applied to the experimental outcomes. To the best of the authors’ knowledge, this paper is the first to apply Random Forest or Quantile regression techniques to the machining domain. The performance of these models was compared to each other to ascertain how feed, depth of cut, and spindle speed affect surface roughness and finally to obtain a mathematical equation correlating these variables. It was concluded that the random forest regression model is a superior choice over multiple regression models for prediction of surface roughness during machining of AISI 4340 steel (69 HRC).
Resumo:
Large, thin (50 mu m) dry polymer sheets containing numerous surface-enhanced Raman spectroscopy (SERS) active Ag nanopartide aggregates have been prepared by drying aqueous mixtures of hydroxyethylcelloulose (HEC) and preaggregated Ag colloid in 10 x 10 cm molds. In these dry films, the particle aggregates are protected from the environment during storage and are easy to handle; for example, they can be cut to size with scissors. When in use, the highly swellable HEC polymer allowed the films to rapidly absorb aqueous analyte solutions while simultaneously releasing the Ag nanoparticle aggregates to interact with the analyte and generate large SERS signals. Either the films could be immersed in the analyte solution or 5 mu L droplets were applied to the surface; in the latter method, the local swelling caused the active area to dome upward, but the swollen film remained physically robust and could be handled as required. Importantly, encapsulation and release did not significantly compromise the SERS performance of the colloid; the signals given by the swollen films were similar to the very high signals obtained from the parent citrate-reduced colloid and were an order of magnitude larger than a commercially available nanoparticle substrate. These "Poly-SERS" films retained 70% of their SERS activity after being stored for 1 year in air. The films were sufficiently homogeneous to give a standard deviation of 3.2% in the absolute signal levels obtained from a test analyte, primarily due to the films' ability to suppress "coffee ring" drying marks, which meant that quantitative analysis without an internal standard was possible. The majority of the work used aqueous thiophenol as the test analyte; however, preliminary studies showed that the Poly-SERS films could also be used with nonaqueous solvents and for a range of other analytes including theophylline, a therapeutic drug, at a concentration as low as 1.0 x 10(-5) mol dm(-3) (1.8 mg/dm(3)), well below the sensitivity required for theophylline monitoring where the target range is 10-20 mg/dm(3).
Resumo:
Pseudomonas aeruginosa is an opportunistic pathogen and an important cause of infection, particularly amongst cystic fibrosis (CF) patients. While specific strains capable of patient-to-patient transmission are known, many infections appear to be caused by unique and unrelated strains. There is a need to understand the relationship between strains capable of colonising the CF lung and the broader set of P. aeruginosa isolates found in natural environments. Here we report the results of a multilocus sequence typing (MLST)-based study designed to understand the genetic diversity and population structure of an extensive regional sample of P. aeruginosa isolates from South East Queensland, Australia. The analysis is based on 501 P. aeruginosa isolates obtained from environmental, animal and human (CF and non-CF) sources with particular emphasis on isolates from the Lower Brisbane River and isolates from CF patients obtained from the same geographical region. Overall, MLST identified 274 different sequence types, of which 53 were shared between one or more ecological settings. Our analysis revealed a limited association between genotype and environment and evidence of frequent recombination. We also found that genetic diversity of P. aeruginosa in Queensland, Australia was indistinguishable from that of the global P. aeruginosa population. Several CF strains were encountered frequently in multiple ecological settings; however, the most frequently encountered CF strains were confined to CF patients. Overall, our data confirm a non-clonal epidemic structure and indicate that most CF strains are a random sample of the broader P. aeruginosa population. The increased abundance of some CF strains in different geographical regions is a likely product of chance colonisation events followed by adaptation to the CF lung and horizontal transmission among patients.
Resumo:
Punctal plugs (PPs) are miniature medical implants that were initially developed for the treatment of dry eyes. Since their introduction in 1975, many PPs made from different materials and designs have been developed. PPs, albeit generally successful, suffer from drawbacks such as epiphora and suppurative canaliculitis. To overcome these issues intelligent designs of PPs were proposed (e.g. SmartPLUG™ and Form Fit™). PPs are also gaining interest among pharmaceutical scientists for sustaining drug delivery to the eye. This review aims to provide an overview of PPs for dry eye treatment and drug delivery to treat a range of ocular diseases. It also discusses current challenges in using PPs for ocular diseases.
On the complexity of solving polytree-shaped limited memory influence diagrams with binary variables
Resumo:
Influence diagrams are intuitive and concise representations of structured decision problems. When the problem is non-Markovian, an optimal strategy can be exponentially large in the size of the diagram. We can avoid the inherent intractability by constraining the size of admissible strategies, giving rise to limited memory influence diagrams. A valuable question is then how small do strategies need to be to enable efficient optimal planning. Arguably, the smallest strategies one can conceive simply prescribe an action for each time step, without considering past decisions or observations. Previous work has shown that finding such optimal strategies even for polytree-shaped diagrams with ternary variables and a single value node is NP-hard, but the case of binary variables was left open. In this paper we address such a case, by first noting that optimal strategies can be obtained in polynomial time for polytree-shaped diagrams with binary variables and a single value node. We then show that the same problem is NP-hard if the diagram has multiple value nodes. These two results close the fixed-parameter complexity analysis of optimal strategy selection in influence diagrams parametrized by the shape of the diagram, the number of value nodes and the maximum variable cardinality.
Resumo:
We present a new algorithm for exactly solving decision-making problems represented as an influence diagram. We do not require the usual assumptions of no forgetting and regularity, which allows us to solve problems with limited information. The algorithm, which implements a sophisticated variable elimination procedure, is empirically shown to outperform a state-of-the-art algorithm in randomly generated problems of up to 150 variables and 10^64 strategies.
Resumo:
We present a new algorithm for exactly solving decision making problems represented as influence diagrams. We do not require the usual assumptions of no forgetting and regularity; this allows us to solve problems with simultaneous decisions and limited information. The algorithm is empirically shown to outperform a state-of-the-art algorithm on randomly generated problems of up to 150 variables and 10^64 solutions. We show that these problems are NP-hard even if the underlying graph structure of the problem has low treewidth and the variables take on a bounded number of states, and that they admit no provably good approximation if variables can take on an arbitrary number of states.
Resumo:
We present a new algorithm for exactly solving decision making problems represented as influence diagrams. We do not require the usual assumptions of no forgetting and regularity; this allows us to solve problems with simultaneous decisions and limited information. The algorithm is empirically shown to outperform a state-of-the-art algorithm on randomly generated problems of up to 150 variables and 10^64 solutions. We show that the problem is NP-hard even if the underlying graph structure of the problem has small treewidth and the variables take on a bounded number of states, but that a fully polynomial time approximation scheme exists for these cases. Moreover, we show that the bound on the number of states is a necessary condition for any efficient approximation scheme.
Resumo:
Hulun Lake, China's fifth-largest inland lake, experienced severe declines in water level in the period of 2000-2010. This has prompted concerns whether the lake is drying up gradually. A multi-million US dollar engineering project to construct a water channel to transfer part of the river flow from a nearby river to maintain the water level was completed in August 2010. This study aimed to advance the understanding of the key processes controlling the lake water level variation over the last five decades, as well as investigate the impact of the river transfer engineering project on the water level. A water balance model was developed to investigate the lake water level variations over the last five decades, using hydrological and climatic data as well as satellite-based measurements and results from land surface modelling. The investigation reveals that the severe reduction of river discharge (-364±64 mm/yr, ∼70% of the five-decade average) into the lake was the key factor behind the decline of the lake water level between 2000 and 2010. The decline of river discharge was due to the reduction of total runoff from the lake watershed. This was a result of the reduction of soil moisture due to the decrease of precipitation (-49±45 mm/yr) over this period. The water budget calculation suggests that the groundwater component from the surrounding lake area as well as surface run off from the un-gauged area surrounding the lake contributed ∼ net 210 Mm3/yr (equivalent to ∼ 100 mm/yr) water inflows into the lake. The results also show that the water diversion project did prevent a further water level decline of over 0.5 m by the end of 2012. Overall, the monthly water balance model gave an excellent prediction of the lake water level fluctuation over the last five decades and can be a useful tool to manage lake water resources in the future.
Resumo:
The solubility of carbon dioxide in five tetraalkylphosphonium superbase ionic liquids, namely the trihexyltetradecylphoshonium phenoxide, trihexyltetradecylphoshonium benzotriazolide, trihexyltetradecylphoshonium benzimidazolide, trihexyltetradecylphoshonium 1,2,3-triazolide, and trihexyltetradecylphoshonium 1,2,4-triazolide was studied experimentally under dry and wet conditions at 22 A degrees C and at atmospheric pressure, using a gravimetric saturation technique. The effects of anion structure and of the presence or absence of water in the solution on the carbon dioxide solubility were then deduced from the data. H-1 and C-13-NMR spectroscopy and ab initio calculations were also conducted to probe the interactions in these solutions, as carbon dioxide and water can compete in the ionic liquid structure during the absorption process. Additionally, the viscosity of selected superbase ionic liquids was measured under dry and wet conditions, in the presence or absence of CO2, to evaluate their practical application in carbon dioxide capture processes. Finally, the recyclability of the trihexyltetradecylphoshonium 1,2,4-triazolide under dry and wet conditions was determined to probe the ability of selected solvents to solubilize chemically a high concentration of carbon dioxide and then release it in a low energy demand process.
Resumo:
The use of power ultrasound treatment in dry red kidney beans as a means to reduce the rehydration step during canning production while maintaining high nutritional value. IFT Annual Meeting. Chicago, 13-16/7/2013. (Poster
Resumo:
We describe some unsolved problems of current interest; these involve quantum critical points in
ferroelectrics and problems which are not amenable to the usual density functional theory, nor to
classical Landau free energy approaches (they are kinetically limited), nor even to the Landau–
Kittel relationship for domain size (they do not satisfy the assumption of infinite lateral diameter)
because they are dominated by finite aperiodic boundary conditions.
Resumo:
Dry reforming is a promising reaction to utilise the greenhouse gases CO2 and CH4. Nickel-based catalysts are the most popular catalysts for the reaction, and the coke formation on the catalysts is the main obstacle to the commercialisation of dry reforming. In this study, the whole reaction network of dry reformation on both flat and stepped nickel catalysts (Ni(111) and Ni(211)) as well as nickel carbide (flat: Ni3C(001); stepped: Ni3C(111)) is investigated using density functional theory calculations. The overall reaction energy profiles in the free energy landscape are obtained, and kinetic analyses are utilised to evaluate the activity of the four surfaces. By careful examination of our results, we find the following regarding the activity: (i) flat surfaces are more active than stepped surfaces for the dry reforming and (ii) metallic nickel catalysts are more active than those of nickel carbide, and therefore, the phase transformation from nickel to nickel carbide will reduce the activity. With respect to the coke formation, the following is found: (i) the coke formation probability can be measured by the rate ratio of CH oxidation pathway to C oxidation pathway (r(CH)/r(C)) and the barrier of CO dissociation, (ii) on Ni(111), the coke is unlikely to form, and (iii) the coke formations on the stepped surfaces of both nickel and nickel carbide can readily occur. A deactivation scheme, using which experimental results can be rationalised, is proposed.
Resumo:
This paper presents a novel method of audio-visual fusion for person identification where both the speech and facial modalities may be corrupted, and there is a lack of prior knowledge about the corruption. Furthermore, we assume there is a limited amount of training data for each modality (e.g., a short training speech segment and a single training facial image for each person). A new representation and a modified cosine similarity are introduced for combining and comparing bimodal features with limited training data as well as vastly differing data rates and feature sizes. Optimal feature selection and multicondition training are used to reduce the mismatch between training and testing, thereby making the system robust to unknown bimodal corruption. Experiments have been carried out on a bimodal data set created from the SPIDRE and AR databases with variable noise corruption of speech and occlusion in the face images. The new method has demonstrated improved recognition accuracy.