962 resultados para MONOLAYER COVERAGE


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Adequate amount of graphene oxide (GO) was firstly prepared by oxidation of graphite and GO/epoxy nanocomposites were subsequently prepared by typical solution mixing technique. X-ray diffraction (XRD) pattern, X-ray photoelectron (XPS), Raman and Fourier transform infrared (FTIR) spectroscopy indicated the successful preparation of GO. Scanning electron microscopy (SEM) and Transmission electron microscopy (TEM) images of the graphite oxide showed that they consist of a large amount of graphene oxide platelets with a curled morphology containing of a thin wrinkled sheet like structure. AFM image of the exfoliated GO signified that the average thickness of GO sheets is ~1.0 nm which is very similar to GO monolayer. Mechanical properties of as prepared GO/epoxy nanocomposites were investigated. Significant improvements in both Young’s modulus and tensile strength were observed for the nanocomposites at very low level of GO loading. The Young’s modulus of the nanocomposites containing 0.5 wt% GO was 1.72 GPa, which was 35 % higher than that of the pure epoxy resin (1.28 GPa). The effective reinforcement of the GO based epoxy nanocomposites can be attributed to the good dispersion and the strong interfacial interactions between the GO sheets and the epoxy resin matrices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With a monolayer honeycomb-lattice of sp2-hybridized carbon atoms, graphene has demonstrated exceptional electrical, mechanical and thermal properties. One of its promising applications is to create graphene-polymer nanocomposites with tailored mechanical and physical properties. In general, the mechanical properties of graphene nanofiller as well as graphene-polymer interface govern the overall mechanical performance of graphene-polymer nanocomposites. However, the strengthening and toughening mechanisms in these novel nanocomposites have not been well understood. In this work, the deformation and failure of graphene sheet and graphene-polymer interface were investigated using molecular dynamics (MD) simulations. The effect of structural defects on the mechanical properties of graphene and graphene-polymer interface was investigated as well. The results showed that structural defects in graphene (e.g. Stone-Wales defect and multi-vacancy defect) can significantly deteriorate the fracture strength of graphene but may still make full utilization of corresponding strength of graphene and keep the interfacial strength and the overall mechanical performance of graphene-polymer nanocomposites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Critical road infrastructure (such as tunnels and overpasses) is of major significance to society and constitutes major components of interdependent, ‘systems and networks’. Failure in critical components of these wide area infrastructure systems can often result in cascading disturbances with secondary and tertiary impacts - some of which may become initiating sources of failure in their own right, triggering further systems failures across wider networks. Perrow1) considered the impact of our increasing use of technology in high-risk fields, analysing the implications on everyday life and argued that designers of these types of infrastructure systems cannot predict every possible failure scenario nor create perfect contingency plans for operators. Challenges exist for transport system operators in the conceptualisation and implementation of response and subsequent recovery planning for significant events. Disturbances can vary from reduced traffic flow causing traffic congestion throughout the local road network(s) and subsequent possible loss of income to businesses and industry to a major incident causing loss of life or complete loss of an asset. Many organisations and institutions, despite increasing recognition of the effects of crisis events, are not adequately prepared to manage crises2). It is argued that operators of land transport infrastructure are in a similar category of readiness given the recent instances of failures in road tunnels. These unexpected infrastructure failures, and their ultimately identified causes, suggest there is significant room for improvement. As a result, risk profiles for road transport systems are often complex due to the human behaviours and the inter-mix of technical and organisational components and the managerial coverage needed for the socio-technical components and the physical infrastructure. In this sense, the span of managerial oversight may require new approaches to asset management that combines the notion of risk and continuity management. This paper examines challenges in the planning of response and recovery practices of owner/operators of transport systems (above and below ground) in Australia covering: • Ageing or established infrastructure; and • New-build infrastructure. With reference to relevant international contexts this paper seeks to suggest options for enhancing the planning and practice for crisis response in these transport networks and as a result support the resilience of Critical Infrastructure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Text categorisation is challenging, due to the complex structure with heterogeneous, changing topics in documents. The performance of text categorisation relies on the quality of samples, effectiveness of document features, and the topic coverage of categories, depending on the employing strategies; supervised or unsupervised; single labelled or multi-labelled. Attempting to deal with these reliability issues in text categorisation, we propose an unsupervised multi-labelled text categorisation approach that maps the local knowledge in documents to global knowledge in a world ontology to optimise categorisation result. The conceptual framework of the approach consists of three modules; pattern mining for feature extraction; feature-subject mapping for categorisation; concept generalisation for optimised categorisation. The approach has been promisingly evaluated by compared with typical text categorisation methods, based on the ground truth encoded by human experts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cross-Lingual Link Discovery (CLLD) is a new problem in Information Retrieval. The aim is to automatically identify meaningful and relevant hypertext links between documents in different languages. This is particularly helpful in knowledge discovery if a multi-lingual knowledge base is sparse in one language or another, or the topical coverage in each language is different; such is the case with Wikipedia. Techniques for identifying new and topically relevant cross-lingual links are a current topic of interest at NTCIR where the CrossLink task has been running since the 2011 NTCIR-9. This paper presents the evaluation framework for benchmarking algorithms for cross-lingual link discovery evaluated in the context of NTCIR-9. This framework includes topics, document collections, assessments, metrics, and a toolkit for pooling, assessment, and evaluation. The assessments are further divided into two separate sets: manual assessments performed by human assessors; and automatic assessments based on links extracted from Wikipedia itself. Using this framework we show that manual assessment is more robust than automatic assessment in the context of cross-lingual link discovery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nick Herd begins his institutional history of Australian commercial television in the early 1890s, when an amateur inventor named Henry Sutton designed the ‘telephane’ with the intent of watching the Melbourne Cup in his home town of Ballarat. The ‘race that stops a nation’ was not broadcast live on television until 1960, but Sutton’s initiative indicates how closely sport and television were aligned in Australia even before the medium existed. The first licensed commercial stations to begin regular broadcasting went on air in Sydney and Melbourne shortly before the 1956 Melbourne Olympic Games, although Herd claims that this was ‘almost accidental’ rather than planned. (49) Only Melbourne viewers were able to see some events live, many via television sets in Ampol service stations following the company’s last minute sponsorship of coverage on Melbourne station GTV-9...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The combined techniques of in situ Raman microscopy and scanning electron microscopy (SEM) have been used to study the selective oxidation of methanol to formaldehyde and the ethene epoxidation reaction over polycrystalline silver catalysts. The nature of the oxygen species formed on silver was found to depend critically upon the exact morphology of the catalyst studied. Bands at 640, 780 and 960 cm-1 were identified only on silver catalysts containing a significant proportion of defects. These peaks were assigned to subsurface oxygen species situated in the vicinity of surface dislocations, AgIII=O sites formed on silver atoms modified by the presence of subsurface oxygen and O2 - species stabilized on subsurface oxygen-modified silver sites, respectively. The selective oxidation of methanol to formaldehyde was determined to occur at defect sites, where reaction of methanol with subsurface oxygen initially produced subsurface OH species (451 cm-1) and adsorbed methoxy species. Two distinct forms of adsorbed ethene were identified on oxidised silver sites. One of these was created on silver sites modified by the interaction of subsurface oxygen species, and the other on silver crystal planes containing a surface coverage of atomic oxygen species. The selective oxidation of ethene to ethylene oxide was achieved by the reaction between ethene adsorbed on modified silver sites and electrophilic AgIII=O species, whereas the combustion reaction was perceived to take place by the reaction of adsorbed ethene with nucleophilic surface atomic oxygen species. Defects were determined to play a critical role in the epoxidation reaction, as these sites allowed the rapid diffusion of oxygen into subsurface positions, and consequently facilitated the formation of the catalytically active AgIII=O sites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increases in functionality, power and intelligence of modern engineered systems led to complex systems with a large number of interconnected dynamic subsystems. In such machines, faults in one subsystem can cascade and affect the behavior of numerous other subsystems. This complicates the traditional fault monitoring procedures because of the need to train models of the faults that the monitoring system needs to detect and recognize. Unavoidable design defects, quality variations and different usage patterns make it infeasible to foresee all possible faults, resulting in limited diagnostic coverage that can only deal with previously anticipated and modeled failures. This leads to missed detections and costly blind swapping of acceptable components because of one’s inability to accurately isolate the source of previously unseen anomalies. To circumvent these difficulties, a new paradigm for diagnostic systems is proposed and discussed in this paper. Its feasibility is demonstrated through application examples in automotive engine diagnostics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Hepatitis C virus (HCV) affects some 150 million people worldwide. However, unlike hepatitis A and B there is no vaccination for HCV and approximately 75% of people exposed to HCV develop chronic hepatitis. In Australia, around 226,700 people live with chronic HCV infection costing the government approximately $252 million per year. Historically, the standard approved/licenced treatment for HCV is pegylated interferon with ribavirin. There are major drawbacks with interferon-based therapy including side effects, long duration of therapy, limited access and affordability. Our previous survey of an at-risk population reported HCV treatment coverage of only 5%. Since April 2013, a new class of interferon-free treatments for chronic HCV is subsidised under the Pharmaceutical Benefits Scheme: boceprevir and telaprevir - estimated to cost the Australian Government in excess of $220 million over five years. Other biologic interferon-free therapeutic agents are scheduled to enter the Australian market. Use of small molecule generic pharmaceuticals has been advocated as a means of public cost savings. However, with the new biologic agents, generics (biosimilars) may not be feasible or straightforward, due to long patent life; marketing exclusivity; and regulatory complexity for these newer products.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2010 a couple in Cairns were charged, and later found not guilty, of illegally obtaining a medical abortion through the use of medication imported from overseas. The court case reignited the contentious debate surrounding the illegality and social acceptance of abortion in Queensland, Australia. Based on a critical discourse analysis of 150 online news media articles covering the Cairns trial, this paper argues that the media shapes perceptions of deviance and stigma in relation to abortion through the use of language. In this case, the Cairns couple were positioned as deviant for pursuing abortion on the basis that they were rejecting the social norm of motherhood. This paper identifies three key themes evident in the articles analysed which contribute to shaping the construction of deviance – the humanising of the foetus, the stereotyping of the traditional female role of mother, and the demonising of women who choose abortion. This paper argues that the use of specific language in media coverage of abortion has the power to disrespect and invalidate the experiences, rights, and health of women who choose to terminate pregnancies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bulk amount of graphite oxide was prepared by oxidation of graphite using the modified Hummers method and its ultrasonication in organic solvents yielded graphene oxide (GO). X-ray diffraction (XRD) pattern, X-ray photoelectron (XPS), Raman and Fourier transform infrared (FTIR) spectroscopy indicated the successful preparation of GO. XPS survey spectrum of GO revealed the presence of 66.6 at% C and 30.4 at% O. Scanning electron microscopy (SEM) and Transmission electron microscopy (TEM) images of the graphene oxide showed that they consist of a large amount of graphene oxide platelets with a curled morphology containing of a thin wrinkled sheet like structure. AFM image of the exfoliated GO signified that the average thickness of GO sheets is ~1.0 nm which is very similar to GO monolayer. GO/epoxy nanocomposites were prepared by typical solution mixing technique and influence of GO on mechanical and thermal properties of nanocomposites were investigated. As for the mechanical behaviour of GO/epoxy nanocomposites, 0.5 wt% GO in the nanocomposite achieved the maximum increase in the elastic modulus (~35%) and tensile strength (~7%). The TEM analysis provided clear image of microstructure with homogeneous dispersion of GO in the polymer matrix. The improved strength properties of GO/epoxy nanocomposites can be attributed to inherent strength of GO, the good dispersion and the strong interfacial interactions between the GO sheets and the polymer matrix. However, incorporation of GO showed significant negative effect on composite glass transition temperature (Tg). This may arise due to the interference of GO on curing reaction of epoxy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Service bundles, in the context of e-government, are used to group services together that relate to a certain citizen need. These bundles can then be presented on a governmental one-stop portal to structure the available service offerings according to citizen expectations. In order to ensure that citizens utilise the one-stop portal and comprised service bundles for future transactions, the quality of these service bundles needs to be managed and maximised accordingly. Consequently, models and tools that focus on assessing service bundle quality play an important role, when it comes to increasing or retaining usage behaviour of citizens. This study focuses on providing a rigorous and structured literature review of e-government outlets with regards to their coverage of service bundle quality and e-service quality themes. The study contributes to academia and practice by providing a framework that allows structuring and classifying existing studies relevant for the assessment of quality for government portals. Furthermore, this study provides insights into the status quo of quality models that can be used by governments to assess the quality of their service bundles. Directions for future research and limitations of the present study are provided as well.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1. Essential hypertension occurs in people with an underlying genetic predisposition who subject themselves to adverse environmental influences. The number of genes involved is unknown, as is the extent to which each contributes to final blood pressure and the severity of the disease. 2. In the past, studies of potential candidate genes have been performed by association (case-control) analysis of unrelated individuals or linkage (pedigree or sibpair) analysis of families. These studies have resulted in several positive findings but, as one may expect, also an enormous number of negative results. 3. In order to uncover the major genetic loci for essential hypertension, it is proposed that scanning the genome systematically in 100- 200 affected sibships should prove successful. 4. This involves genotyping sets of hypertensive sibships to determine their complement of several hundred microsatellite polymorphisms. Those that are highly informative, by having a high heterozygosity, are most suitable. Also, the markers need to be spaced sufficiently evenly across the genome so as to ensure adequate coverage. 5. Tests are performed to determine increased segregation of alleles of each marker with hypertension. The analytical tools involve specialized statistical programs that can detect such differences. Non- parametric multipoint analysis is an appropriate approach. 6. In this way, loci for essential hypertension are beginning to emerge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distributed Wireless Smart Camera (DWSC) network is a special type of Wireless Sensor Network (WSN) that processes captured images in a distributed manner. While image processing on DWSCs sees a great potential for growth, with its applications possessing a vast practical application domain such as security surveillance and health care, it suffers from tremendous constraints. In addition to the limitations of conventional WSNs, image processing on DWSCs requires more computational power, bandwidth and energy that presents significant challenges for large scale deployments. This dissertation has developed a number of algorithms that are highly scalable, portable, energy efficient and performance efficient, with considerations of practical constraints imposed by the hardware and the nature of WSN. More specifically, these algorithms tackle the problems of multi-object tracking and localisation in distributed wireless smart camera net- works and optimal camera configuration determination. Addressing the first problem of multi-object tracking and localisation requires solving a large array of sub-problems. The sub-problems that are discussed in this dissertation are calibration of internal parameters, multi-camera calibration for localisation and object handover for tracking. These topics have been covered extensively in computer vision literatures, however new algorithms must be invented to accommodate the various constraints introduced and required by the DWSC platform. A technique has been developed for the automatic calibration of low-cost cameras which are assumed to be restricted in their freedom of movement to either pan or tilt movements. Camera internal parameters, including focal length, principal point, lens distortion parameter and the angle and axis of rotation, can be recovered from a minimum set of two images of the camera, provided that the axis of rotation between the two images goes through the camera's optical centre and is parallel to either the vertical (panning) or horizontal (tilting) axis of the image. For object localisation, a novel approach has been developed for the calibration of a network of non-overlapping DWSCs in terms of their ground plane homographies, which can then be used for localising objects. In the proposed approach, a robot travels through the camera network while updating its position in a global coordinate frame, which it broadcasts to the cameras. The cameras use this, along with the image plane location of the robot, to compute a mapping from their image planes to the global coordinate frame. This is combined with an occupancy map generated by the robot during the mapping process to localised objects moving within the network. In addition, to deal with the problem of object handover between DWSCs of non-overlapping fields of view, a highly-scalable, distributed protocol has been designed. Cameras that follow the proposed protocol transmit object descriptions to a selected set of neighbours that are determined using a predictive forwarding strategy. The received descriptions are then matched at the subsequent camera on the object's path using a probability maximisation process with locally generated descriptions. The second problem of camera placement emerges naturally when these pervasive devices are put into real use. The locations, orientations, lens types etc. of the cameras must be chosen in a way that the utility of the network is maximised (e.g. maximum coverage) while user requirements are met. To deal with this, a statistical formulation of the problem of determining optimal camera configurations has been introduced and a Trans-Dimensional Simulated Annealing (TDSA) algorithm has been proposed to effectively solve the problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research background For almost 80 years the Chuck Taylor (or Chuck T's) All Star basketball shoe has been an iconic item of fashion apparel. The Chuck T's were first designed in 1921 by Converse, an American shoe company and over the decades they became a popular item not purely for sports and athletic purposes but rather evolved into the shoe of choice for many subcultural groups as a fashion item. In some circles the Chuck Taylor is still seen as the "coolest" sneaker of all time - one which will never go out of fashion regardless of changing trends. With over 600 millions pairs sold all over the world since its release, the Converse shoe is representative of not only a fashion culture - but also of a consumption culture - that evolved as the driving force behind the massive growth of the Western economic system during the 20th Century. Artisan Gallery (Brisbane), in conjunction with the exhibition Reboot: Function, Fashion and the Sneaker, a history of the sneaker, selected 20 designers to customise and re-design the classic Converse Chuck Taylor All Stars shoe and in doing so highlighted the diversity of forms possible for creative outcomes. As Artisan Gallery Curator Kirsten Fitzpatrick states “We were expecting people to draw and paint on them. Instead, we had shoes... mounted as trophies.." referring to the presentation of "Converse Consumption". The exhibition ran from 21 June – 16 August 2012: Research question The Chuck T’s is one of many overwhelmingly commercially successful designs of the last century. Nowadays we are faced with the significant problems of overconsumption and the stress this causes on the natural ecosystem; and on people as a result. As an active member of the industrial design fraternity – a discipline that sits at the core of this problem - how can I use this opportunity to comment on the significant issue of consumption? An effective way to do this was to associate consumption of goods with consumption of sugar. There are significant similarities between our ceaseless desires to consume products and our fervent need to consume indulgent sweet foods. Artisan Statement Delicious, scrumptious, delectable... your pupils dilate, your blood pressure spikes, your liver goes into overdrive. Immediately, your brain cuts off the adenosine receptors, preventing drowsiness. Your body increases dopamine production, in-turn stimulating the pleasure receptors in your brain. Your body absorbs all the sweetness and turns it into fat – while all the nutrients that you actually require are starting to be destroyed, about to be expelled. And this is only after one bite! After some time though, your body comes crashing back to earth. You become irritable and begin to feel sluggish. Your eyelids seem heavy while your breathing pattern changes. Your body has consumed all the energy and destroyed all available nutrients. You literally begin to shut down. These are the physiological effects of sugar consumption. A perfect analogy for our modern day consumer driven world. Enjoy your dessert! Research contribution “Converse Consumption” contributes to the conversation regarding over-consumption by compelling people to reflect on their consumption behaviour through the reconceptualising of the deconstructed Chuck T’s in an attractive edible form. By doing so the viewer has to deal with the desire to consume the indulgent looking dessert with the contradictory fact that it is comprised of a pair of shoes. The fact that the shoes are Chuck T’s make the effect even more powerful due to their iconic status. These clashing motivations are what make “Converse Consumption” a bizarre yet memorable experience. Significance The exhibition was viewed by an excess of 1000 people and generated exceptional media coverage and public exposure/impact. As Artisan Gallery Curator Kirsten Fitzpatrick states “20 of Brisbane's best designers were given the opportunity to customise their own Converse Sneakers, with The Converse Blank Canvas Project.” And to be selected in this category demonstrates the calibre of importance for design prominence.