961 resultados para Database application


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rigid lenses, which were originally made from glass (between 1888 and 1940) and later from polymethyl methacrylate or silicone acrylate materials, are uncomfortable to wear and are now seldom fitted to new patients. Contact lenses became a popular mode of ophthalmic refractive error correction following the discovery of the first hydrogel material – hydroxyethyl methacrylate – by Czech chemist Otto Wichterle in 1960. To satisfy the requirements for ocular biocompatibility, contact lenses must be transparent and optically stable (for clear vision), have a low elastic modulus (for good comfort), have a hydrophilic surface (for good wettability), and be permeable to certain metabolites, especially oxygen, to allow for normal corneal metabolism and respiration during lens wear. A major breakthrough in respect of the last of these requirements was the development of silicone hydrogel soft lenses in 1999 and techniques for making the surface hydrophilic. The vast majority of contact lenses distributed worldwide are mass-produced using cast molding, although spin casting is also used. These advanced mass-production techniques have facilitated the frequent disposal of contact lenses, leading to improvements in ocular health and fewer complications. More than one-third of all soft contact lenses sold today are designed to be discarded daily (i.e., ‘daily disposable’ lenses).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With new developments in battery technologies, increasing application of Battery Energy Storage System (BESS) in power system is anticipated in near future. BESS has already been used for primary frequency regulation in the past. This paper examines the feasibility of using BESS with load shedding, in application for large disturbances in power system. Load shedding is one of the conventional ways during large disturbances, and the performance of frequency control will increase in combination with BESS application. According to the latest news, BESS which are applied in high power side will be employed in practice in next 5 year. A simple low order SMR model is used as a test system, while an incremental model of BESS is applied in this paper. As continuous disturbances are not the main concern in this paper, df/dt is not considered in article.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low speed rotating machines which are the most critical components in drive train of wind turbines are often menaced by several technical and environmental defects. These factors contribute to mount the economic requirement for Health Monitoring and Condition Monitoring of the systems. When a defect is happened in such system result in reduced energy loss rates from related process and due to it Condition Monitoring techniques that detecting energy loss are very difficult if not possible to use. However, in the case of Acoustic Emission (AE) technique this issue is partly overcome and is well suited for detecting very small energy release rates. Acoustic Emission (AE) as a technique is more than 50 years old and in this new technology the sounds associated with the failure of materials were detected. Acoustic wave is a non-stationary signal which can discover elastic stress waves in a failure component, capable of online monitoring, and is very sensitive to the fault diagnosis. In this paper the history and background of discovering and developing AE is discussed, different ages of developing AE which include Age of Enlightenment (1950-1967), Golden Age of AE (1967-1980), Period of Transition (1980-Present). In the next section the application of AE condition monitoring in machinery process and various systems that applied AE technique in their health monitoring is discussed. In the end an experimental result is proposed by QUT test rig which an outer race bearing fault was simulated to depict the sensitivity of AE for detecting incipient faults in low speed high frequency machine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The terrorist attacks in the United States on September 11, 2001 appeared to be a harbinger of increased terrorism and violence in the 21st century, bringing terrorism and political violence to the forefront of public discussion. Questions about these events abound, and “Estimating the Historical and Future Probabilities of Large Scale Terrorist Event” [Clauset and Woodard (2013)] asks specifically, “how rare are large scale terrorist events?” and, in general, encourages discussion on the role of quantitative methods in terrorism research and policy and decision-making. Answering the primary question raises two challenges. The first is identify- ing terrorist events. The second is finding a simple yet robust model for rare events that has good explanatory and predictive capabilities. The challenges of identifying terrorist events is acknowledged and addressed by reviewing and using data from two well-known and reputable sources: the Memorial Institute for the Prevention of Terrorism-RAND database (MIPT-RAND) [Memorial Institute for the Prevention of Terrorism] and the Global Terror- ism Database (GTD) [National Consortium for the Study of Terrorism and Responses to Terrorism (START) (2012), LaFree and Dugan (2007)]. Clauset and Woodard (2013) provide a detailed discussion of the limitations of the data and the models used, in the context of the larger issues surrounding terrorism and policy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many emerging economies are dangling the patent system to stimulate bio-technological innovations with the ultimate premise that these will improve their economic and social growth. The patent system mandates full disclosure of the patented invention in exchange of a temporary exclusive patent right. Recently, however, patent offices have fallen short of complying with such a mandate, especially for genetic inventions. Most patent offices provide only static information about disclosed patent sequences and even some do not keep track of the sequence listing data in their own database. The successful partnership of QUT Library and Cambia exemplifies advocacy in Open Access, Open Innovation and User Participation. The library extends its services to various departments within the university, builds and encourages research networks to complement skills needed to make a contribution in the real world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we introduce a formalization of Logical Imaging applied to IR in terms of Quantum Theory through the use of an analogy between states of a quantum system and terms in text documents. Our formalization relies upon the Schrodinger Picture, creating an analogy between the dynamics of a physical system and the kinematics of probabilities generated by Logical Imaging. By using Quantum Theory, it is possible to model more precisely contextual information in a seamless and principled fashion within the Logical Imaging process. While further work is needed to empirically validate this, the foundations for doing so are provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An online interactive map and associated database including textual extracts and audiovisual material of film/novel/play locations in Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been significant research in the field of database watermarking recently. However, there has not been sufficient attention given to the requirement of providing reversibility (the ability to revert back to original relation from watermarked relation) and blindness (not needing the original relation for detection purpose) at the same time. This model has several disadvantages over reversible and blind watermarking (requiring only the watermarked relation and secret key from which the watermark is detected and the original relation is restored) including the inability to identify the rightful owner in case of successful secondary watermarking, the inability to revert the relation to the original data set (required in high precision industries) and the requirement to store the unmarked relation at a secure secondary storage. To overcome these problems, we propose a watermarking scheme that is reversible as well as blind. We utilize difference expansion on integers to achieve reversibility. The major advantages provided by our scheme are reversibility to a high quality original data set, rightful owner identification, resistance against secondary watermarking attacks, and no need to store the original database at a secure secondary storage. We have implemented our scheme and results show the success rate is limited to 11% even when 48% tuples are modified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been significant research in the field of database watermarking recently. However, there has not been sufficient attention given to the requirement of providing reversibility (the ability to revert back to original relation from watermarked relation) and blindness (not needing the original relation for detection purpose) at the same time. This model has several disadvantages over reversible and blind watermarking (requiring only the watermarked relation and secret key from which the watermark is detected and the original relation is restored) including the inability to identify the rightful owner in case of successful secondary watermarking, the inability to revert the relation to the original data set (required in high precision industries) and the requirement to store the unmarked relation at a secure secondary storage. To overcome these problems, we propose a watermarking scheme that is reversible as well as blind. We utilize difference expansion on integers to achieve reversibility. The major advantages provided by our scheme are reversibility to a high quality original data set, rightful owner identification, resistance against secondary watermarking attacks, and no need to store the original database at a secure secondary storage. We have implemented our scheme and results show the success rate is limited to 11% even when 48% tuples are modified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is the protocol for a review and there is no abstract. The objectives are as follows: To determine the evidence supporting the use of recruitment manoeuvres in mechanically ventilated neonates and identify the optimal method of lung recruitment. To determine the effects of lung recruitment manoeuvres in neonates receiving ventilatory support on neonatal mortality and development of chronic lung disease when compared to no recruitment. If data are available, subgroup analyses will include: chronological age, gestational age, lung pathophysiology and pre-existing lung disease, mode and length of ventilation, timing and frequency of recruitment techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project develops and evaluates a model of curriculum design that aims to assist student learning of foundational disciplinary ‘Threshold Concepts’. The project uses phenomenographic action research, cross-institutional peer collaboration and the Variation Theory of Learning to develop and trial the model. Two contrasting disciplines (Physics and Law) and four institutions (two research-intensive and two universities of technology) were involved in the project, to ensure broad applicability of the model across different disciplines and contexts. The Threshold Concepts that were selected for curriculum design attention were measurement uncertainty in Physics and legal reasoning in Law. Threshold Concepts are key disciplinary concepts that are inherently troublesome, transformative and integrative in nature. Once understood, such concepts transform students’ views of the discipline because they enable students to coherently integrate what were previously seen as unrelated aspects of the subject, providing new ways of thinking about it (Meyer & Land 2003, 2005, 2006; Land et al. 2008). However, the integrative and transformative nature of such threshold concepts make them inherently difficult for students to learn, with resulting misunderstandings of concepts being prevalent...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The past decade has seen an increase in the number of significant natural disasters that have caused considerable loss of life as well as damage to all property markets in the affected areas. In many cases, these natural disasters have not only caused significant property damage, but in numerous cases, have resulted in the total destruction of the property in the location. With these disasters attracting considerable media attention, the public are more aware of where these affected property markets are, as well as the overall damage to properties that have been damaged or destroyed. This heightened level of awareness has to have an impact on the participants in the property market, whether a developer, vendor seller or investor. To assess this issue, a residential property market that has been affected by a significant natural disaster over the past 2 years has been analysed to determine the overall impact of the disaster on buyer, renter and vendor behaviour, as well as prices in these residential markets. This paper is based on data from the Brisbane flood in January 2011. This natural disaster resulted in loss of life and partial and total devastation of considerable residential property sectors. Data for the research have been based on the residential sales and rental listings for each week of the study period to determine the level of activity in the specific property sectors, and these are also compared to the median house prices for the various suburbs for the same period based on suburbs being either flood affected or flood free. As there are 48 suburbs included in the study, it has been possible to group these suburbs on a socio-economic basis to determine possible differences due to location and value. Data were accessed from realestate.com.au, a free real estate site that provides details of current rental and sales listings on a suburb basis, RP Data a commercial property sales database and the Australian Bureau of Statistics. The paper found that sales listings fell immediately after the flood in the affected areas, but there was no corresponding fall or increase in sales listings in the flood-free suburbs. There was a significant decrease in the number of rental listings follow the flood as affected parties sought alternate accommodation. The greatest fall in rental listings was in areas close to the flood-affected suburbs indicating the desire to be close to the flooded property during the repair period.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Zero valent iron (ZVI) was prepared by reducing natural goethite (NG-ZVI) and synthetic goethite (SG-ZVI) in hydrogen at 550 °C. XRD, TEM, FESEM/EDS and specific surface area (SSA) and pore analyser were used to characterize goethites and reduced goethites. Both NG-ZVI and SG-ZVI with a size of nanoscale to several hundreds of nanometers were obtained by reducing goethites at 550 °C. The reductive capacity of the ZVIs was assessed by removal of Cr(VI) at ambient temperature in comparison with that of commercial iron powder (CIP). The effect of contact time, initial concentration and reaction temperature on Cr(VI) removal was investigated. Furthermore, the uptake mechanism was discussed according to isotherms, thermodynamic analysis and the results of XPS. The results showed that SG-ZVI had the best reductive capacity to Cr(VI) and reduced Cr(VI) to Cr(III). The results suggest that hydrogen reduction is a good approach to prepare ZVI and this type of ZVI is potentially useful in remediating heavy metals as a material of permeable reaction barrier.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evaluates trends in the imagery built into GIS applications to supplement existing vector data of streets, boundaries, infrastructure and utilities. These include large area digital orthophotos, Landsat and SPOT data. Future developments include 3 to 5 metre pixel resolutions from satellites, 1 to 2 metres from aircraft. GPS and improved image analysis techniques will also assist in improving resolution and accuracy.