834 resultados para 1007 Nanotechnology


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The roles of weather variability and sunspots in the occurrence of cyanobacteria blooms, were investigated using cyanobacteria cell data collected from the Fred Haigh Dam, Queensland, Australia. Time series generalized linear model and classification and regression (CART) model were used in the analysis. Data on notified cell numbers of cyanobacteria and weather variables over the periods 2001 and 2005 were provided by the Australian Department of Natural Resources and Water, and Australian Bureau of Meteorology, respectively. The results indicate that monthly minimum temperature (relative risk [RR]: 1.13, 95% confidence interval [CI]: 1.02-1.25) and rainfall (RR: 1.11; 95% CI: 1.03-1.20) had a positive association, but relative humidity (RR: 0.94; 95% CI: 0.91-0.98) and wind speed (RR:0.90; 95% CI: 0.82-0.98) were negatively associated with the cyanobacterial numbers, after adjustment for seasonality and auto-correlation. The CART model showed that the cyanobacteria numbers were best described by an interaction between minimum temperature, relative humidity, and sunspot numbers. When minimum temperature exceeded 18%C and relative humidity was under 66%, the number of cyanobacterial cells rose by 2.15-fold. We conclude that the weather variability and sunspot activity may affect cyanobacterial blooms in dams.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This talk proceeds from the premise that IR should engage in a more substantial dialogue with cognitive science. After all, how users decide relevance, or how they chose terms to modify a query are processes rooted in human cognition. Recently, there has been a growing literature applying quantum theory (QT) to model cognitive phenomena. This talk will survey recent research, in particular, modelling interference effects in human decision making. One aspect of QT will be illustrated - how quantum entanglement can be used to model word associations in human memory. The implications of this will be briefly discussed in terms of a new approach for modelling concept combinations. Tentative links to human adductive reasoning will also be drawn. The basic theme behind this talk is QT can potentially provide a new genre of information processing models (including search) more aligned with human cognition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we classify, review, and experimentally compare major methods that are exploited in the definition, adoption, and utilization of element similarity measures in the context of XML schema matching. We aim at presenting a unified view which is useful when developing a new element similarity measure, when implementing an XML schema matching component, when using an XML schema matching system, and when comparing XML schema matching systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scoliosis is a three-dimensional spinal deformity which requires surgical correction in progressive cases. In order to optimize correction and avoid complications following scoliosis surgery, patient-specific finite element models (FEM) are being developed and validated by our group. In this paper, the modeling methodology is described and two clinically relevant load cases are simulated for a single patient. Firstly, a pre-operative patient flexibility assessment, the fulcrum bending radiograph, is simulated to assess the model's ability to represent spine flexibility. Secondly, intra-operative forces during single rod anterior correction are simulated. Clinically, the patient had an initial Cobb angle of 44 degrees, which reduced to 26 degrees during fulcrum bending. Surgically, the coronal deformity corrected to 14 degrees. The simulated initial Cobb angle was 40 degrees, which reduced to 23 degrees following the fulcrum bending load case. The simulated surgical procedure corrected the coronal deformity to 14 degrees. The computed results for the patient-specific FEM are within the accepted clinical Cobb measuring error of 5 degrees, suggested that this modeling methodology is capable of capturing the biomechanical behaviour of a scoliotic human spine during anterior corrective surgery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Programs written in languages of the Oberon family usually contain runtime tests on the dynamic type of variables. In some cases it may be desirable to reduce the number of such tests. Typeflow analysis is a static method of determining bounds on the types that objects may possess at runtime. We show that this analysis is able to reduce the number of tests in certain plausible circumstances. Furthermore, the same analysis is able to detect certain program errors at compile time, which would normally only be detected at program execution. This paper introduces the concepts of typeflow analysis and details its use in the reduction of runtime overhead in Oberon-2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The portability and runtime safety of programs which are executed on the Java Virtual Machine (JVM) makes the JVM an attractive target for compilers of languages other than Java. Unfortunately, the JVM was designed with language Java in mind, and lacks many of the primitives required for a straighforward implementation of other languages. Here, we discuss how the JVM may be used to implement other object-oriented languages. As a practical example of the possibilities, we report on a comprehensive case study. The open source Gardens Point Component Pascal compiler compiles the entire Component Pascal language, a dialect of Oberon-2, to JVM bytecodes. This compiler achieves runtime efficiencies which are comparable to native-code implementations of procedural languages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Managed execution frameworks, such as the.NET Common Language Runtime or the Java Virtual Machine, provide a rich environment for the creation of application programs. These execution environments are ideally suited for languages that depend on type-safety and the declarative control of feature access. Furthermore, such frameworks typically provide a rich collection of library primitives specialized for almost every domain of application programming. Thus, when a new language is implemented on one of these frameworks it becomes necessary to provide some kind of mapping from the new language to the libraries of the framework. The design of such mappings is challenging since the type-system of the new language may not span the domain exposed in the library application programming interfaces (APIs). The nature of these design considerations was clarified in the implementation of the Gardens Point Component Pascal (gpcp) compiler. In this paper we describe the issues, and the solutions that we settled on in this case. The problems that were solved have a wider applicability than just our example, since they arise whenever any similar language is hosted in such an environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stroke is a leading cause of disability and death. This study evaluated the association between temperature variation and emergency admissions for stroke in Brisbane, Australia. Daily emergency admissions for stroke, meteorologic and air pollution data were obtained for the period of January 1996 to December 2005. The relative risk of emergency admissions for stroke was estimated with a generalized estimating equations (GEE) model. For primary intracerebral hemorrhage (PIH) emergency admissions, the average daily PIH for the group aged < 65 increased by 15% (95% Confidence Interval (CI): 5, 26%) and 12% (95% CI: 2, 22%) for a 1°C increase in daily maximum temperature and minimum temperature in summer, respectively, after controlling for potential confounding effects of humidity and air pollutants. For ischemic stroke (IS) emergency admissions, the average daily IS for the group aged ≥ 65 decreased by 3% (95% CI: -6, 0%) for a 1°C increase in daily maximum temperature in winter after adjustment for confounding factors. Temperature variation was significantly associated with emergency admissions for stroke, and its impact varied with different type of stroke. Health authorities should pay greater attention to possible increasing emergency care for strokes when temperature changes, in both summer and winter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A major focus of research in nanotechnology is the development of novel, high throughput techniques for fabrication of arbitrarily shaped surface nanostructures of sub 100 nm to atomic scale. A related pursuit is the development of simple and efficient means for parallel manipulation and redistribution of adsorbed atoms, molecules and nanoparticles on surfaces – adparticle manipulation. These techniques will be used for the manufacture of nanoscale surface supported functional devices in nanotechnologies such as quantum computing, molecular electronics and lab-on-achip, as well as for modifying surfaces to obtain novel optical, electronic, chemical, or mechanical properties. A favourable approach to formation of surface nanostructures is self-assembly. In self-assembly, nanostructures are grown by aggregation of individual adparticles that diffuse by thermally activated processes on the surface. The passive nature of this process means it is generally not suited to formation of arbitrarily shaped structures. The self-assembly of nanostructures at arbitrary positions has been demonstrated, though these have typically required a pre-patterning treatment of the surface using sophisticated techniques such as electron beam lithography. On the other hand, a parallel adparticle manipulation technique would be suited for directing the selfassembly process to occur at arbitrary positions, without the need for pre-patterning the surface. There is at present a lack of techniques for parallel manipulation and redistribution of adparticles to arbitrary positions on the surface. This is an issue that needs to be addressed since these techniques can play an important role in nanotechnology. In this thesis, we propose such a technique – thermal tweezers. In thermal tweezers, adparticles are redistributed by localised heating of the surface. This locally enhances surface diffusion of adparticles so that they rapidly diffuse away from the heated regions. Using this technique, the redistribution of adparticles to form a desired pattern is achieved by heating the surface at specific regions. In this project, we have focussed on the holographic implementation of this approach, where the surface is heated by holographic patterns of interfering pulsed laser beams. This implementation is suitable for the formation of arbitrarily shaped structures; the only condition is that the shape can be produced by holographic means. In the simplest case, the laser pulses are linearly polarised and intersect to form an interference pattern that is a modulation of intensity along a single direction. Strong optical absorption at the intensity maxima of the interference pattern results in approximately a sinusoidal variation of the surface temperature along one direction. The main aim of this research project is to investigate the feasibility of the holographic implementation of thermal tweezers as an adparticle manipulation technique. Firstly, we investigate theoretically the surface diffusion of adparticles in the presence of sinusoidal modulation of the surface temperature. Very strong redistribution of adparticles is predicted when there is strong interaction between the adparticle and the surface, and the amplitude of the temperature modulation is ~100 K. We have proposed a thin metallic film deposited on a glass substrate heated by interfering laser beams (optical wavelengths) as a means of generating very large amplitude of surface temperature modulation. Indeed, we predict theoretically by numerical solution of the thermal conduction equation that amplitude of the temperature modulation on the metallic film can be much greater than 100 K when heated by nanosecond pulses with an energy ~1 mJ. The formation of surface nanostructures of less than 100 nm in width is predicted at optical wavelengths in this implementation of thermal tweezers. Furthermore, we propose a simple extension to this technique where spatial phase shift of the temperature modulation effectively doubles or triples the resolution. At the same time, increased resolution is predicted by reducing the wavelength of the laser pulses. In addition, we present two distinctly different, computationally efficient numerical approaches for theoretical investigation of surface diffusion of interacting adparticles – the Monte Carlo Interaction Method (MCIM) and the random potential well method (RPWM). Using each of these approaches we have investigated thermal tweezers for redistribution of both strongly and weakly interacting adparticles. We have predicted that strong interactions between adparticles can increase the effectiveness of thermal tweezers, by demonstrating practically complete adparticle redistribution into the low temperature regions of the surface. This is promising from the point of view of thermal tweezers applied to directed self-assembly of nanostructures. Finally, we present a new and more efficient numerical approach to theoretical investigation of thermal tweezers of non-interacting adparticles. In this approach, the local diffusion coefficient is determined from solution of the Fokker-Planck equation. The diffusion equation is then solved numerically using the finite volume method (FVM) to directly obtain the probability density of adparticle position. We compare predictions of this approach to those of the Ermak algorithm solution of the Langevin equation, and relatively good agreement is shown at intermediate and high friction. In the low friction regime, we predict and investigate the phenomenon of ‘optimal’ friction and describe its occurrence due to very long jumps of adparticles as they diffuse from the hot regions of the surface. Future research directions, both theoretical and experimental are also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The two outcome indices described in a companion paper (Sanson et al., Child Indicators Research, 2009) were developed using data from the Longitudinal Study of Australian Children (LSAC). These indices, one for infants and the other for 4 year to 5 year old children, were designed to fill the need for parsimonious measures of children’s developmental status to be used in analyses by a broad range of data users and to guide government policy and interventions to support young children’s optimal development. This paper presents evidence from Wave 1data from LSAC to support the validity of these indices and their three domain scores of Physical, Social/Emotional, and Learning. Relationships between the indices and child, maternal, family, and neighborhood factors which are known to relate concurrently to child outcomes were examined. Meaningful associations were found with the selected variables, thereby demonstrating the usefulness of the outcome indices as tools for understanding children’s development in their family and socio-cultural contexts. It is concluded that the outcome indices are valuable tools for increasing understanding of influences on children’s development, and for guiding policy and practice to optimize children’s life chances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Longitudinal Study of Australian Children (LSAC) is a major national study examining the lives of Australian children, using a cross-sequential cohort design and data from parents, children, and teachers for 5,107 infants (3–19 months) and 4,983 children (4–5 years). Its data are publicly accessible and are used by researchers from many disciplinary backgrounds. It contains multiple measures of children’s developmental outcomes as well as a broad range of information on the contexts of their lives. This paper reports on the development of summary outcome indices of child development using the LSAC data. The indices were developed to fill the need for indicators suitable for use by diverse data users in order to guide government policy and interventions which support young children’s optimal development. The concepts underpinning the indices and the methods of their development are presented. Two outcome indices (infant and child) were developed, each consisting of three domains—health and physical development, social and emotional functioning, and learning competency. A total of 16 measures are used to make up these three domains in the Outcome Index for the Child Cohort and six measures for the Infant Cohort. These measures are described and evidence supporting the structure of the domains and their underlying latent constructs is provided for both cohorts. The factorial structure of the Outcome Index is adequate for both cohorts, but was stronger for the child than infant cohort. It is concluded that the LSAC Outcome Index is a parsimonious measure representing the major components of development which is suitable for non-specialist data users. A companion paper (Sanson et al. 2010) presents evidence of the validity of the Index.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scalable video coding of H.264/AVC standard enables adaptive and flexible delivery for multiple devices and various network conditions. Only a few works have addressed the influence of different scalability parameters (frame rate, spatial resolution, and SNR) on the user perceived quality within a limited scope. In this paper, we have conducted an experiment of subjective quality assessment for video sequences encoded with H.264/SVC to gain a better understanding of the correlation between video content and UPQ at all scalable layers and the impact of rate-distortion method and different scalabilities on bitrate and UPQ. Findings from this experiment will contribute to a user-centered design of adaptive delivery of scalable video stream.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A series of mobile phone prototypes called The Swarm have been developed in response to the user needs identified in a three-year empirical study of young people’s use of mobile phones. The prototypes take cues from user led innovation and provide multiple avatars that allow individuals to define and manage their own virtual identity. This paper briefly maps the evolution of the prototypes and then describes how the pre-defined, color coded avatars in the latest version are being given greater context and personalization through the use of digital images. This not only gives ‘serendipity a nudge’ by allowing groups to come together more easily, it provides contextual information that can reduce gratuitous contact.

Relevância:

10.00% 10.00%

Publicador: