165 resultados para Emulsion template


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The article discusses the recent developments on Freedom of Information or FOI in Queensland. It mentions the recent calls for a new FOI model, pointing to a radical departure from the old FOI template and the emergence of a significantly different FOI regime. Two of these reforms are the Right to Information Bill 2009 or RTI and the Information Privacy Bill 2009 or IP. It also mentions the new FOI Public Interest Test under the RTI Act.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers three fields of interest in the recording process: the performer and the song; the technology of the recording context; and the commercial ambitions of the record company, and positions the record producer as a nexus at the interface of all three. The author reports his structured recollection of several recordings that all achieved substantial commercial success. The processes are considered from the author’s perspective as the record producer, and from inception of the project to completion of the recorded work. What were the processes of engagement? Do the actions reported conform to the template of nexus? This paper proposes that in all recordings the function of producer/nexus is present and necessary—it exists in the interaction of the artistry and the technology¬––and is a useful paradigm for analysis of the recording process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Within the communicative space online Social Network Sites (SNS) afford, Niche Social Networks Sites (NSNS) have emerged around particular geographic, demographic or topic-based communities to provide what broader SNS do not: specified and targeted content for an engaged and interested community. Drawing on a research project developed at the Queensland University of Technology in conjunction with the Australian Smart Services Cooperative Research Centre that produced an NSNS based around Adventure Travel, this paper outlines the main drivers for community creation and sustainability within NSNS. The paper asks what factors motivate users to join and stay with these sites and what, if any, common patterns can be noted in their formation. It also outlines the main barriers to online participation and content creation in NSNS, and the similarities and differences in SNS and NSNS business models. Having built a community of 100 registered members, the staywild.com.au project was a living laboratory, enabling us to document the steps taken in producing a NSNS and cultivating and retaining active contributors. The paper incorporates observational analysis of user-generated content (UGC) and user profile submissions, statistical analysis of site usage, and findings from a survey of our membership pool in noting areas of success and of failure. In drawing on our project in this way we provide a template for future iterations of NSNS initiation and development across various other social settings: not only niche communities, but also the media and advertising with which they engage and interact. Positioned within the context of online user participation and UGC research, our paper concludes with a discussion of the ways in which the tools afforded by NSNS extend earlier understandings of online ‘communities of interest’. It also outlines the relevance of our research to larger questions about the diversity of the social media ecology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a framework for both gradient descent image and object alignment in the Fourier domain. Our method centers upon the classical Lucas & Kanade (LK) algorithm where we represent the source and template/model in the complex 2D Fourier domain rather than in the spatial 2D domain. We refer to our approach as the Fourier LK (FLK) algorithm. The FLK formulation is advantageous when one pre-processes the source image and template/model with a bank of filters (e.g. oriented edges, Gabor, etc.) as: (i) it can handle substantial illumination variations, (ii) the inefficient pre-processing filter bank step can be subsumed within the FLK algorithm as a sparse diagonal weighting matrix, (iii) unlike traditional LK the computational cost is invariant to the number of filters and as a result far more efficient, and (iv) this approach can be extended to the inverse compositional form of the LK algorithm where nearly all steps (including Fourier transform and filter bank pre-processing) can be pre-computed leading to an extremely efficient and robust approach to gradient descent image matching. Further, these computational savings translate to non-rigid object alignment tasks that are considered extensions of the LK algorithm such as those found in Active Appearance Models (AAMs).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Development of hypoxia-mimicking bone tissue engineering scaffolds is of great importance in stimulating angiogenesis for bone regeneration. Dimethyloxallyl glycine (DMOG) is a cell-permeable, competitive inhibitor of hypoxia-inducible factor prolyl hydroxylase (HIF-PH), which can stabilize hypoxia-inducible factor 1α (HIF-1α) expression. The aim of this study was to develop hypoxia-mimicking scaffolds by delivering DMOG in mesoporous bioactive glass (MBG) scaffolds and to investigate whether the delivery of DMOG could induce a hypoxic microenvironment for human bone marrow stromal cells (hBMSC). MBG scaffolds with varied mesoporous structures (e.g. surface area and mesopore volume) were prepared by controlling the contents of mesopore-template agent. The composition, large-pore microstructure and mesoporous properties of MBG scaffolds were characterized. The effect of mesoporous properties on the loading and release of DMOG in MBG scaffolds was investigated. The effects of DMOG delivery on the cell morphology, cell viability, HIF-1α stabilization, vascular endothelial growth factor (VEGF) secretion and bone-related gene expression (alkaline phosphatase, ALP; osteocalcin, OCN; and osteopontin, OPN) of hBMSC in MBG scaffolds were systematically investigated. The results showed that the loading and release of DMOG in MBG scaffolds can be efficiently controlled by regulating their mesoporous properties via the addition of different contents of mesopore-template agent. DMOG delivery in MBG scaffolds had no cytotoxic effect on the viability of hBMSC. DMOG delivery significantly induced HIF-1α stabilization, VEGF secretion and bone-related gene expression of hBMSC in MBG scaffolds in which DMOG counteracted the effect of HIF-PH and stabilized HIF-1α expression under normoxic condition. Furthermore, it was found that MBG scaffolds with slow DMOG release significantly enhanced the expression of bone-related genes more than those with instant DMOG release. The results suggest that the controllable delivery of DMOG in MBG scaffolds can mimic a hypoxic microenvironment, which not only improves the angiogenic capacity of hBMSC, but also enhances their osteogenic differentiation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Adaptation of novels and other source texts into theatre has proven to be a recurring and popular form of writing through the ages. This study argues that as the theoretical discourse has moved on from outmoded notions of fidelity to original sources, the practice of adaptation is a method of re-invigorating theatre forms and inventing new ones. This practice-led research employed a tripartite methodology comprised of the writing of two play adaptations, participation by the author/researcher in their productions, and exegetical components focused on the development and deployment of analytical tools. These tools were derived from theoretical literature and a creative practice based on acquired professional artistry "learnt by doing" over a longstanding professional career as actor, director and writer. A suite of analytical tools was developed through the three phases of the first project, the adaptation of Nick Earls’ novel Perfect Skin. The tools draw on Cardwell’s "comparative analysis", which encompasses close consideration of generic context, authorial context and medium-specific context; and on Stam’s "mechanics of narrative": order, duration, frequency, the narrator and point of view. A third analytical lens was developed from an awareness of the significance of the commissioning brief and ethical considerations and obligations to the source text and its author and audience. The tripartite methodology provided an adaptation template that was applied to the writing and production of the second play Red Cap, which used factual and anecdotal sources. The second play’s exegesis (Chapter 10) analyses the effectiveness of the suite of analytical tools and the reception of the production in order to conclude the study with a workable model for use in the practice of adapting existing texts, both factual and fictional, for the theatre.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A comprehensive study was conducted on mesoporous MCM-41. Spectroscopic examinations demonstrated that three types of silanol groups, i.e., single, (SiO)3Si-OH, hydrogen-bonded, (SiO)3Si-OH-OH-Si(SiO)3, and geminal, (SiO)2Si(OH)2, can be observed. The number of silanol groups/nm2, ?OH, as determined by NMR, varies between 2.5 and 3.0 depending on the template-removal methods. All these silanol groups were found to be the active sites for adsorption of pyridine with desorption energies of 91.4 and 52.2 kJ mol-1, respectively. However, only free silanol groups (involving single and geminal silanols) are highly accessible to the silylating agent, chlorotrimethylsilane. Silylation can modify both the physical and chemical properties of MCM-41.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Australian region spans some 60° of latitude and 50° of longitude and displays considerable regional climate variability both today and during the Late Quaternary. A synthesis of marine and terrestrial climate records, combining findings from the Southern Ocean, temperate, tropical and arid zones, identifies a complex response of climate proxies to a background of changing boundary conditions over the last 35,000 years. Climate drivers include the seasonal timing of insolation, greenhouse gas content of the atmosphere, sea level rise and ocean and atmospheric circulation changes. Our compilation finds few climatic events that could be used to construct a climate event stratigraphy for the entire region, limiting the usefulness of this approach. Instead we have taken a spatial approach, looking to discern the patterns of change across the continent. The data identify the clearest and most synchronous climatic response at the time of the Last Glacial Maximum (LGM) (21 ± 3 ka), with unambiguous cooling recorded in the ocean, and evidence of glaciation in the highlands of tropical New Guinea, southeast Australia and Tasmania. Many terrestrial records suggest drier conditions, but with the timing of inferred snowmelt, and changes to the rainfall/runoff relationships, driving higher river discharge at the LGM. In contrast, the deglaciation is a time of considerable south-east to north-west variation across the region. Warming was underway in all regions by 17 ka. Post-glacial sea level rise and its associated regional impacts have played an important role in determining the magnitude and timing of climate response in the north-west of the continent in contrast to the southern latitudes. No evidence for cooling during the Younger Dryas chronozone is evident in the region, but the Antarctic cold reversal clearly occurs south of Australia. The Holocene period is a time of considerable climate variability associated with an intense monsoon in the tropics early in the Holocene, giving way to a weakened monsoon and an increasingly El Niño-dominated ENSO to the present. The influence of ENSO is evident throughout the southeast of Australia, but not the southwest. This climate history provides a template from which to assess the regionality of climate events across Australia and make comparisons beyond our region. The data identify the clearest and most synchronous climatic response at the time of the Last Glacial Maximum (LGM) (21 ± 3 ka), with unambiguous cooling recorded in the ocean, and evidence of glaciation in the highlands of tropical New Guinea, southeast Australia and Tasmania. Many terrestrial records suggest drier conditions, but with the timing of inferred snowmelt, and changes to the rainfall/runoff relationships, driving higher river discharge at the LGM. In contrast, the deglaciation is a time of considerable south-east to north-west variation across the region. Warming was underway in all regions by 17 ka. Post-glacial sea level rise and its associated regional impacts have played an important role in determining the magnitude and timing of climate response in the north-west of the continent in contrast to the southern latitudes. No evidence for cooling during the Younger Dryas chronozone is evident in the region, but the Antarctic cold reversal clearly occurs south of Australia. The Holocene period is a time of considerable climate variability associated with an intense monsoon in the tropics early in the Holocene, giving way to a weakened monsoon and an increasingly El Niño-dominated ENSO to the present. The influence of ENSO is evident throughout the southeast of Australia, but not the southwest. This climate history provides a template from which to assess the regionality of climate events across Australia and make comparisons beyond our region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The T-box family transcription factor gene TBX20 acts in a conserved regulatory network, guiding heart formation and patterning in diverse species. Mouse Tbx20 is expressed in cardiac progenitor cells, differentiating cardiomyocytes, and developing valvular tissue, and its deletion or RNA interference-mediated knockdown is catastrophic for heart development. TBX20 interacts physically, functionally, and genetically with other cardiac transcription factors, including NKX2-5, GATA4, and TBX5, mutations of which cause congenital heart disease (CHD). Here, we report nonsense (Q195X) and missense (I152M) germline mutations within the T-box DNA-binding domain of human TBX20 that were associated with a family history of CHD and a complex spectrum of developmental anomalies, including defects in septation, chamber growth, and valvulogenesis. Biophysical characterization of wild-type and mutant proteins indicated how the missense mutation disrupts the structure and function of the TBX20 T-box. Dilated cardiomyopathy was a feature of the TBX20 mutant phenotype in humans and mice, suggesting that mutations in developmental transcription factors can provide a sensitized template for adult-onset heart disease. Our findings are the first to link TBX20 mutations to human pathology. They provide insights into how mutation of different genes in an interactive regulatory circuit lead to diverse clinical phenotypes, with implications for diagnosis, genetic screening, and patient follow-up.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The encapsulation and release of bioactive molecules from polymeric vehicles represents the holy grail of drug and growth factor delivery therapies, whereby sustained and controlled release is crucial in eliciting a positive therapeutic effect. To this end, electrospraying is rapidly emerging as a popular technology for the production of polymeric particles containing bioactive molecules. Compared with traditional emulsion fabrication techniques, electrospraying has the potential to reduce denaturation of protein drugs and affords tighter regulation over particle size distribution and morphology. In this article, we review the importance of the electrospraying parameters that enable reproducible tailoring of the particles' physical and in vitro drug release characteristics, along with discussion of existing in vivo data. Controlled morphology and monodispersity of particles can be achieved with electrospraying, with high encapsulation efficiencies and without unfavorable denaturation of bioactive molecules throughout the process. Finally, the combination of electrospraying with electrospun scaffolds, with an emphasis on tissue regeneration is reviewed, depicting a technique in its relative infancy but holding great promise for the future of regenerative medicine.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A numerical investigation has been carried out for the coupled thermal boundary layers on both sides of a partition placed in an isosceles triangular enclosure along its middle symmetric line. The working fluid is considered as air which is initially quiescent. A sudden temperature difference between two zones of the enclosure has been imposed to trigger the natural convection. It is anticipated from the numerical simulations that the coupled thermal boundary layers development adjacent to the partition undergoes three distinct stages; namely an initial stage, a transitional stage and a steady state stage. Time dependent features of the coupled thermal boundary layers as well as the overall natural convection flow in the partitioned enclosure have been discussed and compared with the non-partitioned enclosure. Moreover, heat transfer as a form of local and overall average Nusselt number through the coupled thermal boundary layers and the inclined walls is also examined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Particles having at least regions of at least one metal oxide having nano-sized grains are produced by providing particles of a material having an initial, nonequiaxed particle shape, prepg. a mixt. of these particles and at last one metal oxide precursor, and treating the mixt. such that the precursor reacts with the particles. The process can be a co-pptn. process, sol-gel synthesis, micro-emulsion method, surfactant-based process, or a process that uses polymers. Complex metal oxide nanoparticles are produced by (a) prepg. a soln. contg. metal cations, (b) mixing the soln. with a surfactant to form micelles within the soln., and (c) heating the micellar liq. to form metal oxide and to remove the surfactant. The formed metal oxide particles have essentially the same morphol. (particle size and shape) as the initial morphol. of the material particles provided. [on SciFinder(R)]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the increasing popularity of the galvanic replacement approach towards the development of bimetallic nanocatalysts, special emphasis has been focused on minimizing the use of expensive metal (e.g. Pt), in the finally formed nanomaterials (e.g. Ag/Pt system as a possible catalyst for fuel cells). However, the complete removal of the less active sacrificial template is generally not achieved during galvanic replacement, and its residual presence may significantly impact on the electrocatalytic properties of the final material. Here, we investigate the hydrogen evolution reaction (HER) activity of Ag nanocubes replaced with different amounts of Pt, and demonstrate how the bimetallic composition significantly affects the activity of the alloyed nanomaterial.