930 resultados para STATIONARY SECTORIAL SAMPLER


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increasing numbers of preclinical and clinical studies are utilizing pDNA (plasmid DNA) as the vector. In addition, there has been a growing trend towards larger and larger doses of pDNA utilized in human trials. The growing demand on pDNA manufacture leads to pressure to make more in less time. A key intervention has been the use of monoliths as stationary phases in liquid chromatography. Monolithic stationary phases offer fast separation to pDNA owing to their large pore size, making pDNA in the size range from 100 nm to over 300 nm easily accessible. However, the convective transport mechanism of monoliths does not guarantee plasmid purity. The recovery of pure pDNA hinges on a proper balance in the properties of the adsorbent phase, the mobile phase and the feedstock. The effects of pH and ionic strength of binding buffer, temperature of feedstock, active group density and the pore size of the stationary phase were considered as avenues to improve the recovery and purity of pDNA using a methacrylate-based monolithic adsorbent and Escherichia coli DH5α-pUC19 clarified lysate as feedstock. pDNA recovery was found to be critically dependent on the pH and ionic strength of the mobile phase. Up to a maximum of approx. 92% recovery was obtained under optimum conditions of pH and ionic strength. Increasing the feedstock temperature to 80°C increased the purity of pDNA owing to the extra thermal stability associated with pDNA over contaminants such as proteins. Results from toxicological studies of the plasmid samples using endotoxin standard (E. coli 0.55:B5 lipopolysaccharide) show that endotoxin level decreases with increasing salt concentration. It was obvious that large quantities of pure pDNA can be obtained with minimal extra effort simply by optimizing process parameters and conditions for pDNA purification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Systemic splits between pre-compulsory and compulsory early years education impact on transitions to school through discontinuities in children’s experience. This paper presents data from a critical participatory action research project about transitions between pre-compulsory and compulsory early education schooling in Australia. The project aim was to investigate how transitions to school might be enhanced by developing deeper professional relationships and shared understandings between teachers from both sectors. Within the communicative space afforded by a professional learning community the participants engaged in critical conversations about their understandings of transitions practices and conditions, including systemic differences. Data analysis provides a snapshot of changes in teachers’ thinking about professional relationships, continuity and factors influencing cross-sectorial professional relationships. Findings suggest that affording opportunities for teachers to re-frame cross sectorial professional relationships has led to transformative changes to transitions practices, understandings and conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using cameras onboard a robot for detecting a coloured stationary target outdoors is a difficult task. Apart from the complexity of separating the target from the background scenery over different ranges, there are also the inconsistencies with direct and reflected illumination from the sun,clouds, moving and stationary objects. They can vary both the illumination on the target and its colour as perceived by the camera. In this paper, we analyse the effect of environment conditions, range to target, camera settings and image processing on the reported colours of various targets. The analysis indicates the colour space and camera configuration that provide the most consistent colour values over varying environment conditions and ranges. This information is used to develop a detection system that provides range and bearing to detected targets. The system is evaluated over various lighting conditions from bright sunlight, shadows and overcast days and demonstrates robust performance. The accuracy of the system is compared against a laser beacon detector with preliminary results indicating it to be a valuable asset for long-range coloured target detection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Exposure to bioaerosols in indoor environments has been linked to various adverse health effects, such as airway disorders and upper respiratory tract symptoms. The aim of this study was to assess exposure to bioaerosols in the school environment in Brisbane, Australia. Methods: Culturable fungi and endotoxin measurements were conducted in six schools between October 2010 and May 2011. Culturable fungi (2 indoor air and 1-2 outdoor air samples per school) were assessed using a Biotest RCS High Flow Air Sampler, with a flow rate of either 50L/min or 20L/min. A rose pengar agar was used for recovery, which was incubated prior to counting and partial identification. Endotoxins were sampled (8h, 2L/min) using SKC glass fibre filters (4 indoor air samples per school) and analysed using an endpoint chromogenic LAL assay. Results: The arithmetic mean for fungi concentration in indoor and outdoor air was 710 cfu/m3(125- 1900 cfu/m3) and 524 cfu/m3 (140-1250 cfu/m3), respectively. The most frequently isolated fungal genus from the outdoor air was Cladosporium (over 40 %), followed by isolated Penicillium (21%) and Aspergillus (12%). The percent of Penicillium, Cladosporium and Aspergillus in indoor air samples was 32%, 32% and 8%, respectively. The aritmetic mean of endotoxin concentration was 0.59 EU/m3 (0-2,2 EU/m3). Discussion: The results of the current study are in agreement with previously reported studies, in that airborne fungi and endotoxin concentrations varied extensively, and were mostly dependent on climatic conditions. In addition, the indoor air mycoflora largely reflected the fungal flora present in the outdoor air, with Cladosporium being the most common in both outdoor and indoor (with Penicillium) air. In indoor air, unusually high endotoxin levels, over 1 EU/m3, were detected at 2 schools. Although these schools were not affected by the recent Brisbane floods, persistent rain prior to and during the study perios could explain the results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelling fluvial processes is an effective way to reproduce basin evolution and to recreate riverbed morphology. However, due to the complexity of alluvial environments, deterministic modelling of fluvial processes is often impossible. To address the related uncertainties, we derive a stochastic fluvial process model on the basis of the convective Exner equation that uses the statistics (mean and variance) of river velocity as input parameters. These statistics allow for quantifying the uncertainty in riverbed topography, river discharge and position of the river channel. In order to couple the velocity statistics and the fluvial process model, the perturbation method is employed with a non-stationary spectral approach to develop the Exner equation as two separate equations: the first one is the mean equation, which yields the mean sediment thickness, and the second one is the perturbation equation, which yields the variance of sediment thickness. The resulting solutions offer an effective tool to characterize alluvial aquifers resulting from fluvial processes, which allows incorporating the stochasticity of the paleoflow velocity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The digital divide is the disparancy in access to information, in the ability to communicate, and in the capacity to make information and communication serve full participation in the information society. Indeed, the conversation about the digital divide has developed over the last decade from a focus on connectivity and access to information and communication technologies, to a conversation that encompasses the ability to use them and to the utility that usage provides (Wei et al., 2011). However, this conversation, while transitioning from technology to the skills of the people that use them and to the fruits of their use is limited in its ability to take into account the social role of information and communication technologies (ICTs). One successful attempt in conceptualizing the social impact of the differences in access to and utilization of digital communication technologies, was developed by van Dijk (2005) whose sequential model for analyzing the divide states that: 1. Categorical inequalities in society produce an unequal distribution of resources; 2. An unequal distribution of resources causes unequal access to digital technologies; 3. Unequal access to digital technologies also depends on the characteristics of these technologies; 4. Unequal access to digital technologies brings about unequal participation in society; 5. Unequal participation in society reinforces categorical inequalities and unequal distributions of resources.” (p. 15) As van Dijk’s model demonstrates, the divide’s impact is the exclusion of individuals from participation. Still left to be defined are the “categorical inequalities,” the “resources,” the “characteristics of digital technologies,” and the different levels of “access” that result in differentiated levels of participation, as these change over time due to the evolving nature of technology and the dynamics of society. And most importantly, the meaning of “participation” in contemporary society needs to be determined as it is differentiated levels of participation that are the result of the divide and the engine of the ever-growing disparities. Our argument is structured in the following manner: We first claim that contemporary digital media differ from the previous generation of ICTs along four dimensions: They offer an abundance of information resources and communication channels when compared to the relative paucity of both in the past; they offer mobility as opposed to the stationary nature of their predecessors; they are interactive in that they provide users with the capability to design their own media environments in contrast to the dictated environs of previous architectures; and, they allow users to communicate utilizing multi forms of mediation, unlike the uniformity of sound or word that limited users in the past. We then submit that involvement in the information society calls for egalitarian access to all four dimensions of the user experience that make contemporary media different from their predecessors and that the ability to experience all four affects the levels in which humans partake in the shaping of society. The model being cyclical, we then discuss how lower levels of participation contribute to the enhancement of social inequalities. Finally, we discuss why participation is needed in order to achieve full membership in the information society and what political philosophy should govern policy solutions targeting the re-inclusion of those digitally excluded.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electrospun nanofiber meshes have emerged as a new generation of scaffold membranes possessing a number of features suitable for tissue regeneration. One of these features is the flexibility to modify their structure and composition to orchestrate specific cellular responses. In this study, we investigated the effects of nanofiber orientation and surface functionalization on human mesenchymal stem cell (hMSC) migration and osteogenic differentiation. We used an in vitro model to examine hMSC migration into a cell-free zone on nanofiber meshes and mitomycin C treatment to assess the contribution of proliferation to the observed migration. Poly (ɛ-caprolactone) meshes with oriented topography were created by electrospinning aligned nanofibers on a rotating mandrel, while randomly oriented controls were collected on a stationary collector. Both aligned and random meshes were coated with a triple-helical, type I collagen-mimetic peptide, containing the glycine-phenylalanine-hydroxyproline-glycine-glutamate-arginine (GFOGER) motif. Our results indicate that nanofiber GFOGER peptide functionalization and orientation modulate cellular behavior, individually, and in combination. GFOGER significantly enhanced the migration, proliferation, and osteogenic differentiation of hMSCs on nanofiber meshes. Aligned nanofiber meshes displayed increased cell migration along the direction of fiber orientation compared to random meshes; however, fiber alignment did not influence osteogenic differentiation. Compared to each other, GFOGER coating resulted in a higher proliferation-driven cell migration, whereas fiber orientation appeared to generate a larger direct migratory effect. This study demonstrates that peptide surface modification and topographical cues associated with fiber alignment can be used to direct cellular behavior on nanofiber mesh scaffolds, which may be exploited for tissue regeneration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Molecular phylogenetic studies of homologous sequences of nucleotides often assume that the underlying evolutionary process was globally stationary, reversible, and homogeneous (SRH), and that a model of evolution with one or more site-specific and time-reversible rate matrices (e.g., the GTR rate matrix) is enough to accurately model the evolution of data over the whole tree. However, an increasing body of data suggests that evolution under these conditions is an exception, rather than the norm. To address this issue, several non-SRH models of molecular evolution have been proposed, but they either ignore heterogeneity in the substitution process across sites (HAS) or assume it can be modeled accurately using the distribution. As an alternative to these models of evolution, we introduce a family of mixture models that approximate HAS without the assumption of an underlying predefined statistical distribution. This family of mixture models is combined with non-SRH models of evolution that account for heterogeneity in the substitution process across lineages (HAL). We also present two algorithms for searching model space and identifying an optimal model of evolution that is less likely to over- or underparameterize the data. The performance of the two new algorithms was evaluated using alignments of nucleotides with 10 000 sites simulated under complex non-SRH conditions on a 25-tipped tree. The algorithms were found to be very successful, identifying the correct HAL model with a 75% success rate (the average success rate for assigning rate matrices to the tree's 48 edges was 99.25%) and, for the correct HAL model, identifying the correct HAS model with a 98% success rate. Finally, parameter estimates obtained under the correct HAL-HAS model were found to be accurate and precise. The merits of our new algorithms were illustrated with an analysis of 42 337 second codon sites extracted from a concatenation of 106 alignments of orthologous genes encoded by the nuclear genomes of Saccharomyces cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, S. castellii, S. kluyveri, S. bayanus, and Candida albicans. Our results show that second codon sites in the ancestral genome of these species contained 49.1% invariable sites, 39.6% variable sites belonging to one rate category (V1), and 11.3% variable sites belonging to a second rate category (V2). The ancestral nucleotide content was found to differ markedly across these three sets of sites, and the evolutionary processes operating at the variable sites were found to be non-SRH and best modeled by a combination of eight edge-specific rate matrices (four for V1 and four for V2). The number of substitutions per site at the variable sites also differed markedly, with sites belonging to V1 evolving slower than those belonging to V2 along the lineages separating the seven species of Saccharomyces. Finally, sites belonging to V1 appeared to have ceased evolving along the lineages separating S. cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, and S. bayanus, implying that they might have become so selectively constrained that they could be considered invariable sites in these species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Based on protein molecular dynamics, we investigate the fractal properties of energy, pressure and volume time series using the multifractal detrended fluctuation analysis (MF-DFA) and the topological and fractal properties of their converted horizontal visibility graphs (HVGs). The energy parameters of protein dynamics we considered are bonded potential, angle potential, dihedral potential, improper potential, kinetic energy, Van der Waals potential, electrostatic potential, total energy and potential energy. The shape of the h(q)h(q) curves from MF-DFA indicates that these time series are multifractal. The numerical values of the exponent h(2)h(2) of MF-DFA show that the series of total energy and potential energy are non-stationary and anti-persistent; the other time series are stationary and persistent apart from series of pressure (with H≈0.5H≈0.5 indicating the absence of long-range correlation). The degree distributions of their converted HVGs show that these networks are exponential. The results of fractal analysis show that fractality exists in these converted HVGs. For each energy, pressure or volume parameter, it is found that the values of h(2)h(2) of MF-DFA on the time series, exponent λλ of the exponential degree distribution and fractal dimension dBdB of their converted HVGs do not change much for different proteins (indicating some universality). We also found that after taking average over all proteins, there is a linear relationship between 〈h(2)〉〈h(2)〉 (from MF-DFA on time series) and 〈dB〉〈dB〉 of the converted HVGs for different energy, pressure and volume.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of controlling a Markov decision process (MDP) with a large state space, so as to minimize average cost. Since it is intractable to compete with the optimal policy for large scale problems, we pursue the more modest goal of competing with a low-dimensional family of policies. We use the dual linear programming formulation of the MDP average cost problem, in which the variable is a stationary distribution over state-action pairs, and we consider a neighborhood of a low-dimensional subset of the set of stationary distributions (defined in terms of state-action features) as the comparison class. We propose a technique based on stochastic convex optimization and give bounds that show that the performance of our algorithm approaches the best achievable by any policy in the comparison class. Most importantly, this result depends on the size of the comparison class, but not on the size of the state space. Preliminary experiments show the effectiveness of the proposed algorithm in a queuing application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an essay aimed at prompting broad discussion crucial in keeping the interaction design discourse fresh, critical, and in motion. We trace the changing role of people who have advanced from consumers to producers, from stationary office workers to mobile urban nomads, from passive members of the plebs to active instigators of change. Yet, interaction designers often still refer to them only as ‘users.’ We follow some of the historic developments from the information superhighway to the smart city in order to provide the backdrop in front of which we critically analyse three core areas. First, the issue of echo chambers and filter bubbles in social media results in a political polarisation that jeopardises the formation of a functioning public sphere. Second, pretty lights and colourful façades in media architecture are increasingly making way for situated installations and interventions fostering community engagement. And third, civic activism is often reduced to forms of slacktivism. We synthesise our discussion to propose ‘citizen-ability’ as an alternative goal for interaction designers to aspire to in order to create new polities and civics for a better quality of life.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phosphorus has a number of indispensable biochemical roles, but its natural deposition and the low solubility of phosphates as well as their rapid transformation to insoluble forms make the element commonly the growth-limiting nutrient, particularly in aquatic ecosystems. Famously, phosphorus that reaches water bodies is commonly the main cause of eutrophication. This undesirable process can severely affect many aquatic biotas in the world. More management practices are proposed but long-term monitoring of phosphorus level is necessary to ensure that the eutrophication won't occur. Passive sampling techniques, which have been developed over the last decades, could provide several advantages to the conventional sampling methods including simpler sampling devices, more cost-effective sampling campaign, providing flow proportional load as well as representative average of concentrations of phosphorus in the environment. Although some types of passive samplers are commercially available, their uses are still scarcely reported in the literature. In Japan, there is limited application of passive sampling technique to monitor phosphorus even in the field of agricultural environment. This paper aims to introduce the relatively new P-sampling techniques and their potential to use in environmental monitoring studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider a single server queue with the interarrival times and the service times forming a regenerative sequence. This traffic class includes the standard models: lid, periodic, Markov modulated (e.g., BMAP model of Lucantoni [18]) and their superpositions. This class also includes the recently proposed traffic models in high speed networks, exhibiting long range dependence. Under minimal conditions we obtain the rates of convergence to stationary distributions, finiteness of stationary moments, various functional limit theorems and the continuity of stationary distributions and moments. We use the continuity results to obtain approximations for stationary distributions and moments of an MMPP/GI/1 queue where the modulating chain has a countable state space. We extend all our results to feedforward networks where the external arrivals to each queue can be regenerative. In the end we show that the output process of a leaky bucket is regenerative if the input process is and hence our results extend to a queue with arrivals controlled by a leaky bucket.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ergodic or long-run average cost control problem for a partially observed finite-state Markov chain is studied via the associated fully observed separated control problem for the nonlinear filter. Dynamic programming equations for the latter are derived, leading to existence and characterization of optimal stationary policies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time-frequency analysis of various simulated and experimental signals due to elastic wave scattering from damage are performed using wavelet transform (WT) and Hilbert-Huang transform (HHT) and their performances are compared in context of quantifying the damages. Spectral finite element method is employed for numerical simulation of wave scattering. An analytical study is carried out to study the effects of higher-order damage parameters on the reflected wave from a damage. Based on this study, error bounds are computed for the signals in the spectral and also on the time-frequency domains. It is shown how such an error bound can provide all estimate of error in the modelling of wave propagation in structure with damage. Measures of damage based on WT and HHT is derived to quantify the damage information hidden in the signal. The aim of this study is to obtain detailed insights into the problem of (1) identifying localised damages (2) dispersion of multifrequency non-stationary signals after they interact with various types of damage and (3) quantifying the damages. Sensitivity analysis of the signal due to scattered wave based on time-frequency representation helps to correlate the variation of damage index measures with respect to the damage parameters like damage size and material degradation factors.