225 resultados para Rate constant


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We alternately measured on-road and in-vehicle ultrafine (<100 nm) particle (UFP) concentration for 5 passenger vehicles that comprised an age range of 18 years. A range of cabin ventilation settings were assessed during 301 trips through a 4 km road tunnel in Sydney, Australia. Outdoor airflow(ventilation) rates under these settings were quantified on open roads using tracer gas techniques. Significant variability in tunnel trip average median in-cabin/on-road (I/O) UFP ratios was observed (0.08 to ∼1.0). Based on data spanning all test automobiles and ventilation settings, a positive linear relationship was found between outdoor air flow rate and I/O ratio, with the former accounting for a substantial proportion of variation in the latter (R2 ) 0.81). UFP concentrations recorded in cabin during tunnel travel were significantly higher than those reported by comparable studies performed on open roadways. A simple mathematical model afforded the ability to predict tunnel trip average in-cabin UFP concentrations with good accuracy. Our data indicate that under certain conditions, in-cabin UFP exposures incurred during tunnel travel may contribute significantly to daily exposure. The UFP exposure of automobile occupants appears strongly related to their choice of ventilation setting and vehicle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Little is known about the psychological underpinnings of young people’s mobile phone behaviour. In the present research, 292 young Australians, aged 16–24 years, completed an online survey assessing the effects of self-identity, in-group norm, the need to belong, and self-esteem on their frequency of mobile phone use and mobile phone involvement, conceptualised as people’s degree of cognitive and behavioural association with their mobile phone. Structural equation modelling revealed that age (younger) and self-identity significantly predicted the frequency of mobile phone use. In contrast, age (younger), gender (female), self-identity and in-group norm predicted young people’s mobile phone involvement. Neither self-esteem nor the need to belong significantly predicted mobile phone behaviour. The present study contributes to our understanding of this phenomenon and provides an indication of the characteristics of young people who may become highly involved with their mobile phone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quality and bitrate modeling is essential to effectively adapt the bitrate and quality of videos when delivered to multiplatform devices over resource constraint heterogeneous networks. The recent model proposed by Wang et al. estimates the bitrate and quality of videos in terms of the frame rate and quantization parameter. However, to build an effective video adaptation framework, it is crucial to incorporate the spatial resolution in the analytical model for bitrate and perceptual quality adaptation. Hence, this paper proposes an analytical model to estimate the bitrate of videos in terms of quantization parameter, frame rate, and spatial resolution. The model can fit the measured data accurately which is evident from the high Pearson correlation. The proposed model is based on the observation that the relative reduction in bitrate due to decreasing spatial resolution is independent of the quantization parameter and frame rate. This modeling can be used for rate-constrained bit-stream adaptation scheme which selects the scalability parameters to optimize the perceptual quality for a given bandwidth constraint.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Free surface flows of a rotational fluid past a two-dimensional semi-infinite body are considered. The fluid is assumed to be inviscid, incompressible, and of finite depth. A boundary integral method is used to solve the problem for the case where the free surface meets the body at a stagnation point. Supercritical solutions which satisfy the radiation condition are found for various values of the Froude number and the dimensionless vorticity. Subcritical solutions are also found; however these solutions violate the radiation condition and are characterized by a train of waves upstream. It is shown numerically that the amplitude of these waves increases as each of the Froude number, vorticity and height of the body above the bottom increases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The free surface flow of a finite depth fluid past a semi-infinite body is considered. The fluid is assumed to have constant vorticity throughout and the free surface is assumed to attach smoothly to the front face of the body. Numerical solutions are found using a boundary integral method in the physical plane and it is shown that solutions exist for all supercritical Froude numbers. The related problem of the cusp-like flow due to a submerged sink in a corner is also considered. Vorticity is included in the flow and it is shown that the behaviour of the solutions is qualitatively the same as that found in the problem described above.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the use of the Bayes Factor as a distance metric for speaker segmentation within a speaker diarization system. The proposed approach uses a pair of constant sized, sliding windows to compute the value of the Bayes Factor between the adjacent windows over the entire audio. Results obtained on the 2002 Rich Transcription Evaluation dataset show an improved segmentation performance compared to previous approaches reported in literature using the Generalized Likelihood Ratio. When applied in a speaker diarization system, this approach results in a 5.1% relative improvement in the overall Diarization Error Rate compared to the baseline.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes effects of different practice task constraints on heart rate (HR) variability during 4v4 smallsided football games. Participants were sixteen football players divided into two age groups (U13, Mean age: 12.4±0.5 yrs; U15: 14.6±0.5). The task consisted of a 4v4 sub-phase without goalkeepers, on a 25x15 m field, of 15 minutes duration with an active recovery period of 6 minutes between each condition. We recorded players’ heart rates using heart rate monitors (Polar Team System, Polar Electro, Kempele, Finland) as scoring mode was manipulated (line goal: scoring by dribbling past an extended line; double goal: scoring in either of two lateral goals; and central goal: scoring only in one goal). Subsequently, %HR reserve was calculated with the Karvonen formula. We performed a time-series analysis of HR for each individual in each condition. Mean data for intra-participant variability showed that autocorrelation function was associated with more short-range dependence processes in the “line goal” condition, compared to other conditions, demonstrating that the “line goal” constraint induced more randomness in HR response. Relative to inter-individual variability, line goal constraints demonstrated lower %CV and %RMSD (U13: 9% and 19%; U15: 10% and 19%) compared with double goal (U13: 12% and 21%; U15: 12% and 21%) and central goal (U13: 14% and 24%; U15: 13% and 24%) task constraints, respectively. Results suggested that line goal constraints imposed more randomness on cardiovascular stimulation of each individual and lower inter-individual variability than double goal and central goal constraints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A series of lithium niobate powders were synthesized by the combustion method at different heating rates. The effect of heating rate on the crystal composition of lithium niobate powders was investigated by powder X-ray diffraction measurements. It has been found that the lithium content in the as-synthesized lithium niobate powders increases with decreasing the heating rate. On the basis of the existed structure-property relationship of lithium niobate single crystals, it was concluded that high quality lithium niobate powders can be effectively synthesized at a lower heating rate (in the range of 5-10 C/min) by the combustion method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The economic environment of today can be characterized as highly dynamic and competitive if not being in a constant flux. Globalization and the Information Technology (IT) revolution are perhaps the main contributing factors to this observation. While companies have to some extent adapted to the current business environment, new pressures such as the recent increase in environmental awareness and its likely effects on regulations are underway. Hence, in the light of market and competitive pressures, companies must constantly evaluate and if necessary update their strategies to sustain and increase the value they create for shareholders (Hunt and Morgan, 1995; Christopher and Towill, 2002). One way to create greater value is to become more efficient in producing and delivering goods and services to customers, which can lead to a strategy known as cost leadership (Porter, 1980). Even though Porter (1996) notes that in the long run cost leadership may not be a sufficient strategy for competitive advantage, operational efficiency is certainly necessary and should therefore be on the agenda of every company. ----- ----- ----- Better workflow management, technology, and resource utilization can lead to greater internal operational efficiency, which explains why, for example, many companies have recently adopted Enterprise Resource Planning (ERP) Systems: integrated softwares that streamline business processes. However, as today more and more companies are approaching internal operational excellence, the focus for finding inefficiencies and cost saving opportunities is moving beyond the boundaries of the firm. Today many firms in the supply chain are engaging in collaborative relationships with customers, suppliers, and third parties (services) in an attempt to cut down on costs related to for example, inventory, production, as well as to facilitate synergies. Thus, recent years have witnessed fluidity and blurring regarding organizational boundaries (Coad and Cullen, 2006). ----- ----- ----- The Information Technology (IT) revolution of the late 1990’s has played an important role in bringing organizations closer together. In their efforts to become more efficient, companies first integrated their information systems to speed up transactions such as ordering and billing. Later collaboration on a multidimensional scale including logistics, production, and Research & Development became evident as companies expected substantial benefits from collaboration. However, one could also argue that the recent popularity of the concepts falling under Supply Chain Management (SCM) such as Vendor Managed Inventory, Collaborative Planning, Replenishment, and Forecasting owe to the marketing efforts of software vendors and consultants who provide these solutions. Nevertheless, reports from professional organizations as well as academia indicate that the trend towards interorganizational collaboration is gaining wider ground. For example, the ARC Advisory Group, a research organization on supply chain solutions, estimated that the market for SCM, which includes various kinds of collaboration tools and related services, is going to grow at an annual rate of 7.4% during the years 2004-2008, reaching to $7.4 billion in 2008 (Engineeringtalk 2004).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The mechanism for the decomposition of hydrotalcite remains unsolved. Controlled rate thermal analysis enables this decomposition pathway to be explored. The thermal decomposition of hydrotalcites with hexacyanoferrite(II) and hexacyanoferrate(III) in the interlayer has been studied using controlled rate thermal analysis technology. X-ray diffraction shows the hydrotalcites studied have a d(003) spacing of 11.1 and 10.9 Å which compares with a d-spacing of 7.9 and 7.98 Å for the hydrotalcite with carbonate or sulphate in the interlayer. Calculations based upon CRTA measurements show that 7 moles of water is lost, proving the formula of hexacyanoferrite(II) intercalated hydrotalcite is Mg6Al2(OH)16[Fe(CN)6]0.5 .7 H2O and for the hexacyanoferrate(III) intercalated hydrotalcite is Mg6Al2(OH)16[Fe(CN)6]0.66 * 9 H2O. Dehydroxylation combined with CN unit loss occurs in three steps between a) 310 and 367°C b) 367 and 390°C and c) between 390 and 428°C for both the hexacyanoferrite(II) and hexacyanoferrate(III) intercalated hydrotalcite.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigated the grain size dependence of mechanical properties and deformation mechanisms of microcrystalline (mc) and nanocrystalline (nc: grain size below 100 nm) Mg-5wt% Al alloys. The Hall-Petch relationship was investigated by both instrumented indentation tests and compression tests. The test results from the indentation tests and compression tests match well with each other. The breakdown of Hall-Petch relationship and the elevated strain rate sensitivity (SRS) of present Mg-5wt% Al alloys when the grain size was reduced below 58nm indicated the more significant role of GB mediated mechanisms in plastic deformation process. However, the relatively smaller SRS values compared to GB sliding and coble creep process suggested the plastic deformation in the current study is still dislocation mediated mechanism dominant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An existing model for solvent penetration and drug release from a spherically-shaped polymeric drug delivery device is revisited. The model has two moving boundaries, one that describes the interface between the glassy and rubbery states of polymer, and another that defines the interface between the polymer ball and the pool of solvent. The model is extended so that the nonlinear diffusion coefficient of drug explicitly depends on the concentration of solvent, and the resulting equations are solved numerically using a front-fixing transformation together with a finite difference spatial discretisation and the method of lines. We present evidence that our scheme is much more accurate than a previous scheme. Asymptotic results in the small-time limit are presented, which show how the use of a kinetic law as a boundary condition on the innermost moving boundary dictates qualitative behaviour, the scalings being very different to the similar moving boundary problem that arises from modelling the melting of an ice ball. The implication is that the model considered here exhibits what is referred to as ``non-Fickian'' or Case II diffusion which, together with the initially constant rate of drug release, has certain appeal from a pharmaceutical perspective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The proportion of functional sequence in the human genome is currently a subject of debate. The most widely accepted figure is that approximately 5% is under purifying selection. In Drosophila, estimates are an order of magnitude higher, though this corresponds to a similar quantity of sequence. These estimates depend on the difference between the distribution of genomewide evolutionary rates and that observed in a subset of sequences presumed to be neutrally evolving. Motivated by the widening gap between these estimates and experimental evidence of genome function, especially in mammals, we developed a sensitive technique for evaluating such distributions and found that they are much more complex than previously apparent. We found strong evidence for at least nine well-resolved evolutionary rate classes in an alignment of four Drosophila species and at least seven classes in an alignment of four mammals, including human. We also identified at least three rate classes in human ancestral repeats. By positing that the largest of these ancestral repeat classes is neutrally evolving, we estimate that the proportion of nonneutrally evolving sequence is 30% of human ancestral repeats and 45% of the aligned portion of the genome. However, we also question whether any of the classes represent neutrally evolving sequences and argue that a plausible alternative is that they reflect variable structure-function constraints operating throughout the genomes of complex organisms.