46 resultados para Molar - Average distances
Resumo:
Bounded parameter Markov Decision Processes (BMDPs) address the issue of dealing with uncertainty in the parameters of a Markov Decision Process (MDP). Unlike the case of an MDP, the notion of an optimal policy for a BMDP is not entirely straightforward. We consider two notions of optimality based on optimistic and pessimistic criteria. These have been analyzed for discounted BMDPs. Here we provide results for average reward BMDPs. We establish a fundamental relationship between the discounted and the average reward problems, prove the existence of Blackwell optimal policies and, for both notions of optimality, derive algorithms that converge to the optimal value function.
Resumo:
The impact of climate change on the health of vulnerable groups such as the elderly has been of increasing concern. However, to date there has been no meta-analysis of current literature relating to the effects of temperature fluctuations upon mortality amongst the elderly. We synthesised risk estimates of the overall impact of daily mean temperature on elderly mortality across different continents. A comprehensive literature search was conducted using MEDLINE and PubMed to identify papers published up to December 2010. Selection criteria including suitable temperature indicators, endpoints, study-designs and identification of threshold were used. A two-stage Bayesian hierarchical model was performed to summarise the percent increase in mortality with a 1°C temperature increase (or decrease) with 95% confidence intervals in hot (or cold) days, with lagged effects also measured. Fifteen studies met the eligibility criteria and almost 13 million elderly deaths were included in this meta-analysis. In total, there was a 2-5% increase for a 1°C increment during hot temperature intervals, and a 1-2 % increase in all-cause mortality for a 1°C decrease during cold temperature intervals. Lags of up to 9 days in exposure to cold temperature intervals were substantially associated with all-cause mortality, but no substantial lagged effects were observed for hot intervals. Thus, both hot and cold temperatures substantially increased mortality among the elderly, but the magnitude of heat-related effects seemed to be larger than that of cold effects within a global context.
Resumo:
This presentation explores molarization and overcoding of social machines and relationality within an assemblage consisting of empirical data of immigrant families in Australia. Immigration is key to sustainable development of Western societies like Australia and Canada. Newly arrived immigrants enter a country and are literally taken over by the Ministry of Immigration regarding housing, health, education and accessing job possibilities. If the immigrants do not know the official language(s) of the country, they enroll in language classes for new immigrants. Language classes do more than simply teach language. Language is presented in local contexts (celebrating the national day, what to do to get a job) and in control societies, language classes foreground values of a nation state in order for immigrants to integrate. In the current project, policy documents from Australia reveal that while immigration is the domain of government, the subject/immigrant is nevertheless at the core of policy. While support is provided, it is the subject/immigrant transcendent view that prevails. The onus remains on the immigrant to “succeed”. My perspective lies within transcendental empiricism and deploys Deleuzian ontology, how one might live in order to examine how segmetary lines of power (pouvoir) reflected in policy documents and operationalized in language classes rupture into lines of flight of nomad immigrants. The theoretical framework is Multiple Literacies Theory (MLT); reading is intensive and immanent. The participants are one Korean and one Sudanese family and their children who have recently immigrated to Australia. Observations in classrooms were obtained and followed by interviews based on the observations. Families also borrowed small video cameras and they filmed places, people and things relevant to them in terms of becoming citizen and immigrating to and living in a different country. Interviews followed. Rhizoanalysis informs the process of reading data. Rhizoanalysis is a research event and performed with an assemblage (MLT, data/vignettes, researcher, etc.). It is a way to work with transgressive data. Based on the concept of the rhizome, a bloc of data has no beginning, no ending. A researcher enters in the middle and exists somewhere in the middle, an intermezzo suggesting that the challenges to molar immigration lie in experimenting and creating molecular processes of becoming citizen.
Resumo:
Purpose: To determine the effect of moderate levels of refractive blur and simulated cataracts on nighttime pedestrian conspicuity in the presence and absence of headlamp glare. Methods: The ability to recognize pedestrians at night was measured in 28 young adults (M=27.6 years) under three visual conditions: normal vision, refractive blur and simulated cataracts; mean acuity was 20/40 or better in all conditions. Pedestrian recognition distances were recorded while participants drove an instrumented vehicle along a closed road course at night. Pedestrians wore one of three clothing conditions and oncoming headlamps were present for 16 participants and absent for 12 participants. Results: Simulated visual impairment and glare significantly reduced the frequency with which drivers recognized pedestrians and the distance at which the drivers first recognized them. Simulated cataracts were significantly more disruptive than blur even though photopic visual acuity levels were matched. With normal vision, drivers responded to pedestrians at 3.6x and 5.5x longer distances on average than for the blur or cataract conditions, respectively. Even in the presence of visual impairment and glare, pedestrians were recognized more often and at longer distances when they wore a “biological motion” reflective clothing configuration than when they wore a reflective vest or black clothing. Conclusions: Drivers’ ability to recognize pedestrians at night is degraded by common visual impairments even when the drivers’ mean visual acuity meets licensing requirements. To maximize drivers’ ability to see pedestrians, drivers should wear their optimum optical correction, and cataract surgery should be performed early enough to avoid potentially dangerous reductions in visual performance.
Resumo:
Rapid urbanisation and resulting continuous increase in traffic has been recognised as key factors in the contribution of increased pollutant loads to urban stormwater and in turn to receiving waters. Urbanisation primarily increases anthropogenic activities and the percentage of impervious surfaces in urban areas. These processes are collectively responsible for urban stormwater pollution. In this regard, urban traffic and land use related activities have been recognised as the primary pollutant sources. This is primarily due to the generation of a range of key pollutants such as solids, heavy metals and PAHs. Appropriate treatment system design is the most viable approach to mitigate stormwater pollution. However, limited understanding of the pollutant process and transport pathways constrains effective treatment design. This highlights necessity for the detailed understanding of traffic and other land use related pollutants processes and pathways in relation to urban stormwater pollution. This study has created new knowledge in relation to pollutant processes and transport pathways encompassing atmospheric pollutants, atmospheric deposition and build-up on ground surfaces of traffic generated key pollutants. The research study was primarily based on in-depth experimental investigations. This thesis describes the extensive knowledge created relating to the processes of atmospheric pollutant build-up, atmospheric deposition and road surface build-up and establishing their relationships as a chain of processes. The analysis of atmospheric deposition revealed that both traffic and land use related sources contribute total suspended particulate matter (TSP) to the atmosphere. Traffic sources become dominant during weekdays whereas land use related sources become dominant during weekends due to the reduction in traffic sources. The analysis further concluded that atmospheric TSP, polycyclic aromatic hydrocarbons (PAHs) and heavy metals (HMs) concentrations are highly influenced by total average daily heavy duty traffic, traffic congestion and the fraction of commercial and industrial land uses. A set of mathematical equation were developed to predict TSP, PAHs and HMs concentrations in the atmosphere based on the influential traffic and land use related parameters. Dry deposition samples were collected for different antecedent dry days and wet deposition samples were collected immediately after rainfall events. The dry deposition was found to increase with the antecedent dry days and consisted of relatively coarser particles (greater than 1.4 ìm) when compared to wet deposition. The wet deposition showed a strong affinity to rainfall depth, but was not related to the antecedent dry period. It was also found that smaller size particles (less than 1.4 ìm) travel much longer distances from the source and deposit mainly with the wet deposition. Pollutants in wet deposition are less sensitive to the source characteristics compared to dry deposition. Atmospheric deposition of HMs is not directly influenced by land use but rather by proximity to high emission sources such as highways. Therefore, it is important to consider atmospheric deposition as a key pollutant source to urban stormwater in the vicinity of these types of sources. Build-up was analysed for five different particle size fractions, namely, <1 ìm, 1-75 ìm, 75-150 ìm, 150-300 ìm and >300 ìm for solids, PAHs and HMs. The outcomes of the study indicated that PAHs and HMs in the <75 ìm size fraction are generated mainly by traffic related activities whereas the > 150 ìm size fraction is generated by both traffic and land use related sources. Atmospheric deposition is an important source for HMs build-up on roads, whereas the contribution of PAHs from atmospheric sources is limited. A comprehensive approach was developed to predict traffic and other land use related pollutants in urban stormwater based on traffic and other land use characteristics. This approach primarily included the development of a set of mathematical equations to predict traffic generated pollutants by linking traffic and land use characteristics to stormwater quality through mathematical modelling. The outcomes of this research will contribute to the design of appropriate treatment systems to safeguard urban receiving water quality for future traffic growth scenarios. The „real world. application of knowledge generated was demonstrated through mathematical modelling of solids in urban stormwater, accounting for the variability in traffic and land use characteristics.
Expression and distribution of cell-surface proteoglycans in the normal Lewis rat molar periodontium
Resumo:
Cell-surface proteoglycans participate in several biological functions such as cell cell and cell-matrix interactions, cell adhesion, the binding to various growth factors as co-receptors and repair. To understand better the expression and distribution of cell-surface proteoglycans in the periodontal tissues, an immunohistochemical evaluation of the normal Lewis rat molar periodontium using panels of antibodies for syndecan-1, -2, -4, glypican and betaglycan was carried out. Our results demonstrated the expression and distribution of all proteoglycans in the suprabasal gingival epithelium, soft and hard connective tissues. Both cellular and matrix localization was evident within the various periodontal compartments. The presence of these cell-surface proteoglycans indicates the potential for roles in the process of tissue homeostasis, repair or regeneration in periodontium of which each function requires further study.
Resumo:
The average structure (CI) of a volcanic plagioclase megacryst with composition Ano, from the Hogarth Ranges, Australia, has been determined using three-dimensional, singlecrystal neutron and X-ray diffraction data. Least squaresr efinements, incorporating anisotropic thermal motion of all atoms and an extinction correction, resulted in weighted R factors (based on intensities) of 0.076 and 0.056, respectively, for the neutron and X-ray data. Very weak e reflections could be detected in long-exposure X-ray and electron diffraction photographs of this crystal, but the refined average structure is believed to be unaffected by the presence of such a weak superstructure. The ratio of the scattering power of Na to that of Ca is different for X ray and neutron radiation, and this radiation-dependence of scattering power has been used to determine the distribution of Na and Ca over a split-atom M site (two sites designated M' and M") in this Ano, plagioclase. Relative peak-height ratios M'/M", revealed in difference Fourier sections calculated from neutron and X-ray data, formed the basis for the cation-distribution analysis. As neutron and X-ray data sets were directly compared in this analysis, it was important that systematic bias between refined neutron and X-ray positional parameters could be demonstrated to be absent. In summary, with an M-site model constrained only by the electron-microprobedetermined bulk composition of the crystal, the following values were obtained for the M-site occupanciesN: ar, : 0.29(7),N ar. : 0.23(7),C ar, : 0.15(4),a nd Car" : 0.33(4). These results indicate that restrictive assumptions about M sites, on which previous plagioclase refinements have been based, are not applicable to this Ano, and possibly not to the entire compositional range. T-site ordering determined by (T-O) bond-length variation-t,o : 0.51(l), trm = t2o = t2m = 0.32(l)-is weak, as might be expectedf rom the volcanic origin of this megacryst.
Resumo:
Average speed enforcement is a relatively new approach gaining popularity throughout Europe and Australia. This paper reviews the evidence regarding the impact of this approach on vehicle speeds, crashes rates and a number of additional road safety and public health outcomes. The economic and practical viability of the approach as a road safety countermeasure is also explored. A literature review, with an international scope, of both published and grey literature was conducted. There is a growing body of evidence to suggest a number of road safety benefits associated with average speed enforcement, including high rates of compliance with speed limits, reductions in average and 85th percentile speeds and reduced speed variability between vehicles. Moreover, the approach has been demonstrated to be particularly effective in reducing excessive speeding behaviour. Reductions in crash rates have also been reported in association with average speed enforcement, particularly in relation to fatal and serious injury crashes. In addition, the approach has been shown to improve traffic flow, reduce vehicle emissions and has also been associated with high levels of public acceptance. Average speed enforcement offers a greater network-wide approach to managing speeds that reduces the impact of time and distance halo effects associated with other automated speed enforcement approaches. Although comparatively expensive it represents a highly reliable approach to speed enforcement that produces considerable returns on investment through reduced social and economic costs associated with crashes.
Resumo:
This paper presents a methodology for real-time estimation of exit movement-specific average travel time on urban routes by integrating real-time cumulative plots, probe vehicles, and historic cumulative plots. Two approaches, component based and extreme based, are discussed for route travel time estimation. The methodology is tested with simulation and is validated with real data from Lucerne, Switzerland, that demonstrate its potential for accurate estimation. Both approaches provide similar results. The component-based approach is more reliable, with a greater chance of obtaining a probe vehicle in each interval, although additional data from each component is required. The extreme-based approach is simple and requires only data from upstream and downstream of the route, but the chances of obtaining a probe that traverses the entire route might be low. The performance of the methodology is also compared with a probe-only method. The proposed methodology requires only a few probes for accurate estimation; the probe-only method requires significantly more probes.
Resumo:
Melt electrospinning in a direct writing mode is a recent additive manufacturing approach to fabricate porous scaffolds for tissue engineering applications. In this study, we describe porous and cell-invasive poly (ε-caprolactone) scaffolds fabricated by combining melt electrospinning and a programmable x–y stage. Fibers were 7.5 ± 1.6 µm in diameter and separated by interfiber distances ranging from 8 to 133 µm, with an average of 46 ± 22 µm. Micro-computed tomography revealed that the resulting scaffolds had a highly porous (87%), three-dimensional structure. Due to the high porosity and interconnectivity of the scaffolds, a top-seeding method was adequate to achieve fibroblast penetration, with cells present throughout and underneath the scaffold. This was confirmed histologically, whereby a 3D fibroblast-scaffold construct with full cellular penetration was produced after 14 days in vitro. Immunohistochemistry was used to confirm the presence and even distribution of the key dermal extracellular matrix proteins, collagen type I and fibronectin. These results show that melt electrospinning in a direct writing mode can produce cell invasive scaffolds, using simple top-seeding approaches.
Resumo:
Nitrous oxide emissions from soil are known to be spatially and temporally volatile. Reliable estimation of emissions over a given time and space depends on measuring with sufficient intensity but deciding on the number of measuring stations and the frequency of observation can be vexing. The question of low frequency manual observations providing comparable results to high frequency automated sampling also arises. Data collected from a replicated field experiment was intensively studied with the intention to give some statistically robust guidance on these issues. The experiment had nitrous oxide soil to air flux monitored within 10 m by 2.5 m plots by automated closed chambers under a 3 h average sampling interval and by manual static chambers under a three day average sampling interval over sixty days. Observed trends in flux over time by the static chambers were mostly within the auto chamber bounds of experimental error. Cumulated nitrous oxide emissions as measured by each system were also within error bounds. Under the temporal response pattern in this experiment, no significant loss of information was observed after culling the data to simulate results under various low frequency scenarios. Within the confines of this experiment observations from the manual chambers were not spatially correlated above distances of 1 m. Statistical power was therefore found to improve due to increased replicates per treatment or chambers per replicate. Careful after action review of experimental data can deliver savings for future work.
Resumo:
Lens average and equivalent refractive indices are required for purposes such as lens thickness estimation and optical modeling. We modeled the refractive index gradient as a power function of the normalized distance from lens center. Average index along the lens axis was estimated by integration. Equivalent index was estimated by raytracing through a model eye to establish ocular refraction, and then backward raytracing to determine the constant refractive index yielding the same refraction. Assuming center and edge indices remained constant with age, at 1.415 and 1.37 respectively, average axial refractive index increased (1.408 to 1.411) and equivalent index decreased (1.425 to 1.420) with age increase from 20 to 70 years. These values agree well with experimental estimates based on different techniques, although the latter show considerable scatter. The simple model of index gradient gives reasonable estimates of average and equivalent lens indices, although refinements in modeling and measurements are required.
Resumo:
Monitoring stream networks through time provides important ecological information. The sampling design problem is to choose locations where measurements are taken so as to maximise information gathered about physicochemical and biological variables on the stream network. This paper uses a pseudo-Bayesian approach, averaging a utility function over a prior distribution, in finding a design which maximizes the average utility. We use models for correlations of observations on the stream network that are based on stream network distances and described by moving average error models. Utility functions used reflect the needs of the experimenter, such as prediction of location values or estimation of parameters. We propose an algorithmic approach to design with the mean utility of a design estimated using Monte Carlo techniques and an exchange algorithm to search for optimal sampling designs. In particular we focus on the problem of finding an optimal design from a set of fixed designs and finding an optimal subset of a given set of sampling locations. As there are many different variables to measure, such as chemical, physical and biological measurements at each location, designs are derived from models based on different types of response variables: continuous, counts and proportions. We apply the methodology to a synthetic example and the Lake Eacham stream network on the Atherton Tablelands in Queensland, Australia. We show that the optimal designs depend very much on the choice of utility function, varying from space filling to clustered designs and mixtures of these, but given the utility function, designs are relatively robust to the type of response variable.