921 resultados para non-major


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrocarbon spills on roads are a major safety concern for the driving public and can have severe cost impacts both on pavement maintenance and to the economy through disruption to services. The time taken to clean-up spills and re-open roads in a safe driving condition is an issue of increasing concern given traffic levels on major urban arterials. Thus, the primary aim of the research was to develop a sorbent material that facilitates rapid clean-up of road spills. The methodology involved extensive research into a range of materials (organic, inorganic and synthetic sorbents), comprehensive testing in the laboratory, scale-up and field, and product design (i.e. concept to prototype). The study also applied chemometrics to provide consistent, comparative methods of sorbent evaluation and performance. In addition, sorbent materials at every stage were compared against a commercial benchmark. For the first time, the impact of diesel on asphalt pavement has been quantified and assessed in a systematic way. Contrary to conventional thinking and anecdotal observations, the study determined that the action of diesel on asphalt was quite rapid (i.e. hours rather than weeks or months). This significant finding demonstrates the need to minimise the impact of hydrocarbon spills and the potential application of the sorbent option. To better understand the adsorption phenomenon, surface characterisation techniques were applied to selected sorbent materials (i.e. sand, organo-clay and cotton fibre). Brunauer Emmett Teller (BET) and thermal analysis indicated that the main adsorption mechanism for the sorbents occurred on the external surface of the material in the diffusion region (sand and organo-clay) and/or capillaries (cotton fibre). Using environmental scanning electron microscopy (ESEM), it was observed that adsorption by the interfibre capillaries contributed to the high uptake of hydrocarbons by the cotton fibre. Understanding the adsorption mechanism for these sorbents provided some guidance and scientific basis for the selection of materials. The study determined that non-woven cotton mats were ideal sorbent materials for clean-up of hydrocarbon spills. The prototype sorbent was found to perform significantly better than the commercial benchmark, displaying the following key properties: • superior hydrocarbon pick-up from the road pavement; • high hydrocarbon retention capacity under an applied load; • adequate field skid resistance post treatment; • functional and easy to use in the field (e.g. routine handling, transportation, application and recovery); • relatively inexpensive to produce due to the use of raw cotton fibre and simple production process; • environmentally friendly (e.g. renewable materials, non-toxic to environment and operators, and biodegradable); and • rapid response time (e.g. two minutes total clean-up time compared with thirty minutes for reference sorbents). The major outcomes of the research project include: a) development of a specifically designed sorbent material suitable for cleaning up hydrocarbon spills on roads; b) submission of patent application (serial number AU2005905850) for the prototype product; and c) preparation of Commercialisation Strategy to advance the sorbent product to the next phase (i.e. R&D to product commercialisation).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Keyword Spotting is the task of detecting keywords of interest within continu- ous speech. The applications of this technology range from call centre dialogue systems to covert speech surveillance devices. Keyword spotting is particularly well suited to data mining tasks such as real-time keyword monitoring and unre- stricted vocabulary audio document indexing. However, to date, many keyword spotting approaches have su®ered from poor detection rates, high false alarm rates, or slow execution times, thus reducing their commercial viability. This work investigates the application of keyword spotting to data mining tasks. The thesis makes a number of major contributions to the ¯eld of keyword spotting. The ¯rst major contribution is the development of a novel keyword veri¯cation method named Cohort Word Veri¯cation. This method combines high level lin- guistic information with cohort-based veri¯cation techniques to obtain dramatic improvements in veri¯cation performance, in particular for the problematic short duration target word class. The second major contribution is the development of a novel audio document indexing technique named Dynamic Match Lattice Spotting. This technique aug- ments lattice-based audio indexing principles with dynamic sequence matching techniques to provide robustness to erroneous lattice realisations. The resulting algorithm obtains signi¯cant improvement in detection rate over lattice-based audio document indexing while still maintaining extremely fast search speeds. The third major contribution is the study of multiple veri¯er fusion for the task of keyword veri¯cation. The reported experiments demonstrate that substantial improvements in veri¯cation performance can be obtained through the fusion of multiple keyword veri¯ers. The research focuses on combinations of speech background model based veri¯ers and cohort word veri¯ers. The ¯nal major contribution is a comprehensive study of the e®ects of limited training data for keyword spotting. This study is performed with consideration as to how these e®ects impact the immediate development and deployment of speech technologies for non-English languages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Various load compensation schemes proposed in literature assume that voltage source at point of common coupling (PCC) is stiff. In practice, however, the load is remote from a distribution substation and is supplied by a feeder. In the presence of feeder impedance, the PWM inverter switchings distort both the PCC voltage and the source currents. In this paper load compensation with such a non-stiff source is considered. A switching control of the voltage source inverter (VSI) based on state feedback is used for load compensation with non-stiff source. The design of the state feedback controller requires careful considerations in choosing a gain matrix and in the generation of reference quantities. These aspects are considered in this paper. Detailed simulation and experimental results are given to support the control design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nitrous oxide (N2O) is a potent agricultural greenhouse gas (GHG). More than 50% of the global anthropogenic N2O flux is attributable to emissions from soil, primarily due to large fertilizer nitrogen (N) applications to corn and other non-leguminous crops. Quantification of the trade–offs between N2O emissions, fertilizer N rate, and crop yield is an essential requirement for informing management strategies aiming to reduce the agricultural sector GHG burden, without compromising productivity and producer livelihood. There is currently great interest in developing and implementing agricultural GHG reduction offset projects for inclusion within carbon offset markets. Nitrous oxide, with a global warming potential (GWP) of 298, is a major target for these endeavours due to the high payback associated with its emission prevention. In this paper we use robust quantitative relationships between fertilizer N rate and N2O emissions, along with a recently developed approach for determining economically profitable N rates for optimized crop yield, to propose a simple, transparent, and robust N2O emission reduction protocol (NERP) for generating agricultural GHG emission reduction credits. This NERP has the advantage of providing an economic and environmental incentive for producers and other stakeholders, necessary requirements in the implementation of agricultural offset projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Australian climate, soils and agricultural management practices are significantly different from those of the northern hemisphere nations. Consequently, experimental data on greenhouse gas production from European and North American agricultural soils and its interpretation are unlikely to be directly applicable to Australian systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper aimed to assess the magnitude of sewage pollution in an urban lake in Dhaka, Bangladesh by using Quantitative PCR (qPCR) of sewage-associated Bacteroides HF183 markers. PCR was also used for the quantitative detection of ruminant wastewater-associated CF128 markers along with the enumeration of traditional fecal indicator bacteria, namely, enterococci. The number of enterococci in lake water samples ranged from 1.1 x 104 to 1.9 x 105 CFU/100 ml of water. From the 20 water samples tested, 14 (70%) and 7 (35%) were PCR positive for the HF183 and CF128 markers, respectively. The numbers of the HF183 and CF128 markers in lake water samples were 3.9 x 104 to 6.3 × 107 and 9.3 x 103 to 6.3 x 105 genomic units (GU)/100 ml of water, respectively. The high numbers of enterococci and the HF183 markers indicate sewage pollution and potential health risks to those who use the lake water for non-potable purposes such as bathing and washing clothes. This is the first study that investigated the presence of microbial source tracking (MST) markers in Dhaka, Bangladesh where diarrhoeal diseases is one of the major causes of childhood mortality. The molecular assay as used in this study can provide valuable information on the extent of sewage pollution, thus facilitating the development of robust strategies to minimise potential health risks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The standard treatment for a non-union of the hallux metatarsophalangeal joint fusion has been to revise the fusion. Revision fusion is technically more demanding, often involving bone grafting, more substantial fixation and prolonged period of immobilization postoperatively. We present data to suggest that removal of hardware and debridement alone is an alternative treatment option. ---------- MATERIALS AND METHODS: A case note review identified patients with a symptomatic non-union after hallux metatarsophalangeal joint (MTPJ) fusion. It is our practice to offer these patients revision fusion or removal of hardware and debridement. For the seven patients that chose hardware removal and were left with a pseudarthrosis, a matched control group was selected from patients who had had successful fusions. Three outcome scores were used. Hallux valgus and dorsiflexion angles were recorded.---------- RESULTS: One hundred thirty-nine hallux MTPJ arthrodeses were carried out. Fourteen non-unions were identified. The rate of non-union in males and following previous hallux MTPJ surgery was 19% and 24%, respectively. In females undergoing a primary MTPJ fusion, the rate was 2.4%. Twelve non-union patients were reviewed at 27 months (mean). Eleven patients had elected to undergo removal of hardware and debridement. Four patients with pseudarthrosis were unhappy with the results and proceeded to either revision fusion or MTPJ replacement. Seven non-union patients, who had removal of hardware alone, had outcome scores marginally worse compared to those with successful fusions.---------- CONCLUSION: Removal of hardware alone is a reasonable option to offer as a relatively minor procedure following a failed arthrodesis of the first MTPJ. This must be accepted on the proviso that in this study four out of 11 (36%) patients proceeded to a revision first MTPJ fusion or first MTPJ replacement. We also found that the rate of non-union in primary first MTPJ fusion was significantly higher in males and those patients who had undergone previous surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background This research addresses the development of a digital stethoscope for use with a telehealth communications network to allow doctors to examine patients remotely (a digital telehealth stethoscope). A telehealth stethoscope would allow remote auscultation of patients who do not live near a major hospital. Travelling from remote areas to major hospitals is expensive for patients and a telehealth stethoscope could result in significant cost savings. Using a stethoscope requires great skill. To design a telehealth stethoscope that meets doctors’ expectations, the use of existing stethoscopes in clinical contexts must be examined. Method Observations were conducted of 30 anaesthetic preadmission consultations. The observations were video- taped. Interaction between doctor, patient and non-human elements in the consultation were “coded” to transform the video into data. The data were analysed to reveal essential aspects of the interactions. Results The analysis has shown that the doctor controls the interaction during auscultation. The conduct of auscultation draws heavily on the doctor’s tacit knowledge, allowing the doctor to treat the acoustic stethoscope as infrastructure – that is, the stethoscope sinks into the background and becomes completely transparent in use. Conclusion Two important, and related, implications for the design of a telehealth stethoscope have arisen from this research. First, as a telehealth stethoscope will be a shared device, doctors will not be able to make use of their existing expertise in using their own stethoscopes. Very simply, a telehealth stethoscope will sound different to a doctor’s own stethoscope. Second, the collaborative interaction required to use a telehealth stethoscope will have to be invented and refined. A telehealth stethoscope will need to be carefully designed to address these issues and result in successful use. This research challenges the concept of a telehealth stethoscope by raising questions about the ease and confidence with which doctors could use such a device.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Separability is a concept that is very difficult to define, and yet much of our scientific method is implicitly based upon the assumption that systems can sensibly be reduced to a set of interacting components. This paper examines the notion of separability in the creation of bi-ambiguous compounds that is based upon the CHSH and CH inequalities. It reports results of an experiment showing that violations of the CHSH and CH inequality can occur in human conceptual combination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective Research is beginning to provide an indication of the co-occurring substance abuse and mental health needs for the driving under the influence (DUI) population. This study aimed to examine the extent of such psychiatric problems among a large sample size of DUI offenders entering treatment in Texas. Methods This is a study of 36,373 past year DUI clients and 308,714 non-past year DUI clients admitted to Texas treatment programs between 2005 and 2008. Data were obtained from the State's administrative dataset. Results Analysis indicated that non-past year DUI clients were more likely to present with more severe illicit substance use problems, while past year DUI clients were more likely to have a primary problem with alcohol. Nevertheless, a cannabis use problem was also found to be significantly associated with DUI recidivism in the last year. In regards to mental health status, a major finding was that depression was the most common psychiatric condition reported by DUI clients, including those with more than one DUI offence in the past year. This cohort also reported elevated levels of Bipolar Disorder compared to the general population, and such a diagnosis was also associated with an increased likelihood of not completing treatment. Additionally, female clients were more likely to be diagnosed with mental health problems than males, as well as more likely to be placed on medications at admission and more likely to have problems with methamphetamine, cocaine, and opiates. Conclusions DUI offenders are at an increased risk of experiencing comorbid psychiatric disorders, and thus, corresponding treatment programs need to cater for a range of mental health concerns that are likely to affect recidivism rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have developed a bioreactor vessel design which has the advantages of simplicity and ease of assembly and disassembly, and with the appropriately determined flow rate, even allows for a scaffold to be suspended freely regardless of its weight. This article reports our experimental and numerical investigations to evaluate the performance of a newly developed non-perfusion conical bioreactor by visualizing the flow through scaffolds with 45° and 90° fiber lay down patterns. The experiments were conducted at the Reynolds numbers (Re) 121, 170, and 218 based on the local velocity and width of scaffolds. The flow fields were captured using short-time exposures of 60 µm particles suspended in the bioreactor and illuminated using a thin laser sheet. The effects of scaffold fiber lay down pattern and Reynolds number were obtained and correspondingly compared to results obtained from a computational fluid dynamics (CFD) software package. The objectives of this article are twofold: to investigate the hypothesis that there may be an insufficient exchange of medium within the interior of the scaffold when using our non-perfusion bioreactor, and second, to compare the flows within and around scaffolds of 45° and 90° fiber lay down patterns. Scaffold porosity was also found to influence flow patterns. It was therefore shown that fluidic transport could be achieved within scaffolds with our bioreactor design, being a non-perfusion vessel. Fluid velocities were generally same of the same or one order lower in magnitude as compared to the inlet flow velocity. Additionally, the 90° fiber lay down pattern scaffold was found to allow for slightly higher fluid velocities within, as compared to the 45° fiber lay down pattern scaffold. This was due to the architecture and pore arrangement of the 90° fiber lay down pattern scaffold, which allows for fluid to flow directly through (channel-like flow).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article reviews what international evidence exists on the impact of civil and criminal sanctions upon serious tax noncompliance by individuals. This construct lacks sharp definitional boundaries but includes large tax fraud and large-scale evasion that are not dealt with as fraud. Although substantial research and theory have been developed on general tax evasion and compliance, their conclusions might not apply to large-scale intentional fraudsters. No scientifically defensible studies directly compared civil and criminal sanctions for tax fraud, although one U.S. study reported that significantly enhanced criminal sanctions have more effects than enhanced audit levels. Prosecution is public, whereas administrative penalties are confidential, and this fact encourages those caught to pay heavy penalties to avoid publicity, a criminal record, and imprisonment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites